‘Privacy Is Not Dead, But It Is Dying’

Menny Barzilay
Author: ISACA Now
Date Published: 20 October 2022

Editor’s note: Menny Barzilay, partner and co-founder of the global cyber crisis management company Cytactic, will be a keynote speaker for ISACA Conference Asia 2022, to take place virtually 16-17 November, presenting on “The Dark Future of Privacy.” Barzilay, an internationally recognized cybersecurity expert and CTO of the Interdisciplinary Cyber Research Center at Tel-Aviv University, recently visited with ISACA Now to discuss privacy mining, deep fakes, digital trust and more. The following is a transcript, edited for length and clarity:

ISACA Now: What is privacy mining and how does it affect everyday people?
As it should be clear for all by now, privacy is the internet’s main currency. We receive “free” services and we pay with our privacy. Large data holders like Google, TikTok, Facebook and others have mastered the process of transforming our privacy into profits. And since these companies, like any other business, strive to increase their profits, they need to constantly find new ways to collect more of our privacy.

And privacy mining is just that. It’s the process of collecting more and more of our privacy.

ISACA Now: Why do you believe that “trust is the foundation for innovation and technological advance”?
People need to trust new innovative technologies to use them. If people will not trust autonomous cars, they will not ride them. And if they will not trust cryptocurrency, they will not use it. If they will not trust CNN or Reuters, they will stop consuming their news online. In that sense, trustworthiness is an important antecedent for adaptation of any innovation. Without trust, there is no progress.

ISACA Now: Why are deep fakes so dangerous?
It’s already tremendously hard to be able to differentiate between what’s real and what’s fake. Disinformation and “fake news” is a key problem in the modern era. While people as individuals can be critical and skeptical, as a group, we tend to follow the crowd. And when the crowd actions are based on fake information, it could have dangerous outcomes, and in some cases even lead to casualties (like in the case of the rumor which led to the death of these two men, and the role of fake news in increasing the anti-vax movement).

Deep fakes present the next generation of fake information. Or, to better put it, deep fakes significantly enhance the threat. Currently these technologies are not mature enough to cause real havoc. But in a very short time, anyone could create fake videos with any person in them (live or dead, from the US president to Crocodile Dundee), doing and saying anything the creator wants. And it will be impossible to tell that these videos are fake.

We are getting close to a fake reality mode, where people could not even trust their own eyes. This will lead to a trust crisis. People will simply not know what is real and what is fake – what they should dismiss and what they should trust.

ISACA Now: How can we combat this digital trust crisis?
Critical thinking is the best tool to combat fake information. People should learn how to conduct fact checks, verify resources, and stop sharing information before they are sure it can be trusted. With fake reality you are either part of the solution or part of the problem.

People should also demand accountability from those people/companies who share (purposefully or mistakenly) fake information. Legislators have a role to play as well. Strict rules against using fake information should be put in place to prevent businesses like Cambridge Analytica from existing. This is an important war, and we cannot lose it.

ISACA Now: What are the next steps in raising awareness about and working to change the “dark future of privacy”?
First and foremost, we should remember that paying with privacy is dangerous. Unlike lost money, when your privacy is lost, it is impossible to earn it back. When your secrets are revealed, they cannot become secrets ever again. We should not allow companies to exploit our privacy. People should demand transparency (e.g., disclosure of how our private data is being used and what is collected), and, as I said above, accountability. People should make sure they are involved in the discussion. Because currently, the future of our privacy is not bright.

Privacy is not dead, but it is dying.