Illusion of Choice: You Aren't Deciding How Important Your Privacy Is

Written by aiycH | Published 2020/10/16
Tech Story Tags: the-social-dilemma | privacy | facebook | data | security | the-illusion-of-choice | edward-snowden | big-tech | web-monetization

TLDR You're not the one who gets to decide how valuable your privacy is or whether you have something worth hiding, or if you're important enough to be spied on—other people decide this for you. When deciding that something should be private, you're reacting to a perceived negative consequence. You're only able to realize the need for privacy when you have knowledge of a threat. People don't get to choose how valuable their information is, and thus, how much importance they should place on their privacy.via the TL;DR App

The above statement is easily the most eloquent justification of privacy that I've seen. Thanks in large part to Snowden, the past decade has seen large parts of society become serious about data privacy, but it still feels like an overwhelming number of people can't be bothered to give this issue even a second of their time.
The apathy of this latter group is carriedthrough statements such as "I ha ve nothing to hide so I have nothing to fear," or "I'm no one special, so no one cares enough to spy on me." When all you want is for the squad to come talk on Signal instead of WhatsApp, this logic is very annoying. I have yet to figure out how to get those last few stubborn individuals to migrate over, so don't expect much wisdom from me in that regard.
What I will be discussing, however, is data privacy from a perspective that I haven't come across yet.
Here is the argument: you're not the one who gets to decide how valuable your privacy is, how valuable your data is, whether you have something worth hiding, or whether you're important enough to be spied on—other people decide this for you.
Consider this thought experiment. I'm in Canada right now; I can criticize the government to my heart's desire without fear of repercussion, which means that I don't need to treat my statements as sensitive information.
But now imagine that I'm in an authoritarian state, such as China or North
Korea, and I criticize the government there.
I hope it's clear to everyone that the statements I make in Canada are very dangerous in an authoritarian country.
Why does this difference exist though?
The message itself and its delivery are the exact same. What does change is the environment, and this highlights something that we all know and understand, but it never explicitly crosses our minds. When we 'decide' that something is worth hiding, we're not actively making a choice, we're reacting—reacting to what we perceive are the negative consequences if that information were to be exposed.
Try it with any example you can think of and you'll quickly find that your actions to keep something private or hidden is typically motivated by fear of repercussion. You're not deciding so much as you're being forced into this position. So, who really holds the power here if not you? That would of course be the person or thing that controls the consequences; that's who decides if you have something worth hiding and how valuable your privacy is. Sorry, but you don't have the power to choose as you think you do.

CONSEQUENCES

As I noted above, when deciding that something should be private, you're reacting to a perceived negative consequence. Thus, you're only able to realize the need for privacy when you have knowledge of a threat. This has important implications because you cannot possibly know all of the ways that you will be harmed if certain information was exposed. Sure, maybe you are only able to see your online activity as being totally innocuous and useless, but that doesn't mean everyone else would agree.
Cambridge Analytica is probably the best recent example of how our seemingly innocent and bland data can be used to hurt us. As Cambridge
Analytica steadily built psychographic profiles of Facebook users, how many do you think realized that the content they posted would be useful to someone with malicious intentions? How many do you think realized that such simple information could be used to manipulate them?
It doesn't matter how strongly these users believed that their activities carried no risk—people don't get to choose how valuable their information is, and thus, how much importance they should place on their privacy.
Maybe you're not convinced that this is all a big deal, that the Cambridge Analytica scandal was fairly harmless and similar events wouldn't bother you; this thinking is simply a limitation of your imagination. The human
psyche is full of vulnerabilities that are ready to be exploited, and your data is the key to facilitating this.
The recently released Netflix movie, The Social Dilemma, does a good job of explaining the ways that tech giants are using peoples' data to manipulate them. As more of our lives play out virtually, as tech giants expand their surveillance capabilities to cover every corner of the internet, ever increasing quantities of data become available to probe and exploit the vulnerabilities in our psychology.
Tech companies can also bolster their ability to find our exploits thanks to the wonders of machine learning. In particular, machine learning allows tech giants to actively conduct experiments on us. For example, Facebook has previously conducted a social experiment on 689,000 users. This experiment involved altering the content that appeared on news feeds in order to gauge the effect on a user's emotions and moods; the results showed concrete evidence of the ability to manipulate emotions in this way. 
Machine learning allows for a radically improved ability to detect the patterns of success. Subtle changes to the type of content that you see can be tested for its ability to elicit a specified feeling or action. These subtle changes would then be refined to strengthen the desired response. As machine learning technology becomes more potent, so too will the ability to get a desired response. Social media has in many ways become the gatekeeper for the rest of the internet, so this is already quite dangerous. However, tech giants are able to track us everywhere we go, and sometimes they can even control the content that we see on other sites.
With all this power at their fingertips, just how elaborate will their ability to manipulate us get? What about the potential to manipulate even our beliefs and perception of reality? Filter bubbles and echo chambers are
frequently criticized for their role in creating civil unrest and polarization
in society, but what happens as these abilities to distort our reality become
increasingly specific and complex?
The incredible precision with which an individual can be targeted to modify attitudes, behaviors, and cognitions is sheer insanity. In The Social Dilemma, Tristan Harris gives a great, albeit hyperbolic, analogy: it's like sticking electrodes into a spider's brain, where each electrode can fire an individual neuron. Those electrodes can then be used to get the spider
to do whatever it is you want it to do, and I think this paints a sufficiently
dystopian picture of just how great the danger potentially is.

CONCLUSION

The above section was entirely meant to open your mind to what is possible with your data. We don't know how our data will be misused, but we know that it can be. It doesn't matter if you think you have nothing to hide or that you think you're unimportant; tech giants, and many other groups, are interested in your data for a reason. By acknowledging what the consequences are and acknowledging that we can only react to them, we can really start to appreciate the value of data privacy.

Published by HackerNoon on 2020/10/16