When discussing privacy matters with ordinary[1] people, I often hear the same argument being used: “So what if they are tracking me in order to show me better ads? I much prefer hearing about products that interest me, rather than seeing default ads.” Which I have to admit is a very fair argument. In fact, one of the issues I have with creating a privacy-focused product is that it’s very hard to let my intended users know about it. Plus, not only are personalized ads more interesting for users, they are usually also better overall - default ads are often made for the lowest common denominator, full of jingles and celebrity endorsements. On the other hand, these ads can get downright creepy: everybody has a story where they were talking to a friend about a specific topic, only to later see Facebook ads for the same topic. And let’s not get started on ads that are just too personal, like the now-famous target pregnancy prediction algorithm. The formula seems somewhat clear - personalized ads good, too personalized bad. Our efforts should then be to allow for these ads, but to somehow make them not so invasive, right?
The above paragraph follows the same line of thought as almost any discussion on the topic I’ve ever had. Regardless of interlocutor and regardless of how much I chime in, most people[2] seem to agree that creepy is creepy but anything else is fair game. I don’t necessarily disagree with them. The argument itself - that personalized ads are, overall, better for everybody - is pretty convincing. It is so convincing, in fact, that it is a bit harmful. We talk too much about advertisement when talking about personal data, because that’s the number one reason for websites to be collecting data nowadays. The critical point, for me, is this: Just because personal data is nowadays mostly only used by advertisement, that doesn’t mean it’s all it can and will be used for.
The amount of data already collected about us could be used for a lot of different reasons. It’s (probably) not currently being used for any nefarious purposes, but why couldn’t it be? Are we relying on solid reasons to believe that our data will only ever be used to show us better ads? Or are we getting blinded to the fact that our personal data is being collected, just because the most obvious example of their usage are the ads we are shown every day?
Given how little impact this data collection has in our lives, why shouldn’t we trust those collecting it? Those companies also provide some of the most useful products in our lives, things we can no longer imagine ourselves living without. That is also part of the problem - most companies dealing with our data are still living through the years of being led by their original founders, or hand-picked successors. Would you still trust Google with all your data if it was being run by a cabal of anonymous board members? We might trust Google for now, but Google as an entity will exist long after the current leadership is gone. What happens if a less scrupulous group of people gets the rein of any technology giant?
As for the power of the information they already have, would a person in a position of power not be afraid of blackmail from a company that knows (literally) their every move? Did information and power sever their links? Although espionage is often romanticised, the information provided by spies is often as banal as “what do the consul like to do on their free time”. Seemingly simple pieces of information like this can go a long way.
There are plenty of ways to use the data that is being collected. We are letting this data be collected, because we are focused on how it is currently being used. But should ads truly be the focus of our discussion, just because it is the focus of the tech companies’ discussion?
Arguably one of the greatest American writers of the 20th century, Thomas Pynchon often wrote about people getting swept away by the tides of forces much bigger than themselves. Conspiracies, invented or otherwise, are one of his favourite topics. His most famous work, Gravity’s Rainbow, contains its own share of private-corporation backed international conspiracy[3]. In the book, Pynchon wrote the fittingly named proverbs for paranoids, one of which I feel applies very well to the discussion of data privacy and ads: “If they can get you asking the wrong questions, they don't have to worry about answers.”
[1] As usual, I use “ordinary” to refer to people who have spent some time thinking about privacy, but who feel that being perfectly private is just “too much work” (eg. turning off JavaScript, not using any popular service, only talking to your parents using snail mail since they refuse to download Signal).
[2] See #1.
[3] For a nice read on the topic, see: https://www.thesatirist.com/books/GravitysRainbow.html. “(...) clearly one of the main themes that emerges in Gravity’s Rainbow is the prevalence of corporate power and its attendant technologies. Corporate power crosses national lines, even (especially?) during times of war, even during World War II.”