Breaking News Emails

Receive last minute alerts and special reports. News and stories that matter, delivered in the morning on weekdays.

SUBSCRIBE

February 4, 2019, 19:49 GMT

By Jacob Ward

Facebook and other companies can very well protect your privacy, but they do not need your personal information to determine who you are and what you will do next.

Our network of human sensors has been designed to easily and automatically detect small, immediate anomalies such as snakes, shots or members of an enemy tribe. Our cognitive and perceptual equipment has evolved to identify these things now and here. Larger and more abstract threats and patterns are mostly beyond our immediate comprehension. This inability to detect big problems is one of the biggest challenges to our ability to understand, for example, the global implications of climate change, or the need to fill out a complicated form to enroll in a 401 (k). And in the world of privacy and data, our ability to see the real effects of data collection is impaired.

First, understand that privacy and data are separate things. Your privacy – your full name, your social security number, your online credentials – is the unit of measure that we best understand and protect most actively. When a bug in FaceTime allows strangers to hear us and look at us, we understand that, in the same visceral way, we can imagine a man rummaging behind our window. But your data – the abstract portrait of who you are and, more importantly, who you are compared to other people – is your real vulnerability to the companies that make money by offering seemingly free services to millions of people. Not because your data will compromise your personal identity. But because it will compromise your personal autonomy.

"The protection of privacy, as we normally think, does not matter," said Aza Raskin, co-founder of the Center for Humane Technology. "These companies are now building small models, small avatars, and little voodoo dolls, your doll is sitting in the cloud, and will be distributing 100,000 videos to see what's good for you to stay, or what ad with which message is particularly effective in helping you do something. "

Raskin was a successful engineer and entrepreneur, leading teams at Mozilla and Jawbone before realizing that his work had directly shaped human behavior in a way that he could not tolerate. He invented the infinite scroll – the now ubiquitous design standard in which your feed never ends at the bottom of the page – and then made the calculation of the time people were losing as a result of its creation. "The infinite parchment at least wastes 200,000 human lives a day," he said. "That's the reason I chose a new life."

Now working at the CHT with former Google ethicist Tristan Harris, Raskin spends his days fighting the energy companies have to predict and shape human behavior.

Even the smallest interactions with an application or service provide it with useful data to create a simulation of you. "Imagine it's a candy at first and that when you use the system, it picks up nail scraps and strands of hair," Raskin told NBC News. "What does it matter if you've lost a piece of nail? But they put it together in a pattern of you."

With 2.3 billion users, "Facebook has one of these models for one in four people on Earth. All countries, cultures, types of behavior, and socio-economic backgrounds, "said Raskin," With these models and endless simulations, society can predict your interests and intentions before you even know them.

And that's what gives us the illusion that our phones are recording our lyrics and we are showing ads for cars just as we end a conversation about cars – a notion that Facebook and others have strongly denied.

"I understand it's scary to imagine that they listen to your conversations," Raskin said. "But is not it more frightening that they can predict what you are talking about without listening? It's this little model of you. You are super predictable for these platforms. It's about persuasion and prediction, not about privacy. "

But the market for this type of data is just beginning. "Do you remember 10 years ago? We could barely look at a map on our phone – we had to print instructions, "said DJ Patil, who was head of data research under President Obama.

For now, data is immediately the most useful for targeting advertising. Without having to associate your name or address with your data profile, a company can nevertheless compare you to other people who have already displayed similar behavior online (click on, like this) and spread the most targeted advertising possible .

In a statement provided to NBC News, Facebook said that its activities targeted categories of ads, based on their Facebook activity, and that users could disassociate themselves from an interest by removing it from its context. The company also claims that its advertising interests are not related to personal characteristics, but only to their interests, and that Facebook's advertising policy prohibits discrimination.

But this type of data is so powerful that it produces far more powerful results than traditional advertising. For example, Facebook offers the opportunity to pay not only for a certain size of audience, but also for a real commercial result, such as a sale, an application download or a subscription to a newsletter. information. Once upon a time advertisers were paying a "CPM" – cost per thousand views – for a marketing campaign. It was just the opportunity to make themselves known. Now, Facebook offers a "CPA" or "cost per action" rate, a once unimaginable measure offered because the company is so confident in its understanding of people and their preferences that Facebook can essentially guarantee a number of people. some things.

And the data can do more than that. As we have seen in recent years, the data can predict not only which t-shirt you might want to buy, but also which topics are so emotionally charged that you can not look away from them – and what elements of propaganda will suit you best. And that makes platforms that collect large-scale data an incredible way to influence people. Maybe not you. Maybe not today. But, over time, the influence is sufficient so that the results obtained are globally coherent and invisible individually.

Tim Wu, a Columbia Law School professor and author of The Attention Merchants, says social platforms – and Facebook in particular – are a huge responsibility. "There is an incredible concentration of power over there, so much data and influence makes it a target for Russian hackers.To influence an election, you had to hack hundreds of newspapers. Failure for democracy. "

"To influence an election, you had to hack hundreds of newspapers – there is only one point of failure for democracy."

And the categories in which your data places you can be used much more than just selling you content or determining your political preferences. Without telling a company about your race or sexual orientation, your behavioral story can reveal these things. Today, he added, engineers should be trained to recognize a problematic revelation from anonymous data.

"It really needs to be built into the program and every interview," Patil said. "I always ask job candidates something like," You find something that serves as a proxy for the breed. What to do?

"The correct answer is" To whom should I go in this organization? Which group meets regularly about this? "The wrong answer is:" Hey, wow, good question! "It's the wrong person for the job."

And in the future, the transmission of anonymous data to companies will give them an idea of ​​human behavior such that systems based on these data may depreciate us without our conscious participation in the process. A recent joint venture announced by Apple and insurance giant Aetna will reward customers who agree to wear an Apple Watch with incentives for good health practices. Companies have promised to protect the privacy of individuals. At the same time, Aetna told CNBC that she hoped she could enroll all of her members in the program. What all this data will allow the company to do? Will premiums be higher for people whose choices may be healthy, but whose data profile suggests a shorter and more distressed life? And once every insurance company gets this data, how can people accidentally cursed with the wrong data profile get affordable insurance?

Unfortunately, your data and predictive power – while of enormous global value to the companies that collect it – have only the smallest measurable value in your life. In a lawsuit, Patel c. Facebook, arguing its view before the Court of Appeals of the Ninth Circuit, Facebook argued that for people to sue the company for violation of privacy under a new law of Illinois, they should prove that they have suffered individual injury. And yet, the business model relies on the value of aggregating individual data on the largest scale possible.

On an individual basis, Facebook seems to assert that data is useless and worthless. But while we can not essentially show the effect of data collection in our personal lives, the value of your data combined with that of everyone else is immeasurable.