Our private data and their treatment: civilization and its discontent
The age of digital recklessness is over. We can’t anymore be naked on the web, both literally and figuratively. According to the latest study by Norton Lifelock and Harris Poll Institute, 67% of French people are worried about the protection of their personal data. It’s not too soon, but it might already be too late …
Give me your data, I’ll tell you who you are
In the United States, if you are a concerned citizen who wants to vote, everything happens as in France : you register on the electoral lists. These lists include your name, address, date of birth and so on : that is to say, data about you. Your personal data.
When you register on Facebook, you fill in your name, your age (if you are honest, you indicate the true year of your birth), your face in a profile picture, the city where you live… It is tempting to divulge more about you, like the high school you attended (useful to find your old friends). You can follow an anti-vax page if that’s your hobby, or subscribe to an event, anything from a motocross contest to a protest march against dictatorship. The events you like on FB already say a lot about you.
Last but not least, you compile a list of friends, relatives, co-workers, mates, and people you never met IRL, just to grow the narcissistic counter of your online popularity.
All of that is data. A spider’s web, a diagram if you prefer, that maps who you are. In computer science, what we call data is a mathematical representation of information: it means converting into lines of code, into electrical pulsations that pass through circuits, the idea that you love chocolate, or that you measure 1m80, hands up.
The CNIL, whose job in France is to manage these issues, defines private data as : “Any information directly or indirectly identifying someone (eg name, registration number, telephone number, photograph, date of birth, address, fingerprint …).”
Back to Facebook. When you sign in, you give theses informations to the platform, who can then sell it to advertisers (as per the terms and conditions) for them to size you better. The social network provides you free service : you do not pay but there’s a counterpart, so far nothing really surprising.
But once upon a time (in 2016), there was a company who wanted to run an experiment : to understand the statistical, mathematical, logical links between your Facebook profile data (this map of you) and your vote intention for the next President. A sample of willing people, who signed an agreement, participated in this study. The company began harvesting their data on Facebook, but caught in the heat of the moment, they began exploiting the data of the participants’ social connections as well. This is how, starting from a small sample of people, 50 million of US voters’ Facebook profiles were analyzed. They then shared the data they’d collected with another company, without participants’ knowledge. Just like if your partner shared your nudes to their coworkers, then to their whole family at Christmas dinner.
The database thus created included e-mails, invoices, bank transfers etc. Above all, it fed an algorithm, targeting indecisive voters, sending them ads and messages via bots, to definitely make them lean toward putting a paper marked “Trump” into the urn. Their data revealed their concerns, it was easy to promote to them the candidate would solve it all.
The first company was Global Science Research. The second was Cambridge Analytica. The one squeezed in the middle, was Facebook. Can’t forget the puzzle face of its founder, saying no no, he has no culpability in this “Zucky Leaks” scandal. His social network has complied with the terms and conditions, that is to say, only gave access to friends lists. As for the rest, he washes his hands of it.
The way private datas have been processed, passed on to one other, analyzed and manipulated to elect a despot, is basically an Orwellian nightmare that came to life.
It could never happen again. It should never happen again.
But : flash forward to 2018 and the same scenario with the election of Jair Bolsonaro in Brazil, this time thanks to Whatsapp (Facebook’s puppy).
It’s nothing but the logical consequence to what began with the referendum on Brexit in 2016. What’s next?
Personal data is the new oil : their massive manipulation could make our democracies collapse.
The social contract, according to Jean-Jacques Rousseau, is the idea that I give up some of my rights (of carrying a weapon, of taking the law into my own hands etc.) in exchange for State’s protection (police, justice, welfare State).
Somehow, the social contract around our private data is similar. I agree on providing information about me, in exchange for free services or the improvement of paid ones.
The best example is Waze. I voluntarily give my data — position in real time, speed, routes, places I frequent (my home, office, barbershop…) — in exchange for the app’s GPS and warning system that allows me to avoid speeding tickets.
The more the algorithm knows me, the better it can cater to me and make my life easier.
Netflix knows my tastes and suggests the new series that will get me hooked for sure. Airbnb should now know that I love chalets high up in the mountains, and shouldn’t waste my time by suggesting 70s-style houses in a city center. I have come to expect UberEats to know what kind of cuisine I prefer, and to offer me the best Nepali restaurant in town wherever I travel.
All this implies processing my personal data for the purpose of meeting my expectations, within limits. With the exception of Waze, let us not forget that we pay for all the services I listed. Therefore, we are entitled to expect these companies take special care when it comes to securing our data. Though I may click “accept the terms and conditions” without ever really reading them, my trust in these companies means that there are strict conditions on what they can and can’t do with my data. I agree to provide my data for a specific purpose and for nothing else. Amazon echo is meant to listen to my voice so that I can order a good book to read by the fireplace, it is not supposed to record my personal conversations and deduce my political orientations. The State is not supposed to access it one day, and know if I curse the government while drinking my morning coffee.
If it’s free, you’re the product.
When apps do not charge for their services, ask yourself what is their business model? They certainly trade your datas, but why acting so surprised?
Nevertheless, providing some of our personal data might soon make our cities smarter, that is to say more pleasant to live in, less congested etc.
The newspaper Vedomosti revealed that Moscow tracks several million citizen’s movements in real time, through their mobile phones. The city has invested billions to buy the location of Russian mobile customers in order to improve mobility and streamline traffic.
Eventually, even your city could build an app that would advise you to take public transportation during rush hour congestion, or a taxi when there’s an incident on the lines. Crowdsourcing data while you’re on the road could help you find a parking spot in Paris, saving you lots of time and swearing.
The ethical boundaries are clear : we must not match this data with our personal identities.
If Moscow wants to fluidify public transportation, then it’s all good, no need to know our names for that. That the city, at the same time, wants to match mobility data and facial recognition technology, here we slide into mass surveillance. In Moscow, 174,000 cameras scrutinize public spaces, metro and train stations. The big data collected is compared in real time with the database of the Ministry of the Interior. Facial recognition is used to hunt indebted individuals, something the chief bailiff boasted about.
Our data must be anonymous, because if cities can scrutinize your movements in real time to fluidify the traffic, it is a whole other issue if the authorities find out, by the same way, that you regularly attend a mosque or a gay bar .
Giving up our data does not mean renouncing the preservation of our identity, our privacy and our fundamental freedoms.
Our society 4.0 is always on the razor’s blade of the worst case scenarios. Europe has taken a big step ahead with the RGPD. But the problem remains the same : to what extent can Europe reach its legal arm to the GAFA, and on a smaller scale, to companies based outside the EU?
Whether you use Tinder (based in the US) to find the one or just a hook-up to get your shot of endorphins after a week of suffering idiots in the office, it’s all good. The algorithm of the application records your conversations, pictures, and tastes when it comes to sexual partners. It also gives you a « desirability score », based on how many matches you get (as well as your income, and your physical beauty based on Instagram criterias).
If Tinder has not yet experienced leaks, the Canadian website Ashley Madison wasn’t as lucky. The adulterous dating site (nicknamed “Home Wrecker Inc.”, still has 124 million visits per month, though) was hacked in 2015. Hackers downloaded the very sensitive data of 33 million users : names, phone numbers, home addresses, search history, and sexual preferences. They threatened to make this data public if the website didn’t close immediately. Were they the White Knights of the digital age? The Gatekeepers of good morals? My ass. The hackers extorted the victims by blackmailing them. People paid to save their reputation, their privacy, their families… some have lost their jobs. Three of them committed suicide.
The website is heavily responsible for that. Its policy was to keep, without informing the members, all data concerning them, including after they delete their profile, including after subscribing to the charged option “final erasing”.
Whether it’s about Ashley Madison or Facebook, the handling of personal data has been outrageously flippant, unconscionable, and even cynical.
We can not tolerate that anymore.
Who pays the final bill?
The miracle solution for some: to rent our data, or to sell it (at a legitimate price, if you please : not 100 € but rather 50k for a lifetime). It’s the idea of making the GAFA pay for our data, since at the end of the day, they are ours, our own, our precious.
I am skeptical about this solution: it is difficult to estimate the quantity and quality of data, do we set a price per kilo as though our data were avocados? We can spot many other issues : our data are not a tangible good that we own, and that we could estimate, sell, yield or transmit.
My name, my email are private data, but I will not ask for royalties every time I give them when I subscribe to a newsletter. It would be endless.
The RGPD gives us a right to examine how companies use our datas, it’s a good start but it’s not enough. What are the concrete ways to report and sanction the incorrect uses of our private data? Which one of us has the time to check? Who is safe from hackers, when even the Pentagon website has security cracks?
Service providers need to engage with their users, but we also need more self-control over our online behavior.
We must teach the younger generations this simple principle : your private data has value, do not give it away easily.
It seems that we are on the right track, and that even young people have lessons to teach us, we the generation who misbehaved so lightly on Facebook: the study I mentioned in the introduction reveals that 30% of the 18–38 year-old have deleted one of their social media accounts during the past year (compared to 15% of the 39–53 year-old and 8% of those aged 54 and more).
Our children have probably realized that the “all naked on the net” area is over.
They are now evolving under masks like pseudonyms and avatars — which leads to other issues, but this my friends, is the subject of another article.