This week looks at the explosive results from a whistle blower exposing the psychological warfare tools used by Cambridge Analytica.
The man who created Cambridge Analytica
Back in the early days of the Internet there was a web page with a red button. It did nothing but people would go there and just press it in the hope one day it would.
Today we have a different type of button. The privacy button. Trained from years of clicking āI acceptā buttons, we now click OK to sharing our data with countless organisations without a secondās thought.
Everyone knew we were sharing our data with someone. But understanding which celebrity you look like or what personality you are like was too much ofĀ a draw.
Now we can see the extent to which it wasĀ abused.
The Guardian hasĀ released an astonishing articleĀ of how Facebook acted as a conduit to deliver a vast amount of data to an academic from the University of Cambridge, who then sold it on illegally to a company called CambridgeĀ Analytica.
Go and readĀ the story.
It connects the dots between Bannon, Trump, Brexit and Russia.
For those who didnāt go and read the story, one of the core people behind the launch of Cambridge Analytica, Christopher Wylie has turned whistle blower and is revealing the methods undertaken to undermine elections.
It details key theory, data collection and insight to deliver the desired effects we see but only barely understand today.Ā
For every action, there is an equal and opposite reaction.Ā
Whilst the data is out of the bag, the extent to which populations have been compromised and the knowledge that this is happening will eventually result in a push back the other way.Ā
The push back on fake news and Facebookās involvement had already started, but could this kick startĀ an even stronger and broader push back?
Protecting the future
Most people in industry are worrying about GDPR ā a new data law that aims to protect peopleās data and privacy. It is modernising a law that stems back to a time before Google even existed, so it is much needed.
Letās be clear though.Ā
Passing data from the Cambridge University academic to Cambridge Analytica was illegal under that old law.Ā
The new law makes the penalties much higher but does not make it any more illegal.Ā
The increased fines are forcing companies to be much more aware of data and who is responsible for it.
But trying to stop people clicking on buttons authorising the sharing of data is a waste of time.Ā
Facebook could be forced to take a privacy first approach. They would argue they already are but a quick look at the user experience says otherwise.Ā
When it comes to controlling how much data is shared with companies, the selection boxes are often lost underneath the text.Ā
More often though you do not even have the ability to restrict access (as in the above image) and there is little to no warning from Facebook when a company is pushing the boundaries and trying to get more than is appropriate.
The ultimate solution is to giveĀ ourselves control of our own personal data, so that we can recover data or force it to be anonymised.Ā
The world is not set up that way today but it could be.