Small bit of good news on a Saturday evening!
Small bit of good news on a Saturday evening!
Back on BBC News this morning and there were three meaty topics up for discussion: Combating fake news, the fallout from AirBnB’s success and dealing with big tech’s increasing control on data and the EU’s solution only serving to endanger individual’s privacy and consolidate big tech further.
All the talk is of dealing with fake news by using more tech. That isn’t an easy problem to solve and even though huge amounts of it are removed some get through. Worse, big tech are nervous to deal with high profile politicians who increasingly seem to think acting responsibly is a nice to have.
A much better solution, in the same way a decade ago we learnt not to click on dodgy links in email, is to teach people how to identify it in the first place. The New York Times had a piece looking at the increasing amount of news literacy teaching happening at schools. Research by Stanford and discussed by the FT, found that while most people would say yes they can distinguish, the reality was most people were good at identifying true news but no matter the age or education, fake news was more difficult to identify.
It would be interesting to see how the performance changes for those who have been taught to identify it from a young age.
The rise of AirBnB has changed the short stay world with many more properties being added and increasing the supply of rooms in cities. This success has brought short letting businesses into the platform as well.
The Guardian points out the imbalance in legal requirements between those short letting businesses and individuals letting out homes. No doubt there needs to be some common standards in place, but behind the story is really the lack of housing available generally. Stopping AirBnB is not the solution to this, building more property is.
The FT discussed the EU’s new data sharing principles this week, which seek to reduce the power Big Tech companies hold on our data and use to build barriers to competition.
The EU is seeking to force Big Tech to share data like health and wellness data, which is a high value market. But this data is also highly sensitive so ensuring it stays secure is also an important factor.
In addition, privacy needs to be taken into account. If I share data with one of the Big Tech companies, I need to be in control of whether that data is shared with another.
GDPR sought to stop this sharing of data, but the end result has been more power being consolidated into the hands of companies like Google and Facebook. Google’s recent announcement that it is turning off cookies, merely serves to consolidate control of more data within Google itself.
Don’t misunderstand me, switching off cookies is the answer. It was a poorly implemented solution to a problem that never took account of privacy issues.
Today though, we need a more open standard that is independent of the large tech companies that is privacy aware.
This week looks at the explosive results from a whistle blower exposing the psychological warfare tools used by Cambridge Analytica.
Back in the early days of the Internet there was a web page with a red button. It did nothing but people would go there and just press it in the hope one day it would.
Today we have a different type of button. The privacy button. Trained from years of clicking “I accept” buttons, we now click OK to sharing our data with countless organisations without a second’s thought.
Everyone knew we were sharing our data with someone. But understanding which celebrity you look like or what personality you are like was too much of a draw.
Now we can see the extent to which it was abused.
The Guardian has released an astonishing article of how Facebook acted as a conduit to deliver a vast amount of data to an academic from the University of Cambridge, who then sold it on illegally to a company called Cambridge Analytica.
Go and read the story.
It connects the dots between Bannon, Trump, Brexit and Russia.
For those who didn’t go and read the story, one of the core people behind the launch of Cambridge Analytica, Christopher Wylie has turned whistle blower and is revealing the methods undertaken to undermine elections.
It details key theory, data collection and insight to deliver the desired effects we see but only barely understand today.
For every action, there is an equal and opposite reaction.
Whilst the data is out of the bag, the extent to which populations have been compromised and the knowledge that this is happening will eventually result in a push back the other way.
The push back on fake news and Facebook’s involvement had already started, but could this kick start an even stronger and broader push back?
Most people in industry are worrying about GDPR – a new data law that aims to protect people’s data and privacy. It is modernising a law that stems back to a time before Google even existed, so it is much needed.
Let’s be clear though.
Passing data from the Cambridge University academic to Cambridge Analytica was illegal under that old law.
The new law makes the penalties much higher but does not make it any more illegal.
The increased fines are forcing companies to be much more aware of data and who is responsible for it.
But trying to stop people clicking on buttons authorising the sharing of data is a waste of time.
Facebook could be forced to take a privacy first approach. They would argue they already are but a quick look at the user experience says otherwise.
When it comes to controlling how much data is shared with companies, the selection boxes are often lost underneath the text.
More often though you do not even have the ability to restrict access (as in the above image) and there is little to no warning from Facebook when a company is pushing the boundaries and trying to get more than is appropriate.
The ultimate solution is to give ourselves control of our own personal data, so that we can recover data or force it to be anonymised.
The world is not set up that way today but it could be.