What role, if any, should government play in the regulation of the internet? The question has once again reared its head after revelations emerged this weekend that Cambridge Analytica, a private data analytics company, had used personal data from Facebook to influence political elections.
Using data mined from social media profiles, allegedly without consent, the company was able to target political advertisements at individuals based on personal characteristics revealed by their online behaviour. These minutely-tailored and personalised messages are said to have played a significant role in the outcome of several elections – most famously, the US presidential vote in 2016.
Perhaps a more useful question to ask is: what can government do to prevent the misuse of data? Digital innovation moves ahead rapidly, with new services and products invented everyday. Nevertheless, personal data detailing our personal history, interactions, likes and dislikes and relationships is increasingly concentrated in the hands of a few huge platforms. In a recent open letter to mark the world wide web’s twenty-ninth birthday, its founder Tim Berners Lee described how the web is now controlled by a few gatekeepers, “allowing a handful of platforms to control which ideas and opinions are seen and shared”.
When digital technologies are misused, what are the steps governments can take to protect citizens and societies?
Data protection and oversight
Cambridge Analytica is said to have hired a company to collect personal data from Facebook users through a quiz app. This was the raw material it used to create unique personality profiles, which could then be used to build an algorithm that identifies voters susceptible to particular messaging. Yet some of the data appears to have been accessed without express consent: users of the app allegedly agreed that their data could be used for academic – not commercial – purposes. This was picked up by neither Facebook nor government.
“There’s been at least one bad actor,” said Peter Wells, Head of Policy at the Open Data Institute. “Somebody created an app on Facebook and then took the data, and then used it for purposes that were not described in that app. There is someone there who’s acted maliciously or unethically.”
Incoming law changes in Europe will bring greater protections for personal data stored online. From May, the General Data Protection Regulation (GDPR), will force digital companies that handle personal data comply with new laws. For example, any company that suffers a breach will be hit with big fines for failing to report it.
For Craig Fagan, Policy Director at the World Wide Web Foundation, this is a significant step. “I think this is about seeing this as one of the elements of what governance means,” he said. “If you think about fighting corruption, or having a robust political party system, equally among governance and institutions we need to think about how we’re working to ensure, in this digital age, that citizens’ data is protected and we’re maintaining a level of trust.”
“There’s no inspection regime either by Facebook or by government: both could play that kind of role”
Besides the GDPR legislation, both national and city governments are looking more closely at ways to oversee the actions of companies working with data.
In 2017, the UK government announced the creation of a Centre for Data Ethics and Innovation, to “advise on the measures needed to enable and ensure safe, ethical and innovative uses of data-driven technologies.”
New York recently passed a bill to create a task force that will analyse the algorithms the city uses to aid its decision-making, such as those which assess teacher performance, or root out Medicaid fraud. However, Its remit will focus on the government’s own algorithms, rather than those in the private sector.
“When Facebook has created this way to access data about their users, they don’t appear to have thought about how to verify whether people are conforming with those conditions,” said Wells. “There’s no inspection regime either by Facebook or by government: both could play that kind of role, though they’d be doing it for different purposes. You could imagine governments doing inspections – not on everybody, but where there’s high risk.”
The right to move data
Part of the reason why breaches cause so much damage is because data is concentrated in the hands of a few providers. “You have a large level of market concentration and dominance by a few key players,” said Fagan. “The bottom line is that the way digital companies have evolved, it’s all based on data. The more data you have, the more advertising revenue. The risk is that when there’s so much data held by these companies it can easily be misused or misrepresented.”
New legislation could be the means to break up this dominant pattern. The GDPR gives European citizens the right to request that any data held on them by companies be deleted or shared with another provider. Citizens have the power to move their data from one provider to another, creating the opportunity for new business models to emerge.
“A lot of people feel trapped by some of these social platforms”
“If you’re able to move your data just as easily as you’re able to move bank accounts, then it makes it less likely that one player can dominate the market – because you can easily take your business elsewhere,” said Fagan.
Survey research from the ODI shows that 94% of respondents said trust was important in deciding whether to share personal data, and that people are willing to share data if they can see the tangible benefits, such as saving money or contributing to research. More damning is the fact that just one in 10 trust social media organisations – such as Facebook and Twitter – with their personal data.
But for Wells, these events show that the laws might have a blind spot. “Most social media data is about multiple people: it’s chatting with friends,” he said. “Everyone has assumed that Facebook data is owned by an individual, when actually it’s data about multiple people. I could share data without my friends’ consent, and that’s a design pattern failure that people need to think a lot more about: who accesses data that is about multiple people.”
“There are definitely no quick fixes”
The definition of what constitutes personal data in the GDPR remains ambiguous, as do some of the implications of the changes. While portability will become a right, the speed at which organisations must provide individuals with their data, and whether it can be sent directly to third parties, isn’t clear. As people begin to experiment with what the legislation permits, more of these issues will come into the open, and more of them will be argued over.
Getting the balance right
When it comes to government oversight of the internet, balance is important. While citizens’ rights and data need to be protected, there is an ever-present fear that interference could stifle innovation. Doing nothing, meanwhile, could lead to an erosion of trust in the digital innovations which could improve our lives.
“There are definitely no quick fixes,” said Wells. “Government’s always slightly behind, and it’s because innovation is about discovering new things, it can’t predict all of those spaces that innovation is going to go into. We can’t damage trust too much, because that’s when people start smashing their smartphones in the street,” said Wells. “If we get too many of these things, and if the impacts are too severe, we risk people pulling away from data and technology.”
In the specific case of political advertisements, one solution would be for online advertisements to clearly state their publishers. In the US, a group of senators have proposed the outlawing of “dark advertisements”: adverts invisible to all but the specific demographics they’ve been targeted at.
For Wells, the most important actor in neutralising misuse is civil society, with legislation acting as the tool which government should use to open it up to scrutiny. “We would argue for advertising which contains political content to be openly published: so the adverts, the name of the organisation and whom it’s targeted at,” he said. “By having that published as open data, that’s going to create the opportunity for more scrutiny from journalism and from civil society.
“Starting to work out what people want; what people are willing to work with; what our political values say as a society; what consumers want; what citizens want – it’s hard to do, it takes time.”
(Picture credit: Flickr/IIP Photo Archive)