This opinion piece was written by Alistair Gentry, research artist in residence at the Open Data Institute (ODI). His lunchtime lecture “exploring trust and distrust of public information through art” is on Friday 2 November at 1.00pm at the Open Data Institute, 65 Clifton Street, London, EC2A 4JE, and is also available via livestream on the ODI YouTube channel. This piece also appears in our digital government newsfeed.
In September I was appointed as the Open Data Institute’s research artist in residence, with a brief to develop an artwork that explores the key themes of public trust and “data as culture.”
The ODI’s data as culture art program engages new and diverse audiences with work by artists who explore data critically and materially. It offers an alternative angle for looking at data — unpicking the societal and political issues and opportunities it might create without being tied to conventional research methods.
• Want to write for us? Take a look at Apolitical’s guide for contributors
My starting point was the asymmetry of data’s impact upon us — the ghost that helps some of us and haunts others — because the risks, costs and side effects of misplaced or mishandled trust fall much more heavily upon some people and some sectors of society than others.
The tech bro mentality that seems very prevalent in data handlers like Facebook is inextricably bound up with the impact of their technologies on nearly all of us. It’s either by ideological (usually Silicon Valley-inflected libertarian) design or by privileged, unconscious ignorance. “I don’t think I have anything to hide, and if you have nothing to hide you also have nothing to fear” is a sentiment certain people are privileged enough to utter in all seriousness. They might feel differently if they were, for example, an investigative journalist in Malta or a Russian political activist.
Shifting focus slightly to socioeconomic status, in Britain, being poor — often working poor rather than unemployed poor, enough to need Universal Credit — quickly makes a person acutely aware that mere inefficiency in handling data, without active malice, can send a person and their household into a nosedive of rent arrears, unpaid bills and the food bank.
Closing Pandora’s Box
In the developed world, our lives are extensively monitored and recorded thanks to our constant use — and their passive presence on or near our person, in some cases — of phones and smart devices. Alongside online banking, government, services and shopping, data from any or all of these uses and technologies is actively interlinked.
Most people are conscious of it, to a degree, but few of them grasp its full extent, reach or implications. If they were made aware, they’d be amazed, or aghast.
Somebody who knows this very well is Cambridge Analytica’s former lead psychologist Patrick Fagan, who in an interview with the ODI’s Anna Scott says “I don’t think they understand [commercial use of personal data] at all, and I put myself very firmly in that category too. People generally care about emotional things, and data is very rational and complex.”
In the developed world, our lives are extensively monitored and recorded
If the likes of Fagan aren’t claiming to be particularly on the ball, we should approach with extreme caution anybody who says they totally have a grip on this subject.
As for privacy and our rights to control, permit or deny access to the data that accretes constantly around almost everything we do, at this stage it’s probably a question of somehow trying to stuff everything back into Pandora’s Box, sit on the lid until it stops moving, then start afresh.
At a conference I attended recently, run by an artificial intelligence company, a presenter from a major insurer was incredibly relaxed about the prospect of automatic and instant retrieval of complete dossiers on a person when they so much as idly ask for an insurance quote. You don’t even need to proceed with the policy to be exhaustively profiled, along with every other member of your household, your former colleagues, partners, flatmates and family members who no longer live with you, or indeed are no longer alive at all. Another presenter at the same conference’s “deceased” AI database tag attests to this.
My own current research confirms that doing such things is appallingly easy with readily available software — no machine learning required. And machine learning and AI techniques are exponentially accelerating old school human intelligence, research and profiling skills. The moral and ethical framework for these activities is notably, and worryingly, lagging behind.
Choosing the right form
For my residency at the ODI, initially I planned to create something that used these same technologies but turned the data and the tools for managing it over to the person concerned.
But, so pervasive is the indiscriminate vacuuming up of data, that I decided doing this in an artistic context would still be just doing it rather than questioning it. It would be rather like me making a virtually indistinguishable copy of a famous painting and then just putting my signature on it; it might work as an incredibly arch, arid and esoteric conceptual art statement, but that isn’t the kind of statement I’m interested in making. And it’s already being done, every day, to all of us. I could do it to you tomorrow — be right back, downloading the app now.
The ghost of data about us is in some way an inextricable part of us
In preparing to talk about my work and my research so far at the Open Data Institute this November, as part of its ODI Fridays lunchtime lectures series, I’ve been thinking instead about ways in which we could introduce some empathy, openness and humanity into esoteric and abstract data-mining and data-siloing processes that are already well underway.
As in all of my work, I’ll be trying to make something that helps rather than harms, questions rather than answers. If the ghost of data about us is in some way an inextricable part of us, then, like any other aspect of us, that ghost also has inalienable rights. These should not be up for debate any more than any human being’s arm or leg, for example, should be annexed and monetised unilaterally by a government or company — even if that person clicked “I agree” at the end of a long document they didn’t read properly. — Alistair Gentry
(Picture credit: Pexels/Chris Gonzalez)