This piece is part of a monthly email series called The Explainer, which breaks down big issues in public service in easy-to-understand terms. Sign up for the email here.
In an ideal world, all policy decisions would be made based on hard data. We would eliminate partisan wrangling by treating law-making like a scientific process — one devoted not to any political ideology, but to a simple question: what works?
But it’s not that simple: policy is made in messy, pressurised environments, where often it’s the person who shouts loudest that wins the argument.
But over the past nine years, a revolution has been spreading through governments around the world. Today, over 200 public bodies are using a combination of behavioural science, economics and psychology to craft better policy.
It’s called behavioural insights, and it’s a way for governments to steer human behaviour through nudges — simple policy tweaks that use the power of suggestion to influence people’s decisions — rather than laws or taxes.
The idea is to push people to make better choices — both for themselves, and society — while maintaining freedom of choice.
In this explainer, we’ll describe what nudging is, how you can use it in your own work, and why critics have called it a passing trend, or worse, a way to manipulate citizens en masse.
The team that started it all
The UK’s Behavioural Insights Team (BIT), established in 2010, was the world’s first nudge unit. It was inspired by Nobel Prize winner Richard Thaler and legal scholar Cass Sunstein’s 2008 book Nudge, which brought behavioural economics to a mainstream audience.
In Nudge, Thaler and Sunstein demonstrate how people make irrational choices that many traditional economic models ignore. With “choice architecture” — changing the context in which a decision is made — policymakers can push citizens toward better decision-making, they argued.
BIT faced an uphill battle. One newspaper editorial called it “the wackiest and most vogueish corner of government”; another decried its “manipulative” and “Orwellian” methods.
Under scrutiny, the BIT was established with a so-called sunset clause: if it did not achieve a ten-fold return on investment (£500,000 a year, or $653,500), it would be shut down after two years. Not only did it stay open — the BIT saved over 20 times its running cost.
With small tweaks like changing the messaging on a government website, form or letter, it has encouraged an additional 100,0000 people to sign up as organ donors, doubled the number of applicants to the British Army and accelerated tax receipts by over £200 million ($261 million).
Its method is simple: create not one nudge, but several varied and related messages. Then test them against each other with a randomised controlled trial (RCT), to see which is most effective. Finally, use the learnings to make a policy decision.
The BIT’s work supports Thaler and Sunstein’s hypothesis: people don’t always make decisions that are in their best interest. For years, governments have believed that the best way to get people to save for retirement is by giving them generous tax breaks. That’s been shown to have little effect on how people save.
The BIT’s approach was, instead, to automatically enrol all UK citizens onto a workplace retirement fund, while still giving them the choice to opt out. By making the desired choice the default, millions more Britons are saving for retirement.
Nine years later, the BIT — now a limited company owned by its employees, the UK government and innovation charity Nesta — has saved government hundreds of millions of pounds.
By the numbers
The number of government bodies around the world applying behavioural insights into policy
The percentage by which Copenhagen cut street litter when the city stencilled trails of green footprints leading to garbage cans
13% to 80%
The change in 401(k) enrolment after the US made “opt-in” the default option
The amount of public savings the BIT identified over 2014-2016 alone
The percentage by which the Victorian Department of Treasury and Finance was more likely to hire women after de-identifying CVs. Prior to removing identifiers like gender and name, the department was 33% more likely to hire men
The reduction in missed doctor’s appointments after patients in the UK were made aware of the number of people who make their appointments
The number of organ donors per million people in Spain, which has the world’s highest organ donation rate after instituting an opt-in system in 1979 (The EU’s average is 19.6 per million; the US’s is 26.6 per million)
Nudge units may be a new phenomenon — but using behavioural science to influence human behaviour is not. As Sunstein put it: “In Genesis, Satan nudged, and Eve did, too.”
Singapore, for example, used behavioural science on its way to becoming one of the wealthiest and most innovative states in the world. In the ‘60s and ‘70s, the government introduced a number of public campaigns designed to exert social pressure on citizens, from “Keep Singapore Clean” to a “National Courtesy Campaign”.
In 1986, then-Prime Minister Lee Kuan Yew explained his methods. “I am often accused of interfering in the private lives of citizens. Yet, if I had not, we wouldn’t be here today,” he said.
“We would not have made economic progress if we had not intervened on very personal matters — who your neighbour is, how you live, the noise you make, how you spit or what language you use.”
Critics point out that Yew built Singapore’s success at the cost of fundamental civil liberties, including freedom of speech. But more than 30 years later, the city-state is still using behavioural science to steer citizens’ behaviour.
Residents’ electricity bills show how their consumption compares to neighbours’. Garbage bins are placed away from bus stops, so smokers have to stand away from everyone else. There are free outdoor gyms outside large estates, so residents are reminded to exercise.
But studies show that citizens accept behavioural interventions like these in Singapore, where trust in government is very high. But in other countries, where trust in public bodies is low, doing experiments on citizens has been called paternalistic — and, in some cases, even mass manipulation.
In the US, public outcry over “nanny-state” policies has shut down several attempted uses of behavioural insights, like West Virginia’s effort to bestow higher health care premiums on citizens who fail a fitness test.
At worst, this can be called infantilising — but other, more nefarious uses of behavioural science are being used to take choice away from citizens, rather than push them towards the right one.
North Carolina, for example, has enacted laws limiting the types of IDs that can be used to register to vote, and banned out-of-precinct voting. Whether or not it was the explicit aim, these measures have deterred black Americans from voting.
Another chilling example is China’s nation-spanning experiment in social credit, through which citizens are carefully watched and ranked for trustworthiness.
If you fail to pay your taxes, take up an extra seat on the train or get caught drunk driving, points will be taken away. Citizens are awarded a grade: Grade-A citizens may get first priority for jobs, skip hospital lines and get discounts on energy bills. Grade-D citizens, meanwhile, can be denied public services, banned from travel or even blocked from dating websites. Last year, China banned people with low social credit scores from buying plane and train tickets 23 million times.
The idea is that when they’re watched, citizens will behave better. Critics have called China’s social credit system a glimpse into a dystopian future; a place where all citizens are watched and rated by government, which doles out rewards and punishments accordingly.
These are the extreme examples — the ones that critics point to as manipulative, even “mind control”.
But the reality is that government is trying to get us to do things all the time: pay our taxes; separate our recycling; get off unemployment benefits. Behavioural insights simply uses data, research and evidence to make those attempts more effective, and citizens more responsive.
The best way practitioners can allay concerns about their work is simple: be transparent. Today’s governments — particularly those which citizens already mistrust — can’t afford to be caught conducting secret experiments on the public, the way tech giants like Facebook have.
Behavioural scientists have to publish and publicise the results of their trials, good or bad, and think about the long-term effects of behavioural interventions.
How do you design a behavioural intervention? First, decide what outcome you want to achieve, and how you will measure it. Second, understand the context: how are users responding to the current system? Third, build an intervention and put it into practice. Fourth, test — ideally using an RCT — measure, and learn. Finally, adapt the policy based on your insights.
What are the tried-and-tested methods? Use the power of defaults: people tend to stick with pre-set options because they’re easier. Remove any hassle: if it takes any effort — mailing in a letter, or visiting a government office — people will be put off. And make the message clear — or break down complex information into a list of simpler actions.
Which nudges can I use to change citizens’ behaviour? Use rewards, especially financial incentives. Employ social pressure to show that most of their peers make the desired choice, like paying taxes on time. Encourage people to make a commitment to perform the desired action, such as signing a letter promising to vote.
Behavioural insights: innovative, or mundane?
The other major criticism of behavioural insights is that it can only provide quick fixes to mundane, micro-level problems. Telling people how much electricity their neighbour uses may slightly reduce their consumption — but it’s not going to have the same sweeping impact on climate change that, say, a nationwide carbon tax would.
Can behavioural insights conjure innovative solutions to the intractable problems faced by today’s governments? Can it help us end poverty, curb loneliness or end conflict?
Critics say no: these are problems that require innovative, forward-thinking solutions. They can’t be solved by studying past human behaviour.
But the field is driving ever-bigger changes. In the US, automatic enrolment of children in a free school meals program has helped feed millions of kids from low-income families. Tokyo’s installation of blue LED lights at train stations has cut suicide attempts by 84% over 10 years.
And the BIT is extending its work to conflict zones. It has helped with Colombia’s peace negotiations and worked in refugee camps to promote early childhood development and reduce corporal punishment. And in the Central African Republic, Nigeria and Myanmar, it will test new conflict resolution techniques.
And even if most applications of behavioural insights aren’t pushing forward big innovations, they’re still valuable. The BIT offers a pathway for young nudge units around the world: start with small, low-cost tweaks and iterative experiments — these alone can save governments millions — then move onto bigger things.
One of the most important things this small unit — which started with just seven people — can teach public servants is that it’s okay to say “We don’t know”.
“It’s very hard for government to admit that something it’s doing might not work, or that it’s failed… People are often running blind, overconfident and dispensing massive sums of money,” David Halpern, chief executive of the BIT, told Apolitical.
“The dirty secret of almost all government is that we don’t know whether we’re doing the right thing.”
When policymakers can admit that they aren’t sure which policy is best, that opens up the freedom to test — which is perhaps behavioural insights’ biggest contribution to government. It’s introducing public servants all over the world to evidence-based approaches, rigorous testing and empiricism.
These are concepts that often seem far away, as governments like the US and UK face ever-growing ideological divides. But more and more, behavioural science — and its spirit of continually testing and improving upon policy — is seen as something public servants should understand and apply. The fact that this iterative, data-driven approach is taking hold should give us hope for the future of government. —Jennifer Guay
(Picture credit: Unsplash)