This opinion article is an excerpt from the book “Shortest Way Home,” written by candidate for U.S. President, Pete Buttigieg. For more like this, see our government innovation newsfeed.
Pete Buttigeig, the 37-year old US politician colloquially known as “Mayor Pete”, is running for president in the primary elections for the Democratic Party. If elected, he would become the youngest president ever in US history. As he draws from his lessons learned from being the mayor of South Bend, Indiana (2012-present), he explains how data is not a replacement for human insights.
The following is an abbreviated excerpt from his recent book, Shortest Way Home.
The rules for a data-driven government
So how does a tech-oriented mayor make sure that the data is serving the administration, rather than becoming an end in itself? Put another way, how does a government official interested in data come to be viewed more like Goldsmith or O’Malley, and less like McNamara? Over time, I’ve learned a number of rules that have helped us to make sure the use of data makes sense, and does good.
First, know the difference between reporting an issue and resolving it. In some cases, the two go so closely together that you can lose track of the distinction. For example, when we installed ShotSpotter technology using microphones to acoustically pinpoint gunshots, we were enhancing our ability to deal with gun violence. An officer could be immediately dispatched to the scene of a shooting, be it an outdoor fight or a domestic violence case, whether someone called it in or not. And this, in turn, would help in the long run to deter gun violence. But in other cases, knowing more doesn’t help.
At a tech conference, I once saw a pitch from a start-up that would automatically detect patterns of opioid use by scanning for trace amounts in sewage. The technology is brilliant, and may do a great deal of good in some places. But in South Bend, our problem wasn’t knowing how much opioid use was prevalent in this neighbourhood compared to that one; it was a lack of mental health and addiction resources to deal with the issue wherever we found it. Financing a project to tell us more about the problem could even come at the expense of treatment options, which are grossly underfunded in our county and state health systems. In cases where we have ample means to fix a problem, then we need only to find it. The rest of the time, reporting an issue is necessary, but not sufficient, for resolving it.
The difference between graffiti and potholes
A second rule we learned quickly was to recognise that responsiveness and efficiency are not the same thing; in fact, they can sometimes pull against each other. Consider the example of snowplowing. The most responsive thing to do would be to ensure that anytime someone called about an impassable street, a plow crew was immediately dispatched to take care of that block. It would be an attractive thing to be able to do (think of the political credit)—but it’s also clearly not the most efficient; far better to use a zone system, covering the city as quickly as possible, starting with main roads and then moving to residential streets, with added input from a parametric model that takes temperature and precipitation rates into account. Any other approach would take longer, and ultimately mean less quality of service and/or more cost.
Local officials often feel pressure to deal with a squeaky wheel right away, when stepping back and considering a big-picture solution would serve people better. Under the wrong balance of responsiveness and efficiency, data can actually make us worse at our job. This is one reason I eventually backed off from my enthusiasm for the idea of publicising a twenty-four-hour pothole guarantee.
A person aided by data can make smarter and fairer decisions, but only a person can sense when an unexplainable factor ought to come into play
It seemed at first like a great way to show how responsive the city was to road concerns—and doable, because in peak patching season we already get to most potholes within a day or two of them being called in. But after reviewing the concept with engineers, it became clear to me that if I instructed the staff to make sure every hole got taken care of as soon as we knew about it, I could actually reduce the efficiency of the operation. Crews on the West Side might have to drop what they were doing to go deal with a pothole on the North Side, then go chase another work order downtown, all coming to them in order of appearance. An expensive vehicle and work crew would zigzag through the city according to real-time data on which residents were first to call and complain, with little regard for whether it made more sense to have Harter Heights wait a couple days while we systematically took care of the Keller Park area for the season.
At other times, the reverse is true and responsiveness really is more important than efficiency, as in the case of graffiti. It might seem that the most efficient thing would be to treat graffiti like snow—take whatever resources we have for repainting, and have them work the city, street by street, systematically. But if a stop sign gets tagged with graffiti, leaving it there even for a couple days might motivate someone to tag something else nearby. Whether it’s a gang sign or a cartoon bunny, what shows up on Falcon Street may soon be copied on Walnut, and the longer it’s there, the more likely someone will seek to imitate or outdo it. So clearing it right away is the most important thing, and the team works to fix any reported graffiti almost immediately (except on a dedicated graffiti wall opposite the Emporium Restaurant, where artists are welcome to do whatever they like). The result is that people who might be motivated to deface public property find it’s not worth the effort, and now it is less likely to happen in the first place.
Big data makes a big difference
Ultimately, the rise of more data and technology presents tremendous opportunity for cities to be smarter and more efficient in their operations—and therefore become healthier, safer, better places to live. But after taking office, just as quickly as I learned the power of data, I also learned to be mindful of its limitations, and aware of the problems it will not solve. And I learned to maintain some level of respect for the role of intuition.
Good intuition, backed by years of experience, can make it possible to sense things—like whether a trash customer is telling the truth when he calls to say you missed a pickup at his house or whether he just forgot to take the trash out—often with remarkable accuracy and sometimes without even being able to explain how it is sensed. There is great power in human pattern recognition, which actually resembles big data analytics in its most important characteristic: the ability to know things without knowing exactly how we know them.
Often, discussions of performance management gloss over this crucial difference between data analysis in general, and “big data” used with artificial intelligence. Using data in general is nothing new; it is simply the application of factual knowledge to make decisions. As an approach to government, it came as naturally to Alexander of Macedonia as it did to Robert McNamara. For the purposes of using data, the only thing to change with the introduction of computers is that we can gather and apply it more quickly and precisely. “Big data” is different. It has the potential to change government, along with the rest of our society and economy, in categorically different ways than the use of data in general. Not everyone may share my definition, but to me the difference is this: Using data means gathering information, understanding it, and applying it. Using big data means analyzing information to find and apply patterns so complex that we may never grasp them.
Robots behind the steering wheel
Computers can now crunch sets of numbers so vast that the patterns that emerge from them are beyond the reach of the human mind—and yet the patterns can be used. Utilities like our waterworks are beginning to tap into computing capabilities that can accurately predict points of failure in water systems without our truly understanding how the prediction was made—only that it works.
Ironically, what makes this kind of predictive technology most interesting is that it so closely resembles human intuition. More often than we realise, humans rely heavily on knowing things that we can sense, but not explain. One of the reasons it was impossible, till recently, to program a car to drive itself is that many of the mental processes we use to drive a vehicle cannot easily be defined or described (and therefore cannot be programmed).
Like the parks maintenance supervisor or road foreman going through countless decisions in the back of his head, anyone driving a vehicle relies constantly on subconscious pattern recognition. I can’t explain to you how I know the moment when a snowy road has become too slick for normal braking, or whether I am a safe distance from the centreline, or whether I can beat that yellow light. I just know. If I wanted a machine to gain this capacity, I would have to do one of two things: master the precise basis of this knowledge so it could be programmed, or construct a machine capable of learning it the same way I did.
The latter is becoming possible, and this is the true fascination of artificial intelligence. For government, there are extraordinary implications to a program that could anticipate road failures five years in advance, or predict asthma attacks using linked data sets on hospitalisations, weather, and car emissions to sense patterns so exquisitely complex that we will never understand them.
The limits of reason
Now even explicitly social functions, like gauging how much anger a certain policy proposal will cause, can increasingly be achieved through social media analysis that might eventually outmatch even the finest-tuned human political antennae. The ultimate lie detector won’t be designed, like current ones, by programming known patterns in heart rate and perspiration. It will be designed by machine learning, scanning millions of recordings of people saying true or false things, and using this to make predictions based on combinations of indicators beyond our comprehension.
The algorithms advance every year. This is why Netflix has a good sense of what films you would like, perhaps even doing a better job of predicting your preferences than you do, if all you have to go by is a trailer and a description. But these capabilities are also still in their infancy. My Internet TV device still sometimes shows me commercials clearly intended for a middle-aged homemaker or a teenage video-game enthusiast.
And no matter how sophisticated the programs, they will never fully learn our sense of mercy—the rule not to be applied, the efficiency not to be captured. Capable of something resembling intuition but nothing quite like morality, the computers and their programs can only imperfectly replicate the human function we call judgment. Knowing when one valid claim must give way to another, or when a rule must be relaxed in order to do the right thing, is not programmable, if only because it is not completely rational. That’s why, even as reason has partly replaced divine intervention for explaining our world, it will not replace human leadership when it comes to managing it. A person aided by data can make smarter and fairer decisions, but only a person can sense when an unexplainable factor ought to come into play—when, for lack of a better expression, “something is up.” And that, as John Voorde might remind me, has been the job of elected officials all along. — Pete Buttigieg
This is an excerpt from Shortest Way Home: One Mayor’s Challenge and a Model for America’s Future” by Pete Buttigieg, published 12 February 2019 by Norton. Copyright (c) 2019 by Pete Buttigieg. Used with permission of the publisher, Liveright Publishing Corporation, a division of W.W. Norton & Company, Inc. All rights reserved.
“Shortest Way Home: One Mayor’s Challenge and a Model for America’s Future”, 352 pages, published 12 February 2019 by Norton. Read more about the book here.
(Photo credit: Lorie Shaull//Flickr)