Shouldn’t City Hall, the Statehouse, Washington become as savvy as the gadget in your pocket?
They’re getting there, slowly.
Today, sensors along Kansas City’s streetcar line track how many riders, pedestrians and cars fill Main Street and its sidewalks at a given moment. Public Wi-Fi along that River Market-to-Union Station stretch detect how many users have 816 and 913 phone numbers — presumably locals — versus the number who appear to be out-of-towners. It can give a rough idea of whether an evening crowd in the corridor tilts more to millennials or baby boomers. And the free Wi-Fi notices whether the crowd consists more of regulars or newbies.
That information helps the city decide when to brighten streetlights for crowds or dim them to save energy. It can alert police when to send more patrols, help restaurants along the line fine-tune their marketing and will eventually signal dispatchers when to deploy more buses.
“If you’ve got a full streetcar headed south to Union Station, you’ll be able to tell the buses to wait a minute for those riders,” said Kansas City chief innovation officer Bob Bennett. “Some of this stuff is cool and some is smart. It’s smart when you’re actually doing analysis on that data and mitigating a crisis before a crisis even occurs.”
Yet these are baby steps.
Government’s ability to collect information, analyze it and use the resulting smarts to stretch tax dollars is only barely gaining traction.
This may be an iPhone age, but government data-crunchers typically work flip-phone budgets. Many agencies struggle to corral information they already have, much less manage a new breed of real-time statistics. They must juggle privacy issues, retrofit generations of old-school record-keeping to a digital present and overcome suspicion of what looks to some like another geek fad.
At the heart of the promise, and challenge, is what’s become known as “big data” — mountains of information that reveal new insights when computer analyses detect previously unseen patterns or associations. It’s how Netflix recommends movies, how intelligence agencies hunt for terror networks and how public health officials are beginning to predict disease outbreaks.
“Outside of (information technology circles),” one local government chief information officer told researchers, “no one gets the term (big data). … I explicitly told the IT group not to use the term in conversations with our customers across local government.”
The hope
Yet the promise of a data-smart efficiency for government beckons from untold directions.A 2013 report, underwritten by a remote computer storage firm that figures to benefit from more data-centered government, said that a survey of 150 IT officials in federal agencies estimated that better use of large databases could trim spending by $500 billion a year.
The McKinsey Global Institute said governments around the world could save $3.7 trillion a year by putting big data to big use.
“In the public sector,” the report said, “making relevant data more readily accessible across otherwise separated departments can sharply reduce search and processing time.”
Think of a teenager in foster care. Because she’d be a ward of the state, it would be possible to track her social media accounts, her school attendance, her health records. Algorithms, although flawed like those who make them, might tip off her counselors that she’s at a particular risk of drug abuse or teen pregnancy.
Could remote sensors tracking whether elderly people are taking their medications, or listening for the thud of a fall, help them live on their own longer rather than going to a pricier assisted living facility?
Might a system that sells fishing permits be tied into resorts, campgrounds and weather reports to help a tourist town better capitalize on the dollars that come with those in search of lunkers?
“The private sector is way ahead on this front,” said Rick Howard, a research vice president for Gartner Inc., an information technology research firm. “Government’s in competition. It is being judged and benchmarked against service providers in the commercial sector. That gap is only going to become greater if they don’t get on board.”
Still, even government is getting in on the game.
On Kansas City’s east side, ShotSpotter technology deployed in 2012 listens for gunfire and triangulates its location. Of 285 arrests using the detectors nationally through June, 164 came in Kansas City. And because it’s often not the first shots that strike a victim, but rather subsequent gunfire, the ability to send cops to a location at the first sound means police can respond quickly enough to spare bloodshed.
The Chicago Department of Public Health monitors Twitter for complaints that could be linked to food poisoning to better focus its restaurant inspections. The project has tracked more than 3,000 tweets and submitted some 1,700-plus reports of food poisoning, although the project notes: “All the tweets in the world can’t put a thermometer in a dairy case.”
Chicago has also begun wiring sensors to light poles — it aims to create a network of 500 listening posts over the coming 2 1/2 years — to monitor air quality, noise and traffic.
In Boston, city officials have begun sharing fire hydrant locations so volunteers could pledge to dig them out of the snow in winter months.
Researchers in Coloradohave used data to map out, county by county, where poor families eligible for a handful of government assistance programs are getting that help and where they’re not.
The U.S. Food and Drug Administration is exploring virtual laboratories, where field researchers will be able to share their data remotely, gaining access to distant data banks and continually adding to information to share with far-flung scientists.
Herding bits
Embedded in nearly every government effort to corral big data is a hope for greater efficiency, saving tax dollars by using them better.Like many cities, Kansas City in 2007 began routing virtually all calls for service through a 311 line (borrowed from the old days when landline phone users dialed 411 for information). No longer would people need to know what city department took care of weedy lots or potholes. They started at one place that knew how to route their calls for help.
Now that service fields about 100,000 requests annually. More than half the city’s residents make contact that way every year. It’s also given officials a way to track both what folks are complaining about and how well City Hall responds.
Now those data identify areas with problems such as trash pickup. City officials can use the data as leverage to demand better performance from contractors. The numbers figure heavily in budget talks, showing where the greatest resident dissatisfaction with city services lies. That 311 information made tree trimming a higher priority, it made clear the city needed to respond more quickly to complaints about code violations and it improved the understanding of problem spots for water main breaks.
“Before, we had no centralized tracking system,” said Kate Bender, the deputy performance officer in the city manager’s office. “Now we do, and we can see what needs attention.”
In Olathe, like Kansas City, officials conduct resident surveys four times a year to judge whether people are satisfied with city services. The marks show Olathe faring well compared to other cities across the country, said Ed Foley, the performance analyst in the city’s resource management department.
That data also tell city management where it needs to work. One survey focused on complaints about trash pickup. The survey showed that as long as someone responded within 24 hours, people were content.
“It takes a lot of resources to respond to something the same day we get a complaint,” Foley said. “If we know that we’ve got a full 24 hours to keep people happy, then we can be more strategic about sending out crews. It lets us use our resources the best way.”
Resident surveys and 311 logs tell how City Hall can deliver service upgrades, but they represent shallow end of the big data pool.
Kansas City applied for a grant that could have brought $40 million from the U.S. Department of Transportation and $10 million from Vulcan Inc. that ultimately went to Columbus, Ohio, in June. The city had hoped to build various “smart city” features — venturing into the deep end of the big data pool — along a Prospect Avenue MAX rapid bus line. The plan included electric car charging stations, a driverless airport shuttle and sensors that would count waiting commuters to tweak bus service.
Now that application has been slimmed down for a different $12 million grant from the Department of Transportation. It no longer proposes to pay for a driverless shuttle, but it hopes to install free Wi-Fi along Prospect and better collect data to improve transportation services.
Kansas City has a connectivity edge already. Following Google Fiber’s decision to use the Kansas City market as its first test bed for high-speed home internet service, the metro area has seen its competitors respond by building fiber optic networks. For home consumers, it means far faster Web connections at lower prices than in most American cities.
It’s also created a stronger underlying system of cables and switching stations, what the industry calls backhaul, than most areas.
“That’s our edge,” said Bennett, the city’s innovation officer. “It’s an advantage we can build on.”
That internet advantage has sparked new interest in Kansas City as a tech town. It may be years before that’s put to work in the public sector, but it launched efforts such as the nonprofit technology-centered KC Digital Drive. Aaron Deacon, the group’s managing director, said efforts to put delinquent and abandoned property to use could benefit from a more wired world and the big data it produces.
His organization and others are looking at collecting and sharing increasingly detailed information about properties controlled by the Kansas City Land Bank, the city agency overseeing tax-foreclosed properties. If prospective buyers understood the lots better — whether they carry outstanding liens, values of neighboring property, the drainage — they might be more willing to invest.
“Collecting that data and making it public and open,” Deacon said, “lets you know more about making it usable.”
Breaking big data down into helpful pieces helps government planners. But merely making public data more public offers its own possibilities.
If you’re thinking of buying a house, wouldn’t it be nice if just a few computer clicks could tell you if the neighbors consistently get cited for code violations? Shouldn’t taxpayers easily be able to find out how long it takes a rape kit to get processed? What if parents could track their child’s school bus in real time?
One of the biggest hopes in bigger data, said Washington, D.C.-based Center for Data Innovation Director Daniel Castro, is openness.
“It’s the opportunity to make what government is doing more available to the public,” he said.
As cities, states and the federal government invest in sensors, networks and the other things that create big data, Castro said, political debates will form about those choices.
“It’s all about transparency,” he said. “What you choose to select has a lot of impact. So does the data you choose not to collect.”
©2016 The Kansas City Star (Kansas City, Mo.) Distributed by Tribune Content Agency, LLC.