Sure, state and local governments have used analytics to relieve traffic congestion, monitor public utilities, evaluate and predict crime, follow education trends, and keep tabs on public resources. And the bigness of analytics seems to be getting bigger too, if private-sector Goliaths like Amazon are any indication. Amazon’s patented algorithms, for example, allege to predict shopping habits before orders are placed. Other private companies are also using analytics to create statistical treasure maps in market trends.
But for all of its potential, big data’s impact in government remains relatively small. Behind closed doors, government insiders are hopeful of possibilities but skeptical toward first steps. In a study of 150 federal IT professionals, the government IT networking group MeriTalk estimated that federal agencies could save 14 percent with analytics programs, or nearly $500 billion. However, the study also found that only 31 percent of those that had launched an analytics project believed their data strategies would deliver.
A recent IBM report, Realizing the Promise of Big Data: Implementing Big Data Projects, found similar skepticism. The study drew on interviews from 28 federal, state and city CIOs, the majority of whom confessed to fighting a perception of big data as a passing fad. Another CIO admission was a fear to even mention “big data,” dreading a blowback from staff who don’t always understand its potential.
And yet, the technology research firm Gartner says analytics are on their way, reporting that by 2015, the demand for data and analytics jobs will reach 4.4 million globally, but only one-third will be filled.
Despite the conflicting signals, governments are gradually adopting big data tools and strategies, led by pioneering jurisdictions that are piecing together the standards, policy frameworks and leadership structures fundamental to effective analytics use. They give an enticing glimpse of the technology’s potential and a sense of the challenges that stand in the way.
It’s difficult to put a ruler to analytics and its predictions. Perhaps that’s why Anthony Townsend, a senior research scientist at New York University’s Rudin Center for Transportation Policy and Management, finds the word so nebulous. As he points out, “big data analytics” is just another way to describe taking heaps of information and funneling out a conclusion, a process that’s been happening for decades, and arguably, since the first U.S. Census in 1790.
Townsend made a name for himself on the topic by his research, an effort to pin down the moving definition of a smart city and plumb the depths of technology’s impact on urban life. As an economic and technology adviser, his work has carried him from San Francisco to New York and across the globe. Today, however, his name is most visible on the jacket of his new book, Smart Cities, that examines the underpinnings of data-driven cities. On the issue of government analytics’ stature, Townsend is clear.
“I think the biggest misconception is that it’s widely used, because it isn’t,” he said. “I think most government agencies still operate on rote bureaucratic procedures that don’t use a lot of data mining or analytics to prioritize how government employees do their work and when they do what they do.”
Government’s grip on analytics, Townsend said, is nowhere near as tight as that of private industry, which has taken the tool and yoked it to an ever-greater amount of decision-making. The front runners in government resemble more of a hodgepodge group of cities and one-off projects. It differs town to town, city to city, and in some places there’s nothing at all. Costs are prohibitive, analytics skill sets are obscure, and most leaders, though open-minded, are still sitting on fences.
“The places to look are the places that are starting to comprehensively look at how all these [technological] tools should be integrated with the rest of government,” Townsend said.
In the U.S. these include cities like Chicago, New York, Philadelphia and San Francisco. Globally, he said examples are London, Singapore and Dublin. The cities have bolstered analytics by attaching its development to an overall technology architecture, master plans that construct a latticework of tangible problems for it to solve, people to champion it and firm leadership to clear the way.
Slow uptake of analytics in government stems partly from the fact that putting big data to work demands a culture shift for public agencies.
“In government you get a lot of barriers where people feel that they’ll be undermined or lose power if they share their data,” Townsend said. “And that’s something that can only be overcome through very strong leadership and demonstrated success. I think we’re still getting up that curve.”
Desire to unleash the power of analytics and other tools has spawned a series of new job titles — chief innovation officer, chief data officer, etc. — that work for or alongside traditional CIOs. In addition, leading jurisdictions are experimenting with new procurement models and policy frameworks designed to backstop innovative new job titles.
“They tend to be most effective when they’re not just innovation drivers but setting technology policy for the entire government,” Townsend said, as in the case of New York, San Francisco and others that have created open data policies and injected data standards as a part of the city’s vision.
“These people are all agents of change to some degree or another in organizations that are explicitly designed to not change and to resist change,” Townsend said. “And it’s always a thing that frustrates them and makes it difficult for them to be effective, but thank god they keep persevering.”
In Chicago, CIO Brenna Berman leads the effort to create a predictive analytics platform that will process more than 7 million rows of data collected each day by the city. Chicago’s SmartData project will analyze and aggregate data, identify trends and offer problem-solving predictions. City data already is connected to WindyGrid, Chicago’s live analytics dashboard accessible across all departments.
While the project scope alone is enough to draw stares, what’s really interesting is SmartData’s open source build, a design free to any city willing to install it and one that may be the building block for local governments in the future.
“That’s really why we feel this is an important project, and frankly, why Bloomberg Philanthropies was so interested in partnering with us on this,” Berman said. The philanthropy kicked in $1 million — a grant from its Mayors Challenge city improvement competition — to help build the SmartData platform, in hopes of toppling the first domino in a city-by-city chain reaction.
“While several municipalities are working to harness the power of big data, Chicago will be the first city to do so open source, making it possible for this great idea to spread and empower other cities,” Bloomberg’s Jim Anderson said in January.
Now that the data faucet is flowing, Berman and her team of data engineers, project analysts and project managers are wading through department workflows. Alongside department staff, the team is hunting for key performance areas where predictive analytics offers the greatest value.
Pilot analytics programs are to follow in the next year or two, ideally one to three for each city department. Afterward, Berman said more pilots will be added until Chicago’s data is all interwoven and easily interpreted through WindyGrid. Ultimately the city wants to use SmartData to respond to real-world problems by integrating analytics into its daily workflows.
“I think this city has the ability of putting predictive analytics into the hands of every department in the city and unlocking the value of predictive analytics regardless of the number of data engineers we have,” Berman said, referring to the platform’s user-friendly dashboard.
Analytics platforms, even in their infancy, have been aimed — and perhaps lovingly cajoled — to be systems that send, receive and interpret information from all parts of an organization. It’s a system that can be harnessed to monitor and coordinate a jurisdiction’s various functions. It’s a system that responds to inputs from a jurisdiction’s interactions with its environment. And if this observation rings dry and textbookish, in full disclosure, that’s likely because it is. Only it doesn’t stem from a book about technology. In fact, it’s biology. Replace the word “jurisdiction” with “human body” and this is the definition of the human central nervous system.
It’s one possible vision for the future. But admittedly, we’re far away. The central nervous system, however, is an apt metaphor for “smart city” ambitions. On the forefront of these ambitions, IBM has been working to pioneer municipal analytics for years. It’s predicted traffic jam locations 30 minutes ahead of time in Singapore, reduced electricity and water usage in Dubuque, Iowa, and forecast weather patterns for agriculture in Borneo.
Katharine Frase, IBM’s Global Public Sector CTO, has been coordinating the company’s city analytics efforts across the globe. When asked about the future for analytics, Frase sees tomorrow in terms of today.
“It’s important to start from where the city is,” she said. “More data would always be better, but we should start with the data the city already has.”
Often, city leaders are convinced that implementing an analytics program is too costly and requires massive changes, Frase said. That is not always the case. “Start small, don’t think of this as you’ve got to rip everything up and start all over, but start something,” she said.
To date, IBM’s most requested uses for analytics in government are in highly visible areas. Traffic management is most popular, followed by water management, she said. After those, the top contenders are emergency response, energy consumption in buildings and public safety.
The ultimate success of analytics may be proportionate to how readily governments embrace the new data-driven mindset. Beth Blauer, director of Socrata’s GovStat performance monitoring platform, said much of the movement in analytics will depend on factors that are more human than technical.
“Most of our users are just now starting this work, they’re just now turning the corner from understanding it’s not just about publishing data but internalizing the use of data and transforming their decision-making from gut and instinct into real [data-driven] context.”
Blauer, who previously spearheaded Maryland Governor Martin O'Malley's StateStat, a state performance management tool, said the emerging use of analytics is compelling states and cities to define what data is worth measuring, and more importantly, what data is worth acting upon.
“What we’re realizing now, is this is about creating best practices across the board,” Blauer said. “We’re going to see a lot more dialog at every level of governance on what those standards are.”
The hardest part about implementing any of these programs or making a change, she said, is actually having strong leadership. Leaders must consistently confirm that data-based workflows are the way tasks will be done and how decisions will be made. It’s a common ground Blauer shares with Townsend, Berman and Frase who all pointed at leadership as the cornerstone for future progress in analytics.
“There is this whole internal work that needs to be done,” Blauer said. “It’s more of a culture change, it’s not even about technology.”