IE 11 Not Supported

For optimal browsing, we recommend Chrome, Firefox or Safari browsers.

Big Data Is a ‘New Natural Resource,’ IBM Says

IBM researchers explain the need for big data in the public sector, calling it the “new natural resource.”

“Big Data”is quickly becoming a part of the public sector’s lexicon. This catch-all phrase — shorthand for data that’s collected through different channels: sensors, social media feeds, photos, video and cellphone GPS signals — accounts for 2.5 quintillion bytes of data created each day. According to whatsabyte.com, that’s equal to 2.5 billion gigabytes daily.

Public-sector IT leaders and private-sector experts are talking about how they can better integrate this big data into work processes. On Wednesday, June 27, government executives and members of the IBM Watson supercomputer research team gathered for a conference in Washington, D.C., to address how Congress and the federal administration can start utilizing big data cohesively.

The company’s work with the Charleston, S.C., Police Department is one of the examples being highlighted at the conference. The police department is in the early stages of using big data with IBM’s help. The department for years utilized hot spot policing to reduce crime. Earlier this month, IBM announced that the Charleston Police Department is now using the company’s predictive analytics software to evaluate and predict future crime.

Prior to Wednesday’s conference on big data, the police department’s Deputy Chief Anthony Elder (a panelist at the event) said that searching through multiple databases used to be cumbersome and couldn’t be done in real time.

“It takes a lot of time and resources to be able to go through [data]. And you need it now,” Elder said. “So we need a way to be able to get into that big data storage — things that we can already get to. Because that’s what this is — it’s things we can already get to but takes a lot of staff hours to [do so].”

Elder said the police department’s solution allows for an expanded, federated search through multiple databases — information that’s typically siloed. If, for example, the Charleston Police Department needed probation and parole data, the search would still only garner information that external entities are already granted access to. The main change, though, would be getting access to that information faster.

Prior to the event, David McQueeney, vice president of software for IBM Research, said that big data – often called “unprocessed” or “unstructured” data — can be utilized in the federal government outside of technical agencies like the U.S. Department of Energy or the military.

For example, the National Oceanic and Atmospheric Administration’s National Weather Service can utilize big data for storm tracking purposes, McQueeney said. “That’s a case where we’re using computational science to generate predictions about the future,” he said.

The combination of utilizing computational science and social media day can result in better storm tracking, McQueeney said. Predictions about future weather patterns can be generated by using data that people share, such as if they are aware a storm is coming, if they’ve taken safety precautions for a storm, or if they’ve said they’ve evacuated.

“You get one form of very large-scale data that details the future of the physical world based on model and simulation, which would be the prediction of the storm track,” McQueeney said. “But there’s an equally enormous amount of data about the state of society and what businesses are doing. What are people doing? Are people reacting quickly enough?”
Fred Streitz, a computational physicist at the Lawrence Livermore National Laboratory in California, said as scientific simulations such as these grow more complex and bigger, big data will grow bigger.

“As you expand the scope of these simulations, they start creating enormous amounts of data in and among themselves, Streitz said.

Government agencies already have started to express a need for access to supercomputers that could help them more quickly process data sets.

 

Miriam Jones is a former chief copy editor of Government Technology, Governing, Public CIO and Emergency Management magazines.