The White House announced this week a national campaign to boost the research, development and deployment of systems to support smart cities and the Internet of Things — the conceptual connection of objects to one vast network. The efforts are immense: The National Science Foundation and National Institute of Standards and Technology are kicking in $45 million; a group of five federal departments and agencies are adding another $115 million; 20 cities are partnering with local higher education institutions; and a cornucopia of organizations are issuing challenges and conducting their own, smaller projects to support the larger campaign.
A running theme through the campaign is connection — of datasets, of people and of physical objects. The MetroLab Network, which brings together cities and universities with $1 million in funding from the John D. and Catherine T. MacArthur Foundation, will act as a national structure through which city officials will share problems they have in common and the solutions they’re hoping will solve them.
That’s how Michael Mattmiller, Seattle’s chief technology officer, sees the project. He hopes to see how other cities are handling the problems the Pacific Northwest faces.
One problem in particular Mattmiller wants to tackle in partnership with the University of Washington is rain. The city’s electric utility serves 400,000 customers, and wet weather puts a strain on the grid.
“When wet weather occurs, [the utility] can very quickly be overwhelmed or suffer outages,” Mattmiller said.
The university and city want to put a system in place that will forecast the weather in very specific areas. Other possible projects the partners might pursue as part of the MetroLab Network include improving accessibility for people with limited mobility and trying to create bridges out of homelessness.
But it’s not just about solving the problems the city already knows about — it’s also about finding out what other jurisdictions are focusing on.
“There’s going to be a natural tendency to be a little insular when you’re working on smart cities,” said Bill Howe, associate director of the University of Washington’s eScience Institute. “You’re going to be focusing on what … you’re interested in.”
Once cities start sharing information, he said, they can shine a light on their performance. The pooling of crime data, which is collected in different ways by local police forces, has allowed for better comparisons of crime rates between cities.
Then there’s a third connection the MetroLab network allows: intra-city. When working on projects that could benefit cities, Howe said it can be tough to know who to turn to.
“It’s really cross-disciplinary, so it’s really hard to know who to contact between different businesses and the city,” he said.
Case in point: the city of Houston. While the city has an “open data” policy, Director of Rice University’s Kinder Institute for Urban Research William Fulton said it lacks the ability to really dig through all the information.
“They have a list of 2,000 datasets,” Fulton said. “They don’t have the capability to analyze all that data.”
The information is there, but there hasn’t been much analysis of the data, or cross-comparing of datasets to come to big-picture conclusions, he said. Establishing a formal partnership with the city through the MetroLab Network could change that.
The university is starting out with three projects using data that already exist. One will draw on housing data to examine patterns of gentrification. Another will identify the prime locations for Houston’s bike-share program. The last will use information about streetlights to figure out how placement and outages affect people.
The streetlight project could be important for cities that have a backlog of infrastructure projects they need to complete, Fulton said.
“Some city says their infrastructure backlog is a billion dollars, that’s not unusual,” he said. “But we don’t have a billion dollars, so how do we prioritize?”
If the university can show, using data on traffic patterns, crime and neighborhood income levels, which streetlights are the most important, it will help the city identify which lights to fix first. The idea has had support from mayors in multiple cities, he said, but hasn’t come to fruition.
But the idea is common sense, he said. And soon, he thinks it will be commonplace.
“I think this is going to get to the point where this is not some mayor’s baby, this is going to get to the point where this is just the way you do business,” Fulton said. “It just makes sense.”
The White House’s initiative adds another angle to the connection of datasets. On top of connecting datasets together and finding new ways to analyze existing information, some are looking for new ways to gain insight that could help solve problems. The National Science Foundation (NSF) is awarding a $3.1 million grant to the University of Chicago and the Argonne National Laboratory to expand the nascent Array of Things project. The project, which began with a $200,000 budget, aims to put sensors on traffic lights around Chicago.
Last year, the project’s coordinators expected to put up 30 of the sensors. With the NSF grant, the Array of Things will likely consist of 500 nodes by the end of 2017, said Pete Beckman, director of Argonne’s Exascale Technology and Computing Institute. And that, he added, should make it the largest such sensor array dedicated to experimentation.
“The Array of Things is actually quite unique in that it’s the only project that I know of where what goes in the node is the experiment,” he said. “So we have an initial set of sensors but the community can suggest or … even design new things that can go into the nodes. So it makes the whole city into an experimental platform.”
So neither Beckman nor anybody else is fully aware of what the array could be capable of — if somebody can write a program for the sensors, it can be implemented.
The data will also be open to the public, so anyone can use the data as they see fit. The information the nodes will collect includes temperature, barometric pressure, light vibrations, atmospheric content, ambient sound intensity, and pedestrian and vehicle traffic. Some possible applications for the array include finding the least-polluted walking routes, alternate traffic routes to avoid congestion and predicting weather-related problems.
“You might be able to determine how much standing water is on the road, which causes flooding,” Beckman said. “So then looking down at the street, it might be possible to write a computer code saying, ‘Well right now we have a blocked storm drain, and the water is two inches deep.’”
The system is predicated on the idea of “edge computing,” which in the context of the array means giving individual nodes the ability to analyze data without talking to a central computer.
“There’s no way we can process all of this sensor data in the cloud; we need some of it to be processed at the street level,” Beckman said. “We can’t process it in the cloud not only for privacy reasons but also for bandwidth reasons.”
For instance, individual sensors will be able to take pictures of bicycles and cars on the road to determine how congested an area is. If those photos were all stored in a central database, it would not only require a huge amount of storage, it would also mean raising privacy concerns of the people on those bikes and in those cars. So instead, the nodes will use the photos to recognize a car, count it and then report to the database that it counted one car without saving the images.
It represents a step forward for the Internet of Things, he said, and the experiment in Chicago could very well open up doors for other cities to establish similar arrays. He said representatives from New York City, Seattle, Portland, Mexico City, Glasgow, Bristol and Amsterdam have all visited Argonne to learn about the project.
“These cities are all interested in trying stuff out and want to get their hands on a node as soon as possible,” Beckman said.
The White House initiative is bringing New York City, Tampa and Wyoming into the effort to build up an Internet of Things as well. As part of the campaign, the U.S. Department of Transportation is spending $42 million to implement pilot projects in those three jurisdictions that will establish data-sharing connections between vehicles and infrastructure. In Tampa, that will mean sending information about rush-hour traffic to pedestrians’ smartphones. In Wyoming, the state will monitor trucking corridors. And in New York, the city will establish links between 10,000 municipal vehicles, as well as traffic signals and roadside units.
According to research from the National Highway Traffic Safety Administration, vehicle-to-vehicle technology has the potential to intervene in situations that cause about 80 percent of traffic accidents. When vehicles “know” where other vehicles are, they can tell a driver when it’s safe to make a left turn or enter an intersection. By connecting vehicles to traffic signals, stoplight cycles could be improved and help ease congestion.
The Department of Transportation is already planning to require such technology in all new vehicles. The department issued a notice of advanced rulemaking last year, and promised to submit a proposal to the Office of Management and Budget by the end of 2015 for analysis.
The New York, Tampa and Wyoming projects will all help test out the technology, according to a Monday press release from the department.
Throughout all the projects, from Tampa to Seattle, a common response to the smart cities initiative is excitement. Those participating see it as a means of accelerating investment and interest in building smart cities and the Internet of Things.
“We’re going to yield some really great results,” Mattmiller said. “I look forward to getting started.”