IE 11 Not Supported

For optimal browsing, we recommend Chrome, Firefox or Safari browsers.

Predictive Analytics Optimize Chicago’s Food Inspection Process

Using an analytics-based method, Chicago discovered critical violations earlier than if it had used the traditional inspection procedure.

This story was originally published by Data-Smart City Solutions.

Consider this: Chicago, a city with nearly 3 million people and more than 15,000 food establishments, has fewer than three dozen inspectors who are in charge of annually checking the city’s entire lot. 

When inspectors check this entire lot, 15 percent of these establishments, on average, earn a critical violation. 

Having a critical violation, which generally relates to food temperature control, can drastically increase the odds that a restaurant may start or spread a foodborne illness. Because of the obvious negative effects this can have on a population, efficiently and effectively targeting food establishments with critical violations is a top public health priority. 

Chicago’s challenging task to quickly locate and address these violations is a prime candidate for optimization with advanced analytics. It’s also an opportunity that Chicago’s analytics team has been sure to seize as the city pioneers in its use of data. 

Chicago committed to build the first-ever municipal open-source, predictive analytics platform when selected as one of five cities for Bloomberg Philanthropies’ inaugural Mayors Challenge, an ideas competition that encourages cities to generate innovative ideas to solve major challenges and improve city life. The city’s goal was to aggregate and analyze information to help leaders make smarter, faster decisions and prevent problems before they develop.

The city’s recently completed pilot program to optimize the city’s food inspections process – conducted by the Chicago Department of Innovation and Technology (DoIT), along with the Department of Public Health (CDPH) and research partnerships with Civic Consulting Alliance and Allstate Insurance – has been a milestone that has yielded striking results. When using an analytics-based procedure, Chicago was able to discover critical violations, on average, seven days earlier than if it had used the traditional inspection procedure. 

The results have implications not only for Chicago, but also for cities anywhere that wish to optimize inspections processes using advanced analytics. Moreover, Chicago’s collaborative and open method for launching such an initiative provides lessons for other places that wish to start analytics programs of their own.    

Tom Schenk, Chicago’s chief data officer, has had food inspections on his mind for a long time. Nearly two years ago, Schenk held discussions with CDPH around how the two departments could collaborate around the city’s growing analytics program. Following on the city’s goal to build the first municipal open source, predictive analytics platform with the $1 million award from Bloomberg Philanthropies’ Mayors Challenge, the city had begun constructing the SmartData Platform. To test the application for operationalizing predictive analytics, the city had also already introduced its first analytics pilot program, which used advanced analytics to enhance the Department of Streets and Sanitation’s rodent baiting efforts.

Schenk and CDPH reached out to Civic Consulting Alliance (CCA), a local organization that pairs corporations with city of Chicago departments to engage in meaningful pro bono projects.  Through CCA, Chicago met with Allstate, the Chicago-area based insurance giant with a history of community involvement. Allstate has worked with Chicago before, notably as a leading member of Get In Chicago, a cross-sector initiative to improve neighborhood safety across the city.    

With CCA’s help, the city teamed up specifically with Allstate Insurance's data science team—a new kind of relationship for both parties. Allstate Insurance operates Project Bluelight, which commits up to 10 percent of team members to pro bono projects. Often, these projects include volunteer opportunities, such as the Adopt-a-Highway program or service in soup kitchens.  This project, however, offered an opportunity to let Allstate’s data scientists apply their skills to a research project with a new level of impact.

With a coalition in place, Schenk and the team then began by interviewing CDPH inspectors in order to better understand the logistics of their jobs and see how they interact with data. Chicago identified food inspection reports, 311 service data and weather data as top candidates for exploration for predictors of food inspection outcomes. The team also used other information on the city’s open data portal, such as community and crime information, to bolster the model.

The importance of Chicago’s open data program to this project cannot be overstated. With multiple parties involved, the Chicago coalition needed to work with a centralized, comprehensive and universally accessible source of data. Fitting this bill, the portal was able to provide a means for partners to easily exchange research and analysis while working on the project. Without the data portal, Chicago’s key data sets would have been located in disparate systems, making this extremely time-consuming and difficult.  

In processing and analyzing the data, Chicago found several key predicting variables that, when observed, indicated there could be a considerable likelihood that a restaurant may earn a critical violation. These predicting variables include the following:

  • Prior history of critical violations

  • Possession of a tobacco and/or incidental alcohol consumption license 

  • Length of time establishment has been operating

  • Length of time since last inspection

  • Location of establishment

  • Nearby garbage and sanitation complaints

  • Nearby burglaries

  • Three day average high temperature

These predictors were then factored together into a model, which was tested against food inspection procedures via a double-blind post-diction analysis. In other words, after collecting a set of data, Chicago performed a simulation that used this past data to predict what its future outcome would have been under data-optimized conditions. 

What does that process mean, then, when put into practice in a government setting?   

In September and October of 2014, CDPH performed food inspections using their traditional method, and then handed detailed inspection information over to Chicago’s analytics team. During that trial period, CDPH’s inspectors visited 1,637 food establishments in total. Of these inspections, CDPH found there were 258 establishments — approximately 15 percent of that total — that had at least one critical violation.           

Keep in mind that the goal of the pilot is for analytics to help deliver inspections results faster — so that critical violations may be detected earlier. This is why data was collected over a two-month period; September’s and October’s numbers were used as a source for comparison. In September, CDPH inspectors found more than half of those aforementioned establishments with violations — 141 of them, or 55 percent, to be precise. In October, the remainder of that total (45 percent) was then found. 

If 55 percent of critical violations were found during a normal-operations first half, then what percent of critical violations could be found with an analytics-optimized-operations first half?

It’s worth noting that CDPH inspectors had a good first month — finding more than 50 percent of violations within the first half of a given time period is always a good sign. But what if, instead of inspecting these food establishments via standard procedure, inspectors first visited all locations that met the key predicting variables listed above?

To find out, the analytics team assigned each inspected food establishment from the trial period with a probability of earning a critical violation, which was based on how many predicting variables each establishment met. For example, if one establishment met eight out of 10 predicting factors, it would be assigned a higher probability than an establishment that met two out of 10 predicting factors. Since this newly ranked “forecast” list was assembled using trial period data, it provided an ideal source for comparison between the two inspection methods. 

Chicago’s analytics team found that inspections could be allocated more efficiently with the data-optimized forecast list than they were with the traditional procedure’s list. In the simulation, 178 food establishments, or 69 percent of inspections with critical violations, were found in the first half. 

These numbers mean that had inspectors been using the data-optimized forecast list instead, an additional 37 establishments with critical violations would have been detected during the first month. By detecting these establishments earlier, CDPH is able to prevent restaurant patrons from becoming potential victims of foodborne illnesses.   

When comparing the traditional procedure to the data-optimized procedure for inspecting food establishments, the pilot’s results led to a 25 percent increase in the number of critical violations found in the first half. Given that the first half was one month, this 25 percent increase when measured in time shows us that using a data-optimized procedure for inspecting food establishments leads to finding critical violations, on average, seven days earlier. 

What Does It All Mean?

The success of Chicago’s food inspections pilot does not mean more critical violations are being found, nor does it mean that CDPH will be changing their inspections processes overnight.  Rather, the pilot is a key step toward the continued adaptation of advanced analytics into city operations. While the same general critical violations average of 15 percent is still being found, it is being found faster than before, which brings public health benefits to the city.  These results — and the process to obtain them — are similar to Chicago’s aforementioned work on rodent baiting analytics.

Sean Thornton is a Program Advisor for the Civic Analytics Network at Harvard's Ash Center for Democratic Governance and Innovation, and writer for Ash Center publication Data-Smart City Solutions. Based in Chicago, Sean holds joint Masters’ degrees from the University of Chicago in Public Policy and Social Service Administration. His work has spanned the city's public, philanthropic, and nonprofit sectors.