So, researchers are working with the National Oceanic and Atmospheric Administration (NOAA) on new models that use a form of machine learning to forecast where wildfires are likely to strike, weeks or even months ahead of time. That’s something current models can’t do.
Current models use massive amounts of data, such as atmospheric composition, air temperature and air pressure, and that data is calculated via complex equations involving physics, chemistry and “atmospheric transport,” or the way chemicals move through the air. Those calculations result in simulations of future weather events.
The amount of time it takes for the model to come up with a prediction is called a timestep, and to predict further into the future — weeks, months — you need multiple timesteps. That requires more computational power than today’s models are capable of.
That’s where researchers at the Johns Hopkins Applied Physics Laboratory (APL) come in. They’ve entered into a partnership with NOAA, NASA and Morgan State University in Baltimore to develop models, through machine learning, that have the computational power to speed up predictions and allow for longer-range forecasting. The new machine learning method only needs 21 hours of input data to produce accurate air quality forecasts, while traditional models can require months’ worth of data. The Johns Hopkins Applied Physics Lab (APL) is working with the National Oceanic and Atmospheric Administration (NOAA) on the models. They are also applying these methods to improve air quality forecasts. NOAA is migrating its air quality forecasting guidance to a new model with higher resolution, which requires more computational power. “We’re working on a wildfire emulator that’s a deep-learning emulator and the idea is to establish probability estimates for the likelihood of wildfires and then estimate the emissions.” said Jennifer Sleeman, senior artificial intelligence researcher at APL.
Speeding up the forecasts and getting information to the public ahead of time can provide the opportunity for those vulnerable to wildfire smoke to take action before it’s too late.
The deep learning model requires tons of data and time for a “training phase.” That takes place on a Central Processing Unit, where historical data is trained to learn what researchers want to find out. It’s also called pattern learning.
“Pattern learning, that’s really what these kinds of machine learning techniques like deep learning do,” said Marisa Hughes, climate intelligence lead at APL. “You provide them with a great deal of data, and they get good at a specific task that you’re giving them as you reward them for performing well on that data, in this case forecasting.”
Sleeman said air quality forecasting is a lot more challenging than weather forecasting because of such variables as the concentration values of ozone or the particulate matter involved. But it’s something that, with current trends of more and hotter wildfires, the researchers see as a necessity.
“We view this as a challenge that is becoming more important and more critical as we’re seeing an increasing number of air quality events,” Hughes said. “NOAA has some very specific forecasts about the connection to how climate change is affecting wildfires through changes in rainfall and temperatures and humidity, etc.” Wildfire smoke and particulates can affect vulnerable individuals immediately but there’s also the long-term effect, which isn’t always obvious.
“It’s not the same as being able to say, ‘This is exactly what the risk of exposure is at this time,” Hughes said. “[Long-term] diagnosis is much more challenging since we’re talking about longer exposures.”
It’s also probable that the effects of long-term exposure go unnoticed when a person becomes sick or dies. “Unfortunately, we’re expecting an increase in these events and their scales,” Hughes said. “So being able to develop these kinds of tools now can help us with the forecast and protect populations and help people make decisions about protecting themselves.”