Cameron Tongier of the US Fish and Wildlife Service Fire Management Branch spoke with InformationWeek from his temporary office near a fire line in Idaho. He's one of several front-line wildfire managers we spoke with about the long arc of data analysis that leads up to daily situation reports for wildfire managers.

Curtis Franklin Jr., Senior Editor at Dark Reading

September 3, 2015

7 Min Read
<p align="left">Cameron Tongier, geo-spatial coordinator for the US Fish and Wildlife Service Fire Management Branch</p>

(Continued from preceding page)

Other groups in the US government handle resource-mapping applications. "Where we come in is when they need current fire information to run the models with," Quayle said. "[Agencies] will use our fire detection data as an ignition point and [through] the first 24 hours, then run seven- to 14-day simulations from there."

Simulated Forest Fires And The Real Deal

Information from satellites, ground-based sensors, human operators, and other sources is used in planning for future incidents, with modeling tools making use of all that data.

These modeling tools are also used in real time to help guide firefighters on the front lines about what to expect in a given incident, providing details such as how certain vegetation burns.

"[The modeling tool] takes that data and our [front line] information and the weather data and runs what we call F-SIM (Fire Simulation)," said Tongier. "That gives us an idea of what the probabilities are that a fire will do a certain thing in a certain time."

[ Nothing to fear here, move along. Read Weaponized Drones Approved For North Dakota Police. ]

Running different scenarios through the model allows the fire management team to understand "what if" possibilities. "You can put the fire data in, so you can tell it's not going to go out in a particular amount of time," Tongier said. "Then, you can run scenarios on parts of the fire to see what happens if conditions change. So we can see what it would look like in 14 days if things change, or don't. That lets us set up reaction points so we have a long-term plan."

As with most models, more data makes for better results. "The broader the set and the bigger the fire, the more accurate a prediction you can make. The land-fire data is the base data for all of it. With all the other variables, it tells us what the fire will look like in all the fuel models," Tongier said.

Information From The Ground

Satellite data provides vital overviews of a fire, but there is some data that simply must come from the front lines. "If we get a lightning storm, or someone flicks a cigarette out the window, from the ignition we begin collecting information on the incident," said Tongier.

He described the process in detail: "We get first responders on the ground, and someone takes on the role of incident commander. They get the information on the fire, and about 98% of the time we catch it early."

Some information, especially the data required to inform some of the newer modeling software, is telephoned back into a dispatch center. Tongier said it takes a concerted effort in the dispatch center to enter as much data as possible as early as possible. Some of that data comes from very basic tools. "We'll establish a perimeter of the fire with someone on the ground and a GPS," he said. "We'll take that and tie that back into some of the other work we do, like prescribed fire. We'll put that into fuel model maps and try to figure out what the fire did and put it into the models for what we do [to respond]."

Pains_Bay_Control_01.jpg

All of the above takes place in what Tongier calls the "initial attack phase" in the first 24 hours after a fire has begun. If the fire isn't well contained within those first 24 hours, the incident goes into "extended attack," and that's usually when the larger world will start hearing about the incident. At that point, Tongier said, "It goes beyond the first incident commander, and he calls back into dispatch, and it goes into the large fire scenario."

All of the data coming from sources on the ground and in the air need to be standardized and normalized if it's to be useful. That's where the work of people such as Rochelle Pederson comes in. Pederson is chair of the National Wildfire Coordinating Group Data Management Committee (NWCG). She said that the initial data flowing into the system fits easily into a traditional database, and it comes in according to protocols established for all first responders and incident commanders.

Pederson said that the protocols remove as much guesswork and need to improvise as possible for those in the field. "We have standard forms that are used in the field, some automated and some not," she said. "An incident commander will have a form and radio it back to the dispatchers. What NWCG does is create common forms and provide recommendations for agencies to adopt standards about the data."

wxredstone.jpg

Tongier said that the common forms ensure that the proper information goes into the system. "It sets up what the situation is, what the fuel is, and what the resources assigned to the fire are," he said.

The Future Of Fire

Ultimately, all fires end. When they do, the work of managing the recovery starts. That, and the work of using the data from the still-smoldering fire to better understand the next incident. "When you have data and have it on another 100 fires, you can start to see trends," Tongier said. "We can ask whether it's due to weather or to climate, we may see more extreme behavior -- we start to be able to answer those questions. It's an interesting shift that we're making."

scientist-evaluates-post-fire-forest-whiskeytown-np.jpg

NWCG's Pederson pointed out that the type of data that goes into the evolving models is informing the direction of future data gathering. "We're talking about data that's used to describe a fire occurrence -- an ignition. What the fire's doing, key points in time in the fire activity, and also the management response to the particular ignition," she said.

To get data from closer to the fire, Pederson said, "We actually are doing some prototyping and working with the National Association of State Foresters to develop an app to gather data. We want it to be efficient for the users but protect the data as well."

Tongier pointed to an app that's already being deployed in the field. "[First responders] have an app called ARP Collector, where they can go out with an iPad or iPhone. Park service, wildlife, and [the Bureau of Land Management] are setting things so if you're standing at the edge of the fire and open the app, it collects information and sends it right back to the database."

Data collection and modeling aren't the only things that are changing.

Quayle and his group at the Forest Service's RDAS program are working on new ways to present information based on satellite data. "Within minutes of collecting the data, we process it and drive a number of products. We take the data and instead of handing over the raw data, we do some processing. We put it into geospatial data sets, analyze it, and produce value-added products and make those available on the website year round," he said. "The visualization has changed with the advent of new technology. At the beginning, we provided a lot of static products like JPG files, but with things like Google Earth, people want to use those, and people want to see them."

{Image 6}

With the coming of winter, the 2015 forest fire season will come to an end Then, the work of understanding, planning, and budgeting for the next fire season will begin. It all begins and ends with data.

About the Author(s)

Curtis Franklin Jr.

Senior Editor at Dark Reading

Curtis Franklin Jr. is Senior Editor at Dark Reading. In this role he focuses on product and technology coverage for the publication. In addition he works on audio and video programming for Dark Reading and contributes to activities at Interop ITX, Black Hat, INsecurity, and other conferences.

Previously he was editor of Light Reading's Security Now and executive editor, technology, at InformationWeek where he was also executive producer of InformationWeek's online radio and podcast episodes.

Curtis has been writing about technologies and products in computing and networking since the early 1980s. He has contributed to a number of technology-industry publications including Enterprise Efficiency, ChannelWeb, Network Computing, InfoWorld, PCWorld, Dark Reading, and ITWorld.com on subjects ranging from mobile enterprise computing to enterprise security and wireless networking.

Curtis is the author of thousands of articles, the co-author of five books, and has been a frequent speaker at computer and networking industry conferences across North America and Europe. His most popular book, The Absolute Beginner's Guide to Podcasting, with co-author George Colombo, was published by Que Books. His most recent book, Cloud Computing: Technologies and Strategies of the Ubiquitous Data Center, with co-author Brian Chee, was released in April 2010. His next book, Securing the Cloud: Security Strategies for the Ubiquitous Data Center, with co-author Brian Chee, is scheduled for release in the Fall of 2018.

When he's not writing, Curtis is a painter, photographer, cook, and multi-instrumentalist musician. He is active in amateur radio (KG4GWA), scuba diving, stand-up paddleboarding, and is a certified Florida Master Naturalist.

Never Miss a Beat: Get a snapshot of the issues affecting the IT industry straight to your inbox.

You May Also Like


More Insights