Forecasting backdraft with multimodal method: Fusion of fire image and sensor data

  • Tianhang Zhang
  • , Fangqiang Ding
  • , Zilong Wang
  • , Fu Xiao
  • , Chris Xiaoxuan Lu
  • , Xinyan Huang

Research output: Journal article publicationJournal articleAcademic researchpeer-review

Abstract

Experienced firefighters can fuse the flame image, smoke pattern, and varying temperature, sound, and odour in complex and fast-changing fire scenes to foresee flashover and explosion. This study mimics firefighters and proposes a novel transformer algorithm for the fusion of fire images and temperature sensor data to forecast the backdraft explosion in a building fire. The model of backdraft forecast is demonstrated with full-scale fire tests. After training 2674 fire scenarios with various fire intensities and images from various view angles, the Fusion-Transformer model can forecast the risk of backdraft with an overall accuracy of 84%. Moreover, the occurrence time and explosion scale of backdraft can be predicted with the Mean Absolute Error (MAE) of 1.6 s and 0.14 m, respectively. Compared with the single modal model, the fusion of fire images and temperature sensor data improves the accuracy of backdraft forecast by over 50%. This work demonstrates the use of a transformer algorithm in forecasting fire evolution and critical events. It also bridges the gap between data fusion methods and fire forecast, which inspires future universal AI-driven smart firefighting practices.

Original languageEnglish
Article number107939
JournalEngineering Applications of Artificial Intelligence
Volume132
DOIs
Publication statusPublished - Jun 2024

Keywords

  • Building fire
  • Computer vision
  • Deep learning
  • Fusion transformer
  • Smart firefighting

ASJC Scopus subject areas

  • Control and Systems Engineering
  • Artificial Intelligence
  • Electrical and Electronic Engineering

Fingerprint

Dive into the research topics of 'Forecasting backdraft with multimodal method: Fusion of fire image and sensor data'. Together they form a unique fingerprint.

Cite this