Researchers develop facial recognition technology for use in firefighting
According to the Federal Emergency Management Agency, 3,400 deaths occurred in America as a result of fires in 2017. While the number of fires seen in the US has steadily decreased over time, the number of fire-related deaths remains high. At the University of New Mexico Department of Electrical and Computer Engineering, a team of computer engineers lead by Professor Manel Martínez-Ramón are working on a series of projects aimed at making the life-saving work of firefighters easier and safer.
Martínez-Ramón has been working towards the development of a high-tech wearable device for firefighters. The device would help emergency service workers with navigation, communication, and threat assessment during tense life-threatening situations. His projects, funded by a National Science Foundation grant called Next Generation Connected and Smart Cyber Fire Fighter System, have already garnered quite a bit of enthusiasm from experts in both the emergency services and computer engineering fields.
The latest research paper to be published in association with this grant is called Semi-supervised facial expression recognition using reduced spatial features and Deep Belief Networks, coauthored by Aswathy Rajendra Kurup, Meenu Ajith, and Martínez-Ramón. The article describes an innovative new facial-recognition algorithm formulated for use in firefighting technology. The algorithm enables computing systems to identify the emotions displayed by facial expressions with 98% accuracy.
What is especially noteworthy about this technology is that it is the first facial-recognition algorithm to use semi-supervised learning – a method in which researchers essentially teach a computational network how to recognize faces. This is done by inputting both labeled images (pictures of faces which have been labeled, by a researcher, with the emotional expression they display) as well as unlabeled images into a network of computational nodes programmed with the algorithm. In time, the system actually “learns” which facial expressions correspond with which emotion.
Ajith, Kurup, and Martínez-Ramón relied heavily on CARC resources to test their algorithm. The researchers used CARC for both training and experimentation with different data bases. “I don’t think we could have done this without CARC,” says Martínez-Ramón.
Finding an accurate facial recognition algorithm is an important facet of Martínez-Ramón’s ongoing efforts to create a wearable device that would aid firefighters during life-threatening emergencies. Facial expressions are an indispensable form of human communication; a smile or a grimace can tell us whether someone is safe and happy or scared and in pain. Sometimes, however, during conditions of poor visibility, unbearable heat, and the threat of imminent danger, it can be difficult for a firefighter to quickly and accurately identify potential victims. Wearing a device equipped with the latest facial recognition technology could help a firefighter and his commander recognize the face of someone in need of help, which, in turn, could result in a reduction of fire-related deaths.
The anticipated result of the Next Generation Connected and Smart Cyber Fire Fighter System grant is a complex network of devices, used by firefighters throughout the country, that employs supercomputing technology to save lives. According to the grant proposal, the finished product will use microphones, cameras, body sensors, and ambient sensors to monitor oxygen levels, potential hazards, the presence of victims, and other important indicators of scene safety. Martínez-Ramón and his team plan to release two more papers soon in association with this project. While there is still much work to be done before the unveiling of a completed device, firefighters can rest assured that some of the brightest minds in computer engineering are dedicated to helping them to do their jobs safely and effectively.