- Researchers now plan to expand the capabilities of the app and collaborate with healthcare providers to integrate it into existing emergency response protocols.
- Technology leverages power of facial expression recognition to identify the unilateral changes in facial muscle movements that occur in stroke patients.
In the critical realm of emergency healthcare, the timely detection of a stroke can make all the difference in a patient’s recovery and long-term prognosis.
A team of biomedical engineers at RMIT University has developed a smartphone application that leverages the power of artificial intelligence (AI) to identify stroke patients within seconds.
Led by PhD scholar Guilherme Camargo de Oliveira, under the guidance of Professor Dinesh Kumar, the research team has published their findings in the prestigious journal Computer Methods and Programs in Biomedicine.
The key to their innovation lies in the application’s ability to analyse facial expressions and detect the subtle asymmetries that are characteristic of a stroke.
The app, which boasts an impressive accuracy rate of 82 per cent, could prove to be a game-changer for paramedics responding to emergency situations. By providing them with a user-friendly, real-time diagnostic tool, the technology has the potential to significantly improve patient outcomes.
Early detection is critical
“Early detection of stroke is critical, as prompt treatment can significantly enhance recovery outcomes, reduce the risk of long-term disability, and save lives,” Dinesh Kumar, Professor, School of Engineering, RMIT University, said.
The underlying technology, which builds upon the Facial Action Coding System (FACS) developed in the 1970s, leverages the power of facial expression recognition to identify the unilateral changes in facial muscle movements that occur in stroke patients.
The innovative approach addresses a pressing issue in the field, as studies have shown that a substantial percentage of strokes are missed in emergency departments and community hospitals, often due to the subtlety of the symptoms and the potential for bias in the assessment of patients from diverse backgrounds.
The researchers now plan to expand the capabilities of the app, collaborating with healthcare providers to integrate it into existing emergency response protocols.
By incorporating the ability to detect other neurological conditions that affect facial expressions, the team aims to create a comprehensive tool that can serve as a valuable asset for first responders and healthcare professionals alike.