Artificial Intelligence Identifies Awareness in Comatose Individuals Prior to Medical Detection, Days Before Doctors Diagnose It
SeeMe, an innovative AI tool developed by a team at Stony Brook University, is set to revolutionise the way doctors and families approach care for those with severe brain injuries. Led by computational neuroscientist Sima Mofakham, the team behind SeeMe has created a tool that uses computer vision to track the tiniest facial movements in patients who are believed to be unconscious.
The core idea behind SeeMe is to track facial movements with extreme precision. This groundbreaking AI tool could act as an early indicator of the gradual recovery of awareness in patients who have suffered brain trauma.
The phenomenon of covert consciousness, where a person remains inwardly aware but outwardly unresponsive, was first observed in 2006. Up to a quarter of people who seem unresponsive still demonstrate brain activity when given simple voice commands. SeeMe might even open the door to communication with patients who were previously thought to be unreachable.
SeeMe provides a more reliable way of tracking consciousness, especially when patients are unable to respond to routine medical exams like eye opening or hand squeezing. In a recent study, SeeMe was able to detect signs of consciousness up to eight days earlier than clinicians could. In one case, SeeMe detected mouth movements on day 18 after admission, while the patient did not show clear signs of motor command response until day 37.
Patients who exhibited more frequent and pronounced facial movements during the early phase tended to have better outcomes in the long run, recovering quicker and more effectively after discharge. The correlation between early facial movements detected by SeeMe and better recovery outcomes suggests that these early signs can be crucial for doctors and families deciding how to proceed with care.
Sima Mofakham, the leader of the team that developed the AI tool SeeMe, is originally from Iran. The team aims to refine their tool further by expanding its capacity to analyze other forms of movement, such as electrical signals from muscles. They also aim to create a 'yes or no' system, enabling patients who are conscious but unable to move or speak to answer simple questions through facial cues.
SeeMe looks for even the smallest changes in facial expression that can't be spotted by the human eye. The AI examines these movements at the level of individual skin pores. The study involving SeeMe was published in Communications Medicine. With its potential to change how doctors and families approach care for those with severe brain injuries, SeeMe could pave the way for a future where patients with long-term brain injuries may have their awareness detected earlier and with greater certainty.
Read also:
- Inadequate supply of accessible housing overlooks London's disabled community
- Strange discovery in EU: Rabbits found with unusual appendages resembling tentacles on their heads
- Duration of a Travelling Blood Clot: Time Scale Explained
- Fainting versus Seizures: Overlaps, Distinctions, and Proper Responses