FIRST LOOK: Surgical Fingerprints: Real-time Analysis of Intraoperative Events
Intraoperative adverse events (IAEs), such as bowel/vascular injury, are estimated to occur in 2% of operations. IAEs can exact a toll on patient quality of life and costs, with average admission charges estimated to be 41% higher for patients who experience IAEs. Up to two-thirds of surgical errors occur intraoperatively, and 86% of these are secondary to cognitive factors such as failures in judgment, memory, or vigilance that lead to poor decisions. Currently, analysis of the intraoperative phase of care is limited to review of dictated operative reports that are notoriously incomplete. While video has been shown to be more accurate for identification of IAEs, manual review of video is costly and time-consuming.
We developed a machine learning approach to analyze laparoscopic video and generate identification and segmentation of operative steps. Using annotated video, we trained a support vector machine and a neural network to classify video frames into their respective operative steps. Hidden Markov models were used for segmentation of videos into operative steps with >90% accuracy (Fig. 1). We used coresets to leverage semantic summarization of video to increase the efficiency of segmentation. The cumulative log probability for each frame allowed for real-time estimation of deviation from an expected operative path and resulted in a “surgical fingerprint” that visually summarized potential areas of unexpected operative events (Fig. 2).
Real-time analysis of surgical video provides the foundation for intra- and post-operative clinical decision support to augment surgical decision-making. We plan to link population-based pre-procedure risk scores with patient-specific surgical fingerprints with the goal of preventing IAEs and identifying optimal, patient-specific post-procedural management, with the potential to recognize rare patient scenarios and seemingly common patients who may not fit into standard protocols. By unlocking intraoperative care as a quantitative data source to predict IAEs, complications, and readmissions, we expect to have an impact on daily clinical care by providing clinicians with an effective, data-driven tool for perioperative management.
For more information about Dr. Hashimoto’s research, please contact Partners HealthCare Innovation by clicking here.
Guy Rosman2, Daniela Rus2, Ozanan R. Meireles1
1Department of Surgery, Massachusetts General Hospital
2Computer Science and Artificial Intelligence Laboratory, Massachusetts Institute of Technology
Fig 1. Over 90% accuracy of machine segmentation (blue) vs. human segmentation (green)
Fig 2. Fingerprints comparing A) routine sleeve gastrectomy vs. B) sleeve + lysis of adhesions
Most Recent Posts:
The Medically Engineered Solutions in Healthcare (MESH) Incubator: A case study of innovation “push”
In an opinion piece for STAT, Marc Succi, MD, Clinician-in-Residence at Mass General Brigham Innovation, addresses the…