Surgical Planning Laboratory - Brigham & Women's Hospital - Boston, Massachusetts USA - a teaching affiliate of Harvard Medical School

Surgical Planning Laboratory

The Publication Database hosted by SPL

All Publications | Upload | Advanced Search | Gallery View | Download Statistics | Help | Import | Log in

A Novel Mixed Reality Navigation System for Laparoscopy Surgery

Institution:
1Department of Radiology, Brigham and Women's Hospital, Harvard Medical School, Boston, MA, USA.
2 Boston University Medical School, Boston, MA, USA.
3 Fraunhofer MEVIS, Bremen, Germany.
4 Isomics, Inc., Boston, MA, USA.
Publication Date:
Sep-2018
Journal:
Med Image Comput Comput Assist Interv
Volume Number:
11073
Pages:
72-80
Citation:
Int Conf Med Image Comput Comput Assist Interv. 2017 Sep;21(Pt1):72-80.
PubMed ID:
31098598
PMCID:
PMC6512867
Keywords:
Ergonomics, Laparoscopy surgery Audio navigation, Mixed-reality, Surgical navigation, Visual navigation
Appears in Collections:
NCIGT, SLICER, SPL
Sponsors:
P41 EB015898/EB/NIBIB NIH HHS/United States
P41 RR019703/RR/NCRR NIH HHS/United States
Generated Citation:
Jayender J., Xavier B., King F., Hosny A., Black D., Pieper S., Tavakkoli A. A Novel Mixed Reality Navigation System for Laparoscopy Surgery. Int Conf Med Image Comput Comput Assist Interv. 2017 Sep;21(Pt1):72-80. PMID: 31098598. PMCID: PMC6512867.
Downloaded: 29 times. [view map]
Paper: Download, View online
Export citation:
Google Scholar: link

OBJECTIVE: To design and validate a novel mixed reality head-mounted display for intraoperative surgical navigation. DESIGN: A mixed reality navigation for laparoscopic surgery (MRNLS) system using a head mounted display (HMD) was developed to integrate the displays from a laparoscope, navigation system, and diagnostic imaging to provide context-specific information to the surgeon. Further, an immersive auditory feedback was also provided to the user. Sixteen surgeons were recruited to quantify the differential improvement in performance based on the mode of guidance provided to the user (laparoscopic navigation with CT guidance (LN-CT) versus mixed reality navigation for laparoscopic surgery (MRNLS)). The users performed three tasks: (1) standard peg transfer, (2) radiolabeled peg identification and transfer, and (3) radiolabeled peg identification and transfer through sensitive wire structures. RESULTS: For the more complex task of peg identification and transfer, significant improvements were observed in time to completion, kinematics such as mean velocity, and task load index subscales of mental demand and effort when using the MRNLS (p < 0.05) compared to the current standard of LN-CT. For the final task of peg identification and transfer through sensitive structures, time taken to complete the task and frustration were significantly lower for MRNLS compared to the LN-CT approach. CONCLUSIONS: A novel mixed reality navigation for laparoscopic surgery (MRNLS) has been designed and validated. The ergonomics of laparoscopic procedures could be improved while minimizing the necessity of additional monitors in the operating room.