Augmenting Reality in the Operating Room

March 30, 2017    |  

The reality of life in the operating room is about to look more like virtual reality, as leaders in clinical medicine at the University of Maryland, Baltimore (UMB) and colleagues with expertise in computing and imaging at the University of Maryland, College Park (UMCP) work together on lifesaving technologies. Several examples of virtual reality and augmented reality applications were demonstrated on March 27 to a small gathering at the Newseum in Washington, D.C. 

Sarah Murthi, MD, associate professor at the University of Maryland School of Medicine (UMSOM) and trauma surgeon at the R Adams Cowley Shock Trauma Center, let visitors see for themselves how an ultrasound examination looks with the addition of augmented reality. With the help of volunteer “patient” Eric Lee, a research programmer at the UM Institute for Advanced Computer Studies (UMIACS) in College Park, Murthi demonstrated visualization headgear. Using the headgear, doctors can keep their eyes on the patient while seeing images of the ultrasound – and potentially much more information added to their view. (Photo gallery)

Sarah Murthi, MD explains the augmented reality technology used to perform an ultrasound scan on volunteer

Sarah Murthi, MD explains the augmented reality technology used to perform an ultrasound scan on volunteer "patient" Eric Lee.

“Ultimately, you can imagine the whole medical staff wearing these masks. And then whoever wants to can see the imaging. Right now the entire room is trying to look at this one display, but if we could all actually see the images then we could do whatever else we need to do while we’re looking at it,” Murthi said. Keeping all of the critical information about a patient in plain view might also help avoid problems doctors currently encounter, such as missing changes in vital signs while focusing on an ultrasound scan. “The other real thing that happens is that you’ll be all caught up in the imaging and meanwhile the patient’s blood pressure drops and they become unstable,” she said.

Next to Murthi, colleague Caron Hong, MD, associate professor at UMSOM and a critical care anesthesiologist at Shock Trauma, helped visitors use virtual reality goggles to place an endotracheal tube through the mouth of a medical mannequin and into its airway. Intubation is a common procedure, used when a patient needs assistance breathing, that requires considerable practice to master. “It’s like driving a car with a trailer backing up,” complained Newseum President and CEO Jeffery Herbst as he gave the procedure a try.

Both doctors also stressed the great educational value of virtual and augmented reality. “Once this virtual world is created and it can rebroadcast in real time, a thousand people could watch at the same time and learn whatever lesson they needed to learn,” said Murthi.

Following the demonstrations, Murthi and collaboration partner Amitabh Varshney, PhD, professor of computer science, director, Augmentarium, and vice president for research at UMCP, joined colleagues from other universities to share accomplishments and ideas. Varshney explained that he and Murthi are working on several projects together. “One of them is looking at re-creating the controlled chaos of the Shock Trauma Center by using immersive camera arrays,” he said. “By recording exactly what happens in Shock Trauma it would allow residents and interns to place themselves in the shoes of a surgeon and see what they are looking at and what decisions they are making as they are performing these surgeries.”

Murthi added that a combination of virtual and augmented reality may one day support the care of battlefield patients. First responders on the ground might use augmented reality to see important data and imaging while treating and transporting patients. At the same time, advanced medial staff at a remote location could use virtual reality to see what first responders are seeing and provide valuable insight and information to first responders.

Other medical uses for virtual reality around the country were discussed, including the use of three-dimensional mapping to plan and practice brain surgery, developing clinically guided scenarios for soldiers experiencing Post Traumatic Stress Disorder, and reducing the experience of pain during burn treatments by engaging patients in interactive virtual reality games.

“Our focus has been to look for technologies that we could have in hospitals soon, in the next year or so since a lot of this equipment already exists,” said Murthi. “The question is how can we use it and develop it so we can already start saving lives.”

About the Augmentarium

Much of the work of Varshney and his UMIACS colleagues takes place at UMCP’s Augmentarium, a research and teaching facility used to conceptualize, design, and build immersive, interactive technologies. Initial funding for the Augmentarium came from the National Science Foundation. Funding for current projects underway is from the University of Maryland Strategic Partnership: MPowering the State.

About The University of Maryland Strategic Partnership: MPowering the State

The University of Maryland Strategic Partnership: MPowering the State is a collaboration between the state of Maryland’s two most powerful public research institutions: the University of Maryland, Baltimore (UMB) and the University of Maryland, College Park (UMCP). It leverages the sizable strengths and complementary missions of both institutions to advance interdisciplinary research, create opportunities for students, and solve important problems for the people of Maryland and the nation. Working together, UMB and UMCP achieve innovation and impact through collaboration.

The University of Maryland Strategic Partnership Act of 2016 strengthened and formalized the structured relationship between UMB and UMCP, which began in 2012. The law deepens the alliance and energizes UMB and UMCP to pursue even greater transformative change and impact, far surpassing what each institution could do independent of each other.