COVID-19 Patient Data Visualization Tools

Collaborators: R Adams Cowley Shock Trauma Center

Point-of-care ultrasound (POCUS) is ideal for lung and cardiac imaging in patients with COVID-19, as it is a rapid and lower-risk imaging option providing important anatomic and functional information in real time. We are developing new computational tools that will store, analyze and display POCUS images quickly, accurately, and in a way designed to facilitate clinical decision-making. The technology will allow quantification and standardization of lung and cardiac ultrasound findings in COVID-19, which is not currently possible. While this effort is COVID-19- specific, the technology will impact and improve the care of all acutely ill patients, fundamentally changing how patients are managed both in and out of the hospital setting.

Weather Forecasting in Virtual Reality

Collaborators: Mason Quick, Patrick Meyers, Scott Rudlosky, ESSIC

Immersive visualization technology is rapidly changing the way three-dimensional data is displayed and interpreted. Rendering meteorological data in four dimensions (space and time) exploits the full data volume and provides weather forecasters with a complete representation of the time-evolving atmosphere. With the National Oceanic and Atmospheric Administration, we created an interactive immersive “fly-through” of real weather data. Users are visually guided through satellite observations of an event that caused heavy precipitation, flooding, and landslides in the western United States. This narrative and display highlights how VR tools can supplement traditional forecasting and enhance meteorologists’ ability to predict weather events.

Collaborative Virtual Reality Sand Table

Collaborators: University of Maryland Applied Research Laboratory for Intelligence and Security

We have created a prototype multi-user networked virtual reality platform aimed to enhance mission-planning collaboration in a VR setting. Multiple users in different locations around the world can work together within this environment by manipulating 3D assets, which can be added, moved, and deleted at will. Within this environment a permission hierarchy can be added and customized for each planning event: a single host can control and place assets within the scene, with passive observers; or several hosts can manipulate the 3D assets with zero passive users.

Blended Reality for Occupational Training

In various projects with industrial partners, we have created blended reality environments for occupational training, such as for service and maintenance of complex machinery. Using multiple 360-degree scans of a facility, we create its “digital twin,” then augment it with overlaid data representing instructions for its maintenance or other information that is relevant to operations. With this, we are able to create full training modules in virtual reality, with the ability to add interaction and virtual objects as needed, to build interactive training narratives.

Four Strings Around the Virtual World

Collaborators: Irina Muresanu, UMD School of Music

We’ve spearheaded a unique collaboration between the University of Maryland’s School of Music and College of Computer, Mathematical, and Natural Sciences, combining classical music with modern technology to transform the way people experience a concert performance. The Maryland Blended Reality Center built a prototype VR system to capture pieces from world-renowned concert violinist Irina Muresanu’s “Four Strings Around the World” project, a series of solo violin pieces representing traditional music from the Middle East, South America, Europe, China and the United States. The resulting experience transports viewers to scenic locations around the world, which represent these culturally-diverse pieces as Muresanu performs them.

Cinematic Hologram

We have built a state-of-the-art 64-camera studio, which allows us to offer scalable cinematic-quality 3D scene generation and capture from images and videos. With tiled camera arrays to acquire training data from multiple points of view, we have developed: new compression and decompression formats suitable for capture and ingestion of high-precision, dense, space-time light fields; real-time decompression and rendering of the light field at a viewpoint with very low latency for rendering at interactive frame rates; and software tools for editing, manipulation, streaming, and annotation of light-field-based precision telepresence environments. With these developments we can produce cinematic-quality multi-view holographic reconstructions of real people.

Navigable Immersive Opera Experiences

Collaborators: Craig Kier, UMD School of Music

With colleagues at the University of Maryland School of Nursing, we are exploring virtual reality and music therapy for non-opioid pain interventions. For this project, we have captured and recreated a 360-degree navigable performance by the Maryland Opera Studio, which is part of a comparative study of VR environments as pain interventions. The combination of VR and music to decrease pain perception is truly innovative. Combining the novelty and immersion of VR with the known therapeutic effects of music presents a unique and promising approach to patient care, and could help further understanding of the complex relationship between music and healing.

Augmented Reality for External Ventricular Drainage

Collaborators: Greg Schwartzbauer, R Adams Cowley Shock Trauma Center

External ventricular drainage (EVD) is a high-risk procedure that involves inserting a catheter inside a patient’s skull to drain cerebrospinal fluid and relieve intracranial pressure. To assist with this procedure, we have developed the Augmented Reality Catheter Tracking and Visualization Methods and System, which accurately projects both a catheter placed into the human brain through the skull, and a brain CT scan overlaid on the skull, onto AR glasses. Our technique uses a new linear marker detection method that requires minimal changes to the catheter and is well-suited for tracking other thin medical devices that require high-precision tracking.

Immersive Media for Health Messaging

Collaborators: Azieb Kidanu, Pam Clark, UMD School of Public Health

Public health best practices for communicating the harms of tobacco use are well-established for traditional products, such as cigarettes, but there is a critical knowledge gap on how to properly communicate the health risks of other products like e-cigarettes, vape pens and hookahs. In collaboration with the University of Maryland’s School of Public Health, and with funding from the National Institutes of Health (NIH) and the Food and Drug Administration (FDA), we created a virtual hookah bar environment to test the effectiveness of immersive media in modifying young adults’ perception of harm and addiction potential of water pipe tobacco smoking.

Augmented Reality for Patient Care and Diagnostics

While medical imaging has radically evolved, images are displayed in the same way they were in 1950. Visual data are shown on 2D flat screens, on displays that force health care providers to look away from the patient, and even away from their own hands while operating. AR’s ability to concurrently display imaging data and other patient information could save lives and decrease medical errors. This is especially true for procedures done outside an operating room; during “bedside procedures” patients may be most at risk and AR could provide the greatest benefits. We are currently developing and testing several AR tools for patient care and diagnostics, including intubation and ultrasound.

Immersive Environments for Combating Implicit Bias

Collaborators: Kris Marsh, Rashawn Ray, Long Doan, Sociology

We are seeking to adapt virtual reality and immersive environments for better understanding and intervention around various dimensions of implicit bias (e.g. race, ethnicity, gender, sexuality, disability), and to develop new tools that law enforcement agencies in particular can use to improve training around these issues. We have developed VR scenarios that provide an immersive, life-like environment in which officers can experience and react to different situations they may encounter. We have also developed tools that can track physiological data from participants as they experience different scenarios, including eye and head motion, heart rate, and voice stress.

Immersive Language Learning

Collaborators: University of Maryland Applied Research Laboratory for Intelligence and Security

In natural language acquisition environments, facial expressions and micro-expressions, eye movements, gaze directions, gestures, and other subtle body language cues are important communication mechanisms. Without such cues, it is difficult to produce the complex foreign language scenarios needed to train advanced language learners. For our prototype language acquisition module, we record a real-life language environment using 360-degree cameras and omnidirectional microphones. The photorealistic quality of the recorded content conveys realism surpassing even the best graphical renderings available. The recreation of the recorded environment not only preserves the language context, but also captures detailed cues that are essential for high-level language learning.

Memory and Recall in Virtual Environments

Virtual reality displays afford a superior spatial awareness by leveraging vestibular and proprioceptive senses. Since classical times, people have used memory palaces as a spatial mnemonic to help remember information by organizing it spatially in an environment and associating it with salient features in that environment. We’ve explored whether using virtual memory palaces in a head-mounted display (HMD) will allow a user to better recall information than when using a traditional desktop display. In our study, we found that virtual memory palaces in a HMD provided an 8.8% improvement in memory recall ability compared to traditional desktop displays.

Immersive Introduction to Coding

Collaborators: Jan Plane, Kate Atchison, Maryland Center for Women in Computing

In conjunction with the Maryland Center for Women in Computing (MCWIC), we continue to support Computer Science Connect (CompSciConnect), a series of summer day camps, targeting young women in particular, for increasingly technical content focused on VR and AR over the three-year span of middle school. Each summer, nearly 150 young girls participate in this program, learning to code in Unity for virtual environments. These students see the many ways in which VR and AR can be applied, gain hands-on experience with problem-solving using VR and AR, and create applications with the help of graduate and undergraduate student mentors.

Exploring History in Virtual Reality

Immersive environments afford an amazing opportunity to experience history as has never been possible before. Working with the Newseum, we created a prototype of a virtual reality walkthrough of cold war-era East Berlin as a complement to their Berlin Wall exhibit. Visitors were ultimately able to relive the anxiety of Berlin citizens, finding themselves atop a guard tower, searching for courageous wall-jumpers, sifting through artifacts in a Berlin home, finding a hidden escape tunnel, and eventually using a sledgehammer to break down the wall that divided Berlin for nearly three decades.

Non-opioid Pain Management

Collaborators: Luana Colloca, UMD School of Nursing

Over 100 million adults in the United States suffer from acute and chronic pain. Meanwhile, misuse of opioids used to treat pain has increased dramatically in recent years, and the attendant opioid epidemic has resulted in roughly 2 million Americans abusing heroin. Non-opioid strategies for treating and managing pain are of vital importance in curbing this epidemic. We are currently pursuing several interventions. We have studied how VR acts to reduce acute pain perception and are working toward a study assessing the efficacy of VR in pain reduction in ICU patients with headache after subarachnoid hemorrhage, in a classic example of bench-to-bedside research.

Virtual Environments for Quadriplegic Patients

Collaborators: Maryland Institute College of Art | Game Lab, R Adams Cowley Shock Trauma Center

The Quadcade is a collaborative project between the MICA Game Lab, the Maryland Blended Reality Center and the University of Maryland Medical Center’s Shock Trauma Center, exploring how VR games can be used in the rehabilitation of severe spinal cord injuries. We have designed a suite of VR games that utilize commercial VR hardware including the Quadstick, a joystick made for quadriplegics. The games assist with stress-inducing tests of a patient’s ability to breathe off a ventilator, and also help newly-injured patients become familiar with adaptive technologies. We are currently testing our first prototype, and designing a second phase which includes virtual social interaction.