Using VR to Study the Senses in Context



The past few decades of brain research in multisensory integration has demonstrated that many of our taken-for-granted ideas about how the senses work are misguided. Among other things, we have learned that auditory or visual or touch perception very rarely happens without being affected by other senses. Sensory perception always occurs in the context of other senses: the way you process the sounds you hear, for example, is fundamentally changed by what you are seeing. Still, most perception research has relied on simple computer interfaces using flashes of white light and bursts of white noise to understand how the human sensory system works. In recent conversations in the field, this body of research has been complicated by mounting evidence that there are other contextual factors at play that can affect how sensory information interacts. Pushing this idea of contextual sensory research further, they say that these relatively simplistic studies tell us little about how the brain make sense of information coming from eyes, ears and elsewhere in the noisier and more distracting contexts of the real world and lived life. So how is it possible to reconcile the basic knowledge about what the brain does out of context in a laboratory setting with the difficulty of studying what the brain does elsewhere in the unpredictable settings where it actually matters how the sensory environment comes together?


...

Example of a classic minimal psychophysics experiment. The research participant presses a button when they see the white circle or a noise that might accompany it.


I worked on neuroscience projects about how the human brain integrates audio and visual sensory information in more complex environments something like the ones we experience outside of laboratories. In order to study what the brain does in a setting that is both controlled and relatively realistic, we made use of immersive “virtual reality” technology to fabricate more complex environments and scenarios. Because the development of immersive technologies continues to outpace that of mobile “in situ brain imaging”, VR and other technologies remain valuable tools for neuroscientists and psychologists to use to understand brain activity outside the lab.


...

Sound-proofed VR room with Oculus Rift and mounted surround-sound speakers which we used to study audio-visual integration in the brain


For this project we used the Oculus Rift HMD, a surround sound speaker system, and VR environments which I built using Vizard, Python and 3DS Max to create an immersive version of classic multisensory psychophysics experiments. Our initial sketches of these environments emphasized different kinds of contextual features we could exploit in order to get a range of complex and simple virtual worlds. By making some of the environments minimal and more like the classic psychophysics experiments, we could attempt to identify patterns in how the brain extracts multisensory information from environments that are more complex versus less complex. This ability of VR to produce either highly realistic or simple worlds lets us investigate the ‘ecological validity’ of experimental paradigms: how the results of past brain experiments might scale to the real world.

On one end, my responsibilities in this project included coding VR environments using the Vizard Python library and development platform,1 reformulating research questions in order to meet technically-achievable goals, creating a data system for organizing and analyzing the vast amounts of information accessible via VR, and performing extensive hardware troubleshooting and modification to retool entertainment devices to fit the needs and standards of a scientific experiment. On the other end, I was responsible for recruitment, scheduling, study moderation, debriefing, and moreover introducing people to VR (this was 2014-15).


... Sketches of trial structure schematic and room dimension testing ... Low-fi wireframe demonstrating final experiment run-through
... One of the virtual environments used in the study was made to look like the experiment room ... An environment I designed for our second round of VR multisensory studies


I also laid the groundwork in research planning and VR environment design for the lab’s follow-up VR study on “cross-modal attentional cuing” in complex environments, completed after I had graduated from the lab.

Our two-part multisensory detection study led to unexpected findings about how the brain integrates sensory information in more complex environments, which we published in Multisensory Research.2 Our research also served as a proof of concept for using VR technologies to replicate past research and verify principle findings in neuroscience. I presented our results at the annual Society for Neuroscience conference and gave a related discussion-panel talk at an interdisciplinary symposium on synesthesia and the intersection of humanities and sciences.


Back to projects

  1. Vizard is an integrated VR IDE similar to the more popular Unity platform but focused on research applications. 

  2. Bailey, H. D., Mullaney, A. B., Gibney, K. D., & Kwakye, L. D. (2018). Audiovisual integration varies with target and environment richness in immersive virtual reality. Multisensory Research, 31(7), 689 – 713. doi:10.1163/22134808-20181301