A Virginia Tech researcher is expanding the boundaries of virtual reality, connecting virtual and physical worlds in unprecedented ways to give users real world experiences that engage all the senses.

Virtual reality simulates the natural world with 3D images, 360-degree views, and immersive audio, but it does have its limitations. With his project “Visual Sensory Convergence,”  Thomas Tucker, associate professor in Virginia Tech’s School of Visual Arts and fellow with the Institute for Creativity, Arts, and Technology (ICAT), takes an immersive virtual reality experience and extends its borders outward. An exploration of reality using virtual reality technology, the work uses a creative approach to activate more senses in a virtual setting.

This work is supported by the Institute for Creativity, Arts, and Technology and aligns with the institute’s focus area, “Amplifying the Arts.” This collection of projects exemplifies the many ways art and technology continually define and redefine the world, and demonstrate how art presents opportunities to traverse domains, often in unexpected ways. “Visual Sensory Convergence” shows how the collision of art, technology, and experimentation can drive innovation in all directions.

Tucker has transformed a virtual experience that can be isolating into a world where participants can interact with their surroundings. This world includes physical objects that the user can actually reach out and touch and manipulate. These props are controlled by live puppeteers. 

Designed to be experienced by one person at a time, the project promotes unhindered curiosity, interactivity, and imagination. In the simulation, the user might see an object in the virtual world, reach out and touch it in the physical world, and then experience a combination of effects in both the virtual and physical reality. For example, the user may step on a shape he or she sees on the floor, which will trigger a consequence such as the emergence of an object. The user could then move the object, causing a pitch-shifting sound and more objects to appear. As the user moves and interacts with these objects, a symphony of sounds may be created and the user may discover that each object has its own smell, or feel a pulse of air on his or her skin.

“My intention for this body of work is to fully immerse the participant into a re-creation of my own unique abstract world where they create quasi emotional bonds with the objects within the virtual reality scene,” said Tucker. “As part of a five-year goal, this project will go through several technological iterations and thematical cycles where each will build on the other and grow over the years. My overall goal is to develop a new kind of engaging artistic experience that integrates virtual reality with all five senses for a unique kind of performance.”

Tucker’s work as a visual artist has involved the creation of spatial environments that dynamically represent inner vision and 3D form. Over the years, the ways he manipulates space, sound, and visual images have been impacted by the many technologies he uses. What started out as simple hand drawings years ago has progressed to complex animations, projection mapping installations, and virtual reality.

Tucker received funding from ICAT’s Research Leave Extension program for sabbatical time to focus on “Virtual Sensory Convergence.” This support helped him foster interinstitutional collaborations and partnerships in New Zealand with Victoria University in Wellington, the University of Auckland, and the New Zealand Miramar Creative Centre. This work was also featured in the Ars Electronica Garden Aotearoa New Zealand, which showcases projects from artists, media companies, and scientists and researchers from New Zealand’s tertiary institutions.

“This project has been a huge collaboration of many talented people from around the world who have given both their time and passion to this endeavor,” Tucker explained. “I am grateful to Ben Knapp and the support members at ICAT for giving me enough runway during my research leave to develop this vision to completion. This project would not have happened without the expertise of David Franusich, who solved all the technical challenges, helped with the coding, and even stepped in to be one of the vocal talents for the sound.” 

- Peyton Manfre contributed to this story