Although it may sound like futuristic science fiction, the pursuit of virtual reality technology—which allows a user to interact with a computer generated, simulated environment—has been around for years. In fact, in 1968 Ivan Sutherland, a former professor at the University of Utah, created what is widely considered to be the first virtual reality and augmented reality head-mounted display system.

Since then, researchers at the U of U continue to develop and refine such technology for purposes ranging from education and training to design and prototyping.

One project developed at the University began in 1995 initially with a first generation TreadPort, a unique device capable of simulating realistic locomotion through virtual spaces. The TreadPort was designed at Sarcos, an engineering and robotics firm founded by Distinguished Professor of Mechanical Engineering Stephen Jacobsen. The Office of Naval Research funded the project, called a “locomotion interface,” for military training purposes. “Back then it was smaller and had only one screen that projected visuals,” says John Hollerbach, professor in the School of Computing and principal investigator of the project. “But it has been under continuous refinement ever since.”

As he obtained more funding for the project, Hollerbach sought to create a more realistic virtual environment by adding more mechanical display aspects, such as a tilt mechanism and active vertical support and a stereo audio display for realistic sound.

“We continued to develop more virtual aspects that would engage multiple senses,” says Hollerbach. “But one day in 2003, I heard Marc Watson, a chief ride developer at Universal Studios say, ‘To get a sense of immersion in a virtual environment, you need a totality of sensory effects. All of your senses must be engaged to feel immersed and to suspend disbelief.’ Then I wanted to do more.”

ADDING ATMOSPHERIC AND WIND DISPLAYS

Since then, Hollerbach secured funding from the National Science Foundation (NSF) through its Information Technology Research Program. He put together an interdisciplinary team of computer scientists, graphics experts and mechanical engineers, including Mark Minor, associate professor of mechanical engineering, who heads design and control of the mechanical aspects of the system; Eric Pardyjak, associate professor of mechanical engineering, and Meredith Metzger, associate professor of mechanical engineering, who are working on the flow design of the wind display; and Peter Willemsen, a former research assistant professor at the U of U currently at the University of Minnesota Duluth, who is building the graphics hardware and software.

Students building the system are drawn from mechanical engineering and computer science.

LEFT TO RIGHT: Mark Minor, Mechanical Engineering | Eric Pardyjak, Mechanical Engineering | Meredith Metzger, Mechanical Engineering | John hollerbach, School of Computing
(courtesy photo: UofU College of Engineering 2009 Research Report)

Today, the project is called the TreadPort Active Wind Tunnel (TPAWT, pronounced “teapot”) and is unique by virtue of adding a controllable two-dimensional wind tunnel to the existing TreadPort system.

The system houses a 6’ x 10’ computer controlled treadmill, surrounded by three projection screens that provide a 180 degree horizontal field of view. The treadmill can be tilted under computer control 20 degrees up or down to simulate walking uphill or downhill, allowing the user to walk or run to a maximum of eight miles per hour.

“Currently we have graphics simulating the Wasatch Mountains and we’re working on making them more realistic,” says Pardyjak. “We’re also designing graphics to simulate a specific city, such as downtown Salt Lake City.”

The wind component of TPAWT is an actively controlled wind tunnel that allows wind angle and speed to be regulated. Wind speeds can reach 20 miles per hour.

“We project a virtual world onto the screens, and the graphical environment displays artifacts of the wind, while the wind generation system creates and controls the wind flow,” says Minor. “We can steer the wind along both sides of the screens, which creates the illusion for the user that the wind is literally coming from the screen.”

NEXT STEPS

The research team is currently designing and adding displays for radiant heat and cold, and smell to enable users to engage in a completely immersing virtual environment.

“If you walk out of a shadow into the sun, you would be able to feel the heat of the sun,” says Hollerbach. “Also, you could smell the trees while walking in the forest, or smell exhaust from a bus in the city and feel the whoosh of the air as it passes by you.” The team has identified a number of possible uses for the completed TPAWT system.

“If you have a faithful virtual environment, it would be used for all kinds of educational purposes,” says Hollerbach. “If you can simulate reality, then you can use it for emergency response, rehabilitation for spinal cord injuries, training, psychological studies, exercise, and recreation.”

“I think one of the main uses of this system is for training first responders how to locate contaminates, such as a dirty bomb or a chemical leak,” adds Minor. “You can use the wind to simulate how chemicals would move through the air in order to contain a leak.”

As part of an NSF follow-on project, Pardyjak and his collaborators plan to use the system to aid in city planning and building.

“We want to conduct simulations to optimize the design of cities for air quality and energy efficiency,” says Pardyjak. “We will take technology we’ve developed for TPAWT and run simulations, allowing us to move buildings and create different designs to see how air quality and energy efficiency are affected.”

Reprinted from the University of Utah College of Engineering 2009 Research Report