Haptic hardware offers waterfall of immersive experience, could someday aid blind users
Increasingly sophisticated computer graphics and spatial 3D sound are combining to make the virtual world of games bigger, badder and more beautiful than ever. And beyond sight and sound, haptic technology can create a sense of touch—including vibrations in your gaming chair from an explosion, or difficulty turning the wheel as you steer your F1 racecar through a turn because of g-forces.
While all this typically relies on force feedback using mechanical devices, University of Maryland researchers are now offering a new take that delivers lifelike haptic experiences with controlled water jets. But don’t worry—you and your surroundings will stay dry.
The team presented a paper and demoed its new waterjet-based technology known as JetUnit at the ACM Symposium on User Interface Software and Technology (UIST 2024) last month in Pittsburgh.
In addition to its use in gaming, the system might one day aid blind users by providing force feedback cues for spatial navigation and other interactions, thus enhancing accessibility, said project leader Zining Zhang, a computer science doctoral student in the Small Artifacts (SMART) Lab.
The UMD device offers a wide range of haptic experiences, from subtle sensations that mimic a gentle touch to sharp pokes that feel like a needle injection.
User testing has demonstrated that JetUnit is successful at creating diverse haptic experiences within a virtual reality setting, with participants reporting a heightened sense of realism and engagement. Compared to other available haptic technologies, the system can create an unusually broad range of forces based on changes in water pressure.
The key is a thin, watertight membrane made of nitrile—the same synthetic rubber used for sterile gloves—attached to a compact, self-contained chamber that effectively isolates the water from the user. And to reduce water turbulence within the membrane, the team 3D-printed several internal devices to allow strong forces without any leakage.
Zhang said the UMD team decided to try using contained water jets due to water’s efficiency in energy transfer as an incompressible fluid. It also presented the challenge, though, of keeping users dry—a significant focus of work as the team developed the system, she said.
Zhang said she received significant feedback as the project progressed from her adviser, Huaishu Peng, an assistant professor of computer science who is director of the SMART Lab.
Other help on the project came from Jun Nishida, an assistant professor of computer science whose research involves wearable technologies for cognitive enhancement.
Both Peng and Nishida have appointments in the University of Maryland Institute for Advanced Computer Studies, which Zhang credits with providing a “rich, collaborative environment” for the team to do their work.
The Singh Sandbox makerspace and its manager, Gordon Crago, also assisted, providing an array of tools and lighting hardware as the team was developing the JetUnit prototype.
Looking ahead, Zhang envisions integrating thermal feedback via multi-level water temperature control and expanding JetUnit to become a full-body haptic system, paving the way for even more “sensational” uses.
More information:
Zining Zhang et al, JetUnit: Rendering Diverse Force Feedback in Virtual Reality Using Water Jets, Proceedings of the 37th Annual ACM Symposium on User Interface Software and Technology (2024). DOI: 10.1145/3654777.3676440
University of Maryland
Citation:
Haptic hardware offers waterfall of immersive experience, could someday aid blind users (2024, November 12)
retrieved 13 November 2024
from https://techxplore.com/news/2024-11-haptic-hardware-waterfall-immersive-aid.html
This document is subject to copyright. Apart from any fair dealing for the purpose of private study or research, no
part may be reproduced without the written permission. The content is provided for information purposes only.
Comments are closed