A team at Cambridge University has crafted an immersive virtual environment where users can intuitively summon and manage a suite of 3D design tools through simple hand gestures.
The team, employing advanced machine learning techniques, pioneered ‘HotGestures’, a nod to traditional hotkeys found in conventional desktop software. These gestures enable creators to mold and craft virtual objects without the distraction of navigating menus, preserving their creative flow and concentration. Long a staple of sci-fi cinema, the team asserts this groundbreaking method marks the inaugural instance of such extraordinary capabilities made practical. Their findings are detailed in the publication IEEE Transactions on Visualization and Computer Graphics.
Although much-hyped for revolutionizing various sectors, VR has yet to captivate a broader audience aside from gamers. “VR should bestow special advantages, yet most are not eager to engage with it for long stretches,” noted Professor Per Ola Kristensson of Cambridge’s Engineering Department, the project’s leader. “Even with the allure of visually rich environments and comfort concerns set aside, VR has not delivered experiences unattainable in the physical realm.”
Regular desktop application users are accustomed to hotkeys—rapid keyboard commands like ctrl-c and ctrl-v for copy and paste operations. These shortcuts eliminate the cumbersome process of menu navigation albeit requiring prior knowledge of the commands.
“We aimed to reimagine hotkeys for VR, creating an intuitive system that doesn’t presuppose command familiarity,” explained Kristensson, also the co-Director of the Centre for Human-Inspired Artificial Intelligence. Rather than keystrokes, Kristensson’s group envisioned ‘HotGestures’, which activate tools within VR by recognizing natural hand movements.
For instance, a slicing gesture instantly accesses a digital pair of scissors, while a mimed spraying action calls up a virtual spray can. Without the burden of menu scrolling or memorizing hotkey combinations, users fluidly toggle through various tools by altering their gestures as they work, eradicating disruptions commonly caused by menu usage or physical device interaction. “In our everyday communication, we naturally gesticulate, so it was a logical step to incorporate gestural dialogue into virtual scenes,” Kristensson remarked.
Their research entailed constructing a gesture recognition neural network capable of discerning ten distinctive hand gestures tied to various modeling actions, such as sketching, sculpting, or altering virtual structures.
The investigation involved experiments where participants engaged with HotGestures, traditional menus, or a hybrid of both, revealing that the gestural approach offered speedy, effortless shortcuts for tool selection and operation. Study subjects praised the uniqueness, swiftness, and simplicity of HotGestures, which also worked well alongside usual menu-driven interactions. The devised system effectively differentiates intended commands from casual hand motions, ruling out unintended activations, and outperforming menu-driven interfaces in speed.
“Current VR systems can’t offer this functionality,” noted Kristensson. “If VR usage is comparable to mouse and keyboard interfaces, then it’s not maximizing its potential. It should empower users with remarkable, otherwise unattainable abilities.” The researchers have openly shared their source code and data, inviting VR developers to integrate this innovation into upcoming experiences.
Kristensson envisions this approach becoming a new norm for VR interactivity. “We’ve been tied to outdated metaphors like the desktop filing system for too long. It’s time for fresh, transformative methods of engagement with our devices. When executed well, VR has the potential to be utterly enchanting.”