What Experts Predicted vs. Where Haptics Actually Is
A Special Edition of All Things Haptics
Dear Haptics Insider,
Over the past few years, a set of overlapping technical predictions about haptics surfaced in conversations with researchers, founders, and product leaders building real systems. Different domains, different constraints, but a surprising amount of alignment. Many of those conversations happened through Haptics Club, a podcast centered on haptics.
This post revisits those predictions and places them alongside what entered the market.
1. Spatial Computing: Hands vs. Controllers
The industry is currently divided on whether the future of spatial computing should rely on bare hands or dedicated hardware.
Forecast
In February 2023, Tom Carter, CEO of Ultraleap, a company focused on hand-tracking technology for XR, stated that hand tracking had become “table stakes” and described the use of hands as the “correct interface for that computing platform.”
Later in 2023, Denny Unger, CEO of Cloudhead Games, a VR game studio, stated that relying solely on hand tracking removes a primary sense organ, touch, from the digital experience. He added that 90% of the development community rely on tactile buttons.
In Market
The Apple Vision Pro launched in 2024 with a controllerless, hand-tracking-only interface, establishing a hands-only interaction model for a commercial spatial computing platform. (Apple)
In parallel, companies such as DoublePoint are using neural networks on smartwatches to enable wrist-based spatial input with tactile feedback (Doublepoint).
In 2025, Samsung’s Galaxy XR launched with separate controllers available, combining hand tracking with conventional input in a commercial spatial computing headset. (Samsung)
Source
• HC Episode #36 - Jan 20, 2023 - Making digital worlds feel more human, with Tom Carter (00:49)
• HC Episode #46 - Oct 23, 2023 - Haptics in VR – Live with Denny Unger (08:16)


