If you've ever played a game that supports hand tracking on any Meta Quest headset, you'll know that the experience usually isn't very good. Even games that are well-designed and built from the ground up for hand-tracking, like Silhouette, are more frustrating than fun.
But Rogue Ascent is something completely different. I'm not sure what sort of voodoo magic developer Nooner Bear Studio is using but it's absolutely different from what everyone else has been doing up until this point. It's absolutely the best Quest 2 game that supports hand tracking, even when games like Little Cities already do it pretty well.
And that's why I'm no longer worried about the Apple Vision Pro only using hand tracking when games inevitably debut on the expensive headset. While Apple didn't show off any actual VR games at WWDC, I fully expect the headset to help evolve hand-tracked VR games when it launches early next year.
Hand tracking is finally useful
The early days of hand tracking — by that, I mean the first three years — were very rough. During that time, the best hand tracking games for Quest produced a few diamonds in the rough but even the best of those can be incredibly finicky more often than I'd like.
For the most part, I didn't blame the games. Hand tracking is difficult to get right, especially when they had to rely on four basic cameras on the outside of the Quest 2 or Quest Pro. But Rogue Ascent does it differently. It's using Meta's most recent version of hand tracking — which drastically improves accuracy and speed thanks to AI-powered smarts — plus its own blend of magic.
Don't believe me? See it for yourself in my video below and pay attention to a few key things during the video:
1.) The speed at which I can move and it still accurately registers my hand movements.
2.) The accuracy of the aiming. It's dead on with a controller.
3.) The ability to move around as elegantly and smoothly as with a controller.
As you might have guessed by now, Rogue Ascent is a roguelike shooter that tasks players with making it as high up a tall building as possible by blasting their way through each floor's obstacles. Each area ends with a quick conversation with an NPC then a trip up the elevator to the next level.
Like other roguelikes, you'll eventually die and be transported back to a hub area. Your character receives some permanent upgrades earned through your travels, while other upgrades and weapons are accumulated only for an individual run up the tower.
It's a familiar formula that feels fresh thanks to the novel input mechanic and the fact that it actually works.
Apple still has work to do
While we know some of the tech that's inside the Apple Vision Pro (herein called AVP), we're still not sure how far Apple's current tech can take VR gaming. Apple primarily showed off its eye-tracking tech at WWDC — not the hand-tracking tech — and it's quickly becoming clear why.
Apple isn't ready to give you a fully hand-tracked experience just yet. Where's the proof? It's right in the company's own presentations of how its tracking tech works. The developer of Smash Drums on Quest quickly pointed this out on Twitter:
The [latency/input lag/you name it] is WILD and definitely not ready for prime time. Oculus Quest hand tracking v1.0 was already better than this when it launched 2 years ago, and it's evolved a lot in the meantime... Step it up, Apple! This is not even MVP level here 😬 https://t.co/WGC8FN5huRJune 7, 2023
If you watch the video, you'll see how far behind the AVP's actual tracking is versus the arm movements from the person in the video. The thumb tracking, in particular, lags far behind even the extremely slow rotation movements of the person's arms.
Meanwhile, I'm flinging my arms around without even considering the speed in the Rogue Ascent video above, and it shows at least one weakness in Apple's current hardware iteration.
All of Apple's demos mainly rely on your eyes looking at a virtual object — which, by all accounts, is the best eye tracking any headset has to date — and a simple "pinch" to confirm your selection.
While this feels "magic" by normal Apple standards — a superlative used by many folks who got to use the headset at or before WWDC — Apple looks to be heavily relying on its eye-tracking tech instead of hand tracking.
And while some demos showcase intelligent-enough hand tracking recognition, these demos mostly rely on a user holding their arm still to allow a butterfly to land on a finger, for example.
But I have little doubt that Apple will continue to improve this experience as time goes on. The headset has at least half a year until it's actually available for purchase which means plenty of time for the behind-the-scenes tech to mature.
Apple does world anchors better than any XR company on the market right now — even Meta — because it's been building AR tech in the iPhone for a decade or more. This tells me that any limitation we're seeing is because hand tracking doesn't have the long history of the iPhone to build on, not because Apple is somehow incompetent.
Rec Room VR footage on Apple Vision Pro! Cool stuff!Thoughts? https://t.co/FWwc0gxk1M pic.twitter.com/xshO05usrOJune 7, 2023
Plus, as developers have confirmed, Unreal Engine and Unity both support AVP, and games like Rec Room and Fruit Ninja have already confirmed that an AVP version of the popular titles will be available. You can see Rec Room running on an Apple Vision Pro in the Tweet above, and while it looks smooth, some have pointed out that the fidelity looks worse than similar hand-tracked games on Quest.
Regardless of early quality, having more developers onboard that can share experiences between devices is seriously great news for the future of XR, even if Apple only calls its headset a "spatial computing device."
Be an expert in 5 minutes
Get the latest news from Android Central, your trusted companion in the world of Android