To get some sense of the future for Apple’s new Vision devices, early adopters might want to explore ARway.ai, which seems to deliver some of the elements of the augmented reality (AR) vision we think Apple will eventually achieve.
What it provides is mapped and location-based AR content to your surroundings as you ambulate through them wearing the Apple headset. There’s a video to provide a sense of these experiences here.
Why is this early-adopter territory?
At this time — and at the current Vision Pro price — few of us will want to walk through crowded shopping malls for fear of ridicule or robbery. Not only this, but the weight of the headset means it really isn’t the prime time device we know Cupertino is dreaming of. Eventually, it will become smaller and lighter, which means it will be something people do choose to wear as they get around.
That time hasn’t yet come, but you have to expect Apple is reasonably convinced it can iterate on the existing product until it gets to there. That is when these solutions will reach the mass market; right now (as Box CEO Aaron Levie told me recently), the first-generation spatial computing device is for the professional markets.
Those pro markets include enterprise users who can already see value in what’s available, convinced Apple hobbyists, and developers/adopters who want to kick Apple reality around to see what it does now — and where it may be going later on.
With these tools we build unreality
So, with that preamble, what does ARway actually do? The basic idea is that it merges the digital and physical worlds. The focus of the company right now seems to be on augmented retail experiences.
That’s just one way of handling the tech.
Other ways might be within warehousing, where the vision device might direct staff to relevant holds; in hospitals, to make essential equipment easier to find and patient data easier to see; or when handling complex building or maintenance tasks in which blueprints could be overlaid atop what is already there. Curated tours of visitor attractions also seem part of the plan, given ARway has reached a deal to create these experiences with museums and tourist attractions in Saudi Arabia.
What ARway provides
To achieve this, ARway is offering its own SDK to provides tools across both iOS and RealityOS. Maps are created using AI, which automates the creation of 400,000-square-foot 3D spatial maps from 2D floor plans. The idea is you can build these augmented mixed reality experiences for both iPhones and Vision devices.
In a statement, ARway CEO Evan Gappelberg said: “This is a huge opportunity for early investors to invest in the next multi-decade multi-trillion dollar megatrend…, which is spatial computing and augmented reality technology.”
Whether these technologies will actually live up to such hype remains to be seen. It’s also fair to say ARway is just one company in a busy space with plenty of firms working to offer up their own unique tools. Firms like ARCore, Unity, iStaging, (and many others) and even Apple and Adobe have relevant tech to bring to bear.
Who will build the code?
In the end, all these vying platforms proffer prophecies of the kinds of emerging AI-supported low/no code development environments we have to expect will be used to build digital twins for the entire planet.
In the absence of developers and with an eye for realism, AI will then likely have a much bigger part to play in coding these tools for a spatially augmented planet.
That means if you’ve somehow gotten one of Apple’s Vision Pro devices, you should be exploring what feeds this emerging ecosystem far more than sitting back watching wall-size movies or shows in virtual spaces. If that’s all you can think to do with them, perhaps you should just get a cinema ticket and a bigger TV.
But if you’re wise, you’ll recognize a future is coming. You should see what it might be.
Please follow me on Mastodon, or join me in the AppleHolic’s bar & grill and Apple Discussions groups on MeWe.