As I prepped for Virtual Vector's launch, I thought about how long it might be until some new details about Apple's N301 headset would make headlines. Just a few weeks, it turns out.
Wayne Ma continued his streak of headset scoops at The Information on Friday with a report led by Apple's plans for iris scanning, a feature that could become the company's equivalent of Face or Touch ID for immersive headsets or glasses. Other headset differentiators discussed include downward-facing cameras to track the lower body, a weight less than that of Meta's Quest Pro, and support for magnetic prescription lens inserts.
Does any of this sound like a total slam dunk to you? The more we hear about its premium features, the easier it gets to believe that Apple really might launch this headset at $3,000. But, just to use iris scanning as an example, will the pitch of making it "easier for multiple people to use the same device" and securing payments be worth the expense and complexity at this time?
I might feel some of the bullish enthusiasm that many others have about Apple entering the market if we didn't know frustratingly little about its content plans. The hardware story is different. I don't doubt for a second that Apple is capable of releasing a headset that pushes today's boundaries–if anything, it's difficult to imagine Apple pulling the trigger if it hasn't locked in a few features it feels confident boasting about.
What I really struggle to get on board with is the idea that Apple's device itself will have anything uniquely meaningful in store. Over the past year we've seen Meta and others make big hardware moves in anticipation of competition from Apple, most recently with the reveal of the Quest Pro. Meanwhile, every detail that's been filled in around the story of Apple's pivotal standalone-or-bust moment has painted a picture of a headset design team that's facing the same challenges as everybody else. Apple could be the first to take a crack at solving certain headset woes, like the "alienation" issue, but there's little chance it's the only company driving toward a given approach.
We've now heard enough about the headset's features and development to have a decent sense of what's in store, and I'll happily eat my words if we're shown anything to the contrary. Now, though, it's clearer than ever that the truest "new" development when Apple launches might not be the hardware itself, but the ways the XR industry then contorts in reaction to it. Frankly, I can't imagine that story being overshadowed by one about specific sensors, weight, or resolution.
A triple from Google
Yes, Google recently made a point of signaling a new era by announcing plans to test AR prototypes in public, but notable XR news still comes out of Mountain View only sporadically. This week, though, Google has been busy: the company made three announcements that are worth a look.
- Project Starline, Google's line of immersive call booths kitted out with light field displays and numerous sensors, is moving into an early access program. So far, Google has identified Salesforce, WeWork, T-Mobile, and Hackensack Meridian Health as partners slated to get prototype booths soon. Google lists remote "employee onboarding and building rapport" as examples it has used Starline for internally, and I can see how those interactions might be improved by Starline's "magic window" effect. But it wouldn't surprise me if Google encourages or requires these test participants to stay away from tougher exchanges like performance reviews or–perish the thought–immersive layoffs.
- Tests of new Pixel phone-enabled features for business customers might convince you that Google Glass could have a real future. Sure, while Glass Enterprise devices still feature tiny peripheral displays like the failed Explorer Edition glasses did, the fact that they're intended just for hands-free information uses on the job already makes them more palatable. Add in live translation and transcription–an idea for glasses that piqued interest when Google recently pitched it with a sentimental spin–and maybe Glass starts to seem like less of a niche device and more like a legit communication tool.
- Google's Advanced Technology and Projects group (ATAP) gave a first look at what it's calling the ATAP Motion Platform, something that could tie into XR tracking and input solutions. In typical researcher speak, ATAP describes the platform as "a scalable tool to holistically treat motion data regardless of the sensor that it came from," a system that uses machine learning to capture and process human movements. That might make the platform useful for various approaches to body tracking in XR, but at one point in Google's video we also see a wrist-worn prototype letting a user control actions on a screen by making small hand movements. The parallel with Meta's research into neural input wristbands is striking.