For some time, I’ve been pondering how manufactures like Apple will bring a consumer suitable Augmented Reality (AR) headwear products to market. Now though, I think I have it.
If we look at the model set out by the Apple Watch, it launched as a companion device to your iPhone. Heavy tasks are done on your phone and quick information snippets are done on the Watch. Now replace ‘Watch’ with ‘Glasses’.
There’s been a long history of devices sending visual content to other displays. AirPlay, ChromeCast, etc have allowed the primary device to do the work and the companion to output the result.
Let’s put this into an AR context. The work required to compute the graphic required for a high-fidelity experience? They live on the smartphone. Network connection? Smartphone. Mass data storage? Yep, smartphone.
All of the heavy lifting can be done ‘off-device’. This allows the headwear to simply just be a platform for motion sensors and spacial sensors, close range wireless (such as Apple’s AirPods and Bluetooth 5) and the display technology. Add in some bone conducting sounds and near-field microphones and you’ve got a fully fledge device.
Given that the device can have a lot of it’s work done elsewhere, it allows for smaller batteries to be used along with wireless charging. Imagine being able to take them off, lay them on your smartphone and wireless charging to top them right up!
The glasses don’t have to last all day, just as AirPods don’t. They just have to charge quick and operate as a companion device, not a primary. Well, for the first version anyway. Three generations of Apple Watch later, we’ve got a battery that lasts two days and has 4G integrated for connectivity.
So, here it goes, just as I once predicted the functions of the Apple Watch long before it arrived, here are my predictions for the Apple Glasses:
- The AR Apple Glasses will be a companion device for an iPhone
- They will display content similar to the Apple Watch: Siri interface, notifications, travel / directions information and health / exercise information
- They will also function as an AR viewer for things like games and immersive experiences. The computation will be done on the iPhone and streamed to the glasses.
- They will feature touch and tap controls like AirPods
- They will support ‘hey Siri’ in some form
- They will be water resistant
- They will use some kind of MicroLED screens – there’s potential to be light-field, but the technology may not be available at the time
- There will be a camera / sensor array for position and object tracking
- They will charge wirelessly and quickly
- Apple will position them as a ‘human augmentation’ meant to ‘break the hold of the screen’ and ‘allow us to view the world in a new way’
Things it they may have. These are less likely, but still possible:
- Video and stills recording via an embedded camera
- Hand tracking / gestures
- eye tracking
- Health monitoring via metal / sapphire contacts from the scalp to the device
- Bone conduction speakers built in to the arms
- Screen extensions – look at your Apple Watch and see information around the screen. Just as the TouchBar augment’s the MacBook Pro
- Prescription lenses
- Sunglasses / transitions lenses
And there we have it. That’s my flag planted. In a few years time, the above may come back to bite me in the back side, or perhaps, like the Apple Watch (Communicator+) I’ll be proven correct and we’ll have a great, market changing device.
What are your thoughts? Let me know on the social media platform of your choice, you’ll find a Tame Geek there!
(Image credit: Magic Leap)