Smart glasses are having a moment. Meta’s latest Ray-Ban specs are available now, its newest Oakleys are coming soon and Google and Samsung will probably jump into the tech race for your face next year with Android XR. With all that momentum, it always felt inevitable to me that Apple would introduce its own smart glasses sooner or later — and recent signs point to sooner.
Reports claim that Apple has paused its rumored Vision Air hardware — the smaller, lighter successor to its existing Vision Pro VR headset — in favor of smart glasses. To me, that sounds like a pivot to compete with the wave of AI-powered glasses that everyone from Meta and Samsung to Google, Snap, Amazon, Xreal, Rokid and even OpenAI are either selling, developing or rumored to be exploring.
As I test various smart glasses this fall, I see the pieces coming together for Apple. It already has the product catalog and wearable technology in place to make a splash, and it’s much further along than you might realize. Here’s how Apple’s current headphones, watches, phones and software could shape its first pair of smart glasses.
Apple’s been working on tech for our faces for over a decade. When I wore the first AirPods back in 2016 and got mocked for how weird they looked, it felt like Apple testing a design flex for our faces. It succeeded: Today, everyone wears AirPods and other wireless buds — and don’t get mocked for it.
Since then, Apple’s been unleashing computational audio features that could fit perfectly into smart glasses. Think live translation in the latest AirPods firmware, head-nodding gestures for quick replies, heart rate tracking, ambient noise filtering to sharpen focus or assist with hearing loss and spatial 3D audio. There’s also the new open-ear noise cancellation tech on AirPods 4, plus FDA-cleared hearing assistance — a feature already popping up in the smart glasses from companies like Nuance.
Control tech via Apple Watch
Meta’s newest display glasses come with the Neural Band, which controls the on-lens display using electrodes to read tiny muscle impulses and turn them into in-air gestures. Apple already has a foot in the door with its own wrist-based gesture controls.
Apple’s glasses could also link directly to the Watch for quick access to on-screen readouts, allowing them to skip a built-in display altogether. Think of them as a viewfinder for camera glasses, or a wearable touchscreen for selecting connected apps. Meta’s already hinted that its Neural Band could possibly flex to become a watch, and Google’s got plans for watches and glasses to intersect, too.
Camera tech via iPhone Air (and Vision Pro)
Apple’s an old hand at shrinking high-performance cameras down into small spaces. The super-thin iPhone Air pulled off the most impressive compression yet this fall, and glasses demand even smaller cameras.
Apple has experience putting cameras and other sensors on headsets already. The Vision Pro’s array of cameras is likely a lot more complex than anything Apple’s glasses would include.
Maybe Apple will add stereo 3D recording, letting you capture spatial videos on the glasses to relive later with a Vision headset. It’s the same record-your-memories fantasy the Vision Pro tried to sell with its in-headset recording.
Much like Meta is doing, solving for AI on glasses could lead to better AI in other Apple projects down the road, like cars.
Apple Stores are a natural fit for glasses demos
Meta is working to build retail experiences to demo its new Display glasses, but Apple already has a global fleet of stores — the same ones it used for the complex tech demos during the Vision Pro launch. Apple stores would make perfect sense for glasses fittings, with prescriptions filled online, like Vision Pro already does with lens partner, Zeiss.


Leave a Reply