Apple Is Building the Device That Killed Humane
What Apple Learned
On Monday, Bloomberg reported that Apple has accelerated work on three AI wearables: smart glasses targeting late 2026 production, a camera-equipped pendant the size of an AirTag, and AirPods with built-in cameras. All three connect to the iPhone. All three feed visual context to a rebuilt Siri. All three put always-on cameras on your body.
If this sounds familiar, it should. Humane shipped a wearable AI camera for $700 in April 2024 and collapsed ten months later. HP bought the wreckage for $116 million — about 88% less than Humane's peak valuation. Returns outpaced sales by summer. The charging case caught fire. Executives used ice packs to cool prototypes before investor demos.
Rabbit sold a $199 AI gadget that couldn't set a timer. By mid-2025, employees stopped getting paid.
Apple looked at all of this and decided to build three of them.
The difference isn't ambition. It's architecture.
Humane tried to replace the phone. The AI Pin was standalone — its own cellular connection, its own compute, its own $24/month subscription. It did everything badly because it had to do everything alone on a chip the size of a postage stamp.
Apple's pendant is an accessory. Thin disc, aluminum and glass, two cameras, three microphones, a speaker. Processing happens on the iPhone. The pendant sees. The phone thinks. This is the Meta playbook, not the Humane playbook.
Meta shipped 7 million Ray-Ban smart glasses in 2025, tripling the prior two years combined. EssilorLuxottica raised its production target from 10 million to 20–30 million units by end of 2026. The stock jumped 14% on the earnings call. Meta proved one thing: people will wear a camera on their face if it looks like sunglasses, costs what sunglasses cost, and doesn't try to be a phone.
Apple's pendant takes the same lesson and shrinks the form factor further. No display. No standalone compute. No monthly subscription. Clip it to your shirt or wear it on a necklace. It's an eye for your phone — a sensor that gives Siri visual context about the world around you.
The glasses follow the same logic. High-resolution camera for photos and video, second camera for environmental context. Phone calls, music, navigation, and the ability to look at something and ask Siri what it is. Production could start December 2026, launch in 2027. These aren't Vision Pro — they're Ray-Ban Metas with Siri and an Apple logo.
The Privacy Problem Nobody Solved
Here's where it gets uncomfortable.
Harvard students demonstrated last year that Meta Ray-Ban footage could be piped into facial recognition systems to identify strangers in real time. The glasses have an LED that's supposed to signal when the camera is recording. Hobbyists sell mods that disable it. The camera stays on. The light stays off.
At the University of San Francisco, a man wearing Ray-Bans approached women on campus while recording. POV videos from Meta glasses of men approaching women at bars went viral. The EU opened debate on smart glasses privacy rules. U.S. lawmakers signaled reviews but haven't acted.
Apple will face all of this, plus a new wrinkle: the pendant camera is always on. Not recording — but feeding visual data to Siri continuously, so it can answer context-aware questions about your surroundings. "What's on this restaurant menu?" "Is this shirt blue or green?" "Who just walked in?"
Apple says the camera won't take photos or videos. But it will see. The distinction between a camera that records and a camera that processes in real time is meaningful to engineers and meaningless to the person standing across from you wondering if they're being filmed.
Three Products, One Bet
The smart glasses, pendant, and camera AirPods aren't three separate products. They're three entry points to the same idea: give Siri eyes.
Current Siri knows what you type and say. Future Siri knows what you see. Apple is betting that visual context is the missing layer — the thing that turns a voice assistant from a party trick into something useful enough to justify wearing a camera on your body every day.
It's the same bet Meta made. The difference is Apple's ecosystem lock-in. When your AirPods have cameras, your glasses have cameras, and your pendant has cameras, Siri doesn't just know your calendar and your contacts. It knows what you're looking at, who you're talking to, and what's on the table in front of you.
This is either the most natural evolution of the smartphone or the most comprehensive personal surveillance system ever shipped to consumers. Probably both.
Why It Matters
The Humane AI Pin failed because it was a standalone product trying to replace a phone. Rabbit failed because it shipped broken software on a $199 brick. Meta succeeded because it made smart glasses feel like normal glasses.
Apple is entering at exactly the right moment. The standalone AI device market is dead. The phone-connected AI accessory market just proved it works at scale. And Apple has the one advantage no competitor can match: 2 billion active devices already in pockets, already running Siri, already waiting for eyes.
The question isn't whether Apple will ship these products. Bloomberg's Mark Gurman reports they're further along than expected. The question is what happens when a billion people walk around with always-on cameras connected to the most capable personal AI ever built.
We're about to find out.
---
Sources: Bloomberg (Mark Gurman), MacRumors, 9to5Mac, Tom's Guide, CNBC, TechCrunch, Help Net Security, Slate

