Apple Reportedly Developing Trio of AI Wearables to "See" the World
New reports suggest smart glasses, a camera-equipped pendant, and upgraded AirPods.
Summary
- Apple is reportedly working on three new wearable devices designed to integrate with Apple Intelligence
- The trio includes smart glasses (similar to Meta’s Ray-Bans), a camera-equipped pendant, and AirPods with cameras
- These devices are intended to provide “visual context” to Siri, allowing the AI to see and interpret the user’s surroundings
Apple is looking beyond the iPhone screen to find the next frontier for its artificial intelligence. According to a new report from TechCrunch (citing Bloomberg’s Mark Gurman), the tech giant is actively developing a trio of new hardware form factors: smart glasses, an AI-powered pendant, and camera-equipped AirPods. These devices are designed to act as the “eyes and ears” of the Apple ecosystem, allowing Siri to observe the real world and provide proactive assistance without the user ever needing to unlock their phone.
The most prominent device in development is a pair of smart glasses, internally codenamed N50. Unlike the expensive and bulky Apple Vision Pro, these glasses would lack a display entirely. Instead, they would function similarly to the popular Ray-Ban Meta glasses, featuring built-in cameras, speakers, and microphones. The goal is to allow users to ask Siri questions about what they are looking at, capture photos, and receive audio responses, all while offloading the heavy computational lifting to a connected iPhone to preserve battery life and reduce weight.
The report also details plans for AirPods equipped with low-resolution infrared cameras. These sensors would not be used for photography but rather to provide environmental data to the AI, enabling features like gesture control or spatial awareness. Additionally, Apple is exploring a “pendant” device—roughly the size of an AirTag—that could be clipped to clothing or worn as a necklace. Like the glasses, this device would serve as a passive visual sensor for Siri, avoiding the pitfalls of failed standalone AI gadgets by strictly functioning as an iPhone accessory rather than a replacement.
This hardware push signals a shift in Apple’s AI strategy. rather than trying to cram powerful processors into standalone gadgets, Apple appears focused on using wearables to feed visual data back to the iPhone. This approach allows the company to leverage the massive install base of iPhone users while offering cheaper, more fashionable entry points into its ambient computing ecosystem.
While these projects are in active development, they are not imminent. Reports suggest that production on the smart glasses could begin in December 2026, targeting a potential release in 2027. The camera-equipped AirPods and pendant are also expected to follow a similar timeline, though plans could still shift as the technology matures.




















