(Bloomberg) — Apple Inc. is in the advanced stages of development of its new AirPods with built-in cameras, marking a major milestone for what could be its first wearable device designed for the age of artificial intelligence.
Most read from Bloomberg
According to people familiar with the matter, the project has reached the stage where prototypes have near-final design and functionality. The headset, which relies on cameras to see the space around the user and provide information, is currently undergoing advanced testing, people familiar with the matter said. They spoke on condition of anonymity because the work remains confidential.
Apple is betting the new device can capitalize on the success of AirPods while moving it into the world of artificial intelligence-enhanced hardware — where the company faces competition from OpenAI, Meta Platforms Inc. and others.
The cameras essentially serve as the eyes for the Siri digital assistant and are not designed to take photos or videos. These components are located in the right and left earbuds and allow the device to capture visual information at low resolution. The product will be similar to the AirPods Pro 3, except for a longer stem to house the camera.
Apple originally planned to put the headphones on sale as early as the first half of this year, but the launch was delayed due to delays in an improved version of Siri. The new Siri is now expected to be available in September after Apple upgraded its underlying model with Alphabet Inc.’s Gemini technology.
Testers within Apple are actively working with prototypes of the new AirPods, which are in a stage called DVT (Design Verification Testing). This is the last major development stage before PVT or production verification testing, which involves manufacturing early series production units.
While the hardware is largely ready, concerns about the artificial intelligence elements could further hamper the launch if Apple is not satisfied with the quality of its visual intelligence features, people familiar with the matter said. A spokesman for Cupertino, California-based Apple declined to comment.
The idea is to let users ask questions about items they may be viewing. For example, they might be confronted with food ingredients and ask what they should make for dinner. This is similar to the experience you get when uploading photos to an AI service like OpenAI’s ChatGPT or the iPhone’s own visual intelligence feature.
Apple has also been working on other uses for its AI-powered cameras. The device can alert the wearer based on what the camera sees, or it can use external visuals to provide more advanced turn-by-turn navigation. The AI can reference specific landmarks ahead when telling users when they should turn.
Apple is also working to address privacy concerns for people who don’t want to be photographed by their electronic devices — an issue for smart glasses as well. The company built a small LED light into the earbuds that lights up when visual data is fed into the cloud. Due to the small size of the earbuds, it’s unclear how light will be visible.
The new AirPods have been in development for about four years and will be part of a wave of artificial intelligence products. The company will start developing smart glasses and a camera-equipped pendant as early as next year, but development of both products lags behind AirPods.
Apple expects strong demand for the new AirPods, and operations teams are working hard to ensure enough components are rolled out. That’s a particularly challenging task right now due to industry-wide shortages related to memory chips and other silicon.
Unlike the Apple Vision Pro headphones, another device with a ton of external sensors, the new AirPods don’t support gesture controls. According to people familiar with the matter, the upcoming Apple Glasses will not currently provide this feature.
Apple’s incoming CEO, John Ternus, has been touting a strong product lineup at employee events in recent weeks, saying the company is poised to disrupt the industry in a similar way to the iPhone, iPad and iPod.
“We’re about to change the world again,” Ternus told employees after being named Apple’s next CEO. “If you’re lucky, and I mean really lucky, you have one or two moments in your career where you can be a part of something really important. Now we’re at a point where we can do that again.”
Ternus is overseeing the development of about 10 major new products, including touchscreen MacBooks, foldable iPhones and artificial intelligence-powered smart home devices. He succeeded Tim Cook as CEO on September 1 and will host the company’s big annual launch event that month.
Apple has also been looking to upgrade the iPhone’s visual artificial intelligence capabilities. It plans to launch a new Siri camera mode in iOS 27, which will make visual AI features more prominent in the iPhone operating system. This change should introduce more users to the concept of feeding visual data into AI services.
Apple is competing with companies such as OpenAI and Meta in the emerging field of artificial intelligence devices. OpenAI has been preying on Apple’s hardware engineering talent pool to develop competitors for Apple’s smart home and mobile devices, while Meta is improving its own artificial intelligence wearables.
Bloomberg News first reported in February 2024 that Apple was developing AirPods with built-in cameras. The company also developed an Apple Watch with a camera but canceled the project last year.
Given the popularity of earbuds, AirPods are a strong entry point for Apple to start implementing artificial intelligence. Customers often buy them alongside Apple’s flagship iPhone.
AirPods have been a hit with Apple since they were first launched in 2016. The company launched new headphones late last year that added a heart rate monitor similar to the one found in the Apple Watch. The company updated its AirPods Max headphones in March.
Most read from Bloomberg Businessweek
©2026 Bloomberg