On the second day of Connect 2025, during a session dedicated to developers, Meta announced that it would open its smart glasses platform, enabling external developers to build AI-powered services for the Ray-Ban and Oakley smart glasses lines, thereby unlocking richer and more interactive experiences.
Previously, Meta’s smart glasses supported only a limited range of third-party integrations, such as Spotify and Audible for streaming content. With the introduction of the new Wearables Device Access Toolkit, developers will gain access to built-in sensors, microphones, and other core functions of the glasses. By combining these capabilities with Meta’s multimodal AI technologies, they will be able to design services tailored to real-world scenarios and everyday use cases.
Meta also revealed its first wave of partners, including Twitch, Disney, and golf data service provider 18Birdies. Twitch is developing functionality to enable real-time live streaming directly from the glasses, Disney is creating an interactive AI-powered park guide that allows visitors to query information about attractions and events, while 18Birdies is working on integrating swing recommendations and course analytics to help users improve their game.
Notably, these new services will not be limited to the latest Ray-Ban Display model with built-in screens—owners of the first-generation Ray-Ban Meta smart glasses will also gain access, extending the value of their devices, though with certain limitations, as interaction will remain voice-only.
As for whether Meta plans to introduce additional exclusive features for the Ray-Ban Display, the company has not given a direct answer, though more application experiences are expected in the near future.
According to Meta, the Wearables Device Access Toolkit will initially launch in a limited developer preview, with broader availability planned for 2026. By opening the platform, Meta envisions smart glasses evolving beyond basic functions such as music playback and phone calls to AI-driven utilities capable of understanding user needs, delivering real-time insights, enabling richer interactive content, and even acting in an agentic capacity to perform tasks.
In a comparable move, HTC recently unveiled its VIVE Eagle smart glasses, highlighting how an open-architecture approach would encourage collaboration across software, AI models, and industry partners. Meta, however, remains focused on its collaborations with Ray-Ban and Oakley and, unlike HTC, has not yet extended this openness to frame or lens design.