Amazon recently confirmed that it is developing a pair of smart glasses designed specifically for its delivery drivers, corroborating earlier reports. At the heart of these glasses lies AI-driven sensing and computer vision technology, enabling the device to interpret images captured by its built-in camera and display real-time guidance through a heads-up display (HUD) seamlessly embedded in the lenses.
According to Amazon, the glasses have been in development for some time and have undergone early testing by hundreds of delivery drivers, whose feedback helped refine the design. The goal is to enhance overall delivery efficiency while improving drivers’ safety on their routes.
The company explained that the smart glasses automatically activate once the driver parks the vehicle. The system then displays a list of packages to be delivered on the HUD, based on the current location. Moreover, when the driver retrieves a parcel from a pile of packages, the glasses can use visual recognition to confirm that the correct item has been selected.
As the driver proceeds on foot, the HUD provides turn-by-turn walking navigation to the destination, highlights potential hazards along the way, and assists in navigating complex environments such as apartment buildings.
In essence, the device allows drivers to locate addresses and complete deliveries without constantly reaching for or operating their phones. Even capturing proof-of-delivery photos can be accomplished directly through the glasses.
On the hardware side, the smart glasses pair with a custom-designed vest that integrates a controller and a dedicated “emergency button,” allowing drivers to quickly summon help if needed.
The glasses feature a swappable battery design to ensure all-day power and can be fitted with prescription or photochromic lenses to suit individual driver preferences.
Amazon expects future versions to include more advanced AI capabilities—such as warning drivers when they are about to leave a package at the wrong address and detecting additional potential hazards like pets in a yard.
Beryl Tomay, Amazon’s Vice President of Transportation, stated that the glasses “reduce the need for drivers to juggle both their phones and packages,” helping them stay focused and thereby improving their safety. She also revealed that early testing showed drivers could save roughly thirty minutes per shift on average.
Although Amazon did not disclose plans for a consumer version of the glasses, The Information previously reported that the company is also developing a consumer-grade model, potentially slated for release in late 2026 or early 2027.
Related Posts:
- Navigate Smarter: Google Maps Arrives on Garmin Smartwatches
- Apple’s iOS 26: The Liquid Glass Revolution
- Apple Unleashes iOS 26, macOS Tahoe 26 Public Betas: Experience “Liquid Glass” & Major OS Overhauls
- Apple’s “Liquid Glass” Vision: iPhone 2027 & Unified OS 26
- Apple Unifies OS Versions to “26” and Unveils “Liquid Glass” Redesign