Amazon announced on Wednesday that it’s developing AI-powered smart glasses for its delivery drivers. The wearable technology aims to provide a hands-free experience that reduces the need to constantly shift attention between phones, packages, and surroundings while navigating delivery routes.

The e-commerce giant says that the glasses will help the delivery drivers to scan packages, follow turn-by-turn walking directions, and capture proof of delivery without even touching their phones. The system leverages AI-powered sensing capabilities and computer vision alongside cameras to create a heads-up display showing hazards, navigation details, and delivery tasks directly in the driver’s field of vision.

Functionality

When a driver parks at a delivery location, the glasses automatically activate. The system first helps to locate the correct package while sitting inside the vehicle, then gives walking turn-by-turn navigation to the delivery address using Amazon’s geospatial technology. The display guides drivers through complex environments like apartment buildings and alerts them to hazards without requiring them to check their phone.

The glasses pair with a controller worn in the delivery vest containing operational controls, a swappable battery, and a dedicated emergency button. Amazon notes the glasses support prescription lenses and transitional lenses that automatically adjust to light conditions.

Designed with input from hundreds of delivery associates, the technology emerged from direct driver feedback. “Kaleb M., a delivery associate working for Maddox Logistics Corporation in Omaha, Nebraska, who tested the technology, explained:

“I felt safer the whole time because the glasses have the info right in my field of view. Instead of having to look down at a phone, you can keep your eyes forward and look past the display – you’re always focused on what’s ahead.”

Amazon is currently trialing the glasses with delivery drivers in North America and plans to refine the technology before a wider rollout. The announcement wasn’t entirely unexpected, Reuters reported in November 2024 that Amazon was working on the smart glasses under the internal codename “Amelia.”

Efficiency

Amazon likely hopes the glasses will shave many precious seconds off each delivery. With millions of packages delivered daily, even small time savings per delivery could translate into significant operational improvements.

The technology addresses the “last mile” problem efficiently. The final stretch from the distribution center to the customer’s doorstep. This stage is notoriously costly and complicated, requiring searching through diverse neighborhoods. Amazon’s shipping costs rose 8% in the third quarter to $23.5 billion, making efficiency gains particularly valuable.

In the future, Amazon says the glasses will provide drivers with “real-time defect detection” that could notify them if they accidentally drop off a package at the wrong address. The system will also detect pets in yards and automatically adjust to hazards like low light conditions.

Automation

Wednesday’s announcement came alongside the unveiling of Blue Jay, a next-generation robotics system, and Project Eluna, an agentic AI tool for warehouse operations. Blue Jay coordinates multiple robotic arms to perform picking, stowing, and consolidating tasks simultaneously, combining what used to be three separate robotic stations into one streamlined workspace. Currently being tested in South Carolina, the system can handle approximately 75% of items Amazon stores.

Project Eluna is an AI-powered assistant being piloted at a fulfillment center in Tennessee this holiday season. The system processes real-time and historical data to help operations managers anticipate bottlenecks, responding to natural language queries like “Where should we shift people to avoid a bottleneck?”

These announcements arrived one day after The New York Times reported on internal Amazon documents indicating the company’s robotics team aims to automate 75% of operations. According to the leaked documents, Amazon’s automation team projects the company could avoid hiring more than 160,000 U.S. workers by 2027, saving approximately 30 cents per item processed. This alone translates to roughly $12.6 billion in savings between 2025 and 2027.

The documents suggest that if Amazon doubles its sales by 2033 as expected, automation could help avoid hiring more than 600,000 workers over the next decade. Nobel Prize-winning economist Daron Acemoglu warned:

“Nobody else has the same incentive as Amazon to find the way to automate. Once they work out how to do this profitably, it will spread to others, too.”

Tye Brady, chief technologist at Amazon Robotics, emphasized at the company’s Delivering the Future event that “the real headline isn’t about robots. It’s about people – and the future of work we’re building together.” Amazon maintains that automation will shift employees from repetitive physical tasks to higher-value work like quality control and problem-solving.

Challenges ahead

Despite the very promising demonstration, significant technical hurdles remain. Sources told Reuters in 2024 that developing the glasses could take years. One major challenge is creating a battery compact enough to be comfortable yet powerful enough to last an eight-hour shift. Additionally, the onboard navigation system requires extensive mapping data, which could take several years to collect.

The smart glasses build on Amazon’s consumer-focused Echo Frames, which feature Alexa integration. However, the delivery glasses add a small embedded display becoming a significant technical leap requiring more processing power. The Echo Frames have seen modest adoption, with reports indicating only about 10,000 units sold of the latest version.

AD 4nXfmOUgxXg4Vxl9KajEKwuy3t7YwU56W6t3EtvrLzOFYKa7v1E iMcHH7dTtImrD5o9TFJskpJ9UupLkEDjEJcZZ41uJ4eqtZk r3RgnqcAv38omyu7BbI23IOk9By E7F8hcAZDBy0soSywEKy27BVVcOZnFQ?key=yc8Aq2KhXUtNQLSBIM4WPQ

Amazon’s smart glasses represent a pragmatic approach to augmented reality in the workplace, focusing on clear utility rather than flashy features. Unlike Meta’s Orion augmented reality glasses, which the company has no plans to ship, or Snap’s fifth-generation Spectacles with their bulky design and 45-minute battery life, Amazon’s delivery glasses target a specific, measurable problem which is “efficiency in the last 100 yards of delivery”.

If successful, the technology could transform not just Amazon’s operations but the broader logistics industry. The company’s massive scale means innovations that work at Amazon often become industry standards. Yet the broader question remains: as Amazon demonstrates that technology can make individual workers more productive, will that productivity translate into better working conditions, or simply fewer jobs?


Discover more from Being Shivam

Subscribe to get the latest posts sent to your email.