Aira’s new smart glasses give blind users a guide through the visual world

When it comes to augmented reality technologies, visuals always seems to be a pretty essential part of most people’s definitions, but one startup is offering an interesting take on audio-based AR that also calls on computer vision. Even without integrated displays, glasses are still an important part of the company’s products, which are designed with vision-impaired users in mind.

Aira has built a service that basically puts a human assistant into a blind user’s ear by beaming live-streaming footage from the glasses camera to the company’s agents who can then give audio instructions to the end users. The guides can present them with directions or describe scenes for them. It’s really the combination of the high-tech hardware and highly attentive assistants.

The hardware the company has run this service on in the past has been a bit of a hodgepodge of third-party solutions. This month, the company began testing its own smart glasses solution called the Horizon Smart Glasses, which are designed from the ground-up to be the ideal solution for vision-impaired users.

The company charges based on usage; $89 per month will get users the device and up to 100 minutes of usage. There are various pricing tiers for power users who need a bit more time.

The glasses integrate a 120-degree wide-angle camera so guides can gain a fuller picture of a user’s surroundings and won’t have to instruct them to point their head in a different direction quite as much. It’s powered by what the startup calls the Aira Horizon Controller, which is actually just a repurposed Samsung smartphone that powers the device in terms of compute, battery and network connection. The controller is appropriately controlled entirely through the physical buttons and also can connect to a user’s smartphone if they want to route controls through the Aira mobile app.

Though the startup isn’t planning to part ways with their human assistants anytime soon, the company is predictably aiming to venture deeper into the capabilities offered by computer vision tech. The company announced earlier this month that it would be rolling out its own digital assistant called Chloe that will eventually be able to do a whole lot, but is launching with the ability to read so users can point their glasses at some text and they should be able to hear what’s written. The startup recently showed off a partnership with AT&T that enables the glasses to identify prescription pill bottles and read the labels and dosage instructions to users.

Techcrunch event

Disrupt 2026: The tech ecosystem, all in one room

Your next round. Your next hire. Your next breakout opportunity. Find it at TechCrunch Disrupt 2026, where 10,000+ founders, investors, and tech leaders gather for three days of 250+ tactical sessions, powerful introductions, and market-defining innovation. Register now to save up to $400.

Save up to $300 or 30% to TechCrunch Founder Summit

1,000+ founders and investors come together at TechCrunch Founder Summit 2026 for a full day focused on growth, execution, and real-world scaling. Learn from founders and investors who have shaped the industry. Connect with peers navigating similar growth stages. Walk away with tactics you can apply immediately

Offer ends March 13.

San Francisco, CA | October 13-15, 2026

The company is currently in the testing phase of the new headset, but hopes to begin swapping out old units with the Horizon by June.

Topics

, , , , ,
Loading the next article
Error loading the next article