halo is here, the open-source glasses with a private AI agent that can see, hear and speak

halo’s open-source glasses comes with AI agent

 

Halo has arrived, the open-source glasses by Brilliant Labs embedded with a private AI agent that can see, hear, and speak. An upgrade of Frame, the eyewear interacts with both the wearer and what it sees by assisting users in real time and having its own memory. At the center of the Halo system is Noa, the private AI assistant. It works like a real human, having live conversations with the wearer, which they can hear from the bone-conduction speakers at the tail of the glasses. The agent can also answer the user when they want to know what they’re looking at, the title of the song that’s playing, or if the wearer just has a question in general. 

 

For the Halo design team, the open-source glasses are a hands-free device to get information or even process visual or audio content from the users’ surroundings. The private AI agent even has memory support, which means it can remember what the wearer saw, heard, or said. Over time, this saved memory can make the assistant more personal and allow it to predict what its owners want. It stores the conversations and behavior in its memory, even recalling personal details. The team says, however, that the open-source glasses Halo are still built to protect the user’s privacy, so all memory storage and AI processing are managed with privacy controls in place.

all images courtesy of Brilliant Labs

 

 

Eyewear that can create and develops its own apps

 

Halo’s hardware includes several components around the open-source glasses, like a color display, a low-power optical sensor used for AI processing, two microphones with sound activity detection, a low-power AI processor, and two bone conduction speakers. These eyewear parts support the real-time functions and allow the user to hear without blocking the ears. Since the device is open-source, anyone can view, change, or share the software and hardware designs, especially developers and researchers (the design files and source code are available online). 

 

This also allows others to make custom versions of the Halo system, and the open-source model supports learning, testing, and community-driven improvements. The glasses also include a function called Vibe Mode, which is a tool inside the system that lets users turn ideas into apps. The user types their idea into a terminal called Vibe Mode, and Noa reads the idea and starts creating it. The AI agent takes the input and generates a version of the application, and after that the user can then edit or adjust the design. 

the device has two microphones with sound activity detection and a low-power AI processor

 

 

Wearable technology that projects info in front of the eyes

 

The eyewear also acts as a display system, meaning that information created or shared by Noa directly appears in the user’s view. The user can see results, images, or app interfaces without using a phone or computer since the display is small and built into the glasses. The open-source glasses support building custom applications, so developers can write code to add new tools or features directly into Halo. These apps can use visual or audio data from the glasses, and they can also use the memory functions of Noa to track long-term tasks or goals.

 

The team says that Halo is designed to fit most people, but not all. The device has been designed for an IPD (inter-pupillary distance) range of 58–72 mm, and this includes most users. The issue is that not every person falls within this range, so the team advises wearers to use the Eye Measure app to check whether the device suits their eyes. The device weighs slightly over 40 grams, and under regular conditions, it can last for a full day on a single charge. The Halo pool to pre-order the open-source glasses has opened, and the shipping starts in late 2025.

at the tail of the wearable technology, there are two bone conduction speakers

the device supports real-time functions and allows the user to hear without blocking the ears

the AI agent has memory support, which means it can remember what the wearer saw, heard, or said

this saved memory can make the assistant more personal and allow it to predict what its owners want

the shipping of the device begins in the late 2025

 

project info:

 

name: Halo

brand: Brilliant Labs | @brilliantlabsar

The post halo is here, the open-source glasses with a private AI agent that can see, hear and speak appeared first on designboom | architecture & design magazine.

Scroll to Top