snapchat’s AR glasses have lenses that create visuals and sounds of the book you’re reading

Snapchat AR glasses with lenses that generate visuals

 

The Snap Spectacles, a pair of AR glasses from Snapchat, have lenses that generate the images and sounds of the book the user is reading. Named Augmented Reading Lenses, it’s a collaboration between the National Library Board of Singapore and Snap Inc., with LePub Singapore as the campaign’s production lead. These Snapchat AR glasses and lenses use real-time OCR, or the conversion of typed text into a digital format, and generative AI to produce the visuals. The device already has stereo speakers, so the soundscapes are a natural addition to the reading experience.

all images courtesy of Snap Inc., National Library Board of Singapore, and LePub Singapore

 

 

Sounds play as the user reads the text

 

The Snapchat AR glasses and lenses use text recognition and machine learning to see the content the user is reading and activate the related visuals and sounds. First, the device scans the printed text as the user reads. Then, the images float before their eyes, accompanied by the sound effects linked to specific words or scenes. In this case, when the book describes a kind of environmental or action sound, like doors opening, the Snapchat AR glasses with these new lenses play that audio right into the speakers. 

 

So far, the company and the library say that the visuals appear in time for what the user is reading. Once they look up from the page, they can see the images depicted in the text in their field of vision. The National Library Board of Singapore adds that the project is a part of its initiative to use technology as a way to engage more people to read books. The teams have collaborated with LeGarage, the innovation branch of LePub Singapore, to help develop the reading experience and campaign of the Snapchat AR glasses and lenses. At the present time, they plan to roll out the beta-testing devices later in 2025 in Singapore to gather feedback before the public rollout.

view of the Snap Spectacles

the device uses real-time OCR and generative AI to produce the visuals and sounds

the images and sounds appear as the user reads

users can also interact with the floating imagery, based on what they’re reading

sample visuals when the user reads Pride and Prejudice

even Frankenstein shows up as a generated visual

the device already has stereo speakers, so the soundscapes are present

users can see the images depicted in the text in their field of vision

the beta testing rolls out later in 2025

 

project info:

 

name: Augmented Reading Lenses

companies: Snap Inc., Snap AR Studio, LePub Singapore | @spectacles, @lepub_worldwide

library: National Library Board of Singapore | @nlbsingapore

 

designboom has received this project from our DIY submissions feature, where we welcome our readers to submit their own work for publication. see more project submissions from our readers here.

 

edited by: matthew burgos | designboom

The post snapchat’s AR glasses have lenses that create visuals and sounds of the book you’re reading appeared first on designboom | architecture & design magazine.

Scroll to Top