NUS team develops affordable headset to help visually impaired ‘see’

AiSee works by analysing images seen through a built-in camera and giving information about the object to the user through a verbal prompt. PHOTO: NATIONAL UNIVERSITY OF SINGAPORE

SINGAPORE – A new headset developed by a team of university researchers aims to give “sight” to the visually impaired – at a price tag of under $500.

Dubbed AiSee, the prototype works by analysing images seen through a built-in camera and giving information about the object to the user through a verbal prompt.

Associate Professor Suranga Nanayakkara, lead researcher of Project AiSee at the National University of Singapore (NUS), said: “We want to fundamentally rethink how interfaces between human and technology can be made to fit the abilities and expectations of the target users.”

He added: “To achieve this vision, we create novel human-computer interfaces and interactions that seamlessly integrate with a user’s mind, body and behaviour, providing an enhanced perception and cognition.

“We call these ‘assistive augmentations’, which not only look at compensating lack of ability, but also focus on helping users achieve their full potential.”

During his postdoctoral studies at the Massachusetts Institute of Technology in 2012, Prof Nanayakkara saw how a blind friend would take pictures of lecture notes with his phone camera.

He would use his hands to feel the edges of the paper and then hold the phone above it to take a picture before using KNFB Reader – a mobile app for blind, low-vision and other print-disabled users that converts text to speech – to listen to the text.

Prof Nanayakkara thought an all-in-one device with a camera could do the job faster and better.

“This was the inspiration for this particular device. I wanted to develop an interface that provides seamless access to people to interact with the world,” he said.

Worn on the head, AiSee is a compact device which lets users identify objects by holding them up and capturing an image with the press of a button. Its built-in camera extracts features such as text and logos for processing.

An artificial intelligence (AI)-powered image processing unit in the headset then uses large language models (LLMs) such as Open AI’s ChatGPT 4.0 to comprehend and respond to the user’s queries quickly.

The headset bypasses ears, transmitting sound through the skull. This lets visually impaired users receive auditory information while being able to still hear what is going on around them, making it safer for them, especially during risky situations.

AiSee, which was first developed in 2018 by Prof Nanayakkara and his team, has since been modified from a finger-worn (ring-like) interface into a headphone, making it hands-free and easy to wear.

The new prototype is supported by LLMs to allow users to have a more natural interaction with the device.

The initial research and development for the project, which was started in 2015, was supported by research grants from various organisations. The team has also received $150,000 from B.P. de Silva Holdings for the next phase of its project.

Discussions are ongoing with SG Enable to conduct user testing with five persons with visual impairment.

The findings will help to refine and improve AiSee’s features and performance. Field tests will begin in July for three to four months, to obtain feedback from the users to improve the device.

Prof Nanayakkara and his team are working to make the next prototype lighter than its current 140g and adjustable to fit all head sizes. The camera button will also be replaced with a wake word to capture images.

The time taken for the AI to process the information and respond will be reduced, and the AI will also be made to answer multiple questions.

There are also plans to commercialise the product, pending tie-ups with the private sector.

It is hoped that the product can be priced at under $500 – 10 times cheaper than existing assistive tools, like Israeli firms OrCam Technologies’ wearable camera OrCam MyEye Pro, and six times cheaper than the Netherlands-based Envision Glasses Home Edition, built on the since-discontinued Google Glass Enterprise Edition 2.

Associate Professor Suranga Nanayakkara (left) with NUS student Mark Myres, who tested AiSee as a visually impaired user. PHOTO: NATIONAL UNIVERSITY OF SINGAPORE

NUS student Mark Myres, a visually impaired user who has been helping to test AiSee since November 2023, said he hopes for a hands-free device to help him with his daily tasks. This will cut down on the number of apps people like himself have to use on their smartphones to recognise colours, currency and objects, he said.

“Most assistive devices seem very targeted at either totally blind or visually impaired people. AiSee is a good balance. Both visually impaired and blind people could get a lot of benefits from this,” he added.

Join ST's WhatsApp Channel and get the latest news and must-reads.