AI backpack “sees” for visually impaired people

It warns them about any potential obstacles in their path.

As an AI engineer, Jagadish Mahendran has spent a lot of time trying to help robots “see” the world around them.

Now, he’s doing the same for visually impaired people.

Using technology developed by Intel, Mahendran has created a voice-activated AI backpack that guides people with visual impairments in outdoor environments — and the system costs just $800, compared to thousands for some smart glasses.

A Better Assistive Technology

An estimated 285 million people are visually impaired, meaning they have vision problems that can’t be corrected with glasses. Of that group, 39 million are blind.

For blind people, navigating an outdoor environment can be both difficult and potentially dangerous — they may have trouble safely crossing the street or knowing when they need to step up onto a curb.

Guide dogs can help in these situations, but they can be expensive (and some people are allergic). White canes, meanwhile, won’t help them avoid overhead hazards, such as hanging tree branches.

There are other assistive technology devices to help with navigation, but they aren’t always ideal.

Voice-assisted smartphone apps can give visually impaired people turn-by-turn directions, but they can’t help them avoid obstacles.

Smart glasses usually cost thousands of dollars, while smart canes require a person to dedicate one hand to the tech — not great if they’re, say, trying to carry groceries home from the store.

Mahendran’s AI backpack, MIRA, hopes to be the perfect alternative.

“When I met my visually impaired friend, Breean Cox, I was struck by the irony that while I have been teaching robots to see, there are many people who cannot see and need help,” he said. “This motivated me to build the visual assistance system.”

Building an AI Backpack

Before jumping into development on the AI backpack, Mahendran and his collaborators interviewed several people with visual impairments to ensure the device would address the challenges they faced.

Armed with those insights, they developed a system consisting of a small backpack, a vest, and a fanny pack.

A $300 Luxonis OAK-D spatial AI camera contains the Intel computer vision tech that serves as MIRA’s “brains.”

To train the camera’s AI to identify curbs, crosswalks, and other objects, the researchers fed it images from existing databases, as well as some they took and labeled themselves.

After training, they mounted the camera in the vest and connected it to a computing device inside the backpack — this could be anything from a laptop to a Raspberry Pi.

A GPS mounted on top of the backpack also connects to the computer, and the battery powering the whole system goes in the fanny pack.

A Bluetooth-enabled earpiece lets the wearer communicate with the AI backpack.

They can give it commands, such as “Describe,” which prompts the AI to list nearby objects along with their clock positions (e.g., “Stop sign at 2 o’clock”).

They can also hear the AI’s automatic warnings about potential dangers — to let them know a hanging branch is straight ahead, for example, the AI will say “Top, front.”

MIRA can run for eight hours on a single charge and is designed to blend in.

The Future of MIRA

Mahendran told Freethink that the AI backpack prototype cost about $800 — that’s already thousands less than most smart glasses, but he and his team are working to get the cost down even further.

They plan to publish a research paper on the system in the near future and will make everything they develop for the project — the code, datasets, etc. — open source.

We are only limited by the imagination of the developer community.


Hema Chamraj

Right now, they’re raising funds for testing and looking for more volunteers to help them reach their ultimate goal of providing visually impaired people with an open-source, AI-based assistance system for free.

“It’s incredible to see a developer take Intel’s AI technology for the edge and quickly build a solution to make their friend’s life easier,” Hema Chamraj, Intel’s director of technology advocacy and AI4Good, said.

“The technology exists; we are only limited by the imagination of the developer community.”

We’d love to hear from you! If you have a comment about this article or if you have a tip for a future Freethink story, please email us at tips@freethink.com.

Related
The West needs more water. This Nobel winner may have the answer.
Paul Migrom has an Emmy, a Nobel, and a successful company. There’s one more big problem on the to-do list.
Can we automate science? Sam Rodriques is already doing it.
People need to anticipate the revolution that’s coming in how humans and AI will collaborate to create discoveries, argues Sam Rodrigues.
AI is now designing chips for AI
AI-designed microchips have more power, lower cost, and are changing the tech landscape.
Why futurist Amy Webb sees a “technology supercycle” headed our way
Amy Webb’s data suggests we are on the cusp of a new tech revolution that will reshape the world in much the same way the steam engine and internet did in the past.
AI chatbots may ease the world’s loneliness (if they don’t make it worse)
AI chatbots may have certain advantages when roleplaying as our friends. They may also come with downsides that make our loneliness worse.
Up Next
Digital Twin
Exit mobile version