News

Improving assistive technology

Students design robotic wearable to empower the visually impaired

Foresight co-founders

Foresight co-founders (clockwise from upper left) Ed Bayes, Milan Wilborn, Nick Collins, and Anirban Ghosh.

For the 285 million visually impaired people worldwide, assistive technology has come a long way since the white cane was popularized in the 1920s. Yet high-tech solutions to help these individuals navigate the world around them can often be intrusive, unintuitive, and expensive.

A group of Harvard students has launched a startup called Foresight, a wearable navigation aid for the visually impaired that uses cutting-edge soft robotics and computer vision technology. Foresight connects to a user’s smartphone camera, which is worn around the neck. It detects objects nearby and this information triggers soft textile units on the body that inflate to provide haptic feedback as objects approach and pass.

The group won the Innovation Award in the 2020 Harvard President’s Innovation Challenge, which recognizes ideas in the early stages of development with the potential to be world-changing.

"Being diagnosed with a permanent eyesight problem means getting used to a whole new way of navigating the world," explained Ed Bayes, a student in Harvard’s Master in Design Engineering (MDE) program, offered jointly by the John A. Paulson School of Engineering and Applied Sciences (SEAS) and Graduate School of Design (GSD). “We spoke to people living with visual impairment to understand their needs and built around that.”

The startup was born out of the joint SEAS/GSD course Nano Micro Macro, in which teams of students are challenged to apply emerging technology from Harvard labs. Foresight was inspired by students’ experience with blind family members who spoke of the stigma surrounding the use of assistive devices.

Foresight wearable device

A rendering of the Foresight wearable navigation aid. (Image provided by Foresight)

“Importantly, Foresight is discreet, affordable, and intuitive. It provides an extra layer of comfort to help people move around more confidently,” Bayes added.

The teammates had an interest in using technology to help people with disabilities, and latched onto soft robotics, working with the labs of Katia Bertoldi, William and Ami Kuan Danoff Professor of Applied Mechanics; and Conor Walsh, Paul A. Maeder Professor of Engineering and Applied Sciences.

“Most wearable navigation aids rely on vibrating motors, which can be uncomfortable and bothersome to users,” said Anirban Ghosh, M.D.E. ’21. “Soft actuators are more comfortable and can provide the same tactile information.”

With Foresight, the distance between an object and the user correlates with the amount of pressure they feel on their body from the actuators.

“The varied inflation of multiple actuators represents the angular differences of where those objects are in space,” said Nick Collins, M.D.E. ’21. “We settled on a simple, low-tech solution because we want this to be widely adoptable in places that don’t have access to a lot of high-end fabrication facilities. The simpler it is, the more widespread it can actually be deployed, which was important to us.”

Foresight user mock up

This mock-up shows how a user would wear the Foresight navigational aid. (Image provided by Foresight)

“The computer vision algorithm detects and classifies objects,” added Milan Wilborn, a materials science and mechanical engineering Ph.D. student in the lab of Joanna Aizenberg, Amy Smith Berylson Professor of Materials Science. “The software traces a bounding box around each object it ‘sees’ and by calibrating the ratios of that bounding box, the software can estimate how far an object is from the camera, and how quickly it is moving.”

The support and advice they’ve received from mentors in the Harvard Innovation Labs has been critical to their progress. The co-founders are now revisiting some of the tricky questions related to the software and sensors and plan to do more prototyping and user testing in the future.

“We want to know if the information we are giving them actually translates into a user-friendly interpretation of objects in the space around them,” Collins said. “This is another tool in their arsenal. Our desire isn’t necessarily to get rid of any current tools, but to provide another, more robust sensory experience.”

The team is inspired by the opportunity to use emerging technology to provide an implementable solution that could help many people.

“As a Ph.D. student, it is exciting to see this technology that is created in the lab, these non-linear mechanical systems, translating out of the lab and turning into a potential product” Wilborn said. “It is so rewarding to be able to work on developing a device that could help people.”

Topics: Entrepreneurship

Press Contact

Adam Zewe | 617-496-5878 | azewe@seas.harvard.edu