A woman sits in an autonomous wheelchair in a hallway setting.

Merging autonomy with manual control for wheelchair users

U-M researchers are tackling the same trust issues as the AV industry.

Experts

Vineet Kamat

Portrait of Vineet Kamat

See full bio

John L. Tishman Family Professor of Construction Management and Sustainability

Professor of Civil and Environmental Engineering

Carol Menassa

Portrait of Carol C. Menassa.

See full bio

Professor and John L. Tishman CM Faculty Scholar

Autonomous technologies promise passengers travel without concern—the ability to get from Point A to Point B without needing to be engaged in the process. Yet passengers still don’t trust computers the way they trust human drivers, and most autonomous vehicles on the road today are equipped for a human to take over.

The same technologies that allow autonomous vehicles (AVs) to navigate city streets can also give motorized wheelchair users the ability to get from place to place without personally controlling the chair, but wheelchair users also want a way to override the computer. Engineers at the University of Michigan aim to provide that functionality.

Video transcript

Carol: There is a lot of research that focuses on making wheelchairs autonomous.

Narrator: However, if we look at the needs of the people who are going to use this, there are certain scenarios and situations that call for shared control.

Narrator: This is called CoNav, a semi-autonomous wheelchair that prioritizes sharing control with its user. It was developed by Michigan Engineering researchers to help relieve some of the challenges wheelchair users may have with navigating and maneuvering through environments that are complex or unfamiliar.

Vineet: Some of these obstacles are so profound that individuals may often decide, instead of going through those difficulties, they might as well not partake in a particular activity.

Narrator: As the need for temporary and permanent use of wheelchairs rises, assistive tech like robotic and autonomous wheelchairs could go a long way in promoting independent mobility and reducing reliance on caregivers. But they often fail to consider a user’s personal preferences while navigating or maneuvering.

Vineet: Perhaps the user, if they were driving the wheelchair manually, may have a comfortable speed at which they travel in a straight line. Maybe they have a comfortable distance and speed at which they pass through other humans in the hallway.

Narrator: With CoNav, a user can choose their preferred path through an environment and let the wheelchair guide them to their destination. Along that path, CoNav observes the surroundings by maneuvering around and avoiding any obstacles in the way. If the chair gets too close to something, or the user feels like the chair is not navigating how they prefer, they can take back control.

Carol: Trust is very important in this type of situation because there are so many things at stake. You want to trust that you’re going to be safe, and that any people in that environment are going to be safe.

Yifan: When you feel it is too close to the wall or too close to the obstacle…

Vineet: It’s about the user, through demonstration over time, letting the robotic platform know what its preferences are under what set of circumstances, and the robot recognizing those as a pattern and slowly imbibing them into its own behavior and autonomy.

Carol: The feedback between the user and the system is very important, and that is the feedback that’s going to, over time, initiate or establish that trust in the system.

Narrator: In the future, the researchers are planning to have wheelchair users test CoNav’s capabilities, which they’re hoping will be the next step towards integrating the system into existing powered wheelchairs.

Vineet: We really have an opportunity to make a real impact on society by supporting a group that will really benefit from having a capability like this that will keep them mobile—allow them to partake of everything life has to offer, longer, over time.

Most wheelchairs available to those needing regular transportation are either fully manual, or fully autonomous—with few options in between. The research team is harnessing the light detection and ranging (LiDAR) sensors and an onboard camera to allow people with disabilities the combination of freedom to let the software drive and oversight when the user is less trusting of autonomous decisions.

Labeled image of a motorized CoNav wheelchair, equipped with an enhanced display, joystick, VLP-16 LiDAR, Witmotion IMU, ZED camera, and US Digital Encoder.
The CoNav autonomous wheelchair developed by U-M engineers gives users the option of taking control. Photo: Vineet Kamat, Carol Menassa

“The sweet spot is something we call ‘shared control’ or ‘shared autonomy,’ where the robot is assisting you to the extent you want, but it is not ever putting the passenger in a situation where they cannot control their destiny,” said Vineet Kamat, the U-M John L. Tishman Family Professor of Construction Management and Sustainability and a professor of civil and environmental engineering. 

“People with physical disabilities primarily want to maintain their independence but have significant navigation and maneuvering challenges operating in the built environment.”

In many instances, the prospect of having to steer through a complex or crowded pathway may discourage people with disabilities from partaking in social, business or educational opportunities. In the U.S., roughly 2.7 million per year experience health issues that require the use of a wheelchair. 

A woman sits in a wheelchair as another person places a block M sticker on the back of a wheelchair. Two other people observe, standing closely by.
Carol Menassa, a U-M professor of civil and environmental engineering, sits in the CoNav autonomous wheelchair during testing. At left is undergraduate student Qianwei Wang, and Vineet Kamat, the U-M John L. Tishman Family Professor of Construction Management and Sustainability and a professor of civil and environmental engineering, is in the middle. Photo: Marcin Szczepanski, Michigan Engineering.

Kamat and Carol Menassa, a U-M professor of civil and environmental engineering and a John L. Tishman Construction Management Faculty Scholar, have collaborated for years, helping robots to understand and reason with built environments, both indoor and outdoors. This year, their research team was able to outfit a motorized wheelchair with both LiDAR and a 3D camera, tap into the wheelchair’s drive system and write algorithms that would allow for shared control with the help of a video game controller.

“Trust is very important in this type of situation, because there are so many things at stake,” Menassa said. “You want to trust that you are going to be safe and that any people in the environment are going to be safe.”

It’s the same problem being encountered by the auto industry. In Oct. 2023, J.D. Power reported: “Consumer confidence in fully-automated, self-driving vehicles continues to decline for the second consecutive year… Consumers show less readiness on all metrics, with the lowest level of comfort riding in a fully automated, self-driving vehicle and using fully automated, self-driving public transit.”

They have been testing their system, called CoNav, in the basement corridors of the G.G. Brown Building on North Campus with able-bodied volunteers manning the chair.

“Feedback between the user and the system is very important,” Menassa said, “and that is the feedback that’s going to, over time, initiate or establish that trust.”

A woman sits in a wheelchair as another person places a block M sticker on the back of a wheelchair. Two other people observe, standing closely by.
Researchers add finishing touches to the CoNav autonomous wheelchair, developed at U-M. Photo: Marcin Szczepanski, Michigan Engineering.

The research team includes: Yifan Xu, a U-M graduate student research assistant, Jordan Lillie a U-M biomedical engineering technician and undergraduate student Qianwei Wang.

Following the technical validation and testing that’s currently underway, the team will turn its focus to testing with people with disabilities.