Skip to Main Content

Researchers to Develop Human-like Navigating Robots

Study Funded by $1.5M NSF Grant

Hugo, an augmented VGo Communications’ VGo telepresence robot, is shown being driven remotely by a human operator (visible on Hugo's screen). Actively participating in a mobile conversation with Hugo is Adam Norton, a UMass Lowell fine arts graduate now working as an educator and designer in Yanco’s Robotics Lab.

By Edwin L. Aguirre

A team of researchers from UMass Lowell, the University of Michigan and Tufts University has received a two-year grant from the National Science Foundation (NSF) totaling nearly $1.5 million to create intelligent robot systems that will navigate more like humans.

For its part, UMass Lowell will receive nearly $409,000 for the project, with Computer Science Prof. Holly Yanco as principal investigator.

“Our research will develop and evaluate an intelligent robot capable of being genuinely useful to a human and capable of natural dialog with a human about their shared navigation task,” says Yanco. “In particular, the robots will be able to ask for directions and clarifications to those directions.”

The team’s work will be tested in two areas: robot wheelchairs and telepresence robots. Robotic wheelchairs help people move to their desired destinations while telepresence robots serve as virtual eyes and ears for a remote human operator as the robots navigate within an environment.

Yanco says this research will create technologies for mobility assistance for people with disabilities in perception (blindness or low vision), cognition (developmental delay or dementia) or general frailty (old age).

“It will also support telepresence applications such as telecommuting, telemedicine and search and rescue,” she says.