UML Roboticists are Programming the Future

Asst. Prof. Reza Ahmadzadeh holds a HEXA robot.
Asst. Prof. Reza Ahmadzadeh holds a HEXA robot.

04/17/2019
By Ed Brennen

Asst. Prof. Reza Ahmadzadeh handed out two things on the first day of his Mobile Robotics 1 course this spring: a syllabus and a Cozmo, an artificial intelligence robot the size of a coffee mug that students can take home during the semester and program.

One of four new faculty members to join the Computer Science Department this year, Ahmadzadeh’s expertise is in robot learning and human-robot interaction. When he earned his master’s degree in mechanical engineering in his native Iran in 2001, Ahmadzadeh says, there was maybe one robot at the entire university. He’s excited to now be teaching and conducting research at the Kennedy College of Sciences, where students and faculty have access to the latest in robot technology, interdisciplinary labs and testing facilities.

“I’m lucky to be here and to be able to use these resources,” says Ahmadzadeh, who earned his Ph.D. in robotics, cognition and interaction technologies from the University of Genoa in Italy in 2015.

Ahmadzadeh doesn’t have to go far to find one of the country’s leading roboticists; his Dandeneau Hall office is next door to Distinguished University Professor Holly Yanco, director of the New England Robotics Validation and Experimentation (NERVE) Center and founder of UML’s Human-Robot Interaction Lab.

Despite all the talk of robots taking over jobs, they still have a way to go before they can function on par with humans, Yanco says. In fact, during a talk on the future of human-robot interaction last year at UMass Lowell’s Innovation Hub in Haverhill, she drew some surprised looks from the crowd when she described robots as essentially “stupid.”

“Right now, our robots will show you a video of the last five minutes of where they got stuck,” Yanco said as she described her team’s latest research on giving humanoid robots and other autonomous systems the ability to evaluate how well they can perform a task — or if they can do the task at all. “They don’t know how to quickly summarize a situation like people do. Our goal is to develop methods and metrics that would enable autonomous systems to assess their own performance.”

They’re doing so through a project called SUCCESS (Self-assessment and Understanding of Competence and Conditions to Ensure System Success), which is funded with a $7.5 million grant from the U.S. Department of Defense. UML is collaborating on the five-year initiative with three other institutions: Carnegie Mellon University, Brigham Young University and Tufts University.

Building a Better Robot

At the NERVE Center’s home at 110 Canal St. in Lowell, Yanco and her colleagues are evaluating how a pair of $25,000 “Baxter” robots are able to complete assembly tasks, problem-solving scenarios and games. The red and black two-armed humanoid Baxters can display facial expressions on their built-in computer screens while they carry out their assigned tasks.

Working with researchers at Carnegie Mellon’s Robotics Institute, the team is building a software database that lays out all of the variables the robots could encounter and ways in which they could execute tasks based on their previous behavior. By looking at the robots’ track record, researchers hope to predict how well they will perform in the future. The data could be used by operators in the field to help them anticipate how the machines will behave. It could also help computer scientists and engineers design and build the next generation of enhanced robotics.

Prof. Holly Yanco with a Baxter robot
Prof. Holly Yanco will use a Baxter robot to develop self-assessing autonomous systems.
“Hopefully, the study will lead to better human-robot teamwork and increase the level of trust, expectation and efficiency between the two,” said Yanco, who described how even the simple task of a robot handing an item to a human co-worker requires careful study and consideration.

For good old-fashioned humans, working side-by-side with robots is becoming increasingly common across many industries. According to the World Economic Forum’s latest “Future of Jobs Report,” by 2022, stationary robots will be employed at 37 percent of all companies.

“That’s continuing to grow,” says Yanco, who sees collaborative robots — such as exoskeletons that people can wear to help them complete physically demanding tasks like lifting heavy boxes — as a particularly promising “breakout market.”

A Boost to Manufacturing

Since 2017, UML has also been a partner in the $250 million Advanced Robotics Manufacturing (ARM) Institute, a national initiative that focuses research on robots capable of interacting with humans on manufacturing floors and learning new manufacturing processes.

“Hopefully, this will bring back a lot of manufacturing to the country,” says Yanco, who is heading up the university’s regional ARM institute efforts along with the Massachusetts Institute of Technology’s Computer Science and Artificial Intelligence Laboratory, Northeastern University, Worcester Polytechnic Institute and MassRobotics, a nonprofit robotics innovation hub.

“We are proud to bring our resources and expertise, along with our excellence in advanced manufacturing research, to the nation’s first manufacturing innovation institute focused on robotics,” Chancellor Jacquie Moloney said when Gov. Charlie Baker announced the partnership. “This field holds tremendous promise for the economy and the creation of jobs for skilled workers in the commonwealth and beyond.”

Indeed, with 122 robotics-related companies employing close to 5,000 people across the commonwealth (according to the Massachusetts Technology Collaborative’s 2016 statistics), demand for highly skilled professionals is strong.

Given the cluster of robotics employers in the region, it’s no surprise that student interest is high. There are close to 60 students currently enrolled in UML’s robotics minor, an interdisciplinary 24-credit program offered by the departments of Computer Science, Electrical & Computer Engineering and Mechanical Engineering.

In addition to landing co-ops, internships and eventual jobs at companies such as iRobot, Amazon Robotics and Symbotic, robotics students can do hands-on research at the NERVE Center, which provides testing services for industry, academia and the government in collaboration with the National Institute of Standards and Technology.

The NERVE Center’s most famous alum is Valkyrie, NASA’s $2 million humanoid robot that moved to MassRobotics in Boston after a two-year stay on campus. UML students now travel there to work on the 6-foot, 300-pound robot, which one day may help build a space station on Mars.

Spread over the first and second floors at 110 Canal St. in downtown Lowell, the NERVE Center features a robotic manipulation testbed called the Robot ARMada, a collection of robotic arms, end effectors and sensor systems. The center also features movement assessment and performance labs where researchers collaborate with faculty from the Physical Therapy Department to evaluate wearable robots.

“It’s amazing how far we’ve come in the last 10 to 20 years, but the part that is missing is intelligence. We don’t have a notion of intelligence so far. I want to see when the robots get to human level,. That’s what fascinates me about robotics.” -Asst. Prof. Reza Ahmadzadeh
One area they’re currently studying is what happens to a person’s muscles over time when wearing an exoskeleton. By using sensors to look at muscle activation, researchers are trying to predict whether a person’s muscles will atrophy if they spend 40 hours a week carrying increased weight with robotic assistance. The goal is to develop an exoskeleton that can adjust to an individual throughout the day, thereby reducing fatigue and eventual injuries.

“In the morning, when you’re feeling strong, you can lift 150 pounds. Then later, when you’re not feeling so strong, the exoskeleton kicks in to help you,” Yanco explains. “We have to look at how people work, and how robots work, and how they’re going to work together.”

But what about robots and people working together at home, à la Rosie, the robot maid on “The Jetsons”? Ahmadzadeh is working on an algorithm that would enable consumers to teach new skills to robots so they can adapt to dynamic environments—such as getting around someone’s kitchen to make them a pot of coffee.

“Instead of coding the robot with a new skill, which many people don’t know how to do, you can easily show the robot how to do something. It learns from your movements,” Ahmadzadeh explains.

While robots have made tremendous strides thanks to recent advances in neural networks, computation and hardware, Ahmadzadeh says true full autonomy is still the stuff of science fiction.

“It’s amazing how far we’ve come in the last 10 to 20 years, but the part that is missing is intelligence. We don’t have a notion of intelligence so far,” says Ahmadzadeh, who hopes to contribute to the breakthrough needed for this revolutionary step in robotics through his work at UMass Lowell.

“I want to see when the robots get to human level,” he says. “That’s what fascinates me about robotics.”