Yanco Awarded Over $1M for Robotics Research
By Edwin L. Aguirre
Robots have been used in disaster responses for more than a decade — in New York’s World Trade Center in 2001, in Biloxi, Miss., after Hurricane Katrina in 2005 and in the aftermath of Japan’s Fukushima Daiichi nuclear reactor meltdown in 2011. These machines helped search for survivors and assessed damage in environments deemed too hazardous for humans.
“It’s better to put a robot in harm’s way than a person,” says computer science Prof. Holly Yanco, founder of the department’s Robotics Lab in Olsen Hall.
Thanks to a research grant from Google, Yanco and her team are developing a better way for first responders to communicate with each other and work with robots in the field. The one-year funding, worth nearly $52,000, is one of four robotics technology-related grants awarded recently to Yanco, totaling more than $1 million. The other funding agencies include the National Science Foundation (NSF) and the National Institute for Standards and Technology (NIST).
“A major limitation of disaster responses is the difficulty of sharing information — between responders out in the field, between the field and command center, etc. — particularly as we increase the amount of available digital data from satellites, robots, handheld sensors and many other sources,” notes Yanco.
“For example, the video feed from current robot systems can only be seen by the robot operator and people looking at the computer screen over the operator’s shoulder. While the camera feed can be recorded and then brought or sent to people who are not at the deployment site, it introduces delay into the search process. Our goal is to improve disaster response through more effective information sharing, including the fusion of multiple data streams to create augmented visualizations,” she says.
Yanco, who is the principal investigator (PI) for the Google project, will use robots in conjunction with Google Glass, which has a video display mounted on eyeglass frames, and Project Tango, a tablet PC laden with an array of sensors. First responders and/or robots will be fitted with Google Glass and Project Tango to create a 3-D map as they move through the disaster site. The map is then sent to the command center in real time.
“Once you have a detailed 3-D map of the site, you can develop a way of ‘looking through the walls’ during search-and-rescue operations; that is, predicting the kind of structure you will likely encounter behind a wall or door. We may even add a thermal imaging camera for quickly locating victims,” explains Yanco. “We will also employ the work of one of our former graduate students, Mark Micire, to direct a swarm of robots — or people — using Google Glass and a touch table computer such as Microsoft Surface.”
Yanco, who directs the university’s New England Robotics Validation and Experimentation (NERVE) Center, is the PI in the other three research grants, namely:
“Human-Supervised Perception and Grasping in Clutter,” NSF ($553,607): The three-year project, which is being conducted in collaboration with Asst. Prof. Robert Platt, Jr., of Northeastern University and the Crotched Mountain Rehabilitation Center in Greenfield, N.H., uses a robotic arm mounted on a motorized wheelchair or scooter to assist the elderly and people with disabilities in picking up things from a cluttered shelf, cabinet or floor. This assistive manipulation has the potential to be used in a variety of military, police, underwater and outer space applications.
“Development of Standard Test Methods for Response Robots with an Emphasis on Training,” NIST ($280,090): The NERVE Center will set up international standard test methods to quantitatively evaluate the training proficiency of members of a police bomb squad or other emergency responders. The goal of the three-year project is to develop training methods to make the team members’ learning curve in operating a robot go faster, track their progress and test scores and create better operators, regardless of the robots’ manufacturer.
“Research and Development of Test Methods for Autonomous Robots in Manufacturing Environments,” NIST ($181,380): More and more robots are now being used in manufacturing plants around the world. Mobile robots, called automated guided vehicles (AGVs), can be found moving around production floors as well as warehouses and order-fulfillment centers. The two-year project aims to test the safety of these AGVs in working with and around people in various manufacturing environments and obstacles and different workplace scenarios. The technology can also be applied to other fields involving mobile, autonomous machines such as hospital-delivery robots, commercial floor-cleaning robots, agricultural robots, etc.
Growth in the robotics market continues, with robots used not only in the automotive manufacturing industry but also in life sciences and materials handling and processing. In fact, the first quarter of 2015 set a new record for robots ordered and shipped in North America, according to the Robotic Industries Association.
So what does Yanco think of the future of human-robot interaction?
“Robots will still need to communicate with people,” she says. “They will not take over jobs — it just maybe that the jobs will be different as people work together with robots. Robots will not take over the world, so there is no need to worry!”