Skip to Main Content

Grad Student Controls Swarm of Robots With Fingertips

Multi-Touch Controller Gets Nearly 87,000 Hits on YouTube

Dreamcontroller
Micire can manipulate multiple robots by tapping, tagging and dragging the individual icons with his fingertips.

09/22/2010
By Edwin L. Aguirre

Imagine being able to control an army of robots with the touch of your fingertips. Sound like science fiction? Not to Mark Micire, a Ph.D. student in the Robotics Lab in the Computer Science Department.

Micire has developed a simple yet effective multi-touch interface for commanding and controlling a swarm of robots as part of his doctoral thesis. Such technology could conceivably be applied to military and police planning, disaster relief and search-and-rescue operations, warehouse inventory, and environmental monitoring and mapping, to name a few.

Micire presented his program to the public on Aug. 23 during his thesis defense.

“In addition to having a full audience in the room, he broadcast the talk online, with another 40 people attending that way,” says Assoc. Prof. Holly Yanco, the lab’s director. “Before his defense, the talk was promoted on Microsoft Surface’s blog and Twitter feed.”

Micire’s presentation has since been featured in numerous technology blogs, including Slashdot, Wired, Popular Science and Gizmodo. As of Sept. 24, his YouTube video demonstration has been viewed nearly 87,000 times (Watch it.)

Micire achieved his breakthrough by merging an existing technology ߞ; Microsoft’s Surface computer ߞ; with his new, innovative onscreen “joystick,” which he dubbed the DREAM controller.

Microsoft Surface is an interactive computer with a large, 30-inch tabletop flat-screen display. It lets users grab and manipulate digital content and move information using simple touch and hand gestures and object recognition instead of a typical mouse and keyboard.

“The potential of Microsoft Surface has been underutilized,” says Micire. “Since its introduction, it has been used mainly for entertainment, gaming and product demos. My program is one of the first practical, real-world applications of Surface.”

His DREAM controller is a simple, intuitive command-and-control interface that uses rapid hand detection and recognition algorithm.

“The DREAM controller can provide all the functionality of a physical joystick through multi-touch interaction,” he says.

The controller, which is displayed on Microsoft Surface directly underneath each user’s hand, automatically tracks hand movements and can respond to up to 10 points of contact simultaneously, compared to only one finger as with a typical touch screen.

In his demo, Micire uses the joystick to control a swarm of hypothetical smart “robots” to navigate and explore a virtual city block. Through simple fingertip commands, he is able to select robots from a group, tag them, set waypoints and coordinate their formation from above, quickly and precisely and with very little effort. He can also swoop down for a street-level view and manually control each robot, panning and zooming in on its location and providing real-time pairs of eyes on the ground. This capability offers great tactical advantage, especially in the areas of law enforcement and counterterrorism. One programmer has already adopted the DREAM controller for use in Portal and MS Flight Sim games on Microsoft Surface.

Micire says data from multiple types of robots can be integrated into one seamless command and control display. The system can also be combined with data sets from all kinds of sources, such as building blueprints, city maps, topographic maps and more.

Micire’s project is partly funded through Microsoft and Yanco’s National Science Foundation CAREER grant.

For more information, go to robotics.cs.uml.edu. You can also watch Micire’s thesis defense at www.vimeo.com/14543098 and a Q&A at www.vimeo.com/14548015.

Mark Micire