Learn How Robots Can Help When Disaster Strikes

Published

Learn How Robots Can Help When Disaster Strikes

Robotics technology plays an increasingly important role in search-and-rescue missions. Robots are used to explore areas that are deemed too dangerous or difficult for human teams to access. They can, for example, be used to investigate a hazardous material spill or search for disaster survivors. In the case of a disaster, a robot may save the life of not only the victims but also the rescue workers who might otherwise place themselves in harm’s way to search for survivors. Because of the life-saving potential of search-and-rescue robots in emergency situations, researchers are investigating better ways to control the robots in stressful and challenging environments.

Robots that are used for search and rescue are essentially an extension of the human rescue team. Cameras, microphones, and other sensors that are attached to the robot transmit critical information to the rescue team, who typically controls the robot’s movement remotely. Until recently, rescuers who managed a search-and-rescue robot normally had to manipulate complicated joysticks, dials, and switches on a very elaborate controller with multiple electro-mechanical parts. As described in our blog entry (opens in new tab) last year, the robotics lab at the University of Massachusetts, Lowell (UML) (opens in new tab), has developed a natural user interface (NUI) controller that promises greater finesse and control of robots during search-and-rescue operations.

MICROSOFT RESEARCH PODCAST

AI Frontiers: The future of scale with Ahmed Awadallah and Ashley Llorens

This episode features Senior Principal Research Manager Ahmed H. Awadallah, whose work improving the efficiency of large-scale AI models and efforts to help move advancements in the space from research to practice have put him at the forefront of this new era of AI.

Today we are pleased to present a new short video that highlights the accomplishment of this work and gives you an update on its status.

Get Microsoft Silverlight (opens in new tab)

Building the DREAM Controller

The Lowell robotics lab takes a NUI approach for the Dynamically Resizing Ergonomic and Multi-Touch (DREAM) Controller, which has been in development since 2008. Two Microsoft technologies underlie the DREAM Controller: the Microsoft Robotics Developer Studio (used for simulation) and Microsoft Surface (the user interface).

The Microsoft Surface is a coffee-table-sized device with a computer inside and a touch-sensitive interface on top. The Surface allows multiple users to interact with the computer simultaneously by using whole-hand or multiple-finger gestures. These gestures enable rescue teams to control robots with greater dexterity than they could with traditional robotics controllers—and precise control of the robots is critical for search-and-rescue efforts. In addition, the Surface permits more than one robot to be controlled simultaneously—previously not possible with a single controller.

To use the DREAM Controller implemented on the Surface, users simply place their hands on the interface. The DREAM Controller identifies the user’s fingers and thumbs and displays a virtual “joystick” beneath their hand. The user then uses their thumb to manipulate the virtual joystick. There are up to four degrees of freedom (two on each thumb, that is, X-Y on each), enabling the control of four different dimensions.

The Lowell team (Holly Yanco (opens in new tab) and Mark Micire (opens in new tab)) is also developing a series of pre-programmed gestures with guidance from expert search-and-rescue volunteers. The goal is to develop code that enables the DREAM Controller to recognize specific gestures that rescue workers make naturally during a search-and-rescue operation, thereby facilitating and accelerating rescue efforts.

The novel NUI approach to robotics that was employed by the Lowell robotics lab in this socially significant application helped the DREAM Controller project win one of eight grants that Microsoft Research offered under our Social Human Robot Interaction Request for Proposals (RFP). The grant award included financial support, a donated Microsoft Surface, and access to the Microsoft Research team.

I think the DREAM Controller project truly shows what a better first response system—using NUI technology—could look like in the very near future. Check out the video!

Stewart Tansley (opens in new tab), Senior Research Program Manager, Microsoft Research Connections

Learn More