Photokina 2014 highlights

Gesture-controlled computers and robotic nurses being developed for operating rooms

By

February 7, 2011

Researchers are developing a system that would allow surgeons to control both computers an...

Researchers are developing a system that would allow surgeons to control both computers and robotic scrub nurses via hand gestures (Photo: Purdue University)

Image Gallery (2 images)

Although surgeons need to frequently review medical images and records during surgery, they’re also in the difficult position of not being able to touch non-sterile objects such as keyboards, computer mice or touchscreens. Stepping away from the operating table to check a computer also adds time to a procedure. Researchers from Indiana’s Purdue University are addressing this situation by developing gesture-recognition systems for computers, so that surgeons can navigate through and manipulate screen content simply by moving their hands in the air. The system could additionally be used with robotic scrub nurses, also being developed at Purdue, to let the devices know what instruments the surgeon wants handed to them.

The system incorporates a Microsoft Kinect camera (yes, from the gaming system) and specialized algorithms to recognize hand gestures as instructions.

“One challenge will be to develop the proper shapes of hand poses and the proper hand trajectory movements to reflect and express certain medical functions,” said Juan Pablo Wachs, an assistant professor of industrial engineering. “You want to use intuitive and natural gestures for the surgeon, to express medical image navigation activities, but you also need to consider cultural and physical differences between surgeons. They may have different preferences regarding what gestures they may want to use.”

Researchers are developing a system that would allow surgeons to control both computers an...

There are also other considerations that the researchers are taking into account in the design of the system. For instance, they don’t want surgeons to be required to wear special types of gloves or colors of clothing in order for their hands to be “read.” The system should also be able to recognize and respond to gestures quickly, and provide confirmation that it understands the request. At the same time, however, it should not accidentally respond to extraneous gestures, such as those made to colleagues in the room.

The Purdue team also want the system to be relatively inexpensive, and to be quickly and easily adaptable to different operating rooms, lighting conditions, and other variables.

The system could be particularly effective when combined with the robotic scrub nurses, although they wouldn’t be intended to replace human nurses in all situations. “While it will be very difficult using a robot to achieve the same level of performance as an experienced nurse who has been working with the same surgeon for years, often scrub nurses have had very limited experience with a particular surgeon, maximizing the chances for misunderstandings, delays and sometimes mistakes in the operating room,” Wachs said. “In that case, a robotic scrub nurse could be better.”

While other groups have also researched the use of robotic scrub nurses, Wachs claims that his is the first to look into the incorporation of gesture – instead of voice – recognition. The Purdue system is also apparently unique in that it uses advanced algorithms to predict where the surgeon’s hands will be next, or what screen images will next be requested.

About the Author
Ben Coxworth An experienced freelance writer, videographer and television producer, Ben's interest in all forms of innovation is particularly fanatical when it comes to human-powered transportation, film-making gear, environmentally-friendly technologies and anything that's designed to go underwater. He lives in Edmonton, Alberta, where he spends a lot of time going over the handlebars of his mountain bike, hanging out in off-leash parks, and wishing the Pacific Ocean wasn't so far away.   All articles by Ben Coxworth
1 Comment

I think they should incorporate some AI learning for a particular surgeon. Their "profile" could be stored on a USB key or a network location and loaded to the system in the operating room they are entering. This would alleviate much of the "cultural and physical differences between surgeons."

CeridianMN
8th February, 2011 @ 10:28 am PST
Post a Comment

Login with your gizmag account:

Or Login with Facebook:


Related Articles
Looking for something? Search our 28,551 articles