Ultrasound Identifies Hand Gestures, May Lead to Hands Free Control of Surgical Systems

The researchers used a conventional ultrasound probe to image the muscles of the forearm while different gestures were performed. The investigators then used computer vision and machine learning tools to correlate muscle movements to the hand gestures that they produce. Working backwards, the system was able to identify which gestures were produced when different muscle motions were detected. Moreover, the technology proved itself even when the muscle motion was detected at the wrist, the location where smartwatches of the future that contain ultrasound transducers will reside.
Take a look at this demo video showing off the technology:
Flashbacks: Gestureplex Wrist Controller for Hands Free Operation of Devices in Surgical Theater…; Microsoft’s Kinect Technology Utilized for Vascular Surgery…; Robotic Assistant Offers a Helping Hand in the OR…; New System for Hands-Free Control of Image Viewer During Surgery…; Controlling Augmented Reality in the Operating Room…; Real-Time Touch-Free Gesture Control System for Image Browsing in The OR…; Low Cost Glove Translates Sign Language, May Be Used to Practice Surgery in Virtual Reality…
Paper presented at Proceedings of the 2017 CHI Conference on Human Factors in Computing Systems: EchoFlex: Hand Gesture Recognition using Ultrasound Imaging…
Không có nhận xét nào :
Đăng nhận xét