An image-based uterus positioning interface using ADALINE networks for robot-assisted hysterectomy

Hiu Man Yip, David Navarro-Alarcon, Yunhui Liu

Research output: Chapter in book / Conference proceedingConference article published in proceeding or bookAcademic researchpeer-review

Abstract

Surgical manipulators are becoming more popular in modern operating theatres. Robots which work side-by-side with the surgeon and perform supportive tasks are one of the examples. However, how to allow the user to control the robot in a user-friendly manner is challenging. In this paper, we present our work on developing an image-based adaptive user interface to control a robot which assist in uterus positioning during laparoscopic hysterectomy for the hand-busy surgeon. The presented interface can be operated in two different modes, the pick and place mode, and the command specifying mode. Under the pick and place mode, the user specifies the desired starting point and ending point of the manipulation with his/her eyes and the robot drives automatically based on these points specified by the user; under the command specifying mode, the user specifies which joint and in which direction to move by looking at features of the laparoscopic monitor, then a driving command would be sent to the robot. Details of these two control approaches and the experimental results demonstrating how they work in uterus positioning are presented
Original languageEnglish
Title of host publication2017 IEEE International Conference on Real-time Computing and Robotics (RCAR)
PublisherIEEE
DOIs
Publication statusPublished - 2017
Externally publishedYes

Fingerprint

Dive into the research topics of 'An image-based uterus positioning interface using ADALINE networks for robot-assisted hysterectomy'. Together they form a unique fingerprint.

Cite this