Show simple item record

dc.contributor.advisorTaylor, Russell H
dc.creatorChalasani, Preetham
dc.date.accessioned2019-07-30T01:34:57Z
dc.date.available2019-07-30T01:34:57Z
dc.date.created2019-05
dc.date.issued2018-11-08
dc.date.submittedMay 2019
dc.identifier.urihttp://jhir.library.jhu.edu/handle/1774.2/61410
dc.description.abstractRobotic surgical systems have contributed greatly to the advancement of Minimally Invasive Surgeries (MIS). More specifically, telesurgical robots have provided enhanced dexterity to surgeons performing MIS procedures. However, current robotic teleoperated systems have only limited situational awareness of the patient anatomy and surgical environment that would typically be available to a surgeon in an open surgery. Although the endoscopic view enhances the visualization of the anatomy, perceptual understanding of the environment and anatomy is still lacking due to the absence of sensory feedback. In this work, these limitations are addressed by developing a computational framework to provide Complementary Situational Awareness (CSA) in a surgical assistant. This framework aims at improving the human-robot relationship by providing elaborate guidance and sensory feedback capabilities for the surgeon in complex MIS procedures. Unlike traditional teleoperation, this framework enables the user to telemanipulate the situational model in a virtual environment and uses that information to command the slave robot with appropriate admittance gains and environmental constraints. Simultaneously, the situational model is updated based on interaction of the slave robot with the task space environment. However, developing such a system to provide real-time situational awareness requires that many technical challenges be met. To estimate intraoperative organ information continuous palpation primitives are required. Intraoperative surface information needs to be estimated in real-time while the organ is being palpated/scanned. The model of the task environment needs to be updated in near real-time using the estimated organ geometry so that the force-feedback applied on the surgeon's hand would correspond to the actual location of the model. This work presents a real-time framework that meets these requirements/challenges to provide situational awareness of the environment in the task space. Further, visual feedback is also provided for the surgeon/developer to view the near video frame rate updates of the task model. All these functions are executed in parallel and need to have a synchronized data exchange. The system is very portable and can be incorporated to any existing telerobotic platforms with minimal overhead.
dc.format.mimetypeapplication/pdf
dc.language.isoen_US
dc.publisherJohns Hopkins University
dc.subjectModel-Mediated
dc.subjectTeleoperation
dc.subjectMedical Robotics
dc.subjectSystem Architecture
dc.subjectReal-Time
dc.titleComplementary Situational Awareness for an Intelligent Telerobotic Surgical Assistant System
dc.typeThesis
thesis.degree.disciplineComputer Science
thesis.degree.grantorJohns Hopkins University
thesis.degree.grantorWhiting School of Engineering
thesis.degree.levelDoctoral
thesis.degree.namePh.D.
dc.date.updated2019-07-30T01:34:57Z
dc.type.materialtext
thesis.degree.departmentComputer Science
dc.contributor.committeeMemberKazanzides, Peter
dc.contributor.committeeMemberKobilarov, Marin
dc.publisher.countryUSA


Files in this item

Thumbnail

This item appears in the following Collection(s)

Show simple item record