Skip to content

da Vinci® Research Kit

Minimally invasive robotic surgery research

Enabling us and our collaborators to perform advanced research in the field of robotic surgery.

The University of Leeds received the da Vinci® surgical robot as a donation from Intuitive Surgical (https://intuitivesurgical.com/), the worldwide market leader in robotic surgery and manufacturer of the system. The medical robot, worth £1 million, is enabling us and our collaborators to perform advanced research in the field of robotic surgery.
The University of Leeds is the only university in the north of England and outside London to have a da Vinci® Surgical Robot and a da Vinci Research Kit (https://research.intusurg.com/index.php/Main_Page) to be used for technology-oriented research.

Nowadays, robotic laparoscopy is performed without any degree of autonomy in the robot motion. Our vision is an intelligent system, equipped with advanced sensors, able to support the surgeon in very basic or tedious tasks. We aim to reduce the complexity of the operation, enabling as wider spread of the minimally invasive surgery and leading to a much better outcome for patients.


We are developing strategies for shared control of the robot mobility as well as advanced learning techniques, in order to enhance the robot intelligence and allow a close collaboration between the surgeon and its robotic assistant.

Relevant Publications

5599357 U4MG6KFQ 1 ieee 5 date desc 1 title 1176 https://www.stormlabuk.com/wp-content/plugins/zotpress/
%7B%22status%22%3A%22success%22%2C%22updateneeded%22%3Afalse%2C%22instance%22%3Afalse%2C%22meta%22%3A%7B%22request_last%22%3A0%2C%22request_next%22%3A0%2C%22used_cache%22%3Atrue%7D%2C%22data%22%3A%5B%7B%22key%22%3A%222IU9VMG2%22%2C%22library%22%3A%7B%22id%22%3A5599357%7D%2C%22meta%22%3A%7B%22creatorSummary%22%3A%22Marahrens%20et%20al.%22%2C%22parsedDate%22%3A%222024-09-26%22%2C%22numChildren%22%3A0%7D%2C%22bib%22%3A%22%26lt%3Bdiv%20class%3D%26quot%3Bcsl-bib-body%26quot%3B%20style%3D%26quot%3Bline-height%3A%201.35%3B%20%26quot%3B%26gt%3B%5Cn%20%26lt%3Bdiv%20class%3D%26quot%3Bcsl-entry%26quot%3B%20style%3D%26quot%3Bclear%3A%20left%3B%20%26quot%3B%26gt%3B%5Cn%20%26lt%3Bdiv%20class%3D%26quot%3Bcsl-left-margin%26quot%3B%20style%3D%26quot%3Bfloat%3A%20left%3B%20padding-right%3A%200.5em%3B%20text-align%3A%20right%3B%20width%3A%201em%3B%26quot%3B%26gt%3B%5B1%5D%26lt%3B%5C%2Fdiv%26gt%3B%26lt%3Bdiv%20class%3D%26quot%3Bcsl-right-inline%26quot%3B%20style%3D%26quot%3Bmargin%3A%200%20.4em%200%201.5em%3B%26quot%3B%26gt%3BN.%20Marahrens%2C%20D.%20Jones%2C%20N.%20Murasovs%2C%20C.%20S.%20Biyani%2C%20and%20P.%20Valdastri%2C%20%26quot%3B%26lt%3Ba%20class%3D%26%23039%3Bzp-ItemURL%26%23039%3B%20href%3D%26%23039%3Bhttps%3A%5C%2F%5C%2Fieeexplore.ieee.org%5C%2Fdocument%5C%2F10695777%26%23039%3B%26gt%3BAn%20Ultrasound-Guided%20System%20for%20Autonomous%20Marking%20of%20Tumor%20Boundaries%20During%20Robot-assisted%20Surgery%26lt%3B%5C%2Fa%26gt%3B%2C%26quot%3B%20%26lt%3Bi%26gt%3BIEEE%20Transactions%20on%20Medical%20Robotics%20and%20Bionics%26lt%3B%5C%2Fi%26gt%3B%2C%20pp.%201%5Cu20131%2C%20Sept.%202024%2C%20doi%3A%2010.1109%5C%2FTMRB.2024.3468397.%26lt%3B%5C%2Fdiv%26gt%3B%5Cn%20%26lt%3B%5C%2Fdiv%26gt%3B%5Cn%26lt%3B%5C%2Fdiv%26gt%3B%22%2C%22data%22%3A%7B%22itemType%22%3A%22journalArticle%22%2C%22title%22%3A%22An%20Ultrasound-Guided%20System%20for%20Autonomous%20Marking%20of%20Tumor%20Boundaries%20During%20Robot-assisted%20Surgery%22%2C%22creators%22%3A%5B%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Nils%22%2C%22lastName%22%3A%22Marahrens%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Dominic%22%2C%22lastName%22%3A%22Jones%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Nikita%22%2C%22lastName%22%3A%22Murasovs%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Chandra%20Shekhar%22%2C%22lastName%22%3A%22Biyani%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Pietro%22%2C%22lastName%22%3A%22Valdastri%22%7D%5D%2C%22abstractNote%22%3A%22While%20only%20a%20limited%20number%20of%20procedures%20have%20image%20guidance%20available%20during%20robotically%20guided%20surgery%2C%20they%20still%20require%20the%20surgeon%20to%20manually%20reference%20the%20obtained%20scans%20to%20their%20projected%20location%20on%20the%20tissue%20surface.%20While%20the%20surgeon%20may%20mark%20the%20boundaries%20on%20the%20organ%20surface%20via%20electrosurgery%2C%20the%20precise%20margin%20around%20the%20tumor%20is%20likely%20to%20remain%20variable%20and%20not%20guaranteed%20before%20a%20pathological%20analysis.%20This%20paper%20presents%20a%20first%20attempt%20to%20autonomously%20extract%20and%20mark%20tumor%20boundaries%20with%20a%20specified%20margin%20on%20the%20tissue%20surface.%20It%20presents%20a%20first%20concept%20for%20tool-tissue%20interaction%20control%20via%20imu%20sensor%20fusion%20and%20contact%20detection%20from%20the%20electrical%20signals%20of%20the%20%2Aesu%2C%20requiring%20no%20force%20sensing.%20We%20develop%20and%20assess%20our%20approach%20on%20us%20phantoms%20with%20anatomical%20surface%20geometries%2C%20comparing%20different%20strategies%20for%20projecting%20the%20tumor%20onto%20the%20surface%20and%20assessing%20its%20accuracy%20in%20repeated%20trials.%20Finally%2C%20we%20demonstrate%20the%20feasibility%20of%20translating%20the%20approach%20to%20an%20ex-vivo%20porcine%20liver.%20We%20achieve%20mean%20true%20positive%20rates%20above%200.84%20and%20false%20detection%20rates%20below%200.12%20compared%20to%20a%20tracked%20reference%20for%20each%20calculation%20and%20execution%20of%20the%20marking%20trajectory%20for%20dummy%20and%20ex-vivo%20experiments.%22%2C%22date%22%3A%222024-09-26%22%2C%22language%22%3A%22%22%2C%22DOI%22%3A%2210.1109%5C%2FTMRB.2024.3468397%22%2C%22ISSN%22%3A%222576-3202%22%2C%22url%22%3A%22https%3A%5C%2F%5C%2Fieeexplore.ieee.org%5C%2Fdocument%5C%2F10695777%22%2C%22collections%22%3A%5B%22U4MG6KFQ%22%2C%22853N8SBR%22%5D%2C%22dateModified%22%3A%222024-10-07T09%3A05%3A46Z%22%7D%2C%22image%22%3A%5B%22https%3A%5C%2F%5C%2Fwww.stormlabuk.com%5C%2Fwp-content%5C%2Fuploads%5C%2F2024%5C%2F09%5C%2F3fjX5Ud3ZV-150x150.png%22%2C150%2C150%2Ctrue%5D%7D%2C%7B%22key%22%3A%22K7LBE3SF%22%2C%22library%22%3A%7B%22id%22%3A5599357%7D%2C%22meta%22%3A%7B%22creatorSummary%22%3A%22Loza%20et%20al.%22%2C%22parsedDate%22%3A%222024-04%22%2C%22numChildren%22%3A3%7D%2C%22bib%22%3A%22%26lt%3Bdiv%20class%3D%26quot%3Bcsl-bib-body%26quot%3B%20style%3D%26quot%3Bline-height%3A%201.35%3B%20%26quot%3B%26gt%3B%5Cn%20%26lt%3Bdiv%20class%3D%26quot%3Bcsl-entry%26quot%3B%20style%3D%26quot%3Bclear%3A%20left%3B%20%26quot%3B%26gt%3B%5Cn%20%26lt%3Bdiv%20class%3D%26quot%3Bcsl-left-margin%26quot%3B%20style%3D%26quot%3Bfloat%3A%20left%3B%20padding-right%3A%200.5em%3B%20text-align%3A%20right%3B%20width%3A%201em%3B%26quot%3B%26gt%3B%5B1%5D%26lt%3B%5C%2Fdiv%26gt%3B%26lt%3Bdiv%20class%3D%26quot%3Bcsl-right-inline%26quot%3B%20style%3D%26quot%3Bmargin%3A%200%20.4em%200%201.5em%3B%26quot%3B%26gt%3BG.%20Loza%2C%20P.%20Valdastri%2C%20and%20S.%20Ali%2C%20%26quot%3B%26lt%3Ba%20class%3D%26%23039%3Bzp-ItemURL%26%23039%3B%20href%3D%26%23039%3Bhttps%3A%5C%2F%5C%2Fonlinelibrary.wiley.com%5C%2Fdoi%5C%2Ffull%5C%2F10.1049%5C%2Fhtl2.12060%20https%3A%5C%2F%5C%2Fonlinelibrary.wiley.com%5C%2Fdoi%5C%2Fabs%5C%2F10.1049%5C%2Fhtl2.12060%20https%3A%5C%2F%5C%2Fietresearch.onlinelibrary.wiley.com%5C%2Fdoi%5C%2F10.1049%5C%2Fhtl2.12060%26%23039%3B%26gt%3BReal-time%20surgical%20tool%20detection%20with%20multi-scale%20positional%20encoding%20and%20contrastive%20learning%26lt%3B%5C%2Fa%26gt%3B%2C%26quot%3B%20%26lt%3Bi%26gt%3BHealthcare%20Technology%20Letters%26lt%3B%5C%2Fi%26gt%3B%2C%20vol.%2011%2C%20no.%202%5Cu20133%2C%20pp.%2048%5Cu201358%2C%20Apr.%202024%2C%20doi%3A%2010.1049%5C%2Fhtl2.12060.%26lt%3B%5C%2Fdiv%26gt%3B%5Cn%20%26lt%3B%5C%2Fdiv%26gt%3B%5Cn%26lt%3B%5C%2Fdiv%26gt%3B%22%2C%22data%22%3A%7B%22itemType%22%3A%22journalArticle%22%2C%22title%22%3A%22Real-time%20surgical%20tool%20detection%20with%20multi-scale%20positional%20encoding%20and%20contrastive%20learning%22%2C%22creators%22%3A%5B%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Gerardo%22%2C%22lastName%22%3A%22Loza%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Pietro%22%2C%22lastName%22%3A%22Valdastri%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Sharib%22%2C%22lastName%22%3A%22Ali%22%7D%5D%2C%22abstractNote%22%3A%22Real-time%20detection%20of%20surgical%20tools%20in%20laparoscopic%20data%20plays%20a%20vital%20role%20in%20understanding%20surgical%20procedures%2C%20evaluating%20the%20performance%20of%20trainees%2C%20facilitating%20learning%2C%20and%20ultimately%20supporting%20the%20autonomy%20of%20robotic%20systems.%20Existing%20detection%20methods%20for%20surgical%20data%20need%20to%20improve%20processing%20speed%20and%20high%20prediction%20accuracy.%20Most%20methods%20rely%20on%20anchors%20or%20region%20proposals%2C%20limiting%20their%20adaptability%20to%20variations%20in%20tool%20appearance%20and%20leading%20to%20sub-optimal%20detection%20results.%20Moreover%2C%20using%20non-anchor-based%20detectors%20to%20alleviate%20this%20problem%20has%20been%20partially%20explored%20without%20remarkable%20results.%20An%20anchor-free%20architecture%20based%20on%20a%20transformer%20that%20allows%20real-time%20tool%20detection%20is%20introduced.%20The%20proposal%20is%20to%20utilize%20multi-scale%20features%20within%20the%20feature%20extraction%20layer%20and%20at%20the%20transformer-based%20detection%20architecture%20through%20positional%20encoding%20that%20can%20refine%20and%20capture%20context-aware%20and%20structural%20information%20of%20different-sized%20tools.%20Furthermore%2C%20a%20supervised%20contrastive%20loss%20is%20introduced%20to%20optimize%20representations%20of%20object%20embeddings%2C%20resulting%20in%20improved%20feed-forward%20network%20performances%20for%20classifying%20localized%20bounding%20boxes.%20The%20strategy%20demonstrates%20superiority%20to%20state-of-the-art%20%28SOTA%29%20methods.%20Compared%20to%20the%20most%20accurate%20existing%20SOTA%20%28DSSS%29%20method%2C%20the%20approach%20has%20an%20improvement%20of%20nearly%204%25%20on%20mAP%20%28Formula%20presented.%29%20and%20a%20reduction%20in%20the%20inference%20time%20by%20113%25.%20It%20also%20showed%20a%207%25%20higher%20mAP%20%28Formula%20presented.%29%20than%20the%20baseline%20model.%22%2C%22date%22%3A%222024-04%22%2C%22language%22%3A%22%22%2C%22DOI%22%3A%2210.1049%5C%2Fhtl2.12060%22%2C%22ISSN%22%3A%2220533713%22%2C%22url%22%3A%22https%3A%5C%2F%5C%2Fonlinelibrary.wiley.com%5C%2Fdoi%5C%2Ffull%5C%2F10.1049%5C%2Fhtl2.12060%20https%3A%5C%2F%5C%2Fonlinelibrary.wiley.com%5C%2Fdoi%5C%2Fabs%5C%2F10.1049%5C%2Fhtl2.12060%20https%3A%5C%2F%5C%2Fietresearch.onlinelibrary.wiley.com%5C%2Fdoi%5C%2F10.1049%5C%2Fhtl2.12060%22%2C%22collections%22%3A%5B%22U4MG6KFQ%22%2C%22853N8SBR%22%5D%2C%22dateModified%22%3A%222024-07-15T14%3A07%3A36Z%22%7D%7D%2C%7B%22key%22%3A%22H55GNG23%22%2C%22library%22%3A%7B%22id%22%3A5599357%7D%2C%22meta%22%3A%7B%22creatorSummary%22%3A%22Hu%20et%20al.%22%2C%22parsedDate%22%3A%222024%22%2C%22numChildren%22%3A0%7D%2C%22bib%22%3A%22%26lt%3Bdiv%20class%3D%26quot%3Bcsl-bib-body%26quot%3B%20style%3D%26quot%3Bline-height%3A%201.35%3B%20%26quot%3B%26gt%3B%5Cn%20%26lt%3Bdiv%20class%3D%26quot%3Bcsl-entry%26quot%3B%20style%3D%26quot%3Bclear%3A%20left%3B%20%26quot%3B%26gt%3B%5Cn%20%26lt%3Bdiv%20class%3D%26quot%3Bcsl-left-margin%26quot%3B%20style%3D%26quot%3Bfloat%3A%20left%3B%20padding-right%3A%200.5em%3B%20text-align%3A%20right%3B%20width%3A%201em%3B%26quot%3B%26gt%3B%5B1%5D%26lt%3B%5C%2Fdiv%26gt%3B%26lt%3Bdiv%20class%3D%26quot%3Bcsl-right-inline%26quot%3B%20style%3D%26quot%3Bmargin%3A%200%20.4em%200%201.5em%3B%26quot%3B%26gt%3BJ.%20Hu%2C%20D.%20Jones%2C%20M.%20R.%20Dogar%2C%20and%20P.%20Valdastri%2C%20%26quot%3B%26lt%3Ba%20class%3D%26%23039%3Bzp-ItemURL%26%23039%3B%20href%3D%26%23039%3Bhttps%3A%5C%2F%5C%2Fieeexplore.ieee.org%5C%2Fdocument%5C%2F10328689%26%23039%3B%26gt%3BOcclusion-Robust%20Autonomous%20Robotic%20Manipulation%20of%20Human%20Soft%20Tissues%20With%203-D%20Surface%20Feedback%26lt%3B%5C%2Fa%26gt%3B%2C%26quot%3B%20%26lt%3Bi%26gt%3BIEEE%20Transactions%20on%20Robotics%26lt%3B%5C%2Fi%26gt%3B%2C%20vol.%2040%2C%20pp.%20624%5Cu2013638%2C%202024%2C%20doi%3A%2010.1109%5C%2FTRO.2023.3335693.%26lt%3B%5C%2Fdiv%26gt%3B%5Cn%20%26lt%3B%5C%2Fdiv%26gt%3B%5Cn%26lt%3B%5C%2Fdiv%26gt%3B%22%2C%22data%22%3A%7B%22itemType%22%3A%22journalArticle%22%2C%22title%22%3A%22Occlusion-Robust%20Autonomous%20Robotic%20Manipulation%20of%20Human%20Soft%20Tissues%20With%203-D%20Surface%20Feedback%22%2C%22creators%22%3A%5B%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Junlei%22%2C%22lastName%22%3A%22Hu%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Dominic%22%2C%22lastName%22%3A%22Jones%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Mehmet%20R.%22%2C%22lastName%22%3A%22Dogar%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Pietro%22%2C%22lastName%22%3A%22Valdastri%22%7D%5D%2C%22abstractNote%22%3A%22Robotic%20manipulation%20of%203-D%20soft%20objects%20remains%20challenging%20in%20the%20industrial%20and%20medical%20fields.%20Various%20methods%20based%20on%20mechanical%20modeling%2C%20data-driven%20approaches%20or%20explicit%20feature%20tracking%20have%20been%20proposed.%20A%20unifying%20disadvantage%20of%20these%20methods%20is%20the%20high%20computational%20cost%20of%20simultaneous%20imaging%20processing%2C%20identification%20of%20mechanical%20properties%2C%20and%20motion%20planning%2C%20leading%20to%20a%20need%20for%20less%20computationally%20intensive%20methods.%20We%20propose%20a%20method%20for%20autonomous%20robotic%20manipulation%20with%203-D%20surface%20feedback%20to%20solve%20these%20issues.%20First%2C%20we%20produce%20a%20deformation%20model%20of%20the%20manipulated%20object%2C%20which%20estimates%20the%20robots%26%23039%3B%20movements%20by%20monitoring%20the%20displacement%20of%20surface%20points%20surrounding%20the%20manipulators.%20Then%2C%20we%20develop%20a%206-degree-of-freedom%20velocity%20controller%20to%20manipulate%20the%20grasped%20object%20to%20achieve%20a%20desired%20shape.%20We%20validate%20our%20approach%20through%20comparative%20simulations%20with%20existing%20methods%20and%20experiments%20using%20phantom%20and%20cadaveric%20soft%20tissues%20with%20the%20da%20Vinci%20research%20kit.%20The%20results%20demonstrate%20the%20robustness%20of%20the%20technique%20to%20occlusions%20and%20various%20materials.%20Compared%20to%20state-of-the-art%20linear%20and%20data-driven%20methods%2C%20our%20approach%20is%20more%20precise%20by%2046.5%25%20and%2015.9%25%20and%20saves%2055.2%25%20and%2025.7%25%20manipulation%20time%2C%20respectively.%22%2C%22date%22%3A%222024%22%2C%22language%22%3A%22%22%2C%22DOI%22%3A%2210.1109%5C%2FTRO.2023.3335693%22%2C%22ISSN%22%3A%2219410468%22%2C%22url%22%3A%22https%3A%5C%2F%5C%2Fieeexplore.ieee.org%5C%2Fdocument%5C%2F10328689%22%2C%22collections%22%3A%5B%22U4MG6KFQ%22%2C%22853N8SBR%22%5D%2C%22dateModified%22%3A%222024-07-15T14%3A01%3A42Z%22%7D%2C%22image%22%3A%5B%22https%3A%5C%2F%5C%2Fwww.stormlabuk.com%5C%2Fwp-content%5C%2Fuploads%5C%2F2024%5C%2F07%5C%2Fhu1-3335693-small-150x150.gif%22%2C150%2C150%2Ctrue%5D%7D%2C%7B%22key%22%3A%22X2BVMPUK%22%2C%22library%22%3A%7B%22id%22%3A5599357%7D%2C%22meta%22%3A%7B%22creatorSummary%22%3A%22Marahrens%20et%20al.%22%2C%22parsedDate%22%3A%222022-10%22%2C%22numChildren%22%3A0%7D%2C%22bib%22%3A%22%26lt%3Bdiv%20class%3D%26quot%3Bcsl-bib-body%26quot%3B%20style%3D%26quot%3Bline-height%3A%201.35%3B%20%26quot%3B%26gt%3B%5Cn%20%26lt%3Bdiv%20class%3D%26quot%3Bcsl-entry%26quot%3B%20style%3D%26quot%3Bclear%3A%20left%3B%20%26quot%3B%26gt%3B%5Cn%20%26lt%3Bdiv%20class%3D%26quot%3Bcsl-left-margin%26quot%3B%20style%3D%26quot%3Bfloat%3A%20left%3B%20padding-right%3A%200.5em%3B%20text-align%3A%20right%3B%20width%3A%201em%3B%26quot%3B%26gt%3B%5B1%5D%26lt%3B%5C%2Fdiv%26gt%3B%26lt%3Bdiv%20class%3D%26quot%3Bcsl-right-inline%26quot%3B%20style%3D%26quot%3Bmargin%3A%200%20.4em%200%201.5em%3B%26quot%3B%26gt%3BN.%20Marahrens%2C%20B.%20Scaglioni%2C%20D.%20Jones%2C%20R.%20Prasad%2C%20C.%20S.%20Biyani%2C%20and%20P.%20Valdastri%2C%20%26quot%3B%26lt%3Ba%20class%3D%26%23039%3Bzp-ItemURL%26%23039%3B%20href%3D%26%23039%3Bhttps%3A%5C%2F%5C%2Fwww.frontiersin.org%5C%2Farticles%5C%2F10.3389%5C%2Ffrobt.2022.940062%5C%2Ffull%26%23039%3B%26gt%3BTowards%20Autonomous%20Robotic%20Minimally%20Invasive%20Ultrasound%20Scanning%20and%20Vessel%20Reconstruction%20on%20Non-Planar%20Surfaces%26lt%3B%5C%2Fa%26gt%3B%2C%26quot%3B%20%26lt%3Bi%26gt%3BFrontiers%20in%20Robotics%20and%20AI%26lt%3B%5C%2Fi%26gt%3B%2C%20vol.%209%2C%20Oct.%202022%2C%20doi%3A%2010.3389%5C%2Ffrobt.2022.940062.%26lt%3B%5C%2Fdiv%26gt%3B%5Cn%20%26lt%3B%5C%2Fdiv%26gt%3B%5Cn%26lt%3B%5C%2Fdiv%26gt%3B%22%2C%22data%22%3A%7B%22itemType%22%3A%22journalArticle%22%2C%22title%22%3A%22Towards%20Autonomous%20Robotic%20Minimally%20Invasive%20Ultrasound%20Scanning%20and%20Vessel%20Reconstruction%20on%20Non-Planar%20Surfaces%22%2C%22creators%22%3A%5B%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Nils%22%2C%22lastName%22%3A%22Marahrens%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Bruno%22%2C%22lastName%22%3A%22Scaglioni%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Dominic%22%2C%22lastName%22%3A%22Jones%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Raj%22%2C%22lastName%22%3A%22Prasad%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Chandra%20Shekhar%22%2C%22lastName%22%3A%22Biyani%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Pietro%22%2C%22lastName%22%3A%22Valdastri%22%7D%5D%2C%22abstractNote%22%3A%22Autonomous%20robotic%20Ultrasound%20%28US%29%20scanning%20has%20been%20the%20subject%20of%20research%20for%20more%20than%202%20decades.%20However%2C%20little%20work%20has%20been%20done%20to%20apply%20this%20concept%20into%20a%20minimally%20invasive%20setting%2C%20in%20which%20accurate%20force%20sensing%20is%20generally%20not%20available%20and%20robot%20kinematics%20are%20unreliable%20due%20to%20the%20tendon-driven%2C%20compliant%20robot%20structure.%20As%20a%20result%2C%20the%20adequate%20orientation%20of%20the%20probe%20towards%20the%20tissue%20surface%20remains%20unknown%20and%20the%20anatomy%20reconstructed%20from%20scan%20may%20become%20highly%20inaccurate.%20In%20this%20work%20we%20present%20solutions%20to%20both%20of%20these%20challenges%3A%20an%20attitude%20sensor%20fusion%20scheme%20for%20improved%20kinematic%20sensing%20and%20a%20visual%2C%20deep%20learning%20based%20algorithm%20to%20establish%20and%20maintain%20contact%20between%20the%20organ%20surface%20and%20the%20US%20probe.%20We%20further%20introduce%20a%20novel%20scheme%20to%20estimate%20and%20orient%20the%20probe%20perpendicular%20to%20the%20center%20line%20of%20a%20vascular%20structure.%20Our%20approach%20enables%2C%20for%20the%20first%20time%2C%20to%20autonomously%20scan%20across%20a%20non-planar%20surface%20and%20navigate%20along%20an%20anatomical%20structure%20with%20a%20robotically%20guided%20minimally%20invasive%20US%20probe.%20Our%20experiments%20on%20a%20vessel%20phantom%20with%20a%20convex%20surface%20confirm%20a%20significant%20improvement%20of%20the%20reconstructed%20curved%20vessel%20geometry%2C%20with%20our%20approach%20strongly%20reducing%20the%20mean%20positional%20error%20and%20variance.%20In%20the%20future%2C%20our%20approach%20could%20help%20identify%20vascular%20structures%20more%20effectively%20and%20help%20pave%20the%20way%20towards%20semi-autonomous%20assistance%20during%20partial%20hepatectomy%20and%20the%20potential%20to%20reduce%20procedure%20length%20and%20complication%20rates.%22%2C%22date%22%3A%222022-10%22%2C%22language%22%3A%22%22%2C%22DOI%22%3A%2210.3389%5C%2Ffrobt.2022.940062%22%2C%22ISSN%22%3A%2222969144%22%2C%22url%22%3A%22https%3A%5C%2F%5C%2Fwww.frontiersin.org%5C%2Farticles%5C%2F10.3389%5C%2Ffrobt.2022.940062%5C%2Ffull%22%2C%22collections%22%3A%5B%22U4MG6KFQ%22%2C%22853N8SBR%22%5D%2C%22dateModified%22%3A%222024-07-15T11%3A20%3A50Z%22%7D%7D%2C%7B%22key%22%3A%226H3N4MTN%22%2C%22library%22%3A%7B%22id%22%3A5599357%7D%2C%22meta%22%3A%7B%22creatorSummary%22%3A%22Attanasio%20et%20al.%22%2C%22parsedDate%22%3A%222021-05%22%2C%22numChildren%22%3A0%7D%2C%22bib%22%3A%22%26lt%3Bdiv%20class%3D%26quot%3Bcsl-bib-body%26quot%3B%20style%3D%26quot%3Bline-height%3A%201.35%3B%20%26quot%3B%26gt%3B%5Cn%20%26lt%3Bdiv%20class%3D%26quot%3Bcsl-entry%26quot%3B%20style%3D%26quot%3Bclear%3A%20left%3B%20%26quot%3B%26gt%3B%5Cn%20%26lt%3Bdiv%20class%3D%26quot%3Bcsl-left-margin%26quot%3B%20style%3D%26quot%3Bfloat%3A%20left%3B%20padding-right%3A%200.5em%3B%20text-align%3A%20right%3B%20width%3A%201em%3B%26quot%3B%26gt%3B%5B1%5D%26lt%3B%5C%2Fdiv%26gt%3B%26lt%3Bdiv%20class%3D%26quot%3Bcsl-right-inline%26quot%3B%20style%3D%26quot%3Bmargin%3A%200%20.4em%200%201.5em%3B%26quot%3B%26gt%3BA.%20Attanasio%2C%20B.%20Scaglioni%2C%20E.%20De%20Momi%2C%20P.%20Fiorini%2C%20and%20P.%20Valdastri%2C%20%26quot%3B%26lt%3Ba%20class%3D%26%23039%3Bzp-ItemURL%26%23039%3B%20href%3D%26%23039%3Bhttps%3A%5C%2F%5C%2Fwww.annualreviews.org%5C%2Fdoi%5C%2F10.1146%5C%2Fannurev-control-062420-090543%26%23039%3B%26gt%3BAutonomy%20in%20Surgical%20Robotics%26lt%3B%5C%2Fa%26gt%3B%2C%26quot%3B%20%26lt%3Bi%26gt%3BAnnual%20Review%20of%20Control%2C%20Robotics%2C%20and%20Autonomous%20Systems%26lt%3B%5C%2Fi%26gt%3B%2C%20vol.%204%2C%20no.%201%2C%20pp.%20651%5Cu2013679%2C%20May%202021%2C%20doi%3A%2010.1146%5C%2Fannurev-control-062420-090543.%26lt%3B%5C%2Fdiv%26gt%3B%5Cn%20%26lt%3B%5C%2Fdiv%26gt%3B%5Cn%26lt%3B%5C%2Fdiv%26gt%3B%22%2C%22data%22%3A%7B%22itemType%22%3A%22journalArticle%22%2C%22title%22%3A%22Autonomy%20in%20Surgical%20Robotics%22%2C%22creators%22%3A%5B%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Aleks%22%2C%22lastName%22%3A%22Attanasio%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Bruno%22%2C%22lastName%22%3A%22Scaglioni%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Elena%22%2C%22lastName%22%3A%22De%20Momi%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Paolo%22%2C%22lastName%22%3A%22Fiorini%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Pietro%22%2C%22lastName%22%3A%22Valdastri%22%7D%5D%2C%22abstractNote%22%3A%22This%20review%20examines%20the%20dichotomy%20between%20automatic%20and%20autonomous%20behaviors%20in%20surgical%20robots%2C%20maps%20the%20possible%20levels%20of%20autonomy%20of%20these%20robots%2C%20and%20describes%20the%20primary%20enabling%20technologies%20that%20are%20driving%20research%20in%20this%20field.%20It%20is%20organized%20in%20five%20main%20sections%20that%20cover%20increasing%20levels%20of%20autonomy.%20At%20level%200%2C%20where%20the%20bulk%20of%20commercial%20platforms%20are%2C%20the%20robot%20has%20no%20decision%20autonomy.%20At%20level%201%2C%20the%20robot%20can%20provide%20cognitive%20and%20physical%20assistance%20to%20the%20surgeon%2C%20while%20at%20level%202%2C%20it%20can%20autonomously%20perform%20a%20surgical%20task.%20Level%203%20comes%20with%20conditional%20autonomy%2C%20enabling%20the%20robot%20to%20plan%20a%20task%20and%20update%20planning%20during%20execution.%20Finally%2C%20robots%20at%20level%204%20can%20plan%20and%20execute%20a%20sequence%20of%20surgical%20tasks%20autonomously.%22%2C%22date%22%3A%222021-05%22%2C%22language%22%3A%22%22%2C%22DOI%22%3A%2210.1146%5C%2Fannurev-control-062420-090543%22%2C%22ISSN%22%3A%2225735144%22%2C%22url%22%3A%22https%3A%5C%2F%5C%2Fwww.annualreviews.org%5C%2Fdoi%5C%2F10.1146%5C%2Fannurev-control-062420-090543%22%2C%22collections%22%3A%5B%22U4MG6KFQ%22%2C%22853N8SBR%22%5D%2C%22dateModified%22%3A%222024-07-15T11%3A20%3A50Z%22%7D%7D%5D%7D
[1]
N. Marahrens, D. Jones, N. Murasovs, C. S. Biyani, and P. Valdastri, "An Ultrasound-Guided System for Autonomous Marking of Tumor Boundaries During Robot-assisted Surgery," IEEE Transactions on Medical Robotics and Bionics, pp. 1–1, Sept. 2024, doi: 10.1109/TMRB.2024.3468397.
[1]
G. Loza, P. Valdastri, and S. Ali, "Real-time surgical tool detection with multi-scale positional encoding and contrastive learning," Healthcare Technology Letters, vol. 11, no. 2–3, pp. 48–58, Apr. 2024, doi: 10.1049/htl2.12060.
[1]
J. Hu, D. Jones, M. R. Dogar, and P. Valdastri, "Occlusion-Robust Autonomous Robotic Manipulation of Human Soft Tissues With 3-D Surface Feedback," IEEE Transactions on Robotics, vol. 40, pp. 624–638, 2024, doi: 10.1109/TRO.2023.3335693.
[1]
N. Marahrens, B. Scaglioni, D. Jones, R. Prasad, C. S. Biyani, and P. Valdastri, "Towards Autonomous Robotic Minimally Invasive Ultrasound Scanning and Vessel Reconstruction on Non-Planar Surfaces," Frontiers in Robotics and AI, vol. 9, Oct. 2022, doi: 10.3389/frobt.2022.940062.
[1]
A. Attanasio, B. Scaglioni, E. De Momi, P. Fiorini, and P. Valdastri, "Autonomy in Surgical Robotics," Annual Review of Control, Robotics, and Autonomous Systems, vol. 4, no. 1, pp. 651–679, May 2021, doi: 10.1146/annurev-control-062420-090543.

View another research project