Skip to content

da Vinci® Research Kit

Minimally invasive robotic surgery research

Enabling us and our collaborators to perform advanced research in the field of robotic surgery.

The University of Leeds received the da Vinci® surgical robot as a donation from Intuitive Surgical (https://intuitivesurgical.com/), the worldwide market leader in robotic surgery and manufacturer of the system. The medical robot, worth £1 million, is enabling us and our collaborators to perform advanced research in the field of robotic surgery.
The University of Leeds is the only university in the north of England and outside London to have a da Vinci® Surgical Robot and a da Vinci Research Kit (https://research.intusurg.com/index.php/Main_Page) to be used for technology-oriented research.

Nowadays, robotic laparoscopy is performed without any degree of autonomy in the robot motion. Our vision is an intelligent system, equipped with advanced sensors, able to support the surgeon in very basic or tedious tasks. We aim to reduce the complexity of the operation, enabling as wider spread of the minimally invasive surgery and leading to a much better outcome for patients.


We are developing strategies for shared control of the robot mobility as well as advanced learning techniques, in order to enhance the robot intelligence and allow a close collaboration between the surgeon and its robotic assistant.

Relevant Publications

5599357 U4MG6KFQ 1 ieee 5 date desc 1 title 1176 https://www.stormlabuk.com/wp-content/plugins/zotpress/
%7B%22status%22%3A%22success%22%2C%22updateneeded%22%3Afalse%2C%22instance%22%3Afalse%2C%22meta%22%3A%7B%22request_last%22%3A0%2C%22request_next%22%3A0%2C%22used_cache%22%3Atrue%7D%2C%22data%22%3A%5B%7B%22key%22%3A%22T432HWUZ%22%2C%22library%22%3A%7B%22id%22%3A5599357%7D%2C%22meta%22%3A%7B%22creatorSummary%22%3A%22Zeng%20et%20al.%22%2C%22parsedDate%22%3A%222025-12-30%22%2C%22numChildren%22%3A1%7D%2C%22bib%22%3A%22%26lt%3Bdiv%20class%3D%26quot%3Bcsl-bib-body%26quot%3B%20style%3D%26quot%3Bline-height%3A%201.35%3B%20%26quot%3B%26gt%3B%5Cn%20%26lt%3Bdiv%20class%3D%26quot%3Bcsl-entry%26quot%3B%20style%3D%26quot%3Bclear%3A%20left%3B%20%26quot%3B%26gt%3B%5Cn%20%26lt%3Bdiv%20class%3D%26quot%3Bcsl-left-margin%26quot%3B%20style%3D%26quot%3Bfloat%3A%20left%3B%20padding-right%3A%200.5em%3B%20text-align%3A%20right%3B%20width%3A%201em%3B%26quot%3B%26gt%3B%5B1%5D%26lt%3B%5C%2Fdiv%26gt%3B%26lt%3Bdiv%20class%3D%26quot%3Bcsl-right-inline%26quot%3B%20style%3D%26quot%3Bmargin%3A%200%20.4em%200%201.5em%3B%26quot%3B%26gt%3BT.%20Zeng%20%26lt%3Bi%26gt%3Bet%20al.%26lt%3B%5C%2Fi%26gt%3B%2C%20%26quot%3B%26lt%3Ba%20class%3D%26%23039%3Bzp-ItemURL%26%23039%3B%20href%3D%26%23039%3Bhttps%3A%5C%2F%5C%2Fieeexplore.ieee.org%5C%2Fdocument%5C%2F11319357%26%23039%3B%26gt%3BNeeCo%3A%20Image%20Synthesis%20of%20Novel%20Instrument%20States%20Based%20on%20Dynamic%20and%20Deformable%203D%20Gaussian%20Reconstruction%26lt%3B%5C%2Fa%26gt%3B%2C%26quot%3B%20%26lt%3Bi%26gt%3BIEEE%20Transactions%20on%20Medical%20Imaging%26lt%3B%5C%2Fi%26gt%3B%2C%20pp.%201%5Cu20131%2C%20Dec.%202025%2C%20doi%3A%2010.1109%5C%2FTMI.2025.3648299.%26lt%3B%5C%2Fdiv%26gt%3B%5Cn%20%26lt%3B%5C%2Fdiv%26gt%3B%5Cn%26lt%3B%5C%2Fdiv%26gt%3B%22%2C%22data%22%3A%7B%22itemType%22%3A%22journalArticle%22%2C%22title%22%3A%22NeeCo%3A%20Image%20Synthesis%20of%20Novel%20Instrument%20States%20Based%20on%20Dynamic%20and%20Deformable%203D%20Gaussian%20Reconstruction%22%2C%22creators%22%3A%5B%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Tianle%22%2C%22lastName%22%3A%22Zeng%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Junlei%22%2C%22lastName%22%3A%22Hu%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Gerardo%20Loza%22%2C%22lastName%22%3A%22Galindo%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Sharib%22%2C%22lastName%22%3A%22Ali%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Duygu%22%2C%22lastName%22%3A%22Sarikaya%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Pietro%22%2C%22lastName%22%3A%22Valdastri%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Dominic%22%2C%22lastName%22%3A%22Jones%22%7D%5D%2C%22abstractNote%22%3A%22Computer%20vision-based%20technologies%20significantly%20enhance%20surgical%20automation%20by%20advancing%20tool%20tracking%2C%20detection%2C%20and%20localization.%20However%2C%20Current%20data-driven%20approaches%20are%20data-voracious%2C%20requiring%20large%2C%20high-quality%20labeled%20image%20datasets.%20Our%20Work%20introduces%20a%20novel%20dynamic%20Gaussian%20Splatting%20technique%20to%20address%20the%20data%20scarcity%20in%20surgical%20image%20datasets.%20We%20propose%20a%20dynamic%20Gaussian%20model%20to%20represent%20dynamic%20surgical%20scenes%2C%20enabling%20the%20rendering%20of%20surgical%20instruments%20from%20unseen%20viewpoints%20and%20deformations%20with%20real%20tissue%20backgrounds.%20We%20utilize%20a%20dynamic%20training%20adjustment%20strategy%20to%20address%20challenges%20posed%20by%20poorly%20calibrated%20camera%20poses%20from%20real-world%20scenarios.%20Additionally%2C%20automatically%20generate%20annotations%20for%20our%20synthetic%20data.%20For%20evaluation%2C%20we%20constructed%20a%20new%20dataset%20featuring%20seven%20scenes%20with%2014%2C000%20frames%20of%20tool%20and%20camera%20motion%20and%20tool%20jaw%20articulation%2C%20with%20a%20background%20of%20an%20exvivo%20porcine%20model.%20Using%20this%20dataset%2C%20we%20synthetically%20replicate%20the%20scene%20deformation%20from%20the%20ground%20truth%20data%2C%20allowing%20direct%20comparisons%20of%20synthetic%20image%20quality.%20Experimental%20results%20illustrate%20that%20our%20method%20generates%20photo-realistic%20labeled%20image%20datasets%20with%20the%20highest%20PSNR%20%2829.87%29.%20We%20further%20evaluate%20the%20performance%20of%20medical-specific%20neural%20networks%20trained%20on%20real%20and%20synthetic%20images%20using%20an%20unseen%20real-world%20image%20dataset.%20Our%20results%20show%20that%20the%20performance%20of%20models%20trained%20on%20synthetic%20images%20generated%20by%20the%20proposed%20method%20outperforms%20those%20trained%20with%20state-of-the-art%20standard%20data%20augmentation%20by%2010%25%2C%20leading%20to%20an%20overall%20improvement%20in%20model%20performances%20by%20nearly%2015%25.%22%2C%22date%22%3A%222025-12-30%22%2C%22section%22%3A%22%22%2C%22partNumber%22%3A%22%22%2C%22partTitle%22%3A%22%22%2C%22DOI%22%3A%2210.1109%5C%2FTMI.2025.3648299%22%2C%22citationKey%22%3A%22%22%2C%22url%22%3A%22https%3A%5C%2F%5C%2Fieeexplore.ieee.org%5C%2Fdocument%5C%2F11319357%22%2C%22PMID%22%3A%22%22%2C%22PMCID%22%3A%22%22%2C%22ISSN%22%3A%221558-254X%22%2C%22language%22%3A%22%22%2C%22collections%22%3A%5B%22U4MG6KFQ%22%2C%22853N8SBR%22%5D%2C%22dateModified%22%3A%222026-02-04T21%3A32%3A06Z%22%7D%7D%2C%7B%22key%22%3A%22ECI4IC48%22%2C%22library%22%3A%7B%22id%22%3A5599357%7D%2C%22meta%22%3A%7B%22creatorSummary%22%3A%22Xu%20et%20al.%22%2C%22parsedDate%22%3A%222025-10-01%22%2C%22numChildren%22%3A2%7D%2C%22bib%22%3A%22%26lt%3Bdiv%20class%3D%26quot%3Bcsl-bib-body%26quot%3B%20style%3D%26quot%3Bline-height%3A%201.35%3B%20%26quot%3B%26gt%3B%5Cn%20%26lt%3Bdiv%20class%3D%26quot%3Bcsl-entry%26quot%3B%20style%3D%26quot%3Bclear%3A%20left%3B%20%26quot%3B%26gt%3B%5Cn%20%26lt%3Bdiv%20class%3D%26quot%3Bcsl-left-margin%26quot%3B%20style%3D%26quot%3Bfloat%3A%20left%3B%20padding-right%3A%200.5em%3B%20text-align%3A%20right%3B%20width%3A%201em%3B%26quot%3B%26gt%3B%5B1%5D%26lt%3B%5C%2Fdiv%26gt%3B%26lt%3Bdiv%20class%3D%26quot%3Bcsl-right-inline%26quot%3B%20style%3D%26quot%3Bmargin%3A%200%20.4em%200%201.5em%3B%26quot%3B%26gt%3BH.%20Xu%20%26lt%3Bi%26gt%3Bet%20al.%26lt%3B%5C%2Fi%26gt%3B%2C%20%26quot%3B%26lt%3Ba%20class%3D%26%23039%3Bzp-ItemURL%26%23039%3B%20href%3D%26%23039%3Bhttps%3A%5C%2F%5C%2Fwww.sciencedirect.com%5C%2Fscience%5C%2Farticle%5C%2Fpii%5C%2FS136184152500221X%26%23039%3B%26gt%3BSurgRIPE%20challenge%3A%20Benchmark%20of%20surgical%20robot%20instrument%20pose%20estimation%26lt%3B%5C%2Fa%26gt%3B%2C%26quot%3B%20%26lt%3Bi%26gt%3BMedical%20Image%20Analysis%26lt%3B%5C%2Fi%26gt%3B%2C%20vol.%20105%2C%20p.%20103674%2C%20Oct.%202025%2C%20doi%3A%2010.1016%5C%2Fj.media.2025.103674.%26lt%3B%5C%2Fdiv%26gt%3B%5Cn%20%26lt%3B%5C%2Fdiv%26gt%3B%5Cn%26lt%3B%5C%2Fdiv%26gt%3B%22%2C%22data%22%3A%7B%22itemType%22%3A%22journalArticle%22%2C%22title%22%3A%22SurgRIPE%20challenge%3A%20Benchmark%20of%20surgical%20robot%20instrument%20pose%20estimation%22%2C%22creators%22%3A%5B%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Haozheng%22%2C%22lastName%22%3A%22Xu%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Alistair%22%2C%22lastName%22%3A%22Weld%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Chi%22%2C%22lastName%22%3A%22Xu%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Alfie%22%2C%22lastName%22%3A%22Roddan%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Jo%5Cu00e3o%22%2C%22lastName%22%3A%22Cartucho%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Mert%20Asim%22%2C%22lastName%22%3A%22Karaoglu%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Alexander%22%2C%22lastName%22%3A%22Ladikos%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Yangke%22%2C%22lastName%22%3A%22Li%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Yiping%22%2C%22lastName%22%3A%22Li%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Daiyun%22%2C%22lastName%22%3A%22Shen%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Geonhee%22%2C%22lastName%22%3A%22Lee%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Seyeon%22%2C%22lastName%22%3A%22Park%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Jongho%22%2C%22lastName%22%3A%22Shin%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Lucy%22%2C%22lastName%22%3A%22Fothergill%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Dominic%22%2C%22lastName%22%3A%22Jones%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Pietro%22%2C%22lastName%22%3A%22Valdastri%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Duygu%22%2C%22lastName%22%3A%22Sarikaya%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Stamatia%22%2C%22lastName%22%3A%22Giannarou%22%7D%5D%2C%22abstractNote%22%3A%22Accurate%20instrument%20pose%20estimation%20is%20a%20crucial%20step%20towards%20the%20future%20of%20robotic%20surgery%2C%20enabling%20applications%20such%20as%20autonomous%20surgical%20task%20execution.%20Vision-based%20methods%20for%20surgical%20instrument%20pose%20estimation%20provide%20a%20practical%20approach%20to%20tool%20tracking%2C%20but%20they%20often%20require%20markers%20to%20be%20attached%20to%20the%20instruments.%20Recently%2C%20more%20research%20has%20focused%20on%20the%20development%20of%20markerless%20methods%20based%20on%20deep%20learning.%20However%2C%20acquiring%20realistic%20surgical%20data%2C%20with%20ground%20truth%20%28GT%29%20instrument%20poses%2C%20required%20for%20deep%20learning%20training%2C%20is%20challenging.%20To%20address%20the%20issues%20in%20surgical%20instrument%20pose%20estimation%2C%20we%20introduce%20the%20Surgical%20Robot%20Instrument%20Pose%20Estimation%20%28SurgRIPE%29%20challenge%2C%20hosted%20at%20the%2026th%20International%20Conference%20on%20Medical%20Image%20Computing%20and%20Computer-Assisted%20Intervention%20%28MICCAI%29%20in%202023.%20The%20objectives%20of%20this%20challenge%20are%3A%20%281%29%20to%20provide%20the%20surgical%20vision%20community%20with%20realistic%20surgical%20video%20data%20paired%20with%20ground%20truth%20instrument%20poses%2C%20and%20%282%29%20to%20establish%20a%20benchmark%20for%20evaluating%20markerless%20pose%20estimation%20methods.%20The%20challenge%20led%20to%20the%20development%20of%20several%20novel%20algorithms%20that%20showcased%20improved%20accuracy%20and%20robustness%20over%20existing%20methods.%20The%20performance%20evaluation%20study%20on%20the%20SurgRIPE%20dataset%20highlights%20the%20potential%20of%20these%20advanced%20algorithms%20to%20be%20integrated%20into%20robotic%20surgery%20systems%2C%20paving%20the%20way%20for%20more%20precise%20and%20autonomous%20surgical%20procedures.%20The%20SurgRIPE%20challenge%20has%20successfully%20established%20a%20new%20benchmark%20for%20the%20field%2C%20encouraging%20further%20research%20and%20development%20in%20surgical%20robot%20instrument%20pose%20estimation.%22%2C%22date%22%3A%222025-10-01%22%2C%22section%22%3A%22%22%2C%22partNumber%22%3A%22%22%2C%22partTitle%22%3A%22%22%2C%22DOI%22%3A%2210.1016%5C%2Fj.media.2025.103674%22%2C%22citationKey%22%3A%22%22%2C%22url%22%3A%22https%3A%5C%2F%5C%2Fwww.sciencedirect.com%5C%2Fscience%5C%2Farticle%5C%2Fpii%5C%2FS136184152500221X%22%2C%22PMID%22%3A%22%22%2C%22PMCID%22%3A%22%22%2C%22ISSN%22%3A%221361-8415%22%2C%22language%22%3A%22%22%2C%22collections%22%3A%5B%22U4MG6KFQ%22%2C%22853N8SBR%22%5D%2C%22dateModified%22%3A%222026-02-04T21%3A23%3A09Z%22%7D%2C%22image%22%3A%5B%22https%3A%5C%2F%5C%2Fwww.stormlabuk.com%5C%2Fwp-content%5C%2Fuploads%5C%2F2026%5C%2F02%5C%2F1-s2.0-S136184152500221X-ga1_lrg-150x150.jpg%22%2C150%2C150%2Ctrue%5D%7D%2C%7B%22key%22%3A%222IU9VMG2%22%2C%22library%22%3A%7B%22id%22%3A5599357%7D%2C%22meta%22%3A%7B%22creatorSummary%22%3A%22Marahrens%20et%20al.%22%2C%22parsedDate%22%3A%222024-09-26%22%2C%22numChildren%22%3A0%7D%2C%22bib%22%3A%22%26lt%3Bdiv%20class%3D%26quot%3Bcsl-bib-body%26quot%3B%20style%3D%26quot%3Bline-height%3A%201.35%3B%20%26quot%3B%26gt%3B%5Cn%20%26lt%3Bdiv%20class%3D%26quot%3Bcsl-entry%26quot%3B%20style%3D%26quot%3Bclear%3A%20left%3B%20%26quot%3B%26gt%3B%5Cn%20%26lt%3Bdiv%20class%3D%26quot%3Bcsl-left-margin%26quot%3B%20style%3D%26quot%3Bfloat%3A%20left%3B%20padding-right%3A%200.5em%3B%20text-align%3A%20right%3B%20width%3A%201em%3B%26quot%3B%26gt%3B%5B1%5D%26lt%3B%5C%2Fdiv%26gt%3B%26lt%3Bdiv%20class%3D%26quot%3Bcsl-right-inline%26quot%3B%20style%3D%26quot%3Bmargin%3A%200%20.4em%200%201.5em%3B%26quot%3B%26gt%3BN.%20Marahrens%2C%20D.%20Jones%2C%20N.%20Murasovs%2C%20C.%20S.%20Biyani%2C%20and%20P.%20Valdastri%2C%20%26quot%3B%26lt%3Ba%20class%3D%26%23039%3Bzp-ItemURL%26%23039%3B%20href%3D%26%23039%3Bhttps%3A%5C%2F%5C%2Fieeexplore.ieee.org%5C%2Fdocument%5C%2F10695777%26%23039%3B%26gt%3BAn%20Ultrasound-Guided%20System%20for%20Autonomous%20Marking%20of%20Tumor%20Boundaries%20During%20Robot-assisted%20Surgery%26lt%3B%5C%2Fa%26gt%3B%2C%26quot%3B%20%26lt%3Bi%26gt%3BIEEE%20Transactions%20on%20Medical%20Robotics%20and%20Bionics%26lt%3B%5C%2Fi%26gt%3B%2C%20pp.%201%5Cu20131%2C%20Sep.%202024%2C%20doi%3A%2010.1109%5C%2FTMRB.2024.3468397.%26lt%3B%5C%2Fdiv%26gt%3B%5Cn%20%26lt%3B%5C%2Fdiv%26gt%3B%5Cn%26lt%3B%5C%2Fdiv%26gt%3B%22%2C%22data%22%3A%7B%22itemType%22%3A%22journalArticle%22%2C%22title%22%3A%22An%20Ultrasound-Guided%20System%20for%20Autonomous%20Marking%20of%20Tumor%20Boundaries%20During%20Robot-assisted%20Surgery%22%2C%22creators%22%3A%5B%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Nils%22%2C%22lastName%22%3A%22Marahrens%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Dominic%22%2C%22lastName%22%3A%22Jones%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Nikita%22%2C%22lastName%22%3A%22Murasovs%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Chandra%20Shekhar%22%2C%22lastName%22%3A%22Biyani%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Pietro%22%2C%22lastName%22%3A%22Valdastri%22%7D%5D%2C%22abstractNote%22%3A%22While%20only%20a%20limited%20number%20of%20procedures%20have%20image%20guidance%20available%20during%20robotically%20guided%20surgery%2C%20they%20still%20require%20the%20surgeon%20to%20manually%20reference%20the%20obtained%20scans%20to%20their%20projected%20location%20on%20the%20tissue%20surface.%20While%20the%20surgeon%20may%20mark%20the%20boundaries%20on%20the%20organ%20surface%20via%20electrosurgery%2C%20the%20precise%20margin%20around%20the%20tumor%20is%20likely%20to%20remain%20variable%20and%20not%20guaranteed%20before%20a%20pathological%20analysis.%20This%20paper%20presents%20a%20first%20attempt%20to%20autonomously%20extract%20and%20mark%20tumor%20boundaries%20with%20a%20specified%20margin%20on%20the%20tissue%20surface.%20It%20presents%20a%20first%20concept%20for%20tool-tissue%20interaction%20control%20via%20imu%20sensor%20fusion%20and%20contact%20detection%20from%20the%20electrical%20signals%20of%20the%20%2Aesu%2C%20requiring%20no%20force%20sensing.%20We%20develop%20and%20assess%20our%20approach%20on%20us%20phantoms%20with%20anatomical%20surface%20geometries%2C%20comparing%20different%20strategies%20for%20projecting%20the%20tumor%20onto%20the%20surface%20and%20assessing%20its%20accuracy%20in%20repeated%20trials.%20Finally%2C%20we%20demonstrate%20the%20feasibility%20of%20translating%20the%20approach%20to%20an%20ex-vivo%20porcine%20liver.%20We%20achieve%20mean%20true%20positive%20rates%20above%200.84%20and%20false%20detection%20rates%20below%200.12%20compared%20to%20a%20tracked%20reference%20for%20each%20calculation%20and%20execution%20of%20the%20marking%20trajectory%20for%20dummy%20and%20ex-vivo%20experiments.%22%2C%22date%22%3A%222024-09-26%22%2C%22section%22%3A%22%22%2C%22partNumber%22%3A%22%22%2C%22partTitle%22%3A%22%22%2C%22DOI%22%3A%2210.1109%5C%2FTMRB.2024.3468397%22%2C%22citationKey%22%3A%22%22%2C%22url%22%3A%22https%3A%5C%2F%5C%2Fieeexplore.ieee.org%5C%2Fdocument%5C%2F10695777%22%2C%22PMID%22%3A%22%22%2C%22PMCID%22%3A%22%22%2C%22ISSN%22%3A%222576-3202%22%2C%22language%22%3A%22%22%2C%22collections%22%3A%5B%22U4MG6KFQ%22%2C%22853N8SBR%22%5D%2C%22dateModified%22%3A%222024-10-07T09%3A05%3A46Z%22%7D%2C%22image%22%3A%5B%22https%3A%5C%2F%5C%2Fwww.stormlabuk.com%5C%2Fwp-content%5C%2Fuploads%5C%2F2024%5C%2F09%5C%2F3fjX5Ud3ZV-150x150.png%22%2C150%2C150%2Ctrue%5D%7D%2C%7B%22key%22%3A%22K7LBE3SF%22%2C%22library%22%3A%7B%22id%22%3A5599357%7D%2C%22meta%22%3A%7B%22creatorSummary%22%3A%22Loza%20et%20al.%22%2C%22parsedDate%22%3A%222024-04%22%2C%22numChildren%22%3A3%7D%2C%22bib%22%3A%22%26lt%3Bdiv%20class%3D%26quot%3Bcsl-bib-body%26quot%3B%20style%3D%26quot%3Bline-height%3A%201.35%3B%20%26quot%3B%26gt%3B%5Cn%20%26lt%3Bdiv%20class%3D%26quot%3Bcsl-entry%26quot%3B%20style%3D%26quot%3Bclear%3A%20left%3B%20%26quot%3B%26gt%3B%5Cn%20%26lt%3Bdiv%20class%3D%26quot%3Bcsl-left-margin%26quot%3B%20style%3D%26quot%3Bfloat%3A%20left%3B%20padding-right%3A%200.5em%3B%20text-align%3A%20right%3B%20width%3A%201em%3B%26quot%3B%26gt%3B%5B1%5D%26lt%3B%5C%2Fdiv%26gt%3B%26lt%3Bdiv%20class%3D%26quot%3Bcsl-right-inline%26quot%3B%20style%3D%26quot%3Bmargin%3A%200%20.4em%200%201.5em%3B%26quot%3B%26gt%3BG.%20Loza%2C%20P.%20Valdastri%2C%20and%20S.%20Ali%2C%20%26quot%3B%26lt%3Ba%20class%3D%26%23039%3Bzp-ItemURL%26%23039%3B%20href%3D%26%23039%3Bhttps%3A%5C%2F%5C%2Fonlinelibrary.wiley.com%5C%2Fdoi%5C%2Ffull%5C%2F10.1049%5C%2Fhtl2.12060%20https%3A%5C%2F%5C%2Fonlinelibrary.wiley.com%5C%2Fdoi%5C%2Fabs%5C%2F10.1049%5C%2Fhtl2.12060%20https%3A%5C%2F%5C%2Fietresearch.onlinelibrary.wiley.com%5C%2Fdoi%5C%2F10.1049%5C%2Fhtl2.12060%26%23039%3B%26gt%3BReal-time%20surgical%20tool%20detection%20with%20multi-scale%20positional%20encoding%20and%20contrastive%20learning%26lt%3B%5C%2Fa%26gt%3B%2C%26quot%3B%20%26lt%3Bi%26gt%3BHealthcare%20Technology%20Letters%26lt%3B%5C%2Fi%26gt%3B%2C%20vol.%2011%2C%20no.%202%5Cu20133%2C%20pp.%2048%5Cu201358%2C%20Apr.%202024%2C%20doi%3A%2010.1049%5C%2Fhtl2.12060.%26lt%3B%5C%2Fdiv%26gt%3B%5Cn%20%26lt%3B%5C%2Fdiv%26gt%3B%5Cn%26lt%3B%5C%2Fdiv%26gt%3B%22%2C%22data%22%3A%7B%22itemType%22%3A%22journalArticle%22%2C%22title%22%3A%22Real-time%20surgical%20tool%20detection%20with%20multi-scale%20positional%20encoding%20and%20contrastive%20learning%22%2C%22creators%22%3A%5B%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Gerardo%22%2C%22lastName%22%3A%22Loza%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Pietro%22%2C%22lastName%22%3A%22Valdastri%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Sharib%22%2C%22lastName%22%3A%22Ali%22%7D%5D%2C%22abstractNote%22%3A%22Real-time%20detection%20of%20surgical%20tools%20in%20laparoscopic%20data%20plays%20a%20vital%20role%20in%20understanding%20surgical%20procedures%2C%20evaluating%20the%20performance%20of%20trainees%2C%20facilitating%20learning%2C%20and%20ultimately%20supporting%20the%20autonomy%20of%20robotic%20systems.%20Existing%20detection%20methods%20for%20surgical%20data%20need%20to%20improve%20processing%20speed%20and%20high%20prediction%20accuracy.%20Most%20methods%20rely%20on%20anchors%20or%20region%20proposals%2C%20limiting%20their%20adaptability%20to%20variations%20in%20tool%20appearance%20and%20leading%20to%20sub-optimal%20detection%20results.%20Moreover%2C%20using%20non-anchor-based%20detectors%20to%20alleviate%20this%20problem%20has%20been%20partially%20explored%20without%20remarkable%20results.%20An%20anchor-free%20architecture%20based%20on%20a%20transformer%20that%20allows%20real-time%20tool%20detection%20is%20introduced.%20The%20proposal%20is%20to%20utilize%20multi-scale%20features%20within%20the%20feature%20extraction%20layer%20and%20at%20the%20transformer-based%20detection%20architecture%20through%20positional%20encoding%20that%20can%20refine%20and%20capture%20context-aware%20and%20structural%20information%20of%20different-sized%20tools.%20Furthermore%2C%20a%20supervised%20contrastive%20loss%20is%20introduced%20to%20optimize%20representations%20of%20object%20embeddings%2C%20resulting%20in%20improved%20feed-forward%20network%20performances%20for%20classifying%20localized%20bounding%20boxes.%20The%20strategy%20demonstrates%20superiority%20to%20state-of-the-art%20%28SOTA%29%20methods.%20Compared%20to%20the%20most%20accurate%20existing%20SOTA%20%28DSSS%29%20method%2C%20the%20approach%20has%20an%20improvement%20of%20nearly%204%25%20on%20mAP%20%28Formula%20presented.%29%20and%20a%20reduction%20in%20the%20inference%20time%20by%20113%25.%20It%20also%20showed%20a%207%25%20higher%20mAP%20%28Formula%20presented.%29%20than%20the%20baseline%20model.%22%2C%22date%22%3A%222024-04%22%2C%22section%22%3A%22%22%2C%22partNumber%22%3A%22%22%2C%22partTitle%22%3A%22%22%2C%22DOI%22%3A%2210.1049%5C%2Fhtl2.12060%22%2C%22citationKey%22%3A%22%22%2C%22url%22%3A%22https%3A%5C%2F%5C%2Fonlinelibrary.wiley.com%5C%2Fdoi%5C%2Ffull%5C%2F10.1049%5C%2Fhtl2.12060%20https%3A%5C%2F%5C%2Fonlinelibrary.wiley.com%5C%2Fdoi%5C%2Fabs%5C%2F10.1049%5C%2Fhtl2.12060%20https%3A%5C%2F%5C%2Fietresearch.onlinelibrary.wiley.com%5C%2Fdoi%5C%2F10.1049%5C%2Fhtl2.12060%22%2C%22PMID%22%3A%22%22%2C%22PMCID%22%3A%22%22%2C%22ISSN%22%3A%2220533713%22%2C%22language%22%3A%22%22%2C%22collections%22%3A%5B%22U4MG6KFQ%22%2C%22853N8SBR%22%5D%2C%22dateModified%22%3A%222024-07-15T14%3A07%3A36Z%22%7D%7D%2C%7B%22key%22%3A%22H55GNG23%22%2C%22library%22%3A%7B%22id%22%3A5599357%7D%2C%22meta%22%3A%7B%22creatorSummary%22%3A%22Hu%20et%20al.%22%2C%22parsedDate%22%3A%222024%22%2C%22numChildren%22%3A0%7D%2C%22bib%22%3A%22%26lt%3Bdiv%20class%3D%26quot%3Bcsl-bib-body%26quot%3B%20style%3D%26quot%3Bline-height%3A%201.35%3B%20%26quot%3B%26gt%3B%5Cn%20%26lt%3Bdiv%20class%3D%26quot%3Bcsl-entry%26quot%3B%20style%3D%26quot%3Bclear%3A%20left%3B%20%26quot%3B%26gt%3B%5Cn%20%26lt%3Bdiv%20class%3D%26quot%3Bcsl-left-margin%26quot%3B%20style%3D%26quot%3Bfloat%3A%20left%3B%20padding-right%3A%200.5em%3B%20text-align%3A%20right%3B%20width%3A%201em%3B%26quot%3B%26gt%3B%5B1%5D%26lt%3B%5C%2Fdiv%26gt%3B%26lt%3Bdiv%20class%3D%26quot%3Bcsl-right-inline%26quot%3B%20style%3D%26quot%3Bmargin%3A%200%20.4em%200%201.5em%3B%26quot%3B%26gt%3BJ.%20Hu%2C%20D.%20Jones%2C%20M.%20R.%20Dogar%2C%20and%20P.%20Valdastri%2C%20%26quot%3B%26lt%3Ba%20class%3D%26%23039%3Bzp-ItemURL%26%23039%3B%20href%3D%26%23039%3Bhttps%3A%5C%2F%5C%2Fieeexplore.ieee.org%5C%2Fdocument%5C%2F10328689%26%23039%3B%26gt%3BOcclusion-Robust%20Autonomous%20Robotic%20Manipulation%20of%20Human%20Soft%20Tissues%20With%203-D%20Surface%20Feedback%26lt%3B%5C%2Fa%26gt%3B%2C%26quot%3B%20%26lt%3Bi%26gt%3BIEEE%20Transactions%20on%20Robotics%26lt%3B%5C%2Fi%26gt%3B%2C%20vol.%2040%2C%20pp.%20624%5Cu2013638%2C%202024%2C%20doi%3A%2010.1109%5C%2FTRO.2023.3335693.%26lt%3B%5C%2Fdiv%26gt%3B%5Cn%20%26lt%3B%5C%2Fdiv%26gt%3B%5Cn%26lt%3B%5C%2Fdiv%26gt%3B%22%2C%22data%22%3A%7B%22itemType%22%3A%22journalArticle%22%2C%22title%22%3A%22Occlusion-Robust%20Autonomous%20Robotic%20Manipulation%20of%20Human%20Soft%20Tissues%20With%203-D%20Surface%20Feedback%22%2C%22creators%22%3A%5B%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Junlei%22%2C%22lastName%22%3A%22Hu%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Dominic%22%2C%22lastName%22%3A%22Jones%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Mehmet%20R.%22%2C%22lastName%22%3A%22Dogar%22%7D%2C%7B%22creatorType%22%3A%22author%22%2C%22firstName%22%3A%22Pietro%22%2C%22lastName%22%3A%22Valdastri%22%7D%5D%2C%22abstractNote%22%3A%22Robotic%20manipulation%20of%203-D%20soft%20objects%20remains%20challenging%20in%20the%20industrial%20and%20medical%20fields.%20Various%20methods%20based%20on%20mechanical%20modeling%2C%20data-driven%20approaches%20or%20explicit%20feature%20tracking%20have%20been%20proposed.%20A%20unifying%20disadvantage%20of%20these%20methods%20is%20the%20high%20computational%20cost%20of%20simultaneous%20imaging%20processing%2C%20identification%20of%20mechanical%20properties%2C%20and%20motion%20planning%2C%20leading%20to%20a%20need%20for%20less%20computationally%20intensive%20methods.%20We%20propose%20a%20method%20for%20autonomous%20robotic%20manipulation%20with%203-D%20surface%20feedback%20to%20solve%20these%20issues.%20First%2C%20we%20produce%20a%20deformation%20model%20of%20the%20manipulated%20object%2C%20which%20estimates%20the%20robots%26%23039%3B%20movements%20by%20monitoring%20the%20displacement%20of%20surface%20points%20surrounding%20the%20manipulators.%20Then%2C%20we%20develop%20a%206-degree-of-freedom%20velocity%20controller%20to%20manipulate%20the%20grasped%20object%20to%20achieve%20a%20desired%20shape.%20We%20validate%20our%20approach%20through%20comparative%20simulations%20with%20existing%20methods%20and%20experiments%20using%20phantom%20and%20cadaveric%20soft%20tissues%20with%20the%20da%20Vinci%20research%20kit.%20The%20results%20demonstrate%20the%20robustness%20of%20the%20technique%20to%20occlusions%20and%20various%20materials.%20Compared%20to%20state-of-the-art%20linear%20and%20data-driven%20methods%2C%20our%20approach%20is%20more%20precise%20by%2046.5%25%20and%2015.9%25%20and%20saves%2055.2%25%20and%2025.7%25%20manipulation%20time%2C%20respectively.%22%2C%22date%22%3A%222024%22%2C%22section%22%3A%22%22%2C%22partNumber%22%3A%22%22%2C%22partTitle%22%3A%22%22%2C%22DOI%22%3A%2210.1109%5C%2FTRO.2023.3335693%22%2C%22citationKey%22%3A%22%22%2C%22url%22%3A%22https%3A%5C%2F%5C%2Fieeexplore.ieee.org%5C%2Fdocument%5C%2F10328689%22%2C%22PMID%22%3A%22%22%2C%22PMCID%22%3A%22%22%2C%22ISSN%22%3A%2219410468%22%2C%22language%22%3A%22%22%2C%22collections%22%3A%5B%22U4MG6KFQ%22%2C%22853N8SBR%22%5D%2C%22dateModified%22%3A%222024-07-15T14%3A01%3A42Z%22%7D%2C%22image%22%3A%5B%22https%3A%5C%2F%5C%2Fwww.stormlabuk.com%5C%2Fwp-content%5C%2Fuploads%5C%2F2024%5C%2F07%5C%2Fhu1-3335693-small-150x150.gif%22%2C150%2C150%2Ctrue%5D%7D%5D%7D
[1]
T. Zeng et al., "NeeCo: Image Synthesis of Novel Instrument States Based on Dynamic and Deformable 3D Gaussian Reconstruction," IEEE Transactions on Medical Imaging, pp. 1–1, Dec. 2025, doi: 10.1109/TMI.2025.3648299.
[1]
H. Xu et al., "SurgRIPE challenge: Benchmark of surgical robot instrument pose estimation," Medical Image Analysis, vol. 105, p. 103674, Oct. 2025, doi: 10.1016/j.media.2025.103674.
[1]
N. Marahrens, D. Jones, N. Murasovs, C. S. Biyani, and P. Valdastri, "An Ultrasound-Guided System for Autonomous Marking of Tumor Boundaries During Robot-assisted Surgery," IEEE Transactions on Medical Robotics and Bionics, pp. 1–1, Sep. 2024, doi: 10.1109/TMRB.2024.3468397.
[1]
G. Loza, P. Valdastri, and S. Ali, "Real-time surgical tool detection with multi-scale positional encoding and contrastive learning," Healthcare Technology Letters, vol. 11, no. 2–3, pp. 48–58, Apr. 2024, doi: 10.1049/htl2.12060.
[1]
J. Hu, D. Jones, M. R. Dogar, and P. Valdastri, "Occlusion-Robust Autonomous Robotic Manipulation of Human Soft Tissues With 3-D Surface Feedback," IEEE Transactions on Robotics, vol. 40, pp. 624–638, 2024, doi: 10.1109/TRO.2023.3335693.

View another research project