Skip to main content
NIHPA Author Manuscripts logoLink to NIHPA Author Manuscripts
. Author manuscript; available in PMC: 2024 Mar 26.
Published in final edited form as: Proc IEEE Int Symp Bioinformatics Bioeng. 2021 Dec 15;2021:10.1109/bibe52308.2021.9635563. doi: 10.1109/bibe52308.2021.9635563

Supervised Autonomous Electrosurgery for Soft Tissue Resection

Jiawei Ge 1, Hamed Saeidi 2, Michael Kam 3, Justin Opfermann 4, Axel Krieger 5
PMCID: PMC10965307  NIHMSID: NIHMS1978506  PMID: 38533465

Abstract

Surgical resection is the current clinical standard of care for treating squamous cell carcinoma. Maintaining an adequate tumor resection margin is the key to a good surgical outcome, but tumor edge delineation errors are inevitable with manual surgery due to difficulty in visualization and hand-eye coordination. Surgical automation is a growing field of robotics to relieve surgeon burdens and to achieve a consistent and potentially better surgical outcome. This paper reports a novel robotic supervised autonomous electrosurgery technique for soft tissue resection achieving millimeter accuracy. The tumor resection procedure is decomposed to the subtask level for a more direct understanding and automation. A 4-DOF suction system is developed, and integrated with a 6-DOF electrocautery robot to perform resection experiments. A novel near-infrared fluorescent marker is manually dispensed on cadaver samples to define a pseudotumor, and intraoperatively tracked using a dual-camera system. The autonomous dual-robot resection cooperation workflow is proposed and evaluated in this study. The integrated system achieves autonomous localization of the pseudotumor by tracking the near-infrared marker, and performs supervised autonomous resection in cadaver porcine tongues (N=3). The three pseudotumors were successfully removed from porcine samples. The evaluated average surface and depth resection errors are 1.19 and 1.83mm, respectively. This work is an essential step towards autonomous tumor resections.

Keywords: autonomous surgery, surgical robotics, tumor resection, electrosurgery, image-guided surgery

I. INTRODUCTION

Head and neck cancer (HNC) is the sixth most common cancer worldwide with estimated 890,000 new cases and 450,000 deaths in 2018 [1]. Squamous cell carcinoma (SCC) is the most common form of HNC, and makes up about 90 percent of HNC cases [2], [3]. Squamous cells are flat, thin, and are located in the epidermis and outermost layer of the skin, meaning that SCC primarily presents in epithelial tissues [4], [5]. Surgical resection, or excision, is the current clinical standard of care for SCC with or without adjuvant therapies [6]. Preoperatively, surgeons inspect diagnostic computed tomography (CT) and magnetic resonance (MR) images, then localize and map tumor boundaries at the surface of the skin using biocompatible color inks such as India Ink [7], [8]. Intraoperatively, surgeons remove the tumor with an adequate margin of healthy tissue around it. Narrow margins may lead to recurrences or metastases, while overly wide margins will impair post-surgical functioning [9].

Robotic-assisted surgery (RAS) is the standard of care for HNC resection [10]–[12]. Compared to conventional open and laparoscopic surgeries, RAS provides several advancements such as enhanced visualization, dexterous tools, and filtered and scaled motion [10], [13]. However, despite these advantages two limitations exist. First, surgeons can easily lose track of tumor boundaries while surgical sites are obscured by intraoperative bleeding. Surgical diathermy incision is a preferred solution to reduce bleeding compared to the scalpel incision [14], but it further obfuscates the surgical site by causing tissue charring. Second, tumor resections require prolonged focus on delineating tumor boundaries and performing accurate incisions, and surgeon fatigue after hours of surgery can lead to inconsistent resection qualities. In both aforementioned scenarios, any resection error would turn to poor surgical outcomes.

Advanced tumor imaging techniques have been developed to achieve robust intraoperative tumor tracking despite bleeding and tissue charring scenarios. Fluorescence-guided surgery (FGS) is commonly used as an effective technique to define intraoperative tumor location and margins [15]. Monoclonal antibodies are capable of selectively binding to specific tumor cells, and high contrast tumor imaging is achieved by conjugating fluorescent dyes to monoclonal antibodies. Two fluorescent agents, cetuximab-IRDye800CW and panitumum-abIRDye800CW, are under phase II trial studies identifying SCC in the head and neck region [16]. Moreover, a non-tumor-specific fluorescent marker is developed to replace traditional India Ink tattooing in FGS [17], [18]. Similar to India Ink, the dot shaped marker is manually dispensed on tumor surface boundaries to assist tumor delineation and resection. The aforementioned dyes fluoresce in the near infrared (NIR) spectral region, thus having a stronger signal penetration and better signal to noise ratio compared to the color light. The NIR signal can be robustly detected using NIR cameras regardless of blood and tissue charring.

Increasing the level of autonomy (LOA) in surgery is one solution to minimize a surgeon’s burden, and achieve more consistent and potentially better surgical outcomes. Current RAS systems stay at the direct control stage (LOA 1 [19]), where surgeons have complete teleoperating control of surgical tasks. McKinley et al. achieved autonomous fake tumor resection using the da Vinci Surgical Research Kit (dVRK) and customized tools [20]. In this study, the authors create a pseudotumor by embedding a cylindrical rubber shape in a silicone rubber pad. The fake tumor was autonomously palpated, exposed, and removed. Hu et al. proved the feasibility of semiautonomous tumor fragment resection [21]. A jelly-like green fluorescent fake tumor fragment was autonomously debrided using the Raven II Surgical robot and a customized suction tool. Both phantom studies reach the supervised autonomy stage (LOA 3 [19]), where surgeons supervise the robots to autonomously execute surgical tasks. However, the phantom tumor and tissue models were unrealistic and the studies are difficult to be applied to real tissue surgeries. Our prior work demonstrated LOA 3 autonomous soft tissue surface incision using a Kuka Lbr Med robot and a customized electrocautery tool [22], [23]. A portion of the cadaver porcine tongue was designated preoperatively as the tumor region, and the tumor region was then separated from the cadaver sample using supervised autonomous electorcautery. However, despite successfully separating the tumor margins from the cadaveric tissue, the resection task was incomplete since only the tumor margins were cut, and the bottom surface of the tumor remained attached to the porcine cadaver tongue.

In this work, we performed complete supervised autonomous electrosurgery for pseudotumor resection guided by NIR fluorescent markers using the robot system presented in Fig. 1. A cylindrical tissue target on the cadaver sample was designated as a pseudotumor. Our previously developed NIR markers were dispensed on cadaver samples [18], and tracked to mark a pseudotumor surface edge for guiding the surgical resection. Using the presented robotic system, pseudotumors on cadaver samples could be autonomously incised and removed under user supervision. As our first contribution, surgical tumor resection is decomposed to subtask granularity for better understanding and automation. The detailed execution workflow of a dual-robot system is proposed and evaluated in this study. The second contribution is developing a 4-DOF suction system for autonomous tissue maneuver which is required to achieve a complete surgical resection at LOA 3. The system control is developed under the Robot Operating System (ROS) environment, and integrated in our robotic system to cooperate with the electrocautery robot. The accuracy of the motion stage delivering the suction pad guided by the cameras, and the suction capabilities are evaluated and reported. Our third contribution is demonstrating feasibility of complete autonomous surgical resection of pseudotumors. Three consecutive tissue bulks were autonomously resected on cadaver porcine tongues. The surface incision and deep margin dissection errors were evaluated and reported.

Fig. 1.

Fig. 1.

Image of a) the experimental system used in the proposed complete autonomous pseudotumor resection study, and b) the Arduino controlled customized 4-DOF suction system. c) A close view of the electrocautery tip and suction pad collaboration.

This paper is organized as follows. Section II describes the surgical procedure decomposition and simplification, experimental system, suction system, and autonomous control system. In section III, detailed experimental tasks are explained and the results are reported. The discussions and conclusions are presented in section IV.

II. Methods

A. Surgical Tumor Resection Subtask Decomposition and Simplification

The automation of a surgical process is extremely complex. Autonomously completing multiple surgical subtasks in consecutive steps is a recently proposed methodology [24], [25], which this study follows. The clinical SCC resection workflow can be decomposed into three tasks that include diagnostic CT/MR images, tumor resection, and reconstruction of the surgical site using staples/sutures. The third task is outside the scope of this paper and not included in the proposed study. The first two tasks are further decomposed into subtasks as shown in the clinical workflow of Fig. 2. The dexterous cooperation of tissue retraction using forceps and the repetitive deep margin dissection guided by tumor visual feedback is the biggest challenge. Since we aim to first develop and integrate a robotic system in this feasibility study towards a complete resection automation, a simplified tumor model is used to facilitate the surgical process. Our goal is to define a tissue flap on cadaver samples as the pseudotumor, and autonomously resect the tissue flap with a predefined margin. Tissue retraction with forceps is performed clinically to expose deep tumor margins for easier dissection. Tissue retraction with suction is used in this study as an alternative to forceps to manipulate tissues with reduced deformation and damage [26]. The final simplified workflow is presented in Fig. 2. Surface incision, suction retraction, and deep margin dissection of a pseudotumor are the three subtasks to automate. As illustrated in Fig. 3, surface incision is first performed as a standalone subtask to execute, which is followed by suction retraction and deep margin dissection performed consecutively in iterations until the pseudotumor is completely removed from the cadaver samples.

Fig. 2.

Fig. 2.

Clinical (left) and simplified (right) workflow of the tumor resection processes.

Fig. 3.

Fig. 3.

Detailed execution flow chart of the pseudotumor resection. The portion corresponding to the three subtasks (surface incision, suction retraction, and deep margin dissection) are highlighted in blue squares.

1). Pseudotumor Definition and Marking Strategy:

The pseudotumor is developed by dispensing our previously developed NIR markers on porcine cadaver tongue sample surfaces to mark the pseudotumor surface edges [18], [22], [23]. Three dotted NIR markers are evenly dispensed on a 12mm diameter circular pattern, which is similar in size to a T1-stage SCC. As aforementioned, SCC primarily grows from epithelial tissues. An early stage SCC stays in epidermis and has a thickness about 1mm which we will consider as the thickness of the pseudotumors in our study. Given a predefined surface edge offset and depth offset beneath the edge, the corresponding spatial area of a cadaver sample is considered as a pseudotumor. Both lateral and depth offsets are set to 4mm according to replicate appropriate margins for SCC surgical resections [27]. In summation, we create the pseudotumor model by approximating the tumor geometry as a cylindrical tissue growing from the cadaver sample surfaces with 12mm diameter and 1mm depth, and aim to resect the pseudotumor with 20mm diameter and 5mm depth to account for a 4mm tumor margin. Localization of the pseudotumor is based on the three NIR marker positions. The pseudotumor surface center is the same as the center of the three NIR markers.

B. Experimental System

Fig. 1 shows our robotic system, which consists of four components: i) a dual camera system, ii) a 6-DOF robotic manipulator with an electrosurgical unit (ESU), iii) a 4-DOF suction system, and iv) the autonomous control system. The dual camera component includes a RGBD camera (D415, Intel Corp., Santa Clara, California), a NIR camera (acA2040–90umNIR, Basler AG, Ahrensburg, Germany), and a 760nm high power light-emitting diode (North Coast Technical Inc., Chesterland, OH). The robotic manipulator includes a UR5 manipulator (Universal Robots, Odense, Denmark), a customized 35cm laparoscopic electrosurgical tool, an electrosurgical generator (DRE ASG-300ESU, DRE Veterinary, Louisville, KY), and a grounding pad. The suction system and autonomous control system are further elaborated in Section II-C and II-D, respectively. To reduce odor during experimentation, all electrosurgical tasks were conducted under a fume hood and in the presence of a portable smoke evacuator (Smoke Shark, Bovie, Clearwater, Fl).

C. Suction System

As shown in Fig. 1.b, a 4-DOF motion stage and a vacuum gripping system are developed to form a suction system to perform the retraction subtask. Requirements for the suction system include generating adequate vacuum pressure to adhere to the cadaver tissue surface, accurately delivering the suction pad to the pseudotumor surface center (minimum three translational DOF), and elevate and tilt the pseudotumor to expose the deep margin to surgeons/cameras and electrosurgical tools while the minimum tissue damage is caused during the retraction (minimum one rotational DOF). The rotational DOF is unnecessary in terms of the geometry constraints in the presented open surgery setup (Fig. 1.c), but it creates a torque and an extra tension on the tissue deep margin dissection edge in each iteration. This tension helps expose tissue deep margins using the minimum suction force and thus causing the minimum surgical tissue deformation and damage.

The suction system is comprised of a Cartesian motion stage and a fourth Nema-23 stepper motor (Fuyu Technology, Chengdu, China), a Arduino Uno board and a Arduino CNC Shield (Somerville, MA), four Drv8825 motor drivers (Pololu Robotics and Electronics, Las Vegas, NV), a 24V power supply, two 6V mini air pumps, a voltage regulator (Amazon, Seattle, WA), and a 3d printed adapter. Two 6V mini pumps are powered and connected in series with a Bellows vacuum pad (ID = 8mm, OD = 20mm) to provide suction force. Noteworthy, preliminary manual tissue resection experiments were performed to decide the minimum number of pumps required. Following the same sample preparation technique used in Section III-B, and the same simplified tumor resection workflow, two pumps were the minimum requirements for achieving zero vacuum failure during five consecutive manual resections using the 8mm ID suction pad. The suction pad is assembled on a 3d printed adapter, and fixed on the fourth stepper motor. This stepper motor is mounted on the end effector of a Cartesian stepper motorized stage to provide a pitch DOF. The stage motion range is 100×100×100mm. Stepper motors run at 1/32 microstepping mode. The translational and rotational resolutions are 1.56μm and 0.056 degrees, respectively. No motor sensors are integrated because no missed steps will happen theoretically using 0.93N·m torque motors in the proposed soft tissue maneuvering scenarios.

Having this suction system integrated into our robotic system is necessary for resection automation. We customized a Arduino firmware in the ROS environment using the ROS rosserial package. It’s capable of communicating with ROS topics of motor steps and directions commands, and thus driving the CNC shield and four stepper motors. The 4-DOF motion stage kinematics is defined using Kinematics and Dynamics Library (KDL) [28]. The forward kinematics solver is used to register the motion stage base frame with respect to the checkerboard frame, and thus localizing the motion stage in the whole system. The inverse kinematics solver is used to calculate motor joint commands based on the generated suction pad path in the autonomous mode. The detailed autonomous control loop is explained in the following subsection.

D. Autonomous Control System

Fig. 4 shows the block diagram of the control loop for the presented robotic system. Representative figures during a pseudotumor resection on a cadaver porcine tongue are showed in Fig. 5 for a more intuitive understanding. A standard hand-eye calibration is performed using a checkerboard prior to all experiments. Relative positions and orientations of the cameras, UR5, and the suction system are acquired, and their coordinate frames are registered using tf library [29]. Realtime video of the target samples is streamed from the NIR and RGBD cameras and processed. Three NIR markers on each sample surface are distinguished and tracked using Visual Servoing Platform (ViSP) [30], then overlayed on sample point clouds (pink, red, and green points in Fig. 5.b,d) [22], [23]. The pseudotumor surface edge generator projects three NIR markers on world xy plane, define a circle, and add the 4mm offset enabling a negative margin [27]. The new circle is then downsampled to thirty 2d points, and overlaid on sample 3d point clouds (white points in Fig. 5.b,d). The number thirty is empirically decided and should be adjusted for different pseudotumor dimension. The overlaying method uses the tf library and Point Cloud Library [31]. Using generated pseudotumor edge point clouds, the UR5 and suction system have two and one path planners, respectively. The first UR5 planner adds 5mm offset beneath the sample surface, and sets electrosurgical tool orientation to vertical. The path is converted to the joint space trajectory using KDL [28], and smooth joint velocities are planned using Reflexxes Motion Library [32]. The processed joint commands are sent to Universal Robots ROS Driver to drive the UR5 robot and ensure a closed-loop control (Fig. 5.c). The suction system planner works in real-time with the other UR5 planner to achieve pseudotumor lift up and electrosurgical tool deep margin dissection, respectively. Suction system planner generates the first suction waypoint on the central 3d point of the thirty pseudotumor edges 3d points, and the second waypoint 5mm above the first waypoint to lift the pseudotumor. The suction pad orientation of the first two waypoints are parallel to the world xy plane. Then fourteen waypoints are generated to gradually lift and tilt the pseudotumor to expose the deep margin in front of the electrosurgical tool (blue points in Fig. 5.d). The last waypoint is 16mm above the first waypoint, and has a 45 degrees upward tilting angle. The second UR5 planner groups the thirty pseudotumor edges clouds to left and right edges with respect to the suction pad tf frame, and appends on it. The paired real-time positions are updated using KDL, and the orientations are defined to control the electrosurgical tool to perform deep margin dissection repetitively. Each deep margin dissection follows a upside down trapezoidal profile, and the electrosurgical tool has a 30 degree downward tilting angle (as represented in Fig. 5.eh). The suction system path is converted to joint values commands using KDL, and further converted to steps and directions commands to send to the customized Arduino firmware to drive motors (suction pad motion during one lift and tilt motion is presented in Fig. 5.i).

Fig. 4.

Fig. 4.

Control loop of the presented autonomous surgical resection system.

Fig. 5.

Fig. 5.

a) Image of a marked porcine tongue sample using three NIR markers that are evenly dispensed on a 12mm diameter circular pattern. b) Sample point cloud prior surface incision. c) Image during the surface incision. d) Sample point cloud post surface incision and prior to deep margin dissection. In figure b) and d): detected NIR marker 3d points are highlighted in pink, red, and green (colors don’t match markers); pseudotumor center point and surface margin points are highlighted in yellow and white, respectively. The suction pad path is highlighted using blue line in figure d). Representative images of e) first, f) second, g) third, h) fourth key position of the electrocautery tip during a dissection. The electrocautery tip positions and motion directions are highlighted in red. i) Image of a representative lift and tilt motion of the suction system. The suction pad center position moves from light to dark blue dots, and the pad central axis rotates to expose the deep margin.

III. Experiments and Results

A. Suction System Motion Accuracy and Suction Capability Evaluation

1). Tasks and Evaluation:

In this study, we aimed to evaluate the suction system i) motion accuracy of delivering the suction pad guided by NIR markers, and ii) actual suction force using two mini air pumps and a Bellow vacuum pad (ID=8mm) on a cadaver porcine tongue.

One stored porcine cadaver tongue was purchased from a grocery (H Mart, Lyndhurst, NJ), and cut into anterior, middle, and posterior segments as presented in Fig. 6.a. The weight and the minimum bounding box dimension of each sample segment were measured using a ruler. Using a transparent thin film with a drawn 2cm diameter circle as the pseudotumor surface edge, each sample segment was marked on pseudotumor edge with three evenly spaced NIR markers. A human operator was asked to randomly place each sample segment within the 4-DOF suction system motion range, click on NIR markers using a GUI on the computer, and turn on vacuum pump power switch. The suction pad was then autonomously driven to the center position of the three markers, and applied suction on the sample while the suction pad was in horizontal orientation. Attempts were made to lift the sample off the table, and the success rate was reported. If a lift trial succeeded, a digital force gauge was hooked on the back of the sample, and the operator started to pull the force gauge until the suction pad was removed from the sample. For each trial, the maximum measured suction force was recorded and reported. Additionally, the vacuum air pressure was measured using an air pressure gauge. Then the pattern film was placed on top of each sample to match the NIR marker positions. The distance between the drawn circle center on the film and the center of suction pad applied on area was measured using a ruler as the suction system motion error. Since three samples were used, and the aforementioned tests were performed three times for each sample, the total of nine study results were reported (N=9).

Fig. 6.

Fig. 6.

Image of a) three porcine samples, suction pad, transparent pseudotumor pattern, and b) positional accuracy evaluation example of the suction pad delivery experiments.

2). Results:

Experimental results are summarized in Table. I. The average vacuum pressure was −55kPa. The theoretical suction force equalled 2.76N (Force=Pressure*Area, and suction pad ID=8mm). Nine out of nine sample lifting trials succeeded. The suction system was able to lift porcine cadaver tongues up to 135.56g in weight, and provided average maximum suction force of 2.66 ± 0.61N. The suction system motion accuracy was 2.75 ± 1.45mm on sample surface plane guided by NIR markers and the dual camera system.

TABLE I.

Suction System Motion and Suction Performance

anterior middle posterior total

weight (g) 30.13 83.15 135.56

length (mm) 62 63 69
width (mm) 40 49 59
height (mm) 22 29 43

sample lifting success rate 3/3 3/3 3/3 9/9

max suction force (N) 2.70 ± 0.44 2.81 ± 0.17 2.48 ± 1.08 2.66 ± 0.61

motion accuracy (mm) 2.23 ± 0.81 4.20 ± 1.02 1.83 ± 1.19 2.75 ± 1.45

B. Supervised Autonomous Pseudotumor Resection

1). Tasks and Evaluation:

This study was designed to demonstrate feasibility of supervised autonomous tissue resection. Three stored cadaver porcine tongues were purchased from a grocery (H Mart, Lyndhurst, NJ), and clamped using our previously developed sample plate (Fig. 7.a,b) [33]. Each clamped sample was marked following the same procedure as explained in Section III-A1 using a 12mm diameter circle pattern (Fig. 7.a). The three NIR markers are manually clicked in the GUI, and the robotic system followed the task flow explained in Section II-D. A complete removal of the target pseudotumor from the cadaver sample was considered as one successful resection trial (Fig. 7.b). If a pseudotumor was removed, pictures of the top, front and rear view were taken for resection error evaluation. A ruler was in the view to get the pixel to millimeter scale. Surface incision and deep margin dissection errors were quantified using a customized Matlab code and reported. For the surface incision error, we compared the distance between final surface contours (green in Fig. 7.c) and expected surface contours (red in Fig. 7.c, 20mm diameter). The error was calculated for each pixel on the green contour by finding the closest point on the red contour. For the deep margin dissection error, we calculated the vertical distance from each pixel on deep margin contours (green in Fig. 7.d,e) to the nearest pixel along horizontal axis on final surface contours (red in Fig. 7.d,e), and subtracted the 5mm predefined cutting depth. The average, standard deviation, maximum, minimum errors of the surface incision and deep margin dissection were reported.

Fig. 7.

Fig. 7.

Image of a) a marked sample using the transparent film, and b) a successful resection trail. c) Top, d) front, and e) rear view of the removed tissue. The expected and actual incision contours are highlight in red and green respectively in c). The surface and deep margin contours are highlighted in red and green respectively in d) and e).

2). Results:

Three out of three consecutive resection tests were successful as demonstrated in Table. II. The surface incision errors were 1.26 ± 0.63, 1.38 ± 0.56, and 0.92 ± 0.54 mm, respectively. The depth errors were 2.35 ± 0.72, 2.03 ± 0.58, and 1.11 ± 0.45 mm, respectively. All images presented in Fig. 7 were from the first resection test.

TABLE II.

Autonomous Pseudotumor Resection Accuracy (mm)

average std max min

study 1 surface error 1.26 0.63 2.56 0.09
depth error 2.35 0.72 3.75 1.25

study 2 surface error 1.38 0.56 2.45 0.04
depth error 2.03 0.58 4.12 2.08

study 3 surface error 0.92 0.54 1.94 0.00
depth error 1.11 0.45 4.49 2.21

IV. Discussions and Conclusions

The observed errors in resection experiments can mainly be contributed to soft tissue deformation. The hypothesized pseudotumor in this study is a virtual cylindrical space appending onto the suction pad. However, while the suction pad applies on the soft tissue and performs retraction, the tissue is stretched by the connecting tissues beneath and suction force above. The tissue is elongated uniformly in the center and decreasingly to the periphery where the suction isn’t directly applied on (suction pad ID = 8mm<20mm = surface incision path diameter). The tissue periphery droops and is partially removed during the near horizontal deep margin dissection. The stretch and periphery droop contribute to deep margin and surface resection errors, respectively. In future work, a tissue deformation model will be included to predict deformation caused by stretch and to compensate and reduce corresponding errors. The relation between suction pad dimensions and pseudotumor dimensions will also be studied. Different pads will be selected for different tumor resection experiments. Another limitation in this study is that no image feedback was used during the resection. The intraoperative image feedback can acquire additional information during resection, adjust dissection paths, and potentially improve resection accuracy. We’ll develop and integrate this function to our control system in future work.

This paper reported the feasibility of supervised autonomous electrosurgery resection on cadaver tissue. First, the surgical tumor resection procedure is decomposed and simplified to subtask granularity for better understanding and automation. Second, a 4-DOF suction system is developed as an assistant robot to cooperate with the electrocautery robot. Third, our previous autonomous control system for surface incisions [22], [23] is integrated with the suction system and expanded to perform deep margin dissections. Fourth, the integrated system is shown to be able to autonomously resect the target pseudotumor from cadaver porcine tongues with millimeter accuracy. This work is an essential step towards autonomous robotic tumor resections.

Acknowledgments

This work is supported by the National Institutes of Health under award numbers 1R01EB020610 and R21EB024707, and supported by the Intramural Research Program of the National Institutes of Health, National Cancer Institute, Center for Cancer Research. The content is solely the responsibility of the authors and does not necessarily represent the official views of the National Institutes of Health.

Contributor Information

Jiawei Ge, Department of Mechanical Engineering, Johns Hopkins University, Baltimore, MD, USA.

Hamed Saeidi, Department of Computer Science, University of North Carolina Wilmington, Wilmington, NC, USA.

Michael Kam, Department of Mechanical Engineering, Johns Hopkins University, Baltimore, MD, USA.

Justin Opfermann, Department of Mechanical Engineering, Johns Hopkins University, Baltimore, MD, USA.

Axel Krieger, Department of Mechanical Engineering, Johns Hopkins University, Baltimore, MD, USA.

References

  • [1].Bray F, Ferlay J, Soerjomataram I, Siegel RL, Torre LA, and Jemal A, “Global cancer statistics 2018: GLOBOCAN estimates of incidence and mortality worldwide for 36 cancers in 185 countries,” CA: A Cancer Journal for Clinicians, vol. 68, no. 6, pp. 394–424, 2018. [DOI] [PubMed] [Google Scholar]
  • [2].Curado MP and Hashibe M, “Recent changes in the epidemiology of head and neck cancer,” Current Opinion in Oncology, vol. 21, no. 3, pp. 194–200, May 2009. [DOI] [PubMed] [Google Scholar]
  • [3].Vigneswaran N and Williams MD, “Epidemiological Trends in Head and Neck Cancer and Aids in Diagnosis,” Oral and maxillofacial surgery clinics of North America, vol. 26, no. 2, pp. 123–141, May 2014. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • [4].Sánchez-Danés A and Blanpain C, “Deciphering the cells of origin of squamous cell carcinomas,” Nature Reviews Cancer, vol. 18, no. 9, pp. 549–561, Sep. 2018. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • [5].Yan W, Wistuba II, Emmert-Buck MR, and Erickson HS, “Squamous Cell Carcinoma - Similarities and Differences among Anatomical Sites,” American Journal of Cancer Research, vol. 1, no. 3, pp. 275–300, Jan. 2011. [PMC free article] [PubMed] [Google Scholar]
  • [6].Montero PH and Patel SG, “CANCER OF THE ORAL CAVITY,” Surgical oncology clinics of North America, vol. 24, no. 3, pp. 491–508, Jul. 2015. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • [7].Sadeghi N, Li N-W, Taheri MR, Easley S, and Siegel RS, “Neoadjuvant chemotherapy and transoral surgery as a definitive treatment for oropharyngeal cancer: A feasible novel approach,” Head & Neck, vol. 38, no. 12, pp. 1837–1846, Dec. 2016. [DOI] [PubMed] [Google Scholar]
  • [8].Sarode SC, Sarode G, Patil S, Mahajan P, Anand R, and Patil A, “Comparative study of acrylic color and india ink for their use as a surgical margin inks in oral squamous cell carcinoma,” World J Dent, vol. 6, no. 1, p. 26, 2015. [Google Scholar]
  • [9].Haque R, Contreras R, McNicoll MP, Eckberg EC, and Petitti DB, “Surgical margins and survival after head and neck cancer surgery,” BMC ear, nose, and throat disorders, vol. 6, p. 2, Feb. 2006. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • [10].Oliveira CM, Nguyen HT, Ferraz AR, Watters K, Rosman B, and Rahbar R, “Robotic Surgery in Otolaryngology and Head and Neck Surgery: A Review,” Minimally Invasive Surgery, vol. 2012, 2012. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • [11].Weinstein GS, O’Malley BW, Magnuson JS, Carroll WR, Olsen KD, Daio L, Moore EJ, and Holsinger FC, “Transoral robotic surgery: a multicenter study to assess feasibility, safety, and surgical margins,” The Laryngoscope, vol. 122, no. 8, pp. 1701–1707, Aug. 2012. [DOI] [PubMed] [Google Scholar]
  • [12].Loevner LA, Learned KO, Mohan S, O’Malley BW, Scanlon MH, Rassekh CH, and Weinstein GS, “Transoral Robotic Surgery in Head and Neck Cancer: What Radiologists Need to Know about the Cutting Edge,” RadioGraphics, vol. 33, no. 6, pp. 1759–1779, Oct. 2013. [DOI] [PubMed] [Google Scholar]
  • [13].kalantari F, Rajaeih S, Daneshvar A, Karbasi Z, and Mahdi Salem M, “Robotic surgery of head and neck cancers, a narrative review,” European Journal of Translational Myology, vol. 30, no. 2, Jan. 2020. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • [14].Kumar V, Tewari M, and Shukla HS, “A comparative study of scalpel and surgical diathermy incision in elective operations of head and neck cancer,” Indian Journal of Cancer, vol. 48, no. 2, pp. 216–219, Jun. 2011. [DOI] [PubMed] [Google Scholar]
  • [15].Nagaya T, Nakamura YA, Choyke PL, and Kobayashi H, “Fluorescence-Guided Surgery,” Frontiers in Oncology, vol. 7, Dec. 2017. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • [16].ClinicalTrials.gov (NCT03134846, NCT03405142),” library Catalog: clinicaltrials.gov. [Online]. Available: https://clinicaltrials.gov/
  • [17].Shademan A, Decker RS, Opfermann JD, Leonard S, Krieger A, and Kim PCW, “Supervised autonomous robotic soft tissue surgery,” Science Translational Medicine, vol. 8, no. 337, pp. 337ra64–337ra64, May 2016. [DOI] [PubMed] [Google Scholar]
  • [18].Ge J, Opfermann JD, Saeidi H, Huenerberg KA, Badger CD, Cha J, Schnermann MJ, Joshi AS, and Krieger A, “A novel indocyanine green-based fluorescent marker for guiding surgical tumor resection,” Journal of Innovative Optical Health Sciences, vol. 14, no. 03, p. 2150013, May 2021. [Google Scholar]
  • [19].Yip M and Das N, “Robot Autonomy for Surgery,” arXiv:1707.03080 [cs], Jul. 2017. [Google Scholar]
  • [20].McKinley S, Garg A, Sen S, Gealy DV, McKinley JP, Jen Y, Guo M, Boyd D, and Goldberg K, “An interchangeable surgical instrument system with application to supervised automation of multilateral tumor resection,” in 2016 IEEE International Conference on Automation Science and Engineering (CASE), Aug. 2016, pp. 821–826. [Google Scholar]
  • [21].Hu D, Gong Y, Seibel EJ, Sekhar LN, and Hannaford B, “Semiautonomous image-guided brain tumour resection using an integrated robotic system: A bench-top study,” The International Journal of Medical Robotics and Computer Assisted Surgery, vol. 14, no. 1, p. e1872, 2018. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • [22].Saeidi H, Ge J, Kam M, Opfermann JD, Leonard S, Joshi AS, and Krieger A, “Supervised Autonomous Electrosurgery via Biocompatible Near-Infrared Tissue Tracking Techniques,” IEEE Transactions on Medical Robotics and Bionics, vol. 1, no. 4, pp. 228–236, Nov. 2019. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • [23].Ge J, Saeidi H, Opfermann JD, Joshi AS, and Krieger A, “Landmark-Guided Deformable Image Registration for Supervised Autonomous Robotic Tumor Resection,” Medical image computing and computer-assisted intervention : MICCAI ... International Conference on Medical Image Computing and Computer-Assisted Intervention, vol. 11764, pp. 320–328, Oct. 2019. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • [24].Nagy DÁ, Nagy TD, Elek R, Rudas IJ, and Haidegger T,´ “Ontology-based surgical subtask automation, automating blunt dissection,” Journal of Medical Robotics Research, vol. 3, no. 03n04, p. 1841005, 2018. [Google Scholar]
  • [25].Elek R, Nagy TD, Nagy DÁ, Garamvölgyi T, Takács B, Galambos P, Tar JK, Rudas IJ, and Haidegger T, “Towards surgical subtask automation—blunt dissection,” in 2017 IEEE 21st International Conference on Intelligent Engineering Systems (INES). IEEE, 2017, pp. 000253–000258. [Google Scholar]
  • [26].Vonck D, Goossens RHM, van Eijk DJ, de Hingh IHJT, and Jakimowicz JJ, “Vacuum grasping as a manipulation technique for minimally invasive surgery,” Surgical Endoscopy, vol. 24, no. 10, pp. 2418–2423, 2010. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • [27].Nahhas AF, Scarbrough CA, and Trotter S, “A Review of the Global Guidelines on Surgical Margins for Nonmelanoma Skin Cancers,” The Journal of Clinical and Aesthetic Dermatology, vol. 10, no. 4, pp. 37–46, Apr. 2017. [PMC free article] [PubMed] [Google Scholar]
  • [28].Smits R, Bruyninckx H, and Aertbeliën E, “Kdl: Kinematics and dynamics library,” 2011. [Google Scholar]
  • [29].Foote T, “tf: The transform library,” in 2013 IEEE Conference on Technologies for Practical Robot Applications (TePRA), Apr. 2013, pp. 1–6, iSSN: 2325–0534. [Google Scholar]
  • [30].Marchand E, Spindler F, and Chaumette F, “ViSP for visual servoing: a generic software platform with a wide class of robot control skills,” IEEE Robotics Automation Magazine, vol. 12, no. 4, pp. 40–52, Dec. 2005. [Google Scholar]
  • [31].Rusu RB and Cousins S, “3D is here: Point Cloud Library (PCL),” in 2011 IEEE International Conference on Robotics and Automation, May 2011, pp. 1–4. [Google Scholar]
  • [32].Kröger T, “Opening the door to new sensor-based robot applications—The Reflexxes Motion Libraries,” in 2011 IEEE International Conference on Robotics and Automation, May 2011, pp. 1–4. [Google Scholar]
  • [33].Opfermann JD, Leonard S, Decker RS, Uebele NA, Bayne CE, Joshi AS, and Krieger A, “Semi-autonomous electrosurgery for tumor resection using a multi-degree of freedom electrosurgical tool and visual servoing,” in 2017 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS). IEEE, 2017, pp. 3653–3660. [DOI] [PMC free article] [PubMed] [Google Scholar]

RESOURCES