Abstract
In order to realize many of the potential benefits associated with robotically assisted minimally invasive surgery, the robot must be more than a remote controlled device. Currently using a surgical robot can be challenging, fatiguing, and time consuming. Teaching the robot to actively assist surgical tasks, such as suturing, has the potential to vastly improve both patient outlook and the surgeon’s efficiency. One obstacle to completing surgical sutures autonomously is the difficulty in tracking surgical suture threads. This paper proposes an algorithm which uses a Non-Uniform Rational B-Spline (NURBS) curve to model a suture thread. The NURBS model is initialized from a single selected point located on the thread. The NURBS curve is optimized by minimizing the image match energy between the projected stereo NURBS image and the segmented thread image. The algorithm is able to accurately track a suture thread as it translates, deforms, and changes length in real-time.
I. Introduction
Robotic surgical systems used in Minimally Invasive Surgery (MIS) present an opportunity to pair human surgical expertise with the precision and repeatability of robots. Surgical suturing is one task that can benefit from the synergy of surgeons and surgical robots. Currently, the robot is controlled through teleoperation. Requiring the surgeon’s complete control of the robotic system can make certain tasks during MIS, such as suturing, challenging, fatiguing, and time consuming. This is largely due to the lack of haptic feedfack, a limited robotic workspace, and a narrow field of view from the laparoscopic camera system. Autonomous robotic manipulation and control of the suture needle and thread would allow the robot to complete the suture while relying on minimal guidance from the surgeon. Once the surgeon specifies key aspects of the suture, the rest of the task can be completed by the robot. The surgeon then plans the next step of the procedure while supervising the quick and precise autonomous completion of the suture.
In order to complete autonomous tasks during MIS, it is important to track task critical elements (e.g. suture needle, thread, and tissue for autonomous suturing). This paper focuses on the thread tracking problem. In the proposed method, the suture thread is initialized as a 3-dimensional NURBS curve. The NURBS suture thread model is projected into a stereo image pair and the projected models are optimized using an energy based image improvement algorithm. The updated models are then recombined into a 3-dimensional NURBS update. This approach allows the suture thread to be tracked in real-time as it is deformed and the method is robust during changes in thread length, self intersection, and knot formation.
The outline of the paper is as follows, section II summarizes previous related research in the literature. Section III discusses how the suture thread is segmented. Section IV describes how the shape of the suture thread is modeled. Section V explains how the NURBS model is seeded from a single point. Section VI discusses how the suture thread model is updated in real-time. Section VII demonstrates the capabilities of the suture thread initialization and tracking algorithm. Finally, section VIII presents the discussion and conclusion.
II. Literature Review
The problem of autonomous surgical suturing encompasses many facets of robot control and perception. An autonomous surgical suture requires tracking the thread, tissue, and needle in order to plan and execute the robot motion. This must be completed using techniques that are robust and adaptable to the dynamic surgical environment.
Iyer et al. demonstrates that it is possible, using tissue markers, to complete an automated circular needle drive [1]. This drive is completed while tracking the needle. In this implementation, the needle was not attached to any thread. This contrasts to actual suture needle kits where the thread and the needle form a single unit as shown in Fig 1. Once the needle is driven, the suture thread must be pulled through and tied off. Since suture thread can be 70 cm long, a single suture kit can be used for multiple sutures.
Fig. 1.

A sample suture needle kit has a semi-circular needle crimped onto a violet suture thread. This particular kit was manufactured by Ethicon (Cincinnati, Ohio) (part number J341).
Detecting and manipulating the suture thread is an important component of completing a suture autonomously. While it is possible to circumvent suture thread tracking using specialized hardware [2], it may not always be practical to use. Previous works have demonstrated the feasibility of detecting a suture thread as well as similar objects. Javdani et al. [3] demonstrated that it is possible to model suture threads as Dynamic One-dimensional Objects (DOO). Each suture thread is then modeled as a collection of connected bones. The incident and twisting angles between the bones are modeled as an internal deformation energy of the thread. By minimizing the total internal energy coupled with the energy of the image matching, the suture thread is identified. In this example, the suture thread is segmented from the image using Canny edge detection [3]. This approach requires knowledge of the suture thread material model. An alternative approach by Schulman et al. [4] used a point cloud to help identify various deformable objects (including rope). Similar to [3], actively deforming a physical model increases the internal energy so that the energy of the image matching can be minimized. The final goal is to minimize the sum of the internal energy with the image match energy. Currently, MIS robots do not employ the same RGB plus depth imaging device that is required to generate a point cloud. Consequently, it is more practical to rely exclusively on the stereo imaging systems that are deployed in existing robotic surgical systems. Padoy et al. used multiple approaches to detect curvilinear structures, [5], [6]. One of which, [5], used NURBS curves to describe the shape of the detected suture. The control points of the curve were optimally positioned by using a Markov random field model that was iterated using a fastPD optimization algorithm [7]. In addition to using Markov random fields, active contours (Snakes) can be used for image based curve optimizations [8]. This includes optimizing closed NURBS curves with active contour modeling [9].
Previous algorithms for suture thread detection assumed that the end points were held by tracked manipulators during initialization [5]. While this allows for tracking algorithms to be tested, it neglects the problems that might be associated with tracking the suture end points. In an actual surgical environment, the suture thread may only have one end grasped. The thread may not even be grasped at all. Additionally, the end point constraint is not directly addressed. Due to the nature of the image energy functions in previous works [3],[5] mismatches are not penalized if the image of the suture thread model is a subset of the actual suture thread image. This allows for the suture thread model to ‘shrink’ with no apparent penalty. To mitigate this, a penalty was introduced where the cost would increase as the length of the model changed [5]. This can cause problems if a portion of the suture thread is occluded and is slowly exposed causing the thread to lengthen.
This paper presents an algorithm that can be used to initialize and then accurately track the suture thread from an initial seed point. The tracking algorithm is capable of tracking the suture thread as it moves, deforms, changes length, and tightens knots.
III. Suture Detection
Reliance on automatic suture thread detection requires top level knowledge of the surgical procedure. There may be multiple suture threads in the image. Some sutures might even have already been tied off. Detecting the suture thread of interest requires a directed pruning of irrelevant sutures. Supervisory selection will maximize the performance of the robot while keeping the surgeon in control of the procedure. Identification of the target suture thread can be quickly and easily performed by the surgeon provided that there exists an adequate user interface that enables the surgeon to communicate his/her intentions.
There are many methods that a surgeon can utilize to input the point of the suture thread. The surgeon can enter the point by area selection, three-dimensional projection, gripper based initialization, or even using an external laser pointer. The method used can be selected to complement the surgeon’s existing user interface. In this implementation, a 3-dimensional space mouse was used to input the seed point. The seed point, when coupled with the segmented image, allows the initial thread position to be identified.
A. Suture Thread Segmentation
In the proposed method, the suture thread is segmented from the image using a thin feature enhancement algorithm developed by Frangi et al. [10]. The algorithm is modified to include directionality information that was explored by Steger et al. [11]. During thin feature enhancement, a gray scale image representation is convolved with a two dimensional Gaussian distribution second derivative. The variance of the Gaussian function corresponds to the scale space size of the image filter. The result is a scale space Hessian matrix of the original gray image. That is to say each image pixel is now a set of 4 pixels that comprise a 2 × 2 symmetric matrix. This 2 × 2 real symmetric matrix has real eigenvalues ( |λ1| < |λ2|) and orthogonal eigenvectors v1 ⊥ v2. The magnitude and ratio of these eigenvalues are used to generate the output image magnitude. A large magnitude eigenvalue indicates that the local image region aligns well with the second derivative of a Gaussian function in the corresponding eigenvector direction. If one eigenvalue is large and the other is small, the local image region looks like a ridge (or a section of suture thread). The output vector for each pixel location Vr(i, j) can be computed directly from the eigenvalues of the corresponding Hessian matrix.
| (1) |
Here RB = λ1/λ2, , and the parameters β and c are user defined parameters. The normalized eigenvector corresponding to the smaller eigenvalue is considered to be the local thread direction (v1). Since the eigenvector can be arbitrarily scaled, the angle (θ) of the thread direction is wrapped into the domain θ ∊ [0, π). The result is that each pixel from the gray scale image is now a vector in Vr(i, j) ∊ ℝ2. As part of the post processing, the image is smoothed using a low pass gaussian kernel, G. Due to the natural wrapping of the angle domain, the gaussian blur is computed as follows.

An example comparing an unsegmented image to a segmented image is shown in Fig. 2. The purple suture thread is highlighted from the background. The segmented image is falsely colored to indicate direction.
Fig. 2.

A sample segmented image (b) together with the corresponding original image (a). The segmented image uses false color which denotes the direction of the pixel estimated by the algorithm. In addition to the suture thread, other image edges are also segmented. This is because, in the gray scale image, the border appears to be a contrasting location to the surrounding edges. The thread self intersection can be identified from the direction information as can be seen from the different colors of the intersecting thread segments. Suture thread initialization and tracking must be robust against falsely segmented regions.
IV. Suture Thread NURBS Model
In the proposed tracking approach, the thread is modeled as a 3-dimensional NURBS curve. The suture thread is defined by a set of n+1 control points C = [c0, …, cn]T. The order of the curve, p, indicates the polynomial degree that is used to smooth together the control points. A NURBS curve also requires a set of m + 1 knots where m = n + p + 1. In this case, the set of knots, U , are distributed in the range [0, 1]. The end knots have a multiplicity of p + 1. This ensures that the end points of the curve fall on the points c0 and cn, respectively. NURBS curves have been used previously in vision research because they generate smooth curves using a relatively low dimensional space [12], [5]. This naturally allows for the NURBS curve to have a reduced internal energy compared to point wise curve definitions. The NURBS curve is parametrically defined as in [13]
| (2) |
where u ∊ [0, 1] is the parameter used to generate the points on the curve. The term Ni,p(u) is a B-spline basis function of order p. In this study, the weight wi is defined to be 1 for all indices. This allows the NURBS curve to be easily projected from 3-dimensional space to 2-dimensional images. Additionally, unity weights reduce the space of optimized parameters, reducing the overall optimization time. The basis functions are defined over the knot vector of the NURBS curve as defined above. After the curve point set P(u) is found, it is projected into the stereo images resulting in the point sets Pl(u) and Pr(u).
V. Initializing the Suture Thread
The curve initialization function generates a thread model including loops and dead ends when given an arbitrary point along the thread. The implementation of this algorithm is built on previous work by Kaul et al. [14]. The user selects the initial point using a projected 3-dimensional location with a spacemouse. The point grows outwards from the initial point according to a predefined cost function. The initially selected point has a cost of 0. This is similar to Dijkstra’s tree growth algorithm. The cost of adjacent pixels to the current is calculated using the magnitude of the segmented image pixels.
| (3) |
Here, the i, j represent the current image location while ix, jy ∊ [0, 1, −1]. The parameter ρ is used to improve the delineation between the thread and noisy background. This represents the 8-connected neighborhood around a pixel. The lowest net cost for each pixel is stored. The lowest cost pixel is then used to compute its neighbor’s cost. The growth of this path is shown in Fig. 3a. When the cost of a pixel is calculated, it is added to a set of tracked pixels which grows outwards at each iteration. After the front of the pixel growth reaches a certain distance away from the origin point, preset as the step size, a thread point is marked at that location. A corresponding thread point is found in the other stereo image, and the stereo point is added to the current point list. When a new stereo point is found, the list that it belongs to must be identified. The new point is compared to the current point lists. If the point matches one of the lists, then it is appended to that list. Otherwise, the point seeds a new list. The end result is that there is a list of thread segments representing every continuous non-branched segment of thread. These thread segments are then combined into the final thread model. Endpoints of the thread segments that are close to each other and have similar angles are connected. The merging of matching segments is shown in Fig. 3b. The resulting list of thread points is then used to generate a curve. The NURBS curve is calculated using a least square approximation of the generated 3-dimensional point list. The result is a list of NURBS control points C = [c0, …, cn]T. Once the thread has been initialized, the NURBS model is updated using an iterative tracking algorithm.
Fig. 3.

As the segmented suture thread region grows, it will branch as in (a). Once the regions are done growing, they are connected together based on their alignment. This is illustrated in (b).
VI. NURBS Curve Iteration
After completing the suture thread initialization, the curve is defined as a set of control points, C = [c0, …, cn]T. Previous NURBS improvement techniques include using a Markov random field to optimize the set of control points [5]. This approach has the disadvantage of not inherently tracking the ends of suture thread as they change. It randomly updates the control points and checks to see if this improves the image match. Optimizing a closed NURBS curve also circumvents the need to deliberately track end points [9]. Optimizing the control points using the projected images of the curve allow the algorithm to update itself directly to the images while tracking the ends of the suture thread.
The proposed algorithm tracks the deforming suture thread as it moves in the surgical environment by iteratively updating the control points of the suture thread model. This includes removing and inserting control points as needed. The goal is to minimize the image energy while preventing the internal energy from being too high. Evenly spacing the NURBS control points helps to minimize the internal energy.
A. Image Energy
The energy of the NURBS curve image is based on both the image magnitude and image alignment. Not only should the curve image match the peak of the segmented image, they should share the same tangential direction as well. The image energy equation actually represent a pair of image energies, one for the left and one for the right image. Without loss of generality, the l subscript will stand for the pair l and r unless otherwise noted.
| (4) |
Here tl(u) is the unit tangent to the curve at point in image space while Vol(u) = Vo(pl(u)) = Vol(i, j) is the segmented image output vector and direction at the curve image point pl(u).
The spatial gradient of the image energy is used to generate forces on the curve image points. This gradient is approximated as,
| (5) |
It is assumed that the spatial derivative of is small. That is to say, the alignment between the segmentation direction and the curve tangent does not change quickly. This gradient force is then projected into the image curve normal nl(u). The point offset is then given by
| (6) |
This prevents the points from sliding along the segmented curve. If the local energy gradient is small, then the local image region is actively searched for a potential point. This increases the image basis of the NURBS optimization algorithm. Optimizing the end points requires that they are given their own energy cost term.
B. End Point Energy
A difficulty in tracking the thread arises due to the special case of the thread endpoints. Other works have used fixed end points or penalized changing the length of the suture thread in order to assist in tracking the endpoints [5] [3]. In this work, the end of the thread is tracked using the local segmented image information as it aligns to the thread model.
The end points have their own calculated energy term.
| (7) |
The parameter Vcl is the cutoff magnitude of the image. The cutoff magnitude is the average of the foreground mean magnitude and the background mean magnitude of the segmented image. This attracts the curve ends to the thread end. The end point forces acts on both ends
| (8) |
| (9) |
As in equation 5, the gradient of the segmentation NURBS alignment is neglected. The positive sign at u = 1 is due to the fact that the tangent of the nurbs curve points from u = 0 to u = 1. Consequently it must be flipped at the end point where u = 1. This force is projected to be along the end point tangent direction. The point offset is then given by
| (10) |
The equation is similar for when u = 1. The forces are formulated such that the internal points are updated only in directions normal to the curve while the end points are additionally updated in directions that are tangential to the curve.
C. Pointwise Update
The NURBS curve is defined to be a linear combination of the control points. Since it is impractical to compute the curve for every u ∊ [0, 1], only a finite number of points are actually computed. If the curve is generated for a list of parameters values uj ∊ u where 0 ≤ j ≤ k, then the basis function Ni,p(u) can be precomputed and, as such, a matrix A(u) ∊ ℝ(k+1)×(n+1) can be constructed
| (11) |
The denominator from (2) is not present because it always sums to 1. The vector u = [u0, …, uk] is defined such that the points are evenly spaced throughout the entire NURBS curve. The NURBS curve is now formulated as a linear combination of the control points
| (12) |
Here P = [p0, …, pk]T is the set of points on the NURBS curve, C is the set of control points as defined previously, and A(u) is defined in (11). By retaining a precomputed copy of the matrix A(u), the NURBS curve can be quickly recomputed if the control points are updated without changing the parameter vector (u). After the curve finite point set P is found, it is projected into the stereo images resulting in the point sets Pl and Pr. The precomputed discrete points of the NURBS curve greatly simplify the force computation on the set of NURBS curve points. This results in an offset that is applied to each set of projected NURBS points Δpil. Once the left and right image updates are found, they must be deprojected into 3-dimensional space.
D. 3-Dimensional Deprojection
Now that the update vector has been found for the set of image points Δpil and Δpir, they must be mapped from stereo image space into 3-dimensional space. This can be accomplished by analyzing the projection of the orthonormal curve basis t, n, b, (tangent,normal,binormal). Assuming that the 3-dimensional optimization vector is of the form Δpi = αiti + βini + γibi where αi, βi, and γi are update gains. Then, provided that ||pi|| is small, the projected vector Δpil approximates the sum αitil + βinil + γibil. This can be reduced to the following over determined matrix equation.
| (13) |
The least squares solution of (13) gives the values of αi,βi, and γi. Since the offset of each curve point Δp is found in 3-dimensional space, the update is constrained to the epipolar lines of the stereo camera system [15]. These values can then be used to generate the three dimensional offset vector ΔP = Δp0, …, ΔpnT. This offset matrix ΔP in the curve point space is mapped into the control point space using the matrix A(u) as defined in (11).
| (14) |
The end points are updated directly using the computed offset. This is done because the end points are defined to be the end control points (i.e. Δc0 = Δp0). The matrix D ∊ ℝk+1×3 is a denominator matrix that is used as a normalization matrix to normalize the force offsets on the matrix ΔC.
| (15) |
This serves to normalize the effects of the curve point forces. The point set at the next time step is then simply the previous points plus the scaled offset point set.
| (16) |
The operator ⊘ indicates that matrix ΔC is element wise divided by denominator matrix D
1) Control Point Post Processing: Once the control points have been updated, they are added and removed based on inter control point distance. Points are pruned if they are too close together. Conversely points are inserted where there is a large gap between neighboring points. The points are inserted such that the curve does not change shape. The goal is to keep the set of interpoint distances ||ck − ck−1|| and ||ck − ck+1|| within a certain range. This mitigates the possibility that control points are bunched up while at the same time ensures that there won’t be any gaps in the thread model. While it is true that straight segments of the curve do not need to be modeled with as many points, the presence of the points enables the model to respond more quickly to a local changes in the thread shape.
Fig. 4 shows the composite result of fitting a NURBS curve to a suture thread. The quality of fit allows for a robotic gripper to grab the thread (See attached video).
Fig. 4.

This is an image of the thread model overlaid onto the camera image. The semi translucent green curve is the NURBS curve, while the red circles are control points.
VII. Experimental Validation
The goal of the experimental validation is to evaluate the performance of the suture thread tracking algorithm. The following aspects are validated: initialization, motion tracking, and distortion tracking. In order to test the algorithm, a pair of calibrated stereo cameras with a resolution of 640 × 480 pixels were mounted above a motorized X-Y linear stage that allows the thread to be tracked as it moves. The experimental stage is shown in Fig. 5. The orange and red marbled paper was used in an attempt to mimic colors and patterns that might be found in a surgical setting. The computer completing the image processing contains an Intel Core 2 Quad processor running at 3GHz with 8 GB of installed RAM. The segmentation algorithm was implemented using CUDA and ran on a Nvidia GTX 650 Ti with 2 GB of GDDR5 RAM. During the experiments, the segmentation algorithm as well as the tracking algorithm ran in parallel threads. The segmentation algorithm runs with a loop speed of (15 Hz). The suture tracking thread runs with a loop speed of (20 Hz). Since the suture thread tracking and the segmentation loop operate in parallel, the thread model lags behind the segmentation model in the video display thread.
Fig. 5.

The X-Y linear stage with a sample of thread. The orange and red patterned construction paper is meant to emulate some of the colors and patterns that might be present in an actual surgical environment. One end of the suture thread is affixed to the immobile piece of paper while the other is affixed to the linear stage. This allows the thread to deform as it is tracked.
A. Quantative Accuracy
In order to detect the accuracy of the thread tracking algorithm, the proposed algorithm was used to estimate the length of three pre-cut suture threads. This was done in order to avoid marking the thread during imaging. Marking the thread would have interfered with the segmentation. The lengths of the suture threads were 107 mm, 175 mm, and 200 mm. Each precut segment was positioned on the work space, initialized, and tracked. This includes tracking the thread as it moves and deforms. One instance of each thread was measured in this way. The thread was twisted and kinked in the different images so that it did not lie flat on the workspace. Additionally, one side of the suture thread was affixed to an immobile section of the background paper. This was done to ensure that the thread would deform significantly during tracking. This is illustrated in Fig 5. The table moved its attached thread endpoint in a figure 8 pattern with a maximum speed of 11 mm/s. The length of the thread model during deformation was logged and the mean and variance of the logged length are summarized in Table I. The term ℓ represents the actual length of the suture thread, while ~ ℓ is the average estimated length of the suture thread. The standard deviation of the measurements (σ) and the percent error (%) are also listed.
TABLE I.
The measured (ℓ) and detected (~ ℓ) length of the suture threads. The standard deviation (σ) and the percent error (%) are also reported.
| ℓ (mm) | ~ ℓ (mm) | σ (mm) | % |
|---|---|---|---|
|
| |||
| 107 | 108.8 | 2.41 | 1.68 |
| 175 | 181.4 | 2.65 | 3.66 |
| 200 | 203.53 | 2.77 | 1.76 |
As the table indicates, the algorithm is capable of tracking the length of the suture thread within a few percent as well as a few millimeters. The largest error was with the 175mm thread length. This indicates that the error is not a fixed value, nor is it dependent on length. It is most likely that the shape of the thread itself introduces the most errors and consequently is the source of most of the variance.
B. Qualitative Tracking Results
The video submitted with this paper contains several components that are meant to illustrate the thread tracking algorithm. The video also shows how the tracking looks in the segmented image space as well as in the raw images.
The first video demonstrates that the initialization algorithm is able to generate a successful point list even when the thread self intersects numerous times. Once the NURBS model is initialized, the tracking algorithm is demonstrated during thread motion, deformation, changing lengths, and knot tying.
In order to test the suture tracking algorithm, the linear stage was used to move the suture thread. One end of the thread was fixed to the moving stage, while the other end of the thread was fed through a hole in a immobile background. As the stage moves, the suture thread is pulled through the hole. This causes the thread to move, lengthen, and deform simultaneously. The thread is successfully tracked while the stage moves with a velocity of 9.7 mm/s. The video submitted with this paper shows how the thread is tracked.
The video also includes how the tracking algorithm is capable of tracking a suture knot as it is drawn tight. Key images from the loop tightening are shown in Fig 6. As the overhand knot loop is pulled tight, the control points get closer and closer together. Eventually, they are so close that the control points causing the loop are deleted. The NURBS model then continues to track the suture as it moves and deforms.
Fig. 6.

As the suture knot is pulled tight in (a), the control points are drawn closer together. Eventually, the control points converge into a narrow area (b). When the area is small enough, the control points are pruned (c). When the tight knot is moved, it is then tracked by a single control point (d).
The included video also demonstrates how a robotically controlled surgical gripper can reorient and pick up a suture thread that was localized and tracked using the proposed algorithm. The suture thread was threaded through a tissue phantom (part number SCS-10 by Simulab Corp.). Once the user entered the suture thread point, the thread was traced and a grasp point near the end was identified. In addition to providing a grasp point, the NURBS curve model also provides a tangent to the suture thread. This allows the gripper to reorient as needed to grasp the suture. The robot uses visual feedback to translate and reorient until it is able to grasp the suture. The robot gripper coordinates were identified using colored circle markers attached to the gripper. The gripper in the video was designed in the lab [16] and is mounted to an IRB140 (ABB Ltd. Zurich, Switzerland) industrial robotic assembly arm. Once the arm grabs the suture thread, the thread is pulled to show that the gripper grasped the thread successfully.
VIII. Discussion and Conclusions
This paper presents a novel method that can be used to track a complete surgical suture thread online in real-time using a calibrated stereo vision system. In the proposed method, the suture thread model is segmented and initialized from a single seed point on the thread. Once the suture thread is initialized, a NURBS model tracks the thread as it moves in real-time. The segmentation operates at 15 Hz while the actual NURBS tracking operates at 20 Hz. The algorithm is robust against thread deformations, translations, and dynamic thread length. The algorithm is well suited to track a thread end as the thread is pulled through a tissue sample. The method was validated on bench top experiments using actual suture thread while the test bench environment was colored to emulate more realistic surgical environments.
The current limitations of the initialization algorithm include some sensitivity to undesired segmented pixels. It also is currently unable to detect when the thread intersects itself for a finite length (e.g. overhand knot).
The main limitation with the thread tracking algorithm is that it does not add or remove control points based on local thread shape. This can result in reduced measurement accuracy of the actual thread since some bends might be poorly approximated. Another deficiency is that when the thread is moving, it can lead the changes in the NURBS model. This also is due to the parallel processing threads. One way around this might be to incorporate a velocity model into the NURBS curve iteration.
Further work will focus on actively tracking the suture thread as it is manipulated by surgical grippers. This will allow the thread to be tracked as it is tied into a suture knot. Additionally, the NURBS fitting algorithm can be improved to support tracking the thread using previous velocity information.
Acknowledgments
This work was supported in part by the National Science Foundation under grants, IIS-0905344 and CNS-1035602, and National Institutes of Health under grant R21 HL096941 and R01 EB018108.
REFERENCES
- [1].Iyer S, Looi T, Drake J. A Single Arm Single Camera System For Automated Suturing; Robotics and Automation (ICRA), 2013 IEEE International Conference on.2013. pp. 239–244. [Google Scholar]
- [2].Leonard S, L. Wu K, Kim Y, Krieger A, Kim PCW. Smart Tissue Anastomosis Robot (STAR): A Vision-Guided Robotics System for Laparoscopic Suturing; Biomedical Engineering, IEEE Transactions on; Apr. 2014. pp. 1305–1317. [DOI] [PubMed] [Google Scholar]
- [3].Javdani S, Tandon S, Tang J, O’Brien JF, Abbeel P. Modeling and perception of deformable one-dimensional objects; Robotics and Automation (ICRA), 2011 IEEE International Conference on.2011. pp. 1607–1614. [Google Scholar]
- [4].Schulman J, Lee A, Ho J, Abbeel P. Tracking Deformable Objects with Point Clouds; Robotics and Automation (ICRA), 2013 IEEE International Conference on.2013. pp. 1122–1129. [Google Scholar]
- [5].Padoy N, Hager GD. 3D thread tracking for robotic assistance in tele-surgery; Intelligent Robots and Systems (IROS), 2011 IEEE/RSJ International Conference on.2011. pp. 2102–2107. [Google Scholar]
- [6].Padoy N, Hager G. Deformable Tracking of Textured Curvilinear Objects; Proceedings of the British Machine Vision Conference; BMVA Press; 2012. pp. 5.1–5.11. [Google Scholar]
- [7].Komodakis N, Tziritas G, Paragios N. Fast, Approximately Optimal Solutions for Single and Dynamic MRFs. Computer Vision and Pattern Recognition, 2007. CVPR ’07; IEEE Conference on.2007. pp. 1–8. [Google Scholar]
- [8].Kass M, Witkin A, Terzopoulos D. Snakes: Active contour models. International Journal of Computer Vision. 1988;1(4):321–331. [Online]. Available: http://dx.doi.org/10.1007/BF00133570. [Google Scholar]
- [9].Meegama RGN, Rajapakse JC. NURBS snakes. Image and Vision Computing. 2003;21(6):551–562. [Online]. Available: http://www.sciencedirect.com/science/article/pii/S0262885603000660. [Google Scholar]
- [10].Frangi A, Niessen W, Vincken K, Viergever M. Multiscale vessel enhancement filtering. In: Wells W, Colchester A, Delp S, editors. Medical Image Computing and Computer-Assisted Interventation — MICCAI’98. Springer Berlin Heidelberg; 1998. 1496. pp. 130–137. ser. Lecture Notes in Computer Science. [Online]. Available: http://dx.doi.org/10.1007/BFb0056195. [Google Scholar]
- [11].Steger G. An unbiased detector of curvilinear structures. IEEE Transactions on Pattern Analysis and Machine Intelligence. 1998;20(2):113–125. [Google Scholar]
- [12].Martinsson H, Gaspard F, Bartoli A, Lavest J. Reconstruction of 3D Curves for Quality Control. In: Ersbøll B, Pedersen K, editors. Scandinavian Conference on Image Analysis. Springer Berlin / Heidelberg; 2007. pp. 760–769. [Google Scholar]
- [13].Piegl L. In: The NURBS book. 2nd Tiller W, editor. Springer-Verlag New York, Inc.; New York, NY, USA: 1997. [Google Scholar]
- [14].Kaul V, Yezzi A, Tsai YJ. Detecting curves with unknown endpoints and arbitrary topology using minimal paths. IEEE Transactions on Pattern Analysis and Machine Intelligence. 2012;34:1952–1965. doi: 10.1109/TPAMI.2011.267. [DOI] [PubMed] [Google Scholar]
- [15].Cham T-JCT-J, Cipolla R. Stereo coupled active contours; Proceedings of IEEE Computer Society Conference on Computer Vision and Pattern Recognition.1997. [Google Scholar]
- [16].Liu T. Design and Prototyping of a Three Degrees of Freedom Robotic Wrist Mechanism for a Robitic Surgery System. Case Western Reserve University; Cleveland, OH: 2010. Master’s thesis. [Google Scholar]
