Abstract
Open surgery requiring cytoreduction still remains the primary treatment course for many cancers. The extent of resection is vital for the outcome of surgery, greatly affecting patients’ follow-up treatment including need for revision surgery in the case of positive margins, choice of chemotherapy and overall survival. Existing imaging modalities such as CT, MRI, and PET are useful in the diagnostic stage and long-term monitoring, but do not provide the level of temporal or spatial resolution needed for intraoperative surgical guidance. Surgeons must instead rely on visual evaluation and palpation in order to distinguish tumors from surrounding tissues. Fluorescence imaging provides high resolution, real-time mapping with the use of a contrast agent and can greatly enhance intraoperative imaging.
Here we demonstrate an intraoperative, real-time fluorescence imaging system for direct highlighting of target tissues for surgical guidance, optical projection of acquired luminescence (OPAL). Image alignment, accuracy, and resolution was determined in vitro prior to demonstration of feasibility for operating room (OR) use in large animal models of sentinel lymph node biopsy. Fluorescence identification of regional lymph nodes after intradermal injection of indocyanine green (ICG) was performed in pigs with surgical guidance from the OPAL system. Acquired fluorescence images were processed and rapidly re-projected to highlight ICG within the true surgical field. OPAL produced enhanced visualization for resection of lymph nodes. Results show the OPAL system can successfully use fluorescence image capture and projection to provide aligned image data that is invisible to the human eye in the OR setting.
Keywords: fluorescence, imaging, surgery, cancer, margins, oncology, indocyanine green
INTRODUCTION
Despite the improvements in minimally invasive surgery techniques and endoscopic imaging choices, open surgery is still the choice of surgeons for many procedures where tactile feel, large working space, and other factors are important. Diagnostic imaging modalities including CT, MRI, and PET/SPECT enable non-invasive detection of primary and metastatic tumors throughout the body for staging of disease and pre-surgical planning. While minimally invasive surgical approaches are increasingly shown beneficial, open surgery is still the mainstay in many oncologic surgeries. Surgeons rely on tactile as well as visual cues to differentiate cancer from surrounding healthy tissues, particularly as tissue locations shift with patient position, after each incision and during exploration. Therefore, enhancement of tissue contrast by optical reporters that selectively accumulate in cancer tissue, lymphatics or even nerves can greatly improve surgeon confidence, speed procedures and increase accuracy of resection, eliminating repeat surgeries and increasing cure rates1,2.
In the operating room, sterility is imperative; space and time are limited. These considerations explain some difficulties in introducing new imaging technologies to the operating room (OR). Most intraoperative imaging systems are mobile (CT, MRI, ultrasound) but are very bulky. These must be wheeled into place alongside the operating table and surgeon for use, then returned when finished. For CT and MRI, intraoperative systems are necessarily bulky and heavy due to limitations of detection technologies. Intraoperative ultrasound requires patient contact and constant guidance by experienced surgeon. Thus there is little motivation to elevate ultrasound from standard wheeled units. In contrast, optical imaging provides non-contact and full-field view during imaging.3 Optical imaging is able to detect biological events ranging from molecular and sub-cellular levels to organ systems with large field of view (FOV) and high frame rate, resolution and sensitivity.
Fluorescence Image Guided Surgery
Optical imaging utilizes nonionizing radiation. At the NIR region (700–1000 nm), tissue absorption and auto-fluorescence are minimal, increasing depth penetration and reducing background signal, respectively4. Fluorescence imaging enables real-time, high resolution mapping of contrast agent distribution in superficial structures.
Fluorescence imaging can be utilized to improve visualization in sentinel lymph node (SLN) biopsies5. In patients, cancer cells may appear first in the sentinel nodes before spreading to other areas of the body. While peri-tumoral injection of radioactive colloid or blue dye are currently used in SLN biopsies, fluorescence imaging can be more sensitive and selective, improving the chance of accurate removal. Fluorescence molecular imaging has been demonstrated to improve identification and subsequent removal of ovarian cancer peritoneal metastases in humans6.
Here we describe a fluorescence image capture and projection strategy, optical projection of acquired luminescence (OPAL) that promises to significantly enhance the ease of use and adoption of fluorescence guidance for open surgery procedures including real-time guidance of sentinel lymph node biopsy. This new method features rapid acquisition of fluorescence intensity maps from the operating field then projects this acquired fluorescence image information directly onto the patient rather than on an out-of-field digital display (Figure 1). The OPAL system will improve workflow in the OR over other methods by providing fluorescence visualization on demand, via non-disruptive, overhead projection, leaving the surgeon free to operate unimpeded.
Figure 1.
Artistic rendition depicting the fluorescence guided surgery enhancement with OPAL direct projection. Direct display of fluorescence image information onto the imaging field simplifies use of fluorescence during oncologic procedures.
Materials and Methods
Optical Projection of Acquired Luminescence System
We upgraded our preclinical prototype OPAL system7 for use in a clinical operating room (OR) environment. A 2500 Lumen, consumer grade DLP projector provided sufficient brightness and field of view at 1–1.5 m above the surface. A 0.3 megapixel, monochrome CMOS camera (Firefly MV, Pointgrey Research, Richmond, Canada) with 785 nm EdgeBasic™ long-pass edge filter (Semrock BLP01-785R-25) was affixed to the projector with field of view centered at 1 m. Excitation light was provided by handheld, high power a 780 nm, 420 mW output LED (Thorlabs M780L2) with 26.5mm diameter Carclo poly-carbonate collimating lens and 769/41 nm BrightLine® single-band bandpass filter (Semrock FF01-769/41-25). The OPAL system was affixed to a mobile LED surgical light (Harmony LED585, Steris Corp., Mentor, OH) via custom-machined aluminum adapter fitting the central hub (Figure 2).
Figure 2.
Picture of OPAL system attached to LED surgical light for OR applications.
System Overview and Design
The complete system consists of three essential components: a camera, a projector, and a processing unit. By applying a horizontal integration approach in our design process, we streamlined the system around a single control center, or enterprise service bus (ESB)8. The ESB manages all communication between independent components and subsystems, and routes inputs and outputs to their appropriate destinations (Figure 3). For this system, a MATLAB function comprises the ESB, enabling easy modification and multiplatform compatibility.
Figure 3.
Workflow of OPAL control system.
Projector
The defining feature of this product comes in the form of a projected image through a projector with sufficiently high luminosity. After complete image processing, the ESB sends the processed data to the projector as a Java application9 run through MATLAB. To ensure proper alignment, the projector provides the ESB with a set of resolution and display parameters for accurate co-registration with received data.
Enterprise Service Bus (ESB)
The ESB serves as both a router for all connections to and from components, and a processing center to ensure that the data being routed to a component is compatible with that component. This allows for flexibility in exchanging different component models and ease of adding or removing components to the system. However, with each added component, the ESB gets increasingly complex internally as a result of managing so many connections8.
Alignment Calibration
Since the camera and projector exist in separate coordinate systems and possess different fields of view, alignment calibration is necessary to match captured and projected images to the surgical field. To calibrate the camera-projector system, we picked the projector’s field as our reference frame and applied transformations to the camera’s frame to conform it to the reference frame. After characterizing every frame, all operations and calculations can be done in the reference frame and results can then be transformed back to another frame if necessary.
Prior to use, calibration is performed through a series of steps that includes projecting and capturing images. Specifically, we use a basic shape (e.g. square, circle, rectangle, etc.) with a known set of coordinates and dimensions in the reference (projector) frame. The system then projects this shape onto a blank surface located at the same distance away as the operational subject. The camera captures a still image of the shape, which is processed to determine the same set of coordinates and dimensions of the shape in the camera’s frame. We found that shapes with discrete edges were most easily and accurately processed due to MATLAB’s included corner detection algorithms.
Accuracy of alignment of projected image with detected signals was evaluated to verify calibration and evaluate object size limits of detection. To perform this method, the alignment calibration routine was evaluated first, as described above. Then an image with nine dots, five of which has diameter 1.5 mm and four of which has 1.0 mm (P0), was projected onto a flat, non-fluorescent surface. The camera was then triggered to capture an image of the field of view, including all dots projected in P0. This image (C1) was then thresholded and resized using calibration parameters, and the resulting image (P1) projected. A second camera trigger captured this image (C2). This procedure was repeated 9 times each at two separate threshold levels, 80% and 99% using Otsu’s automated thresholding. The Euclidean distance between centroids and overlap correlation were compared for each dot in C1 and C2 to evaluate the accuracy of alignment based on location and size, respectively.
Image Processing
When applying only one threshold to the image, each pixel is contrasted with the entire image, sometimes causing important, lower fluorescence intensity signals to be drowned out by brighter ones nearby. Contrast limited adaptive histogram equalization (CLAHE) adjusts the contrast of an image by comparing first the histogram of small regions of the image to neighboring regions, and then sets a cut-off point at a certain level on the histogram. Any points above that cut-off are then redistributed across the entire histogram. After performing these contrast limitations, normal adaptive histogram equalization occurs and a transformation is applied across each region, based upon the distribution function of neighboring regions10. Thus CLAHE reduces the difference in contrast levels between smaller and larger areas of fluorescence, preventing the smaller areas from being drowned out. The difference in detail between simple global thresholds compared to using our CLAHE processing method is demonstrated in Figure 4A–C. User-based input allows real-time customization of image processing and display functions when desired (Table 1), accessed within the graphical user interface (GUI), including threshold sliders, colormap drop-down menus and image capture button (Figure 4D).
Figure 4.
Demonstration of on-the-fly automated thresholding for realtime projection of fluorescence information showing (A) raw fluorescence image data from the camera, (B) simple thresholding of A, and (C) contrast-limited adaptive histogram equalization threshold of A. (D) A simplified graphical user interface for OPAL control and visualization with (1) threshold level slider, (2) 8-bit colormap selector (3) color selector for binary threshold images (4) image capture button and (5) on/off button for realtime projection. Images of (left) raw camera capture and (right) processed image for projection are displayed on the controller’s computer monitor.
Table 1.
List of control options in the OPAL graphical user interface for user-based input.
| Selector | Type | Function |
|---|---|---|
| Colormap | Drop- down list |
This menu offers a wide variety of image processing or viewing options. The default setting is a simple gray scale map with no special processing apart from the initial thresholding. CLAHE processing can be accessed from this dropdown menu. The other options include many of MATLABs built in colormaps, including “jet”, “hsv, hot, cool, and colorcube. This provides the user with various approaches to viewing the data, allowing a user to customize contrasts and gradients to best complement the working environment. Monochrome green colormap was chosen for best contrast in relation to that expected in the surgical field and the exceptional sensitivity of the human eye to this color. |
| Threshold | Slider | By adjusting this slider, the user adjusts the threshold value for the image processing. The threshold value, set by the slider position from left to right, may range from 0.00 to 1.00. This value corresponds to the relative luminance level of a pixel to the possible maximum and minimum levels [9]. Setting the threshold value to zero will turn the entire image black whereas setting it to one makes the image completely white. |
|
Threshold Color |
Drop- down |
This button opens up a modal dialog allowing the user to select either a preset color or to define their own color using a color mixer. The selected color will replace the lighter (default being white) output color from the thresholding and processing. |
|
Image Capture |
Button | The image capture feature allows the user to save a set of unprocessed and processed images at the current moment. The captured images are the exact images seen in the GUI, including the preprocessed, raw camera feed as well as the final processed image. |
| Projection | Button | For convenience, the GUI provides an easy way to toggle projection on and off. This process could also be controlled by foot-pedal for hands-free operation in the OR. |
After applying the histogram equalization, a 2-D median filter is applied to the image for additional noise reduction. Then the appropriate color map is applied to the image while the image data is converted to an appropriate data type for projection. Using information collected from the system calibration, the necessary frame transformation is applied to the image. This final processed image can then be projected as long as the projection setting is activated. Otherwise, the processed image can still be previewed on the controller’s monitor in real time.
Intraoperative demonstration of OPAL imaging for Regional Lymph Node Dissection in Swine
All animal studies were conducted in accordance with protocols approved by the Washington University Animal Studies Committee. For all three studies 35-kg female Yorkshire pigs were premedicated with atropine (0.04 mg/kg given intramuscularly) and a cocktail consisting of Telazol (tiletamine and zolazepam), ketamine, and xylazine (1 mL/50 lb [22.7 kg] of body weight given intramuscularly) prior to induction and maintenance of anesthesia with isoflurane (1–5% v/v in O2) by intubation. Vital signs were monitored during anesthesia. As a model of SLN biopsy for gynecologic cancer, 5 mg/ml ICG dissolved in sterile water (Sigma Aldrich, St. Louis, MO) was injected intradermally (0.1 ml into three sites, bilaterally) of the vulva using an insulin syringe with a 29-gauge needle. For the SLN imaging in the second study, 5 mg/ml ICG was injected (0.3 ml) into the leg using an insulin syringe with a 29-gauge needle. To model SLN for head and neck cancer surgery, 5 mg/ml ICG was injected (0.3 ml) under the skin at the base of the chin as a single injection. All pigs were euthanized about 5 minutes post-injection by intravenous potassium chloride.
For intraoperative guidance, the OPAL system was positioned directly over the operating field, approximately 1 m above the skin surface. The handheld excitation light source was operated by an assistant, directed by the surgeon, to illuminate the region of interest without blocking the OPAL field of view. Automated system calibration was performed to ensure alignment of acquired and projected images. Acquired fluorescence images’ threshold level can be adjusted by the user via slider in GUI and resulting fluorescence mask displayed by projector onto the operating surface. Image mask was projected in green light for ideal visual contrast at 10–15 frames per second. Superficial inguinal lymph nodes were identified by transdermal fluorescence prior to incision and the region illuminated with green light. Guided by the location of green light, a single incision was made to expose the inguinal lymph nodes and complete the biopsy.
Results
Process Control and Accuracy
The basic functionality of our system is reliant upon accurate projection such that the projector illuminates the same location where the camera detects fluorescence. It is also vital for the calibration process to be in control. After each calibration, projected items should remain at the same locations even after halting and resuming the projection. That is, the calibration error should remain consistent within each calibration attempt.
Using a flat surface placed 1.00 meter away from the lens of the projector and 1.11 meters from the camera lens, we repeatedly captured a physical image and then projected that same image on top of itself. The calibration error measurement was taken as the distance from the center of the physical image to the center of the projected image. This process was repeated several times per sample. Each sample consists of the measurements taken after each independent calibration. With 95% confidence, the true mean calibration error was between 1.01 mm and 1.75 mm. The mean was tested using a two-sided Student t-test with an alpha level of 0.05.
Accuracy of alignment
Correlation between images was similar for both threshold levels (80% and 99%). However, the correlation for the 80% is lower than correlation for the 99% threshold levels. For each case there is a variation in the Euclidean distance between centroids with the distance between centroids being better for the 80% than the 99%. This is due to the difference in thresholds resulting in smaller dots in the 99% with greater relative differences in centroids for each dot.
In addition, these results demonstrate the detection resolution of this system for fluorescent 1.5 and 3 mm diameter objects at a distance of 1 m. The accuracy of alignment is sufficiently high to accurately locate these objects within the operating field. Errors of size were due to overestimation by nearly double for 1.5 mm objects, however accuracy of alignment was very high at both threshold levels (Figure 5).
Figure 5.
Analysis of alignment accuracy of OPAL system.. (A) Overlay of image captures from initial projected spots (green) with thresholded (80% and 99% levels), re-projected spots captured at a distance of 1 m from OPAL system, demonstrating high resolution detection and accurate projection of 3.3 mm (top and middle rows) and 1.5 mm (bottom row) spots throughout the full field of view. Further analysis demonstrated relatively low correlation of localization (B) due to overestimation of spot size and high accuracy of alignment (C) as measured by the Euclidian distance between spot centroids.
Model of SLN biopsy for gynecological cancer
There was visible on-field visualization of ICG transport to the regional lymph node after intradermal injection in vulva of female pig. Acquired fluorescence image information was projected onto the imaging field in monochrome green to facilitate localization and biopsy of the sentinel lymph node. The right and left inguinal nodes were visualized and removed with guidance from OPAL technology. Green illumination tracked well with lymph node location to guide lymph node biopsy (Figure 6). This initial study illustrated the benefits of on-field visualization of the projected green fluorescence pre-incision, post incision, and post removal. Fluorescence signal was confirmed to be localized to the lymph node and surrounding lymphatic vessels.
Figure 6.
(A) Intraoperative OPAL use identifying locations of inguinal lymph nodes after intradermal injection of ICG in vulva of female pig. (A) transdermal fluorescence detection of inguinal lymph nodes (lower) and after initial skin incision (upper). (B) Representative raw fluorescence image capture from OPAL camera. (C) Resulting thresholded, binary image map created from B for projection onto surgical field.
Model of SLN biopsy for melanoma of the lower limb
After intradermal injection in the leg of the pig, there was definite on-field visualization of fluorescence from the ICG in the field of view pre-incision. To facilitate removal of the lymph node, the acquired fluorescence was projected in monochrome green onto the field. The fluorescence tracked well during removal of the tissue. Figure 7 illustrates the on field visualization of the projected green light pattern after skin incision and elevation of suspected lymph node. Fluorescence signal was confirmed to be localized to the superficial inguinal lymph node and surrounding lymphatic vessels.
Figure 7.
(A) OPAL-guided biopsy of superficial inguinal lymph nodes demonstrating ICG accumulation after intradermal injection in the lower hindlimb. (B & C) Representative raw fluorescence image capture and thresholded, binary image map for projection, respectively.
Model of SLN biopsy in head and neck cancer
After the pig was injected in the neck, there was strong on-field visualization of fluorescence pre-incision. The acquired fluorescence information was projected onto the imaging field in monochrome green. A gland in the neck was visualized and removed with guidance from OPAL technology. The gland was primarily composed of fatty tissue but was visualized with fluorescence due to the presence of lymphatic tissue. A small piece of tissue was removed which was green, verifying the presence of ICG. Figure 8 illustrates the on field visualization of the projected green fluorescence pre-incision and post incision. Fluorescence signal was confirmed to be localized to the lymph node with the presence of ICG. This tissue was highly fluorescent.
Figure 8.
OPAL highlighting of cervical lymph node after intradermal injection of ICG in the ventral to the mandible. (A) A handheld NIR light source illuminates the area of interest, exciting ICG fluorescence which is detected and re-projected by OPAL. (B) ICG accumulation is confirmed by green color in lymph node, confirming accuracy and sensitity of OPAL (C).
Discussion
Here we have reported a novel device for intraoperative fluorescence detection and fluorescence guided SLN biopsy in large animals, replicating the clinical operating room environment. Application of OPAL for detection of ICG fluorescence in lymph nodes and lymphatic vessels in large animal models of sentinel lymph node biopsy demonstrates feasibility for intraoperative surgical guidance in human medicine.
Intraoperative optical imaging devices have now been developed for fluorescence imaging during surgical procedures3,6,11–14. Optical imaging systems designed for use in the OR, such as ultrasound systems, have become less bulky over time. Devices can now be categorized as hand-held or hands-free. Handheld optical spectroscopy (contact) and imaging (non-contact) devices are prevalent. Handheld devices are still encumbered with wired connections to light sources, computers and monitors and require constant attention by the operating team. Hands-free devices, once positioned, require little interaction other than attention to adjacent display monitors, on the system itself or a nearby wall. Display of images on monitors may distract surgeons from focusing on the area of interest and compromise surgeons’ coordination, thus affecting surgical outcome. These limitations favor the development of real-time optical imaging systems for intraoperative procedures.
Reported hands-free fluorescence imaging systems collect and display fluorescence data in real time with anatomical image overlay, spectral deconvolution, background subtraction, and contrast enhancement.2,6 Upcoming technological improvements promise head-mounted, see-through displays with stereoscopic vision to directly enhance the surgeon’s view of the surgical field with fluorescence information15 though many hurdles must be overcome before this head-mounted technology will be adopted routinely by surgeons.
In contrast, OPAL system was mounted to an overhead surgical light for unobtrusive, intraoperative detection of ICG in tissues and lymph nodes. Green light signals rapidly alerted surgeon and surgical team to ICG location, even below the surface, without distraction of offset computer monitors, handheld cameras or head-mounted devices and their associated connecting cables.
As reported by others, ICG traveled from site of injection to draining lymph nodes within minutes, enabling localization of the lymph node prior to incision in most cases. ICG has been shown to be equally or more sensitive and specific for SNL detection compared with radioactive colloid and blue dye injections1. While fluorescence detection is limited by tissue light attenuation to a few centimeters below the skin surface, lymphatic transport was visualized in real time. In addition, camera-based fluorescence detection is much more sensitive than naked-eye visualization of blue dye.
Using the handheld excitation device, sweeping the NIR illumination over the area of interest, was effective in localizing ICG in tissue by fluorescence signals projected onto the skin or tissue surface. This was effective for non-invasive LN detection prior to incision in the groin (Figure 6 and 7) but not as good in deeper set cervical lymph nodes (Figure 8) due to depth-dependent light attenuation.
The projection of monochrome green was very strong, confirming locations of ICG for all three scenarios, aiding in discovery and resection of the lymph nodes. The OPAL projections successfully tracked while tissues were moved and excised. While the projection of green light onto the field of view was successful in identifying regions of interest, this significantly altered the color and contrast of the tissues. On-demand or intermittent projection of fluorescence image data may be preferable to continuous projection. A delineation of color would provide more information on the presence of ICG than a single shade of monochrome green.
By projecting fluorescence that is darker in areas of higher ICG accumulation and lighter in areas of low ICG accumulation, the surgeon would be better informed on where to make the incision and which tissues to resect. This would also allow the surgeon to make decisions concerning the margins producing lower risk to damage of adjacent vital structures. Alternately, brighter illumination sources could be used to fully illuminate the whole surgical field without affecting color temperature of surgical lighting. This would make the OPAL system truly hands-free. The applied calibration algorithm utilizes 2-D imaging and shape detection algorithms using MATLAB functions. Feature detection and data projection display was excellent with detection of objects smaller than 2 mm and highly accurate size and co-localization of projected images on a flat surface. Alignment is likely to suffer in significantly non-flat environments as can be expected during surgery, requiring improvements in depth-of-field focus for camera and projection. Advanced calibration processes could include 3-D imaging and depth perception to create much more detailed coordinate system (frame) characterizations and transformations and be integrated with surgical navigation systems. These improvements will be helpful in some ways, but may come at the expense of refresh rates. The system would benefit significantly if much of the foundation were written in a more efficient, lower level language such as C++.
Conclusions
Fluorescence imaging is used regularly in human medicine to assess patency of blood vessels and ureters during surgery. Near infrared light (700–1000 nm) penetrates biological tissue much better than visible light and background fluorescence is lower in this region as well. The human eye has highest sensitivity to green light (~550 nm) and insensitive to light beyond 750 nm.16 Digital camera sensors are much more sensitive to NIR light than human eyes. New high intensity LED surgical lighting has optimal color rendering without the infrared component of previous halogen and other light sources. These factors led to the OPAL concept in which NIR fluorescence can be detected concurrent with surgical lighting then displayed directly on the operating field for high visual contrast.
Fluorescence image capture and direct projection with NIR fluorescent molecular probes will provide significant enhancements to current fluorescence guided surgery methods. Light beyond 750 nm is not visible to the human eye, so cannot be detected directly. Green light provides excellent contrast against red/white/yellow background within the abdomen. Projection of anatomically aligned fluorescence image data directly onto the surgical field will eliminate the need for digital monitor display during open surgery. This will make fluorescence information available to the entire surgical team.
Additionally, the OPAL system is similar in operation to other intraoperative imaging systems currently employed in the operating room allowing for a smooth transition into the OR. Because the OPAL system is primarily composed of the projector system, laptop, and NIR illumination source, the system is highly portable, providing easy implementation in the OR without disrupting normal surgical routine.
Acknowledgments
This study was supported by grants from the National Institutes of Health Office of Research Infrastructure Programs (K01RR026095) and the Barnes-Jewish Hospital Foundation BJHF-7583-55.
Vachiramon, Pithawat. 2009. Fullscreen 1.1 (http://www.mathworks.com/matlabcentral/fileexchange/23404-fullscreen-1-1). MATLAB Central File Exchange. Retrieved June 11, 2014.
Abbrevations
- CT
Computed Tomography
- ESB
Enterprise Service Bus
- FOV
Field of View
- ICG
Indocyanine Green
- LED
Light Emitting Diode
- MRI
Magnetic Resonance Imaging
- NIR
Near Infrared
- OPAL
Optical Projection of Acquired Luminescence
- OR
Operating Room
- PET
Positron Emission Tomography
- SLN
Sentinel Lymph Node
- SPECT
Single Photon Emission Computed Tomography
References
- 1.Handgraaf HJ, Verbeek FP, Tummers QR, et al. Real-time near-infrared fluorescence guided surgery in gynecologic oncology: a review of the current state of the art. Gynecologic oncology. 2014 Dec;135(3):606–613. doi: 10.1016/j.ygyno.2014.08.005. [DOI] [PubMed] [Google Scholar]
- 2.de Boer E, Harlaar NJ, Taruttis A, et al. Optical innovations in surgery. The British journal of surgery. 2015 Jan;102(2):e56–e72. doi: 10.1002/bjs.9713. [DOI] [PubMed] [Google Scholar]
- 3.De Grand AM, Frangioni JV. An operational near-infrared fluorescence imaging system prototype for large animal surgery. Technology in cancer research & treatment. 2003 Dec;2(6):553–562. doi: 10.1177/153303460300200607. [DOI] [PubMed] [Google Scholar]
- 4.Rudin M, Weissleder R. Molecular imaging in drug discovery and development. Nat. Rev. Drug Discov. 2003 Feb;2(2):123–131. doi: 10.1038/nrd1007. [DOI] [PubMed] [Google Scholar]
- 5.Schaafsma BE, Verbeek FP, Peters AA, et al. Near-infrared fluorescence sentinel lymph node biopsy in vulvar cancer: a randomised comparison of lymphatic tracers. BJOG : an international journal of obstetrics and gynaecology. 2013 May;120(6):758–764. doi: 10.1111/1471-0528.12173. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 6.van Dam GM, Themelis G, Crane LM, et al. Intraoperative tumor-specific fluorescence imaging in ovarian cancer by folate receptor-alpha targeting: first in-human results. Nature medicine. 2011 Oct;17(10):1315–1319. doi: 10.1038/nm.2472. [DOI] [PubMed] [Google Scholar]
- 7.Sarder P, Gullicksrud K, Mondal S, Sudlow GP, Achilefu S, Akers WJ. Dynamic optical projection of acquired luminescence for aiding oncologic surgery. Journal of biomedical optics. 2013 Dec;18(12):120501. doi: 10.1117/1.JBO.18.12.120501. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 8.Gold-Bernstein B, Ruh W. Enterprise Integration: The Essential Guide to Integration Solutions. Addison Wesley Longman Publishing Co., Inc; 2004. [Google Scholar]
- 9. [Retrieved June 11, 2014];Fullscreen 1.1. Vachiramon P. MATLAB Central File Exchange. http://www.mathworks.com/matlabcentral/fileexchange/23404-fullscreen-1-1. [Google Scholar]
- 10.Zuiderveld K. Contrast limited adaptive histogram equalization. In: Paul SH, editor. Graphics gems IV. Academic Press Professional, Inc; 1994. pp. 474–485. [Google Scholar]
- 11.Crane LM, van Oosten M, Pleijhuis RG, et al. Intraoperative imaging in ovarian cancer: fact or fiction? Molecular imaging. 2011 Aug;10(4):248–257. doi: 10.2310/7290.2011.00004. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 12.Gioux S, Choi HS, Frangioni JV. Image-guided surgery using invisible near-infrared light: fundamentals of clinical translation. Molecular imaging. 2010 Oct;9(5):237–255. [PMC free article] [PubMed] [Google Scholar]
- 13.Zhu B, Rasmussen JC, Lu Y, Sevick-Muraca EM. Reduction of excitation light leakage to improve near-infrared fluorescence imaging for tissue surface and deep tissue imaging. Medical physics. 2010 Nov;37(11):5961–5970. doi: 10.1118/1.3497153. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 14.Marshall MV, Rasmussen JC, Tan IC, et al. Near-Infrared Fluorescence Imaging in Humans with Indocyanine Green: A Review and Update. Open Surg Oncol J. 2010;2(2):12–25. doi: 10.2174/1876504101002010012. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 15.Liu Y, Zhao YM, Akers W, et al. First in-human intraoperative imaging of HCC using the fluorescence goggle system and transarterial delivery of near-infrared fluorescent imaging agent: a pilot study. Translational research : the journal of laboratory and clinical medicine. 2013 Nov;162(5):324–331. doi: 10.1016/j.trsl.2013.05.002. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 16.Gross H. Handbook of optical systems. Weinheim: Wiley-VCH; 2005. [Google Scholar]








