The Academy of Clinical Laboratory Physicians and Scientists (ACLPS) established the Paul E. Strandjord Young Investigator Awards Program in 1979 to encourage students and trainees in laboratory medicine to consider academic careers. Each year, a call for abstracts is sent to each member, inviting submission of scientific papers.
All submitted abstracts are peer reviewed by a committee of ACLPS members selected confidentially by the director of the Young Investigator Program, Neal I. Lindeman, MD. Reviewers are blinded to authors and institutions. Young Investigator Award recipients are granted free registration to the annual meeting, reimbursement for a portion of travel expenses, and the opportunity to present their scientific work before an audience of peers and mentors.
The following abstracts were presented at Lab Medicine 2014, the 49th Annual Meeting of ACLPS, May 29 to May 31, 2014, in San Francisco, CA. Authors receiving a 2014 Young Investigator Award are marked with an asterisk (*).
ACLPS abstracts are published in the American Journal of Clinical Pathology (AJCP) as received by ACLPS without AJCP editorial involvement. Content and typographical errors and inconsistencies in these abstracts are the responsibility of the abstract authors.
1 Abnormal B Lymphoid Blasts May Not Always Indicate Accelerated or Blast Phase in Chronic Myelogenous Leukemia (CML)
Lori Soma,* Cheng Ding, and Sindhu Cherian.* University of Washington, Seattle, WA.
Objective: To determine if CML patients with <20% abnormal B lymphoid blasts (ABLB) by flow cytometric studies (FC) progress to accelerated phase or blast crisis. Methods: The University of Washington Hematopathology database was searched from 2005 to 2012 for patients who were reported to have a positive p210 by molecular studies and for patients with the diagnosis of “abnormal immature B cell population” by FC, which was reported as <20% of white cells. The patient history was evaluated for clinical presentation, blast transformation, therapy, and outcome. The FC dot-plots were reviewed to assess the abnormal phenotype and ABLB percentage. If ABLB were >0.15% of the white cells by FC, histologic examination of the bone marrow was performed, including immunohistochemical stains (IHC: CD34, PAX5, and TdT). Results: The search criteria yielded 2,491 total entries, of which five patients were found who had CML and ABLB reported at less than 20% of the viable white cells by FC in peripheral blood or bone marrow. All specimens were initial presentation; patient age range was 22 to 57 years and included three men and two women. Upon review of the data, the ABLB percentage by FC ranged from 0.006% to 23.5% of total white cells. Commonly seen abnormalities included increased CD10 (n = 4), increased CD19 (n = 3), and decreased or slightly decreased CD38 (n = 5). Three patients demonstrated 10% to 24% ABLB by FC and/or IHC, with the remaining two less than 1% by FC. IHC in all three patients demonstrated increased numbers of interstitial TdT- and PAX5-positive cells (10%–20%), as well as small clusters (interstitial and some paratrabecular; two patients with >10 TdT-positive cells per cluster). One patient was treated with hyper C-VAD (two cycles) and then followed as chronic phase, and the remaining four patients were treated and followed as chronic phase CML. All patients are alive without molecular evidence of disease or in chronic phase without evidence of ABLB (14–47 months follow-up). The ABLB seen at diagnosis or in the diagnostic time frame was not seen in subsequent FC analysis where available (or reported in subsequent documentation). Conclusion: The presence of ABLB in CML patients at presentation should not be considered in isolation when determining disease phase and treatment.
3 Rapid Detection of the Active Cardiac Glycoside Convallatoxin of Lily of the Valley Using LOCI Digoxin Assay
Kerry J. Welsh, Richard S. P. Huang, Jeffrey K. Actor, and Amitava Dasgupta.* Department of Pathology and Laboratory Medicine, University of Texas Health Science Center at Houston, Houston, TX.
Objectives: Lily of the valley plant is toxic to animals and humans due to the presence of the cardiac glycoside convallatoxin. Extracts of this plant are used in herbal medicine despite reported cases of toxicity. The luminescent oxygen channeling technology-based digoxin immunoassay (LOCI digoxin assay; Siemens Diagnostics) was explored for rapid detection of convallatoxin. The potential in vitro binding of convallatoxin with Digibind was also evaluated. Methods: Aliquots of a drug-free serum pool and a digoxin serum pool were supplemented with microliter amounts of lily of the valley extract or nanogram to microgram quantities of convallatoxin, followed by measurement of apparent digoxin concentrations using the LOCI digoxin assay. Mice were administered lily of the valley extract (50 μL) or 50 μg of convallatoxin, and digoxin concentrations in sera were measured 1 and 2 hours after gavage. Aliquots of a serum pool supplemented with convallatoxin or lily of the valley extract were further supplemented with various concentrations of Digibind, and free apparent digoxin concentrations were measured. Results: Apparent digoxin concentrations were observed when aliquots of a drug-free serum pool were supplemented with convallatoxin or lily of the valley extract; for example, apparent digoxin level was 2.6 ng/mL in the presence of 500 ng/mL of convallatoxin. Apparent digoxin levels were also detected in mice after feeding with convallatoxin or herbal extract, and half-life of convallatoxin was 0.43 hours. Bidirectional (negative interference at lower concentration and positive interference at higher concentration) of both convallatoxin and lily of the valley extract with serum digoxin measurement using the LOCI assay were observed. Digibind was capable of binding convallatoxin in vitro as in the presence of 5 μg/mL of Digibind; apparent free digoxin was reduced from 0.58 ng/mL to none detected when a drug-free serum pool was supplemented with lily of the valley herbal extract. Conclusions: LOCI digoxin assay can be used for rapid detection of convallatoxin, and Digibind can neutralize convallatoxin in vitro.
4 Thromboelastography Is a Suboptimal Test for Determination of the Underlying Cause of Bleeding Associated With Cardiopulmonary Bypass
Kerry J. Welsh, Angelica Padilla, Amitava Dasgupta,* Andy N. D. Nguyen, and Amer Wahed. Department of Pathology and Laboratory Medicine, University of Texas Health Science Center at Houston, Houston, TX.
Objectives: Patients undergoing cardiac surgery with cardiopulmonary bypass (CPB) are at high risk of bleeding. The Coagulation-Based Hemotherapy (CBH) service is a clinical pathology consultation service that provides intraoperative management of coagulopathy and transfusion support based on predefined algorithms for patients undergoing surgery with CPB. One test used by the service is thromboelastography (TEG), a test that measures the dynamics of clot formation and lysis. The goal of this investigation was to compare TEG with standard coagulation tests in patients with active bleeding due to CPB-induced coagulopathy. Methods: Patients who had an intraoperative consultation by the CBH service from May 1, 2012, through August 1, 2013, were selected for inclusion into the study. The TEG values and coagulation tests (PT, PTT, fibrinogen, and d-dimer) of patients during episodes of active bleeding attributed to coagulopathy were characterized. The utility of TEG to predict postoperative bleeding was assessed. A second analysis was performed to determine if a shortened TEG R time is associated with thrombotic events. Results: Paired TEG and standard coagulation tests were available from 21 patients during an episode of active bleeding attributed to coagulopathy; 15 of these patients had normal TEG values, and three showed only a shortened R time, all of whom had abnormalities of standard coagulation tests. Eighteen of 67 patients who underwent surgery with CPB had an episode of postoperative bleeding. The TEG R time and CI (clotting index), PT, and PTT collected after CPB were associated with postoperative bleeding in the univariate analysis, but only PT was independently associated with postoperative bleeding in multivariate analysis. The TEG MA for 30 patients on aspirin alone was on average 68.98 ± 5.13 mm and 68.88 ± 8.52 mm for 12 patients on both aspirin and clopidogrel, which is within the reference range (50–70 mm). In the second analysis, three of 38 patients with a normal TEG and four of 43 patients with a shortened R time had a thrombotic event during their hospitalization (P = 1.00). Conclusions: TEG had limited utility in identifying the underlying cause of active bleeding and was not independently predictive of postoperative bleeding associated with cardiac surgery compared with conventional coagulation tests. The TEG MA does not identify patients on aspirin or clopidogrel, a risk factor for bleeding associated with cardiac surgery. A shortened TEG R time may not represent a hypercoagulable state.
5 Powered Versus Manual Bone Marrow Biopsies: A Retrospective Review
Douglas W. Lynch, Stephanie L. Stauffer, and Nancy S. Rosenthal. Sponsor: Matthew D. Krasowski. University of Iowa Hospital and Clinics, Iowa City, IA.
Powered bone marrow biopsy systems (PS) have been used for several years, but little data exist regarding the quality of the pathologic specimens obtained. We evaluated the quality and quantity of the aspirates and biopsy specimens obtained in the same patient with a PS compared with the standard manual method (MM). The University of Iowa Pathology laboratory information system was reviewed for patients who had underwent bone marrow biopsies performed by both the OnControl Bone Marrow System and the MM, and a total of 68 patients were identified. Each specimen was reviewed for a dilute aspirate, bone in the clot section, cortical bone/subcortical location, aspiration and crush artifact, presence of skin, and the presence of a single, intact core biopsy specimen. The length of each biopsy specimen, percentage of evaluable marrow, and calculated length of evaluable marrow were compared for each case using a paired t test. The core biopsy specimens obtained by the PS were significantly longer compared to those obtained by the MM (16.9 vs 14.4 mm, P = .0040); however, those obtained by the MM had more evaluable marrow elements (66% vs 40%, P = .0001), and the MM was superior in 46 of the 68 patients when the length of evaluable marrow was calculated (9.7 vs 7 mm, P = .0048). Every biopsy specimen acquired by the MM had some evaluable marrow while two core biopsy specimens obtained by the PS had none. In the PS, we were more likely to see cortical bone/subcortical location, skin, aspiration artifact, and bone in the clot section. Also, we were less likely to see a single, intact core biopsy specimen with the PS. Our findings show that longer core biopsy specimens are obtained by the OnControl Bone Marrow system but that the MM is still superior when the percentage and length of evaluable bone marrow are analyzed.
8 Case Studies Illustrating the Clinical Utility of Liquid Chromatography–Tandem Mass Spectrometry for the Assessment of Free Thyroid Hormone Status
Verena Gounden,1 Francesco S. Celi,2,3 and Steven J. Soldin.1,4 1Department of Laboratory Medicine and 2NIDDK, National Institutes of Health, Bethesda, MD; 3Division of Endocrinology and Metabolism, Virginia Commonwealth University, Richmond, VA; and 4Department of Medicine, Georgetown University, Washington, DC.
The majority of routine clinical laboratories perform free thyroxine (FT4) and free tri-iodothyronine (FT3) measurements on immunoassay (IA) platforms. The validity of FT4 and FT3 analysis by direct analogue IA has long been questioned. These assays are affected by changes in binding protein concentrations, have a weak inverse linear log relationship to thyroid-stimulating hormone (TSH) in hypo- and hyperthyroid individuals, and have poor performance at the upper and lower values of the concentration range. Liquid chromatography–tandem mass spectrometry (LC-MS/MS) following ultrafiltration of the sample at 37°C (method as previously described by Gu et al. Clin Biochem. 2007;40;1386–1391) has been shown to perform better in such circumstances. Here we describe two cases with TSH values discordant to IA FT4 and/or FT3 values where LC-MS/MS method for FT3 and FT4 analysis was used to better elucidate thyroid status in complex clinical scenarios. Case 1: 30-year-old man with Kaposi’s sarcoma was admitted to intensive care unit ward. Initial results showed an elevated TSH and low FT4 (IA) of 0.7 pg/mL (reference interval, 0.8–1.5); thyroid antibodies were negative. The patient was commenced on thyroxine replacement therapy. After 5 weeks of replacement therapy, immunoassay results were as follows: TSH of 3.52 μIU/mL (0.4–4.0), FT4 (IA) of 0.66 ng/dL, and FT3 of 140 pg/dL (180–420) remained low. Following sample treatment with heterophilic blocking antibodies, IAs were rerun with no change in values. On MS analysis, results were as follows: FT4, 2.7 ng/dL (1.7–2.4) and FT3, 2.6 pg/mL (2.0–6.1). Also, thyroxine binding globulin analysis revealed a decreased value of 4.3 μg/mL (13–39). This is the likely reason for low FT4 and FT3 IA results while the continued high normal TSH was likely due to sick euthyroid syndrome. Case 2: 38-year-old woman with sickle cell anemia and primary hypothyroidism receiving levothyroxine replacement therapy. On follow-up, TSH remained elevated at 8.56 μIU/mL with normal FT4 of 0.8 ng/dL and low total T3. The patient also continued to have symptoms of hypothyroidism. LC-MS/MS results were low for FT4 (1.3 ng/dL [1.7–2.4]) and low normal for FT3 (1.7 pg/mL [1.5–6.1]). Levothyroxine replacement therapy was increased with subsequent normalization of TSH levels. Conclusion: The authors believe that LC-MS/MS analysis of FT4/FT3 should be performed on all samples with discrepant TSH and free thyroid hormone immunoassay values.
9 A Comparison of Thromboelastometry (ROTEM) and Standard Coagulation Testing: Similar Results but Improved Patient Care?
Michael D. Reyes, Benjamin Ramos, Kendall P. Crookston,* and Sara C. Koenig. Department of Pathology, University of New Mexico School of Medicine, Albuquerque, NM.
Thromboelastometry is widely becoming incorporated into the evaluation of hemostasis in a variety of clinical settings. Despite its increasing popularity and use, questions still remain regarding its correlation with standard coagulation testing and appropriate use. In this study, two primary objectives are addressed: (1) assessment of the correlation between specific ROTEM parameters and standard coagulation testing and (2) review of the potential advantages of ROTEM compared with standard coagulation testing. ROTEM analysis was performed on 47 whole-blood samples collected over a 6-month period. Thirty-four samples were collected from inpatients and 13 from outpatients seen in the Coumadin Clinic. Remaining sample from inpatients following ROTEM analysis was sent for concurrent routine coagulation testing (PT/INR, PTT, and fibrinogen). Coumadin Clinic patients were consented and the first draw sent for routine coagulation studies, while point-of-care INR testing was performed on the second sample (per clinic protocol). A correlation was found between the maximum clot firmness (MCF) of the fibrin-specific clot formation (FIBTEM) and fibrinogen levels (r = 0.82). In addition, we observed a strong correlation between the amplitude at 10 minutes (A10) of the FIBTEM and fibrinogen levels (r = 0.98). FIBTEM A10 was highly predictive of the final ROTEM measurement of fibrinogen (FIBTEM MCF). Correlation between the clotting time (CT) EXTEM and INR (r =.76) and CT INTEM and PTT (r = 0.64) was seen as well. In addition to the correlation between various ROTEM parameters and standard coagulation testing, specific ROTEM features may promote improved patient care. The rapid turnaround time, evaluation of whole-blood coagulation (as opposed to plasma-based testing), and ability to view the data from any location are distinct advantages. Additionally, while ROTEM testing is generally more expensive than standard coagulation testing, improved patient care and decreased overall cost may be realized through targeted transfusion therapy and more rapid operating room turnover times.
10 Severe Vitamin B12 Deficiency Mimicking Thrombotic Thrombocytopenic Purpura
Joshua K. Routh,1 Cristhiam M. Rojas-Hernandez,2 Kendall P. Crookston,1,2* and Sara C. Koenig.1 Departments of 1Pathology and 2Internal Medicine, University of New Mexico School of Medicine, Albuquerque, NM.
Microangiopathic hemolytic anemia (MAHA) is associated with serious conditions, including thrombotic thrombocytopenic purpura (TTP), hemolytic uremic syndrome (HUS), and disseminated intravascular coagulopathy (DIC); however, vitamin B12 deficiency can present rarely as a pseudothrombotic microangiopathy (pseudo-TMA). A previously healthy 43-year-old man with chronic alcoholism presented to a rural medical center with a 2-week history of confusion, fevers, chills, exertional dyspnea, dizziness, and fatigue associated with diarrhea and hematochezia. Laboratory results were notable for a platelet count of 19, hemoglobin of 5.0, hematocrit of 15.1, and an MCV of 118 fL. Creatinine was within normal limits. A peripheral blood smear showed marked schistocytes. A presumed diagnosis of TTP/HUS was given, and the patient was transfused with 3 units of plasma and transferred to a tertiary hospital for therapeutic plasma exchange (TPE). Further testing revealed a decreased vitamin B12 level of 93 pg/mL (normal range, 193–986). A repeat peripheral blood smear showed marked macrocytosis, hypersegmented neutrophils, marked thrombocytopenia, and increased red cell fragments. The differential diagnosis was expanded to include pseudo-TMA secondary to B12 deficiency. The patient received TPE a total of 5 times and 1,000 mcg of vitamin B12 daily for 7 days. His platelets, hemoglobin, MCV, and clinical condition gradually improved. Stool cultures tested negative for Shiga toxin. The ADAMTS13 level drawn at the tertiary hospital before the first TPE was at 71%. Additional workup confirmed the diagnosis of pernicious anemia, given the presence of IF antibodies. Although the patient had received 3 units of plasma prior to transfer, it is unlikely that his pretransfusion ADAMSTS13 level was low enough to be consistent with TTP. Additionally, the patient’s creatinine had remained within normal limits throughout his hospitalization. At this point, a diagnosis of pseudo-TMA resulting from vitamin B12 deficiency was made and TPE was discontinued. The occurrence of diarrhea and hematochezia was interpreted as a consequence of malabsorption and severe thrombocytopenia at presentation, respectively. The patient’s laboratory data and clinical condition continued to improve, and he was discharged home with vitamin B12 supplementation.
11 Characteristics of a Rare Adverse Event: Fatal Transfusion-Associated Anaphylaxis
Eric A. Gehrie,1 Pampee P. Young,2* Christopher A. Tormey,1* and Garrett S. Booth.2 1Department of Laboratory Medicine, Yale University, New Haven, CT, and 2Department of Pathology, Microbiology & Immunology, Vanderbilt University, Nashville, TN.
Anaphylaxis is a rare but potentially catastrophic complication of blood component transfusion. Using a novel approach of data-mining publicly available US Food and Drug Administration (FDA) transfusion reaction records to yield relatively large aggregate case numbers, we sought (1) to investigate the general characteristics of fatal anaphylactic transfusion reactions (ATRs) and (2) to determine possible risk factors associated with ATRs. Methods: We filed Freedom of Information Act (FOIA) requests with the FDA for reports of fatal ATRs from 2006 to 2012. For applicable cases, several parameters were collected, including patient age, type of product, volume of product infused, and duration of transfusion. Results: There were 13 reports of fatal ATRs from 2006 to 2012 with data available. The majority of patients were male (n = 8). The median age was 68 years (range, 52–85 years). Eight deaths were associated with fresh-frozen plasma (FFP), three were attributed to platelets (PLTs), and two were attributed to RBCs. The median time to onset was estimated to be 9 minutes for FFP transfusions, 20 minutes for PLT transfusions, and 32.5 minutes for RBC transfusions. The volume of blood product transfused prior to symptom onset was estimated in eight cases. The median volume transfused was 95 mL for FFP (n = 4), 287.5 mL for PLT transfusions (n = 2), and 65 mL for RBC transfusions (n = 2). The infusion rate could be estimated in eight cases: four FFP transfusions (median rate: 14 mL/min), two PLT transfusions (median rate: 12.4 mL/min), and two RBC transfusions (median rate: 2 mL/min). Conclusion: Transfusion recipients who die of ATRs reported to the FDA are typically older adults receiving FFP. Further work is currently under way to determine whether the infusion rate in cases of fatal FFP and PLT transfusions is substantially faster than standard practice. FOIA requests appear to be a promising method to gather information about transfusion-associated events that are too rare to study at an individual medical center.
14 Likely Evolution of ABL1 Kinase Domain Mutation T315L out of T315I in Chronic Myelogenous Leukemia, Lymphoid Blast Phase With T-Lymphoblasts
Geoffrey D. Wool and Farid F. Chehab. Sponsor: Enrique Terrazas.* Department of Laboratory Medicine, UCSF, San Francisco, CA.
We present a unique case of a patient with chronic myelogenous leukemia (CML), lymphoid blast phase with T lymphoblasts, who also has coexisting ABL1 kinase domain mutations T315L and T315I. Bone marrow biopsy and initial ancillary studies were performed at an outside institution and reviewed at UCSF Hematopathology. ABL1 kinase domain mutation analysis was performed at UCSF by Sanger sequencing of complementary DNA (forward and reverse orientations) from peripheral blood. The patient is a 59-year-old man who presented with night sweats, splenomegaly, weight loss, and leukocytosis (WBC of 320,000/μL with 11% T lymphoblasts). Consistent with positive peripheral blood fluorescent in situ hybridization for the 9;22 translocation, qPCR revealed the major breakpoint p210 (b3a2) and a BCR-ABL1:ABL1international standard ratio of 83.64. The marrow biopsy specimen was markedly hypercellular (>95%) and comprised predominantly blasts and maturing granulocytes with virtually absent erythropoiesis. Megakaryocytes were mildly increased in number and were predominantly small. Other diagnoses, such as mixed phenotype acute leukemia with t(9;22)(q34;q11.2) or Ph+ T-lymphoblastic leukemia/lymphoma, were considered but were ruled out by morphologic and immunophenotypic criteria. The patient was started on TKI treatment and developed recurrent splenomegaly 1 month later and was begun on HyperCVAD. Follow-up testing after treatment revealed continued highly elevated BCR-ABL1 burden. ABL1 kinase domain mutation testing revealed a mixed mutant allele burden at codon 315 (without wild-type ACT codon). The major allele was CTT at codon 315, c.943A>C, c.944C>T, corresponding to a T315L mutation. There was a minor population of ATT at codon 315, c.944C>T, corresponding to a T315I mutation. Thus, it is likely that a C->T mutation from the normal codon gave rise to the ATT/T315I codon, which then through an A->C substitution emerged the CTT/T315L codon. The ABL1 kinase domain mutation T315L is rare, having been described in one case of Ph+ B-lymphoblastic leukemia/lymphoma. Whether the progression of the T315I allele to T315L implies worse prognosis than that already associated with CML lymphoid blast phase disease is unclear at this time. In summary, this is an exceptional case report of a patient with a rare CML with T-lymphoid blast phase as well as a T315L ABL1 kinase domain mutation that appears to have evolved out of a poor prognosis T315I allele.
17 A Report of Clotting During LDL Apheresis
Kathryn Matney,1 Mary Berg,1* James Falko,2 and Nicole L. Draper.1 Departments of 1Pathology and 2Internal Medicine, University of Colorado, Aurora, CO.
Low-density lipoprotein (LDL) apheresis is a recommended treatment for severe hyperlipidemia. In individuals with familial hypercholesterolemia (FH), apheresis may be the treatment of choice. Severe reactions involving clotting during the procedure are not well described in the literature. Case Report: We report the case of a 63-year-old woman who underwent LDL apheresis for refractory FH. LDL apheresis was done using a dextran-sulfate cellulose (DSC) column with the Kaneka MA-01 Liposorber system (Kaneka Corporation, Japan). Before the first procedure, her serum values were total cholesterol, 241 mg/dL; LDL, 155 mg/dL; and lipoprotein(a), 142 mg/dL. She received a 25-U/kg loading dose of unfractionated heparin (UH) with a 2,100-mL/h UH infusion rate. At initiation of the procedure, the patient was asymptomatic, but her blood pressure (BP) slowly declined. At 75 minutes, it reached a nadir of 72/45 mm Hg. A bolus of saline improved her BP to 106/52 mm Hg, but at 60 minutes, the blood in the tubing leading to the filter had started to clot. Despite replacing the pressure valve and separator column, the patient’s blood clotted off the instrument at 185 minutes. In addition, her BP again dropped below 100 mm Hg systolic despite repeated boluses of saline. The procedure was stopped after processing 32% of the target plasma volume. After the first procedure, her serum values were total cholesterol, 147 mg/dL; LDL, 85 mg/dL; and lipoprotein(a), 70 mg/dL. Twenty-one days later, the patient underwent a second procedure. Before the second procedure, her serum values were total cholesterol, 238 mg/dL; LDL, 155 mg/dL; and lipoprotein(a), 122 mg/dL. For this procedure, she received an increased loading dose (50 U/kg) of UH followed by a 2,100-mL/h infusion rate. The patient did not develop clotting during this procedure, but her BP was labile; following a vasovagal episode while getting up to the bedside commode, the procedure was terminated at 38.5% of the target volume. After the second procedure, her serum values were total cholesterol, 122 mg/dL; LDL, 66 mg/dL; and lipoprotein(a), 53 mg/dL. The patient returned for third and fourth treatments. She completed and tolerated both procedures well. For these procedures, she was given an increased UH loading dose of 35 U/kg followed by a 2,100-mL/h infusion during the procedure. Conclusion: We report a case of severe clotting associated with hypotension during LDL apheresis performed by the DSC method. Clotting during DSC apheresis is not well described in the literature; there was one case reported in 1995. A growing body of literature on the role of lipoprotein(a) in atherothrombotic complications and hemostasis supports a possible mechanism by which clotting could occur during apheresis. This theory is consistent with our case in that the patient’s lipoprotein(a) levels progressively declined with each procedure. Although further investigation is needed, the available literature suggests that higher anticoagulant doses during apheresis should be considered in individuals with marked elevations in Lp(a).
18 Prevalence of Novel Psychoactive Substance Use Among Stimulant Abusers in San Francisco
Jennifer M. Colby, Alan H. B. Wu,* and Kara L. Lynch. Department of Laboratory Medicine, University of California San Francisco, San Francisco, CA.
Novel psychoactive substances (NPS), also known as designer drugs, are usually legal alternatives to common drugs of abuse that can be found in brick-and-mortar stores as well as purchased over the Internet. NPS usage and side effects have been the subject of significant media coverage in the United States, but there are few population-based studies on NPS prevalence. Reasons for this include the lack of available standards and unknown cross-reactivity with commercially available immunoassays that are routinely used as screening tools. We developed a liquid chromatography/high-resolution mass spectrometry (HRMS) screening method for amphetamine-like NPS and then used the method to determine the prevalence of NPS in a subset of our patient population and to identify substances that caused false positives in our amphetamine immunoassay. One major advantage of our HRMS method is that we can acquire accurate mass data in an untargeted manner and then, during data processing, specifically look for masses that correspond to NPS. This technique, known as suspect screening, allows us to search the data for NPS without acquiring reference standards for each compound. Our liquid chromatography method used a Phenomenex Kinetex C18 column and a simple 0% to 100% gradient of mobile phase B over 10 minutes. Mobile phase A consisted of 5 mmol/L ammonium formate and 0.05% formic acid in water; mobile phase B consisted of 0.05% formic acid in 50:50 methanol to acetonitrile. Our ABSciex 5600 QTOF mass spectrometer collected full-scan high-resolution data on precursor ions with information-dependent acquisition of up to 30 simultaneous product ion spectra. We analyzed more than 200 urine samples that had positive immunoassay screens for amphetamines to determine the positivity rate for over 400 compounds, including amine NPS. Ninety percent of the samples confirmed positive for methamphetamine and 42% confirmed positive for amphetamine. Only 3% of samples were false-positive screens that confirmed positive for another drug, including trazodone, dextromethorphan, doxylamine, and bupropion. Approximately 5% of samples did not confirm positive for any known compound and are undergoing further testing. We confirmed ethylamphetamine, Methylone, and MDMA in our patient population but ultimately found that fewer than 10% of the samples were positive for any amine NPS. While this low prevalence rate may reflect a study population bias, as the pool of individuals who abuse illegal stimulants may be different from the pool who would use NPS, our study indicates that NPS usage is not as frequent within our population as the news media might have us believe.
21 Quantitation of Infliximab in Serum Using Surface Plasmon Resonance
Katie L. Thoren, Kara L. Lynch, and Alan H. B. Wu.* Department of Laboratory Medicine, University of California–San Francisco, San Francisco, CA.
Infliximab is a chimeric monoclonal antibody that targets tumor necrosis factor–alpha (TNFα) and is used to treat a variety of chronic autoimmune disorders. Therapeutic drug monitoring for infliximab is important in guiding patient management; serum concentrations can indicate if the dosing needs to be adjusted or if the patient should be switched to a different therapeutic. Current testing options for infliximab, however, are limited. Here, we use surface plasmon resonance (SPR) to measure infliximab concentrations in serum and, in general, explore the potential use of this technology in the clinical laboratory. All SPR experiments were performed on the ProteOn XPR36 (Bio-Rad). Recombinant human TNFα (GenWay Biotech) was immobilized to a GLC chip (BioRad) using standard amine coupling. Normal serum samples were spiked with known concentrations of infliximab (Jannsen Biotech) ranging from 0.3 to 20 μg/mL. To generate calibration curves, infliximab-spiked serum was diluted 1:10 in buffer and injected over the immobilized TNFα at 25 μL/min for 2 minutes. The surface was then washed with 10 mmol/L HCl and allowed to reequilibrate before injecting subsequent samples. We found that TNFα-immobilized chips are stable for at least 50 binding/wash cycles. The lower limit of detection for our assay is 0.3 μg/mL, and the functional sensitivity (20% coefficient of variation) is 1 μg/mL. Patient samples (n = 10) have been run using the SPR assay; those that have been tested by our reference laboratory (n = 2) correlate well with the SPR results. Additional patient comparison studies are under way using the iLite Infliximab bioassay (Biomonitor). Overall, these results demonstrate the feasibility of using SPR to measure infliximab levels in patient samples. SPR offers several advantages over other methods that are currently used to measure infliximab. SPR requires virtually no sample preparation; measures the active, free fraction of drug; and provides a faster turnaround time because the measurements are made in real time. The technique could potentially be used to measure other protein therapeutics in the future.
24 Sirt1 Expression in B-Cell Acute Lymphoblastic Leukemia and Dose-Dependent Effect of Vorinostat
Kristine Konopka, Priya Weerasinghe, Nghia D. Nguyen, Amitava Dasgupta,* and Robert E. Brown. University of Texas Health Science Center at Houston, Houston, TX.
Sirt1 (silent mating type information regulation 2, homolog 1), an NAD+-dependent histone deacetylase, blocks differentiation and inhibits p53-mediated apoptosis. In this study, we evaluated Sirt1 expression in B-cell acute lymphoblastic leukemia (B-ALL) patient samples, in the context of gender, age, and presence of chromosome abnormality. Since Sirt1 may function as a therapeutic target, we also assessed the effect of the histone deacetylase inhibitor vorinostat (suberoylanilide hydroxamic acid, SAHA) on human B-ALL cell line CCRF-SB. With institutional review board approval, we retrieved bone marrow cases of B-ALL from our database, from July 2008 to July 2013, following 2008 World Health Organization (WHO) Classification. Only initial diagnostic cases with adequate materials were included, excluding postchemotherapy cases. Twelve cases fulfilled the criteria and included seven males and five females, ages 4 to 56 years (six pediatric and six adult), and among these patients, six showed abnormal chromosome analyses. Sirt1 monoclonal antibody was applied to bone marrow aspirate clot sections, and DAB chromogenic signal detection was scored on a scale of 0 to 3+. All cases (12 of 12) had variable nuclear expression of Sirt1 with weak (±) to 3+ staining; the majority of leukemic cells in each case had mild to moderate intensity (1–2+). Sirt1 expression was verified in CCRF-SB cells by immunohistochemistry, which also showed 1 to 2+ nuclear staining. Dose response and kinetics experiments were then performed on CCRF-SB cells that were treated with vorinostat. Cell death and proliferation were determined by light microscopy, electron microscopy, and PI/cell death assay, while cell cycle and apoptosis were assessed by flow cytometry. Expression of Sirt1 was determined utilizing immunofluorescence. At 48 hours posttreatment, 1-, 2-, and 5-μM concentrations displayed 32% ± 4.5, 53% ± 6.4, and 86% ± 6.2 of cell death, respectively. Light and electron microscopy showed cells undergoing apoptosis, while cell cycle analysis revealed a sub-G1 peak but no significant changes in G1 or G2. Immunofluorescence experiments displayed a decrease in Sirt1 expression in the nucleus at 24 hours with 2 μM vorinostat exposure compared with untreated control cells. Overall, our findings suggest Sirt1 is expressed in B-ALL, regardless of gender, age, or genetic status. Furthermore, we found essentially equivalent Sirt1 in both favorable (eg, hyperdiploid) and unfavorable (eg, t (9;22) (q34;q11.2)) karyotypes. We conclude that Sirt1 blocks differentiation in B-ALL, and our findings support the use of vorinostat as a potential therapeutic agent.
26 ROTEM Case of the Week: An Online, Case-Based Educational Intervention for the Introduction of a Novel Coagulation Test to Practicing Physicians
K. Krehbiel,1 L. Rosenbaum,1 J. Risan, and J. L. Marinaro.2 Sponsor: Kendall Crookston. Departments of 1Pathology and 2Anesthesiology and Critical Care Medicine, University of New Mexico School of Medicine, Albuquerque, NM.
This year, the rotational thromboelastometry (ROTEM) panel was introduced at our academic, level 1 trauma and tertiary referral hospital as a rapid method to assess a patient’s coagulation status and to direct appropriate usage of blood product. As most clinicians have no prior exposure to the ROTEM panel, we sought to develop a didactic and online educational offering to help practicing physicians understand the interpretation and utility of this test. An educational curriculum was developed not only to instruct clinicians who would order and interpret the ROTEM but also to increase both awareness and utilization of this novel laboratory panel. Two separate groups completed the training concurrently: 14 pathology faculty/residents and 23 critical care/anesthesia faculty and residents. The curriculum consisted of a 1-hour ROTEM lecture given by the Transfusion Medicine service, followed by an 8-week online course of short case-based quizzes with detailed answers provided. A pretest was given prior to the lecture, and an identical online posttest was administered at the end of the 8-week course. The effectiveness of the curriculum was assessed by evaluating pretest vs posttest scores, overall participation in the weekly quizzes, participant feedback on the clinical applicability, and satisfaction with the training. Data were analyzed using the Spearman ρ correlation method. Of the 37 participants enrolled in the training, 26 (70%) completed at least one of the weekly quizzes. Fourteen participants (38%) completed all eight of the weekly quizzes; 16 participants (43%) completed the entire course, including the pre- and posttest. Due to this limited sample size, the study did not reach statistical significance for demonstrating a change in pretest vs posttest scores. However, 13 of the 16 (81%) participants who completed the entire course demonstrated improvement, with an average improvement (including the participants who demonstrated no improvement) of 2.8 correct answers per the 11-question pre- and posttests. Of the 10 participants who responded to the satisfaction survey, 100% stated they enjoyed the program and were applying the training principles in their clinical practice. Our results indicate that a weekly case-based quiz is an effective educational tool for the introduction of the ROTEM panel.
28 Establishing Peripheral Blood Smear Review Criteria Based on Interpretation of Automated Analyzer Flags in an Outpatient Cancer Care Setting
Awet Tecleab, Elizabeth Southern, John McGovern, and Ellinor I. Peerschke.* Department of Laboratory Medicine, Memorial Sloan Kettering Cancer Center, New York, NY.
Background: Outpatient cancer care requires rapid resulting of blood cell and leukocyte differential counts. Automated analyzer flags are highly sensitive to the presence of immature/abnormal cells, which impacts manual review rates and may delay result reporting. Aim: To establish relevant peripheral blood smear review criteria for flagged automated differential results in a patient population treated for solid tumors in an outpatient setting. Method: Automated CBC and differential analysis was performed on the Sysmex-XS1000i analyzer at four separate outpatient sites. More than 500 flagged samples were reviewed by microscopic examination of Wright/Giemsa-stained peripheral blood films. Only the presence of blasts, nucleated RBCs (nRBCs), atypical lymphocytes, and platelet abnormalities were considered positive findings in this patient population. Observations of left shift/immature granulocytes flags were expected in patients undergoing chemotherapy and were excluded from review when present alone. Results: The distribution of automated analyzer flags occurring alone or in combination was abnormal WBC scattergram (25%), left shift (LS) (29%), nRBCs (26%), atypical lymphocyte (17%), blast (19%), immature granulocytes (IG) (60%), and platelet abnormalities (36%). The true-positive rates for significant flags occurring alone and in combination with other flags were 0% and 29% for WBC abnormal scattergram, respectively; 25% and 33% for nRBCs; 0% and 1% for atypical lymphocyte; 0% and 8.5% for blast; and 0% and 3.5% for platelet flags. Whereas the nRBC flag showed a high true-positive rate, the remaining flags, present either alone or in combination, failed to reveal unexpected findings on peripheral blood smear review, including flags for platelet clumps, which were remarkable for occasional platelet clumps or fibrin strands found on peripheral smear review but did not change the automated platelet count. Based on these findings, we recommend releasing (1) automated platelet counts greater than or equal to 100,000/μL, regardless of platelet flags; (2) automated differential cell counts associated with single flags only (except nRBC flag); and (3) combinations of LS/IG flags with or without blast and/or atypical lymphocyte flags. Conclusion: These measures reduce the manual review rate by 74%. The data support the importance of appropriate automated CBC and differential flag interpretation for specific patient populations, such as cancer patients, to guide implementation of clinically relevant manual review criteria that enhance patient care, maximize laboratory efficiency, and reduce operational costs.
29 Clinical Validation of a Custom-Designed Microarray Assay for the Diagnosis of Hematologic Malignancies
Jess F. Peterson,1 Nidhi Aggarwal,2 Susanne M. Gollin,1 Urvashi Surti,1 Steven H. Swerdlow,2 Aleksandar Rajkovic,1 and Svetlana A. Yatsenko.1 Sponsor: Alan Wells.2 1Pittsburgh Cytogenetics Laboratory, 2Department of Pathology, University of Pittsburgh Medical Center, Pittsburgh, PA.
Classic cytogenetic analysis (G-banding) plays a critical role in the diagnosis, prognosis, and treatment of hematologic malignancies and allows for the detection of large genomic alterations. However, the necessity for dividing cells, poor chromosome morphology, and low resolution of G-banding are limiting factors for a complete and accurate chromosome study. The introduction of high-resolution arrays, including comparative genomic hybridization (CGH) and single-nucleotide polymorphism (SNP), has enabled the detection of most abnormalities detectable by G-banding, in addition to copy number variations and loss of heterozygosity (LOH). Our group developed and validated a microarray assay for clinical use in the diagnosis of hematologic malignancies. The custom design contains 180,000 oligonucleotide CGH probes or 120,000 CGH + 60,000 SNP probes with an average genomic resolution of 135 to 150 kb, in addition to enhanced 5- to 10-kb resolution spanning 900 genes involved in oncogenesis. A total of 27 specimens from patients with a diagnosed hematologic malignancy (11 AML, four MDS, four B-ALL, three CLL, two CML, two PCM, and one PV) were analyzed with the CGH or CGH + SNP array and compared with G-banding and FISH results. Completely concordant results between G-banding/FISH and microarray analysis were observed in five of 27 cases. In four of 27 cases with balanced translocations as the sole abnormality, microarray analysis showed normal results as expected. In 18 of 27 cases, microarray analysis revealed additional chromosomal aberrations, including heterozygous and homozygous large genomic and submicroscopic imbalances, deletions at the breakpoints of some recurrent translocations, low-level mosaic aneusomies, and LOH. However, four of the 18 cases also had abnormalities detected by G-banding/FISH that were not detected by microarray analysis. Clinically significant findings detected by microarray analysis include deletions of the NF1 gene in two AML patients, MYC amplification in one case each of AML and CLL, homozygous deletions of both CDKN2A and CDKN2B in two cases of B-ALL, PAX5 deletions in two B-ALL patients, and TP53 deletions in one case each of AML and B-ALL. In addition, nine regions of LOH were detected, some involving clinically significant genes such as RB1 and TP53. Our results demonstrate the added benefit of incorporating a microarray into clinical practice to enhance the detection of clinically significant alterations in hematologic malignancies.
30 Development and Validation of a Clinical Sequencing Assay Using RNA-Seq to Direct Treatment of Relapsed Pediatric Cancers
Marisa Needham,1 Colby Chiang,2 Mani Mahadevan,1 Larry Silverman,1 Robin Legallo,1 and Ira Hall.2 Sponsor: David Bruns.*1 Departments of 1Pathology and 2Biochemistry and Molecular Genetics, School of Medicine, University of Virginia, Charlottesville, VA.
Background: Cancer continues to be the leading cause of death in children in the United States. The aim of this study was to develop a clinically validated, RNA-sequencing (RNA-Seq) assay of formalin-fixed, paraffin-embedded (FFPE) tumor tissue to detect actionable mutations in pediatric cancer relapse patients who have a less than 20% chance of event-free survival. Methods: We studied FFPE tissue from 12 tumors previously identified to have gene amplification and/or overexpression by clinically validated methods. Purified RNA was quantified and evaluated by use of Nanodrop (2–3,000 ng/μL RNA), Invitrogen Quant-IT Qubit RNA Broad-Range Assay Kit (1 ng/μL–1 μg/μL), and Agilent Bioanalyzer RNA Pico 6000 Kit (AMR 0.05–5 ng/μL). RNA Integrity Numbers ranged from 2 to 4.3; 28S/18S values were 0 for the majority of samples. All purified samples were sonicated to ensure sufficient fragmentation of RNA. Samples were sequenced on the Illumina MiSeq platform, and manual analysis of overexpression was performed for the first four samples. Within-sample normalization of genes of interest was accomplished by selecting nine housekeeping genes expressed in all samples. When matched normal tissue was unavailable, data from The Cancer Genome Atlas (TCGA) database were used for comparison with our results. The bioinformatics pipeline in development will define approximately 100 suitable housekeeping genes for normalization. Results: At abstract submission, four of 12 specimens had been analyzed. They contained duplications of MDM2, HER2 (ERBB2), or MYC or an EGFR mutation. The four samples were multiplexed for sequencing on one MiSeq flow cell lane. Overexpressions of MDM2 (177-fold), HER2 (20-fold), and EGFR (7-fold) were detected by manual analysis. Coverage was insufficient to produce reads in exons 19 to 21 of the EGFR gene in order to detect mutations. Therefore, only one specimen per flow cell was sequenced in subsequent runs. At time of abstract submission, the TCGA database had no normal control data of the tissue type for the MYC sample analysis; MYC data will be analyzed with the bioinformatics pipeline. Conclusions: We describe a proof-of-principle for a clinically validated whole-transcriptome RNA-Seq assay in order to detect overexpression of clinically relevant genes in cancer patients.
31 LIS-Facilitated Active Surveillance Detects Immune Refractory Patients Earlier Than Passive Surveillance
Erik G. Jenson, Ann Burbank, and Nancy Dunbar.* Department of Pathology, Dartmouth-Hitchcock Medical Center, Lebanon, NH.
Background: Platelet transfusion (PltTxn) recipients who repeatedly fail to achieve an adequate corrected count increment (CCI) should be investigated for HLA antibodies, as response to transfusion may be improved by providing HLA or crossmatch-compatible PltTxns. At our institution, identification of these patients is achieved through passive surveillance (PS), which relies on physician recognition to prompt further investigation and HLA antibody testing (HLA-ABT). Theoretically, the laboratory information system (LIS) can be configured to achieve active surveillance (AS) by automatically calculating a CCI following every PltTxn, flagging patients who may benefit from HLA-ABT. Objective: Determine whether an investment to enable AS is warranted. Methods: This retrospective study compares the historic performance of PS with a hypothetical AS in identifying patients with HLA antibodies. Our population consisted of all patients with positive HLA-ABT who were subsequently supported with crossmatch or HLA compatible PltTxn from 2010 to 2013. To simulate AS, a CCI was calculated for PltTxns preceding HLA-ABT. Inadequate response was defined as CCI <7.5 calculated from post-PltTxn laboratories drawn within 5 hours of platelet dispense. The earliest active surveillance detection date (EASDD) was defined as the date when two consecutive inadequate responses were observed. The EASDD was then compared to the patient PltTxn history to determine how many additional PltTxns beyond the EASDD each patient received prior to HLA-ABT. To investigate the possibility that AS may result in increased HLA-ABT, we also applied the AS algorithm retrospectively to every inpatient on the hematology-oncology ward who received PltTxns during October 2013. Results: Eight immune-refractory patients were diagnosed via PS from 2010 to 2013. All of these patients would have been investigated for potential HLA-ABT earlier using the AS algorithm. These patients received an average of five PltTxns (range, 1–12; median, 4) between the EASDD and a positive HLA-ABT. Application of the AS algorithm to inpatients identified two additional patients for further investigation during the study period. Both of these patients were negative for HLA antibodies following HLA-ABT. Conclusion: AS identifies immune refractory patients earlier than PS, potentially sparing these patients additional incompatible platelet transfusions through earlier provision of compatible units. Active surveillance may, however, result in increased HLA antibody testing in patients with other nonimmune explanations for platelet refractoriness.
32 Using Single-Nucleotide Variants for Ancestry Determination in a Cancer Predisposition Next-Generation Sequencing Assay
Patrick C. Mathias, Noah G. Hoffman,* Colin C. Pritchard,* and Brian H. Shirts,* Department of Laboratory Medicine, University of Washington, Seattle, WA.
During the review of genetic laboratory data, determination of a patient’s ancestry can help to provide appropriate interpretations for variants observed only in select populations, as well as to provide an additional quality control metric for identifying possible sample swaps. To address this need, we demonstrate the use of open source software to predict geographical regions of ancestry for patient samples in a germline next-generation sequencing assay, which sequences exons and intronic sequences in 50 genes associated with hereditary cancer predisposition. We apply a software program called Locating Ancestry from SEequence Reads (LASER), which uses principal component analysis (PCA) to capture the variation in sample single-nucleotide variants relative to reference data from the Human Genome Diversity Project. This method is suited to targeted sequencing assays because it restricts the coverage of the references to correspond to the targeted regions of the genome. The results of this analysis are used to build two-dimensional ancestry maps against a reference PCA space, which represents data from over 900 reference individuals covering eight broad geographical regions but does not include groups of mixed race. Additionally, we apply a k-nearest neighbors (KNN) algorithm to the principal components to classify each sample and provide discrete results for ancestry prediction. We used these approaches to analyze 257 patient samples with the following self-reported ancestry distribution: 75% non-Ashkenazi European, 7% Ashkenazi, 3.5% Latin American, 3% Asian, 2% African, 1.5% Native American, 1% Middle Eastern, and 7% mixed ancestry. The ancestries predicted by KNN are 100% concordant with self-report for the non-Ashkenazi European, Asian, and Middle Eastern samples without mixed background. Fifteen of 17 Ashkenazi samples were classified as European, and the other two were classified as Middle Eastern. Because the reference data set does not include individuals with mixed race, individuals with a mixed background are predicted by KNN to be one of the parent ancestries or an ancestry group lying between the parent groups in PCA space. In all 257 samples, the sample position on the graph of the first two principal components relative to the reference individuals is consistent with the patient’s self-reported ancestry. This work demonstrates that additional information from next-generation sequencing data may be applied for quality control in addition to more personalized interpretation of the variants detected by these techniques.
33 Batched Sample Extraction Improves Precision and Turnaround Times for the Measurement of Tacrolimus by Immunoassay
Michael O. Alberti, Alan M. Fukuchi, Anthony W. Butch, and Kathleen A. Kelly.* Department of Pathology & Laboratory Medicine, University of California Los Angeles, Los Angeles, CA.
Current methods for measuring immunosuppressants include manual (or “off-line”) extraction of individual samples and automated (or “online”) extraction methods. Both of these methods pose challenges, as off-line extraction of hundreds of individual samples per day results in hand fatigue and increased turnaround times (TAT), whereas online extraction methods compromise sensitivity and precision. In this regard, we developed a batched off-line extraction method to be used in conjunction with the Abbott ARCHITECT platform for measurement of tacrolimus concentrations in whole blood and compared the analytical performance of this method with off-line extraction of individual samples. Pretreatment (protein precipitation and drug extraction) of individual EDTA whole-blood specimens is required prior to using the ARCHITECT tacrolimus immunoassay. We modified this pretreatment step with a centrifuge and rack system that could accommodate 20 to 40 samples at one time. Total assay imprecision (%CV), as determined from patient sample pools in duplicate for 5 days, was 3.8% (mean tacrolimus concentration = 15 ng/mL) and 4.6% (mean = 4.2 ng/mL) for individual off-line extraction and 3.4% and 3.8%, respectively, for batched extraction. Functional sensitivity (20% CV), as determined from duplicates of patient sample pools (tacrolimus concentrations ranging from 0.2 to 4.0 ng/mL), was similar between the two methods (0.5 ng/mL). Method comparison studies, using samples from renal transplant patients treated with tacrolimus (n = 117), demonstrated that the two methods were highly correlated over a range of concentrations (BATCHED = 0.992 [INDIVIDUAL] + 0.13; r = 0.99; average bias = 0.6). Timed analysis (at 15 ng/mL) for off-line individual vs batched extraction for 20 or 40 samples resulted in a time savings of approximately 13% to 14% overall, as well as an 11-minute reduction in TAT. In summary, batched extraction of samples for tacrolimus measurement by immunoassay on the ARCHITECT platform had similar functional sensitivity, correlation, and bias compared to individual off-line extraction of samples. Additionally, batched extraction demonstrated increased precision at low tacrolimus concentrations and resulted in less hand fatigue and a small time savings compared to individual off-line extraction of samples, thus improving patient care.
35 Paraneoplastic Antibody Panel Ordering Patterns and Clinical Management: Utilization Review at an Academic Medical Center
Chad Vanderbilt,1 Teerin Liewluck,2 Tina Cline,1 and Philip J. Boyer.1 Sponsor: Mary P. Berg. Departments of 1Pathology and 2Neurology, University of Colorado School of Medicine, Aurora, CO.
With the sense that a large proportion of paraneoplastic antibody panel (PAP) testing results were negative and with cost-control initiatives in place, we undertook a utilization review to assess ordering patterns and clinical management of positive results. A query of our laboratory result database identified 292 PAP test orders during a 13-month period in 2012–2013. We systematically reviewed clinical history and indications for testing, ordering service, clinician response to results, clinical outcome, and display of results in our electronic medical record (EMR). A PubMed search for reports of PAP utilization review and best practices was undertaken. At least one antibody was identified in 48 of 292 (16%) of tests, representing 46 patients. Serum evaluation was much less likely to be positive (1/25, 4%) if not ordered directly or indirectly by a neurologist. Cerebrospinal fluid testing was rarely positive (1/44, 2%). In 20% of cases, positive results were either (1) interpreted as negative or (2) not acknowledged as positive by the ordering physician in the EMR. Only 48% of positive results led to a change in management, most commonly immunotherapy and/or a search for occult malignancy. There was a strong association between a history of an autoimmune disease and a positive result (67% of positive tests). In three patients with a positive test, a remote history of malignancy (two breast, one bladder) was known, one patient had a concurrent malignancy (thyroid), and two were diagnosed at 8 (small cell lung carcinoma) and 10 (anaplastic astrocytoma) months following the positive result despite originally negative malignancy screening. Review of reports in our EMR revealed that (1) positive results were reported as “See Report” rather than “Positive” in the synoptic result review section, and (2) discerning results was difficult due to report complexity and lack of segregation of positive results at the top of the report. No peer-reviewed reports directly addressing PAP utilization or best practices were identified. While only 16% of tests resulted in a positive result, the PAP was much less likely to be positive when ordered by a nonneurologist, suggesting a role for a “gatekeeper.” Long-term surveillance of patients with positive PAP results may be necessary to identify occult malignancy. The misinterpretation or lack of acknowledgment of 20% of positive tests is worrisome; investigation is ongoing to discern the root cause(s). Reporting in the EMR synoptic result review section was changed to read “Positive” rather than “See Report,” and reports now list positive results at the top; postintervention analysis is ongoing.
36 Electronic Health Record Alert to Providers Identifies Excessive Ordering of Cardiac Troponin Testing
Sara A. Love,1 Zeke J. McKinney,2 Yader Sandoval,2 Stephen W. Smith,2 Rebecca Kohler,2 and Fred S. Apple.1* Departments of 1Laboratory Medicine and Pathology and 2Medicine, Hennepin County Medical Center, Minneapolis, MN.
A new, hospital-wide cardiac troponin I (cTnI) order set was implemented in clinical practice, based on serial cTnI testing (0, 3, 6, 9 h) for the diagnosis of myocardial infarction (MI). During 2 months in the fall of 2013, any order set placed after this initial one would then trigger an electronic health record (EHR) alert. Providers were not limited to any number of order sets used to rule in/out MI. The purpose of this study was to investigate cTnI order utilization under this new system. All EHR alerts initiated patient inclusion, with subsequent EHR review and cTnI tabulation. Data included demographics, duration of stay, location, ICD-9 diagnoses, and cTnI orders and results. Normal or increased cTnI results based on 99th percentile (0.025 μg/L; Abbott Architect) were evaluated relative to the alert firing and for the number of cTnI results prior or subsequent to the alert. In total, 1,477 alerts were generated in the EHR as clinicians requested additional cTnI orders, beyond an initial serial order set. The 1,477 alerts were associated with 833 encounters for 702 patients and 3,045 cTnI results. Ninety-three percent of all encounters triggered by an alert were determined to be non-ACS related (per ICD-9 codes). There were an average of 2.1 alerts per encounter (range, 0–12) and 1.2 encounters per patient (range, 1–6). There was an average of 1.8 cTnI results available prior to an alert and 2.5 cTnI results following an alert. In theory, if all of the additional order sets were completed, approximately 7,000 additional cTnI tests would have been reported. Of the 1,477 alerts, 347 (23%) fired before a single cTnI result was available in 293 encounters. For each of the 833 encounters, an average of 3.6 cTnI results were completed (range, 0–21) per patient. The first cTnI value was increased in 238 encounters (30%), increasing to 359 (43%) over time. Of 238 encounters with initially increased cTnI values, 205 (86%) remained increased for the duration of measured cTnI results. Of these chronically increased result encounters, there were between 0 and 17 additional cTnI results reported after the first alert of the encounter, with 77% having one, two, three, or four additional results after the alert was initially triggered. We conclude that (a) providers order cTnI excessively to rule in/out MI, (b) excessive orders are predominantly in non-ACS patients, and (c) providers do not follow recommended MI guidelines for cTnI ordering. Improvements in education and quality monitoring could provide an avenue to more appropriate utilization of cTn testing at substantial financial savings.
37 Physicians’ Challenges and Potential Solutions in Clinical Laboratory Test Ordering and Result Interpretation
Julie Taylor,1 Pamela Thompson,1 and John Hickner.2 Sponsor: Brian Smith.3 1Division of Laboratory Programs, Standards, and Services, CDC, Atlanta, GA; 2Department of Family Medicine, University of Illinois Chicago, Chicago, IL; and 3Yale School of Medicine, New Haven, CT.
Objective: The Centers for Disease Control and Prevention (CDC) sought to identify challenges primary care physicians face in diagnostic laboratory test ordering and result interpretation and solutions that are useful and currently available to them. Methods: General internal medicine and family medicine physicians were surveyed about their uncertainty when ordering and interpreting laboratory tests, tactics used to overcome uncertainty, challenges in test ordering and interpreting test results, perceived solutions to these challenges, and helpfulness of consultation with the clinical laboratory (OMB 0920-0893). Results: In total, 1,768 physicians responded to the survey. Physicians reported ordering diagnostic laboratory tests for an average of 31.4% of patient encounters per week. They reported uncertainty about ordering tests in 14.7% and uncertainty in interpreting results in 8.3% of these diagnostic encounters. Most helpful tactics physicians used to overcome uncertainty were review of e-references, referral to specialists, and curbside consultation. When physicians were uncertain about test interpretation, they frequently contacted patients for follow-up visits to review their medical histories. The most common and problematic challenges in test ordering were reported as cost to the patients and restrictions on insurance coverage. Other challenges included different names for the same test, different tests included in panels with the same names, and tests only available within a panel. The most common and problematic challenges in interpreting and using test results were problems receiving test results and confusing report formats. Most physicians endorsed a variety of information technology solutions to these challenges, but the solutions were not widely available. Conclusions: With more than 500 million patient visits to primary care physicians per year, the level of uncertainty reported in this study (31.4% or 157 million visits with test ordering) could affect ordering tests for 23 million patients (14.7%) and interpreting test results for 13 million patients (8.3%) per year. Improvement in information technology and clinical decision support systems, clear report formats, and access to laboratory consultations may reduce uncertainty and improve laboratory test utilization.
38 Comparison of Multiple Respiratory Viral Infections in Adults and Children
Diana K. Sung and Fann Wu. Sponsor: Richard O. Francis. Department of Pathology & Cell Biology, Columbia University, New York, NY.
A benefit of multiplex PCR testing of respiratory samples is the detection of coinfections with two or more viruses. The clinical significance of multiple respiratory viral infections, however, is not well known. The objective of this study was to investigate the incidence and clinical significance of triple-positive (TP) respiratory viral infections and compare them in adults and children. Data from respiratory samples tested by multiplex PCR (BioFire Diagnostics, Salt Lake City, UT) at a tertiary care hospital during November 2012 to October 2013 were analyzed for coinfection. A total of 4,915 patients, including 2,514 pediatric and 2,401 adult, tested positive for at least one virus. Thirteen pediatric (0.52%) and six adult (0.25%) patients tested positive for coinfection with three viruses. Of the TP pediatric patients, 92.3% (n = 12) were under the age of 3 years vs 60.9% (n = 1532) of pediatric patients who tested positive for at least one virus. TP adult patient ages ranged from 23 to 84 years. The most frequent virus detected in TP pediatric patients was human rhinovirus/enterovirus at 69.2% (n = 9), followed by respiratory syncytial virus (n = 6) and adenovirus (n = 5). In contrast, coronavirus OC43 was most frequently detected in adult TP patients at 66.7% (n = 4), followed by influenza A H3 (n = 3) and human rhinovirus/enterovirus (n = 3). Ten of 13 (76.9%) TP pediatric patients presented to the emergency department (ED), two to the outpatient clinic, and one as an inpatient. The majority that presented to the ED (8/10) were hospitalized; however, all who were hospitalized were discharged after 1 to 3 days of expectant management. None of the TP pediatric patients were known to have any medical conditions causing immunosuppression, whereas the majority of TP adult patients (5/6) were immunocompromised, three being lung transplant recipients and two having stage III chronic lymphocytic leukemia. Four of six TP adult patients were tested as inpatients and two as outpatients. The majority (5/6) had a more severe clinical course than the pediatric group, including septic shock and respiratory failure. Our results indicate that TP respiratory viral infections result in hospitalizations of previously healthy infants and toddlers. Triple infections in adults, however, are more likely associated with immunocompromised states as well as complicated and prolonged hospital stays. Further studies are warranted to investigate the impact of multiple respiratory viral infections.
39 Real-Time Surveillance of Epidemiological Trends in Drug Abuse and Overdose Using Mass Spectrometry–Based Urine Toxicology
Joshua A. Hayden and Geoffrey S. Baird.* University of Washington, Seattle, WA.
Drug overdoses and emergency room (ER) visits represent a substantial problem, with prescription drugs accounting for a greater proportion than illicits. The goal of this work was to determine if mass spectrometry–based urine toxicology (UT) could provide real-time surveillance of drug abuse. We utilized two separate UT assays, a liquid chromatography tandem mass spectrometry (LC-MS/MS) and a gas chromatography mass spectrometry (GC-MS) assay. The LC-MS/MS assay was utilized as part of our institution’s chronic pain screening, while the GC-MS assay was used as the comprehensive UT in our ER. Data on drug-related deaths were obtained from the county medical examiner. We found that the prevalence of prescribed opioids in our chronic pain patients correlated with the prevalence of abused drugs in this patient population (ρ = 0.71), with prescription opioids being five times more commonly abused than illicits (cocaine, heroin, and methamphetamine). Oxycodone, methadone, and hydrocodone were the main drugs prescribed and abused; these were also the main opioids detected in overdosed patients in our ER. The prevalence of drugs detected in overdosed patients mirrored the prevalence of drugs causing overdose deaths, with eight of the top 10 most common drugs occurring in both groups. These drugs included citalopram and diphenhydramine, which were only detectable because of the use of mass spectrometry–based UT. In addition, mass spectrometry allowed us to detect and obtain prevalence estimates for levamisole, a clinically relevant cocaine adulterant. The estimated prevalence (70%; 95% confidence interval, 64%–76%) was statistically indistinguishable from estimates by the Drug Enforcement Agency (70.3%). The demographics (age, sex) and prevalence of alcohol (27.1%) in overdosed patients in our ER were statistically indistinguishable from those of patients dying of overdoses (alcohol prevalence 26%). This work indicates that mass spectrometry–based UT can provide real-time surveillance of drug abuse. This allows for the identification of at-risk patient demographics and patterns of drug abuse, including drugs not detected by standard immunoassays.
40 A General Method for Validating Assays on Variable Body Fluids
Matthew Schrage, Geoffrey Baird,* Andrew Hoofnagle,* and Petrie M. Rainey.* Department of Laboratory Medicine, University of Washington, Seattle, WA.
Validation of clinical assays in body fluids is problematic in multiple ways. Assay manufacturers frequently do not include testing of body fluids among their claims, requiring individual laboratories to validate body fluid testing as laboratory-developed tests. We developed a generalizable method for validating multiple analytes by demonstrating matrix commutability in model effusions of varying composition. We made a model transudate by ultrafiltration of a patient serum pool using the Millipore Amicon Ultra 30-kD centrifugal filtration system. The ultrafiltrate was then mixed back with pooled serum in varying proportions to create a range of matrices, specifically mixtures of 0%, 25%, 50%, 75%, and 100% transudate/serum. Spike and recovery assays were performed in triplicate across these matrices to determine if they were commutable. When possible, spiking solutions spanning the analytical measurement range were used to determine linearity. We verified this approach as a part of introducing a new chemistry analyzer, the Beckman AU 680. Matrix effects were assessed using the slopes of analyte recovery vs matrix composition (expected to be zero for no matrix effect). The slopes were not statistically different from zero for bicarbonate, creatinine, total bilirubin, total protein, albumin, lactate dehydrogenase, amylase, lipase, and triglycerides. The 95% confidence intervals for the slopes were not clinically significant for sodium (0.017 to 0.054), potassium (0.015 to 0.051), chloride (0.016 to 0.048), magnesium (0.036 to 0.079), phosphate (0.019 to 0.051), serum urea nitrogen (0.0068 to 0.035), and glucose (0.020 to 0.046). Matrix commutability could not be demonstrated for the cholesterol assay (95% confidence interval for the slope of 0.11 to 0.22). This novel approach should serve as an excellent model for validating a variety of assays for matrix commutability.
41 The Microbiology Diagnostic Management Team: A Novel Initiative to Promote Interpretive and Individualized Laboratory Medicine
Jonathan E. Schmitz, Charles W. Stratton, Carol A. Rauch, and James Chappell.* Department of Pathology, Microbiology, and Immunology, Vanderbilt University Medical Center, Nashville, TN.
A novel initiative in the practice of clinical microbiology—termed the Microbiology Diagnostic Management Team, or Microbiology DMT—was initiated in 2010 and has evolved over the past 3 years in the Department of Pathology, Microbiology, and Immunology at Vanderbilt University Medical Center. The Microbiology DMT involves daily coordinated efforts of the microbiology faculty, fellow, and pathology residents and strives to realize the following goals: (1) create a formal venue for expert interpretation of challenging laboratory findings, (2) proactively identify cases benefiting from further intervention through monitoring of sentinel results, and (3) optimize the laboratory response to intra- and extra-departmental consultative requests. The scope of the Microbiology DMT encompasses bacteriology, virology, mycology, and molecular diagnostics, along with certain aspects of parasitology, infectious disease serology, and microbiological reference testing. The members of the Microbiology DMT meet daily (M-F) to review emerging cases, monitor developments in ongoing cases, and formulate actions on a patient-by-patient basis. Regular interventions include submitting interpretive commentary in the electronic medical record, advising managing clinicians on appropriate follow-on testing, and facilitating involvement (as indicated) by other specialty services (in particular, Infectious Diseases, Clinical Pharmacy, and Infection Control and Prevention). In this presentation, we outline the general workflow of the Microbiology DMT and describe the added value of a DMT-centered microbiology service to patient care. We also discuss the collateral impact of the DMT on other aspects of the clinical microbiology laboratory, such as administrative/management issues, house staff education, antimicrobial stewardship, and resource utilization. The clinical laboratory is under increasing pressure to make more direct and definable contributions to positive patient outcomes, and we believe that the Microbiology DMT is an effective mechanism for clinical microbiologists to maximize their impact as health care practitioners.
42 Systemic Lupus Erythematosus Complicated by Macrophage Activation Syndrome, Portal Vein Thrombosis, and Ischemic Colitis: An Autopsy Case Report
Yihong Ma, Vishnu Reddy,* and Stephanie Reilly. Department of Pathology, University of Alabama at Birmingham, Birmingham, AL.
Macrophage-activation syndrome (MAS) is a severe, potentially life-threatening condition. It has been described in association with systemic lupus erythematosus (SLE), Kawasaki disease, and systemic juvenile idiopathic arthritis (sJIA). The hallmark clinical and laboratory features include high fever, hepatosplenomegaly, lymphadenopathy, pancytopenia, liver dysfunction, disseminated intravascular coagulation (DIC), hypofibrinogenemia, hyperferritinemia, and hypertriglyceridemia. We report a 60-year-old white woman with SLE and Sjögren syndrome who was admitted with sepsis from acute colitis. Initially stable, her condition declined despite therapy, raising a concern for MAS. Her ferritin levels were significantly increased (1,776–8,944 ng/mL) along with positive DIC laboratory results (d-dimer 5,000–20,000 ng/mL, fibrinogen 100 mg/dL, platelet 50 × 103/cmm, INR 7.8) and high white cell count (40–50 × 103/cmm). She died on hospital day 8. Postmortem examination confirmed acute colitis involving a 60-cm segment and revealed acute portal vein thrombosis (PVT) with extensive associated hepatic necrosis. Bone marrow was hypercellular with hemophagocytic activity; CD163 staining demonstrated increased numbers of macrophages with hemophagocytosis. The immediate cause of death was attributed to MAS associated with PVT and acute colitis. The underlying cause of death was SLE. Associations have been reported between SLE and MAS and SLE and thrombosis. The incidence of MAS in sJIA is 7% to 13%. In contrast, MAS associated with SLE is rare, with an incidence of 0.9% to 4.6%. Cytotoxic dysfunction in SLE may lead to persistent excessive expansion and activation of T cells and macrophages. Arterial and/or venous thrombosis is a well-known clinical entity in SLE, with a prevalence >10%. Thrombosis is attributed to a number of thrombophilic defects, including lupus anticoagulant and anticardiolipin antibodies. PVT in SLE patients after abdominal surgeries has been reported, but no statistical data have been found in non–surgery-related SLE cases. To our knowledge, this is the first fatal case of MAS and PVT in a SLE patient. It highlights the importance of considering rare complications and the need for early recognition and immediate therapeutic intervention.
44 Retrospective Evaluation of Multi-Reviewer Sign-Out for a Next-Generation Sequencing Assay Targeting Somatic Mutations in Cancer
Eric Q. Konnick, David Wu, Brian H. Shirts, Jonathan F. Tait, and Colin C. Pritchard.* Department of Laboratory Medicine, University of Washington, Seattle, WA.
Introduction: Interpretation of next-generation sequencing (NGS) assays performed on clinical tumor tissue is challenging due to the complicated nature of neoplastic tissue and the bioinformatics pipelines necessary to analyze and annotate the raw sequence data. Broad inclusion of numerous genes on test panels necessitates extensive reviewer understanding of the relevant pathways and signaling cascades. Somatic mutation analysis in neoplasia is further complicated by high mutation rates, tumor heterogeneity, and admixture of normal tissue, requiring analysts to be cautious with their interpretation of variants. The goal of this project was to evaluate the absolute and comparative performance of resident and medical director analysis of multigene tumor panel output. Methods: We evaluated the performance of a multireviewer model for identification and reporting of significant findings of an NGS panel for cancer and other neoplasms in 101 sequential patients. The use of target capture combined with NGS on Illumina HiSeq or MiSeq instruments allows us to detect mutations, copy number changes, and structural variants relevant to cancer treatment, prognosis, and diagnosis. Results were independently analyzed by two to four reviewers, including at least one senior pathology resident and one or more physician-level laboratory medicine faculty. Results deemed significant by each reviewer were compiled and discussed in a consensus (sign-out) session. We compared the variants identified by residents and medical directors and assessed the rates of agreement between reviewers and inclusion in the final patient reports Results: In 101 patients, the bioinformatics pipeline identified 387,826 variants, which were reduced to 148,582 by removing intronic, noncoding, and synonymous substitutions. Of the remaining variants, 867 were brought to consensus conference for possible inclusion in the final patient report, with the number of variants per sample ranging from 1 to 51. In total, 472 variants (55%) were reported across 96 cases, with no variants reported for five patients (5%). Of the reported variants, 90 (19%) were identified by only one reviewer, with the majority of these variants (58, 66%) identified by the resident reviewer. Results that were discussed at consensus conference, but not reported, were selected by only a single reviewer in 51% of cases. Conclusions: The use of multiple independent reviewers, including residents, in evaluating the somatic mutations in neoplastic tissue identified by a multigene NGS panel increases the yield of reported results and may limit reporting of incidental findings through discussion and a more standardized approach to reporting.
45 A Novel Method for Monitoring Therapy and Guiding Treatment Decisions in a New Era of Anticoagulants
Jaime Noguez and James Ritchie.* Department of Pathology and Laboratory Medicine, Emory University School of Medicine, Atlanta, GA.
A number of orally administered drugs have recently been introduced to the market, which selectively inhibit single enzymes in the coagulation cascade. These new oral anticoagulants hold the promise of overcoming some of the limitations associated with traditionally used anticoagulants and are predicted to have a therapeutic advantage. Perhaps one of the most appealing attributes of these new drugs is that they do not require routine monitoring due to predictable pharmacologic profiles with both fixed and weight-based dosing regimens. There are specific clinical situations, however, where detecting and quantifying them may be required for clinical decision making, such as perioperative patient management, monitoring high-risk patients, ruling in/out an adverse drug event, and bleeding event management. Many challenges exist for monitoring these new drugs, and current coagulation assays have demonstrated limited clinical usefulness. The inability to accurately and reliably measure their concentrations when the need arises has made clinicians wary of prescribing them. Using a Waters Acquity UPLC coupled to a Waters Xevo TQ MS, we have developed and validated a sensitive liquid chromatography tandem mass spectrometry (LC-MS/MS) method to simultaneously quantify all four oral anticoagulant drugs (warfarin, dabigatran, rivaroxaban, and apixaban) on the market in plasma samples. Quantification was based on multiple-reaction monitoring and the use of stable isotope-labeled drug analogs as internal standards. This method allowed for the reproducible and accurate quantification of warfarin, dabigatran, rivaroxaban, and apixaban using only 100 μL of plasma, with limits of quantification of 2.5, 5.0, 0.5, and 1.5 ng/mL, respectively. Linearity was established up to 2,000 ng/mL, with no significant interferences observed in lipemic, hemolyzed, or icteric plasma samples and no significant matrix effects observed for samples collected in green (heparin), light blue (citrate), lavender (EDTA), or yellow (ACD) top BD Vacutainer Tubes. This multicomponent MS method for the determination of anticoagulants is the first of its kind and provides laboratories with a novel, cost-effective way to monitor therapy and to guide treatment decisions in the new era of oral anticoagulant drugs.
48 Comparison of Two Human β-Chorionic Gonadotropin Immunoassays Used in Early Pregnancy and Embryo Implantation Evaluations
Jing Cao, Samantha Miller, and Michael Y. Tsai.* Department of Laboratory Medicine and Pathology, University of Minnesota, Minneapolis, MN.
Human chorionic gonadotropin β subunit core fragment (β-hCGcf) is the major form of hCG found in urine samples in pregnancy. Nevertheless, multiple other forms of hCG variants are present in the urine during early pregnancy. The objective of this study was to compare the performance of two β-hCG immunoassays used for early pregnancy and embryo implantation evaluation in the Effects of Aspirin in Gestation & Reproduction (EAGeR) study conducted by the National Institute of Child Health and Human Development. Standards of β-hCG, β-hCGcf, nicked β-hCG (β-hCGn), intact hCG, and nicked hCG (hCGn) were obtained from the National Institute for Biological Standards and Control (NIBSC) and were measured using the β-hCG immunoassay kits from Diagnostic Automation (Calabasas, CA) and from BioVendor (Candler, NC); both were used to assay urine samples from the EAGeR study for early pregnancy (<5 weeks) and implantation evaluation performed at the University of Minnesota Advanced Research and Diagnostic Laboratory. NIBSC standards were serially diluted in assay diluent to 40 and then to 1 mIU/mL. The assay reactivity to the standards of hCG variants was calculated as a percentage of the measured concentration over the concentration calculated from dilution. At the concentration of 1 mIU/mL, the reactivity to β-hCG was 190% and 55% for the assays from Diagnostic Automation and BioVendor, respectively, while at 40 mIU/mL, the reactivity was 133% and 179% for the two assays. When tested against other NIBSC standards of hCG variants, the reactivity was overall higher in the assay from Diagnostic Automation than BioVendor. We concluded that between the two β-hCG immunoassays used for early pregnancy and embryo implantation evaluation in the EAGeR study, the assay from Diagnostic Automation showed higher reactivity to all hCG variants than BioVendor at the concentration of 1 mIU/mL, which indicated the need of establishing assay-specific hCG reference ranges in early pregnancy studies.
49 Testing for Lupus Anticoagulant by Two Different Methodologies Provides Additional Diagnostic Information at a Tertiary Care Hospital
Daimon P. Simmons,1 Adrianna Z. Herskovits,1 Elisabeth Battinelli,2 Peter H. Schur,3 Susan J. Lemire,1 and David M. Dorfman.1* 1Department of Pathology, 2Department of Medicine, Division of Hematology, 3Department of Rheumatology/Immunology, Brigham and Women’s Hospital, Boston, MA.
Antiphospholipid antibody syndrome is characterized by laboratory evidence of antiphospholipid antibodies (eg, a lupus anticoagulant [LA], anticardiolipin, and/or anti–β2-GPI) in a clinical setting of venous or arterial thrombosis or specific pregnancy morbidity. Recently, the International Society on Thrombosis and Haemostasis recommended that laboratories utilize two different testing modalities to detect a LA. Our hospital is a tertiary care center that tests patients for the presence of a LA in the setting of thrombosis, pregnancy morbidity, and in the evaluation of connective tissue disease. Many asymptomatic patients with a prolonged PTT are also tested for LA. In accordance with the new recommendations, our coagulation laboratory has instituted a diluted Russell’s viper venom time (dRVVT) in addition to the previously used PTT-based method to detect LAs. Test results were tracked over a 3-month period. Of the positive results (135 of 328 total cases), 50% were detected by the PTT method alone, 22% by only the dRVVT method, and 28% by both modalities. This confirms that not all cases of LA in antiphospholipid antibody syndrome are detected with a single LA testing modality. LA results were correlated with anticardiolipin and anti–β2-GPI levels in patients who were tested with both modalities (188 and 118 cases, respectively). LAs detected simultaneously by both testing modalities were more likely to have a coincident high titer anticardiolipin antibody. Chart review was performed to correlate the clinical history with laboratory findings of LA. Asymptomatic patients with a prolonged PTT are a persistent diagnostic challenge, and these patients are more likely to have a positive LA detected through the PTT-based method. Patients with a positive LA detected by dRVVT are more likely to have a history of thrombosis than patients with negative LA or those detected by PTT only. However, excluding the asymptomatic patients from analysis revealed an equivalent increase in thrombotic risk in patients with a positive LA detected by either PTT- or dRVVT-based methods. These findings show that the addition of a second testing modality for LA provides additional diagnostic information and may be helpful in identifying patients with future risk of thrombosis.
52 Computer-Aided Diagnosis of Classical Hodgkin Lymphoma (CHL) by Flow Cytometry
David P. Ng, David Wu,* and Jonathan Fromm.* Department of Laboratory Medicine, Division of Hematopathology, University of Washington, Seattle, WA.
Objective: Previously described manual methods for diagnosing CHL from flow cytometry depend on sequential gating of small populations of Hodgkin/Reed Sternberg (HRS) cells and are not widely standardized. We sought to evaluate machine learning methods in classifying cases with and without CHL as well as determine novel features that may be useful in classifying these cases. Methods: Flow cytometry data from 144 clinical cases using a previously described nine-color panel flow cytometry were analyzed using algorithms developed on Python 2.7.6 and the “scikit-learn” module. Seventy-eight 50 × 50 two-dimensional histograms were generated using the flow cytometry data and a reciprocal power function (y = x−3/2) applied so as to favor rare events. These features were vectorized, the dimensionality (d = 195000) reduced by principal component analysis (PCA) or ensemble methods, and rescaled before support vector machines (SVM), gradient boosting, or random forest classifiers were applied. Results: All three classifiers showed no statistical difference in performance, with 89% to 92% accuracy on stratified k-fold cross-validation. Interestingly, nearly the same set of cases was misclassified by all three methods. As a whole, there were more false-positive cases than false-negative cases. Dimensionality reduction by ensemble methods selected for points in a CD5+/CD40+/CD64- region. Dimensionality reduction by PCA demonstrated better performance with lower overhead given large training sets. Additionally, the inclusion and exclusion of HRS gated cells had little impact on the overall performance, selected support vectors in SVM, or dimensionality reduction by ensemble methods. Conclusion: We show computer-aided classification of CHL flow cases is possible and can be a useful adjunct to manual diagnosis of cases. All three classifiers can provide probabilistic confidences for each result, and cutoffs can be chosen to minimize false negatives to better serve as a screening tool. More interestingly, there appear to be features beyond the recognition of discrete HRS populations that are useful in the classification of these cases. We hypothesize these may be immunophenotypically abnormal T-cell infiltrates, and additional research will be required to determine the characteristic immunophenotype of these infiltrates.
53 Parathyroid Hormone (PTH) Measurement in Fine-Needle Aspiration Biopsy (FNAB) Washings Identifies Parathyroid Tissue With High Specificity
Hemamalini Ketha, Michael A. Lasho, and Alicia Algeciras-Schimnich.* Department of Laboratory Medicine and Pathology, Mayo Clinic, Rochester, MN.
Background: Measurement of PTH in fine-needle aspirate biopsy (FNAB) washings following ultrasound-guided biopsy has gained popularity as a means to provide localization of suspected parathyroid lesions and to differentiate parathyroid tissue from thyroid nodules. Interpretation of the PTH concentration in the washout will be influenced by the collection volume, collection technique, and the PTH assay used. Objective: To establish a diagnostic cutoff for PTH in FNAB needle washings that identifies parathyroid-derived tissue with 100% specificity. Methods: This was a retrospective study where all Mayo Clinic patients who had PTH measured on FNAB needle washings from June 2008 through September 2013 were identified. FNAB washings were performed by rinsing the biopsy needles, after expulsion of cellular material for cytology, with 0.1 to 0.2 mL of normal saline. The needle washings from five passes per biopsied site were pooled (final volume 0.5–1.0 mL) and sent refrigerated to the laboratory for testing. PTH was measured using the Roche Elecsys assay (Indianapolis, IN). Medical records were reviewed to document clinical presentation, PTH concentration in serum and FNAB washings, imaging procedures, and cytology and pathology outcomes. Statistical analysis was performed using JMP software (version 8.0; SAS). Results: Ninety-three FNAB washings specimens from 79 patients (59 females and 20 males; age, mean ± SD: 64 ± 13 years) were tested. In 79 patients, there was sufficient clinical information to classify the biopsied structures unequivocally as parathyroid (n = 38) or nonparathyroid tissue (n = 41). ROC curve analysis produced an area under the curve (AUC) of 0.96. A PTH cutoff of 100 pg/mL showed 100% specificity, 82% sensitivity, and a positive predictive value of 100%. Out of the seven parathyroid specimens missed by the FNAB washings, three were also missed by cytology. Using an FNAB washing PTH to serum PTH ratio of 32.3 discriminated between parathyroid and nonparathyroid tissue with 100% specificity and 81% sensitivity (ROC AUC of 0.89). In 30 histology-confirmed parathyroid specimens, 21 (67%) were correctly identified by a PTH >100 pg/mL, whereas only 17 (57 %) were detected by cytology. Conclusions: FNAB washings with a PTH concentration of >100 pg/mL or a PTH FNAB washing to serum ratio of 32.3 were selected to achieve 100% specificity, hence avoiding unnecessary surgeries. A PTH cutoff of >100 pg/mL showed slightly better diagnostic sensitivity than cytology but still was suboptimal. Performance of PTH FNAB washings might be further improved by more accurate sampling of the biopsy structures.
54 Failure of Laboratories to Supplement Aminotransferase Assays with Pyridoxal 5′ Phosphate Leads to Discrepant Results: A Case Report and Investigation of Discrepant Serum AST and ALT Activity in Patient Subgroups
John R. Mills,1 Craig A. Wittwer,1 Dina N. Greene,2 Lynn A. Cheryk,1 Darci R. Block,1 and Nikola A. Baumann.1* 1Department of Laboratory Medicine and Pathology, Mayo Clinic, Rochester, MN, and 2Kaiser Permanente, TPMG Northern California Regional Laboratory, Berkeley, CA.
The IFCC recommends that aspartate aminotransferase (AST) and alanine aminotransferase (ALT) assays include pyridoxal-5′-phosphate (P5P/vitamin B6), a cofactor required for maximal activity. Despite these recommendations, the 2013 CAP proficiency testing survey showed <5% of laboratories with Roche chemistry analyzers reported using P5P-supplemented reagents. We present a case report of a patient with P5P deficiency and macroenzyme AST who presented with discrepant AST results from several laboratories, leading to unnecessary workup and additional testing. The objectives of this study were to compare serum AST and ALT measurements using reagents with or without P5P and assess the clinical impact in the general patient population, in patients with P5P deficiency, and in patients with macroAST. Residual samples from liver function testing (N = 349 serum, AST 10–3,690 U/L), P5P testing (N = 239 plasma, P5P <2–180 mcg/L), and macroAST cases (N = 7 serum) were obtained. AST and ALT were measured on Roche Cobas c501 using the ALTL [−P5P], ALTLP [+P5P], ASTL [−P5P], and ASTLP [+P5P] reagents (Roche). Serum and plasma were validated for all reagents. Linear regression analysis of method comparisons for AST and ALT yielded ASTL = 0.89(ASTLP) −2.1; R2 = 0.98 and ALTL = 0.86(ALTLP) +0.48; R2 = 0.98. For AST, eight of 349 patients would be classified differently according to the manufacturer’s reference interval (RI) using +P5P reagent. Of 349 patients, nine had increases in AST >40% with P5P. For ALT, 12 of 349 patients would be classified differently according to the manufacturer’s RI using +P5P reagent. Of 349 patients, 11 had increases in ALT >40% with P5P. Patients with plasma P5P <2 mcg/L had greater increases (median, 44%; range, 22%–67%; P < .0001) in AST activity upon supplementation compared to those with normal plasma P5P concentrations (5–50 mcg/L). Patients with macroAST had greater increases of AST activity compared with patients without macroAST (median, 100%; range, 8%–1,500%; P < .001). In conclusion, patients with severe P5P deficiency and/or macroAST are at higher risk for discordant AST results due to inconsistent use of P5P supplementation. The case report and our data support universal adoption of P5P-supplemented AST and ALT assays to improve harmonization and reduce clinical confusion that arises from discrepant liver enzyme results between laboratories.
55 Iron Levels and Risk of Hepatocellular Carcinoma in Liver Explants
Margaret E. Lawless,1,2 Geoffrey S. Baird,2* Paul E. Swanson,1 and Matthew M. Yeh.1 Departments of 1Pathology and 2Laboratory Medicine, University of Washington, Seattle, WA.
Excess iron accumulation in the liver contributes to liver fibrosis, cirrhosis, and hepatocellular carcinoma (HCC). In animal models, dietary iron overload has been shown to cause development of pre-neoplastic nodules and hepatocellular carcinoma in the absence of fibrosis or cirrhosis. The exact mechanism by which iron is hepatocarcinogenic is unclear but is most likely due to increased generation of reactive oxygen species. In this study, we aimed to determine whether various indices of iron status could be of value in predicting the presence of hepatocellular carcinoma in livers explanted for end-stage liver disease. We performed multivariate analyses on 533 liver explants in our institution, evaluating the independent association of multiple indices of iron status with the presence of HCC. Serum ferritin, iron, AFP, and TIBC, as well as semiquantitation of liver tissue iron by iron stain (0–4+), were assessed for their utility as predictors of HCC status by generation of ROC curves. HCC was present in 158 of the liver explants (30%). Etiologies of cirrhosis primarily included HCV/HBV infection and alcohol; nonalcoholic steatohepatitis, primary biliary cirrhosis, and several others were also represented in the analyses. Two multivariate analyses showed no statistically significant relationship between any of the four iron indices and presence of HCC (P > .15). Areas under the curve (AUC) in individual ROC analyses were 0.54 for serum ferritin, 0.60 for serum iron, 0.51 for TIBC, 0.50 for semiquantitative tissue iron, and 0.63 for serum AFP. There was a significant negative predictive effect for lower serum ferritin levels (<50 ng/mL), with an odds ratio of 0.54 (95% CI, 0.31–0.93). A logistic model containing data points with serum iron log(ferritin), tissue iron, and AFP was able to predict HCC with an AUC of 0.68 (accuracy = 74%). In this model, only serum iron (P = .00005) and AFP (P = .003) were significant as coefficients. Multiple statistical techniques have shown no clinically useful predictive effect of multiple indices of iron status and indicators of iron overload; however, there is a small but statistically significant relationship between low iron storage levels (as measured by serum ferritin) and risk of HCC in liver explants, suggesting that lower iron levels have a minor protective effect.
57 Dilute-and-Shoot Mass Spectrometry to Identify Hemoglobin Variants
Jane Y. Yang1 and David A. Herold.1,2* 1Department of Pathology, University of California San Diego, La Jolla, CA, and 2VAMC-San Diego, San Diego, CA.
Cation exchange (CE)–HPLC and capillary zone electrophoresis (CZE) are oftentimes insufficient to identify specific hemoglobin variants. In addition to the information obtained from these classic techniques, we use mass spectrometry (MS) to confirm the identities of hemoglobin variants. Washed whole-blood samples, known specimens (for validation), and previously unable to be identified specimens by CE-HPLC and CZE were lysed in water (1:1, v/v) and centrifuged. Specimens were diluted in electrospray solvent (5% acetonitrile, 2% formic acid) and injected onto a C4 capillary column on a Waters Acquity UPLC for desalting coupled to a QSTAR Elite QTOF mass spectrometer (AB SCIEX). Mass spectra were deconvoluted and analyzed for peak distributions indicative of mass differences (Δm) due to a single amino acid variant in the α (M 15126.36) or β (M 15867.22) chain. The presence of a “normal” distribution of peaks in the spectra indicated that the mass difference between the “wild-type” hemoglobin chain and the variant is below the variability of the mass spectrometry (< 6 amu), thereby ruling out the variants with larger mass differences. Hemoglobins J-Baltimore (β16Gly>Asp), D-Punjab (β121Glu>Gln), and hemoglobin C (β6Glu>Lys), as indicated by CE-HPLC and CZE, were confirmed by MS (experimental Δm 58.5 β, <6 amu, and <6 amu, respectively), corresponding to calculated Δm 58.04, −0.99, and −0.95. Hemoglobins Athens-Georgia (β40Arg>Lys), G-Pest (α74Asp>Asn), and J-Habana (α71Ala>Glu) with calculated Δm −28.02, −0.99, and 58.04 were also confirmed by MS (experimental Δm −29, <6 amu, and 58 α). In cases where this approach yields multiple possible variants, the position of the amino acid substitution can be confirmed with digestion and tandem MS protein sequencing, which is possible with the sample volumes remaining. The authors gratefully acknowledge Drs Majid Ghassemian and Elizabeth Komives at the UCSD Biomolecular and Proteomics Mass Spectrometry Facility for training and usage of their instrument.
59 Frequency of Overlooked Abnormal Results in Urine and Blood Identified After Retrospective Expanded Screening for Toxic Elements by Inductively Coupled Plasma Mass Spectrometry Highlights Limitations With Current Screening Approaches
Sarah A. Hackenmueller,1 Jake Walden,2 Mongola Yang,2 Christian Law,2 and Frederick G. Strathmann.1,3* 1Department of Pathology, University of Utah; 2ARUP Laboratories, Salt Lake, UT; and 3ARUP Institute for Clinical and Experimental Pathology, Salt Lake City, UT.
Timely diagnosis of heavy metal poisoning represents a significant issue in clinical toxicology. In addition, overlapping symptoms after exposure to several of the heavy metals presents a challenge to the physician and the laboratory to ensure the appropriate test is ordered and readily available. Historically, the laboratory has provided quantitative “panels” comprising the elements most commonly involved in poisonings; however, sample type (ie, urine or blood) can also be a challenge due to the complex biological distribution of heavy metals. In contrast, it is common for many hospitals to use point-of-care toxicology screens, immunoassay-based laboratory tests, or comprehensive mass spectrometry–based screens to provide qualitative compound class or compound identification in both urine and serum/plasma for overdose situations in non–heavy metal drug exposures. A retrospective review of results from heavy metal panels in blood containing three (n = 13,370) or four (n = 1,852) elements revealed overall positivity, defined as at least one element being elevated above the reference interval, to be <7% for each panel. This low positivity rate raises the possibility that the currently designed panels may be missing potential elemental exposures. To determine if a comprehensive, heavy metal screen by inductively coupled plasma–mass spectrometry would be of clinical utility, we proactively screened 101 blood and 159 urine specimens submitted for trace element testing. The originally ordered tests for the submitted specimens included individual elements as well as panels containing between three and six elements in blood or urine. Of the 159 submitted urine specimens that were negative for the originally ordered elements, 57 (36%) had detectable concentrations greater than the established reference intervals for aluminum, arsenic, bismuth, chromium, cobalt, iodine, manganese, nickel, selenium, and zinc. For the 101 submitted blood specimens that were negative for the originally ordered elements, 46 (46%) had detectable concentrations greater than the established reference intervals for antimony, arsenic, lead, and manganese. The percent positive results identified here with a wider screening protocol for known and potentially toxic elements, taken together with the identified challenges in accurately detecting elemental exposure, suggest that an expanded elemental screening panel may be of clinical utility in unknown elemental exposures.
60 A Case of Good Syndrome With a Full Genome Analysis of the Patient and Family
Jonathan H. Esensten,1 Mickie H. Cheng,2,3 Mark S. Anderson,2,3 Adam S. Lauring,5 Farid F. Chehab,1 Katherine Gundling,3 and Mark Seielstad.1,4 Sponsor: Enrique Terrazas.1* 1Department of Laboratory Medicine, 2Diabetes Center, 3Department of Medicine, and the 4Blood Systems Research Institute, University of California, San Francisco, CA, and 5Department of Medicine, Division of Infectious Diseases, Department of Microbiology & Immunology, University of Michigan Medical School, Ann Arbor, MI.
Good syndrome is a rare disorder associated with thymoma and severe humoral and cellular immunodeficiency. Marked absence of B cells in the peripheral blood is a common finding. Genetic and environmental risk factors for this disease are unknown, although hypotheses for possible genetic causes have been proposed, such as mutations in TACI and Ikaros. We present a case of a 38-year-old man with a history of thymoma resected at age 22 years, myasthenia gravis, recurrent pneumonia, and hypogammaglobulinemia on chronic prednisone therapy. He presented with diarrhea secondary to CMV colitis, disseminated Mycobacterium avium complex infection, and a new diagnosis of sclerosing cholangitis. His hospital course was complicated by Candida fungemia, VRE bacteremia, and presumed fungal endophthalmitis. Laboratory evaluation was notable for an absolute lymphocyte count of 0.73 ×109/L (normal range, 1.0–3.4 ×109/L) on admission and an absolute CD19+ B-cell count of 3 ×106/L (normal range, 90–660 ×106/L). Absolute B-cell counts had been very low for at least 3 years. The patient’s absolute CD4+ T-cell count was also decreased at 125 ×106/L (normal range, 410–1,590 ×106/L). An HIV test was negative. CD16+CD56+ natural killer cells were also decreased at 3 ×106/L (normal range, 90–590 ×106/L). The patient’s total serum IgG was 400 mg/dL (normal range, 672–1,760 mg/dL), with all subclasses decreased. Due to the history of thymoma and the presence of both cellular and humoral immunodeficiency, a diagnosis of Good syndrome was made. The patient, his parents, and his sister were consented for blood collection for whole-genome sequence analysis (University of California, San Francisco IRB 10-02294). Besides the patient, the other family members reported no history of recurrent infections or autoimmune conditions. Whole-genome sequencing was performed by Complete Genomics using a high-throughput DNA nanoarray platform. An analysis of the kindred using Ingenuity Variant Analysis did not show likely deleterious mutations in either of the proposed loci associated with Good syndrome (TACI and Ikaros). These results suggest possible locus heterogeneity for Good syndrome. Homozygous deleterious mutations in genes annotated to be involved in immune cell regulation were not detected in the proband in our analysis.
61 Serum Total B12, Holotranscobalamin, and Methylmalonic Acid in the Diagnosis of Vitamin B12 Deficiency
Jeffrey W. Meeusen, Maria A. Willrich, Nikola A. Baumann, and Alicia Algeciras-Schimnich.* Department of Laboratory Medicine and Pathology, Mayo Clinic, Rochester, MN.
Background: Early detection of vitamin B12 (B12) deficiency is essential to prevent potentially serious complications. The biochemical evaluation for B12 deficiency includes measurement of total B12 and methylmalonic acid (MMA). The use of total B12 as the first line for screening is limited by its low sensitivity. MMA is increased in individuals with B12 deficiency but lacks specificity, especially in patients with impaired renal function, limiting its utility. Recently, assays for holotranscobalamin (holoTC), the bioavailable protein-B12 complex, have become commercially available. However, the utility of holoTC in the detection of B12 deficiency is controversial. Objectives: To evaluate the performance of B12, holoTC, and MMA in identifying B12 deficiency. Methods: Study cohorts included healthy individuals (n = 210) and patients undergoing B12 deficiency evaluation (n = 239). Clinical B12 deficiency (n = 62) was verified by chart review. Serum holoTC was assayed using the Abbott ARCHITECT, total B12 using the Beckman UniCel DxI 800, and MMA by LC-MS/MS. The performance of each analyte alone or as a two-tier approach (total B12 or holoTC followed by MMA) was evaluated by receiver operating curve (ROC) analysis. Results: The holoTC central 95th percentile for the normal cohort was 25 to 317 pmol/L. In the clinical cohort, median values were decreased with clinical B12 deficiency for holoTC (28 pmol/L [IQR, 21–36 pmol/L] vs 98 pmol/L [IQR, 71–189 pmol/L]) and total B12 (134 pmol/L [IQR, 115–146 ng/L] vs 382 ng/L [IQR, 273–653 ng/L]) and increased for MMA (0.36 nmol/mL [IQR, 0.25–0.58 nmol/mL] vs 0.17 nmol/mL [IQR, 0.14–0.21]) (P < .0001 all cases). ROC analysis identified optimal cutoffs for holoTC at 46 pmol/L (AUC 0.97) with 92% sensitivity and 91% specificity, B12 at 197 ng/L (AUC 0.96) with 90% sensitivity and 92% specificity, and MMA at 0.26 nmol/L (AUC 0.87) with 74% sensitivity and 88% specificity. Diagnostic efficiency was 91.5% for both holoTC and total B12, while MMA efficiency was 85%. Our current two-tier approach, confirming total B12 between 150 and 400 ng/L with MMA (cut-point ≥0.4 nmol/L), showed 90% sensitivity and 94% specificity. Using a similar approach confirming holoTC between 50 and 75 pmol/L with MMA (cut-point ≥0.27 nmol/L) was 97% sensitive and 86% specific. Using holoTC as an initial screen required fewer MMA confirmations compared to B12 (52 vs 108). Conclusions: Serum holoTC and B12 are efficient biomarkers of clinical B12 deficiency. Using holoTC as an initial screen could decrease the number of MMA confirmatory tests, leading to decreased cost and time to result while increasing diagnostic sensitivity.
62 Challenges and Clinical Impact of Visual Hemolysis Detection for Trace and Toxic Element Assessment in Serum
Meghan Driscoll,1 Carrie J. Haglock-Adler,2 Kira Hinckley,3 Kelly L. Scholes,3 Christian Law,3 and Frederick G. Strathmann.1,2* 1Department of Pathology, University of Utah Health Sciences Center, Salt Lake, UT; 2ARUP Institute for Clinical and Experimental Pathology, Salt Lake, UT; and 3ARUP Laboratories, Salt Lake City, UT.
The current state of evaluation for presence of hemolysis in serum samples was assessed to determine if visual inspection is adequate to identify hemolyzed samples for trace element testing. Additionally, the effect of varying degrees of hemolysis on serum samples tested for select trace elements was assessed to determine the degree to which hemolysis would affect the reported values. Current laboratory protocol requires hemolysis to be evaluated visually inside the cleanroom testing facility to reduce environmental contamination. Samples are assigned a hemolysis index (HI ≈ mg/dL of hemoglobin), and any samples greater than 200 are rejected for testing. To assess adequacy of the visual assignment process, 87 serum samples submitted for routine clinical testing had visual HIs verified spectrophotometrically using a Roche cobas c702. Of the 56 samples with an HI less than 200 and determined to be acceptable for clinical testing, 54 (96.4%) were correctly identified compared with the value provided by measured hemolysis. Of the 31 samples with a visual HI greater than 200, 14 (45.2%) were correctly identified as having an HI greater than 200, while 17 (54.8%) samples were incorrectly identified based on spectrophotometric measurement. To determine the clinical significance of hemolysis on serum trace and toxic element measurements, a pool of serum samples with a measured hemolysis index of <2 and a pool of washed RBCs were used. The RBC pool was hemolyzed by exposure to three freeze/thaw cycles, and the serum and RBC pools were mixed to achieve 0, 1%, 5%, 25%, 50%, 75%, and 100% hemolyzed RBCs in serum. The mixed samples were analyzed for Cu, Zn, Co, Mn, and Ni using inductively coupled plasma–mass spectrometry. Clinically significant increases of 221% and 40% for Ni and Zn, respectively, were observed at an HI of 300 (1% RBCs). In contrast, relatively little impact was seen on serum measurements of copper, cobalt, and manganese with an HI less than 1,000. In summary, visual inspection for HI is an adequate technique for identifying hemolysis in serum samples at and below 200; however, improvements need to be made in the final determination of incorrectly identified samples with a true HI less than 200 to reduce false sample rejection. Additionally, these findings indicate that falsely elevated results due to hemolysis have the potential to negatively impact patient care by inaccurate indications of toxicity or through the masking of deficiency, leading to inappropriate medical treatment.
63 Simultaneous Identification of 9 Oral Antidiabetic Drugs in Plasma and Urine by Liquid Chromatography–Tandem Mass Spectrometry
Brenda B. Suh-Lailam,1 Geoffrey S. Rule,2 and Kamisha L. Johnson-Davis.1,2* 1Department of Pathology, University of Utah, Salt Lake City, UT, and 2ARUP Laboratories, Salt Lake City, UT.
Oral antidiabetic drugs (OADs) are utilized for the management of hyperglycemia in type II diabetes. Despite their efficacy in treatment, inadvertent or intentional overdose of OADs can cause life-threatening hypoglycemia. The detection of OADs is used to investigate hypoglycemia of unknown origin and may differentiate between hypoglycemia caused by OADs as opposed to insulinoma. A liquid chromatography–tandem mass spectrometry (LC-MS/MS) method was developed for the detection of OADs in plasma and urine. Additionally, retrospective analysis of 1,932 test results was evaluated to determine the positivity rate for OADs. Liquid-liquid extraction of plasma and urine samples was followed by separation and detection of nine OADs (first generation: chlorpropamide, acetohexamide, tolbutamide, tolazamide; second generation: glipizide, glyburide, glimepiride, repaglinide, and nateglinide) by LC-MS/MS using an API4000 with atmospheric pressure chemical ionization. Identification of specific drugs was achieved by two ion transitions in selected reaction monitoring mode. Method comparison with an external laboratory showed 100% concordance for all analytes in plasma except for glyburide (97.3%). Analyte recoveries ranged from 74% to 100% in plasma and 80% to 95% in urine. The limit of quantitation was 100 ng/mL for first-generation OADs and 5 ng/mL for second-generation OADs. Within- and between-run imprecision ranged from 2.5%–23.8% and 1.0%–17.2% in plasma and 2.3%–31.2% and 2.2%–23.9% in urine, respectively. Imprecision for chlorpropamide, acetohexamide, and tolbutamide was variable due to the lack of isotopically labeled internal standards for these analytes. Commonly used medications (eg, acetaminophen) did not interfere with this assay. However, glipizide results were negatively affected by hemolysis and lipemia. The effects of collection tube type were evaluated in sodium heparin, EDTA, serum, and KF/Na oxalate tubes. All of the tube types had a negative bias on chlorpropamide, while tolazamide had a negative bias when serum or an EDTA tube was used. Retrospective analysis of patient results in plasma showed a 16% (315/1932) positivity rate with glipizide, glyburide, and glimepiride most commonly identified. In urine, only glipizide was detected in 17 of 461 (4%) patients. This assay is sensitive and specific for the qualitative detection of nine OADs and is useful in investigating clinical cases of hypoglycemia of unknown origin, using plasma or urine.
64 Genomic Database Performance Improvements With Document-Based Database Architecture
Wade L. Schulz, Donn K. Felker, and Brent G. Nelson. Sponsor: Michael Linden. University of Minnesota, Minneapolis, MN.
Clinical genome sequencing has become increasingly common as the cost of sequencing has declined and efficiency has improved. While the underlying sequencing technology has improved dramatically, software solutions for data storage and analysis have not kept pace. In this study, we consider whether new database architectures, such as document-oriented databases, can improve the efficiency of genomic data storage and analysis. We used a cloud computing platform to compare write speed and query performance between a common relational (MySQL) and document-oriented (MongoDB) database with traditional disk and solid-state storage devices. Single-nucleotide polymorphism data from dbSNP were parsed and loaded into each platform and multiple queries were performed to determine the benefits of each platform. We found that, while the time needed to load individual records was similar, bulk loading data into MongoDB had a significant performance benefit. In addition, query performance in the document-oriented MongoDB was approximately two orders of magnitude faster than MySQL. Surprisingly, we found that standard disk storage within the cloud platform performed significantly better for write operations compared to solid-state drives, likely due to the random-write nature of these operations. These data demonstrate that document-oriented databases can provide significant benefits for new database systems within the field of genomics. It also underscores the need for performance testing on new hardware, as a change in vendor or storage type may lead to unintended consequences on system performance. Given these overall benefits, the cost of genomic data storage can be significantly reduced by migrating to a document-oriented database architecture.
66 Using Internal Control T-Cell and NK-Cell Subsets as an Objective Measure of Flow Cytometric Quality
Alicia M. Hunt, Stephen P. Ten Eyck, and Fiona E. Craig.* Department of Pathology, University of Pittsburgh School of Medicine, Pittsburgh, PA.
Flow cytometry studies are an important part of the evaluation of potential hematopoietic neoplasms. T- and NK-cell neoplasms may have subtle differences in staining from nonneoplastic populations. Although visual inspection of normal populations is commonly used as a subjective internal control, no objective control measures are currently in place. Normal T- and NK-cells are present in peripheral blood samples submitted for flow cytometry, possibly providing an opportunity for the development of an objective control. We hypothesized that normal T- and NK-cell populations would demonstrate a distinct and reproducible staining pattern, including those analyzed in the presence of T- or NK-cell neoplasms, and that these normal populations could be used to follow flow cytometry instrument performance over time. Flow cytometric data were reviewed from peripheral blood specimens stained with CD45 V500, CD2 V450, CD3 PE-Cy7, CD7 PE, CD4 Per-CP-Cy5.5, CD8 APC-H7, CD56 APC, CD16&CD57 FITC, and acquired with FACS Canto-II instruments. All peripheral blood specimens from July 2012 and June to July 2013 were reviewed, as well as those with an abnormal T-cell population from May 2011 to September 2013. T- and NK-cell subsets, including aberrant populations, were identified using a FACS DIVA template (BD Biosciences), and median fluorescence staining intensity (MFI) was determined. Specimens (n = 213) were classified by abnormal population (none, B cell, T cell, and other), and MFI was compared for four populations of T cells (CD3+CD2+, CD4+CD8−, CD4−CD8+(bright), and CD7+(bright) naive T cells). No significant difference in MFI was found between specimens with no abnormal population and specimens with an abnormal T-cell, B-cell, or other population (eg, CD3+CD2+ in specimens with no abnormal population vs those with an abnormal T-cell population [27,807 vs 23,473, P = .28]). In addition, the CD3 MFI of specimens from March 2013 to September 2013 was used to plot a moving average and build a Levey-Jennings plot. The moving average demonstrated a significant difference between the two flow cytometry instruments, despite instrument cloning (22,488 vs 26,771, P = .0007). It also highlighted a significant change in mean CD3 MFI before and after relocation of the laboratory (15,478 vs 44,551, P < .0001). In conclusion, normal T- and NK-cell populations were identified in all specimens and could be used to establish objective internal control criteria, even in those specimens with aberrant T- or NK-cell populations. Normal T-cell staining can also be used to monitor the longitudinal performance of flow cytometry instruments in the clinical laboratory.
67 An Open-Source Online Database for Clinical Pathology Consults
Cigdem Himmetoglu Ussakli,1 Patrick C. Mathias,1 Sinan Ussakli,2 and Noah Hoffman.1* 1Department of Laboratory Medicine, University of Washington, Seattle, WA, and 2Microsoft Corporation, Redmond, WA.
In a typical residency program, a substantial portion of the patient care on clinical pathology services takes place while a resident is on call for the laboratory. Since the issues handled by residents are typically not recorded in the conventional electronic medical record, documenting the educational value of these on-call experiences and providing evidence of competence can be a difficult task for faculty. In addition, a historical record of on-call experiences provides a valuable knowledge base. Our department had previously developed an on-call database system with a Microsoft Active Server Pages front-end and a Microsoft Access back-end. This system had approximately 26,000 calls entered until June 2013 but required replacement due to the scale of the database and a lack of support for outdated tools. A project to rewrite the on-call database using an entirely open-source software stack was initiated in April 2013, using an Apace web server, a Python web framework called Flask, and a PostgreSQL database instance hosted on a department server. Federated account management and authentication to our institutional domain were provided using pubCookie. This HIPAA-compliant and open-source database offers interfaces to track incoming calls to on-call residents; new record and edit screens tailored to the residents’ on-call workflow to facilitate data entry; bookmarkable, full-text search; lists and views of records with or without PHI; printing for faculty review; and other features like tagging, flagging, linking to other database entries, commenting, and voting. There are multiple user privilege levels, which allows delegation of certain administrative functions to trusted users. Since the new system was implemented in July 2013, there have been 2,262 call entries, 15 defined “tags” (call categories) created, and 70 commented entries. Each entry has been updated an average of four times. A survey conducted with 10 residents currently using the database proved an overall satisfaction of 4.5 on a 5-point scale. All respondents agreed or strongly agreed that the database has a positive impact on patient care and helps to improve test utilization and transfer of care. The users cited the automatic searching for previous entries by medical record number, auto save, and linking to other calls and publications as the most useful features. This open-source and effective software is a secure platform for any laboratory medicine department that aims to have online tracking for cases with positive impact on resident education and patient care with powerful features while maintaining ease of use and simplicity.
68 Survey of Microbiology Laboratory Technologists About Continuing Education Presentations by Pathology Residents
Thomas McDonald. Sponsor: Richard Haspel. Beth Israel Deaconess Medical Center, Boston, MA.
Laboratory technologists must complete Continuing Education (CE) credits in order to maintain their certification by various state and national agencies. At BIDMC, pathology residents give live presentations that count as CE credits for laboratory technologists. These presentations generally involve PowerPoint slides and last about 20 minutes. In order to learn about the technologists’ views of these presentations and to identify ways to improve them, an eight-question SurveyMonkey survey was developed and emailed to 43 laboratory technologists. Anonymous responses were received from 22 individuals. The results of this study were as follows: 45% of the respondents wanted to have a CE presentation every 2 weeks, while 32% wanted a weekly presentation, and 23% wanted a monthly presentation. As for topics of CE presentations, 95% of respondents were “definitely interested” in having presentations about specific BIDMC patients who received workups in the microbiology laboratory. Other popular topics were specific diseases (eg, HIV), fundamentals of microbiology (eg, antibiotics), and new technologies (eg, molecular testing methods). The technologists were also asked how often they work on a case that they think would be a good case for a CE presentation. Their responses were “about one time every two weeks” (42%), “about once per month” (29%), and “less than once per month” (29%). In addition, 68% expressed an interest in anonymously evaluating the presentations. Free-text responses included several comments indicating a preference for presentations that focus on the clinical aspects of cases (as opposed to the laboratory aspects). The conclusions of this study are as follows: the respondents generally favor having a CE presentation every 2 weeks. The most desired topic was presentations about specific BIDMC patients, especially about the clinical aspects of the cases. In addition, the majority of the respondents work on a case at least once per month that they think would be a good case for a CE presentation. Hence, mechanisms must be in place whereby laboratory personnel can feel comfortable in communicating these interesting cases to the pathology residents or attendings. Finally, a method should be implemented whereby the laboratory personnel can anonymously evaluate the CE presentations. In light of the results of this pilot study, the survey questions are currently being refined in order to further elucidate areas of ambiguity, such as the technologists’ preferred method for evaluating the presentations. An enhanced survey will be administered in April 2014 to laboratories of different sizes with the aim of increasing the generalizability of the results and increasing the reliability of the results through increased survey numbers.
70 Clinical Utility of Calreticulin Mutation Testing in the Diagnosis of Nonmutated JAK2 Myeloproliferative Neoplasms, Idiopathic Thrombocytosis, Leukocytosis, and Thrombosis
Joshua R. Menke, Henrietta Seet, and Farid F. Chehab. Sponsor: Enrique Terrazas. Department of Laboratory Medicine, University of California, San Francisco, CA.
Introduction: Somatic mutations in exon 9 of calreticulin (CALR) were recently found in approximately 80% of patients with nonmutated JAK2 myeloproliferative neoplasms (MPN), including primary myelofibrosis (PMF) and essential thrombocythemia (ET), and were associated with a low risk of thrombosis relative to mutated JAK2 disease. The two most common CALR mutations, type I (52bp deletion) and type II (5bp insertion), correlated with the diagnosis of PMF and ET, respectively. However, the clinical utility of CALR mutation testing in the diagnosis of JAK2-negative MPN, idiopathic thrombocytosis, leukocytosis, and thrombosis has not been defined. Method: Archived peripheral blood or bone marrow DNA samples with suspected PMF, ET, idiopathic thrombocytosis, leukocytosis, and thrombosis that were JAK2 V617F negative from 2009 to 2013 were retrieved from the molecular diagnostic laboratory at a single institution. The clinical diagnoses and bone marrow biopsy reports for all patients were reviewed and diagnosis confirmed by WHO criteria. Primers bracketing exon 9 of the CALR gene were used to amplify the region that encompasses the previously reported mutations. The resulting PCR product was analyzed by capillary electrophoresis and its size determined against DNA markers. Results: Eighteen of 24 patients with nonmutated JAK2 MPN showed mutations in CALR (75%). CALR mutations included eight type I mutations, six type II mutations, and four previously undescribed insertions/deletions. While all CALR mutated patients showed a mixed allele burden, one patient with progressive ET had a CALR type II mutation load of 36% in 2006 that evolved to nearly a clonal mutation of 91% in 2010 after chemotherapy. Type I CALR mutations were found in five patients with PMF and two patients with ET. Type II CALR mutations were found in six patients with ET and one patient with PMF. In two other cases, positive CALR mutation testing established the diagnosis of MPN in whom the thrombocytosis was considered to be reactive. None of the patients with a history of idiopathic leukocytosis or thrombosis showed CALR mutations. Conclusions: Mutations in CALR are clinically useful in confirming the diagnosis of MPN, including in some cases of thrombocytosis with an unclear etiology. CALR mutation type I and type II correlated with the diagnosis of PMF and ET, respectively, in most cases and may help reclassify cases with bone marrow findings that could represent either prefibrotic PMF or ET. CALR mutation testing is not recommended for patients with JAK2-negative idiopathic leukocytosis or thrombosis because of the low likelihood of finding a CALR mutation in this cohort.
71 Derivation of In Silico Standard Curves to Maximize the Number of Samples in BCR-ABL qPCR Runs of CML Patients Monitored for Minimal Residual Disease
Justin Mak, Martha Gunthorpe, and Farid Chehab. Sponsor: Deborah French. Department of Laboratory Medicine, University of California, San Francisco, CA.
Quantitative analysis of BCR-ABL1 transcripts is of considerable prognostic value in minimal residual disease monitoring of chronic myelogenous leukemia (CML). Real-time PCR (qPCR) is the method of choice for the determination of BCR-ABL1/ABL1 ratios but requires two standard curves, one for BCR-ABL1 and another for the internal control ABL1. Typically, these standard curves are generated in triplicates with cDNAs from, respectively, 5 and 3 log dilutions of K562 (a BCR-ABL1 positive cell line) and HL60 (a BCR-ABL1 negative cell line), thus limiting the number of wells dedicated to patient samples. We evaluated the use of in silico standard curves using ratios generated from 50 independent standard curves runs carried out on the QuantStudio qPCR platform (Life Technologies). Cycle threshold (Ct) values were averaged for each cDNA dilution over 50 runs and used to generate in silico BCR-ABL1 and ABL1 standard curves, which revealed regression coefficients of 0.997 and 0.998, respectively. The linear regression equation from each standard curve was then used to determine the BCR-ABL1/ABL1 ratios from 60 individuals with ratios previously determined with traditional standard curves. Comparison of traditional and in silico ratios over 5 logs showed an overall correlation coefficient of 0.985. More important, 99% of ratios remained in their log intervals. Since minimal residual disease is evaluated based on a log change over the course of kinase inhibitors treatment, these results ensure the reliability of in silico standard curves to determine accurately BCR-ABL1/ABL1 ratios with the goal of maximizing patient samples per run. Routine implementation of this approach would still require traditional standard curves to be run once a month to confirm that they continue to fall within the range of the in silico standard curve. Overall, the ability to double patient samples per assay increases turnaround times, decreases costs, and minimizes technical labor.
72 Immunophenotypic Phenograms Facilitate Rapid Interpretation of Leukemia and Lymphoma Multiparameter Immunophenotyping Assays
Allen W. Bryan Jr. and German Pihan. Sponsor: Richard Haspel. Department of Pathology, Beth Israel Deaconess Medical Center, Boston, MA.
Multiparameter flow cytometry assays are integral to the diagnosis and classification of leukemia and lymphoma (L&L). Specific attributes of many L&L forms are increasingly recognized as helping in rapid diagnosis, often predicting the driving mutations of these disorders. However, typical numerical displays of phenotypes strip out data such as stain intensity and coexpression patterns, which often convey important diagnostic and prognostic information. A visualization tool to concisely and intuitively display such information could therefore aid L&L diagnosis.
To facilitate visualization, recognition, and recall of information-dense and specific multiparameter immunophenotypes, we developed an intuitive visualization tool based on implementation of simple algorithms that can be routinely integrated in the average clinical cytometry laboratory. Central to this tool is an interactive circular plot displaying presence/absence and expression intensity for multiple L&L antigens. Markers are arranged circumferentially in sectors denoting stereotypical cell type: stem cells, B cells, T cells, NK cells, and myeloid cells. Radial co-display of marker intensity permits succinct assessment of overall expression patterns and likely diagnostic entities. “Phenograms” were produced by both manual data entry and automated conversion from FCS files using application-specific algorithms and found to be effective at conveying expression patterns to clinicians and trainees with minimal training required. Immunophenotypic “phenograms” represent a simple and effective way of visually summarizing flow cytometry assays, suitable for training both naive and advanced cytometry practitioners in readily recognizing patterns in flow cytometry data, thus enhancing the capabilities of the busy flow cytometry laboratory.
73 Thrombin-Containing Rapid Serum Tubes Remove Fibrinogen Bands on Serum Protein Electrophoresis
Nabiha Huq Saifee,1 Monica B. Pagano,1,2 and Mark H. Wener.* Sponsor: Mark H.Wener.1* 1Department of Laboratory Medicine, University of Washington, Seattle, WA, and 2Puget Sound Blood Center, Seattle, WA.
Serum protein electrophoresis is a key part of screening for monoclonal gammopathies. Fibrinogen may be present as a contaminant in poorly clotted serum samples and may be mistaken for a monoclonal protein spike in the β/γ region or may mask a true monoclonal immunoglobulin spike. Fibrinogen in serum samples is most commonly found in patients on anticoagulation therapy or with an indwelling catheter kept open with low-dose heparin; however, it can also be found in patients with coagulation disorders such as dysfibrinogenemias or liver disease. Several methods have been previously described to distinguish fibrinogen bands from true monoclonal spikes, including the removal of fibrinogen using protamine sulfate/thrombin or ethanol precipitation, immunofixation with anti-fibrinogen antibody to demonstrate presence of fibrinogen, and determination of a γ/IgG ratio. Here, we show that thrombin treatment of fibrinogen-contaminated serum samples with Becton Dickinson (BD) Vacutainer rapid serum tubes (RST) is a convenient, rapid, and effective method for removing suspected fibrinogen bands and revealing underlying monoclonal spikes. The BD vacutainer RST (368774) contains a thrombin-based clotting agent formulated to clot whole blood within 5 minutes of collection. To remove suspected fibrinogen, 500 μL of serum or plasma is placed in the RST and inverted. After incubation for 5 minutes at room temperature, the specimen is spun for 10 minutes at 2,000 rpm. Protein electrophoresis is then repeated on the treated specimen. Treatment of nine plasma samples with this method completely removed the fibrinogen peak in eight of nine specimens. One specimen containing 450 mg/dL fibrinogen had a small residual fibrinogen band after thrombin treatment. A paired t test analysis of monoclonal peak quantification of thrombin-treated plasma and serum samples revealed no difference in quantification between samples with and without thrombin treatment (P = .24). Thrombin-containing rapid serum tubes provide a rapid, convenient, and effective method to eliminate fibrinogen from poorly clotted serum specimens, and use of these tubes does not alter the quantification of monoclonal components on serum protein electrophoresis.
74 Targeted Sequencing of Specimens With Limited DNA on a Microfluidics-Coupled NGS Platform for Solid Tumor Diagnostics
Andrew C. Nelson,1 Matthew Schomaker,3 Christine Henzler,1,2 Teresa Kemmer,3 Jon D. Wilson,1 Sophia Yohe,1* and Bharat Thayagarajan.1* 1Department of Laboratory Medicine and Pathology, 2Minnesota Supercomputing Institute, and 3Fairview Health Services, University of Minnesota Health, Minneapolis, MN.
Next-generation sequencing technologies enable the simultaneous detection of many genomic alterations but are hampered by specimens with limited yields of double-stranded DNA such as cytology biopsies and formalin-fixed paraffin embedded (FFPE) tissue. We developed an oncology sequencing panel to accurately test low DNA input specimens in a cost-effective and rapidly adaptable manner. The panel covers 850 hotspot mutations in 19 genes utilizing 86 PCR amplicons produced on an integrated microfluidics chip (IFC). FFPE specimens are subjected to a preliminary quantitative PCR QC step to rule out excessively degraded DNA. Specimens are pooled and sequenced on a MiSeq platform. We investigated assay performance at low DNA input quantities for both cytology and FFPE specimens (ranging from 1–10 ng and 7.5–15 ng, respectively) compared to replicate samples loaded with higher DNA quantities (10–20 ng for cytology and 15–50 ng for FFPE). Uniformity of coverage and reproducibility of variant calls were two major factors impacting the accuracy of the assay at low DNA input levels. Average coverage was deceptively reassuring at low input levels (>7,400′ and >8,000′, cytology and FFPE, respectively) and was not significantly different than the average coverage achieved with higher DNA quantities (>8,700′ and >6,700′, respectively). However, 7% of amplicons in low-input specimens dropped below 1,000′ average coverage vs only 2% of amplicons for higher DNA inputs. The total number of variants called was increased by 15% (cytology) and 27% (FFPE) for low DNA input samples compared to replicates at higher inputs. There was also poor (<90%) concordance observed between replicates within the low DNA input range. Above the respective 10- or 15-ng minimum inputs, cytologic specimens demonstrated 100% concordance and FFPE specimens demonstrated 96% concordance across all sequenced bases for variants detected at or above a 10% variant allele frequency (VAF). At hotspot locations, detection of known mutations showed 100% sensitivity and a false-positive rate of 0.05% at a 5% VAF limit of detection. Experiments utilizing preamplification to improve accuracy and decrease minimum DNA input are ongoing. Our data indicate that targeted sequencing using a PCR-based microfluidics platform can provide accurate results at reasonable VAF thresholds with challenging clinical specimens.
75 Managing Emergency Department Add-On Laboratory Orders
Emily L. Ryan and Ross J. Molinaro.* Department of Pathology and Laboratory Medicine, Emory University School of Medicine, Atlanta, GA.
Laboratory test orders placed on specimens after the initial orders have been placed can tax resources and are impacted with reporting delays, which may impact patient care. These added-on orders disrupt normal workflow and can also impact staffing needs. We examined 31 days of emergency department (ED) orders (35,840 in total) at two institutions to explore differences between initial ordering patterns and the 627 (1.7%) add-on orders to determine if automation storage and laboratory processes could decrease these delays. Most of the laboratory add-on orders were chemistry tests (62%). The following three tests totaled 45% of the add-on orders: troponin, magnesium, and lipase. For each of these three tests, we assessed time to results for both initial orders and add-on orders. “Time to result” for initial orders was defined as the time from specimen receipt to results sent electronically. “Time to result” for add-on orders was defined as the time from the order electronically placed to results sent electronically, since the specimen had already been received. Add-on orders were significantly delayed in all three instances. One institution had refrigerated storage on the automation line (hospital 1), while at the second institution, sample storage was off-line (hospital 2). We determined if storage attached to the automated line reduced the time to result for add-on orders. For troponin, an off-line test at both institutions, no difference was observed. For magnesium and lipase, both online tests, there was an additional 0.36- to 2.21-hour delay in result reporting for add-on when the sample storage was off-line. Magnesium median time to result at hospital 1 was 0.57 hours (IQR, 0.38–1.07) while at hospital 2, it was 2.37 hours (IQR, 1.00–3.38), P < .0001. The second approach to improving add-on time to results was a cost-benefit analysis of adding magnesium to all comprehensive metabolic panels and only reporting the result in instances where it was ordered. Depending on the institution, it was estimated that $100 to $200 a month in unbillable results would be incurred in order to eliminate the approximately 2-hour delay in result reporting, which could lead to a decreased ED stay. In our assessment, the addition of automated storage is the optimal approach to decrease reporting delays of add-on test orders.
76 Geographical Distribution of HPHV Genotypes in Minnesota
Omar Bushara,1 Jason Borah,1 Sophia Yohe,1 Andrew Nelson,1 Shabnam Zarei,1 Maria Surowiecka,1 Matthew Schomaker,3 Jon Wilson,1 Shalini Kulasingham,2 and Bharat Thyagarajan.1* 1Department of Laboratory Medicine and Pathology, 2Division of Epidemiology and Community Health, University of Minnesota, Minneapolis, MN, and 3University of Minnesota Medical Center, Fairview, MN.
Although one previous study has demonstrated the geographical differences in the distribution of HPV genotypes, differences in geographical distribution of HPV genotypes have not been demonstrated in the United States. Given the limited uptake of the HPV vaccine, identification of regions with a higher prevalence of high-risk HPV types may allow allocation of greater resources toward those areas. We extracted HPV information (presence of HPV infection and the HPV genotype) and residential zip codes from medical records of 44,446 women who underwent screening for cervical cancer using Pap smear and HPV testing at the University of Minnesota Medical Center, Fairview, from January 1, 2008, to December 31, 2012. HPV testing was performed using a PCR-based method that amplified the L1 region using the MY09/MY11 primer sets followed by triple restriction enzyme digest using PstI, RsaI, and HaeIII to identify the HPV genotypes. After excluding women with multiple HPV infections (n = 2,463), missing zip code information (n = 1,276), and areas with fewer than 100 women tested (n = 495), we evaluated the geographical distribution of HPV genotypes in 16 counties using the Esri ArcGIS 10.2 for Desktop software with geographic boundary files from the National Historical Geographic Information System (www.nhgis.org). The overall prevalence of all HPV infections (39 HPV genotypes) was 19.8%, while the prevalence of high-risk vaccine-preventable HPV infections (HPV16 and 18) (hrHPV) was 3.5%. As reported in previous studies, the prevalence of both all HPV genotypes and hrHPV was higher among younger women (age <30 years) (45.9% and 8.7%, respectively) compared to women ≥30 years of age (15.0% and 2.4%, respectively) (P < .0001). We observed a wide geographical variation in the prevalence of all HPV infections and hrHPV, with Goodhue county showing the highest prevalence of both all HPV infections and hrHPV infections (36.7% and 7.9%), while Benton, Kanabec, and Wright counties showed the lowest prevalence of HPV and hrHPV infections (<6.2% and <0.4%; P < .0001). This is the first study to identify specific geographical areas with a higher prevalence of vaccine-preventable HPV infections. Future studies that are more representative of the overall population are needed to replicate these findings and identify areas for targeted campaigns to improve HPV vaccination rates.
77 Clinical Pathology Intervention Improves Rate of Heparin Anticoagulation
Grant P. Harrison,1 Matt Mullen,2 Thomas R. Vetter,3 Ron Turman,1 Patti Tichenor,1 Laura Taylor,1 and Marisa B. Marques.1* Departments of 1Pathology, 2Medicine, and 3Anesthesiology, University of Alabama at Birmingham, Birmingham, AL.
Unfractionated heparin (UFH) remains the mainstay treatment for acute thrombosis in hospitalized patients. Due to its inherent properties, UFH use must be monitored to ensure therapeutic anticoagulation. Following repeated observations that the partial thromboplastin time (PTT) did not agree with the anti-Xa result, we undertook a prospective side-by-side comparison of the two assays, which resulted in the practice changes described here. We worked as a multidisciplinary group through three needed phases to optimize UFH therapy: period 1 was the baseline (April-July 2012), when PTT was used to monitor UFH; period 2 began when our pharmacy protocols were renamed to improve the likelihood of physicians choosing the correct dose of UFH (January–March 2013); and period 3 (April–July 2013) began when we replaced the PTT with the anti-Xa assay to monitor UFH. For each period, we calculated several variables. At baseline, 94 of 177 (53%) patients had anti-Xa of at least 0.3 U/mL (therapeutic range, 0.3–0.7 U/mL) at an average of 18.3 hours from the start of UFH infusion, compared with 109 of 165 (66%) at 19.0 hours and 135 of 170 (79%) at 20.1 hours for periods 2 and 3, respectively (P < .0001 overall). At the end of period 1, we realized that 69% of patients were started on UFH infusions without a bolus (“soft protocol”), not indicated for the treatment of venous or arterial thrombosis. At that point, UFH protocols were renamed by their intention, such as “DVT/PE,” “cardiac,” and “OB,” and compliance with the correct dose improved. Furthermore, for periods 1 and 2, when PTT and anti-Xa were run concurrently, the PTT agreed with the anti-Xa only 46% of the time, with the remainder being falsely low or falsely high at similar rates. This comparison between the two assays had a P < .001 for disagreement. Based on these data and following a hospital-wide campaign, we stopped using the PTT and replaced it with the anti-Xa for routine UFH monitoring. As described above, the percentage of patients anticoagulated at 18 to 20 hours from the start of the UFH infusion increased by 26% (56% in period 1 to 79% in period 3). Across periods 1, 2 and 3, the mean age (SD) of the patients, 58 ± 18, 59 ± 14, and 60 ± 15 years, respectively (P = .45); the median length of stay (P = .80); and the in-hospital mortality (12.4%, 9.1%, and 13.5%) (P = .43) were not different. We conclude that the clinical laboratory collaboration with pharmacy to update the UFH protocols and an education campaign to institute the anti-Xa assay as the test of choice for monitoring UFH have improved outcomes, with a significantly higher percentage of patients fully anticoagulated before 24 hours.
78 Massive Transfusion Protocol Initiation Does Not Cause Increased Plasma Usage in a Level I Trauma Center
Awais Khan,1 Lance A. Williams III,2 and Marisa B. Marques.2* Departments of 1Medicine and 2Pathology, University of Alabama at Birmingham, Birmingham, AL.
While plasma has a clear indication for massive hemorrhage and thrombotic thrombocytopenic purpura (TTP), its benefits remain unproven for most other indications, such as prevention of bleeding prior to bedside interventions. Optimizing the use of blood components is a current focus in transfusion medicine. While reduction in the utilization of red blood cells has been the primary goal at our institution, we have also observed an overall decrease in plasma transfused in the last 5 to 6 years. The purpose of this study was to analyze how many plasma units were transfused by each service line, such as trauma/burn, surgical, and nonsurgical specialties, in 2008 compared with 2013, especially since we initiated a massive transfusion protocol (MTP) in 2011. Our MTP contains six red blood cell units, four units of plasma, and is primarily utilized for trauma-induced massive hemorrhage. We used the transfusion service computer system to collect data for a period of 4 weeks in the months of August and September of each year, excluding plasma given to patients with TTP. The total demand for plasma decreased by 32% from 822 units in 2008 (pre-MTP) to 560 in 2013 (post-MTP). The reduced need was due to 33% fewer patients transfused, since the average number of units/patient remained constant at 3.7. When analyzed by service, trauma/burn patients received the highest mean number of units at 5.5 and 6.2 in 2008 and 2013, respectively. Patients in surgical specialties went from 3.3 to 3.7 units/patient, while the average decreased from 3.3 to 2.8 pre- and post-MTP, respectively, for patients admitted to nonsurgical specialties. The contribution of each service to the overall utilization showed an increase for trauma/burn from 25% to 29%, consistent with the availability of MTP, while the surgical use decreased from 39% to 37%, and the nonsurgical use remained similar at 36% and 35%. Our analysis revealed that the overall decrease in plasma utilization was due to fewer units for all indications when all patients were combined. Only patients with nonsurgical conditions received fewer units/admission. Moreover, the initiation of the MTP did not result in a dramatic increase in overall usage of plasma per patient, especially in the trauma/burn group.
81 Evaluation of the Effects of PREVI Color Gram Automated Staining System on Reporting of Positive Blood Culture Samples
Ari S. Nowacek, Angella Charnot-Katsikas, Sue Boonlayangoor, Kathleen G. Beavis, and Vera Tesic. Sponsor: Vera Tesic. Department of Pathology, University of Chicago Medical Center, Chicago, IL.
Sepsis is associated with significant morbidity and mortality. As such, prompt treatment with appropriate antibiotics is crucial. In order to achieve this, bacteria grown from blood cultures must be accurately identified and reported. The first and, arguably, most crucial step in this process is the gram stain. The gram stain is a vital tool for characterizing causative agents in bacteremia. Gram stain results are used by physicians immediately, before culture and susceptibility results are available, to place patients on antibiotics likely to successfully treat their infection.
The manual gram stain technique has been effectively used for over 100 years. However, this method is limited by a number of issues. Importantly, the results are highly dependent upon the user’s experience. Thus, even with a standardized procedure, results can still vary between users. In order to reduce this variability, automated gram stainers have been developed and utilized by clinical microbiology laboratories.
The purpose of this study was to evaluate the effect of one model, the PREVI Color Gram Automated Stainer, on the accuracy of reporting gram stains from positive blood cultures. We analyzed over 11,000 positive blood cultures over a 40-month time period and compared the number of corrective reports generated before and after implementing automation. The number of corrective reports was significantly reduced from 9.9/1,000 to 6.1/1,000. There was also a redistribution of the types of mistakes made. Automation reduced most of the types of interpretation error, but the greatest decrease was in the misinterpretation of gram-positive bacilli as gram-negative bacilli. However, automation increased the misinterpretation rate of bacteria as yeast.
Overall, these data provide evidence that gram stain automation, which decreases variability associated with traditional manual methods, can reduce the rate of gram stain misinterpretation from positive blood cultures. This, in turn, should improve clinical outcomes by helping to ensure that patients are initially started on the correct antibiotics for their infection.
82 Improved Accuracy of Medication Compliance Determination and Significant Reduction in Turnaround Time Using a Qualitative Time-of-Flight Mass Spectrometry and Immunoassay-Based Screening Approach
Kelly Doyle,1 Chantry Clark,2 and Frederick G. Strathmann.1,3* 1Department of Pathology, University of Utah; 2ARUP Laboratories, Salt Lake, UT; and the 3ARUP Institute for Clinical and Experimental Pathology, Salt Lake City, UT.
Conventional approaches to pain management compliance testing utilize antibody-based screens and point-of-collection (POC) cups for rapid and initial assessment. Disadvantages of POC cups and immunoassay-based screens include low specificity, false positivity/negativity, narrow detection profiles within drug classes, and limited ability to accurately determine compliance with the results. Positive results typically require confirmation by mass spectrometry, while negative results often go uninvestigated, resulting in inefficient and potentially incorrect compliance determination. In this study, compliance determination from urine specimens using a comprehensive hybrid assay that combines heterogenous immunoassays with acceptable performance and time-of-flight (TOF) mass spectrometry was compared to a conventional immunoassay screen with reflex to mass spectrometry confirmation workflow. Our objectives were to (1) compare the in-laboratory turnaround time (TAT) and total TAT between assays, (2) determine if false-negative/positive results from the immunoassay screen are resolved by the hybrid assay, and (3) demonstrate the utility of the hybrid assay in providing qualitative evidence of specific compounds within a drug class and overall compliance assessment. ANOVA and post hoc analysis of 3 months of patient results indicated a statistically significant reduction in the mean analysis time for the hybrid assay compared to screen with reflex for both in-laboratory (32.2 hours) and total time (76.5 hours) and for confirmation alone (19.2 hours in-laboratory time; 32.2 hours total time). Accuracy of compliance interpretation was conducted on 42 residual urine specimens with known prescription histories and a positive immunoassay screen result. A subset of samples was also analyzed using the NexScreen POC cups, with performance nearly equivalent for the NexScreen POC and conventional immunoassay with the exception of three additional positive benzodiazepine results identified by the POC device. The hybrid assay was superior in confirming compliance per patient (33/42 vs 27/42) and per prescription (48/57 vs 40/57), as well as in identifying evidence of nonprescription substance abuse (12 vs 8). These data demonstrate the utility of a combined mass spectrometry and immunoassay approach in providing clinicians a method to determine medication compliance and substance abuse.
83 An Optimal Von Willebrand Activity to Antigen Ratio: Cost-Effective Method of Diagnosing Von Willebrand Disease With Abnormal Multimer Pattern in Children
Esther P. Soundar,1 Lakshmi Srivaths,2,3 Rabia Shafi,1 Jun Teruya,1,3 and Shui-Ki R. Hui.1,3* Departments of 1Pathology & Immunology and 2Pediatric Hematology, Baylor College of Medicine, and 3Texas Children’s Hospital, Houston, TX.
Laboratory diagnosis of von Willebrand disease with abnormal multimer pattern (VWD-AMP) can be especially challenging due to the lack of established standardized von Willebrand factor activity to antigen ratio (VWFCoF/Ag) trigger for multimer analysis (MA), especially in pediatrics. Current published guideline recommends VWFCoF/Ag <0.5 to 0.7 as a trigger for MA. Previously, our hospital’s standard of practice was to use a <0.5 to 0.7 or Ristocetin cofactor activity of <80% as a trigger for MA, which is based on published data and antidotal evidence. Recently, we performed a retrospective study, which demonstrated via receiver operating characteristic analysis that aVWFCoF/Ag of <0.7 is the most optimal trigger for MA for VWD-AMP. We sought to examine the effect of implementation of the new trigger for MA in terms of both diagnosis and cost-effectiveness. Methods: We performed a retrospective chart review of children with a VWF panel performed at our tertiary hospital 1 month pre- and postimplementation of the newly established VWFCoF/Ag trigger for MA. We compared the proportion of patients who had MA performed during those time periods. We also calculated the cost-effectiveness ratio (CER) for both the pre- and postimplementation months. In addition, we compared the CERs to determine which trigger for MA costs less per unit of outcome (correct diagnosis of VWD-AMP). Results: Prior to implementation of the new trigger, a significantly higher proportion of patients (20 of 42, 47.6%) had MA performed compared to those (13 of 78, 16.7%) in the postimplementation period (P < .001). In the preimplementation period, only one patient (5%) was confirmed to have abnormal MA that was significantly lower than the number in the postimplementation period, where three (23.1%) patients were confirmed to have VWD-AMP (P < .001). The cost-effective analysis shows that our new trigger of VWFCoF/Ag <0.7 for MA costs $1,083 per correct diagnosis of VWD-AMP, whereas the old trigger costs $5,000 per correct diagnosis. Conclusion: Diagnosis of VWD-AMP in children is often difficult and the measurement of VWFCoF/Ag and MA are key components. The new VWFCoF/Ag trigger for MA results in significantly fewer MA and significantly higher CERs of correct diagnosis. This study shows that a locally established data-driven trigger for MA can result in a cost-effective algorithmic approach to diagnosing VWD-AMP.
84 Comparison of Minimal Residual Disease Detection by Multi-Parameter Flow Cytometry and PCR in NPM1-Mutated vs NPM1/FLT3 ITD Mutated Acute Myeloid Leukemia
Jesse Manuel Jaso and Pei Lin.* The University of Texas MD Anderson Cancer Center, Houston, TX.
Mutations in the NPM1 gene occur in up to 50% of patients with acute myeloid leukemia and a normal karyotype (NK-AML). NK-AML with isolated NPM1 mutation has a good prognosis that is adversely affected by the presence of internal tandem duplications (ITD) in the FLT3 gene. Further prognostic information can be obtained by assessment of response to initial chemotherapy via the measurement of minimal residual disease (MRD). MRD can be measured by immunophenotypic assessment using flow cytometry (FC) or by PCR-based detection of the mutated gene. While PCR-based methods are generally more sensitive, an optimum method of MRD detection has not been fully established. We searched our patient database from 2010 to 2014 for patients with de novo untreated NK-AML and NPM1 mutation who underwent serial MRD monitoring by both eight-color FC and PCR. We identified 19 patients (seven men, 13 women; median age at diagnosis, 57 years). Of these patients, eight had isolated NPM1 mutation (NPM1mut) and 12 had concurrent FLT3 ITD mutation (NPM1mut/FLT3mut). Detection of MRD by either method at any time point was associated with relapse (median time to relapse, 7 months). No patients with negative MRD by both methods underwent relapse. Among the NPM1mut group, five of eight patients (63%) were negative for MRD by FC and PCR after the first course of induction therapy; however, three of eight (36%) had at least one sample with MRD and subsequently relapsed. Among the NPM1mut/FLT3mut group, nine of 12 (75%) patients were negative for MRD by both FC and PCR after one course of induction. However, six of 12 (50%) had at least one sample with MRD, followed by relapse. Only one patient had discordant results between FC and PCR with persistent MRD detected by FC only. This patient was NPM1mut and had extramedullary relapse involving the skin 2 months after an MRD-negative bone marrow obtained after one cycle of chemotherapy. Our results show that MRD measurement by FC and PCR correlates well in NK-AML patients with NPM1 mutation with or without concurrent FLT3 ITD mutation and that MRD detection by either method correlates with impending relapse. However, relapse may still occur, including in extramedullary sites, despite lack of MRD. In addition, our results confirm that the majority of patients in both groups quickly achieve MRD-negative status after induction with slightly higher rates of relapse in NPM1mut/FLT3mut patients. Because rare cases may show discordant results, the combined use of both methods may be optimal.
87 Ordering Practices and Utility of Peripheral Blood Smears to “Rule Out Hemolysis”
Daniel Martig and Michael A. Linden.* Department of Laboratory Medicine and Pathology, University of Minnesota, Minneapolis, MN.
Objectives: Our institution has a tradition of routinely ordering numerous peripheral blood smears for professional interpretation by hematopathologists. A typical clinical scenario includes an anemic patient whose peripheral blood is sent for our review to “rule out hemolysis.” While this practice had been a gold standard to evaluate the anemic patient, quantitative and qualitative clinical laboratory tests are now available to help evaluate for hemolysis. Moreover, the incidence of hemolytic anemia is rather low. In this study, we sought to study physician ordering practices in the evaluation of anemia, including hemolysis, as well as determining the utility of the peripheral smear in this context. Methods: We reviewed pertinent clinical and laboratory data corresponding to peripheral smears sent to our division for morphologic review in December 2013. Selected chart review was performed where indicated. Results: During the selected time period, there were 213 blood smears corresponding to 202 patients, 51.2% female and 48.8% male. The mean age was 49 years (range, 4 days to 91 years). Of all smears reviewed, the mean hemoglobin was 10.1 g/dL (range, 4.8–16.6), and the average MCV was 89 fL (range, 46–116). Many of the cases had additional ancillary testing performed, including reticulocyte count (57.3%), total bilirubin (77.9%), haptoglobin (11.7%), LDH (43.2%), d-dimer (6.1%), and fibrinogen (24.9%). Clinical indications were heterogeneous, although many (24.9%) were ordered to assess for hemolysis, DIC, and/or TTP/HUS, as well as anemia (25.3%). Of all cases, only one patient representing two smears supported a clinical diagnosis of a microangiopathic hemolytic anemia. Of the smears, 8.9% showed rare or very rare fragmented red blood cells, including helmet cells and schistocytes, but were insufficient for a diagnosis of hemolytic anemia. Moreover, 53% of the smears with fragments were seen on samples from patients who had no clinical suspicion of hemolysis. Conclusions: A peripheral blood smear can be helpful to guide diagnostic testing. In our institution, however, while the smear is ordered in parallel with other laboratory tests, quantitative tests are resulted first and most often influence clinical decisions. In addition, in most instances, the presence of rare red cell fragments seen on a smear does not support a diagnosis of a hemolytic anemia. Based on these findings, we propose an algorithm by which smears are reviewed for morphologic features of hemolytic anemia only when the CBC, clinical history, and laboratory data heighten the pretest probability.
88 Identifying Variability in Test Utilization With Big Data in the Veterans Affairs Healthcare System
Ronald G. Hauser and Cynthia A. Brandt. Sponsor: Christopher A. Tormey. Department of Veterans Affairs, West Haven, CT.
Introduction: Health care quality includes the effective delivery of care, meaning the delivery of health care originates from scientific knowledge, not provider idiosyncrasy. We have proposed a metric, the calculated pretest probability (PTP), to quantify utilization variability. It supposes differences in patient pretest probability of disease correlate with their probability of an abnormal diagnostic test. Intuitively, a patient visiting sub-Saharan Africa has a higher probability of malaria than a hermit in Canada and therefore is more likely to have malaria on a blood smear. If we observe large differences in abnormal rates between providers, and the sample is large enough, it is quite likely the providers test populations with different risk. With this approach, we assess utilization variability by comparing abnormal rates between provider groups in Veterans Affairs (VA) hospitals. Method: We began by selecting 251 high-volume tests with noninterpretive results (normal/abnormal only). The test results from 2013 were extracted from the VA Region 4 data warehouse. We then standardized the results to “normal”/“abnormal”/“other” with 3,219 unique, manually curated rules. Tests with >5% of results as “other” were dropped. We determined the positive rate for each hospital and geographically clustered hospital group (VISN). The results were grouped in two ways: (1) VISN and (2) hospitals within a VISN. Groups with a member having fewer than 100 tests were excluded. Results: We extracted 31.2 million laboratory results from the 251 tests chosen. After standardizing the results, we excluded 43 tests with a result of “other” >5%. Grouping the results and excluding groups with a member having fewer than 100 tests left 51 VISN groups and 214 hospitals clustered within a VISN. The positive rate differed by >2 times between the lowest and highest abnormal rate in 21 of 51 VISN groups. In four groups, the positive rate differed by >10 times. VISN comparisons not meeting the fold cutoff often still had wide variation, such as HbA1c between VISN1 (78% positive, N = 18,957) and VISN2 (53% positive, N = 89,281). Hospitals within VISNs show similar variability (100/214 with a least a 2-fold difference in positive rate). Conclusion: Our experiment highlights widespread variability in a small subset of high-volume tests. While the results shown here demonstrate differences in the positive rate, previous case studies by our group have linked differences in abnormal rates to differential population disease risk. Overall, this project employs a simple method to identify variability in test utilization.
90 Effect of Overnight Storage on Post-Thaw Viabilities of Hematopoietic Progenitor Cells
David R. Gibb,1 Donna Summers,2 and Diane S. Krause.1,2 Sponsor: Christopher Tormey. 1Department of Laboratory Medicine and 2Yale Stem Cell Center, Yale University School of Medicine, New Haven, CT.
High-dose myeloablative chemotherapy followed by autologous or allogeneic transplantation of hematopoietic progenitor cells (HPCs) is standard of care treatment for a variety of malignancies. Cryopreservation of autologous HPCs is necessary to allow for administration of chemotherapy. Currently, there are no standard guidelines for the storage of cells prior to cryopreservation, including optimal cell concentrations. Additionally, depending on laboratory workflow or the need to send HPCs to other centers, cryopreservation may occur immediately after collection or following overnight storage at 4°C. Multiple studies have demonstrated that prolonged storage at 4°C may reduce viability of CD34+ stem cells and lead to prolonged platelet engraftment. As part of our ongoing process improvement program, we compared postthaw cell viabilities of HPCs collected via apheresis (HPC-A) cryopreserved the same day or the day after apheresis collection. As validated previously, there was no statistical difference between the mean viabilities for HPC frozen on the same day (n = 4,224) vs the following day (n = 318). On further analysis, we observed a trend toward lower viabilities with increased cell concentration. When we compared postthaw viability for cells cryopreserved after overnight storage at different concentrations, there was a statistically significant decrease in viability for those frozen at cell concentration >3 ×108/mL, which had an average viability of 56.1% ± 15.5% SD vs those frozen at <1 ×108/mL, which had a viability of 66.7% ± 12.8% SD, P < .0001. Due to large variations in viabilities among different patients, we next performed paired t test analysis on just those 125 paired products for which two collections from the same donor were frozen on the same day for one collection and after overnight storage for a different collection. This paired analysis revealed a significant decrease in viability of cells stored overnight at 4° C (no storage, 67.1 ± 12.9; 4°C storage, 57.9 ± 13.9; P < .0001). Thus, we conclude that postthaw viability can be improved if all products are either cryopreserved on the same day as collection or cryopreserved after overnight storage and at a cell concentration <1 ×10e8/mL. There is no correlation between postthaw viability and time to engraftment most likely because the dose of viable CD34+ cells infused was adequate for rapid engraftment.
92 Comparison of Vancomycin-Intermediate and Vancomycin-Sensitive Staphylococcus aureus Using Matrix-Assisted Laser Desorption/Ionization–Time-of-Flight Mass Spectrometry
Cheryl Mather, April Abbott, and Susan Butler-Wu. Sponsor: Geoffrey Baird.* Department of Laboratory Medicine, University of Washington, Seattle, WA.
Vancomycin-intermediate Staphylococcus aureus (VISA) infections are associated with a higher risk of treatment failure and poorer outcomes compared to vancomycin-sensitive S aureus (VSSA). The goal of this study was therefore to assess whether matrix-assisted laser desorption/ionization–time of flight mass spectrometry (MALDI-TOF MS) could be used to distinguish VISA from VSSA isolates based on differential protein expression patterns. Twenty-three S aureus isolates, 12 VISA (MIC ≥4 μg/mL) and 11 VSSA (MIC ≤1.5 μg/mL), were grown on a variety of routine culture media with and without vancomycin and subjected to analysis by MALDI-TOF MS. The presence or absence of protein peaks and the average intensity of each peak were determined for each isolate. When repeatedly sampled from blood agar, there was little difference between the two groups with regards to the presence or absence of peaks or expression patterns. However, when vancomycin was present in the blood agar, we found one peak unique to VSSA isolates. Furthermore, the expression pattern also demonstrated significant differences, with three VSSA peaks present at ≥1.8′ intensity (overexpressed) compared to VISA isolates. Using Mueller-Hinton (MH) agar with vancomycin, we found additional significant differences in both the presence of peaks as well as their expression pattern, with nine peaks overexpressed in VISA and two peaks overexpressed in VSSA isolates. Using support vector machine analysis, we were able to correctly classify all VISA and VSSA isolates from the MH agar using a minimum of four VSSA and five VISA isolates for training. We therefore conclude that there are differences between VISA and VSSA isolates that are appreciable by MALDI-TOF MS. The inclusion of primary media containing low concentrations of vancomycin followed by MALDI-TOF MS analysis may provide a clinically useful way to identify vancomycin-intermediate S aureus in a more timely fashion than routine methods.