Skip to main content
Journal of Digital Imaging logoLink to Journal of Digital Imaging
. 2019 Mar 11;33(1):54–63. doi: 10.1007/s10278-019-00194-3

Free DICOM-Viewers for Veterinary Medicine

Survey and Comparison of Functionality and User-Friendliness of Medical Imaging PACS-DICOM-Viewer Freeware for Specific Use in Veterinary Medicine Practices

Andreas Brühschwein 1,, Julius Klever 1, Anne-Sophie Hoffmann 1, Denise Huber 1, Elisabeth Kaufmann 1, Sven Reese 2, Andrea Meyer-Lindenberg 1
PMCID: PMC7064695  PMID: 30859340

Abstract

There is increasing prevalence of digital diagnostic imaging in veterinary medicine with a progressive need to use medical imaging software. As Digital Imaging and Communications in Medicine (DICOM)-viewers for veterinary use do not require medical device approval in many countries, freeware viewers might be a practical alternative. The aim of this study was to identify and evaluate free DICOM-viewer software for veterinary purposes. The functionality and user-friendliness of various DICOM-viewers from the internet were analyzed and compared. Inclusion criteria for the evaluation were free availability, PACS (picture archiving and communication system)-connectivity, and stand-alone and client-based software. Based on this, eight viewers were found: Ginkgo CADx, Horos, K-PACS, MAYAM, MITO, OsiriX Lite, RadiAnt, Synedra personal. In these DICOM-viewers, 14 core tools were tested and rated on a score from 1 to 10 by multiple observers with different levels of training, using studies of four imaging modalities. Criteria were functionality and user-friendliness. For each viewer, the total number of a predefined set of 47 important tools was counted. The ranking based on functionality and user-friendliness of 14 core tools (mean score in brackets) was as follows: 1. Horos/OsiriX Lite (8.96), 2. RadiAnt (8.90), 3. K-PACS (8.02), 4. Synedra (7.43), 5. MAYAM (6.05), 6. Ginkgo CADx (5.53), 7. MITO (3.74). The DICOM-viewers offered between 20 and 44 tools of the predefined important tool set and are sufficient for most veterinary purposes. An increasing number of tools did not necessarily impair user-friendliness, if the user interface is well designed. Based on the results of this study, veterinarians will find suitable free DICOM-viewers for their individual needs. In combination with PACS-freeware, this allows veterinary practices to run a low-budget digital imaging environment.

Keywords: Digital Imaging and Communications in Medicine (DICOM), Imaging software, Diagnostic radiology, Diagnostic imaging, Digital radiography (DX), Computed tomography (CT), Magnetic resonance imaging (MRI), Diagnostic ultrasound (US)

Introduction

In medicine, the prevalence of digital diagnostic imaging involved a change in workflow. Today, the replacement of conventional hard copy films by digital radiography has reached small veterinary practices. As a consequence, there is increasing exchange of DICOM (Digital Imaging and Communications in Medicine) data among veterinarians. Even practices that continue to work with conventional film radiography regularly need to handle digital diagnostic imaging media from other veterinarians, commonly DICOM-data of imaging studies on compact discs (CD). Various clinics use different viewer software. The use of self-contained viewer on CD can be cumbersome and can also cause technical problems [1]. DICOM is an open cooperative standard to transmit medical imaging data [28]. DICOM-viewers are medical software for the display of DICOM image files on computer monitors [918]. Single-veterinarian practices and small veterinary clinics have limited financial resources for picture archiving and communication systems (PACS) and medical imaging software. For human use, medical imaging software commonly requires medical device approval by national authorities. In the USA, the Department of Health and Human Services of the Food and Drug Administration [19] regulates radiology devices (Part 892) by the Code of Federal Regulations (21CFR892) [20]. In the European Union, the European Commission is responsible for European legislation and regulates medical devices by Council Directives [21, 22]. In most countries, the use of DICOM-viewer software for diagnostic imaging in animals does neither require regulatory medical device approval nor legal licensing, according to the authors’ knowledge. This also applies for hardware like computer screens used for diagnostic purposes. Officially approved medical device software for use in human medicine is commercially available but often expensive and not profitable for veterinarians.

Medical imaging freeware from the internet can solve the discrepancy between financial restriction and the need for appropriate technical solutions [23]. An open-source PACS in combination with free DICOM-viewers allows a low-cost digital imaging environment [23]. Various surveys assessed freeware and open-source software [918]. These studies commonly focused on imaging research, image processing, and human medicine and the software was evaluated by imaging or software experts or radiologists. Veterinary practitioners are often not extensively trained in diagnostic imaging, but commonly need to diagnose animal diseases using medical imaging software. In veterinary practices, there is rising demand for DICOM-viewer software, based on the spread of digital diagnostic imaging.

The goal of our study was to identify, analyze, compare, and assess free DICOM-viewers for the use in veterinary practices.

Materials and Methods

The design of the study was prospective and experimental. We searched the internet with a non-systematic approach for free medical DICOM image viewing software using an internet search engine and the terms DICOM, viewer, software, medical imaging software, freeware, free, and open-source, individually and in various combinations. Inclusion criteria for evaluation in this study were free availability and stand-alone and client-based software. There were no minimum requirements except PACS-connectivity.

The software was downloaded from the internet and installed, launched, and reviewed. For comparison of Microsoft Windows and Apple software, we used computers with equal hardware (64-bit operating system, 8-GB RAM, Quad-Core CPU, Gbit LAN). We used standard commercially available computer screens without medical device approval. The loading time of a test CT study (507 CT images, 268 MB) was measured for all viewers. Two experienced veterinary radiologists defined a set of 47 overall important and a second smaller set with 14 of the 47 tools being most relevant for the needs in veterinary diagnostic imaging (core tools) based on their subjective opinion. One observer listed and evaluated all viewer tools for operational capability in every program, compared them with the predefined set of 47 important tools, and counted the relevant proportion. The set of core tools underwent further evaluation by various observers that were recruited at the Centre of Clinical Veterinary Medicine of the LMU Munich by email and personal invitation. The observers were asked to record gender, veterinary degree, total years of veterinary training and experience, area of expertise or interest, as well as self-assessment of general computer skills on a scale from 1 to 10 (1 = beginner, 10 = expert) using a survey form. All DICOM-viewers were evaluated in a randomized order by each observer. For every DICOM-viewer, the test person was asked to note personal experience and prior practice with this software (1 = never used, 10 = often used).

We used a set of anonymized imaging studies that are commonly used in veterinary medicine to test the viewer tools:

  1. radiography (DX, ventro-dorsal canine pelvic digital radiograph of a flat panel detector, Siemens Axiom Luminos dRF),

  2. diagnostic ultrasound (US, multiframe DICOM-File with 101 single images of a canine abdomen, GE Logiq E9),

  3. computed tomography (CT, two corresponding transverse plane pre and post contrast series of a canine head and brain with a total of 120 images of a helical multi-slice scanner, Siemens Definition AS),

  4. magnetic resonance imaging (MRI, one sagittal and two corresponding transverse plane series of a canine spine with a total of 51 images from a high-field MRI scanner, Siemens Magnetom Symphony).

Observers were asked to perform the following specified tasks and score the operability of the core tools for each viewer using a scale from 1 to 10 (1 = very hard, 10 = very easy). The set of basic tools containing zoom (magnification), pan, windowing (contrast and brightness), rotate, flip (mirror), and measurements of length and angle was evaluated on a radiograph. Cine-mode (movie) was tested on a diagnostic ultrasound video loop. Scrolling, split screen viewing, slice level synchronization in two parallel series, and use of cross-reference lines in two orthogonal plane series were evaluated in a MRI study. Use and switch between window presets and Hounsfield unit measurement were tested in a CT study. The whole evaluation process was supervised by one of the authors to guide the observer in order to guarantee compliance to the fixed test protocol.

After the evaluation period and prior to statistical analysis, the supervisors of the evaluation process reported their experience with the viewers. Based on their impressions, a consensus ranking was gathered.

Finally, the results of the multi-observer scorings were evaluated using statistical analysis software (SPSS version 23, IBM). Descriptive statistical analysis was performed for the test persons and the scoring results. Mean scores for each assessed tool in each viewer were separately calculated for the total viewer score, for the test person category (students, veterinarians, veterinary radiologists), and for areas of applications (radiography, sonography, CT/MRI).

For radiography, a mean score based on the set of basic tools containing zoom (magnification), pan, windowing (contrast and brightness), rotate, flip (mirror), and measurements of length and angle was calculated. For the ultrasound, ranking the mean score of the basic tools in combination with the cine-mode (movie, video) tool was considered. For CT and MRI assessment of the viewer, the mean score of the basic features in combination with the CT- and MRI-specific tools (scrolling, split screen viewing, slice level synchronization, cross-reference lines, window presets, and HU measurement) was calculated. The tools considered in the evaluation of the area of application are shown in Table 1.

Table 1.

Tools for evaluation of modality-specific scoring (area of application)

Basic tools/radiography Ultrasound CT/MRI
Browse through images (scroll)
Pan with mouse continuously
Zoom in/out (magnification) with mouse continuously
Windowing (contrast and brightness) with mouse continuously
Window-presets
Mirroring (flipping)
Rotate, 90° steps
Split screen
HU measurements (punctual, region of interest (ROI), variable ROI, ROI-copy)
Distance (length) measurement
Angle measurement
Synchronization (slice level), position manually
Cross-reference lines
Cine-mode (video/movie)

✓ = tools included in evaluation

A non-parametric correlation test (Spearman’s rho) was used to analyze the influence of gender, veterinary degree, total years of veterinary training and experience, self-assessment of general computer skills, and prior experience with each viewer on the score.

Results

The following eight free DICOM-viewers were found and met the inclusion criteria for further evaluation: Ginkgo CADx [24], Horos [25], K-PACS [26], MAYAM [27], MITO (Medical Imaging Toolkit) [28], OsiriX Lite [29], RadiAnt [30], Synedra View Personal [31] (in alphabetical order). Two of the viewers require an Apple OSX operating system, four of the viewers were designed for Microsoft Windows environment, and two viewers were platform independent (Table 2).

Table 2.

Results of freeware PACS DICOM-viewer evaluation

Freeware viewer Operating system Version (year) Rank by consensus rating Core tool ranking Core tool score Radiography score Ultrasound score CT/MRI score Loading time (rank) Content of 47 important tools (rank)
Horos OSX 2.0 (2016) 1 1 8.96 9.00 9.02 8.90 19 s (1) 44/47 (1)
OsiriX Lite OSX 8.0 (2016) 1 1 (*) (*) (*) (*) 19 s (1) 44/47 (1)
RadiAnt WIN 3.4 (2016) 2 2 8.90 8.96 9.01 8.79 32 s (4) 30/47 (3)
K-PACS WIN 1.6 (2008) 4 3 8.02 7.77 7.74 8.04 59 s (7) 25/47 (5)
Synedra WIN 16.0 (2016) 3 4 7.43 8.38 8.43 7.34 34 s (5) 31/47 (2)
MAYAM WIN/OSX/LINUX 2.0 Beta1 (2014) 6 5 6.05 6.97 7.03 5.94 34 s (5) 20/47 (7)
Ginkgo CADx WIN/OSX/LINUX 3.7 (2014) 5 6 5.53 7.76 7.25 5.67 59 s (7) 29/47 (4)
MITO WIN 2.0 (2013) 7 7 3.74 5.14 4.88 3.78 26 s (3) 24/47 (6)

(*) for subjective assessment, only Horos was evaluated

The loading times of the test study were between 19 s for Horos and OsiriX Lite and 59 s for Ginkgo CADx and K-PACS (Table 2).

Horos and OsiriX Lite offered the highest number of tools with 44 of the 47 predefined important tools available (Table 2). The Windows viewers with the highest proportion of important viewer tools were Synedra [24], RadiAnt [26], and Ginkgo CADx [31]. The availability of the predefined important tools for the needs in veterinary diagnostic imaging is shown for each viewer in Table 3.

Table 3.

Important tool list and available tools in each viewer core tools in bold

MAYAM MITO K-PACS Ginkgo RadiAnt Synedra Horos
OsiriX
Browse through images (scroll)
Pan with mouse continuously
Zoom in/out (magnification) with mouse continuously
Windowing (contrast and brightness) with mouse continuously
Triple flexible simultaneous mouse button mapping for panning, zooming and windowing
Window-presets
Customizable window presets
Mirroring (flipping)
Rotate, 90° steps
Rotate continuously
Split screen
HU measurements (punctual, region of interest (ROI), variable ROI, ROI-copy)
Distance (length) measurement
Angle measurement
Area measurement oval
Area measurement circular
MPR-tool (multi-planar reconstruction)
PACS-connectivity
Dual monitor tool
Synchronization (slice level), position automatically
Synchronization (slice level), position manually
Cross-reference lines
3D-navigation
Freeplane-MPR
Curved MPR
SSD (surface shaded display)
VR (volume rendering technique)
MIP (maximum intensity projection)
Multi-frame DICOM support
Hanging protocols
Image fusion
Segmentation
STL-export
Cine-mode (video/movie)
Overlay hide (de-/activation)
Overlays customizable
DICOM header (metadata) readout
DICOM tag editing (patient data)
DICOM-export
Image export or copy (JPEG)
Patient-CD (burning)
Image file import (JPEG)
Send to PACS
Video-export
DICOM print
CD-import
Structured report
Total number of important tools 20 24 25 29 30 31 44

Twenty veterinary students (57%, 18 female, 2 male), 11 veterinarians (32%, 7 female, 4 male), and four veterinary radiologists (11%, 1 female, 3 male), in total 35 observers, evaluated and scored 14 core viewer tools. Veterinary students attended vet school 4 to 6.5 years (mean 4.5 years). Veterinarians had work experience between 0.5 and 6 years (mean 2.5 years) and veterinary radiologists between 4 and 14 years (mean 10.75 years). Areas of expertise or interest included small animal medicine (n = 10), surgery (n = 9), internal medicine (n = 4), diagnostic imaging (n = 4), large animal medicine (n = 2), bovine medicine (n = 2), ophthalmology (n = 2), anesthesiology (n = 2), equine medicine (n = 1), swine medicine (n = 1), cardiology (n = 1), dermatology (n = 1), emergency and critical care medicine (n = 1), animal reproduction medicine (n = 1), veterinary medicine (n = 1), and “no special interest” (n = 2) in a total of 26 female (74%) and 9 male (26%) observers. Self-assessment of general computer skills was between 2 and 9 (mean 5.7) for all observers, between 2 and 9 (mean 5.25) for veterinary students, between 2 and 8 (mean 5.7) for veterinarians, and between 7 and 9 (mean 7.5) for veterinary radiologists.

Horos and OsiriX Lite had the identical important and core tools with the same viewer appearance and operator interface and also the same loading times. Therefore, for subjective assessment and scoring of the core viewer tools, only Horos was evaluated. Table 2 shows the results of subjective individual evaluation by 35 observers, including overall ranking and mean score of the 14 core tools, mean score of important tools for specific areas of application, image loading times, numbers of important features, and feature ranking. The Mac OS X platform viewer Horos (OsiriX Lite respectively) was ranked highest overall and in all imaging categories. RadiAnt, Synedra, and K-PACS were ranked highest among the Windows viewers.

Veterinary radiologists scored slightly higher than veterinarians and veterinary students (Fig. 1). Gender, total years of veterinary training and experience, and self-assessment of general computer skills did not significantly affect the overall viewer ranking.

Fig. 1.

Fig. 1

Total evaluation results according to veterinary degree

A Spearman’s rho correlation coefficient of 0.229 (p = 0.000) indicated significant correlation between prior observer experience and overall score only for the RadiAnt viewer. Students with almost no prior viewer experience scored RadiAnt lower than Horos. Veterinarians and veterinary radiologists scored RadiAnt slightly higher than Horos. Veterinarians had minimal prior experience with RadiAnt. The four veterinary radiologists had moderate prior practice with Horos or OsiriX and high prior practice with RadiAnt. Prior practice of observers with DICOM-viewer freeware is shown in Fig. 2.

Fig. 2.

Fig. 2

Prior experience of test persons with DICOM-viewer freeware

Based on the results of the web search Horos is a Mac OS X platform viewer that is based upon OsiriX and other open-source medical imaging libraries [25]. OsiriX MD is commercially available and FDA-approved software that is claimed to be the most widely used DICOM-viewer in the world and offers a limited free version OsiriX Lite [29] with a “not for medical use” disclaimer that is limited to series containing less than 500 images. To a great extent, Horos equals OsiriX Lite [25]. Horos is made available under the GNU Lesser General Public License, Version 3 (LGPL-3.0) [25]. Unfortunately, there was no Windows version available. Installation and operation were trouble-free. Despite the high number of tools, the toolbar was still clearly arranged and variably adjustable facilitating familiarization. The software was responding quickly and was running reliably. Horos and OsiriX offer an interface that enables the use of veterinary-specific components. VetHP, a plug-in for a CT- and MRI-hanging protocol, enables automatic image orientation preferred by veterinarians, dorsal at the top of the image in transverse series as well as cranial (rostral) at the left side of the image and dorsal at the top for sagittal CT and MRI series [32]. OriOn is a plug-in that improves image illustration for presentations and reports by removing all overlays except the annotations for orientation. VetTools-Plugin offers measurement features for surgical planning of tibial plateau leveling osteotomies and osteosynthesis [33]. In summary, Horos and OsiriX Lite demonstrated strong and compelling performances and were our favorite free DICOM-viewer software for all imaging modalities based on important tool content, subjective consensus rating, multi-observer evaluation, and the fastest image loading times (Table 2).

RadiAnt is a Windows DICOM-viewer that was advertised being “Smart. Easy. Fast.” [30], based on the content of the webpage. This promise was confirmed in our survey. Installation was fast and without difficulties. The viewer was running very reliable and was loading and responding very quickly. The gray graphical user interface was simple and attractive for us. The buttons and symbols were well designed, clearly arranged, and well organized within a toolbar on top that made the software very intuitive. RadiAnt had an adequate but not vast number of important tools. The viewer offered more than 20 different operating languages. Unfortunately, the installation licenses are temporarily limited. Occasionally, the viewer software requires a license update that is currently free of charge. In total, we ranked RadiAnt in second position based on consensus rating and multi-observer scoring, in third place for important tool content, and fourth place for loading time. In summary, RadiAnt was a very compelling software, powerful for all imaging modalities and our favorite Windows DICOM-viewer (Table 2).

Based on our online research, Synedra View Personal is a free Windows DICOM-viewer that has a Mac OS X compliant version in the professional and professional diagnostic edition, which have medical device approval (Class IIb Medical Device compliant with Medical Devices Directive MDD 93/42/EEC) and are commercially available [31]. English, German, French, and also Russian and Turkish editions are advertised [31]. We did neither observe problems during installation nor operation. The task bar and right mouse button menu were logically arranged. The limited free personal version still offers a very large number of tools. The viewer could be more intuitively designed. The fixed right mouse button menu was comprehensive and very well structured, but precludes flexible mouse button mapping for the right key. Settings for scrolling and series synchronization are cumbersome. We missed a cross-reference line feature in the personal version. Instead, a three-dimensional cursor was available. Measurement tools are comprehensive, especially for Hounsfield units in CT images. The use of multiple screens was supported. Finally, we ranked Synedra in second place for important tool content, in third place in consensus rating, in fourth place in multi-observer evaluation, and in fifth place based on loading time. In summary, we experienced Synedra being a powerful DICOM-viewer (Table 2).

Based on the internet research, K-PACS (version 1.6.0) is a Windows platform DICOM-viewer that offers an English and German operating language edition [26]. The design of the viewer resembles “Windows XP.” During the installation process, we experienced some technical difficulties and the program turned out not to quit the DICOM listener (Store-SCP) service if the program is terminated, which may block other imaging software using the same port. The tools are spread over a menu on top, a side bar and symbols on the bottom, and the main buttons are obvious and concise. Panning and zooming share one mouse button depending on the mouse cursor position in the image display, pan in the center, and zoom in the periphery. K-PACS was ranked in third place based on our multi-observer evaluation, in fourth place based on consensus rating, in fifth place regarding important tool content, and in seventh place for study loading time (Table 2).

At the webpage research, Ginkgo CADx (version 3.7.1) was advertised as an advanced open-source DICOM-viewer and dicomizer that converts various image files (png, jpeg, bmp, pdf, tiff) to DICOM image files and is distributed under the GNU Lesser General Public License (LGPL) [24]. Versions for Windows, Mac OS X, and Linux are available [24]. The installation process was without any difficulty. The design of the viewer interface resembled “Windows XP.” With a main menu on top, the buttons and symbols were widely spread and small-sized. They could be better arranged and more intuitively designed. Scrolling and playing a video loop is cumbersome. Concurrent display of two identical image series was possible, but we could not view two different image series simultaneously using a split screen. Ginkgo CADx was ranked in fourth place based on important tool content, in fifth place based on consensus rating, and in seventh place for study loading time (Table 2).

Based on our online search, MAYAM 2.0 is a cross-platform DICOM workstation developed in Java using the dcm4che toolkit [27]. Advanced computer skills of the operator and Java version 1.8 or less on the computer are required to run the application. The user interface is simply with attractively designed buttons with orange symbols on dark gray background. The interactive tooltips (mouseover) did not work consistently. All tools in a single task bar on top are clearly arranged. The play speed controller takes much space and does not display the actual frame rate. In radiographs, measurements were not metrically scaled and HU measurements appeared not to be reliable. We missed a tool for angular measurements. Unfortunately, series synchronization and cross-reference lines worked in CT but not in MRI studies. MAYAM 2.0 was ranked in fifth place based on multi-observer evaluation and study loading time, in sixth place based on our consensus rating, and in seventh place for important tool content (Table 2).

Based on the results of our internet research, MITO (version 2.0) is described by the publisher as an open-source, Windows-based software architecture for advanced medical imaging released under the terms of GNU General Public License [28]. Main characteristics are DICOM compliance, 2D-3D visualization (VR, SR, MIP), image segmentation and fusion, ROI, and virtual reality [28]. An advanced 3D user interface enables Kinect and Wii control [28]. During the installation process and at start of use, we regularly received error messages and observed a few interruptions and failures of function. However, finally, the program was functional most of the time, but occasionally required patience from the user. The viewer was not always responding instantly. The design of the user interface resembled “Windows XP” with a main menu on top. The presence of various windows was sometimes confusing. MITO was not very intuitive. According to the manufacturer’s information, the program offers many 3D tools including image fusion. We could not view two parallel split screen CT or MRI series, we could not measure Hounsfield units, and the annotations and overlays were poorly visible. Videos did not always run smooth. MITO was ranked in third place for study loading time, in sixth place for important tool content, and in seventh place by consensus rating and multi-observer evaluation (Table 1).

None of the evaluated freeware viewer offered veterinary-specific tools. However, Horos and OsiriX Lite offer an interface for veterinary-specific plug-ins.

Discussion

Based on the results of this study, we found DICOM-viewer freeware with great performance. There are extensive differences between the programs that emphasize the importance of our study. The clear and homogeneous results enable and simplify recommendations. Medical viewer software is a very important tool in diagnostic imaging and trained radiologists are familiar with DICOM-viewer software. In many countries, there are only a few specialists in veterinary diagnostic imaging. General veterinary practitioners have limited training in diagnostic imaging, and nevertheless they need to cover this area of expertise. In spite of the increasing importance for veterinary practitioners, DICOM-viewer software is not in the focus of veterinary research. We only found general papers, but not detailed review or research articles [3436]. We hope that the quality and value of DICOM-viewer software will get more attention and appreciation in veterinary medicine based on the results of this survey. This study also sets benchmarks for comparison with commercially available viewer software in veterinary medicine. Prior studies on medical viewer software focused on human medicine and research [1, 912, 1416, 3739], and therefore we aimed to identify, evaluate, and compare appropriate freeware DICOM-viewer software from the internet for veterinary purposes. Many studies are old [1012, 14, 15, 37, 39] and their results might be outdated. Over time, new software will be released and new tools might be provided in existing viewer software. Due to the rapidly changing software market, surveys will require regular updates in the future. Despite a thorough internet search prior to our survey, the list of selected viewer might be incomplete.

We found DICOM-viewer freeware with strong and compelling performance, especially Horos, OsiriX Lite, RadiAnt, K-PACS, and Synedra. Horos and OsiriX Lite offered a vast number of tools, followed by RadiAnt and Synedra. The good performance of Horos and OsiriX Lite in our survey was not surprising. OsiriX showed convincing results in other prior studies [9, 16]. Entering the search term “OsiriX” into the PubMed database on November 10, 2016, provided 286 items in the search results confirming the wide use of the software. However, we observed only moderate prior experience with OsiriX Lite or Horos in our radiologist’s group of observers and not on the veterinarian’s and student’s groups, likely due to limited prevalence of Apple computers at our institution.

Besides digital radiography, most viewers offered features for all imaging modalities that are commonly used in veterinary medicine including sonography, CT, and MRI. We suppose that many special tools are not commonly used by veterinarians. Detailed comparison of all available tools in each viewer would have been very time-consuming. Instead, we focused on features that we considered most relevant and important for diagnostic radiology in veterinary practices and created an important tool list (Table 3). The tool selection and also the selection of the imaging studies might have caused bias. However, the extent of the list and the various distinct approaches for viewer evaluation should compensate this potential effect. A randomized order of viewer scoring minimized a potential bias due to a learning effect during the evaluation process. The time period for each observer was not measured precisely on purpose to avoid influence on the score if the observer feels under pressure. The evaluation process for 14 core tools in seven DICOM-viewers took between 45 and 120 min for each observer, probably depending on the individual computer skills and level of experience with the specific viewer and DICOM-viewer software generally.

We expected a negative correlation between number of features and user-friendliness, and we assumed decreasing intuitivity with increasing numbers of tools. The display of many tools might complicate the software and its use could be harder, especially for inexperienced beginners that are not familiar with DICOM-viewer software. This assumption was not confirmed. Horos had the highest evaluation scores in combination with the highest numbers of tools. There was also no statistical significant difference between student’s and radiologist’s evaluation that had only a minimal higher mean score. So increasing numbers of tools is not a hurdle for beginners. Viewer can offer many tools combined with intuitive user interface that facilitates user-friendliness. Obviously, a smart design of the user interface can overcome this problem.

Observers that scored the viewers had various degrees, training levels, years of veterinary experience, and areas of expertise or interest. Substantial prior viewer experience was observed in veterinary radiologists with Horos, OsiriX, and RadiAnt, but they accounted for 11% of the observers only and this did not significantly affect the results (data not shown). Veterinarians (32%) and veterinary students (57%) had minimal or no prior viewer experience. Prior experience improved the results of RadiAnt in the group of veterinarians and veterinary radiologists, but still Horos and OsiriX preserved the first rank. Age, gender, or other group-specific characteristics might have an impact on the preferences in software design. In our survey, gender, computer skills, veterinary degree, and experience did not affect the grading of the viewers significantly.

Overall, the results should be representative for veterinary practices and match the needs of our target audience, because of numerous observers and a quite heterogeneous composition of test persons. The age of the test persons was not recorded in detail, but based on our recruitment approach and the median years of veterinary experience, we had a rather young test person population with nobody being older than 45 years. Based on the homogenous and similar results between students and veterinarians, we assume that in older observers, the results should be quite similar.

Veterinarians are required to store their digital images, because codes of veterinary medical ethics regulate the archiving of technical documentation of diagnosis and treatment in animals. Therefore, we also defined PACS-connectivity and stand-alone and client-based software being inclusion criteria for this survey. This enables a low-cost digital environment with viewer software, network, and PACS in veterinary practices [23, 36, 40, 41]. For this purpose, the use of a freeware PACS was described earlier [23]. Also, the use of self-contained DICOM-viewer software from compact discs can be time-consuming and cumbersome, and use of an own well-known viewer is advantageous [1].

None of the evaluated freeware viewers offered veterinary-specific features and only a few veterinary-specific plug-ins for Horos and OsiriX Lite are available [32, 33]. This is not unexpected, because the veterinary market is small compared to human medicine. Commercially available viewers for veterinary medicine offer tools for measurement of the Norberg angle, vertebral heart scale, or planning of tibial plateau leveling osteotomies, tibial tuberosity advancement surgeries, modified Maquet procedures, osteosynthesis, or endoprosthetic surgeries [33].

Ginkgo CADx, Horos, MAYAM, and MITO are real free software published under the GNU Lesser General Public License. K-PACS is an outdated commercial version that is freely available. OsiriX Lite, RadiAnt, and Synedra are free demo versions of commercial software. Freeware user should be aware that DICOM-viewers might start as freeware projects and later the freeware could evolve into commercial software, after numerous users were gained. Commercial offers of professional pay software with full medical device approval and parallel limited free versions without medical device approval are another common business model. Freeware authors might ask for contributions to maintain and improve the software (donationware). Other software providers offer technical support if a license was bought, because normally, freeware is lacking technical support. This disadvantage is commonly compensated by internet forums for the user community. Technical problems can be discussed in a support group of other users.

In the last years in veterinary medicine, we observed increasing exchange of imaging data based on the spread of digital radiography. Within emails, commonly, JPEG files are transmitted, which facilitates transfer and viewing if the recipient cannot provide medical imaging software. Not only clients, but also veterinary colleagues might not have or use a viewer that could display DICOM-data. JPEG files have major disadvantages. Image quality is commonly decreased by compression and there is limited range of windowing and lack of metadata. In our survey, all viewers could read out the metadata from the DICOM header that is a common and important feature of medical imaging software [9, 39, 42]. The results of this survey should motivate veterinarians and veterinary students to install and use DICOM-viewer software and to exchange and view DICOM-data instead of JPEG images.

Conclusion

Based on the results of this study, veterinarians will find suitable free DICOM-viewers for their individual needs. In combination with PACS-freeware, this allows veterinary practices to run a low-budget digital imaging environment.

Acknowledgements

The authors greatly acknowledge Dr. Julia Neugebauer for reviewing this article.

Footnotes

Publisher’s Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

References

  • 1.Hosch RE, Rivard AL. Evaluation of self-contained PACS viewers on CD-ROM. J Digit Imaging. 2014;27(4):470–473. doi: 10.1007/s10278-014-9675-2. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 2.Wright MA, Ballance D, Robertson ID, Poteet B. Introduction to DICOM for the practicing veterinarian. Vet Radiol Ultrasound. 2008;49(s1):S14–SS8. doi: 10.1111/j.1740-8261.2007.00328.x. [DOI] [PubMed] [Google Scholar]
  • 3.Widmer WR. Acquisition hardware for digital imaging. Vet Radiol Ultrasound. 2008;49(s1):S2–S8. doi: 10.1111/j.1740-8261.2007.00326.x. [DOI] [PubMed] [Google Scholar]
  • 4.Wallack S. Digital image storage. Vet Radiol Ultrasound. 2008;49(s1):S37–S41. doi: 10.1111/j.1740-8261.2007.00332.x. [DOI] [PubMed] [Google Scholar]
  • 5.Committee DS. Veterinary Identification Tags (PS 3.3, 3.5, 3.6, 3.16 2006) DICOM Corretion item (CP-643) USA: DICOM Standards Committee; 2006. [Google Scholar]
  • 6.Committee DS. NEMA PS3 / ISO 12052, Digital Imaging and Communications in Medicine (DICOM) standard, National Electrical Manufacturers Association, Rosslyn, VA, USA. Rosslyn, VA, USA DICOM Standards Committee
  • 7.Bidgood WD, Jr, Horii SC, Prior FW, Van Syckle DE. Understanding and using DICOM, the data interchange standard for biomedical imaging. J Am Med Inform Assoc. 1997;4(3):199–212. doi: 10.1136/jamia.1997.0040199. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 8.Bidgood WD, Jr, Horii SC. Introduction to the ACR-NEMA DICOM standard. Radiographics. 1992;12(2):345–355. doi: 10.1148/radiographics.12.2.1561424. [DOI] [PubMed] [Google Scholar]
  • 9.Valeri G, Mazza FA, Maggi S, Aramini D, La Riccia L, Mazzoni G, et al. Open source software in a practical approach for post processing of radiologic images. Radiol Med. 2015;120(3):309–323. doi: 10.1007/s11547-014-0437-5. [DOI] [PubMed] [Google Scholar]
  • 10.Solomon RW. Free and open source software for the manipulation of digital images. AJR Am J Roentgenol. 2009;192(6):W330–W334. doi: 10.2214/AJR.08.2190. [DOI] [PubMed] [Google Scholar]
  • 11.Scarsbrook AF. Open-source software for radiologists: a primer. Clin Radiol. 2007;62(2):120–130. doi: 10.1016/j.crad.2006.09.023. [DOI] [PubMed] [Google Scholar]
  • 12.Puech PA, Boussel L, Belfkih S, Lemaitre L, Douek P, Beuscart R. DicomWorks: Software for reviewing DICOM studies and promoting low-cost teleradiology. J Digit Imaging. 2007;20(2):122–130. doi: 10.1007/s10278-007-9018-7. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 13.Haak D, Page C-E, Deserno TM. A survey of DICOM viewer software to integrate clinical research and medical imaging. J Digit Imaging. 2016;29(2):206–215. doi: 10.1007/s10278-015-9833-1. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 14.Escott EJ, Rubinstein D. Informatics in radiology (infoRAD): free DICOM image viewing and processing software for the Macintosh computer: what’s available and what it can do for you. Radiographics. 2004;24(6):1763–1777. doi: 10.1148/rg.246045049. [DOI] [PubMed] [Google Scholar]
  • 15.Escott EJ, Rubinstein D. Free DICOM image viewing and processing software for your desktop computer: what’s available and what it can do for you. Radiographics. 2003;23(5):1341–1357. doi: 10.1148/rg.235035047. [DOI] [PubMed] [Google Scholar]
  • 16.Ariani A, Carotti M, Gutierrez M, Bichisecchi E, Grassi W, Giuseppetti GM, Salaffi F. Utility of an open-source DICOM viewer software (OsiriX) to assess pulmonary fibrosis in systemic sclerosis: preliminary results. Rheumatol Int. 2014;34(4):511–516. doi: 10.1007/s00296-013-2845-6. [DOI] [PubMed] [Google Scholar]
  • 17.Presti GL, Carbone M, Ciriaci D, Aramini D, Ferrari M, Ferrari V. Assessment of DICOM viewers capable of loading patient-specific 3D models obtained by different segmentation platforms in the operating room. J Digit Imaging. 2015;28(5):518–527. doi: 10.1007/s10278-015-9786-4. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 18.Lee JT, Munch KR, Carlis JV, Pardo JV. Internet image viewer (iiV) BMC Med Imaging. 2008;8:10. doi: 10.1186/1471-2342-8-10. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 19.FDA - Food and Drug Administration. Available from: FDA http://www.fda.gov. Accessed 30 Nov 2017
  • 20.U.S. Code of Federal Regulations 2016. Available from: http://www.ecfr.gov. Accessed 30 Nov 2017
  • 21.Revisions of Medical Device Directives of the European Commission 2016. Available from: http://ec.europa.eu/growth/sectors/medical-devices/regulatory-framework/revision_en. Accessed 30 Nov 2017
  • 22.Medical Device Directory (Council Directive 93/42/EEC of 14 June 1993 concerning medical devices OJ L 169 of 12 July 1993, European Commission) 2016. Available from: https://ec.europa.eu/growth/single-market/european-standards/harmonised-standards/medical-devices_en. Accessed 30 Nov 2017
  • 23.Iotti B, Valazza A. A reliable, low-cost picture archiving and communications system for small and medium veterinary practices built using open-source technology. J Digit Imaging. 2014;27(5):563–570. doi: 10.1007/s10278-014-9692-1. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 24.Ginkgo CADx DICOM-Viewer Spain 2016. Available from: http://ginkgo-cadx.com. Accessed 30 Nov 2017
  • 25.Horos DICOM-Viewer 2016. Available from: https://www.horosproject.org. Accessed 30 Nov 2017
  • 26.K-PACS DICOM-Viewer Germany 2016. Available from: http://www.image-systems.biz/download-center. Accessed 30 Nov 2017
  • 27.Mayam DICOM-Viewer India 2016. Available from: http://oviyam.raster.in/mayam2.html. Accessed 30 Nov 2017
  • 28.MITO - Medical Imaging Toolkit DICOM-Viewer Italy 2016. Available from: http://ihealthlab.icar.cnr.it/index.php/projects/9-mito.html. Accessed 30 Nov 2017
  • 29.OsiriX Lite DICOM-Viewer Switzerland 2016. Available from: http://www.osirix-viewer.com. Accessed 30 Nov 2017
  • 30.RadiAnt DICOM-Viewer Poland2016. Available from: http://www.radiantviewer.com. Accessed 30 Nov 2017
  • 31.Synedra Personal View DICOM-Viewer Austria 2016. Available from: http://www.synedra.com. Accessed 30 Nov 2017
  • 32.VetHP (Veterinary Hanging Protocols) and OriOn (Orientation Only) Lucent Media 2016. Available from: http://www.lucent-media.com. Accessed 30 Nov 2017
  • 33.VetTools-Plugin K&M Germany2016. Available from: http://vetmed.eu. Accessed 30 Nov 2017
  • 34.Poteet BA. Veterinary teleradiology. Vet Radiol Ultrasound. 2008;49(s1):S33–SS6. doi: 10.1111/j.1740-8261.2007.00331.x. [DOI] [PubMed] [Google Scholar]
  • 35.Puchalski SM. Image display. Vet Radiol Ultrasound. 2008;49:S9–S13. doi: 10.1111/j.1740-8261.2007.00327.x. [DOI] [PubMed] [Google Scholar]
  • 36.Robertson ID, Saveraid T. Hospital, radiology, and picture archiving and communication systems. Vet Radiol Ultrasound. 2008;49(s1):S19–S28. doi: 10.1111/j.1740-8261.2007.00329.x. [DOI] [PubMed] [Google Scholar]
  • 37.Honea R, McCluggage CW, Parker B, O’Neall D, Shook KA. Evaluation of commercial PC-based DICOM image viewer. J Digit Imaging. 1998;11(3 Suppl 1):151–155. doi: 10.1007/BF03168289. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 38.Langer S, Wang J. An evaluation of ten digital image review workstations. J Digit Imaging. 1997;10(2):65–78. doi: 10.1007/BF03168558. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 39.Varma D. Free DICOM browsers. Indian J Radiol Imaging. 2008;18(1):12–16. [Google Scholar]
  • 40.Armbrust LJ. PACS and image storage. Vet Clin North Am Small Anim Pract. 2009;39(4):711–718. doi: 10.1016/j.cvsm.2009.04.004. [DOI] [PubMed] [Google Scholar]
  • 41.Robertson I. Image dissemination and archiving. Clin Tech Small Anim Pract. 2007;22(3):138–144. doi: 10.1053/j.ctsap.2007.05.008. [DOI] [PubMed] [Google Scholar]
  • 42.Varma DR. Managing DICOM images: tips and tricks for the radiologist. Indian J Radiol Imaging. 2012;22(1):4–13. doi: 10.4103/0971-3026.95396. [DOI] [PMC free article] [PubMed] [Google Scholar]

Articles from Journal of Digital Imaging are provided here courtesy of Springer

RESOURCES