WHEN CONTEMPLATING a potential research project, the collective brain trust of our research team will often sit down to discuss the merits of the idea. In addition to the scientific value, study design, and null hypothesis, we always ask ourselves what is the practical value to the community at large. Is this proposed study too narrow in focus, too esoteric, or too technical in nature to have a broad-based appeal to those radiologists, technologists, and administrators who occupy the “front lines” in the radiology marketplace? If the answer to these questions is “no,” our next question extends to industry. Could this research project have an impact on the vendors and manufacturers who design, build, and implement the technology? If the answer to this fundamental question is “yes,” we often proceed forward with the hopes of a project that offers practical value to the entire radiology community.
The problem with this approach is that often one of the principal target audiences, industry, is either not paying attention or ignoring the study and its results. This has broad-based implications for all radiology practitioners, regardless of the initials after your name. Regardless of what one thinks about the importance of any given study, if the people responsible for product development and refinement aren’t listening, then the study’s impact is muted. The key then is to get industry more engaged when it comes to research and ensure they have a vested interest in employing research results into their product development. Too often, however, a great study is presented in the scientific community, only to be noticed by a select group of academicians. The analogy in many respects is like the tree that falls in the woods, but no one is listening to hear the noise.
While many vendors tout market-driven engineering,1 the reality is more appropriately termed competition-driven engineering. When one vendor brings a new product or application to the market, other vendors seek to develop their own branded product in many of the same ways, with a slight twist. In the end, practitioners are left with little to no choice, as it relates to true technical innovation and distinction. This indirectly serves to stifle the competition, allowing the larger vendors to maintain market share at the expense of the smaller vendors. If one were to reinvent the engineering process, the ideal solution could be to place greater importance on clinical research in defining the “best practice” standards to measure products against. A number of examples help to illustrate this concept of “data-driven engineering.”
In a recent prospective, multicenter, time-motion study comparing computed radiography (CR) and direct radiography (DR), it was determined that a large percentage of the time differences between CR and DR was due to postacquisition processing.2,3 While vendors like to dwell on the differential acquisition times of CR and DR, an otherwise often forgotten and overlooked step came to light. Postacquisition processing, which includes quality assurance (QA), was shown to be an important determinant of technologist productivity. The results suggested that maximizing technologist productivity requires a combination of re-engineered workflow and technology. By allocating postacquisition processing to a third party (ie quality assurance specialist), technologist productivity could be enhanced and both total examination time and the time differential between CR and DR would be greatly diminished. The ideal solution (in both productivity and economic measures) would be to develop new computer-based software to facilitate this technologist outsourcing, in the form of automated QA. If left to their own devices, few vendors would unilaterally seek to invest engineering resources to develop this application. However, if the results of this multicenter study are believed to be representative of the radiology community at large, this would make perfect sense and be a logical extension for engineering resources. The radiology community wins by improving technologist productivity, reducing personnel costs, and potentially improving image quality and consistency. The vendor who invests the engineering resources also wins, by developing a product with “real-life” application that translates into improved sales and name recognition.
Another example of “data-driven” engineering can be illustrated with electronic auditing software developed for the purpose of tracking radiologist workflow in softcopy interpretation using PACS.4 To date, vendors have done little other than limited observational studies to track radiologist workflow. The limited feedback that has been provided has largely been from academic radiologists, who in all likelihood practice in a dissimilar fashion to the majority of nonacademic private practice radiologists. At the same time, vendor advisory groups are largely composed of these same academic radiologists, who offer opinions and recommendations which may be contradictory to the radiologist mainstream. If this feedback and limited observation goes into traditional market-driven engineering, the end product will likely be counterintuitive to the practice patterns of most radiologists. Competing vendors, in turn, follow similar engineering strategies, with the end result being a PACS workstation created in the mindset of a select few.
The answer (in our opinion) is to utilize this electronic auditing software to prospectively collect data on large populations of users, from a variety of backgrounds and training. Only by understanding how end-users interact with the workstation can vendors truly understand how the radiologist mainstream goes through the interactive processes of image display, interpretation, and reporting. With this data in hand, vendors can develop more intelligent and intuitive user interfaces, hanging protocols, decision support software, and navigational devices. This “data-driven” approach to engineering takes on great importance in the current climate of multislice CT, functional MRI, and PET/CT. As the size and complexity of these volumetric datasets continue to grow, radiologists are faced with exponentially increasing challenges to maintain productivity, interpretation accuracy, and profitability.5 If the radiologist, technologist, and engineering communities work together through “data-driven engineering,” we can succeed in an unprecedented fashion. If instead, we maintain the traditional paradigms of competition and market-driven engineering, we will fail miserably. In the end, we must always remember, the tail shouldn’t wag the dog.
References
- 1.Channin DS. Driving market-driven engineering. Radiology. 2003;229:311–313. doi: 10.1148/radiol.2292031199. [DOI] [PubMed] [Google Scholar]
- 2.Reiner BI, Siegel EL, Hooper FJ, et al: Technologist productivity in the performance of digital radiographic examinations: comparison of computed and direct radiography. Radiological Society of North America, Chicago, IL, December 1, 2003
- 3.Reiner BI, Siegel EL, Musk A, et al: Impact of quality assurance on technologist productivity in the performance of digital radiography examinations. Radiological Society of North America, Chicago, IL, December 2, 2003
- 4.Siddiqui KM, Siegel EL, Reiner BI, et al: Development and use of workstation auditing tools to define and improve radiologist workflow. Radiological Society of North America, Chicago, IL, November 30, 2003
- 5.Reiner BI, Siegel EL, Siddiqui K. Evolution of the digital revolution: a radiologist perspective. J Digit Imaging. 2003;16:324–330. doi: 10.1007/s10278-003-1743-y. [DOI] [PMC free article] [PubMed] [Google Scholar]