Abstract
Objective
Qualitative methods are particularly well-suited to studying the complexities and contingencies that emerge in the development, preparation, and implementation of technological interventions in real-world clinical practice, and much remains to be done to use these methods to their full advantage. We aimed to analyze how qualitative methods have been used in health informatics research, focusing on objectives, populations studied, data collection, analysis methods, and fields of analytical origin.
Methods
We conducted a scoping review of original, qualitative empirical research in JAMIA from its inception in 1994 to 2019. We queried PubMed to identify relevant articles, ultimately including and extracting data from 158 articles.
Results
The proportion of qualitative studies increased over time, constituting 4.2% of articles published in JAMIA overall. Studies overwhelmingly used interviews, observations, grounded theory, and thematic analysis. These articles used qualitative methods to analyze health informatics systems before, after, and separate from deployment. Providers have typically been the main focus of studies, but there has been an upward trend of articles focusing on healthcare consumers.
Discussion
While there has been a rich tradition of qualitative inquiry in JAMIA, its scope has been limited when compared with the range of qualitative methods used in other technology-oriented fields, such as human–computer interaction, computer-supported cooperative work, and science and technology studies.
Conclusion
We recommend increased public funding for and adoption of a broader variety of qualitative methods by scholars, practitioners, and policy makers and an expansion of the variety of participants studied. This should lead to systems that are more responsive to practical needs, improving usability, safety, and outcomes.
Keywords: methods, qualitative research, medical informatics, human–computer interaction, computer-supported cooperative work, science, technology studies
INTRODUCTION
“Since designers do not routinely visit work sites or talk to users, they are unlikely to come across information that would cast doubt on the generalized beliefs about work and users on which their systems are based. Lacking such data—and excluding as unscientific the informal, local information that could help them to design systems better suited to real users in particular workplaces—it is little wonder that these scientists produce systems that users do not want to use.”
-Diana E. Forsythe1
The above epigraph was written about 2 decades ago by Diana Forsythe,1 on the state of user research in medical informatics. At the time, qualitative methods were rarely used in health informatics research. Qualitative methods are a range of techniques for making meaning, which take written, oral, visual, and artifactual accounts of everyday practice as evidence.2 Qualitative methods are well-suited for studying how people design and work with health information technologies to construct meaning and order action.3–6
In Studying Those Who Study Us, the 2001 book excerpted in the epigraph above, Forsythe noted a historical dearth of qualitative evidence in health informatics, related to the distance between designers and field sites, generalized beliefs about the nature of clinical work, and a commitment to formal knowledge as the only valid frame for building systems.1 Forsythe attributed low user acceptance to the dominance of the formal knowledge paradigm.1
While qualitative methods have made significant inroads in health informatics in the intervening 2 decades,1 the field still has a long way to go in leveraging the strengths of qualitative research to meet the promise of health information technologies to improve clinical practice and patient outcomes. In this article, we report a scoping review of 158 qualitative articles published in JAMIA from its inception in 1994 through 2019. We then offer recommendations for health informatics scholars, including those who sit on editorial boards, health information technology (IT) professionals (including vendors and payers), and policy makers (research funders and health IT policy makers) about how to expand the range of qualitative approaches in future health informatics work.
Health information technologies have been instrumental in radically reconfiguring clinical work,7,8 the power relations between medical professionals and hospital administrations,9 public health practices,10 and the relationships between laypersons and their own bodies within and outside of clinical environments.11 Not all health IT disruptions have been universally beneficial.12–16
To reduce the risk of harm to health workers and consumers, public health agencies, and research, health informatics researchers and practitioners stand to gain by seeking to understand healthcare practices as they are actually performed at the front lines of care before technologically intervening in them. Qualitative methods are excellent for producing contextualized analyses, as shown by both the tradition of health informatics in JAMIA as well as other technology-oriented fields, such as human–computer interaction (HCI),17computer-supported cooperative work (CSCW),18 and science and technology studies (STS).19 In addition to generating rich accounts of the experience of working with health information technology platforms, some qualitative methods such as ethnographic fieldwork or long-term interview studies can also help health informatics researchers and practitioners, as they constantly reevaluate how technologically enabled interventions reconfigure clinical practices.20
While quantitative methods are useful for quantifying and predicting problems, qualitative methods can shed light on the impressions, narratives, and discourses that underlie human behavior. For example, qualitative studies of clinicians’ “noncompliance” with process regimens or patients’ “nonadherence” to medical regimens open avenues for institutional change to adapt to the practical and often unpredictable needs of clinicians and patients.7,21 Further, in health informatics, qualitative methods such as grounded theory22,23 and thematic analysis24 have been used to study and mitigate safety hazards created by usability issues,25 to mitigate the disruptions to workflow produced in the course of software rollouts,26 and to coordinate care for patients with special needs.27 However, qualitative methods have not been widely used within health informatics in the past.1 In the spirit of this special issue, we have taken the opportunity to take stock of the full scope of qualitative work in JAMIA and to identify opportunities for future work.
Objective
This scoping review investigated the following research question: how have qualitative methods been used in JAMIA? We paid particular attention to the varieties of objectives, populations studied, collection methods, cited analysis methods, and fields of analytical origin. In the discussion, we contrast the use of a relatively limited range of methods in the qualitative research in JAMIA with the wider variety of approaches used in HCI, CSCW, and STS. In these other technology-oriented fields, researchers and practitioners have used a wider variety of qualitative approaches to illuminate issues related to the design of systems, the contexts of their use, and patterns of historical development that shape technological deployment and uptake. These topics are of great interest to health informatics researchers and professionals, and the field could benefit from adopting a broader range of qualitative methods.
MATERIALS AND METHODS
We conducted a scoping review of qualitative research published in JAMIA, following the framework proposed by Arksey and O'Malley.28 Scoping reviews are better suited to heterogeneous and broad topics, such as the 1 approached in this study.29 The review included the following stages: identifying the research question, identifying relevant studies, study selection, charting (ie, extracting) the data, and collating, summarizing, and reporting the results. We used the PRISMA extension for scoping reviews30 as a guide. We opted not to conduct a critical appraisal in this scoping review; for coherence, we focused on uncovering absences. A detailed review protocol may be requested from the corresponding author.
Data sources and search strategy
In January 2020, we queried the PubMed database to identify potentially relevant work for this study, using keyword terms associated with qualitative methods (Table 1). We included articles entered into the PubMed database between 1994, the year of the journal’s inception, and 2019. Because the study focuses on 1 journal and has a well-defined scope (qualitative empirical research), the search query was iteratively developed by the members of the study team, who are health informatics researchers.
Table 1.
Journal keyword | “Journal of the American Medical Informatics Association: JAMIA” |
---|---|
Methods Keywords | qualitative, “grounded theory”, ethnomethodology, ethnomethodological, interview, interviews, ethnographic, observational, observations, “constant comparative,” “constant comparison,” “mixed method,” “mixed methods,” “mixed-method,” “mixed-methods” |
Excluding | “systematic review” |
Date Entry | from “1994/01/01” to “2020/01/01” |
Full Query | (“Journal of the American Medical Informatics Association: JAMIA”[Journal]) AND (qualitative OR “grounded theory” OR ethnomethodology OR ethnomethodological OR interview OR interviews OR ethnographic OR observational OR observations OR “constant comparative” OR “constant comparison” OR “mixed method” OR “mixed methods” OR “mixed-method” OR “mixed-methods”) NOT (“systematic review”) AND ((“1994/01/01”[Date—Entry] : “2020/01/01”[Date—Entry])) |
Study selection
We included articles that documented original, qualitative empirical research published in JAMIA from 1994 to 2019. These included exclusively qualitative articles (eg, ethnographic works)3,31 and some mixed-methods articles in which qualitative methods had a significant role in determining the results of the study (eg, studies in which interviews and quantitative surveys were conducted and triangulated).32 We excluded articles in which results were exclusively or almost exclusively quantitative (eg, Likert scale-based surveys) or used qualitative collection or coding methods and then analyzed the data with quantitative (eg, quantifying frequencies of medical errors), machine learning, or natural language processing methods (called system development in Figure 1). We also excluded reviews, essays, white papers, articles assessing electronic health records (EHRs) or database data quality, proposals, and articles that were not empirical or that did not provide enough information for our analysis.
Screening, data charting, and synthesis of results
The literature search identified 531 potentially relevant articles for screening. We divided the screening and data extraction tasks among the research team. We then assessed titles, abstracts, and article bodies to select eligible articles following the criteria above. To ensure consistency among reviews, we first screened and assessed several articles together, discussing possible inconsistencies. Then, we divided the entire set of articles into roughly equal-sized subsets, assigned them to each member of the team, and determined eligibility within those subsets independently. We reviewed one another’s subsets and met to build consensus. We excluded 373 articles that did not fit the eligibility criteria, resulting in a total final sample of 158 articles. The process is shown in Figure 1.
In our analysis of the 158 articles in our sample, we operationalized the use of qualitative methods along these parameters: study objectives, populations of interest (ie, who was studied), the data collection methods used, the analysis methods, and the fields from which the analytical methods were drawn.
We extracted study objectives from articles’ abstracts and introductions. Populations and data collection methods were extracted from the methods section. Data analysis methods were extracted from methods sections and cross-referenced in references sections. We extracted the fields from which the analytical methods were drawn from the referenced texts—for manuscripts, this was available in the publication venue’s stated field, and for books, it was available in the Library of Congress categorization available on the copyright page. We independently charted these data in a spreadsheet and discussed the results as a team to ensure consistency.
After extraction, we drew upon thematic analysis24 to identify the objectives of the studies and to synthesize the objectives into 7 main categories. Specifically, we followed the first 5 steps of Braun and Clarke’s method (data familiarization, code generation, theme identification, theme review, and theme definition), sparing the 6 and final reporting step; the purpose was to draw a scope rather than to closely examine the themes we found. For populations of interest, data collection methods, and analytical methods, we extracted, cleaned, and merged the common patterns (eg, populations were grouped into healthcare providers, healthcare consumers, and other stakeholders). At least 2 researchers analyzed and coded each dimension. In the next section, we present our results as descriptive analyses of study objectives, data collection methods, analytical methods, and fields of analytical origin. A complete list of included articles and their classifications can be found in the Supplementary Material.
RESULTS
From 1994 to 2019, a total of 3791 articles were published in JAMIA. As shown in Figure 1, of these articles, 158 mainly used qualitative analysis methods. References and data are available in the Supplementary Material.
Although qualitative articles represented only 4.2% of total JAMIA articles, the number of qualitative articles published in JAMIA has increased over the years. Figure 2 presents the proportion of qualitative articles published in JAMIA throughout its history and the number of qualitative studies published per year, normalized by the total number of articles.
These 158 studies used different collection and analysis methods to explore different objectives and to target different populations in the health informatics domain. In the following sections, we detail these objectives, populations of interest, collection methods, analytic methods, and fields of analytical origin in order to characterize the qualitative body of work published in JAMIA.
Study objectives
From our thematic analysis, we identified 7 broad themes from the objectives of JAMIA’s body of qualitative work. The definitions for these 7 categories of objectives are summarized in Table 2. In order of frequency, the most common objective category was described technology use, followed by understand needs, learn from deployment, then initial evaluation. Assess system impact, analyze technology development, and conceptual or definitional were less common. Except for conceptual or definitional (5 articles), we found that all the others could be separated into 2 groups directly related to the system development process: studying health informatics systems before deployment (76) and after deployment (77).
Table 2.
Objective category | N | |
---|---|---|
Before deployment (N=76) | Understand needs: these studies focused on analyzing medical practices and users' needs. They often involved understanding people's processes, workflows, and mental models, to gather requirements for the development of an IT system that would align with user needs, expectations, and practices. Example: Moen and Brennan33 conducted interviews with health consumers about their experiences managing health information at home, aiming to derive implications for consumer health informatics systems. | 41 |
Analyze technology development: these studies focused on analyzing the processes used to develop health informatics technologies, to analyze how health informatics systems were conceptualized, negotiated, and implemented. Example: Ratwani et al34 conducted interviews with EHR vendor staff to analyze their user-centered design practices aiming to identify challenges related to system development that play a role in EHR usability. | 6 | |
Initial evaluation: these studies described an early evaluation of a system that had not been deployed in a real environment, such as a clinical trial or usability test, to identify opportunities for improvement. Example: Wilcox et al35 conducted a pilot study of a medication-tracking tool, interviewing patients and healthcare providers (clinical pharmacists). | 29 | |
After deployment (N=77) | Learn from deployment: these studies focused on identifying best practices and potential challenges (particularly sociotechnical challenges) of deploying a system or an institutional program for use in a real-world setting, often aiming to support the success of future efforts. Example: Novak et al36 explored the work a group of nurses performed as mediators of the adoption and use of a barcode medication administration system in an inpatient setting. | 26 |
Assess system impact: these studies assessed the impact of a system deployed to supplement or replace established paper-based practices or software systems. They have often used methods such as a longitudinal study in 1 setting, or compared different settings, shedding light on uptake and effectiveness, and highlighting issues that need to be addressed to better support end users. Example: Richardson and Ash37 analyzed the effects of introducing hands-free communication devices in 2 hospitals. | 8 | |
Describe technology use: these studies typically focused on understanding how people use technologies that have been used in a real setting for a significant time, often focusing on unintended consequences and work-arounds. These studies may promote reconfigurations of work practices or computer systems. Example: Winkelman et al38 analyzed how patients with chronic inflammatory bowel disease used and valued internet-based patient access to electronic patient records. | 43 | |
Separate from deployment | Conceptual or definitional: these studies defined a nomenclature, the curriculum of a course, policies, or a research space. Example: Embi and Payne39 described the field of Clinical Research Informatics and identified its main challenges and opportunities. | 5 |
In the following, we present trends over time for the top 4 most common objective categories (Figure 3). Articles that focused on understanding needs have increased each year since 2011. We observed a similar pattern for initial evaluation articles, which increased after 2015. However, this increasing trend did not hold for either describe technology use or learn from deployment; both of these objective categories increased after 2002, with no sustained monotonic trend since that year.
Study populations
We grouped populations of interest into 3 stakeholder groups: healthcare providers, healthcare consumers, and other stakeholders. There have been studies which focused on 1, 2, or all 3 stakeholder groups at the same time.
The first group, healthcare providers, encompasses clinicians (eg, physicians, nurses, pharmacists, health professional students) and any healthcare staff, such as hospital managers and health IT staff. Studies of healthcare providers have often examined clinical systems deployed in the workspace, such as EHRs, computerized decision support (CDS), or hands-free communication device systems for use in hospitals.
The healthcare consumers group encompasses patients and caregivers, such as family members and friends. Many of the articles studying this population have analyzed systems intended for use outside of clinical settings (eg, self-management tools or social media websites), by patients themselves or by patients alongside healthcare providers (eg, patient portals). Articles focusing on this population have also explored systems used within traditional clinical settings, such as analyzing patients’ experiences concerning physicians’ use of EHRs during consultations or focusing on existing or potential systems intended to support inpatients’ needs.
Third, the other stakeholders group includes actors that are not healthcare providers or healthcare consumers, such as researchers, government agencies, health IT vendors, payers, and consumer technology companies.
Figure 4Ashows a Venn diagram of the number of articles studying each group. Articles studying only healthcare providers were in the majority (89), followed by healthcare consumers (28), and finally other stakeholders (15). Comparatively, there were fewer articles focusing on multiple stakeholder groups. Although healthcare providers and healthcare consumers can be considered the main populations for healthcare informatics, few articles (18) studied them at the same time. There were 3 articles in which all 3 groups were studied at the same time.
We also analyzed the stakeholder groups in terms of relevant articles published per year, normalized to total publication counts. In Figure 4B, the lines for healthcare providers and for healthcare consumers represent the number of articles which exclusively studied the referred population (89 and 28 articles, respectively) as well as the articles studying them alongside other stakeholders (6 and 2, respectively). The line representing healthcare providers and consumers includes the 15 articles which studied both providers and consumers at the same time, as well as the 3 articles which additionally included other stakeholders. The line representing other stakeholders includes all 26 articles that studied any “other” stakeholders (eg, researchers, health IT vendors, and payers). We found that the proportion of qualitative articles focusing on healthcare providers followed a relatively constant trend over the years. With healthcare consumers, however, there was an upward trend of qualitative articles starting in 2011.
Data collection methods
As shown in Figure 5, interviews were by far the most common collection method (111 articles), followed by observations (64), artifact analysis (31), focus groups (28), and surveys (18). In the “other” category, 7 used think-aloud protocols, 2 used participatory design, 1 each used Delphi methods, EHR timestamp data to qualitatively study clinical workflow, app store reviews, social media data, and usability inspection. Many studies used a combination of methods, such as interviews and observations.
Analysis methods
As shown in Figure 6, a plurality of articles (66 of 158) did not cite an analysis method, although some of these reported an analysis method by invoking its name (eg, “grounded theory”). We focused mainly on cited analysis methods, which provided us with the specific methodological lineages of each study, including the fields from which the authors drew the analytical method. The most commonly cited methods were Straussian (33 articles) and Glaserian (16) grounded theory, followed by thematic analysis (14). Other methods used included content analysis (8), mixed-methods analytical techniques (4), case study (3), and concept mapping (3) methods. There were 2 citations for each of the following analysis methods: ethnography, process tracing, and usability analysis. There was 1 citation for each of these: action research (a critical reflection of action with research), affinity diagrams (an idea-mapping approach), contextual design (a user-centered design process), discourse analysis (an approach analyzing language in social contexts), the empirical program of relativism (an approach within STS), focus groups, frame analysis (a multidisciplinary approach focused on identifying how individuals understand situations and activities), framework analysis (an applied policy approach), hierarchical task analysis (an approach used to describe activities), participatory design (an approach which engages end users with design), and template analysis (a structured approach to thematic analysis).
Fields of origin
Regarding the fields of origin of cited analysis methods (Figure 7): out of the 92 articles that provided at least 1 such citation, a majority (50) cited sociological sources (books or articles), 19 cited interdisciplinary sources, 17 cited medical or health informatics sources, 11 cited psychological sources, 6 cited design sources, and 5 cited education sources. There were 2 citations to information systems and management sources, 1 cited a social services source, and 1 cited an anthropological source. Some articles cited analysis methods from more than 1 field (eg, sociological and interdisciplinary sources).
DISCUSSION
The proportion of qualitative work, particularly including consumers, has been increasing in JAMIA over time. At the same time, the scope of qualitative research reported in JAMIA appears to be relatively restricted in comparison with the venues of other technology-oriented fields, such as the ACM Conference on Human Factors in Computing Systems (CHI), whose published proceedings function as the flagship journal of the field of HCI; the ACM Conference on CSCW, the similarly structured flagship venue for computer-supported cooperative work; and Science, Technology, & Human Values, the flagship journal for STS. We were also surprised to find only 2 ethnographic works and only 1 citing an analytical method originating in STS; Forsythe was an anthropologist and STS scholar.40
HCI, CSCW, and STS have paid attention to work practices that are complex, collaborative, and require expertise, such as healthcare, aviation. and nuclear power plant operation.41–45 Many of the methods used in HCI, CSCW, and STS have already been successfully applied to health technologies, albeit outside of health informatics venues.1,26,46 Health informatics will likely benefit from adopting those approaches more centrally.
In the interest of fostering productive interdisciplinary engagements, we provide specific recommendations for scholars (authors, reviewers, editorial board members), practitioners (EHR vendors, hospital administrators, insurance, and payers), and government representatives (research funders and health IT policy makers). We recommend (1) broadening the scope of qualitative approaches to include those already used in other technology-oriented fields, (2) understanding research, practice, and policy making in health informatics as a sociotechnical enterprise—as social in addition to technical—bringing social considerations into the realm of legitimate scientific investigation; and (3) supporting broader and more socially oriented studies both financially and structurally.
Broaden the scope of qualitative methods to improve research and practice
Here, we provide examples of some types of qualitative work more commonly found in other technology-oriented fields and which health informatics researchers and practitioners may find valuable. We describe how methods relatively absent in our review—such as ethnographic approaches, distributed cognition, usability inspections, think-aloud protocols, and participatory design—can be used to enhance the field of health informatics. These methods can help researchers and practitioners understand practices prior to developing technology solutions, to fix usability issues early (while they are inexpensive to fix and before they cost lives) and to serve users throughout software design and development.
Understand practices prior to planning technological interventions
Aligned with Forsythe's view,47 studies from other technology-oriented fields48 have emphasized the importance of understanding how social activity is structured within a community of potential users prior to the introduction of technological interventions. This approach is opposed to assuming that perceived problems can be easily fixed through a technological solution. HCI and CSCW have shown the importance of conducting thorough research in daily clinical work prior to planning technological interventions. This is key to minimizing technologically induced disruptions in clinical practice which endanger patients.12
Therefore, we recommend researchers, practitioners, and policy makers understand people's daily practices before developing or mandating technological solutions. There are several methods for studying everyday activity.49 For example, ethnographic approaches have been used to study battleship navigation50 and the relative safety of paper “flight strips” to computer software in air traffic control.51,52 Distributed cognition, an ethnographic approach, has been used to study critical areas of healthcare, such as trauma care and cardiac surgery, because such work is collaborative, complex, uncertain, and ever-changing.4–6,53 Such work has even been published in health informatics journals such as Journal of Biomedical Informatics and International Journal of Medical Informatics ,5,6,53 but it was rare in JAMIA.
Evaluate systems early and often, prioritizing and fixing usability issues while they are inexpensive to fix and before they cost lives
In our scoping review, observations and interviews were common. However, other methods, such as usability inspections and think-aloud protocols, were rare. These methods have been more commonly used in HCI,17,54 a field closely integrated with corporate software design.46,55
In qualitative usability inspections, several trained usability experts independently inspect the interactive design of a software system for usability issues,17,56 similar to the way peer reviewers examine a manuscript. The qualitative results are then collated to guide changes to the design, before a prototype is built. Inspections may be repeated throughout design and development.
In think-aloud protocols, the participants use a software prototype—even 1 made of paper—to accomplish a task, while thinking out loud. This often uncovers qualitative reasons for user confusion (“usability bugs”) early,17 so they may be addressed while the “system” is still a prototype which is inexpensive and easy to revise, before it becomes an enterprise software system which is expensive and difficult to revise,57 and before any patients are harmed.58
Usability inspections were mandated by Meaningful Use,59,60 a US government program that incentivized the expanded use of EHRs in healthcare.61 However, according to Halamka, who served as cochair of ONC's Health IT Standards Committee ,62 policy makers prioritized data collection requirements, creating usability issues associated with repetitive and low-value data entry. We recommend that policy makers and practitioners prioritize usability.
Support users as they lead iterative software design and development
All too often, software systems are simply foisted upon users.63,64 Proponents of participatory design and community-driven design65 argue that this is both ineffective and unethical and that the actual end users must drive key decisions during every stage of software design.66 Proponents of participatory software design, in the spirit of the Scandinavian social democratic labor movement, were originally concerned with the democratic design of computer software for the workplace;66 as computing has expanded beyond the workplace, so has the realm of participatory software design.67
Participatory design methods were rare in JAMIA; we found only 2 articles. To better support end user needs, health informatics researchers and practitioners should collaborate with experienced participatory designers and researchers, centralizing end users as key decision-makers. Additionally, by validating and meaningfully addressing the needs of end users, wide adoption of participatory design’s social–democratic stance66 among researchers, practitioners, administrators, and policy makers may address some of the deeper issues in healthcare, such as clinician burnout, patient dissatisfaction, and patient safety hazards.68,69
Understand health informatics as a sociotechnical enterprise to guide its continual reform
Diana Forsythe studied “those who study us.”1 This reflected her commitment to feminist STS scholarship—she questioned the assumptions of power that typically define the divides between “investigators” and “subjects.”40,70 To continue her legacy, health informatics funders and researchers can invest in carrying out the extensive and delicate work of socially analyzing health informatics across policy, research, business, and practice.
Study stakeholders other than patients and providers
Most of the qualitative studies in our review focused on providers (89), and a steady stream of consumer-focused work had emerged (28). Notably, other stakeholders—such as health IT vendors, payers, administrators, and nonclinical personnel, such as medical secretaries, social workers, and the researchers themselves—represented the smallest portion of our review (15). Health informatics is a complex sociotechnical enterprise involving many partners, regulators, institutions, and personnel; leaving out key stakeholders frequently leads to serious problems.71–73
Because EHR developers and managers, hospital IT staff and administrators, payers, and policy makers shape the technological environments that clinicians and patients contend with, the field will benefit from more qualitative study of these influential groups. Funders should enable researchers to study these key stakeholders using methods such as those from CSCW74 and STS.75
Study geographically distributed social practices
Multisited ethnography76 is a method that allows for the study of distributed practices. In the language of ethnographers in STS, the work of materializing, shaping, and sustaining the apparatus known as the EHR is distributed between vendors, healthcare organizations, payers, policy makers, and patient advocacy groups such as Leapfrog. Such study, once enabled by funders, should enable researchers and practitioners to understand the broad sociotechnical contexts that influence the development and uptake of health information systems—and perhaps even to rationalize barriers to success that have been considered too sensitive to redress.
Increase the focus on clinical encounters
We found a recent stream of qualitative work focused on healthcare consumers. This recent interest may have been motivated in part by an emerging consumer health technology market (eg, mobile health apps), as well as regulatory incentives to implement patient portals77 and to incorporate patient-generated health data into healthcare practices.78,79 It has become critical to analyze and understand how patients use these tools, and how their use can influence clinical practices; clinical encounters involving patient-generated health data may influence patient satisfaction, a measure to which healthcare organizations are increasingly made accountable.79 Despite the critical importance of the topic, studies analyzing interactions between healthcare providers and consumers were uncommon. So, there are opportunities for researchers to qualitatively study technologies intended to facilitate relations between experts and laypersons.
Study how policies shape organizational objectives, which affect clinicians and patients in turn
In our review, we found that studies have tended to focus on specific implementations at the level of the healthcare organization. Since public policy increasingly shapes the objectives of healthcare organizations,61 it may be fruitful for researchers to also study how policies shape organizational actions and how those actions affect clinicians and patients in turn. Policy makers should fund such research, and researchers should be involved in the policy making process; ideally, there should be a 2-way relationship between health IT policy makers and researchers studying the front lines of care.
Expand support for impact studies
There has been a recent and rapid expansion of US healthcare technology markets, which have been tightly linked to and driven by regulatory incentives that have not foregrounded issues of usability, focusing instead on technological capacity for data capture, storage, and transmission.80 In our analysis, learning from deployment and describing technology use were considerably more common than assessing system impact—on workers and consumers—after a system was deployed. This may be attributable in part to policy: in response to Meaningful Use, now called Promoting Interoperability, many institutions acquired such systems, creating opportunities to study technological deployment and use, instead of creating opportunities to decide whether or not it was desirable to have deployed those technologies in the first place. An expansion of multiyear and even multidecade longitudinal impact studies may warrant funding.
Support broader, socially oriented studies in health informatics
There are multiple standards of qualitative rigor, and we provide recommendations for addressing some of the challenges that the authors, reviewers, and editorial boards of health informatics venues may encounter when attempting to accommodate them.81–84
Value novel views of established practices, not only novel technologies
As discussed in the section titled "understand practices prior to planning technological interventions," the study of ordinary, everyday, well-established clinical work is essential to minimizing disruption. Such research rarely involves presenting a marketable, novel technology or predictive model, but often reveals novel views of established practices; funders, reviewers, and editorial boards should view it as valuable, essential, prestigious, and central to health informatics—and therefore worthy of funding and publication in its flagship journal.
Cite original sources to substantiate analysis methods
Some articles referred to “conventional” or “standard” forms of qualitative analysis methods, often citing interdisciplinary sources, medical sources, and entire textbooks, but over 40% of eligible articles did not cite the analytical method used. There are many such methods, each with its own disciplinary and paradigmatic context,23,85 and which should be interpreted in context. Authors should at least cite specific, original sources to substantiate analysis methods.
Explain and respect disciplinary norms and paradigmatic commitments
There is a wide variety of qualitative methods available, each of which is rigorous, and each of which judges rigor differently. For example, in ethnographic work, “open data” would present an unethical breach of participant confidentiality.86
More broadly, in the past half-century, new paradigms have emerged within qualitative research; Lincoln and Guba have characterized these paradigms as interpretivist, and they have long held that it is inappropriate to evaluate such research using positivist or postpositivist standards of scientific rigor.81–84 Specifically, instead of seeking reproducible results to uncover general mechanisms that govern the social world from an impartial perspective, many qualitative researchers reject assumptions that such general mechanisms exist, instead seeking interpretations that, while always and by necessity partial, are fair, revealing, and meet the unique needs of social movements, such as the data justice movement.81,87
In our experience, STS has a strong base of interpretivist scholarship, followed by CSCW and HCI. When composing interpretivist articles for HCI venues, one is expected to specify and explain the analytical methods used, to delineate their disciplinary orientation and paradigmatic commitments, and to support those norms and commitments with citations. Such practice may benefit interpretivist authors aiming to publish in JAMIA.
Increase or remove specific space limits for qualitative studies
Authors will need to be provided the space necessary to explain divergent disciplinary norms and paradigmatic commitments 81–84 and to provide the rich descriptions that are characteristic of qualitative work. Science, Technology and Human Values has a word limit of 8000.88 The published proceedings of CSCW and CHI no longer have specific space limits; reviewers judge length based on appropriateness for methods and contributions.89,90 So, to better accommodate qualitative research, editorial boards should increase or remove specific space limits.
Cultivate reviewer bases for evaluating multiple forms of rigor
According to Lincoln, Guba, and others, not all qualitative methods include replicability or reproducibility in their standards of rigor—this does not mean that certain standards are lower; they are instead incomparable.2,81–84 It is often difficult to ensure replicability or reproducibility for questions of immediate interest to nonresearchers.91 There may be much of importance to learn, then, from paradigms that do not assume reproducibility and concomitant generalizability. To make space for such research,81–84 the criteria of replicability, reproducibility, and generalizability should be applied more selectively by reviewers; editorial boards should work to cultivate reviewer bases with experience evaluating alternative orientations.81–84
Limitations
Our review has some limitations. First, we focused on 1 journal. However, because JAMIA is widely considered the flagship journal in health informatics, there is value in analyzing the type of qualitative research it publishes to identify possible gaps and opportunities for future studies. Also, due to scope, we did not perform a comprehensive review of HCI, CSCW, and STS, so recommendations about the qualitative methods available to health informaticists from these fields are constrained by our extensive collective experience in those fields. Further, we did not include critical essays, an important and distinct method of inquiry that deserves focused study. Finally, although critical appraisal of articles is not required in scoping reviews, such analysis would be useful to identify areas to improve quality in the literature.29
Future work
In the short term, further review involving quality appraisal of qualitative work—in JAMIA, in other health informatics venues, and even health informatics work published in HCI, CSCW, and STS—may be in order. These initial reviews will be critical for the course of research that follows—if history is any guide, quality assessment will be an item of interest and controversy.81–84
As for the long term—it is clear that there is much to be done to continue the legacy of Diana Forsythe. A monumental and many-pronged task lies ahead—to invite scholars from HCI, CSCW, and STS to deeply and extensively engage with the JAMIA authorship and readership to foster long-term collaborations with those scholars, to cultivate a scholarly community that can recognize and evaluate multiple forms of rigorous qualitative work that have been largely absent from JAMIA, and which may inform health policy. Scholars, practitioners, and policy makers will need to resist, in every space and at every turn, the foreclosure of potentially beneficial avenues of inquiry.
CONCLUSION
We found a growing presence of qualitative research in JAMIA, though relatively limited in scope. We have provided specific suggestions for expanding the variety of qualitative approaches considered central to the research, practice, and policy of health IT and health informatics, especially those which are commonly used in the fields of HCI, CSCW, and STS. This will enable researchers and practitioners to produce work that is more responsive to the needs of clinicians, patients, and other users and subjects of health data, improving patient outcomes and safety.
FUNDING
This work was supported by the National Center for Research Resources and the National Center for Advancing Translational Sciences, National Institutes of Health, grant number TL1TR001415-05. The content is solely the responsibility of the authors and does not necessarily represent the official views of the NIH.
AUTHOR CONTRIBUTIONS
MIH and MCF made substantial contributions to the conception and design of the work, the acquisition and analysis of data, and interpretation of results, drafting, and critical revisions for important intellectual content. BDT and ZS provided substantial contributions to data acquisition, analysis, and results interpretation. SM provided substantial contributions to the interpretation of data for the work, as well as drafting and revisions, and critical revisions for important intellectual content. EVE provided substantial contributions to the design of the work, the acquisition of data and interpretation of results for the work, and drafting and critical revisions for important intellectual content. YC made substantial contributions to the design of the work, as well as the interpretation of results for the work, and critical revisions for important intellectual content. All authors provided final approval of the version to be published and agree to be accountable for all aspects of the work in ensuring that questions related to the accuracy or integrity of any part of the work are appropriately investigated and resolved.
SUPPLEMENTARY MATERIAL
Supplementary material is available at Journal of the American Medical Informatics Association online.
Supplementary Material
ACKNOWLEDGMENTS
We thank the editors and reviewers for their careful and thorough inspection of prior versions of this manuscript and for their exceptionally thoughtful and constructive advice.
CONFLICT OF INTEREST STATEMENT
None declared.
REFERENCES
- 1. Forsythe DE. Studying those who study usBlaming the user in medical informatics: medical informatics appropriates ethnographythe cultural nature of scientific practice In: Hess DJ,ed. Studying Those Who Study Us: An Anthropologist in the World of Artificial: Intelligence. Stanford, CA: Stanford University Press; 2001: 132. [Google Scholar]
- 2. Boellstorff T, Nardi B, Pearce C, et al. Ten myths about ethnography In: Ethnography and Virtual Worlds: A Handbook of Method. Princeton, NJ: Princeton University Press; 2012: 29–48. [Google Scholar]
- 3. Unertl KM, Weinger MB, Johnson KB, et al. Describing and modeling workflow and information flow in chronic disease care. J Am Med Inform Assoc 2009; 16 (6): 826–36. doi : 10.1197/jamia.M3000 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 4. Xiao Y, Lasome C, Moss J, et al. Cognitive properties of a whiteboard: a case study in a trauma centre In: Ecscw 2001. Berlin: Springer; 2001: 259–78. [Google Scholar]
- 5. Hazlehurst B, McMullen CK, Gorman PN.. Distributed cognition in the heart room: how situation awareness arises from coordinated communications during cardiac surgery. J Biomed Inform 2007; 40 (5): 539–51. [DOI] [PubMed] [Google Scholar]
- 6. Rajkomar A, Blandford A.. Understanding infusion administration in the ICU through distributed cognition. J Biomed Inform 2012; 45 (3): 580–90. [DOI] [PubMed] [Google Scholar]
- 7. Wallenburg I, Bal R.. The gaming healthcare practitioner: How practices of datafication and gamification reconfigure care. Health Informatics J 2019; 25 (3): 549–57. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 8. Eikey EV, Chen Y, Zheng K.. Unintended adverse consequences of health IT implementation: workflow issues and their cascading effects In: Cognitive Informatics. Berlin: Springer; 2019: 31–43. [Google Scholar]
- 9. Campbell EM, Sittig DF, Ash JS, et al. Types of unintended consequences related to computerized provider order entry. J Am Med Inform Assoc 2006; 13 (5): 547–56. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 10. Lupton D. Critical perspectives on digital health technologies. Sociol Compass 2014; 8 (12): 1344–59. [Google Scholar]
- 11. Rich E, Miah A.. Mobile, wearable and ingestible health technologies: towards a critical research agenda. Health Sociol Rev 2017; 26 (1): 84–97. [Google Scholar]
- 12. Sittig DF, Ash JS, Zhang J, et al. Lessons from “unexpected increased mortality after implementation of a commercially sold computerized physician order entry system. Pediatrics 2006; 118 (2): 797–801. [DOI] [PubMed] [Google Scholar]
- 13. Ash JS, Berg M, Coiera E.. Some unintended consequences of information technology in health care: the nature of patient care information system-related errors. J Am Med Inform Assoc 2003; 11 (2): 104–12. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 14. Ash JS, Sittig D, Campbell E, et al. An Unintended Consequence of CPOE Implementation: Shifts in Power, Control, and Autonomy. AMIA Annu Symp Proc 2006; 2006: 11–15. [PMC free article] [PubMed] [Google Scholar]
- 15. Ash JS, Sittig DF, Dykstra R, et al. The unintended consequences of computerized provider order entry: Findings from a mixed methods exploration. Int J Med Inf 2009; 78: S69–76. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 16. Cabitza F, Rasoini R, Gensini GF.. Unintended consequences of machine learning in medicine. JAMA 2017; 318 (6): 517–8. [DOI] [PubMed] [Google Scholar]
- 17. Benyon D. Understanding In: Designing Interactive Systems: A Comprehensive Guide to HCI and Interaction Design. New York: Addison Wesley; 2010: 146–75. [Google Scholar]
- 18. Button G, Harper R.. The relevance of ‘work-practice’ for design In: Computer Supported Cooperative Work (CSCW). Berlin: Springer; 1996: 263–80. [Google Scholar]
- 19. Suchman L. Making work visible. Commun ACM 1995; 38 (9): 56–64. [Google Scholar]
- 20. Kohn L, Corrigan J, Donaldson M.. To Err is Human: Building a Safer Health System. Washington, DC: National Academy Press; 2000. [PubMed] [Google Scholar]
- 21. Butler M, Coggan C, Norton R.. A qualitative investigation into the receptivity to hip protective underwear among staff and residents of residential institutions. N Z Med J 1998; 111 (1075): 383–5. [PubMed] [Google Scholar]
- 22. Glaser BG. Basics of Grounded Theory Analysis: Emergence vs Forcing. 2. Print. Mill Valley, CA: Sociology Press; 1992. [Google Scholar]
- 23. Corbin J. Taking an analytical journey In: Morse JM, Stern PN, Corbin J, et al., eds. Developing Grounded Theory: The Second Generation. New York: Routledge; 2016: 35–53. [Google Scholar]
- 24. Braun V, Clarke V.. Using thematic analysis in psychology. Qual Res Psychol 2006; 3 (2): 77–101. [Google Scholar]
- 25. Patterson ES, Cook RI, Render ML.. Improving patient safety by identifying side effects from introducing bar coding in medication administration. J Am Med Inform Assoc 2002; 9 (5): 540–53. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 26. Hartswood MJ, Procter RN, Rouchy P, et al. Working IT out in medical practice: IT systems design and development as co-realisation. Methods Inf Med 2003; 42 (04): 392–7. [PubMed] [Google Scholar]
- 27. Ranade-Kharkar P, Weir C, Norlin C, et al. Information needs of physicians, care coordinators, and families to support care coordination of children and youth with special health care needs (CYSHCN). J Am Med Inform Assoc 2017; 24 (5): 933–41. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 28. Arksey H, O'Malley L.. Scoping studies: towards a methodological framework. Int J Soc Res Methodol 2005; 8 (1): 19–32. [Google Scholar]
- 29. Pham MT, Rajić A, Greig JD, Sargeant JM, Papadopoulos A, McEwen SA.. A scoping review of scoping reviews: advancing the approach and enhancing the consistency. Res Synth Methods 2014; 5 (4): 371–85. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 30. Tricco AC, Lillie E, Zarin W, et al. PRISMA Extension for Scoping Reviews (PRISMA-ScR): Checklist and Explanation. Ann Intern Med 2018; 169 (7): 467. [DOI] [PubMed] [Google Scholar]
- 31. Saleem JJ, Patterson ES, Militello L, et al. Exploring barriers and facilitators to the use of computerized clinical reminders. J Am Med Inform Assoc 2005; 12 (4): 438–47. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 32. Anderson NR, Lee ES, Brockenbrough JS, et al. Issues in biomedical research data management and analysis: needs and barriers. J Am Med Inform Assoc 2007; 14 (4): 478–88. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 33. Moen A, Brennan PF.. Health@Home: The Work of Health Information Management in the Household (HIMH): Implications for Consumer Health Informatics (CHI) Innovations. J Am Med Inform Assoc 2005; 12 (6): 648–56. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 34. Ratwani RM, Fairbanks RJ, Hettinger AZ, et al. Electronic health record usability: analysis of the user-centered design processes of eleven electronic health record vendors. J Am Med Inform Assoc 2015; 22 (6): 1179–82. [DOI] [PubMed] [Google Scholar]
- 35. Wilcox L, Woollen J, Prey J, et al. Interactive tools for inpatient medication tracking: a multi-phase study with cardiothoracic surgery patients. J Am Med Inform Assoc 2016; 23 (1): 144–58. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 36. Novak LL, Anders S, Gadd CS, et al. Mediation of adoption and use: a key strategy for mitigating unintended consequences of health IT implementation: Table 1. J Am Med Inform Assoc 2012; 19 (6): 1043–9. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 37. Richardson JE, Ash JS.. The effects of hands-free communication device systems: communication changes in hospital organizations. J Am Med Inform Assoc 2010; 17 (1): 91–8. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 38. Winkelman WJ. Patient-perceived usefulness of online electronic medical records: employing grounded theory in the development of information and communication technologies for use by patients living with chronic illness. J Am Med Inform Assoc 2005; 12 (3): 306–14. doi : 10.1197/jamia.M1712 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 39. Embi PJ, Payne PRO.. Clinical research informatics: challenges, opportunities and definition for an emerging domain. J Am Med Inform Assoc 2009; 16 (3): 316–27 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 40. Hess DJ. Editor’s introduction In: Hess DJ, ed. Studying Those Who Study Us: An Anthropologist in the World of Artificial Intelligence. Stanford, CA: Stanford University Press; 2001: xi–xxv. [Google Scholar]
- 41. Oliveira R, Dupuy-Chessa S, Calvary G.. Formal verification of UI using the power of a recent tool suite. In: proceedings of the 2014 ACM SIGCHI symposium on Engineering interactive computing systems; April 26–May 1, 2014: 235–40; Toronto, Canada. [Google Scholar]
- 42. Roth W-M, Jornet A.. Situational awareness as an instructable and instructed matter in multi-media supported debriefing: A case study from aviation. Comput Supp Coop Work 2015; 24 (5): 461–508. [Google Scholar]
- 43. Weber RN. Manufacturing gender in commercial and military cockpit design. Sci Technol Hum Values 1997; 22 (2): 235–53. [Google Scholar]
- 44. Perin C. Operating as experimenting: Synthesizing engineering and scientific values in nuclear power production. Sci Technol Hum Values 1998; 23 (1): 98–128. [Google Scholar]
- 45. Fitzpatrick G, Ellingsen G.. A review of 25 years of cscw research in healthcare: contributions, challenges and future agendas. Comput Supp Coop Work 2013; 22 (4-6): 609–65. [Google Scholar]
- 46. Kelley C, Lee B, Wilcox L. Self-tracking for mental wellness: understanding expert perspectives and student experiences. In: proceedings of the 2017 CHI Conference on Human Factors in Computing Systems; May 6–11, 2017: 629–41; Denver, CO. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 47. Hess D, Downey G, Suchman L, et al. Diana E. Forsythe (11 November 1947-14 August 1997. Soc Stud Sci 1998; 28 (1): 175–82. [Google Scholar]
- 48. Suchman L. Do Categories Have Politics? The language/action perspective reconsidered. In: proceedings of the Third European Conference on Computer-Supported Cooperative Work. September 13–17, 1993: 1–14; Milan, Italy. [Google Scholar]
- 49.Nardi BA. Studying context: A comparison of activity theory, situated action models, and distributed cognition. In: Nardi BA, ed. Context and consciousness: Activity theory and human-computer interaction. Cambridge, MA: MIT Press; 1996: 69–102. [Google Scholar]
- 50. Hutchins E. Cognition in the Wild. Cambridge, MA: MIT Press; 1995. [Google Scholar]
- 51. Harper RHR. The organisation in ethnography–a discussion of ethnographic fieldwork programs in CSCW. Comput Support Coop Work CSCW 2000; 9 (2): 239–64. [Google Scholar]
- 52. MacKay WE. Is paper safer? The role of paper flight strips in air traffic control. ACM Trans Comput-Hum Interact TOCHI 1999; 6 (4): 311–40. [Google Scholar]
- 53. Hazlehurst B, Gorman PN, McMullen CK.. Distributed cognition: an alternative model of cognition for medical informatics. Int J Med Inf 2008; 77 (4): 226–34. [DOI] [PubMed] [Google Scholar]
- 54. Benyon D. Evaluation In: Designing Interactive Systems: A Comprehensive Guide to HCI and Interaction Design. New York: Addison Wesley; 2010: 225–50. [Google Scholar]
- 55.Apple Computer, Incorporated. Human interface design and the development process In: Macintosh Human Interface Guidelines. Menlo Park, CA: Addison-Wesley Publishing Company; 199: 33–46. [Google Scholar]
- 56. Nielsen J. Enhancing the explanatory power of usability heuristics. In: proceedings of the SIGCHI Conference on Human Factors Computing System; April 24–28, 1994: 152–8; Boston, MA.
- 57. Marcus A. User interface design’s return on investment: examples and statistics In: Bias RG, Mayhew DJ, eds. Cost-Justifying Usability: An Update for the Internet Age. New York: Elsevier; 2005: 17–40. [Google Scholar]
- 58. Leveson NG. The therac-25: 30 years later. Computer 2017; 50 (11): 8–11. [Google Scholar]
- 59.Office of the National Coordinator for Health Information Technology. Safety-enhanced design; 2015. https://www.healthit.gov/test-method/safety-enhanced-design Accessed April 27, 2020
- 60. Lowry S, Quinn M, Ramaiah M, et al. NISTIR 7804: Technical Evaluation, Testing, and Validation of the Usability of Electronic Health Records. Gaithersburg, MD: National Institute of Standards and Technology, US Dept. of Commerce; 2012. https://www.nist.gov/system/files/documents/2017/05/09/NISTIR-7804.pdf Accessed April 27, 2020 [Google Scholar]
- 61. Blumenthal D, Tavenner M.. The “meaningful use” regulation for electronic health records. N Engl J Med 2010; 363 (6): 501–4. [DOI] [PubMed] [Google Scholar]
- 62. Halamka JD, Tripathi M.. The HITECH era in retrospect. N Engl J Med 2017; 377 (10): 907–9. [DOI] [PubMed] [Google Scholar]
- 63. Miliard M. Nurses not happy with hospital EHRs. Healthc. IT News; 2014. https://www.healthcareitnews.com/news/nurses-not-happy-hospital-ehrs Accessed May 2, 2020
- 64.Drachmann H. [The Capital Region rehires only 1 in 6 medical secretaries]. Politiken. 2017. https://politiken.dk/indland/art6257249/Region-Hovedstaden-genansætter-kun-1-ud-af-6-lægesekretærer Accessed December 18, 2017.
- 65. Hess J, Offenberg S, Pipek V. Community driven development as participation? Involving user communities in a software design process. In: proceedings of the Tenth Anniversary Conference on Participatory Design; September 30–October 4, 2008: 31–40; Bloomington, IN. [Google Scholar]
- 66. Muller MJ, Kuhn S.. Participatory design. Commun ACM 1993; 36 (6): 24–8. [Google Scholar]
- 67. Björgvinsson E, Ehn P, Hillgren P-A. Participatory design and “democratizing innovation.” In: proceedings of the 11th Biennial participatory design conference; November 29–December 3, 2010: 41–50; Sydney, Australia. [Google Scholar]
- 68. Gardner RL, Cooper E, Haskell J, et al. Physician stress and burnout: the impact of health information technology. J Am Med Inform Assoc 2019; 26 (2): 106–14. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 69. Panagioti M, Geraghty K, Johnson J, et al. Association between physician burnout and patient safety, professionalism, and patient satisfaction: a systematic review and meta-analysis. JAMA Intern Med 2018; 178 (10): 1317–31. [DOI] [PMC free article] [PubMed] [Google Scholar] [Retracted]
- 70. Suchman L, Feminist STS and the sciences of the artificial In: Hackett EJ, Amsterdamska O, Lynch M, et al. , eds. The Handbook of Science and Technology Studies. Cambridge, MA: The MIT Press; 2008: 139–64. [Google Scholar]
- 71. Bossen C, Groth Jensen L, Witt F. Medical secretaries’ care of records: the cooperative work of a non-clinical group. In: proceedings of the ACM 2012 conference on Computer Supported Cooperative Work; February 11–15, 2012: 921–30; Bellevue, WA. [Google Scholar]
- 72. Møller NLH, Vikkelsø S.. The clinical work of secretaries: Exploring the intersection of administrative and clinical work in the diagnosing process In: From Research to Practice in the Design of Cooperative Systems: Results and Open Challenges. Berlin: Springer; 2012: 33–47. [Google Scholar]
- 73. Eikey EV, Murphy AR, Reddy MC, et al. Designing for privacy management in hospitals: understanding the gap between user activities and IT staff’s understandings. Int J Med Inf 2015; 84 (12): 1065–75. [DOI] [PubMed] [Google Scholar]
- 74. Pine KH, Mazmanian M.. Institutional Logics of the EMR and the Problem of “Perfect” but Inaccurate Accounts. New York: ACM Press; 2014: 283–94. [Google Scholar]
- 75. Woolgar S, Grint K.. Computers and the transformation of social analysis. Sci Technol Hum Values 1991; 16 (3): 368–78. [Google Scholar]
- 76. Marcus GE. Multi-sited ethnography: five or six things I know about it now In: Multi-Sited Ethnography. New York: Routledge; 2012: 24–40. [Google Scholar]
- 77.Office of the National Coordinator for Health Information Technology. Patient Portal for Improved Communication. 2017. https://www.healthit.gov/success-story/patient-portal-improved-communication Accessed April 12, 2020
- 78. Bietz MJ, Bloss CS, Calvert S, et al. Opportunities and challenges in the use of personal health data for health research. J Am Med Inform Assoc 2016; 23 (e1): e42–8. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 79. Sanger PC, Hartzler A, Lordon RJ, et al. A patient-centered system in a provider-centered world: challenges of incorporating post-discharge wound data into practice. J Am Med Inform Assoc 2016; 23 (3): 514–25. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 80. Adler-Milstein J, Jha AK.. HITECH Act drove large gains in hospital electronic health record adoption. Health Aff (Millwood) 2017; 36 (8): 1416–22. [DOI] [PubMed] [Google Scholar]
- 81. Schwandt TA, Lincoln YS, Guba EG.. Judging interpretations: but is it rigorous? Trustworthiness and authenticity in naturalistic evaluation. New Dir Eval 2007; 2007 (114): 11–25. [Google Scholar]
- 82. Lincoln YS, Guba EG.. But is it rigorous? Trustworthiness and authenticity in naturalistic evaluation. New Dir Prog Eval 1986; 1986 (30): 73–84. [Google Scholar]
- 83. Manning K. Authenticity in constructivist inquiry: methodological considerations without prescription. Qual Inq 1997; 3 (1): 93–115. [Google Scholar]
- 84. Lincoln YS, Guba EG.. Paradigmatic controversies, contradictions, and emerging confluences In: Denzin NK, ed. The Sage Handbook of Qualitative Research. New York: 2005. 163–88. [Google Scholar]
- 85. Boellstorff T, Nardi B, Pearce C, et al. Data analysis In: Ethnography and Virtual Worlds: A Handbook of Method. Princeton, NJ: Princeton University Press; 2012: 159–81. [Google Scholar]
- 86. Boellstorff T, Nardi B, Pearce C, et al. Ethics In: Ethnography and Virtual Worlds: A Handbook of Method. Princeton, NJ: Princeton University Press 2012. 129–50. [Google Scholar]
- 87. Kuntsman A, Miyake E, Martin S.. Re-thinking digital health: data, Appisation and the (im)possibility of ‘Opting out. Digit Health 2019; 5: 205520761988067. doi : 10.1177/2055207619880671 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 88.Society for Social Studies of Science. Science, Technology, & Human Values—Submission Guidelines. nd. https://us.sagepub.com/en-us/nam/science-technology-human-values/journal200858#submission-guidelines Accessed April 12, 2020
- 89. Drucker S, Bjørn P. Papers | CHI 2021. nd. https://chi2021.acm.org/for-authors/presenting/papers Accessed May 3, 2020
- 90.Association for Computing Machinery. ACM CSCW Community Survey. 2014. https://cscw.acm.org/2014/survey-questions.pdf Accessed April 12, 2020
- 91.Medicine NA of S Engineering, and, Affairs P and G, Policy C on S Engineering, Medicine, and Public. Reproducibility and Replicability in Science. Washington, DC: National Academies Press; 2019. [PubMed] [Google Scholar]
Associated Data
This section collects any data citations, data availability statements, or supplementary materials included in this article.