Skip to main content
Journal of the American Medical Informatics Association : JAMIA logoLink to Journal of the American Medical Informatics Association : JAMIA
. 1998 Nov-Dec;5(6):493–502. doi: 10.1136/jamia.1998.0050493

Medical Informatics and the Science of Cognition

Vimla L Patel 1, David R Kaufman 1
PMCID: PMC61330  PMID: 9824797

Abstract

Recent developments in medical informatics research have afforded possibilities for great advances in health care delivery. These exciting opportunities also present formidable challenges to the implementation and integration of technologies in the workplace. As in most domains, there is a gulf between technologic artifacts and end users. Since medical practice is a human endeavor, there is a need for bridging disciplines to enable clinicians to benefit from rapid technologic advances. This in turn necessitates a broadening of disciplinary boundaries to consider cognitive and social factors pertaining to the design and use of technology. The authors argue for a place of prominence for cognitive science. Cognitive science provides a framework for the analysis and modeling of complex human performance and has considerable applicability to a range of issues in informatics. Its methods have been employed to illuminate different facets of design and implementation. This approach has also yielded insights into the mechanisms and processes involved in collaborative design. Cognitive scientific methods and theories are illustrated in the context of two examples that examine human-computer interaction in medical contexts and computer-mediated collaborative processes. The framework outlined in this paper can be used to refine the process of iterative design, end-user training, and productive practice.


Recently there has been a proliferation of articles, editorials, and edited volumes reflecting on the state of the art and future challenges of medical informatics.1,2,3,4 Clearly, there are numerous technologic, sociologic, and organizational issues that are important concerns of the discipline. As a discipline matures, there is a need to broaden horizons and critically examine directions, especially in regard to educating future practitioners and researchers. We approach the field of medical informatic from a somewhat different vantage point—namely, as researchers in the area of medical cognition, education, and human—computer interaction. We are particularly concerned with how theory from these disciplines can inform the practice and training of medical informaticians.

In this paper, we address the role of cognitive science in addressing current and future needs in medical informatics. We first consider a range of philosophic and conceptual issues in defining disciplinary boundaries. This transitional period in the history of informatics affords us a great opportunity to capitalize on technologic and sociologic trends to shape clinical practice and education. Medical information systems embody ideals in design that often do not readily yield practical solutions in implementation. Theories and methods from cognitive science can greatly inform medical informatics by addressing issues such as the usability of systems, the process of medical decision making, and the training of physicians and end users. Specifically, we present our own research and methodologies pertaining to 1) usability assessment of medical information technologies and 2) the analysis of scientific collaboration.

The Nature of the Discipline

What is the proper subject matter of the discipline of medical informatics, and what falls outside its purview? Shortliffe5 considers medical informatics “a basic science discipline in medicine,” although one that has evolved distinct characteristics that have tended to separate it from other traditional academic and research medical specialties. He suggests that medical informatics holds both realized and potential importance for the science and practice of medicine. He argues that the failure to define a scientific basis for the emerging discipline of informatics has limited its impact on scientists in other medical fields and has hindered its timely introduction into medical education.

Warner suggests that medical informatics is an emerging discipline. He defines medical informatics as “The study, invention, implementation of structures and algorithms to improve communication, understanding, and management of medical information.”6(p207)

In our view, this definition is somewhat narrow, focusing on an older model of computing. Recognizing the central roles of designers and users of these technologies in the field of medical informatics, Greenes and Shortliffe7 define medical informatics as “the field that is concerned with the cognitive, information processing, and communication tasks of medical practice, education, and research including the information science and the technology that support these tasks.” This definition stretches the conventional boundaries of informatics and places computing and technology in a more supporting role. There appears to be a growing belief that medical informatics is more than the thin intersection of computing and medical practice. In a recent editorial, Hasman and Safran8 suggest that we need to emphasize communication and collaboration rather than computing. In our view, this captures a critical shift in the nature of the informatics paradigm. It suggests a place of prominence for a human dimension as reflected in the social and cognitive sciences. In fact, cognitive science has long since had a place at the periphery of medical informatics, especially in relation to medical artificial intelligence and medical decision making.9 However, we believe that methods and theories in cognitive science can make a more profound contribution to this discipline. Maintaining rigid disciplinary structures may not allow sufficient adaptability to capitalize on important trends. For example, communication technologies have made possible long-distance collaboration, which is beginning to transform the way people work.

The integration of cognitive science in the field of medical informatics presents some interesting and worthwhile challenges. Different disciplines can inform and shape the field of medical informatics to generate a more comprehensive and multifaceted perspective, which requires the development of shared objectives, methods, and vocabulary. These issues are considered in the following section.

On Dialects and Languages: Toward a Convergence of Meaning

The evolution of any discipline mirrors natural and cultural evolution in important respects. Any community of learning and practice made up of people from diverse areas—such as computer science, epidemiology, information science, and biomedicine, as well as educational research, sociology, anthropology, and psychology—needs to find a means of effective communication among the participants. Each of these disciplines has evolved its own language of communication, and the ongoing struggle is to find common ground. In the development of a hybrid language or dialect, speakers create a form of discourse known as pidgin.* For example, in Fiji, the language spoken in the market place, pidgin Hindi, is a combination of Fijian, Hindi, and English. This emerging language is a melange of terms and structures from its various languages of origin. It serves as the means of minimal or functional communication. An example of the emergence of this sort of dialect can be found in attempts by medical informaticians to identify conceptual units and their organizational structures. The foundational disciplines use a range of constructs, including “terminology” (from information science), “vocabulary” (from medicine), and “ontology” (from computer science and philosophy), all of which have similar referents.

As a language develops, it may evolve into a form known as creole. Creoles are real languages, but their roots in other languages are clearly apparent. One area of medical informatics research involves defining standards for exchanging information among systems. The diverse informatics community has settled on the term “message” for the unit for exchange. This term is now widely understood, and it has a distinct meaning, different from its meaning in medicine (which is more concerned with content than structure) or in computer science (where it has a predominant meaning related to object-oriented programming). Similarly, in cognitive science, (human) information processing has taken on a different meaning from its more conventional usage in computer science.

We in informatics are trying to communicate with each other through a common dialect. Many of us are not native speakers of medical informatics; rather we speak a pidgin variation that liberally draws on our native disciplinary tongues. Disciplines often have a long evolutionary history. A mature discipline in which everyone speaks the same dialect affords a range of grounding in communication strategies to coordinate shared meanings.10 While medical informatics is developing a substantial shared terminology, there are still considerable idiosyncrasies that hamper communication. Controlled medical vocabularies are an interesting case in point, because there are several emerging standards, each with different assumptions and subtle difference in meanings that can create boundaries. The roots of medical informatics can be traced back to at least the late 1950s.11 However, 40 years is a relatively short time in the history of an emerging discipline. In the process of creating a creole, different objectives and vocabularies are negotiated across disciplinary boundaries. The clarity, validity, and utility of these theories and methods become apparent and become incorporated into the emerging discipline.

In the current state of informatics, methods, theories, and concepts are drawn mostly from computer science, information science, and other domains. As a real language and culture develop, people are born into the new community; similarly, when new members of informatics come of age, they will increasingly graduate not in computer science or experimental medicine, but from programs devoted to medical informatics. Cognitive science and neuroscience are multidisciplinary fields that have witnessed similar developments in recent years. We are, perhaps, currently somewhere between pidgin and creole, where the disciplinary and institutional boundaries are still very apparent.

At the moment, informatics is in a transitional state. As the disciplines mature, affiliated research communities may endeavor to concentrate their efforts more narrowly by targeting a specific set of objectives, or they may broaden their vision to consider a wider range of issues. Both kinds of transitions are not uncommon in the world of science and, in fact, are not mutually exclusive. In the next section, we argue for a place of prominence for cognitive science in the evolving disciplinary matrix of medical informatics. In our view, cognitive science can contribute to the basic science dimension of medical informatics as well as inform the practical aspects having to do with design and implementation.

Cognitive Science and Medical Informatics

One source of concern is the pressing need for effective information management in medicine. The urgency of this matter has been spurred, in part, by the rapid proliferation of new biomedical knowledge, diagnostic procedures, and therapeutic interventions. These developments have been paralleled by advances in computer-based technologies for recording, storing, managing, accessing, and communicating information. As in most domains, there is a gulf between technologic artifacts and end users.12 Medical practice is a human endeavor, and there is a need for bridging disciplines to enable clinicians to benefit from rapid technologic advances. Theories and methods from cognitive science can provide an effective complement to other medical informatics approaches in addressing issues of the usability of systems, the processing of information, and the training of physicians.

Cognitive science is a multidisciplinary field incorporating theories and methods from psychology, linguistics, philosophy, anthropology, and computer science in the study of cognition. In recent years, theories and methods from cognitive science have been applied to a wide range of practical domains, including medical education and informatics.13 Medical cognition includes studies of cognitive processes, such as perception, comprehension, reasoning, decision making, and problem solving in medical practice itself and in representative experimental tasks (e.g., simulated patient problems). Cognitive science in this respect acts as a basic science and provides a framework for the analysis and modeling of complex human performance. For example, theories of human memory and knowledge organization lend themselves to characterizations of expert clinical knowledge that can then be contrasted with representation of such knowledge in medical systems. With the advent of emerging medical information technologies, we need to be concerned with the ways in which people can use these systems accurately, efficiently, and safely. presents parallel issues in cognitive science, medical cognition, and medical informatics.

Table 1.

Parallel Issues in Cognitive Science, Medical Cognition, and Medical Informatics

Cognitive Science Medical Cognition Medical Informatics
Knowledge organization and human memory Organization of clinical and basic science knowledge Development and use of medical knowledge bases
Problem solving, heuristics/reasoning strategies Medical problem solving and decision making Medical artificial intelligence/cognitive models of decision support
Perception/attention Radiologic and dermatologic diagnosis Medical imaging systems
Text comprehension Learning from medical texts Electronic text processing
Conversational analysis Medical discourse Medical language processing
Distributed cognition Collaborative practice and research in health care Computer-supported collaborating agents
Coordination of theory and evidence Diagnostic and therapeutic reasoning Evidence-based clinical guidelines
Mental models Skills training Cognitive usability assessment

Cognitive science and studies of medical cognition can meaningfully inform and shape design, development, and assessment.14,15 Similarly, basic cognitive research has begun to have an effect on the development of decision support technology.16 Also pertinent is the emerging area of distributed and collaborative cognition.17 This notion suggests that cognitive processes such as planning, learning, and decision making can be construed as a joint effort among various agents including human beings and machines. With the rapid ascendance of the Internet as a vital communication medium, collaborative endeavors in medical informatics research have become increasingly prominent.18 In the next section, we illustrate how cognitive science methods and theories can be used to further our understanding of human computer interaction in medical contexts and computer-mediated collaborative processes.

Human-Computer Interaction and Medical Technologies

Despite the great potential of medical information technologies, their implementation and integration into medical practice have often proved to be more difficult than was anticipated.14,19,20 Much research has focused on a range of technical issues in the implementation of these systems, including computer communication and networking, physical input devices (e.g., mouse and keyboard), and the development of software standards for controlled medical vocabularies.21,22 However, research has only recently begun to investigate the cognitive and social dimensions of physicians' encounters with computer-based technologies in medical practice.23

Human—computer interaction is a science of design that seeks to understand and support human beings interacting with technology.24 In our research, we are principally interested in characterizing the usability and learnability of medical technologies.25 “Usability” refers to the capacity of a technology to be used easily and effectively by a range of users (e.g., health care workers), given specified training and user support, to perform a range of tasks (e.g., diagnosis and patient management) within a specified range of settings (e.g., clinics, offices, and hospital wards).26 “Learnability” refers to the ease with which a user can attain certain levels of competency. Training is an essential (and often overlooked) ingredient in promoting the effective use of technologies. We have observed that training is too often narrowly focused on attaining basic competency in the use of a system (e.g., basic commands, navigation, etc.). Although this is an essential aspect of the learning process, there is a need to tailor training toward developing specific cognitive skills that will lead to more productive use. For example, a medical student can learn to do a Medline search relatively easily. However, it is more difficult to develop effective search strategies that can maximize the yield of relevant literature and minimize extraneous costs. The development of expertise is clearly predicated on substantial experience. However, advanced training can significantly accelerate the learning curve.27

Our laboratory is actively engaged in research evaluating medical record systems and computer-based learning systems.25,28 The objectives are twofold: to contribute to a process of iterative design in the development of more effective systems, and to continue to refine a theoretic and methodologic framework for the cognitive use of medical technologies. We employ two classes of usability techniques: usability inspection methods and usability testing. Usability inspection methods are a set of analytic techniques for characterizing the usability-related aspects of the interface. These methods are typically employed by an analyst or experimenter working with the system being tested. Usability testing involves observing end users employing a system to perform representative tasks (for example, recording a patient history in a computer-based patient record system). We have developed a set of cognitively based video analytic techniques for characterizing subjects' behavior. These methodologies have been employed in a range of tasks and settings. In this section, we illustrate the use of the usability inspection method known as the cognitive walk-through (CW).

We have adapted this design evaluation methodology to study the usability and learnability of medical information systems.25,29 The purpose of a walk-through is to evaluate the process by which users perform a task and the ease with which they can do this. The CW methodology involves identifying sequences of actions and goals needed to accomplish a specific task. More specifically, the primary aims of the CW procedure are to determine whether the user's background knowledge and the cues provided by the interface are sufficient to construct the goal structure necessary to generate the action sequence required to perform a task and to identify potential usability problems. To do this, an experimenter/analyst performs a task simulation, “stepping through” the sequence of actions necessary to achieve a goal. The principal assumption underlying this method is that a given task has a particular generic goal-action structure (basically, ways in which a user's objectives can be translated into the particular actions). This analysis also provides us with substantial insight into the cognitive demands of a task. For example, tasks that require the user to execute lengthy sequences of actions or require movement between different screens make heavier demands on a user's working memory.

The walk-through is an example of a strong theory-based approach with clear practical implications. This approach draws on theories of problem solving, skill acquisition, and human—computer interaction. The end product of an analysis is a set of cognitive models that can both inform theories of human performance and have obvious consequences for design and implementation. We illustrate below the CW procedure in the context of a Medline search. This is an activity with which most readers have some familiarity and which is sufficiently complex to illustrate the CW methodology. In the following scenario, the top-level goal is to perform a database search to locate pertinent review articles about the relationship between diabetes and pregnancy. The following outline illustrates a goal-action sequence for accessing the MEDLINE database from the Ovid bibliographic information retrieval system:

Goal: Find recent review articles related to pregnancy and diabetes.

Subgoal: Do MEDLINE Search.

Subgoal: Access Ovid Browser and Query System.

Action: Open Browser.

System Response: List available database (e.g., PsychINFO, HEALTHSTAR, MEDLINE).

Subgoal: Open MEDLINE database (1993-1997).

Actions: Scroll down and select MEDLINE.

Action: Press <Enter>

In the preceding sequence, there are three subgoals and three actions needed to access MEDLINE. The CW characterizes the (hypothetic) goals and subgoals of the user, related actions, system responses, and potential problems. Subgoals reflect inferences needed to connect a higher-level goal to specific actions. The actions arise from the user's intentions but are critically shaped by system responses. In the following characterization of a goal-action sequence, the next goal is to do a keyword search for articles related to diabetes.

Goal: Find articles related to diabetes.

Subgoal: Do a keyword search on diabetes.

Action: Enter key combination <Control> <R>.

System Response: “Enter a word or phrase to be searched in titles and abstracts.”

Actions: Type in diabetes and press <Enter>.

System Response: Returns 22,998 entries.

The same sequence is then repeated for the term “pregnancy.” Once these goals have been accomplished, the two sets of results must be integrated to find those entries that correspond to both diabetes and pregnancy. This part of the walk-through is illustrated by the following segment:

Goal: Merge list of entries.

Potential Problem: Subject must map term “combine” to action “merge.”

Subgoal: Combine data sets (diabetes and pregnancy).

Action: Enter key combination <Control> <N>.

System Response: Screen with two sets and instructions. “Use the spacebar to select at least two sets to combine and then press <Enter>.”

Actions: Scroll to diabetes and press spacebar.

Actions: Scroll to pregnancy and press spacebar.

Action: Press <Enter>.

System Response: Combine sets screen: Choose Boolean connective “and” or “or.”

Potential Problem: Choice of connectives.

Actions: Select “and” and press <Enter>.

System Response: Returns 870 entries.

Potential Problem: List is too extensive.

The top-level goal of the entire search necessitates several actions, and there are a number of potential problems of which only a few are indicated here. The problems may pertain to the surface characteristics of the interface (e.g., clarity of dialogue elements) or may be of a more conceptual nature (e.g., mapping terms to actions). The system returns 870 entries, which would make the task of finding relevant articles too cumbersome. The user must then find a way to narrow the search space. The final goal relates to limiting a search to articles in English, studies of human subjects, and review articles.

A MEDLINE search is a task of minimal-to-moderate complexity. (Current e-mail programs are systems of minimal complexity, whereas most computer-based patient record systems are of substantial complexity.) Our analysis indicated that this complete task requires 22 actions and involves 12 goals and subgoals and nine transitions between screens. A first-time MEDLINE user may be frustrated by the complexity of the interface and the sequence of actions necessary to accomplish a goal. In addition, the transitions between screens will invariably cause navigational problems for some users. After using the system a few times, however, the user is likely to develop sufficient facility to achieve a range of basic goals without too much difficulty.

The walk-through can serve a number of purposes, including contributing to the iterative software design process and aiding in the development of instructional materials. We have also used this method to develop a coding scheme to analyze end users' performance of a task.25 The walk-through reveals a subset of potential user problems but is most effective when used in conjunction with video-based usability testing involving representative end users. This approach can also yield valuable information about the efficiency of various procedures (e.g., the number of actions needed to search a database); the prior knowledge needed to draw various inferences from the system's behavior; the consistency of tasks supported by a system (most tasks should have similar goal-action hierarchies); and the transparency of system feedback (responses to users' actions). Video-based usability testing can also contribute to effective training by characterizing productive strategies and by making transparent the various affordances of the system (e.g., undocumented shortcuts) as well as the constraints.

We have used the CW technique to characterize the learnability of multimedia instructional software and, more recently, have applied this technique in the study of various kinds of computer-based patient record systems. These systems represent immensely complex interactive environments that make numerous conceptual as well as perceptual and motor demands on the user. There is a critical need to study cognitive aspects of the interface and its effects on both advanced and novice users. In addition, these systems greatly affect information gathering strategies and problem representation. As these systems proliferate, they are likely to have a substantial impact on the way medical students as well as novice physicians learn clinical medicine. We are currently engaged in an effort to understand and delineate the cognitive dimensions of physicians' interactions with computer-based patient record systems.

Computer-based systems do not merely facilitate or enhance the performance of a given task; they also have an enduring impact on the mastery of related tasks. Salomon et al.,30 in considering the effects of technology on intellectual performance, introduce a useful distinction between the effects with technology and the effects of technology. The former are concerned with the changes in performance displayed by users while equipped with the technology. For example, when using an effective medical information system, physicians should be able to gather information more systematically and represent this information in a more structured manner. In this capacity, the medical information technologies may alleviate some of the burden on the physicians' working memory and permit them to focus on higher-order thinking skills, such as hypothesis generation and evaluation. The phrase effects of technology refers to lasting changes in general cognitive capacities (knowledge and skills) as a consequence of interaction with a technology. For example, extensive use of information technologies may result in enduring changes in diagnostic and therapeutic reasoning even in the absence of the system. This suggests that medical information systems and decision-support technologies may have ancillary positive consequences but may also induce complacency and certain dependencies on systems. In our view, effective training can serve to maximize the positive consequences and minimize the counterproductive ones.

The focus of much research in human—computer interaction has been on understanding the solitary individual who uses a computer or workstation and deriving guidelines for design based on this understanding. Although we view this work as important, there is clearly a need to understand the social, contextualized, and distributed nature of work in health care settings. This issue is discussed in the next section.

Collaboration in Medicine

Recent communication technologies are beginning to transform the ways in which individuals work.31 The community of researchers in medical informatics is uniquely positioned to exploit as well as advance these new capabilities to improve the quality of health care delivery. Collaboration has always been central to health practices, with nurses, physicians, and support staff jointly contributing to patient care. Similarly, researchers in the medical informatics community have been engaging in joint efforts for many years. However, these efforts have been somewhat impeded by geographic distances, which seem to create a kind of cultural and institutional isolation. This accounts in part for the many idiosyncratic and incompatible medical information systems that have been implemented in various clinical settings. The Internet and other communication technologies have paved the way for unparalleled collaborative activity and shared expertise.

The InterMed Collaboratory is an Internet-based medical informatics project involving five participating institutions.32,33,34 There are two broad mandates of this project. The first is to further the “development, sharing, and demonstration of software and system components, data sets, procedures, and tools that will facilitate the collaborations and support the application goals of these projects.”18 The second is to provide a distributed suite of clinical applications, guidelines, and knowledge bases for clinical, educational, and administrative purposes. The development of a shared integrated library distributed across institutions with different kinds of expertise is increasingly recognized as a necessity for providing access to a broad range of informatics resources to meet the needs of health care professionals.

We have conducted an evaluation of the InterMed collaboratory with two complementary objectives in mind—to assess whether the project has reached or is on course to reach its stated objectives, and to assess the collaboration as an ongoing experiment in computer-mediated collaborative design. One of the central goals of the collaboratory is the design and implementation of a methodology for sharing computer-based generic guidelines based on a standard representation model.35 Guideline-related activities have accordingly been the focus of our research.

We present a brief synopsis of our investigations. The InterMed enterprise aims to provide a broadly applicable model for shared component-based, collaborative development in medicine. We similarly view the ultimate objective of our research as the development of a broadly based analytic and methodologic framework for evaluating and investigating computer-mediated collaborative design. We anticipate that such a framework will have the potential to contribute substantially, by informing and shaping models of collaborative technology development.

The InterMed Collaboratory has been working to develop a representation called the Guideline Interchange Format (GLIF) to facilitate the sharing of guideline information among different systems and institutions. This is a challenging and important undertaking, given the large effort required to produce guidelines and the benefits that could be derived from minimizing duplication of effort, facilitating modifiability and, through iterative design, reducing ambiguity. One of our first objectives was to understand and characterize the different InterMed teams and investigate the processes they had defined in efforts to achieve their goals. We have also endeavored to study InterMed's evolving communication patterns, ongoing decisions, and distributed and collaborative activities. InterMed employs a range of asynchronous and synchronous communication media including e-mail, telephone conferencing, video conferencing and face-to-face meetings. The World Wide Web has also provided a medium for demonstrating and sharing tools, models and graphic representations. The study of this collaboratory has employed a wide range of analyses including electronic interviews, sociometric analyses of e-mail communication (patterns of dyadic and group communication over time), discourse analysis of dialogue, and meeting activity analysis designed to characterize planning and design. We have also examined the translation of text-based clinical guidelines into a GLIF representation.36

Sociometric analysis is a method used to characterize relationships among individuals. We have employed it to study communication patterns among members of the InterMed Collaboratory. This method involves representing the number of exchanges in which each individual takes part as well as the channels of interaction, such as who communicates with whom and the frequency of this interaction. In these sociometric representions, each node corresponds to an individual participant in the interchange. The links between nodes correspond to the channels and the frequency of communication between the participants. Representations are then developed for the overall time period in question and, to examine the evolution of communication patterns, sociometric representations are developed at various points in the evolution of the collaboration. The analysis can be used as a basis for characterizing clusters of project-related activity and the division of labor.

presents a sociometric representation of the communication patterns of InterMed members during a month of e-mail exchanges. It both characterizes the patterns of interaction among individuals with and beyond their institutional boundaries and charts the evolution of these patterns over time. Members from the four institutions involved in InterMed are grouped together, and InterMed central refers to the listserver that distributes e-mail to every participant.

Figure 1.

Figure 1

Sociometric analysis of e-mail communication among InterMed collaborators during April 1996.

The sociometric representation of the four weeks of e-mail communication indicates identifiable clusters of communication. There is a considerable degree of cross-site communication, with a select number of individuals being the main participants in communication with individuals at other institutions (participants B, C, G, J, and L), and there was considerably less intrasite communication. Furthermore, the listserver (InterMed Central) was the predominant method used to disseminate information. Our total analyses have spanned the first 96 weeks of the InterMed Collaboratory's guidelines research.33 We have looked at changes in communication patterns across various time intervals (e.g., one week), in relation to specific decision points (e.g., selection of clinical guidelines), and as guided by various events such as the preparation of publications or grant proposals. This method provides certain insights into the division of labor in the planning, design, and implementation processes.

Sociometric analysis is a valuable but coarse method that focuses on structure (e.g., frequency of exchanges) rather than content (e.g., substance of a message or topic) for studying communication patterns. We have employed this method in conjunction with other methodologies such as discourse analytic techniques to characterize productive as well as counter-productive episodes of collaboration. In brief, the results have characterized patterns of convergence and divergence in InterMed's goals and objectives, adaptive as well as suboptimal activity patterns, and the distinct role each of the communication modes has palyed in shaping the collaboration. This has been used to chart the progress over the period of data collection, showing patterns of collaboration and cooperation using different media (conference calls, e-mail, progress reports, and face-to-face interaction).

Although there has been exponential growth in interest and research, collaboration science is still in its infancy. Collaboration is increasingly central to the field of informatics and more broadly to the work force at large. How can collaboration be a more productive enterprise? What patterns of communication facilitate decision making in design and development? Alternatively, what patterns of communication and activity lead to suboptimal outcomes? Which technologies can be most effectively used to achieve various goals (e.g., joint writing of papers)? The objective is to develop an explanatory framework that will address these concerns. This framework has been used to both characterize research collaborations and collaborative practices in professional health care settings.34

Conclusions

Medical informatics is an emerging discipline characterized by rapid development and exciting new initiatives that promise to have a significant impact on the practice of medicine. In this paper, we have argued that the cognitive and social sciences can illuminate different facets of design and implementation. Dramatic technologic changes, such as those occuring today, invariably go hand in hand with profound social and cultural changes. Tension and strife result when the latter do not receive adequate attention. It has been well documented that technologic development often outstrips its productive use in a community of practitioners. Enhanced functionality and efficiency afforded by new machinery need to be balanced with concerns for usability, learnability, and adaptability to the needs of the setting. With perpetual change being one of the few certainties, the challenge is to adapt to the constantly shifting balance between recognizing and promoting these technologic changes and understanding the social consequences. Stable paradigms for clinical computing remain somewhat elusive for the moment. As long as there is a complementarity between the social and cognitive on the one hand and the technologic on the other, a satisfactory equilibrium can be more readily achieved.

This article identifies several ways in which cognitive science can contribute to objectives that concern researchers and practitioners in medical informatics. These were illustrated in the context of two research scenarios, one involving usability inspection and the other focusing on patterns of communication in collaborative processes. The methodologies and theories illustrated in this paper are oriented toward understanding and characterizing the cognitive, and to some extent the social, impact of technology. We have expanded on a framework drawn from cognitive science, human—computer interaction, and research in computer-mediated communication. The results of this research can contribute to tangible changes in systems, training, and practice. Similarly, the study of practice can help shape theories of human performance, technology-based learning, and scientific and professional collaboration that extend beyond the domain of medicine.

This work was supported in part by grant MA-134-49 from the Medical Research Council of Canada and by High Performance Computing and Communications contracts LM-43514 and LM-05857 from the National Library of Medicine.

Footnotes

*

We owe a great dept to Paul Smolensky for this analogy. We would also like to thank David Evans for contributing to the development of these ideas.

Individuals who use a system vary considerably in their background knowledge. To simplify matters, we can identify three broad classes of users: beginners who have minimal or no experience; novices who have a limited command of the basic functionality of the system; and experts who have attained considerable mastery in their use of the system. A walk-through is generally concerned with modeling beginners and novices.

It is a method commonly used to study relationships among peers in a classroom in view to identify children at risk.

References

  • 1.Haux R. Aims and tasks of medical informatics. Int J Med Inform. 1997;44(1): 9-20. [DOI] [PubMed] [Google Scholar]
  • 2.Hasman A. Challenges for medical informatics in the 21st century. Int J Med Inform. 1997;44(1): 1-7. [DOI] [PubMed] [Google Scholar]
  • 3.Tuttle MS. Medical informatics challenges of the 1990s: acknowledging secular change. J Am Med Inform Assoc. 1997;4(4): 322-4. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 4.Van Bemmel JH, Musen MA (eds). Handbook of Medical Informatics. New York: Springer, 1997.
  • 5.Shortliffe EH. The science of biomedical computing. In: Pages JC, Levy AH, Gremy F, Anderson J (eds). Meeting the Challenge: Informatics and Medical Education. Amsterdam, The Netherlands: North-Holland, 1984: 1-10.
  • 6.Warner H. Medical informatics: a real discipline? J Am Med Informatics Assoc. 1995;2: 207-14. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 7.Greenes RA, Shortliffe EH. Medical informatics: an emerging academic discipline and institutional priority. JAMA. 1990;263: 1114-20. [DOI] [PubMed] [Google Scholar]
  • 8.Hasman A, Safran C. A new name, a new scope. Int J Med Inform. 1997;44(1): v. [Google Scholar]
  • 9.Ledley RS, Lusted LB. Reasoning foundations of medical diagnosis. Science. 1959;130: 9-21. [DOI] [PubMed] [Google Scholar]
  • 10.Clark HH, Brennan SE. Grounding in communication. In: Clark HH (ed). Using Language. Cambridge, England: Cambridge University Press, 1996; 127-49.
  • 11.Collen MF. A History of Medical Informatics in the United States, 1950 to 1990. Indianapolis, Ind: American Medical Informatics Association, 1995.
  • 12.Norman DA. Cognitive engineering, In: Norman DA, Draper SW (eds). User-centered System Design. Hillsdale, N.J.: Lawrence Erlbaum Assocs, 1986: 31-62.
  • 13.Evans DA, Patel VL (eds). Advanced Models of Cognition for Medical Training and Practice. NATO ASI Series F: Computer and Systems Sciences, Heidelberg, Germany: Springer-Verlag, 1992;97: 193-212. [Google Scholar]
  • 14.Patel VL, Kushniruk AW. Understanding, navigating and communicating knowledge: issues and challenges. Methods Inf Med. Nov 1998. [PubMed]
  • 15.Patel VL, Groen GJ, Ramoni MF, Kaufman DR. Machine depth versus psychological depth: a lack of equivalence. In: Keravnou E (ed). Deep models for medical knowledge engineering. Amsterdam, The Netherlands: Elsevier North-Holland, 1992.
  • 16.Kushniruk AW, Patel VL. Knowledge-based HDSS: cognitive approaches to the extraction of knowledge and the understanding of decision support needs. In: Tan J (ed). Health decision support systems. Gaithersburg, Mass.: Aspen Publishers, 1988: 127-52.
  • 17.Salomon G (ed). Distributed cognition: psychological and educational considerations. Cambridge, England: Cambridge, University Press, 1993.
  • 18.Shortliffe EH, Barnett GO, Cimino JJ, Greenes RA, Huff SH, Patel VL. Collaborative medical informatics research using the Internet and the World Wide Web. Proc AMIA Annu Fall Symp. 1996: 125-9. [PMC free article] [PubMed]
  • 19.Williams LS. Microchip versus stethoscopes: Calgary Hospital MDs face off over controversial computer system. Can Med Assoc J. 1992;147(10): 1534-47. [PMC free article] [PubMed] [Google Scholar]
  • 20.Tang PC, Patel VL. Major issues in user interface design for health professional workstations. Int J Biomed Comput. 1994;34: 139-48. [DOI] [PubMed] [Google Scholar]
  • 21.Cimino JJ, Clayton PD, Hripcsak G, Johnson SB. Knowledge-based approaches to the maintenance of a large controlled medical terminology. J Am Med Inform Assoc. 1994;1(1): 35-50. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 22.Ball MJ, Collen MF. Aspects of the computer-based patient record. New York: Springer-Verlag, 1992.
  • 23.Patel VL, Cytryn KN, Jones PC, Safran C. The changing face of the clinical encounter with the collaborative health care team: a socio-cognitive evaluation. Manuscript submitted for publication. Cognitive Studies in Medicine. Montreal, Canada: McGill University, May 1988.
  • 24.Carroll JM. Human—computer interaction: psychology as a science of design. Annu Rev Psychol. 1997;48: 61-83. [DOI] [PubMed] [Google Scholar]
  • 25.Kushniruk AW, Kaufman RD, Patel VL, Lèvesque Y, Lottin P. Assessment of computerized patient record system: a cognitive approach to evaluation of an emerging medical technology. MD Comput. 1996;13: 406-15. [PubMed] [Google Scholar]
  • 26.Shackel B. Usability: context, framework, definition, design, and evaluation. In: Shackel B, Richardson S (eds). Human Factors for Informatics Usability. Cambridge, England: Cambridge University Press, 1991: 21-37.
  • 27.Lesgold A, Ivill-Friel J, Bonar J. Toward intelligent systems for testing. In: Resnick LB (ed). Knowing, Learning, and Instruction: Essays in Honor of Robert Glaser. Hillsdale, N.J.: Lawrence Erlbaum Assocs, 1989: 337-60.
  • 28.Kaufman DR, Kushniruk AW, Yale J-F, Patel VL. Physicians' knowledge and decision making for hypercholesterolemia and coronary heart disease. Submitted for publication, 1998. [DOI] [PubMed]
  • 29.Polson PG, Lewis C, Rieman J, Wharton C. Cognitive walkthroughs: a method of theory-based evaluation of user interfaces. Int J Man—Machine Stud. 1992;36: 741-73. [Google Scholar]
  • 30.Salomon G, Perkins DN, Globerson T. Partners in cognition: extending human intelligence with intelligent technologies. Educ Res. 1991;20(3): 2-9. [Google Scholar]
  • 31.Wulf WA. The collaboratory opportunity. Science. 1993;13(261): 854-5. [DOI] [PubMed] [Google Scholar]
  • 32.Shortliffe EH, Patel VL, Cimino JJ, Barnett GO, Greenes RA. A study of collaboration among medical informatics laboratories. Artif Intell Med. 1998;12(2): 97-123. [DOI] [PubMed] [Google Scholar]
  • 33.Oliver DE, Barnes MR, Barnett GO, et al. InterMed: an Internet-based medical collaboratory, Proc Annu Symp Comput Appl Med Care. 1995: 1023.
  • 34.Patel VL, Kaufman DR, Allen VG, Shortliffe EH, Cimino JJ, Greenes RA. Building collaboratories from scientific laboratories in medical informatics research. Cognitive Studies in Medicine. Montreal, PQ, Canada: McGill University, 1998. Submitted for publication.
  • 35.Ohno-Machado L, Gennari JH, Murphy SN, et al. The GuideLine Interchange Format: a model for representing guidelines. J Am Med Inform Assoc. 1998;5: 357-72. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 36.Patel VL, Allen VG, Arocha JF, Shortliffe EH. Representing clinical guidelines in GLIF: individual and collaborative expertise. J Am Med Inform Assoc. 1998;5: 467-83. [DOI] [PMC free article] [PubMed] [Google Scholar]

Articles from Journal of the American Medical Informatics Association : JAMIA are provided here courtesy of Oxford University Press

RESOURCES