Skip to main content
International Journal of General Medicine logoLink to International Journal of General Medicine
. 2012 Nov 6;5:917–921. doi: 10.2147/IJGM.S38805

Pivot and cluster strategy: a preventive measure against diagnostic errors

Taro Shimizu 1,, Yasuharu Tokuda 2
PMCID: PMC3508570  PMID: 23204855

Abstract

Diagnostic errors constitute a substantial portion of preventable medical errors. The accumulation of evidence shows that most errors result from one or more cognitive biases and a variety of debiasing strategies have been introduced. In this article, we introduce a new diagnostic strategy, the pivot and cluster strategy (PCS), encompassing both of the two mental processes in making diagnosis referred to as the intuitive process (System 1) and analytical process (System 2) in one strategy. With PCS, physicians can recall a set of most likely differential diagnoses (System 2) of an initial diagnosis made by the physicians’ intuitive process (System 1), thereby enabling physicians to double check their diagnosis with two consecutive diagnostic processes. PCS is expected to reduce cognitive errors and enhance their diagnostic accuracy and validity, thereby realizing better patient outcomes and cost- and time-effective health care management.

Keywords: diagnosis, diagnostic errors, debiasing

Introduction

One of the major objectives of clinical education is to improve the clinical reasoning abilities of medical students and residents. This ability is a key factor in physicians’ professional performance and competency.1 Diagnostic errors can lead to a substantial portion of preventable medical errors.2 Diagnostic errors can be associated with a higher morbidity in affected patients than other types of medical errors such as medication errors, surgical mistakes, or skill deficiencies.38 In particular, malignant or rapidly evolving conditions can cause great harm if undiagnosed, misdiagnosed, or diagnosed in an untimely manner.

Errors have been classified as “cognitive” (data gathering or synthesis, faulty knowledge), “system-related” (technical failures or organizational problems), and “no-fault” (unusual presentation or patient-related such as deception or poor cooperation).1,9,10 The majority of diagnostic errors are likely due to cognitive errors in physicians’ clinical reasoning, specifically bias and failed heuristics.1,11,12 The minimization of cognitive errors has been a major challenge in the diagnostic process.

Many types of biases have been identified that lead physicians to fail in diagnosis.1223 They include availability bias (the tendency to weigh the likelihood of things by the ease with which they are recalled), representative bias (the tendency to be guided by prototypical features of disease and miss atypical variants), confirmation bias (the tendency to seek data to confirm, not refute, the hypothesis), base rate neglect (the tendency to ignore the true rate of disease and pursue rare, but more exotic, diagnoses), and premature closure (a processing bias, the tendency to stop considering other possibilities after reaching a diagnosis).1,13,15,2428

Clinical reasoning can be classified into two classes of mental processes referred to as the intuitive process (System 1) and analytical process (System 2). This dual-processing model of thinking and reasoning has been explored extensively in psychology and has been applied to diagnostic reasoning in medicine.2931

The intuitive process occurs through unconscious intuitive matching to a prior example accumulated in memory.32 This process is rapid and contextualized, whereas analytical reasoning is deliberate, systematic, logical, and conceptually applies the more traditional methods of medical decision-making.

For any specific clinical situation, we cannot prescribe which diagnostic process the clinician should rely on, because one or both processes might be appropriate depending on contextual and other factors. In some situations, a trade-off between speed and accuracy might be important, requiring discretionary use of each process.10 For example, experienced physicians tend to rely more on intuitive reasoning based on pattern recognition, which works rapidly and effectively to cope with common or routine problems, despite the fact that this approach might be more easily affected by biases.10,3338 On the other hand, analytical reasoning can provide a more accurate diagnosis based on logical processes, but may require physicians to use more of their functioning memory, thereby limiting their speed.29

Physicians often become anchored in their initial hypothesis, whether it is right or not, search for confirming evidence to support their initial diagnosis, underestimate evidence against it, and therefore fail to adjust their initial impression in light of all available information.13,15 The effects of anchoring by an early incorrect diagnosis may lead to inaccurate judgment, inappropriate decisions, and ultimately unwanted and harmful impact on patients. Preliminary diagnostic impressions may be subject to bias and should always be checked against other possibilities.

A variety of debiasing strategies in making diagnoses have been introduced such as metacognition, cognitive forcing strategies, reflection, enforcing analytical reasoning, feedback, electronic systems, and checklists.13,14,22,23,29,3949 In this article, we introduce a new, practical, and quick-impact error-reducing strategy.

In general, physicians generate an initial diagnosis as a most likely hypothesis through intuitive or analytical processing based on collected history and physical examination in conjunction with their own knowledge and the experience stored in their memory. However, this process might lead to diagnostic error due to cognitive bias with or without faulty knowledge. We propose here that it may be possible to reduce error and improve diagnostic accuracy by automatically and simultaneously recalling a cluster of diagnoses whose clinical manifestation/pictures are very close to the initial diagnosis impression. Thus, in their initial diagnostic attempt, physicians do not necessarily need to reach the correct diagnosis. In other words, they can make the initial diagnosis as a so-called “pivotal” disease and at the same time, deploy the cluster of diseases that are very close to the pivotal disease in clinical manifestation (symptoms or other findings).50

Using this “pivot and cluster strategy” (PCS), physicians might mitigate diagnostic errors. The PCS enables the recall of the most likely differentials in an “en bloc” manner automatically and swiftly, according to a cluster (a set of differential diagnosis) that is prepared in advance. A key characteristic of the PCS is that it provides a guide for diagnostic reasoning and acts as a cognitive aid that may reduce cognitive error (Figure 1). Although we introduce an arbitrary pivot disease in Table 1, every disease has the potential to be a pivot. In addition, any diseases in the cluster in Table 1 can serve as another pivot. To make another disease as the new pivot, its specific cluster can be organized to the new host. Some actual examples in which PCS could have served effectively are described in the following clinical scenarios.

Figure 1.

Figure 1

The pivot and cluster strategy is explained with a “disease map”.

Notes: Suppose all diseases exist within the square frame called a “disease map” which includes all the identified diseases. Along with an initial diagnosis made by intuition of the physician, the pivot (star) diagnosis is plotted in the map. Concentric circles can be drawn a certain distance from the pivot. Dots that exist inside the circle are the clusters of this pivotal diagnosis. Diagnoses whose clinical pictures are very close to the pivotal diagnosis are distributed close to the pivotal disease. Thus, the distance between dots represents the similarity of the clinical picture of one specific diagnosis and another. Thinking of the cluster as a whole along with the pivotal diagnosis could minimize the cognitive defect in building differential diagnoses, thereby preventing biases in making diagnoses. The radius of the circle may depend on the physician’s certainty of the diagnosis. The more concern the physician has about the differential diagnoses, the greater the radius would be.

Table 1.

Examples of pivot and cluster

Pivot Cluster
Hepatic encephalopathy Hypoglycemia with/without vitamin B1 deficiency, hyponatremia, intracranial hemorrhage, lactic acidosis, etc
Acute appendicitis Diverticulitis, inflammatory bowel disease, Behçet’s disease, pelvic inflammatory disease, urinary calculus, etc
Polymyalgia rheumatica Paraneoplastic syndrome, vasculitis, infectious endocarditis, hypothyroidism, tuberculosis, etc
Rib fracture due to trauma Multiple myeloma, bone metastasis, intercostal neuralgia, costochondritis, pleuritis, referred pain, etc
Cerebral infarction Hypoglycemia, multiple sclerosis, intracranial hemorrhage, Todd’s palsy, migraine, conversion reaction, etc
Major depressive disorder Substance abuse, hypothyroidism, adrenal insufficiency, diabetes, frontal lobe tumor, pancreatic cancer, etc

Clinical scenario 1

An internal medicine resident at an urgent care clinic saw a 32-year-old woman with no significant past medical history, complaining of right lower abdominal pain that came on earlier in the day. This resident actually had seen another woman with appendicitis who had a similar right lower quadrant abdominal pain just 2 days earlier. He again made the diagnosis of appendicitis, but the surgeon who examined her did not agree. Although this example might seem extreme, the diagnostic process of this internal medicine resident was confounded by commonly seen biases such as availability bias, representative bias, and confirmation bias, all contributing to misdiagnosis. If the resident had used PCS at that time, he would have considered not only appendicitis, but also the cluster of appendicitis, making it more likely that alternative possibilities would have been considered such as diverticulitis, pelvic inflammatory disease, urinary tract infection, ovarian cyst, or ectopic pregnancy. Thus, the PCS pulls in other possibilities on the differential that might not otherwise receive consideration.

Clinical scenario 2

A 68-year-old man was brought to an emergency department (ED) with loss of consciousness. This patient was known to have cirrhosis of the liver due to alcoholism and had multiple prior visits to the ED. A similar presentation with altered level of consciousness had been attributed to hepatic encephalopathy that was treated and the patient was discharged. On this occasion, the emergency physician again attributed the patient’s condition to hepatic encephalopathy after minimum examination, and started treatment for hepatic encephalopathy, but the patient’s condition did not improve. The patient was admitted and computed tomography of the head was performed, revealing intracranial hemorrhage.

The emergency physician’s reasoning might have been vulnerable to several biases including the availability heuristic, posterior probability error, premature closure, or anchoring heuristic. However, if he had adopted the PCS for the cluster of hepatic encephalopathy (“altered level of consciousness”), he would have generated an appropriate differential that would have included hypoglycemia, intracranial hemorrhage, toxidrome, sepsis, and others.

Discussion

Clinical scenarios 1 and 2 demonstrate that PCS, as a combination of an intuitive diagnostic process with cognitive forcing strategy, one of the analytical processes, has the potential to improve the patients’ outcome. Cognitive biases are inevitable; however, we can minimize diagnostic errors due to cognitive bias by using PCS. The strategy effectively serves as a “safety net” of life-threatening diagnoses if the correct diagnosis is not made initially. The pivot automatically brings in its cluster that includes likely alternate diagnoses that need to be ruled out, especially for premature diagnostic closure, one of the most pervasive of cognitive biases.1

PCS differs from the traditional debiasing strategies in following at least four points: First, PCS is a debiasing strategy that comprises a combination of two mental processes, the intuitive process (System 1) and analytical process (System 2), in one strategy. That enables physicians to double check their diagnosis by way of two consecutive diagnostic processes. In this way, diagnostic accuracy will be expected to be higher than in other debiasing processes. Second, PCS urges physicians to focus specifically on the resemblance in clinical manifestation of the disease regardless of each disease entity. This has not been well focused in the traditional approach. Third, PCS visualizes one differential diagnosis list as a cluster in the “disease map” as shown in Figures 1 and 2. This can enable physicians to generate or expand the differential diagnosis list easily, especially when the new pivot is included in the original cluster, because these two clusters are overlapped in some differentials (Figure 2). Fourth, the concept of PCS can help learners to grasp the concept of building differential lists with the visual aid of the map. These ideas cannot be realized by the traditional differential diagnosis list.

Figure 2.

Figure 2

This figure depicts the overlap of two clusters of different pivots.

Notes: Two clusters (inside dashed and solid circle) overlap with some most likely differential diagnoses. That means the two clusters resemble in clinical manifestation (eg, left-sided abdominal pain), but differs in some points in making differentials (eg, exact location, radiation of pain).

PCS is especially useful where the cognitive workload of the clinician is high or for less clinically experienced interns or residents who are less likely to generate appropriate differential diagnosis lists. PCS can be taught to medical undergraduates and postgraduates. For instance, we can offer classes in which medical students can cultivate the skills of PCS in medical schools, or we could offer training sessions for improving their PCS skills as extracurricular training in residency programs. In every session, medical students, interns, and residents are repeatedly required to list a cluster for a specific diagnosis. That training will enable them to list effective differential diagnoses even in a pressing and emergent situation. It is also important for learners to be made aware of the wide variety of cognitive pitfalls in many medical situations.

Conclusion

This article suggests that a simple strategy could improve diagnostic accuracy with daily practice. PCS can be developed for settings in which diagnostic error is prevalent: internal medicine, emergency medicine, and family practice. Further research could be directed at the cost–benefit analysis of using PCS from the perspective of health economics.

Acknowledgments

We would like to express our gratitude to Dr Pat Croskerry, Professor, Department of Emergency Medicine, and Division of Medical Education, Faculty of Medicine, Dalhousie University, Halifax, Nova Scotia, Canada for his valuable reviewing and editing of our article.

Footnotes

Disclosure

We confirm that neither of the authors have a conflict of interest in this work.

References

  • 1.Graber ML, Franklin N, Gordon R. Diagnostic error in internal medicine. Arch Intern Med. 2005;165(13):1493–1499. doi: 10.1001/archinte.165.13.1493. [DOI] [PubMed] [Google Scholar]
  • 2.Kohn KT, Corrigan JM, Donaldson MS. To Err Is Human: Building a Safer Health System. Washington, DC: National Academies Press; 2000. pp. 1–287. [PubMed] [Google Scholar]
  • 3.Schiff GD, Hasan O, Kim S, et al. Diagnostic error in medicine: analysis of 583 physician-reported errors. Arch Intern Med. 2009;169(20):1881–1887. doi: 10.1001/archinternmed.2009.333. [DOI] [PubMed] [Google Scholar]
  • 4.Kostopoulou O, Delaney BC, Munro CW. Diagnostic difficulty and error in primary care – a systematic review. Fam Pract. 2008;25(6):400–413. doi: 10.1093/fampra/cmn071. [DOI] [PubMed] [Google Scholar]
  • 5.Brennan TA, Leape LL, Laird NM, et al. Incidence of adverse events and negligence in hospitalized patients: results of the Harvard Medical Practice Study 1. N Eng J Med. 1991;324:370–376. doi: 10.1056/NEJM199102073240604. [DOI] [PubMed] [Google Scholar]
  • 6.Wilson RM, Runciman WB, Gibberd RW, et al. The Quality in Australian Health Care Study. Med J Australia. 1995;163:458–471. doi: 10.5694/j.1326-5377.1995.tb124691.x. [DOI] [PubMed] [Google Scholar]
  • 7.Thomas EJ, Studdert DM, Burstin HR, et al. Incidence and types of adverse events and negligent care in Utah and Colorado. Med Care. 2000;38:261–262. doi: 10.1097/00005650-200003000-00003. [DOI] [PubMed] [Google Scholar]
  • 8.Tokuda Y, Kishida N, Konishi R, Koizumi S. Cognitive error as the most frequent contributory factor in cases of medical injury: A study on verdict’s judgment among closed claims in Japan. J Hosp Med. 2011;6(3):109–114. doi: 10.1002/jhm.820. [DOI] [PubMed] [Google Scholar]
  • 9.Graber M, Gordon R, Franklin N. Reducing diagnostic errors in medicine: what’s the goal? Acad Med. 2002;77(10):981–992. doi: 10.1097/00001888-200210000-00009. [DOI] [PubMed] [Google Scholar]
  • 10.Norman G, Eva KW. Diagnostic error and clinical reasoning. Med Educ. 2010;44(1):94–100. doi: 10.1111/j.1365-2923.2009.03507.x. [DOI] [PubMed] [Google Scholar]
  • 11.Gandhi TK, Kachalia A, Thomas EJ, et al. Missed and delayed diagnoses in the ambulatory setting: a study of closed malpractice claims. Ann Intern Med. 2006;145(7):488–496. doi: 10.7326/0003-4819-145-7-200610030-00006. [DOI] [PubMed] [Google Scholar]
  • 12.Kahnemann D, Slovic P, Tversky A. Judgment Under Uncertainty: Heuristics and Biases. New York, NY: Cambridge University Press; 1982. pp. 1–544. [Google Scholar]
  • 13.Croskerry P. The importance of cognitive errors in diagnosis and strategies to minimize them. Acad Med. 2003;78(8):775–780. doi: 10.1097/00001888-200308000-00003. [DOI] [PubMed] [Google Scholar]
  • 14.Mamede S, Schmidt HG, Rikers R. Diagnostic errors and reflective practice in medicine. J Eval Clin Pract. 2007;13(1):138–145. doi: 10.1111/j.1365-2753.2006.00638.x. [DOI] [PubMed] [Google Scholar]
  • 15.Redelmeier DA. The cognitive psychology of missed diagnoses. Ann Intern Med. 2005;142(2):115–120. doi: 10.7326/0003-4819-142-2-200501180-00010. [DOI] [PubMed] [Google Scholar]
  • 16.McNutt R, Abrams R, Hasler S. Emergency medicine, spotlight case. Agency for Healthcare Research and Quality; May, 2005. [Accessed February 22, 2011]. Diagnosing diagnostic mistakes: case and commentary. Web site. http://www.webmm.ahrq.gov/case.aspx?caseID=95. [Google Scholar]
  • 17.Croskerry P. Achieving quality in clinical decision making: cognitive strategies and detection of bias. Acad Emerg Med. 2002;9:1184–1204. doi: 10.1111/j.1553-2712.2002.tb01574.x. [DOI] [PubMed] [Google Scholar]
  • 18.Berner ES, Graber ML. Overconfidence as a cause of diagnostic error in medicine. Am J Med. 2008;121(Suppl):2–33. doi: 10.1016/j.amjmed.2008.01.001. [DOI] [PubMed] [Google Scholar]
  • 19.Kahneman D. A perspective on judgment and choice: mapping bounded rationality. Am Psychol. 2003;58(9):697–720. doi: 10.1037/0003-066X.58.9.697. [DOI] [PubMed] [Google Scholar]
  • 20.Kahnemann D, Tversky A. Choices, values, frames. Am J Psychol. 1984;39:341–350. [Google Scholar]
  • 21.Elstein AS, Schwarz A. Clinical problem solving and diagnostic decision making: selective review of the cognitive literature. BMJ. 2002;324:729–732. doi: 10.1136/bmj.324.7339.729. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 22.Croskerry P. The cognitive imperative: thinking about how we think. Acad Emerg Med. 2000;7:1223–1231. doi: 10.1111/j.1553-2712.2000.tb00467.x. [DOI] [PubMed] [Google Scholar]
  • 23.Croskerry P. Cognitive forcing strategies in clinical decision making. Ann Emerg Med. 2003;41:110–119. doi: 10.1067/mem.2003.22. [DOI] [PubMed] [Google Scholar]
  • 24.Klein JG. Five pitfalls in decisions about diagnosis and prescribing. BMJ. 2005;330(7494):781–783. doi: 10.1136/bmj.330.7494.781. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 25.Poses RM, Anthony M. Availability, wishful thinking, and physicians’ diagnostic judgments for patients, with suspected bacteremia. Med Decis Making. 1991;11(3):159–168. doi: 10.1177/0272989X9101100303. [DOI] [PubMed] [Google Scholar]
  • 26.Brezis M, Halpern-Reichert D, Schwaber MJ. Mass media-induced availability bias in the clinical suspicion of West Nile fever. Ann Intern Med. 2004;140(3):234–235. doi: 10.7326/0003-4819-140-3-200402030-00024. [DOI] [PubMed] [Google Scholar]
  • 27.Heath L, Acklin M, Wiley K. Cognitive heuristics and AIDS risk assessment among physicians. J Appl Soc Psychol. 1991;21:1859–1867. [Google Scholar]
  • 28.Peay MY, Peay ER. The evaluation of medical symptoms by patients and doctors. J Behav Med. 1998;21(1):57–81. doi: 10.1023/a:1018715521706. [DOI] [PubMed] [Google Scholar]
  • 29.Evans JS. Dual-processing accounts of reasoning, judgment, and social cognition. Ann Rev Psychol. 2008;59:255–278. doi: 10.1146/annurev.psych.59.103006.093629. [DOI] [PubMed] [Google Scholar]
  • 30.Stanovich KE. Who is Rational? Studies of Individual Differences in Reasoning. Mahwah, NJ: Lawrence Erlbaum; 1999. pp. 1–312. [Google Scholar]
  • 31.Croskerry P. A universal model for diagnostic reasoning. Acad Med. 2009;84:1022–1028. doi: 10.1097/ACM.0b013e3181ace703. [DOI] [PubMed] [Google Scholar]
  • 32.Medin D, Schaffer MM. A context theory of classification learning. Psychol Rev. 1978;85:207–238. [Google Scholar]
  • 33.Eva KW. What every teacher needs to know about clinical reasoning. Med Educ. 2005;39(1):98–106. doi: 10.1111/j.1365-2929.2004.01972.x. [DOI] [PubMed] [Google Scholar]
  • 34.Norman G, Young M, Brooks L. Non-analytical models of clinical reasoning: the role of experience. Med Educ. 2007;41:1140–1145. doi: 10.1111/j.1365-2923.2007.02914.x. [DOI] [PubMed] [Google Scholar]
  • 35.Norman G. Dual processing and diagnostic errors. Adv Health Sci Educ Theory Pract. 2009;14(Suppl 1):37–49. doi: 10.1007/s10459-009-9179-x. [DOI] [PubMed] [Google Scholar]
  • 36.Schmidt HG, Boshuizen HP. On acquiring expertise in medicine. Educ Psychol Rev. 1993;5:1–17. [Google Scholar]
  • 37.Schmidt HG, Norman GR, Boshuizen HP. A cognitive perspective on medical expertise: theory and implication. Acad Med. 1990;65(10):611–621. doi: 10.1097/00001888-199010000-00001. [DOI] [PubMed] [Google Scholar]
  • 38.Eva KW. The aging physician: changes in cognitive processing and their impact on medical practice. Acad Med. 2002;77(Suppl 10):S1–S6. doi: 10.1097/00001888-200210001-00002. [DOI] [PubMed] [Google Scholar]
  • 39.Mamede S, Schmidt HG, Rikers RM, Penaforte JC, Coelho-Filho JM. Influence of perceived difficulty of cases on physicians’ diagnostic reasoning. Acad Med. 2008;83(12):1210–1216. doi: 10.1097/ACM.0b013e31818c71d7. [DOI] [PubMed] [Google Scholar]
  • 40.Schiff GD. Minimizing diagnostic error: the importance of follow-up and feedback. Am J Med. 2008;121(Suppl 5):S38–S42. doi: 10.1016/j.amjmed.2008.02.004. [DOI] [PubMed] [Google Scholar]
  • 41.Singh H, Thomas EJ, Khan MM, Petersen LA. Identifying diagnostic errors in primary care using an electronic screening algorithm. Arch Intern Med. 2007;167(3):302–308. doi: 10.1001/archinte.167.3.302. [DOI] [PubMed] [Google Scholar]
  • 42.Mamede S. Effect of availability bias and reflective reasoning on diagnostic accuracy among internal medicine residents. JAMA. 2010;304(11):1198–1203. doi: 10.1001/jama.2010.1276. [DOI] [PubMed] [Google Scholar]
  • 43.Mamede S, Schmidt HG, Penaforte JC. Effects of reflective practice on the accuracy of medical diagnoses. Med Educ. 2008;42(5):468–475. doi: 10.1111/j.1365-2923.2008.03030.x. [DOI] [PubMed] [Google Scholar]
  • 44.Mamede S, Schmidt HG. The structure of reflective practice in medicine. Med Educ. 2004;38(12):1302–1308. doi: 10.1111/j.1365-2929.2004.01917.x. [DOI] [PubMed] [Google Scholar]
  • 45.Mamede S, Schmidt HG, Rikers RM, Penaforte JC, Coelho-Filho JM. Breaking down automaticity: case ambiguity and the shift to reflective approaches in clinical reasoning. Med Educ. 2007;41(12):1185–1192. doi: 10.1111/j.1365-2923.2007.02921.x. [DOI] [PubMed] [Google Scholar]
  • 46.Mamede S, Schmidt HG. Correlates of reflective practice in medicine. Adv Health Sci Educ Theory Pract. 2005;10(4):327–337. doi: 10.1007/s10459-005-5066-2. [DOI] [PubMed] [Google Scholar]
  • 47.Croskerry P. The feedback sanction. Acad Emerg Med. 2000;7:1232–1238. doi: 10.1111/j.1553-2712.2000.tb00468.x. [DOI] [PubMed] [Google Scholar]
  • 48.Ely JW, Graber ML, Croskerry P. Checklists to reduce diagnostic errors. Acad Med. 2011;86(3):307–313. doi: 10.1097/ACM.0b013e31820824cd. [DOI] [PubMed] [Google Scholar]
  • 49.Graber ML. Educational strategies to reduce diagnostic error: can you teach this stuff? Adv Health Sci Educ Theory Pract. 2009;14(Suppl 1):63–69. doi: 10.1007/s10459-009-9178-y. [DOI] [PubMed] [Google Scholar]
  • 50.Eddy DM, Clanton CH. The art of diagnosis: solving the clinico-pathological exercise. N Engl J Med. 1982;306:1263–1268. doi: 10.1056/NEJM198205273062104. [DOI] [PubMed] [Google Scholar]

Articles from International Journal of General Medicine are provided here courtesy of Dove Press

RESOURCES