While reading a routine EEG recorded for 20–30 min, the interpreter has to review approximately 120–180 pages or screens, each containing 19–25 channels of EEG and other waveforms. This includes normal and abnormal EEG activity during different states of consciousness and in response to a variety of stimuli. It can seem a daunting task, particularly for a novice, to not only summarize this information in a concise report that includes all the necessary and sufficient information but also provide a clinical correlation to guide the treating physician. The reporting becomes even more complex if specialized types of EEG and prolonged EEG studies are taken into consideration.
Fortunately, there are several resources available to assist the aspiring reader in learning how to identify normal and abnormal findings and report EEGs, including a number of EEG textbooks, professional society guidelines (American Clinical Neurophysiology, 2006, Tatum et al., 2016), a scholarly review (Kaplan and Benbadis, 2013) and a glossary of EEG terms (Kane et al., 2017). In addition, at many academic centers, EEG interpretation is part of training during a neurology residency, clinical neurophysiology fellowship or as a different medical subspecialty. However, access to these resources and specific training programs in EEG are not available in all parts of the world. At the same time, routine EEG is a commonly ordered test in clinical practice for a variety of indications and neurologists often report EEGs despite having limited training and experience.
The approach to EEG reporting also differs across centers and among individual readers. The large variety of clinically relevant EEG findings, differences in the use and understanding of terminology and variations in describing the clinical relevance of observations and terms (e.g. phase reversal) lead to confusion and variable quality of reports. Free text EEG reports provide flexibility in accommodating the wide variety of findings but the reporting style often varies from center to center and most electroencephalographers perpetuate these differences by continuing to use what they learned during their training in their future careers. Further, even highly knowledgeable and experienced readers at a single center may disagree with one another on an observation or its significance due to the inherent subjectivity involved in EEG interpretation.
Clearly, some standardization of EEG reporting is necessary while allowing for individual differences based on specific situations and the reader’s natural language and writing style. The most recent American Clinical Neurophysiology Society (ACNS) guideline on EEG reporting (Tatum et al., 2016) recommends the use of five sections- a succinct history with relevant clinical information and indication for the EEG, technical description of the conditions and parameters of recording, complete and orderly description of salient normal and abnormal EEG findings using standard terminology, EEG interpretation or impression reflecting a synthesis of the significance of EEG findings, and clinical correlation expressing the clinical relevance of the findings in language understandable to all clinicians. Despite the availability of guidelines, they are not always followed and use of a free text format can result in missed information and considerable variability of reports. As computer-based technology has become more sophisticated, attempts have been made at local and broader levels to apply this to EEG reporting.
The standardized computer-based organized reporting of EEG (SCORE) system was developed in Europe (Beniczky et al., 2013, Beniczky et al., 2017) and has quickly become popular with widespread use in multiple countries and endorsement by the IFCN and ILAE. This system uses a software developed by Holberg EEG that guides the reader through a structured reporting process where they choose features from a predefined list based on standardized terminology. There are two software packages - a premium edition that has to be purchased and a free version with limited features and basic functionality. The report is organized in a manner similar to that proposed in the latest ACNS guideline (Tatum et al., 2016) – study information including indication for EEG and technical details, systematic description of normal and abnormal EEG findings with visualization on a 2D head model, summary of EEG findings and their diagnostic significance and a clinical correlation. There is also a header with the hospital name and a section providing information about patient demographics and the referring provider.
An important question is: are EEG reports generated with the SCORE system superior to free text reports? In this volume of Clinical Neurophysiology Practice, Japaridze and colleagues describe their observations on comparing the quality of routine EEG reporting using the SCORE system with free text reports in 157 patients evaluated at a center in Tbilisi, Georgia, which they consider to be an underprivileged area (Japaridze et al., 2022). They found that the SCORE system not only improved the quality of reporting but was also felt to be understandable and useful by referring physicians.
All EEG reports were entered in both the free-text and SCORE formats using the free software version translated into Georgian. The article does not specify if the readers who entered both the reports were the same or different, how many readers were involved in the study and their background in terms of EEG training and familiarity with EEG guidelines. Inclusion of key features (quality indicators) based on the current ACNS guideline (Tatum et al., 2016) was assessed in both types of reports.
They observed that seventy-five percent (18 out of 24) of the features were exclusively (14) or more often (4) described in SCORE reports compared to free text reports. Analyzing the key features further, from table 2 it appears that the top 2 features that were never included in the free text reports were the indication for EEG and information about the recording electrode array. This is surprising since these elements are typically part of free text reports in most parts of the world. It is also a little confusing since the example of a free text report provided in the supplement does in fact provide a diagnosis (focal epilepsy), while the corresponding SCORE report states “monitoring the effect of medication” without specifying a clinical diagnosis. Also, the free text report does mention the 10–20 system of electrode placement, while the SCORE report only provides an extra detail with “10–20 and inferior row.” In addition, it is concerning that only 2 out of 157 free text reports described the diagnostic significance of the findings.
Using a Likert scale, referring physicians also indicated that the SCORE format was informative, easy to understand, useful and more refined (of higher granularity and precision) than free text reports. However, only 20 selected reports were provided to them and it is unclear from the article if the authors included both types of reports and asked for a comparison between the two or only shared the SCORE format.
While the differences between SCORE and free text reports were quite striking in this study, we cannot help wondering if the situation would be similar with free text reports at other centers. The authors do acknowledge that well written free text reports that follow all the recommendations of the ACNS guideline may be comparable to SCORE reports in terms of the details provided. But in situations where this is not done, because the guidelines are not available or ignored, or extensive EEG training is not available, the SCORE system may improve the quality of EEG reports by providing a template and guiding the reader to include essential details. Caution must be exercised before using the results at one center to make a generalized statement about underprivileged regions and similar findings need to be demonstrated in other such areas of the world. Also, the concept of “underprivileged” needs to be better defined. Are these regions with lack of resources or training or both? In terms of resources, free text written reports may actually be less expensive than using a computer program. The free version avoids the cost of the premium version, but still requires collaboration with information technology services and cannot be integrated with certain types of EEG hardware and electronic record systems. Regardless of the version used, readers need training and have to go through a learning curve before they are able to use the SCORE system accurately and efficiently.
As pointed out by others (Sperling, 2013, Tatum, 2017), it is also important to keep in mind that, although SCORE promotes inclusion of more details in the report, it does not improve the quality of interpretation. Therefore, there is an ongoing need for medical education and training.
The time required to complete a SCORE report has also been raised as a concern (Sperling, 2013, Tatum, 2017). Brogger et al. (2018) specifically addressed this issue and found that the median time to score and report a routine EEG using the SCORE system was 12.5 min, with higher and more variable numbers for abnormal EEGs compared to normal ones (8.5 min). While no direct comparison with the time for free text reporting was performed in this study and there are no other publications providing this information, these times appear to be acceptable. However, the authors were experienced users of SCORE and admitted that a center starting with SCORE EEG may take much longer.
Free text and SCORE EEG reports do not represent a complete either-or situation. Proponents of free text reports who are concerned about the potential for a cookie cutter approach with SCORE that takes out the “art” from EEG reporting and forces them to make choices from a limited set should be reassured to know that SCORE does allow the addition of some details as free text, especially in the summary and clinical interpretation. Conversely, with the wide availability of electronic medical records, free-text reports can be created using templates that steer and encourage the reader to comment on all the key features.
In conclusion, by prompting and guiding the reader, the SCORE system ensures that all the important EEG features are addressed and documented, whereas some of these may be omitted or overlooked in free text reports. This can lead to a better quality of EEG reporting. We agree with the authors that this may be particularly helpful in regions of the world with limited training and resources, as long as the free software version continues to be offered. Further, by using standard and uniform terminology, it can improve communication and data sharing across centers and between clinicians and researchers. Widespread adoption also obviates the need for individual centers to design and validate their own computerized system, saving time and effort, and allows international standardization of reports.
Conflicts of interest and funding sources
None.
References
- American Clinical Neurophysiology Society Guideline 7: guidelines for writing EEG reports. J. Clin. Neurophysiol. 2006;23(2):118–121. doi: 10.1097/00004691-200604000-00008. [DOI] [PubMed] [Google Scholar]
- Beniczky S., Aurlien H., Brøgger J.C., Fuglsang-Frederiksen A., Martins-da-Silva A., Trinka E., et al. Standardized computer-based organized reporting of EEG: SCORE. Epilepsia. 2013;54:1112–1124. doi: 10.1111/epi.12135. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Beniczky S., Aurlien H., Brøgger J.C., Hirsch L.J., Schomer D.L., Trinka E., et al. Standardized computer-based organized reporting of EEG: SCORE – Second version. Clin. Neurophysiol. 2017;128:2334–2346. doi: 10.1016/j.clinph.2017.07.418. [DOI] [PubMed] [Google Scholar]
- Brogger J., Eichele T., Aanestad E., Olberg H., Hjelland I., Aurlien H. Visual EEG reviewing times with SCORE EEG. Clin. Neurophysiol. Pract. 2018;3:59–64. doi: 10.1016/j.cnp.2018.03.002. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Japaridze G., Kasradze S., Aurlien H., Beniczky S. Implementing the SCORE system improves the quality of EEG reporting. Clin. Neurophysiol. Pract. 2022 doi: 10.1016/j.cnp.2022.07.004. this volume. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Kane N., Acharya J., Beniczky S., Caboclo L., Finnigan S., Kaplan P.W., Shibasaki H., Pressler R., van Putten M.J.A.M. A revised glossary of terms most commonly used by clinical electroencephalographers and updated proposal for the report format of the EEG findings. Revision 2017. Clin. Neurophysiol. Pract. 2017;2:170–185. doi: 10.1016/j.cnp.2017.07.002. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Kaplan P.W., Benbadis S.R. How to write an EEG report. Dos and don’ts. Neurology. 2013;80(Suppl 1):S43–S46. doi: 10.1212/WNL.0b013e3182797528. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Sperling M.R. Commentary on standardized computer based organized reporting of EEG; SCORE. Epilepsia. 2013;54:1135–1136. doi: 10.1111/epi.12210. [DOI] [PubMed] [Google Scholar]
- Tatum W.O. Standardized computerized EEG reporting – it’s time to even the score. Clin. Neurophysiol. 2017;128:2330–2331. doi: 10.1016/j.clinph.2017.08.021. [DOI] [PubMed] [Google Scholar]
- Tatum W.O., Olga S., Ochoa J.G., Munger Clary H., Cheek J., Drislane F., Tsuchida T.N. American Clinical Neurophysiology Society guideline 7: guidelines for EEG reporting. J. Clin. Neurophysiol. 2016;33:328–332. doi: 10.1097/WNP.0000000000000319. [DOI] [PubMed] [Google Scholar]