Highlights
-
•
Understanding the user experience is key to effective design.
-
•
Agile design practices foster efficient systems development.
-
•
Implementing technological change requires socio-cultural understanding of teams.
-
•
User-testing data from multiple sources allows for better systems evaluation.
-
•
A well-designed system requires little training or explanation to use.
Keywords: Interface design, Systems development, Environmental monitoring, Healthcare environments, Mixed-methods, User-centered design
Abstract
Background
Intensive care units (ICU) are busy round the clock and it is difficult to maintain low sound levels that support patient rest. To help ICU staff manage activities we developed a visual display that monitors and reports sound levels in real-time. This facilitates immediate feedback, encouraging proactive behavior change to limit disturbances.
Methods
Following the principles of user-centered design practices we created our ‘user persona’ to understand the needs and goals of potential users of the system. We then conducted iterative user testing with current members of the ICU team, primarily using the ‘think aloud’ method to refine the design and functionality of our novel system. Ethnography evaluated team use of the display.
Results
The final design was simple, clear, and efficient, and both functional and aesthetically pleasing for the key user demographic. We identified challenges in the implementation and adoption process that were separate from the ‘usability’ of the system itself.
Conclusions
Embedding the design process within the core user demographic ensured the final product delivered relevant information for key users, and that this information was intuitive to interpret. Initiating sustainable change is not straightforward. It requires recognition of cultural practices within teams, departments, professions, organizations, and strategies to maximize engagement.
1. Background
1.1. Monitoring sound levels in the healthcare environment
Sound levels in intensive care units (ICUs) are typically between 50 and 60 dBA, with peaks up to 128 dB [1]. By comparison a typical library is 35 dBA, the limit for road-legal car exhausts in the UK is 74 dBA [2], and a running chainsaw is ~110 dBA [3]. Prolonged exposure >85 dBA can damage hearing and requires hearing protection in the workplace [4]. Sounds >200 dBA can be instantly fatal [5]. Sound levels in the ICU have more subtle effects, such as disturbing patient sleep, distracting staff, or hindering communication. To protect staff and patients the World Health Organization recommends that patient care area 24 h average sound levels should not exceed 35 dBA [6].
Managing sound levels in the ICU is therefore a priority. It is known that ICU admission affects long-term mental health [7]. Patients can find the ICU highly distressing [8], and noise related to equipment and staff activity is frequently cited as contributing to patient anxiety. In an environment where peak sound levels > 100 dB occur up to 16 times every hour [1] it is unsurprising that patients report noise a significant barrier to good sleep [9].
Attempts to reduce sound levels in ICUs have achieved short-term success but sustained lower sound levels remain difficult to achieve [10]. Embedded behavioral or cultural change is difficult to realise, and requires repeated interventions over prolonged timeframes. This is particularly challenging where staff turnover is high. System change has better chance of longevity [11] but inadequately designed systems can worsen team performance [12] and many healthcare informatics systems have been shown to be suboptimal, with several usability challenges identified [13], [14].
To be adopted successfully an engineered solution must be designed well. Good design is achieved when a system fits an existing workflow and is usable by as many people as possible with minimal explanation. Many technological systems fail because their ‘conceptual image’ (the model that users generate of how the system performs, what the information it displays means, and how they can influence the working of the system) does not match reality. Through good design designers can communicate correct conceptual mapping to users [15]. This leads to efficiency in use that influences effectiveness and ultimately, satisfaction.
To ensure systems are usable in real working conditions, the design process needs to be based on a clear understanding of the purpose for which systems will be used, and the key features that users need. User-centered methods, such as those we deployed, allow designers to accommodate the human complexities associated with technological innovation uptake and adoption [16].
A key design concept for systems built to improve work flow, work conditions or efficiency is to minimize user cognitive load [17]. Effective displays should be clear, unambiguous and simple; each visual element reduced to its simplest form whilst retaining functionality [18]. Visual alert systems that do not account for user psychology are unlikely to be successful in the long term. Signs that attract attention through visual alerts when high sound limits are breached have been shown to improve noise control temporarily [19]. They may fail because their visual novelty wains over time. One of the aims of the SILENCE project (ref: NIHR PB-PG-0613-31034) was to design a real-time monitoring system allowing staff in the ICU to adjust activities to reduce sound levels. To do this we needed to develop a robust understanding of existing ICU noise management strategies and how these were embedded within routine clinical care. We then leveraged this understanding to create a new noise-monitoring system coupled to a novel clinical display that exemplified good design principles.
2. Methods
This quality improvement phase of the project ran at the adult intensive care unit at the John Radcliffe Hospital, Oxford from April 2017 to May 2018 (Oxford University Hospitals NHS Foundation Trust Datix ref: 3247). We did not recruit patients, nor collect identifiable patient information. Local ethics policies do not require formal review or approval for studies based on environmental data.
This was an advanced design mixed-methods study [20]. Specifically, the design development work was qualitative as we needed to understand the user experiences associated with the task of managing sound levels in patient areas. Quantitative analysis, embedded within the wider qualitative study, informed and supported design decisions made on the basis of understanding the qualitative data.
Sound levels were collected through a microphone array that mapped sound across a self-contained area within the general adult ICU at the John Radcliffe Hospital, Oxford. This area included four beds in a communal space and two single-occupation rooms. Details of this array system, designed to locate sound origins and convert sound level data for display, are published elsewhere [21]. Sound levels were used to contextualize the need for sound level monitoring, establish the need to present these data to clinical staff in near real time, and to help identify features of the soundscape that would be useful to be included on a visual display.
To create the display we recruited nurses with experience of working in the ICU. The final group of five was purposively sampled to ensure a range of seniority. This group produced a user persona [22], [23], describing the typical user expected of the system, which informed subsequent design decisions. Design teams can optimize products to maximize chances of successful deployment and adoption by recognizing the needs of a typical user, their expectations of the product, and how it will fit into existing tasks or processes. A ‘goal and role’ based user persona [24] was created by a group of nurses with a range of seniority (NHS Bands 5–7) and experience of working in an ICU. The nurses were asked to think about their goals, activities, expectations, and frustrations during a shift. We then discussed the role of the ICU nurse in terms of a typical patient admission to the unit. This discussion allowed us to create a ‘user journey’ and identify ‘pause points’ [25] in the workflow when information about sound levels would be helpful.
We assessed each version of the interface against elements of usability according to ISO9421 (effectiveness, efficiency and user satisfaction) [26]. New prototype designs were produced in response to user feedback following the ‘agile’ design process that is characterized by rapid iteration guided by evaluation through a design-test-review cycle [27]. Visual clutter can predict information search efficiency [28] related to display complexity [29]. In recognition of this, a screenshot from each iteration was assessed using the Feature_Congestion package [30] running in MATLAB (MATLAB r2014a (Mathworks UK, Cambridge, UK) as described by Rosenholtz et al. [31]. This analysis produces two measures of visual complexity; feature congestion (conveys clutter, and predicts the difficulty of locating a new variable added to the display), and sub-band entropy (related to the amount of information available, influenced by the number of features, colors, and shapes). Both values correlate well with real-world performance in visual search tasks [29]. Displays with higher values are less organized and more difficult to search. We used visual clutter analysis to measure objective information complexity. We used this objective data in conjunction with subjective user feedback to assess usability.
Each iteration was explored with users primarily using the ‘talk aloud’ method. Ten purposively-sampled prospective users were recruited from the adult ICU team, all with current experience of ICU nursing. The prototype interfaces were presented individually, allowing the nurses to discuss their preferences with regards to visual representation of sound and potential interactivity with the future interface.
The nurses were asked to interact with a static simulation of a series of prototypes, verbalizing their actions as they interacted with each [32]. Specifically, they were asked to explain how they interpreted the visual images and what action they would take if presented with this information on an electronic screen. We also used semi-structured discussions and simple questionnaires to explore understanding of sound levels and how users thought a new graphical user interface (GUI) of sound levels might integrate into the established workflow. The nurses were also asked if they were aware of the guidelines for sound levels in the ICU and how often they thought the local unit was compliant.
Interface use was assessed by in-situ ethnography. The display was active during all observation sessions, and field notes captured a description of the environment with a particular focus on how people interacted with the GUI and how the presence of the display influenced behavior. Sound levels were recorded concurrently. The researcher aimed to be unobtrusive in the environment to avoid influencing behavior. Where clinical staff voluntarily engaged with the research team we explored their understanding of the interface and how they felt awareness of sound levels in the ICU was changing. Field notes were reviewed shortly after ethnography sessions and key points identified and summarized.
The key principles of design informing this phase of the project are explored in detail elsewhere [33]. This adherence to good design practice ensured the system accommodated users’ needs and requirements, that it was straightforward to use, was pleasing to the eye, and communicated information clearly.
3. Results
3.1. User persona
Since 90% of the UK NHS nursing workforce is female [34], we created ‘Suzanne’, our user persona. She is a nurse with moderate experience working in a general adult ICU. We used hand-drawn graphics generated during the group discussion to understand Suzanne’s needs and expectations during her work. See Fig. 1.
Fig. 1.
User Persona: Suzanne – intensive care nurse and key user of proposed interface.
Suzanne’s primary goal is to manage her patient’s wellbeing by providing specialist care to maintain a ‘steady’ state of health and reduce disturbance. To achieve this she expects limited interruptions and does not want to disturb her patient more than necessary. She is frustrated by anything she perceives as preventing her from achieving her goals. Barriers include being overwhelmed by workplace stimuli and demands on her attention. She is particularly bothered by telephones and persistent alerts from patient monitoring systems, finds it time-consuming to negotiate for help, and is irritated when unable to find equipment.
She sees herself as caring and empathetic, an integral part of a supportive team, and effective at multitasking. She can feel emotionally detached from her work and she worries about her work/home balance. She is concerned about burnout.
Suzanne aims to limit disturbance by personalizing alarm thresholds for her patient but admits she can respond to alarms somewhat reflexively. She can also be oblivious to background noise which can feel like a monotonous drone. She is aware she can suffer ‘alarm fatigue’.
3.2. User journey
A user journey combined with the user persona allowed us to identify how the new GUI might integrate into existing workflows. We mapped ‘admission of a patient after an operation’ as a representative higher workload activity. The nurses described the main tasks associated with a new admission and we then asked them to reflect on their likely ability to be ‘present in the moment’ and their situation awareness. We were looking for key points in the admission process where it would be feasible to integrate a visual representation of sound levels. See Fig. 2.
Fig. 2.
User Journey: admission of a patient to the intensive care unit after an operation.
On the patient’s arrival Suzanne must assimilate detailed information from multiple sources to establish their needs. She may need to transfer the patient to new monitoring equipment, likely changing settings and triggering disconnection alarms in the process. As she becomes more familiar with her patient, her anxiety levels fall and her situational awareness increases. Retrospectively, Suzanne is aware the admission process was disturbing for other people, but admits that “in the moment” she is unaware of the disruption.
Predicted sound level changes associated with each stage of the user journey were mapped against how aware Suzanne is likely to be of these in real time. This suggested there were natural pauses in the process, during which Suzanne might reflect on ‘how things are going’, and that this could include a review of recent sound levels and disturbance.
3.3. GUI iteration
There were four iterations of the GUI. For simplicity in this paper we present visuals of the final version only. Full details of all iterations are available elsewhere [33].
An initial paper prototype was developed and presented to ten nurses for discussion that included think-aloud review and exploration of their understanding and experience of noise levels in an ICU, and an opportunity to design their own interface.
From a list of potential sound-related display elements, seven users prioritized noise level, five requested sound-source location, six thought details of the source and type of noise would be useful, and two were interested in the severity level. When prompted for their own interface designs, five people wanted suggestions to reduce noise levels, and four wanted to see trends or comparison data to assess current levels against targets. Individual designs varied but similarities enabled interface elements to be classified and grouped for future testing.
Based on findings from initial user tests, new low-fidelity designs were generated and presented for think-aloud assessment. This led to a formal specification that was used to create a functional electronic version for subsequent iterations and development, including interactivity testing. A design choice was made to split the display into separate screens. This disconnected real-time sound localization from trend values, simplifying the interface.
This split-display system made use of progressive data disclosure. System heuristics were based on spatial understanding of the ICU environment. The think-aloud process demonstrated that users were able to navigate the hierarchical display and correctly interpret iconography. New functionalities suggested at this stage included displaying individual bed space sound level values to motivate staff to reduce sound levels. It was also suggested that bed space values could help identify nurses who might be free to support patient care needs and help staff recognize which bed spaces were consistently louder/quieter than others. Feedback also included a need to acknowledge alerts and senior staff wanted to export reports from the system. There was a particular interest in being able to compare shifts and associate data with patient feedback questionnaires.
A summary [35] of comments relating to screenshots of near-final displays led to constructive feedback for the design team and a final specification for the fully-featured working prototype (see Figs. 3a and 3b). Specifically, this included a more vivid color scale and removal of mid-level (moderate) sounds from the localization display to increase focus on the most important high-intensity sounds. Indicators of sounds <35 dB were retained to encourage proactive behavioral change [36]. As one respondent commented: “makes me want to experiment […] to see if we can get it to go green”.
Fig. 3a.
Final primary screen, showing sound values by shift, hour, and current levels.
Fig. 3b.
Final sound localization display.
Examples of each iteration and examples from the electronic patient record display and standard ICU patient monitor used in the hospital were evaluated for visual complexity (Table 1). Both Feature Congestion and Sub-band Entropy are measures of visual information. Lower scores in all cases indicate simpler views which are easier to scan for information.
Table 1.
Feature congestion and sub-band entropy scores for each iteration in comparison to standard electronic patient monitoring and electronic health record systems in routine use.
Feature Congestion | Sub-band Entropy | |
---|---|---|
Cerner Millennium® | 6.43 | 3.98 |
ICU patient monitor | 3.94 | 2.72 |
Iteration 1 | 2.34 | 1.09 |
Iteration 2 | 3.83 | 2.49 |
Iteration 3 | 3.05 | 1.75 |
Iteration 4 | 1.81 | 1.09 |
All versions were visually simpler than both standard electronic patient record system in use (Cerner Millennium®) and patient monitoring screens used for routine care.
Interface use was assessed by in-situ ethnography. Observations were conducted by the primary researcher (JD) who was well-known to the ICU clinical team. Field notes were recorded from approximately six and a half hours of observations. Highlights from comments and feedback on the prototype interface design and purpose are outlined below:
-
•
Screen out of sight for most of the bay. No-one ever spontaneously registered it was ‘on’.
-
•
The “subtle, understated” look and dark background was particularly liked.
-
•
Several people suggested ‘now’ values should be more dynamic. This was corroborated by field notes that current sound level visuals were not altered with sudden loud sounds. Reporting current ‘peak’ rather than LAeq(15 mins) might increase impact.
-
•
Retrospective review of sound levels unanimously liked.
-
•
The wireless connectivity between the display screen and mainframe computer was intermittent. This meant we could not leave the system to run when we were not present in the ICU.
4. Discussion
Including future users of the proposed system in the design process was a key feature of this project. Their involvement at all stages ensured that the design and functionality of the system remained focused on their needs. Their input meant that the visual information presented on screen mapped to their understanding of their working environment. The ‘agile’ design method of rapid prototyping allows for fast-paced change and was ideal for this project. Each test stage is evaluated quickly, allowing features to be introduced, tested, and discarded if not appropriate, or not mapping to end-user needs and expectations. The speed of development possible through the agile method worked well for both the clinical and design teams.
Early designs were developed on paper. This low-fidelity approach is a cost-effective and rapid method to generate concepts and ideas. Concurrent software programming enabled final designs to be fully operational quickly, and the interactivity of the think-aloud technique “invites the user to become a participant in the analysis of his or her own cognitive processes” [37]. This technique kept the design centered on the users, crucial because new devices tend to succeed or fail not because of their technical merits or failings, but as a result of the socio-cultural context into which they are introduced. Products with a higher chance of long-term success are those that include implementation and evaluation strategies by design, and are created with an understanding of their setting, purpose, and use [38].
The iteration process identified quickly that a single screen presented too much information to be assimilated, evident in the high feature congestion scores which predicted inefficient visual search. The move to progressive disclosure resulted in a fall in visual complexity scores. The consistency between qualitative and quantitative data demonstrated that design decisions were valid, and that changes made on the basis of qualitative interpretation led to objective differences that were measurable across design iterations. Information search is therefore efficient in the final displays due to the low cognitive load required to interpret information. This low cognitive load is achieved through simplicity and intuitive logic. Use of color is limited and consistent, and circles displaying sound levels for different timescales are presented in order.
The final assessment was completed via in-situ ethnography sessions. We collected observational field notes that included real-time sound level monitoring data. Whilst it was clear there was a cohort of staff interested in addressing the problem of sound levels in the unit, it was immediately obvious that the display screen was installed in the wrong location, as no-one looked at it spontaneously.
Practical considerations constrained installation, and this influenced user experiences. The display was mounted on a wall behind two bed spaces. This was the only place where it was within reach of a power socket and also out of immediate view of patients. This position meant the display was not in easy sight of staff at any point, and completely hidden from view when privacy curtains were closed. It is possible that because the screen was physically between two bed spaces, staff looking after patients in other beds may have felt the screen was not relevant to them. The ICU is organized by bed space. Everything around an individual bed is necessary for that patient, but not relevant for other patients. Staff are therefore not conditioned to look into another bed space for information about their patient, and the positioning of the system made it difficult to incorporate into natural workflows for most people. We also suffered with wireless network connection. Two years on, handovers occur outside the main unit bay. Installing a static display in the handover area, with a complementary hand-held device for ad-hoc mid-shift monitoring would be a potential solution to consider.
5. Conclusions
“A good user interface is like a joke. If you have to explain it, it’s not that good” [39].
The interface is visually appealing, easy to search, and similar to other monitoring displays familiar to the ICU clinical team. The user experience methodology imposed a focus on the needs and preferences of the end user which allowed exploration of typical use-cases through scenario and real-time in-situ assessment. We were able to draw coherent conclusions from concurrent evaluation of mixed data sets throughout the project. We assessed cognitive load through visual clutter scoring and user testing and evaluated real-world usability through observation field notes considered alongside real-time sound values. This mixed methods approach led to a highly acceptable display that is both functional and aesthetically pleasing. Whilst some features need fine-tuning before wider deployment, the system as a whole has clear benefits.
The array system used for the research project is too complex for widespread use. However, we know that focusing on the individual patient space matches the mental model of how ICU staff (especially nurses) view their workspace and also from source location work [40] that loud sounds occur predominantly around patients’ heads. A spontaneous suggestion for the real-time display use was the ability to identify a “quiet bed” as a way to balance workload across the team when requesting practical assistance. Nurses were also keen to be able to use the display and reporting features to identify persistently loud areas to facilitate relocating patients they felt would benefit from “silent side room” care. Displaying sound levels for individual patient spaces is relatively simple and should be a key feature of any future sound level display designed for the ICU.
Finally, initiating sustainable change into a complex environment is not straightforward. It requires recognition of cultural practices within teams, departments, professions, organizations, and strategies to maximize staff engagement [41], [38]. This mixed-methods project brought the full multi-disciplinary team together, including a cohort of patients, to plan and deliver the intervention. In this respect the project was successful and benefitted from a co-design approach [42] that ensured relevance and significance to patient care. We were however still reliant on individuals with the authority and motivation to work flexibly and independently to bypass restrictive limitations on technology integration.
Funding
This paper presents independent research funded by the National Institute for Health Research (NIHR) under its Research for Patient Benefit (RfPB) Programme (Grant Reference Number PB-PG-0613-31034) and by the Oxford NIHR Biomedical Research Centre (Grant Reference Number NIHR-BRC-1215-20008). The views expressed are those of the authors and not necessarily those of the NIHR or the Department of Health and Social Care.
CRediT authorship contribution statement
Julie L. Darbyshire: Conceptualization, Data curation, Formal analysis, Funding acquisition, Investigation, Methodology, Project administration, Resources, Software, Validation, Visualization, Writing – original draft, Writing – review & editing. Paul R. Greig: Visualization, Writing – original draft, Writing – review & editing. Lisa Hinton: Funding acquisition, Investigation, Supervision, Writing – original draft, Writing – review & editing. J. Duncan Young: Conceptualization, Funding acquisition, Investigation, Methodology, Resources, Software, Supervision, Validation, Writing – original draft, Writing – review & editing.
Declaration of Competing Interest
The authors declare that they have no known competing financial interests or personal relationships that could have appeared to influence the work reported in this paper.
Acknowledgements
Many thanks to the staff and patients of the Adult Intensive Care Unit, Oxford University Hospitals NHS Foundation Trust, with particular thanks to matrons Matt Holdaway and Lyn Bennett for their support for this project. Thanks also to the software development team at Oxford Computer Consultants for their expertise translating the sound data and user comments into visual realities.
References
- 1.Darbyshire J.L., Young J.D. An investigation of sound levels on intensive care units with reference to the WHO guidelines. Crit. Care. 2013;17(5):R187. doi: 10.1186/cc12870. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 2.Vehicle Certification Agency: Cars and Noise. https://www.vehicle-certification-agency.gov.uk/fcb/cars-and-noise.asp.
- 3.Health and Safety Executive: Noise. http://www.hse.gov.uk/event-safety/noise.htm.
- 4.Department for Work and Pensions: The Control of Noise at Work Regulations, vol. 1643, 2005.
- 5.European Space Agency: Large European Acoustic Facility. http://www.esa.int/spaceinimages/Images/2014/01/Large_European_Acoustic_Facility.
- 6.Berglund B., Lindvall T., Schwela D. Guidelines for Community Noise. World Health Organisation; 1999. [Google Scholar]
- 7.Hatch R., Young D., Barber V., Griffiths J., Harrison D.A., Watkinson P. Anxiety, Depression and Post Traumatic Stress Disorder after critical illness: a UK-wide prospective cohort study. Crit. Care. 2018;22(1):310. doi: 10.1186/s13054-018-2223-6. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 8.Darbyshire Julie L., Greig Paul R., Vollam Sarah, Young J. Duncan, Hinton Lisa. 'I can remember sort of vivid people, but to me they were plasticine' Delusions on the intensive care unit: what do patients think is going on? PLoS ONE. 2016;11(4):e0153775. doi: 10.1371/journal.pone.015377510.1371/journal.pone.0153775.t001. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 9.Freedman N.S., Kotzer N., Schwab R.J. Patient perception of sleep quality and etiology of sleep disruption in the intensive care unit. Am. J. Respir. Crit. Care Med. 1999;159(4):1155–1162. doi: 10.1164/ajrccm.159.4.9806141. [DOI] [PubMed] [Google Scholar]
- 10.L. Delaney, E. Litton , F. Van Haren, The effectiveness of noise interventions in the ICU (1473-6500 (Electronic)). [DOI] [PubMed]
- 11.Trbovich P., Shojania K.G. Root-cause analysis: swatting at mosquitoes versus draining the swamp. BMJ Quality Safety. 2017;26(5):350. doi: 10.1136/bmjqs-2016-006229. [DOI] [PubMed] [Google Scholar]
- 12.Hagiwara M.A., Sjöqvist B.A., Lundberg L., Suserud B., Henricson M., Jonsson A. Decision support system in prehospital care: a randomized controlled simulation study. Am. J. Emerg. Med. 2013;31(1):145–153. doi: 10.1016/j.ajem.2012.06.030. [DOI] [PubMed] [Google Scholar]
- 13.Liew M.S., Zhang J., See J., Ong O.L. Usability challenges for health and wellness mobile apps: mixed-methods study among mhealth experts and consumers. JMIR Mhealth Uhealth. 2019;7(1):e12160. doi: 10.2196/12160. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 14.Hirschtritt M.E., Hirschtritt D.B. Improving usability of health information technology. JAMA. 2019;322(4):364–365. doi: 10.1001/jama.2019.6455. [DOI] [PubMed] [Google Scholar]
- 15.Norman D.A. The MIT Press; Cambridge, Massachusetts, US: 2013. The Design of Everyday Things. [Google Scholar]
- 16.Greenhalgh T., Robert G., Macfarlane F., Bate P., Kyriakidou O. Diffusion of innovations in service organizations: systematic review and recommendations. The Milbank Quarterly. 2004;82(4):581–629. doi: 10.1111/j.0887-378X.2004.00325.x. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 17.S. Oviatt, Human-centred design meets cognitive load theory: designing interfaces that help people think, in: MM ’06: Proceedings of the 14th ACM international conference on Multimedia. Santa Barbara, California, US, 2006.
- 18.Tuch A.N., Presslaber E.E., Stöcklin M., Opwis K., Bragas-Avila J.A. The role of visual complexity and prototypicality regarding first impression of websites: working towards understanding aesthetic judgments. Int. J. Hum. Comput. Stud. 2012;70(11):794–811. [Google Scholar]
- 19.Plummer N.R., Herbert A., Blundell J.E., Howarth R., Baldwin J., Laha S. SoundEar noise warning devices cause a sustained reduction in ambient noise in adult critical care. J. Intensive Care Soc. 2019;20(2):106–110. doi: 10.1177/1751143718767773. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 20.Cresswell J.W. Sage Publications Ltd; London: 2015. A Concise Introduction to Mixed Methods Research. [Google Scholar]
- 21.Müller-Trapet M., Cheer J., Fazi F.M., Darbyshire J., Young J.D. Acoustic source localization with microphone arrays for remote noise monitoring in an Intensive Care Unit. Appl. Acoust. 2018;139:93–100. doi: 10.1016/j.apacoust.2018.04.019. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 22.Johansson M., Messeter J. Present-ing the user: constructing the persona. Digital Creativity. 2005;16(4):231–243. [Google Scholar]
- 23.Human Interface Guidelines: Visual Design (Colour). https://developer.apple.com/design/human-interface-guidelines/ios/visual-design/color/.
- 24.A. Cooper, The Inmates are Running the Asylum: Why High Tech products drive us crazy and how to restore the sanity: Sams Publishing, 2004.
- 25.A. Gawande, The Checklist Manifesto: How to Get Things Right, Profile Books Ltd, London, UK, 2011.
- 26.ISO: ISO 9241-210:2019 Ergonomics of human-system interaction -Part 210: Human-centred design for interactive systems, 2019.
- 27.Dyba T., Dingsoyr T. What do we know about agile software development? IEEE Softw. 2009;26(5):6–9. [Google Scholar]
- 28.J.M. Wolfe, Approaches to Visual Search: Feature Integration Theory and Guided Search, in: A.C. Nobre, S. Kastner (Eds.), The Oxford Handbook of Attention, Oxford University Press, Oxford, 2014, pp. 11–55.
- 29.Rosenholtz R., Li Y., Nakano L. Measuring visual clutter. J. Vision. 2007;7(2) doi: 10.1167/7.2.17. [DOI] [PubMed] [Google Scholar]
- 30.A. Deza, denseClutterFunctions/Feature_Congestion. https://github.com/ArturoDeza/Piranhas/tree/master/MATLAB/denseClutterFunctions/Feature_Congestion.
- 31.R. Rosenholtz, Y. Li, L. Nakano, Feature congestion and subband entropy measures of visual clutter, http://dspace.mit.edu/handle/1721.1/37593.
- 32.Jaspers M., Steen T., Bos C., Geenen M. The think aloud method: a guide to user interface design. Int. J. Med. Inform. 2004;73(11-12):781–795. doi: 10.1016/j.ijmedinf.2004.08.003. [DOI] [PubMed] [Google Scholar]
- 33.Darbyshire J.L. University of Oxford; Oxford: 2020. Sleep in the Intensive Care Unit: Limiting Elements of Noise in the Critical Care Environment (SILENCE): DPhil Thesis. [Google Scholar]
- 34.Royal College of Nursing . Nursing and Midwifery Council; London: 2019. The NMC Register. [Google Scholar]
- 35.Ziebland S., McPherson A. Making sense of qualitative data analysis: an introduction with illustrations from DIPEx (personal experiences of health and illness) Med. Educ. 2006;40(5):405–414. doi: 10.1111/j.1365-2929.2006.02467.x. [DOI] [PubMed] [Google Scholar]
- 36.N. Eyal, Hooked:How to Build Habit-forming Products, Penguin Books Ltd, London, UK, 2014.
- 37.J. Nielsen, T. Clemmensen, C. Yssing, Getting access to what goes on in people’s heads? Reflections on the think-aloud technique, in: Proceedings of the Second Nordic Conference on Human-computer Interaction. Aarhus, Denmark, 2002.
- 38.Greenhalgh T., Wherton J., Papoutsi C., Lynch J., Hughes G., A'Court C., Hinder S., Fahy N., Procter R., Shaw S. Beyond adoption: a new framework for theorizing and evaluating nonadoption, abandonment, and challenges to the scale-up, spread, and sustainability of health and care technologies. J. Med. Int. Res. 2017;19(11) doi: 10.2196/jmir.8775. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 39.https://twitter.com/kimunertlphd/status/1113850393388298244.
- 40.Darbyshire J.L., Muller-Trapet M., Cheer J., Fazi F.M., Young J.D. Mapping sources of noise in an intensive care unit. Anaesthesia. 2019;74(8):1018–1025. doi: 10.1111/anae.14690. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 41.Rogers E.M. Simon and Schuster; New York: 2010. Diffusion of Innovations. [Google Scholar]
- 42.Locock L., Robert G., Boaz A., Vougioukalou S., Shuldham C., Fielden J., Ziebland S., Gager M., Tollyfield R., Pearcey J. Using a national archive of patient experience narratives to promote local patient-centered quality improvement: an ethnographic process evaluation of 'accelerated' experience-based co-design. J. Health Serv. Res. Po. 2014;19(4):200–207. doi: 10.1177/1355819614531565. [DOI] [PubMed] [Google Scholar]