Skip to main content
NIHPA Author Manuscripts logoLink to NIHPA Author Manuscripts
. Author manuscript; available in PMC: 2020 Apr 16.
Published in final edited form as: J Health Psychol. 2017 Jul 12;25(4):429–438. doi: 10.1177/1359105317718615

Trust in hybrid closed loop among people with diabetes: Perspectives of experienced system users

Molly L Tanenbaum 1, Esti Iturralde 1, Sarah J Hanes 1, Sakinah C Suttiratana 2, Jodie M Ambrosino 3, Trang T Ly 1,4, David M Maahs 1,5, Diana Naranjo 1, Natalie Walders-Abramson 5, Stuart A Weinzimer 3, Bruce A Buckingham 1, Korey K Hood 1
PMCID: PMC7162558  NIHMSID: NIHMS1566749  PMID: 28810490

Abstract

Automated closed loop systems will greatly change type 1 diabetes management; user trust will be essential for acceptance of this new technology. This qualitative study explored trust in 32 individuals following a hybrid closed loop trial. Participants described how context-, system-, and person-level factors influenced their trust in the system. Participants attempted to override the system when they lacked trust, while trusting the system decreased self-management burdens and decreased stress. Findings highlight considerations for fostering trust in closed loop systems. Systems may be able to engage users by offering varying levels of controls to match trust preferences.

Keywords: automated systems, hybrid closed-loop, technology acceptance, trust, type 1 diabetes

Introduction

Closed loop (CL) automated insulin delivery systems have shown promise for changing type 1 diabetes management in the near future. Current systems, including those being tested and one that will be available to consumers in 2017, include several components: an insulin pump, continuous glucose monitoring (CGM), and a third device that manages data from CGM to provide automated insulin delivery through the pump through the use of an algorithm. Some systems manage glucose levels overnight, while others—including the first approved by the Food and Drug Administration (FDA) in 2016 (Gordon, 2016)—are hybrid closed loop (HCL) systems which still require meal boluses but assist in managing daytime and nighttime glucose levels (Ly and Buckingham, 2015). Trials of such systems have demonstrated success through keeping participants’ glucose levels within range 70–80 percent of the time (Ly and Buckingham, 2015; Russell et al., 2014, 2016; Thabit et al., 2015 a, 2015b). Given these positive results, HCL systems have potential to ease daily burdens of diabetes management, improve glycemic control, and reduce risk of long-term complications.

HCL users must be able to develop trust in these systems to use them and give over control of diabetes management to them (Barnard et al., 2015). However, little is known how user trust in an HCL system initially develops. For this article, we draw upon Wagner’s (2009) definition of trust: the belief that the automated technology will “mitigate the trustor’s risk” in situations when the trustor puts their outcomes at risk by relying on the system (Lee and See, 2004; Robinette et al., 2015). Trust is a necessary element of HCL technology adoption, not only because of potential health risks involved in using them but also because sustained use is needed for systems to benefit their users.

To date, few studies on trust and acceptance of these systems have focused on participants with firsthand HCL experience. The majority of studies have been with individuals without direct HCL experience and have found a fairly high level of a priori acceptance of this technology; many participants report willingness to use a CL system when available (Barnard et al., 2015; Elleri et al., 2010; Van Bon et al., 2010, 2011). Some participants predicted that they would trust a system and benefit from it, including trusting overnight insulin delivery (Elleri et al., 2010) and expecting that the technology would stabilize glucose levels, save time, and decrease worry (Van Bon et al., 2010, 2011). Some felt they would need about 2 weeks of CL experience to witness its benefits and establish trust (Van Bon et al., 2010). Others foresaw having difficulty trusting a system and wanting to override it (Shepard et al., 2012). Anticipated barriers to trust included concerns about accuracy of the CGM, algorithm, and personalized advice and concern about human error in inputting data (Shepard et al., 2012). However, as participants were predicting whether they might trust an HCL system before experiencing it directly, it is not possible to conclude that their predictions would align with the reality once they gain direct HCL experience.

Very few studies have explored how experienced HCL users develop trust in these systems. One study surveyed participants following an HCL trial; most said they would adopt the technology when available (Bevier et al., 2014). Participants factored in usefulness and usability of the system, health benefit, and convenience in how willing they were to accept the new technology. A second, mixed-methods study of adolescents following an overnight CL trial noted some trust-related issues around CGM accuracy and insulin delivery (Barnard et al., 2014).

A limited number of studies have also examined technology acceptance and trust in the components of HCL systems. Perceived CGM inaccuracy has been linked with device discontinuation, infrequent device use, and lower quality of life (Chamberlain and Dana Dopita, 2013; Polonsky and Hessler, 2015; Ramchandani et al., 2011). In addition, former CGM users frequently cite lack of trust as a reason for discontinuing use (Polonsky and Hessler, 2015).

In addition to the limited availability of existing literature on how HCL systems earn the trust of their users, available theoretical models on technology acceptance are not entirely adequate to account for the need for trust in the automated nature of HCL systems. The Technology Acceptance Model (TAM), first applied to acceptance of healthcare information technology, is the most common theory referenced in CL and HCL studies (Holden and Karsh, 2010). Studies have found usefulness and perceived ease of use to be key TAM facets in willingness to accept a new technology (Bevier et al., 2014; Holden and Karsh, 2010; Van Bon et al., 2010). Another relevant theory to CL systems is the Diffusion of Innovation Theory (DIT) (Gonder-Frederick et al., 2011). DIT describes a process of technology adoption, beginning with learning about it, deciding to use it, using it, and evaluating it. While both theories address technology acceptance, they do not adequately capture CL-specific considerations, such as user risk involved in relying on an automated system to manage diabetes, and perceived benefits of trust (e.g. diabetes-specific burden that could be eased through using a system).

HCL systems are an emerging technology that will be increasingly available in the immediate future, and their value to individuals with diabetes depends on their ability to retain their users. Therefore, it is important to understand how these systems initially engage users and gain their trust. To date, no studies have investigated the process of trusting HCL systems in individuals with HCL trial experience, and existing theoretical frameworks do not fully capture HCL-specific considerations. Qualitative methods are well suited to develop an understanding of how users build trust in a new technology and to expand upon existing theories of technology acceptance to account for trust and risk involved in using automated systems. Therefore, we aimed to explore the following questions: (1) what factors influence trust in HCL systems? and (2) How do users experience giving over control to an automated system? Understanding these questions will inform theory and assist with increasing acceptability and uptake of these systems.

Method

HCL trial

Eligible participants at this multicenter trial were between 14 and 40 years of age, were diagnosed with type 1 diabetes for at least 12 months, and had a total daily insulin requirement of >0.4 U/ kg/day. Individuals were excluded if they had diabetic ketoacidosis in the past 30 days, hypoglycemic seizure or loss of consciousness in the past 3 months, pregnancy, or medical or psychiatric conditions that would interfere with protocol completion. Adults participated for 5 days on the system, adolescents for 4 days. Participants spent nights in a hotel or house setting with staff in close proximity. Three cohorts were studied. The protocol was approved by the Institutional Review Board at the three collaborating sites: Stanford University, University of Colorado, and Yale University. Complete details have been published elsewhere (Ly et al., 2016). Two systems were tested. System A used a research Android-based platform and a third generation Enlite® sensor (Medtronic, Northridge, CA) and required wireless communication between the pump, a translator box, and the Android phone. The Android phone allowed for remote monitoring. System B used a fourth generation Enlite sensor (Medtronic) and an early research prototype of the Medtronic Hybrid Closed Loop system pump. In System B, the pump and algorithm were integrated, but this system did not allow for remote monitoring. Both systems presented information about current active insulin.

Procedure

Focus groups were held at the end of the trial, were led by a psychologist at each site, and used a structured interview guide to promote consistency. Interview questions included pre-participation expectations, experiences of wearing the system, attitudes about usability, experiences with diabetes management on the system, and post-study attitudes. Groups lasted between 40 and 75 minutes and were audio-recorded and transcribed.

Data analysis

The goal of this study was to understand initial trust development in HCL systems overall, not to compare systems. Therefore, data from all nine focus groups (three from System A and six from System B) were coded together using content analysis (Pope and Mays, 2000). A codebook was created based on original study questions and an initial open-coding phase in which two coders read and independently coded one transcript to generate an initial set of codes which were then applied to remaining transcripts. Coders met weekly with a third researcher (D.N.) to resolve coding discrepancies, and add, merge, or refine codes if needed. Updated codes were re-applied to earlier transcripts in an iterative process. Coding was entered into NVivo 10 qualitative data analysis software (QSR International Pty Ltd, 2014). Inter-rater reliability was calculated by the two coders both coding 25 percent of all transcripts from randomly generated segments of data. If a given theme was endorsed by both coders in a specific segment, or not endorsed by both coders in a specific segment, this indicated agreement. Based on this criterion, there was 86 percent agreement on average across themes.

Results

A total of 32 individuals participated (see Table 1 for participant characteristics). Major themes relating to establishing trust in an HCL system during a research trial were as follows: (1) factors influencing trust (context-, system- and person-level factors) and (2) experiences of giving over control and developing trust in an HCL system. Figure 1 depicts the interrelationships of these themes within an overarching model of trust development in automated systems.

Table 1.

Participant characteristics (n = 32).

Cohort 1:
adults (n = 8)
Cohort 2:
adults (n = 9)
Cohort 3:
adolescents (n = l5)

HCL System Version A B B
Age (years) 27.8 ± 5.8 28.5 ± 6.7 l6.6 ± 0.9
Diabetes duration 17.7 ± 5.7 18.9 ± 8.2 8.2 ± 3.9
HbAlc
 % 6.7 ± 0.6 7.0 ± 0.7 9.0 ±1.1
 mmol/mol 50 ± 6 53 ± 8 75 ± 12

HbAlc: glycated hemoglobin.

Figure 1.

Figure 1.

Trust in CL systems: influencing factors and implications.

Factors that influenced trust

Contextual factors.

Trust in the system was context-dependent; many participants described trusting the system in one context but not in another. For example, participants were more willing to trust overnight management and less comfortable trusting the system to handle meals and exercise:

I trusted it throughout the whole [study trial]. I thought it was really good for at night and monitoring your blood sugar going up and down. But after eating it was a little off. (Adult, System B)

System-level factors.

System attributes that influenced participant trust included device brand, CGM accuracy, information availability, and alerts.

Device brand.

Some participants expressed that certain device brands would present a barrier to trusting a system. These barriers were often due to past negative experiences with a particular brand of device. In addition, participants were less likely to trust the study sensor when they had positive experiences with their own sensor of a different brand.

CGM accuracy.

Accuracy of the CGM was a very important precursor to establishing trust in the overall system:

[The algorithm] is based off your sensor. If you don’t trust your sensor, you’re not going to trust the insulin that you’re getting. (Adult, System B)

Perceptions of CGM inaccuracy negative affected overall trust in the system:

I was more confident in my own feeling of being low than what the sensor was telling me. That day that I was 53 the sensor said that I was 100 … I was not confident in the sensor or in the system. (Adult, System A)

When participants did not trust the CGM, they performed more frequent glucose checks:

The pump is nothing without a CGM, so if I don’t trust the sensor ... It didn’t undermine my confidence in the algorithm of the system but before I did any sort of treatment, I would check [my blood glucose level]. I don’t think I checked less by any means. I checked more. (Adult, System B)

Therefore, perceptions of CGM inaccuracy led to decreased trust and increased self-management burden through more frequent blood glucose monitoring.

Information availability.

Access to accurate, real-time diabetes information increased trust. Many users monitored system information frequently and felt it was easily accessible, including CGM trend graphs and insulin delivery information:

I liked that the graph is there. You don’t have to wait for it to build itself. You hit the button for the graph and it’s there. (Adult, System B)

You have to go and look at [insulin given]. That was kind of new to me. It was like, “I can trust this because if I go back and look, I’ve had no insulin for the last hour and a half. I’m okay.” (Adult, System B)

Participants described relying on and benefiting from other system information, such as arrows that indicated direction and speed of change in glucose levels:

Respondent 1: The arrows were very accurate and helpful for me. If I was 90 or 85 and I had a down arrow, I usually would be like 70 or close to 60, which is low. I’d know if I had an arrow I would be pretty close to having a low. Having those arrows were really helpful.

Respondent 4: It was peace of mind. It really was. (Adolescents, System B)

I trusted these arrows a lot more. And I like how there were triple arrows … I thought that was cool. Just to know, “Oh wow, I’m really going down or I’m really going up.” (Adolescent, System B)

The availability of accurate information helped participants make diabetes management decisions, such as when to check glucose levels or to bolus. That being said, participants had individual preferences for quantity and accessibility of information. Some considered the arrows to be a source of stress:

[When you see a triple-arrow down], you don’t know what’s actually happening to you. You’re just free-falling. (Adults, System B)

While some participants expressed a desire for more readily accessible information about insulin dosing, others were satisfied with the information presented. Participants’ ability to access desired information, therefore, fostered trust in the system.

Alerts.

Being able to rely on system alerts was a key part of establishing trust. Some participants appreciated the ability to customize alerts, while others felt the alarms were easy to miss or ignore. Some participants wished the system had distinct alerts for certain events (e.g. low blood glucose levels at night), as users could begin to ignore alerts and they did not want to miss important ones. Others complained that the alarms were too frequent and intrusive, causing embarrassment at work.

Person-level factors.

Participant characteristics played a role in developing trust. Participants who had managed diabetes on their own for many years spoke about learning to trust an automated system as a gradual process:

Last night [my glucose level] was 160 and as I was starting to fall asleep, it was that 20 years of diabetes, I woke up, “What is it? What am I doing? Do I feel low?” you know, that stress-out moment. I’ve learned to trust [the system] a little bit more every day. (Adult, System B)

The first night I didn’t trust it at all. I didn’t. I’ve had diabetes for 17 years. I’ve done it all myself and trusting something else to do that for you without you having really any say in it was … But at the end [of the trial] I completely trusted it 100 percent. (Adult, System B)

Another person-level factor was participants’ self-evaluation of how well they manage diabetes on their own, without the HCL system. Essentially, participants asked themselves, “Can this system do better than I can?” and evaluated the system in comparison to their own performance. For some participants, trust developed quickly; these individuals were more concerned with trusting themselves to manage diabetes on their own again after the study ended:

Respondent 1: I just know that after [the study, my diabetes is] not going to be as controlled. I think I’m going to go back to where I was before.

Respondent 2: It’s harder to control because you can’t constantly sit there and pull out your little graph … it’s going to be hard getting back to the old ways. (Adolescents, System B)

The control-trust spectrum: challenges of giving over control to an automated system and implications of trust.

Figure 1 depicts the continuum from trusting an automated system entirely at one end to giving over total control to an automated system at the other end. Many participants described starting the trial on the “control” end of the continuum and resisting giving control over to the system to manage diabetes for them. As one participant stated,

It’s hard to give up control. I think about my diabetes constantly. I make a million decisions a day. So it’s hard to let that go. (Adult, System B)

At the “control” end of the continuum, participants described attempts to override the system and make different decisions, particularly early on in the trial. The desire to override arose when participants wanted to modify bolus amounts and to administer correction doses:

Sometimes I really wanted to correct but it wouldn’t let me. (Adolescent, System B)

There wasn’t an option for those high-fat, high-carb meals. There wasn’t an option that you could do a dual wave bolus. That was my main concern. (Adult, System A)

Some participants tried workarounds such as entering meals early, inputting more carbohydrate than actually consumed, or entering “fake meals.” These actions enabled participants to retain some control over diabetes management and to achieve desired results rather than trust the system entirely:

I overbolused for all my meals because I felt like I was going so high after I ate. I think because they turned off the micro-bolusing when you eat ... I felt like I always wanted more insulin. (Adult, System B)

Additionally, some participants wished they could provide the system with more detailed information on their daily activities to have more confidence that the algorithm would make appropriate decisions:

If I had the option of [inputting] severity or intensity of exercise, like on a 1, 2, 3 scale, Am I going for a walk? … That would be 1. Am I going to boot camp? That’s a 3, and I know I’m going to bottom out at 30 or 40 afterwards, so it would be nice. (Adult, System B)

Many participants were able to move toward the trusting end of the continuum once the system proved itself through concrete results:

The first day I was wigging out … but then as I realized how effective it was I learned to trust it. Especially seeing the first time it caught a low on me—because I way overdosed for a meal, and the first time I saw it cut off insulin and catch it I was like, “I think I’m going to be okay.” So then I learned to roll with it. (Adolescent, System B)

As would be expected, participants had a harder time giving over control and trusting the system when disappointed by the results:

[My glucose level] was much, much higher this week than normal . I think it was nice to see when it was working. It was just very, very hard for me to put my trust in it. (Adult, System A)

Contrasting with our observation that lack of trust in the system led to greater self-management burden, we found that trust in the system led to decreased self-management and decision-making burden and improved quality of life:

Respondent 3: You don’t have to be as concerned about stuff. You don’t have to trust your instincts.

Respondent 2: If you’re feeling weird ... it was nice to look at your number and be like “Okay, you’re fine. Don’t worry about it” or “Maybe you should check.” (Adolescents, System B)

At the “control” end of the continuum, participants were less hopeful about the technology’s ability to benefit them:

I really was hoping that I wouldn’t go high and low so much. This isn’t close to a cure. I’m still feeling my diabetes. I still thought about it constantly. I felt just as much or more burden . I think my A1c would significantly go up using this with the settings right now. (Adult, System B)

Meanwhile, those who expressed trust in the system then began to imagine the benefits it would have in their daily lives and for their long-term health:

I would be able to trust that the system was doing its job and I would be able to focus more on what I have to do instead of “Oh I’m feeling low and I need to eat.” (Adolescent, System B)

The main thing was having to put my trust in it. Once I did that I felt that it was nice not worrying ... I liked that down the road it will be less worry and be able to keep you tighter. (Adult, System A)

Trusting the system increased hopefulness about the future of diabetes care, as participants could imagine ways in which the system would benefit their daily lives.

Discussion

Our results bring up challenges users face when transitioning to a semi-automated system to manage diabetes and indicate that trust in these systems depends on contextual, system-based, and user-specific factors. Trust in HCL was context-dependent; overall, users trusted the system more to manage diabetes overnight than to handle meals and exercise. System attributes also influenced trust, including CGM accuracy, information availability, alerts, and past experiences with certain device brands. Participants felt frustrated when they could not access desired information, such as insulin dose information, easily. In addition, user characteristics influenced trust; participants who had managed diabetes without HCL for many years expressed skepticism that an automated system could make better decisions than they could. Others were more eager for an automated system to manage their diabetes, particularly if they perceived the system to be superior to self-management. Many participants wanted to retain some control over diabetes decision-making and attempted to override the system to give themselves different amounts of insulin. Our findings point to a continuum between control and trust that has been highlighted in the literature on trust in other automated systems (e.g. self-driving cars) (Carlson et al., 2014; Heitmeyer and Leonard, 2015; Helldin et al., 2013; Parasuraman and Riley, 1997). As we have depicted in Figure 1, there are self-management and quality of life implications at both ends of the continuum.

Advances in automated technology have created the need to better understand what factors influence trust in automated processes. Human-automation interaction (HAI) research has shown that user trust has implications for how users engage with a system (Carlson et al., 2014; Parasuraman and Riley, 1997). Technology may be imperfect and not able to adapt immediately to any given real-world situation; therefore, technology should instill an appropriate level of trust (Carlson et al., 2014). Too little trust in a system could dissuade those with type 1 diabetes from using CL systems or lead users to employ workarounds and other strategies to “trick” the system or regain some control. Conversely, over-trusting is potentially harmful. Users who trust an automated system too much may not monitor the system as closely, may lose skills and become less vigilant over time, and may not be prepared to intervene when necessary (Carlson et al., 2014).

Therefore, it will be important to calibrate user trust to retain an appropriate level of HCL or CL user engagement (i.e. being in the middle of the continuum with a balance of control and trust). HAI research has identified methods of optimizing user trust through system design. One strategy is to provide “uncertainty information”—a rating of how well a system can handle conditions at any given time. For example, self-driving car users provided with uncertainty information trusted the system less, were more alert, engaged, and able to intervene more quickly when necessary, compared to users not given uncertainty information (Helldin et al., 2013). A second method of calibrating user trust involves simulator programs that enable users to run through scenarios before using the system themselves (Heitmeyer and Leonard, 2015). System features unrelated to its functionality may also influence trust. For instance, when a self-driving car has anthropomorphic characteristics (e.g. giving the system a name, gender, and voice), users trust it more and think it is smarter and better able to anticipate and plan for future events (Waytz et al., 2014).

HCL users will likely have different individual preferences for the amount of control they want over diabetes management decision-making and the type of information provided. A recent overnight CL trial found that fear of hypoglycemia decreased for adolescent users during the trial, but this fear increased for their parents (Barnard et al., 2014). Thus, parents of children or teenagers on CL or HCL systems may have different trust-related needs than adult system users. Future systems may be able to engage more users by offering varying levels of user controls, such as customizable information and alerts. In the future, systems may also be able to adapt to users’ wishes for control or lack of control. Interestingly, one study developed an automated system that could assess and adapt its own trustworthiness by learning from previous human inputs (Floyd et al., 2014). Frequency of user intervention on a system has been used as a measure of lack of trust (Gao et al., 2013); future HCL and CL systems may be able to detect where a user falls along the control-trust spectrum and respond accordingly.

Our findings should be considered in the context of our study design. Given our small sample, we cannot draw conclusions about differences between age groups (adult vs adolescent) or systems, or about differences based on pre-study glycemic control and HCL attitudes. Further investigation of potential differences in HCL acceptance by age and glycemic control is needed. Also, we interviewed participants who had experience with two particular HCL systems as part of a research trial during which they were closely monitored by research staff, and system use was relatively short. Results may not generalize to all HCL or CL systems or to larger scale, longer term, real-world system use. The algorithm tested was both conservative in its insulin dosing and adaptive to the user; however, 4–5 days were not an adequate time for the algorithm to adapt optimally. Long-term use may improve user perceptions of system performance and strengthen trust. While first impressions have been shown to influence trust in automated systems (Robinette et al., 2015), longitudinal studies will provide important insights about the trajectories of trust over time and about the process of regaining trust.

Our study is unique in being the first to report qualitatively on trust development among experienced HCL users and to develop a theoretical model relevant for users of an automated system. Findings have implications for future research on development and acceptance of HCL technology. It will be important for HCL and CL trials to assess human factors, such as user experience, and individual characteristics (e.g. age, diabetes duration, technology attitudes, diabetes self-efficacy) that may influence trust and technology acceptance.

Funding

The author(s) disclosed receipt of the following financial support for the research, authorship, and/or publication of this article: This work was supported by a grant from the Juvenile Diabetes Research Foundation. Researchers at Yale were also supported by the Michael D. Ryan & Rosemary McNicholas Ryan Pediatric Diabetes Research Fund.

The author(s) declared the following potential conflicts of interest with respect to the research, authorship, and/or publication of this article: S.A.W. receives research support and serves as consultant to Medtronic and has served on an advisory board for Insulet. D.M.M. is on the advisory board for Insulet and his institution has received research funding from Medtronic and DexCom. B.A.B. is on medical advisory boards for Sanofi, Novo-Nordisk, BD, UnoMedical, and Medtronic and has received research grant and/or material support from Medtronic, DexCom, LifeScan, Roche, Bayer, UnoMedical, and Tandem. T.T.L. has received honoraria from Medtronic. K.K.H has served as a consultant to Bigfoot Biomedical and received research support from DexCom.

Footnotes

Declaration of conflicting interests

M.L.T., E.I., S.J.H., S.C.S., and D.N. report no potential conflicts of interest relevant to this article.

References

  1. Barnard KD, Pinsker JE, Oliver N, et al. (2015) Future artificial pancreas technology for type 1 diabetes: What do users want? Diabetes Technology & Therapeutics 17: 311–315. [DOI] [PubMed] [Google Scholar]
  2. Barnard KD, Wysocki T, Allen JM, et al. (2014) Closing the loop overnight at home setting: Psychosocial impact for adolescents with type 1 diabetes and their parents. BMJ Open Diabetes Research & Care 2: e000025. [DOI] [PMC free article] [PubMed] [Google Scholar]
  3. Bevier WC, Fuller SM, Fuller RP, et al. (2014) Artificial pancreas (AP) clinical trial participants’ acceptance of future AP technology. Diabetes Technology & Therapeutics 16: 590–595. [DOI] [PMC free article] [PubMed] [Google Scholar]
  4. Carlson MS, Desai M, Drury JL, et al. (2014, March 24–26) Identifying factors that influence trust in automated cars and medical diagnosis systems. In: Proceedings of the AAAI symposium on the intersection of robust intelligence and trust in autonomous systems Palo Alto, CA: AAAI. [Google Scholar]
  5. Chamberlain J and Dana Dopita R (2013) Persistence of continuous glucose monitoring use in a community setting 1 year after purchase. Clinical Diabetes 31: 106–109. [Google Scholar]
  6. Elleri D, Acerini CL, Allen JM, et al. (2010) Parental attitudes towards overnight closed-loop glucose control in children with type 1 diabetes. Diabetes Technology & Therapeutics 12: 35–39. [DOI] [PubMed] [Google Scholar]
  7. Floyd MW, Drinkwater M and Aha DW (2014) How much do you trust me? Learning a case-based model of inverse trust. In: Proceedings of the twenty-second international conference on case-based reasoning (ed Lamontagne L and Plaza E), Cork, 29 September-1 October, pp. 125–139. Cham: Springer. [Google Scholar]
  8. Gao F, Clare AS, Macbeth JC, et al. (2013) Modeling the impact of operator trust on performance in multiple robot control. In: Proceedings of the AAAI spring symposium: Trust and autonomous systems, Stanford, CA, 25–27 March. [Google Scholar]
  9. Gonder-Frederick L, Shepard J and Peterson N (2011) Closed-loop glucose control: Psychological and behavioral considerations. Journal of Diabetes Science and Technology 5: 1387–1395. [DOI] [PMC free article] [PubMed] [Google Scholar]
  10. Gordon S (2016) FDA approves first “artificial pancreas” for type 1 diabetes.CBS News, 28 September Available at: http://www.cbsnews.com/news/fda-approves-first-artificial-pan-creas-type-1-diabetes-medtronic-minimed-670g/ [Google Scholar]
  11. Heitmeyer CL and Leonard EI (2015) Obtaining trust in autonomous systems: Tools for formal model synthesis and validation. In: Proceedings of the third FME workshop on formal methods in software engineering, Florence, 18 May New York: IEEE, pp. 54–60. [Google Scholar]
  12. Helldin T, Falkman G, Riveiro M, et al. (2013) Presenting system uncertainty in automotive UIs for supporting trust calibration in autonomous driving. In: Proceedings of the 5th international conference on automotive user interfaces and interactive vehicular applications, Eindhoven, 28–30 October New York: ACM, pp. 210–217. [Google Scholar]
  13. Holden RJ and Karsh BT (2010) The technology acceptance model: Its past and its future in health care. Journal of Biomedical Informatics 43: 159–172. [DOI] [PMC free article] [PubMed] [Google Scholar]
  14. Lee JD and See KA (2004) Trust in automation: Designing for appropriate reliance. Human Factors 46: 50–80. [DOI] [PubMed] [Google Scholar]
  15. Ly TT and Buckingham BA (2015) Technology and type 1 diabetes: Closed-loop therapies. Current Pediatrics Reports 3: 170–176. [Google Scholar]
  16. Ly TT, Weinzimer SA, Maahs DM, et al. (2016) Automated hybrid closed-loop control with a proportional-integral-derivative based system in adolescents and adults with type 1 diabetes: Individualizing settings for optimal performance. Pediatr Diabetes. [DOI] [PubMed] [Google Scholar]
  17. Parasuraman R and Riley V (1997) Humans and automation: Use, misuse, disuse, abuse. Human Factors 39: 230–253. [Google Scholar]
  18. Polonsky WH and Hessler D (2015) Perceived accuracy in continuous glucose monitoring: Understanding the impact on patients. Journal of Diabetes Science and Technology 9: 339–341. [DOI] [PMC free article] [PubMed] [Google Scholar]
  19. Pope C and Mays N (2000) Qualitative Research in Health Care. London: BMJ Books. [DOI] [PMC free article] [PubMed] [Google Scholar]
  20. QSR International Pty Ltd (2014) NVivo qualitative data analysis Software (Version 10). [Google Scholar]
  21. Ramchandani N, Arya S, Ten S, et al. (2011) Real-life utilization of real-time continuous glucose monitoring: The complete picture. Journal of Diabetes Science and Technology 5: 860–870. [DOI] [PMC free article] [PubMed] [Google Scholar]
  22. Robinette P, Wagner AR and Howard AM (2015) The Effect of Robot Performance on Human-Robot Trust in Time-Critical Situations. Atlanta, GA: Georgia Institute of Technology. [Google Scholar]
  23. Russell SJ, El-Khatib FH, Sinha M, et al. (2014) Outpatient glycemic control with a bionic pancreas in type 1 diabetes. New England Journal of Medicine 371: 313–325. [DOI] [PMC free article] [PubMed] [Google Scholar]
  24. Russell SJ, Hillard MA, Balliro C, et al. (2016) Day and night glycaemic control with a bionic pancreas versus conventional insulin pump therapy in preadolescent children with type 1 diabetes: A randomised crossover trial. Lancet Diabetes & Endocrinology 4: 233–243. [DOI] [PMC free article] [PubMed] [Google Scholar]
  25. Shepard JA, Gonder-Frederick L, Vajda K, et al. (2012) Patient perspectives on personalized glucose advisory systems for type 1 diabetes management. Diabetes Technology & Therapeutics 14: 858–861. [DOI] [PMC free article] [PubMed] [Google Scholar]
  26. Thabit H, Elleri D, Leelarathna L, et al. (2015a) Unsupervised overnight closed loop insulin delivery during free living: Analysis of randomised cross-over home studies in adults and adolescents with type 1 diabetes. The Lancet 385: S96. [DOI] [PubMed] [Google Scholar]
  27. Thabit H, Tauschmann M, Allen JM, et al. (2015b) Home use of an artificial beta cell in type 1 diabetes. New England Journal of Medicine 373: 2129–2140. [DOI] [PMC free article] [PubMed] [Google Scholar]
  28. Van Bon AC, Brouwer TB, von Basum G, et al. (2011) Future acceptance of an artificial pancreas in adults with type 1 diabetes. Diabetes Technology & Therapeutics 13: 731–736. [DOI] [PubMed] [Google Scholar]
  29. Van Bon AC, Kohinor MJ, Hoekstra JB, et al. (2010) Patients’ perception and future acceptance of an artificial pancreas. Journal of Diabetes Science and Technology 4: 596–602. [DOI] [PMC free article] [PubMed] [Google Scholar]
  30. Wagner AR (2009) The Role of Trust and Relationships in Human-Robot Social Interaction. Atlanta, GA: Georgia Institute of Technology. [Google Scholar]
  31. Waytz A, Heafner J and Epley N (2014) The mind in the machine: Anthropomorphism increases trust in an autonomous vehicle. Journal of Experimental Social Psychology 52: 113–117. [Google Scholar]

RESOURCES