ABSTRACT
Rationale
Awareness of their standing relative to best practices motivates primary healthcare (PHC) teams to improve their practices. However, gathering the data necessary to create such a portrait is a challenge. An effective way to support the improvement of the practices of PHC teams is to simplify the availability of data portraying aspects of their practices that might need improvement. Timely access is one of the foremost challenges of PHC. Yet, very few tools supporting reflections on the implementation of best practices to improve access are available to PHC teams.
Aims and Objectives
To develop an online reflective tool that evaluates the state of a PHC team member's advanced access practice and formulates customized recommendations for improvement.
Methods
This sequential multimethod study was informed by a literature review and an expert panel composed of researchers, patients, provincial and local decision‐makers, and PHC clinical and administrative staff in the province of Quebec, Canada. Consensus was reached on the content of the questionnaire and the prioritization of the recommendations.
Results
No reflective tool on advanced access practices was found in the literature review. Grey literature was used to create an initial version of the questionnaire. This version was revised and enriched through consultation phases with the expert panel. Then, five iterations of the tool were tested with 169 PHC team members, which led to the conception of two distinct versions: one for clinical staff and one for administrative agents responsible for appointment booking. The final versions of the reflective tool are available online in both English and French.
Conclusion
This reflective tool provides a portrait of PHC team members' advanced access practices as well as an automated report that contains personalized and prioritized recommendations for improvement. Further developments are necessary for its optimal use among PHC professionals other than physicians and nurse practitioners.
Keywords: access to care, Canada, primary health care, quality improvement, questionnaire design, self‐assessment, survey
1. Introduction
Ongoing scientific and technological advances require healthcare teams to adapt continually [1]. An effective way to engage teams in a reflection process is by demonstrating the gap between their current performance and best practices. Such a reflection on their practice can be powerful for inspiring the desire to improve while helping them define concrete goals that will lead the way to action [2, 3, 4, 5].
Assessing the gap between current and best practices requires external data to provide a picture of current practices [2, 3], as personal reflections on the state of one's practice may be prone to biases. However, accessing data to create portraits of primary healthcare (PHC) practices is a challenge, as the data are often limited, difficult to obtain in a user‐friendly format [4, 6]. Thus, an effective way to support the continuous improvement of PHC teams' practices involves simplifying the availability of data portraying potential areas of improvement.
Timely access—providing care when patients need it–is an essential component of efficient PHC, but it is also one of the foremost challenges of PHC in several countries [7, 8]. The advanced access (AA) model, conceived in the United States in the early 2000s, was developed to improve access and continuity in PHC. Its effectiveness has been demonstrated in various healthcare settings [9, 10, 11], and it has been endorsed by the US Institute for Healthcare Improvement as well as the Royal College of General Practitioners in the United Kingdom [12]. The model was revised in 2021 to better reflect interdisciplinary practice and the use of electronic medical records in PHC practices [13]. The revised model is based on five pillars: (1) comprehensive planning for needs, supply and recurring variation; (2) regular adjustment of supply to demand; (3) processes of appointment booking and scheduling; (4) integration and optimization of collaborative practice; and (5) communication about AA and its functionalities.
Since 2012, the AA model has been actively promoted by various Canadian medical associations and organizations, including the College of Family Physicians of Canada. In Quebec, the Association of Family Physicians of Quebec and the Ministry of Health and Social Services have also actively promoted the model. To support AA implementation in Quebec over the last decade, free online and face‐to‐face training sessions, as well as implementation guides, have been offered by various organizations, including the Association of Family Physicians of Quebec [14, 15, 16, 17]. These guides explain AA to the various stakeholders and present the main strategies to implement it. However, they offer little support to ensure the continuity of implementation over time. Recent data show that these efforts lead to widespread but incomplete implementation of the key principles of AA across the province, which varies considerably between clinics and between individuals within the same clinic [18, 19].
Improved implementation of the different pillars of AA requires increased engagement of PHC team members in the process. One way to contribute to their engagement is to encourage them to reflect on the gaps between their current AA practice and recommended AA practices. However, to our knowledge, there is no existing tool that provides data to guide such a reflection process [18]. The objective of this study was to develop and evaluate an online reflective tool on Advanced Access (Outil réflexif sur l'accès adapté; ORAA) for all PHC team members that formulates personalized recommendations for AA improvement through an automated report generated for each respondent.
2. Method
This project was embedded in a larger study whose first objective was to actualize and operationalize the AA model developed in 2002 by Murray and collaborators [13, 20]. The resulting revised model was then used as the basis for the conception of the reflective tool in the present study. The current study used a sequential multiphase consultation process [21], divided into three phases: (1) a literature review on AA implementation tools and guidelines; (2) development of the reflective tool with a panel of AA experts; and (3) evaluation of the tool. Figure 1 summarizes the study process.
Figure 1.
Flow diagram of tool development. AA, advanced access; FP, family physician; NP, nurse practitioner; ORAA, Reflexive Tool on Advanced Access (Outil réflexif sur l'accès adapté); PHC, primary healthcare.


2.1. Phase 1: Literature Review on AA Implementation Tools and Guidelines
A literature review was conducted on existing AA implementation tools and guidelines for PHC clinical staff and administrative agents to develop an initial, yet comprehensive version of the reflective tool. We searched ‘assessment’, ‘reflective’ and ‘tools’ in combination with terms such as ‘advanced access’, ‘open access’, ‘same‐day scheduling’, ‘timely access’ and ‘AA implementation’ in MEDLINE, CINAHL, HealthSTAR and PsychInfo for scientific peer‐reviewed studies published between 2001 and 2022 (Appendix S1). Grey literature, such as reports or AA implementation guides, was also reviewed. English and French publications were considered. Two analysts conducted independent screenings of titles and abstracts, followed by full reviews of the retrieved references. Any disagreements were resolved through discussion. The data extraction grid allowed for the recension of tool items (and their formulation) to operationalize expected behaviours while practicing under the AA model. Data extraction was performed as described for the publication retrieval process.
2.2. Phase 2: Development of the Tool
The development of the tool involved generating two deliverables: a questionnaire to identify the AA strategies implemented by each respondent and an algorithm to identify the optimal recommendations to support respondent's AA implementation and improvement efforts. These deliverables were developed through an iterative process involving two rounds of consultation via an e‐survey and two virtual meetings. See Figure 1 for a summary of the study processes.
Key stakeholders in AA—provincial and local decision‐makers, family physicians, nurse practitioners, registered nurses, quality improvement coaches, project managers, administrative agents responsible for appointment booking, patients and researchers working in the field of AA—were invited by the research team to participate in the panel. To be considered an expert, all participants, except for patients, had to have experience with AA (5+ years), to have concrete experience in AA implementation and improvement, and to speak French. A list of experts was shared with key informants, then purposive and snowballing recruitment techniques were used to complete the list. Principal investigators contacted potential experts by email to invite them to participate in the project. A panel of between 25 and 30 experts was targeted to ensure the development of productive group dynamics and a diverse range of perspectives and to maximize the likelihood of reaching consensus among experts [22]. Patients were recruited from a patient committee (n = 4) involved in another AA‐related project led by the research team [23]. All patients were retired and each was attached to a different PHC clinic. Patients also had different health profiles; some had more health issues than others, which influenced their needs in terms of access to care and services. Table 1 presents the composition of the panel.
Table 1.
Composition of the advanced access expert panel.
| No. (%) of experts | |
|---|---|
| Roles regarding advanced access (not exclusive) | n = 31 |
| Researchers | 4 (13) |
| Patients | 4 (13) |
| Decision‐makers | |
| Provincial | 2 (6) |
| Regional | 2 (6) |
| Clinical and administrative staff | |
| Family physicians | 9 (29) |
| Nurse practitioners | 3 (10) |
| Registered nurses | 2 (6) |
| Quality improvement coaches | 2 (6) |
| Administrative agents responsible for appointment booking | 2 (6) |
| Project managers | 2 (6) |
2.2.1. Round 1
An initial version of the questionnaire aimed at all PHC team members (family physicians, nurse practitioners, registered nurses, administrative agents responsible for appointment booking and other PHC professionals) was designed by the research team based on the revised AA model [13] and the results of the literature review. In July 2020, all panelists, except for patients, received a personalized link to an e‐survey [24, 25, 26] via the SurveyMonkey platform. They were asked to rate the importance of suggested strategies to achieve full implementation of the AA model on a unipolar five‐point Likert scale (Not important at all to Essential). In this first round, panelists were also asked to agree with the response scale choice of each item and to suggest a new one in the case of disagreement. Three criteria were established to determine when consensus was reached on an item or a response scale: 75% or more of responses with a high importance rating (4–5 on the Likert scale), a median of 4 or more, and an interquartile range of 0 or 1. Items lacking consensus were kept for further reflection or clarification in the subsequent round.
2.2.2. Round 2
In the second round in November 2020, the results of the first round regarding the level of importance of items for which no consensus was reached were presented to panelists. They were asked whether they wished to confirm or modify their initial responses. The final consensus on each item was then evaluated using the same criteria as in the first round, and the formulation of some items was modified considering any comments received during both rounds.
2.2.3. Validation Meetings
The first validation meeting was held online, due to the pandemic, in February 2021 and attended by 27 experts. Its aim was to review the elements needed to design the algorithm that would produce personalized recommendations. The goal of the algorithm is to compare each respondent's current situation, as portrayed by their answers to the questionnaire, with the most impactful AA implementation strategies, then suggest strategies to implement as a matter of priority—that is, the recommendations. Prioritizing strategies to be implemented helps maintain respondent motivation by proposing a limited number of changes that have the potential to have the greatest impact [27].
The validation meeting was carried out in two stages. First, panelists were divided into sub‐groups, which took turns focusing on each of the five pillars. For each pillar, a member of the research team acted as a facilitator, presenting the group with an assessment of the estimated level of impact of each strategy on timely access. This assessment was made by the research team based on the results of the literature review and the e‐survey. Panelists were invited to comment on and review this assessment. Second, panelists were presented with a list of recommendations aimed to facilitate the implementation or improvement of each strategy. This list was based on the results of the literature review and the e‐survey. Panelists were engaged in a carrousel activity that allowed them to move sequentially between stations, comment on and discuss topics and review the list with the help of a facilitator [28].
The second validation meeting was held with the four patients on the panel and focused on their access experience to validate the pre‐test version of the questionnaire and ensure that items deemed important from the patient point of view were not overlooked.
2.3. Phase 3: Testing of the Tool
Using the SurveyMonkey platform, pre‐tests of the questionnaire were conducted with 169 PHC clinical staff and administrative agents not involved in the panel of experts. These individuals were involved in a quality improvement initiative [23] and work in eight PHC clinics within three regions of the province of Quebec. This iterative improvement process was conducted between December 2020 and April 2022. Participants in this pre‐test were invited to comment on the tool in writing after its completion. In addition, to identify flow problems during completion or items that were unclear or needed to be reformulated, we purposefully selected for their role within the AA model 21 individuals for cognitive interviews [29]: eight administrative agents responsible for appointment booking, seven family physicians, three nurse practitioners, one registered nurse, one psychologist and one social worker. Cognitive interviews aim to evaluate sources of response error in a questionnaire. To do so, a ‘think‐aloud’ interviewing technique was used, which consists in participants being encouraged to verbalize their thoughts as they read and respond to each question. The interviewer paid attention for any signs of confusion, hesitation, or misinterpretation and suggestions to improve the quality and relevance of the questions. Follow‐up questions or exploration of alternative wording and phrasing were also used when necessary. The report format and the relevance of the recommendations received in the personal report were also assessed.
2.4. Ethics Approval
This study was approved by the Scientific Research Committee (an independent committee working alongside the research ethics board under the umbrella of the Centre intégré de santé et de services sociaux de la Montérégie‐Centre), and ethics approval was provided by the research ethics board of the Centre intégré de santé et de services sociaux de la Montérégie‐Centre (2020‐441, CP 980475). Informed consent for piloting of the tool by clinical staff and administrative agents was gathered as part of #MP‐04‐2020‐410.
3. Results
3.1. Phase 1: Literature Review on AA Implementation Tools and Guidelines
We identified 3378 citations in the electronic database search. After elimination of duplicates, 3333 titles were screened, and 38 abstracts and 12 grey literature resources were reviewed. Only the 12 grey literature resources were retained for inclusion in the study (Figure 2). Of these 12 resources, 9 were AA implementation guides and 3 were websites that offer resources on AA implementation. No reflective tool on AA practice was found. The 12 resources found were combined with the revised model to develop the first draft of the questionnaire, the initial prioritization of each strategy and the list of recommendations.
Figure 2.

Flow diagram of resource selection. AA, advanced access.
3.2. Phase 2: Development of the Tool
Building on the literature review results, each sub‐pillar of the revised AA model was distilled by the research team into one or more items depending on its complexity. The fifth pillar, ‘Communication about AA and its functionalities’, was excluded from the development of the questionnaire, as it does not include processes that can be directly evaluated in a reflective manner by PHC team members. This first draft of the questionnaire was then submitted to the expert panel through the e‐survey, and its content was presented to the patient partners at the second validation meeting. Both e‐survey rounds and the first validation meeting were held with all members of the expert panel, except for patients, who participated only in a separate validation meeting.
Of the 39 items or sub‐items presented in the first round of the e‐survey, consensus was reached on 31 items being either ‘Essential’ or ‘Very important’. The remaining eight items were resubmitted to the panel during the second round. Respondents were then asked to decide whether they wished to confirm or modify their first‐round responses in light of the results obtained for each item. Consensus was reached on seven items or sub‐items after this second round. Consensus was nearly reached on the final item, My patients can book an appointment with me online or by email, with 71% of respondents considering it either ‘Essential’ or ‘Very important’. However, because the patient partners considered this item as ‘Very important’ to ‘Essential’, it was decided to include it in the questionnaire but to assign it a lower priority level for AA implementation. Otherwise, patients approved the rest of the questionnaire. Table 2 shows the development path of the final items of the questionnaire.
Table 2.
Development path of final items—clinical staff version.
| Pillar | Sub‐pillar | Item content— final version | Sub‐item content—final version |
|---|---|---|---|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
||
|
|
||
|
|
||
|
|
|
|
|
|
|
|
|
|
|
|
|
|
Abbreviation: EMR, electronic medical record.
Items for the administrative agents version of the tool.
The validation meeting was then held with the aim of enriching and validating two elements necessary for the conception of the algorithm: (1) the assessment of each item's potential impact on AA implementation and (2) a list of recommendations related to each item. The results of this consultation provided the research team with the material necessary to create nonautomated customized reports as a first step, followed by development of the algorithm.
3.3. Phase 3: Evaluating the Tool
Once the content of the questionnaire was consolidated, five iterations were piloted within eight PHC clinics. In addition to the written comments of participants, the completion rate—the ratio of the number of completed questionnaires to the number of questionnaires started—was considered as an indicator of the relevance, ease of completion and appropriateness of the questionnaire. The number of missing answers was also used as an indicator of the degree of complexity or vagueness of each item. After each round of piloting, the research team met to discuss these issues and address them by modifying the questionnaire. In successive versions of the questionnaire, adjustments were made to simplify its completion. This included the addition of umbrella questions to avoid overly complex response options. When needed, a sub‐question providing further clarification was added only for those who responded positively to the umbrella question. Minor changes were also made to certain terms or phrases where there seemed to be recurring difficulties in understanding. Table 3 presents the evolution of testing and content of the tool.
Table 3.
Evolution of testing and content of the clinical staff version of the tool.
| Pilot test iteration | Number of items; sub‐items | Addition or removal of item (I) or sub‐item (SI) | Questionnaires completed | Percentage of questionnaires with missing data/total number of questionnaires | Cognitive interviews |
|---|---|---|---|---|---|
| 1 | 26; 13 |
15 FP 15 NP 0 O 0 |
20% |
1 FP 1 NP 0 O 0 |
|
| 2 | 30; 22 |
P2: +1SI P3: +2I P4: +2I; +8SI |
29 FP 24 NP 3 O 2 |
17% |
0 FP 0 NP 0 O 0 |
| 3 | 28; 13 | P4: −2I; −9SI |
15 FP 11 NP 1 O 3 |
9% |
8 FP 4 NP 1 O 3 |
| 4 | 28; 14 | P3: +1SI |
7 FP 5 NP 2 O 0 |
0% |
3 FP 2 NP 1 O 0 |
| 5 | 29; 19 |
67 FP 39 NP 5 O 23 |
4% |
1 FP 0 NP 1 O 0 |
Abbreviations: FP, family physician; NP, nurse practitioner; O, other clinical staff; P2, Pillar 2 of AA; P3, Pillar 3 of AA; P4, Pillar 4 of AA.
The testing phase also revealed the need to create a second version of the tool aimed specifically at administrative agents responsible for appointment booking, whose work was deemed to be too different from that other PHC team members to be integrated into the same tool. The language of this shorter version was adapted, and it was developed iteratively starting from the second pilot test iteration. In total, it was tested by 36 administrative agents.
At the beginning of the testing process of both versions, nonautomated personalized reports, developed in collaboration with the research team members, were used as part of an overarching research project by a quality improvement coach to help guide the AA implementation process [23]. Those coaches offered ongoing support and mentorship with data analysis, change management and implementation of changes related to the quality improvement process. Such responsibilities made them well positioned to test the tool and be aware of the PHC team members' need in terms of feedback. The algorithm was then developed with the aim of replicating this process while also standardizing it and increasing efficiency. A colour chart was integrated into the algorithm to assess the priority of each recommendation (see Figure 3). The generated personalized automated reports were then tested in the final rounds of the testing phase (V3 onwards), which allowed the research team to refine the algorithm.
Figure 3.

Colour chart illustrating the priority of recommendations in automated reports.
Figure 4 shows an extract of a personalized automated report.
Figure 4.

Extract of a personalized automated report.
A total of 169 questionnaires and 21 cognitive interviews were completed during the evaluation phase of both versions. The interviews allowed the research team to explore the variety of working realities in greater depth and to adapt, simplify, or clarify formulations and response choices accordingly in both versions. It also ensured items were understood as intended by all respondents, regardless of their role or clinic set‐up. The final version of the reflective tool on AA for clinical staff contains 29 items and 19 sub‐items. Five items are related to comprehensive planning for patient's needs, service delivery and recurring variations, 4 items to regular adjustment of supply to demand, 14 items to the appointment system and 6 items to the integration and optimization of collaborative practice. The final version for administrative agents responsible for appointment booking contains 15 items and 5 sub‐items. One item is related to regular adjustment of supply to demand, 10 items to the appointment system and 4 items to the integration and optimization of collaborative practice. Both versions of the tool are available through an online platform in English and French. Access can be requested by contacting the corresponding author.
4. Discussion
This study resulted in the conception of two versions of a reflective tool comprising a questionnaire that provides a portrait of the implementation of AA in a PHC team member's practice and an algorithm that provides an automated report with personalized recommendations to further implement AA. To our knowledge, no other reflective tool exists to support PHC teams in their ongoing AA implementation efforts, although a decade of initiatives to implement the model has resulted in implementation gaps and disappointing results in terms of timely access [18, 30].
Reflection on the gaps between current and best practices is an increasingly encouraged means of ensuring ongoing professional development [4]. The Collège des Médecins du Québec's requirement of spending a minimum of 10 h in practice evaluation activities per 5‐year reference period bears witness to this [31]. However, this reflection process is hampered by the difficulty of gathering external data portraying current practices, limiting the ability of PHC team members to accurately self‐assess their practices [4, 6, 32]. Thus, stand‐alone tools are needed to support the continuous reflective efforts of PHC team members. However, the development of such tools is resource intensive, as it requires multiple iterations through testing and refinement loops with many individuals as well as data acquisition for comparison purposes. The ORAA tool responds to this need by offering rapid, user‐friendly and inexpensive mean to regularly assess progress in implementing the AA model.
The availability of data on one's practice is important but not sufficient to induce change [4, 33, 34]. Various factors influence the impact of feedback on practices. A meta‐analysis found that feedback has greater impact when it is delivered frequently, is written rather than given orally, and when it includes information that helps the individual understand what must be changed to improve his or her performance [34]. The conception of ORAA was originally inspired by a reflective questionnaire on the Patient Medical Home, a model of PHC developed by the College of Family Physicians of Canada [35]. This questionnaire depicts how the Patient Medical Home model is implemented in the respondent's practice with the aim of supporting efforts to further improve its implementation. ORAA replicates this process, offering a complete picture of one's AA practice, while going a step further by offering personalized and actionable recommendations. ORAA delivers feedback that promotes change by being easily available for regular use and by providing written feedback composed of prioritized personalized recommendations for future actions. It can also be used as part of a more structured quality improvement process.
ORAA was designed using a rigorous method based on the scientific literature and consultations with a panel of experts. To date, it has been used by more than 1100 PHC team members in three Canadian provinces, Quebec, New Brunswick and Ontario [18]. The first version, for clinical staff, is particularly well‐suited to family physicians and nurse practitioners whose practices are organized around a pool of registered patients followed by an interprofessional PHC team. The second version is well‐suited for administrative agents who are responsible for appointment booking, who are key stakeholders in AA implementation. However, a limit of the tool is its limited adaptation to the practice of other PHC professionals such as social workers, physiotherapists and pharmacists, especially those who do not have registered patients. The AA model currently provides a set of principles designed to enhance access to professionals who have affiliated patients. The AA model may need to be further adapted for better application of the tool by interprofessional PHC teams. Another limit of this tool is the limited involvement of patient‐partners in its development. To adequately capture the needs and experience of patient access, which is the product of an advanced access practice, we have developed a comprehensive data collection tool on patient experience that is offered in a complementary manner to teams who choose to complete the ORAA. Results emerging from the use of this tool have been published elsewhere [36].
Engaging all staff in an AA implementation project has been shown to be an important factor in implementation success [37]. Also, because AA implementation requires a collective approach in addition to individual efforts [37], future research should focus on the development of an organizational summary of ORAA to reflect on items that could be improved by organizations rather than individuals (e.g., communication with patients, contingency plans in case of prolonged absences, etc.). Such a tool should present an aggregate picture of AA practices within a PHC organization, enabling managers to identify aspects of AA that are already well established as well as next steps to prioritize for further implementation.
5. Conclusion
This study contributes to the development of an innovative and much‐needed tool in the realm of reflections on access to PHC. ORAA addresses a persistent gap in assessing and improving the implementation of popular models to improve timely access. Unlike previous approaches, ORAA combines a comprehensive questionnaire with an algorithm that generates personalized recommendations, offering PHC team members—clinical and administrative—an available and cost‐effective means of reflecting on and enhancing their AA practices. For optimal results, ORAA should be used of as an entry point in the process of further implementing AA, and thus being combined with other measures such as a quality improvement process [23, 38]. Future work should focus on expanding its applicability to various PHC professionals and developing organizational‐level summaries to support collective AA implementation efforts within PHC organizations.
Conflicts of Interest
The authors declare no conflicts of interest.
Supporting information
Supporting information.
Acknowledgments
The authors would like to thank all participants who took part in this study, without whom it could not have been carried out. This research was made possible by a grant from the Ministry of Health and Social Services of Quebec. The funding organization was not involved in the design of the study or the writing of the protocol.
Data Availability Statement
The data that support the findings of this study are available on request from the corresponding author. The data are not publicly available due to privacy or ethical restrictions.
References
- 1. Vachon B., Désorcy B., Camirand M., et al., “Engaging Primary Care Practitioners in Quality Improvement: Making Explicit the Program Theory of an Interprofessional Education Intervention,” BMC Health Services Research 13, no. 1 (2013): 106. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 2. Vachon B., Désorcy B., Gaboury I., et al., “Combining Administrative Data Feedback, Reflection and Action Planning to Engage Primary Care Professionals in Quality Improvement: Qualitative Assessment of Short Term Program Outcomes,” BMC Health Services Research 15 (2015): 391. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 3. Ivers N. M., Tu K., Francis J., et al., “Feedback Gap: Study Protocol for a Cluster‐Randomized Trial of Goal Setting and Action Plans to Increase the Effectiveness of Audit and Feedback Interventions in Primary Care,” Implementation Science 5, no. 1 (2010): 98. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 4. Sargeant J., Bruce D., and Campbell C. M., “Practicing Physicians' Needs for Assessment and Feedback as Part of Professional Development,” Journal of Continuing Education in the Health Professions 33 (2013): S54–S62. [DOI] [PubMed] [Google Scholar]
- 5. Tremblay M. C., Garceau L., Thiab Diouf N., et al., “Improving Understanding of Reflexivity in Family Medicine: Development of an Educational Tool Based on a Rapid Review,” MedEdPublish 10 (2021): 181. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 6. Wong S., Johnston S., Burge F., and McGrail K., “Value in Primary Healthcare—Measuring What Matters?,” HealthcarePapers 18, no. 4 (2019): 58–67. [DOI] [PubMed] [Google Scholar]
- 7. Shah N., Latifovic L., Meaney C., et al., “Association Between Clinic‐Reported Third Next Available Appointment and Patient‐Reported Access to Primary Care,” JAMA Network Open 5, no. 12 (2022): e2246397. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 8. Shahaed H., Glazier R. H., Anderson M., et al., “Primary Care for All: Lessons for Canada From Peer Countries With High Primary Care Attachment,” Canadian Medical Association Journal 195, no. 47 (2023): E1628–E1636. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 9. Rose K. D., “Advanced Access Scheduling Outcomes: A Systematic Review,” Archives of Internal Medicine 171, no. 13 (2011): 1150–1159. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 10. John C. H., Steven M., and Elaine R., “Advanced Access Appointments,” Canadian Family Physician 56, no. 10 (2010): e361. [PMC free article] [PubMed] [Google Scholar]
- 11. Fournier J., Heale R., and Rietze L., “‘I Can't Wait’: Advanced Access Decreases Wait Times in Primary Healthcare,” Healthcare Quarterly 15, no. 1 (2012): 64–68. [DOI] [PubMed] [Google Scholar]
- 12. Hudon C., Luc M., Beaulieu M. C., et al., “Implementing Advanced Access to Primary Care in an Academic Family Medicine Network: Participatory Action Research,” Canadian family physician Medecin de famille canadien 65, no. 9 (2019): 641–647. [PMC free article] [PubMed] [Google Scholar]
- 13. Breton M., Gaboury I., Beaulieu C., et al., “Revising the Advanced Access Model Pillars: A Multimethod Study,” CMAJ Open 10, no. 3 (2022): E799–E806. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 14. Brazeau S., Couture P., and Karemere Bimana H., Guide pour l'implantation de l'accès adapté. L'expérience d'une région ‐ Laval (Laval: Centre intégré de santé et services sociaux de Laval, 2016). [Google Scholar]
- 15. Breton M., Gaboury I., Sasseville M., et al., “Development of a Self‐Reported Reflective Tool on Advanced Access to Support Primary Healthcare Providers: Study Protocol of a Mixed‐Method Research Design Using an e‐Delphi Survey,” BMJ Open 11, no. 11 (2021): e046411. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 16. Health Quality Ontario , Quality Improvement e‐Learning Modules: Timely Access to Primary Care (HQO, 2020). [Google Scholar]
- 17. Centre de santé et de services sociaux du Bas St‐Laurent Outils, zone professionnelle, Accès adapté (CIUSSS BSL, 2020). [Google Scholar]
- 18. Breton M., Gaboury I., Beaulieu C., Deville‐Stoetzel N., and Martin E., “Ten Years Later: A Portrait of the Implementation of the Advanced Access Model in Quebec,” Healthcare Management Forum 36, no. 5 (2023): 317–321. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 19. Breton M., Deville‐Stoetzel N., Gaboury I., et al., “Comparing the Implementation of Advanced Access Strategies Among Primary Health Care Providers,” Journal of Interprofessional Care 38, no. 2 (2023): 209–219, 10.1080/13561820.2023.2173157. [DOI] [PubMed] [Google Scholar]
- 20. Murray M. and Berwick D. M., “Advanced Access: Reducing Waiting and Delays in Primary Care,” JAMA 289, no. 8 (2003): 1035–1040. [DOI] [PubMed] [Google Scholar]
- 21. Creswell J. and Plano Clark V., Designing and Conducing Mixed Methods Research, 2nd ed. (CA: SAGE, 2011). [Google Scholar]
- 22. Veugelers R., Gaakeer M. I., Patka P., and Huijsman R., “Improving Design Choices in Delphi Studies in Medicine: The Case of an Exemplary Physician Multi‐Round Panel Study With 100% Response,” BMC Medical Research Methodology 20, no. 1 (2020): 156. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 23. Gaboury I., Breton M., Perreault K., et al., “Interprofessional Advanced Access—A Quality Improvement Protocol for Expanding Access to Primary Care Services,” BMC Health Services Research 21, no. 1 (2021): 812. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 24. Jünger S., Payne S. A., Brine J., Radbruch L., and Brearley S. G., “Guidance on Conducting and REporting DElphi Studies (CREDES) in Palliative Care: Recommendations Based on a Methodological Systematic Review,” Palliative Medicine 31, no. 8 (2017): 684–706. [DOI] [PubMed] [Google Scholar]
- 25. Falzarano M. and Pinto Zipp G., “Seeking Consensus Through the Use of the Delphi Technique in Health Sciences Research,” Journal of Allied Health 42, no. 2 (2013): 99–105. [PubMed] [Google Scholar]
- 26. Hsu C.‐C. and Sandford B. A., “The Delphi Technique: Making Sense of Consensus,” Practical Assessment, Research, and Evaluation 12 (2019): 1–8. [Google Scholar]
- 27. Rogers E. M., Diffusion of Innovations, 5th ed. (New York: Free Press, 2005), 576. [Google Scholar]
- 28. Kagan M., Kagan Cooperative Learning (2015). [Google Scholar]
- 29. Willis G., Analysis of the Cognitive Interview in Questionnaire Design (Oxford University Press, 2015). [Google Scholar]
- 30. Canadian Institute for Health Information , How Canada Compares: Results From the Commonwealth Fund's 2020 International Health Policy Survey of the General Population in 11 countries (CIHI, 2021). [Google Scholar]
- 31. Collège des médecins du Québec , “Formation continue obligatoire des médecins,” July 1, 2024, https://www.cmq.org/fr/pratiquer-la-medecine/formation-continue/formation-continue-obligatoire-des-medecins.
- 32. Davis D. A., Mazmanian P. E., Fordis M., Van Harrison R., Thorpe K. E., and Perrier L., “Accuracy of Physician Self‐Assessment Compared With Observed Measures of Competence: A Systematic Review,” JAMA 296, no. 9 (2006): 1094–1102. [DOI] [PubMed] [Google Scholar]
- 33. Ivers N., Jamtvedt G., Flottorp S., et al., “Audit and Feedback: Effects on Professional Practice and Healthcare Outcomes,” Cochrane Database of Systematic Reviews 2012, no. 6 (2012): CD000259. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 34. Hysong S. J., “Meta‐Analysis: Audit and Feedback Features Impact Effectiveness on Care Quality,” Medical Care 47, no. 3 (2009): 356–363. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 35. College of Family Physicians of Canada , “The PMH Questionnaire,” https://patientsmedicalhome.ca/self-assess/.
- 36. Deville‐Stoetzel N., Gaboury I., Haggerty J., and Breton M., “Patients Living With Social Vulnerabilities Experience Reduced Access at Team‐Based Primary Healthcare Clinics,” Healthcare Policy 18, no. 4 (2023): 89–105. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 37. Abou Malham S., Touati N., Maillet L., Gaboury I., Loignon C., and Breton M., “What Are the Factors Influencing Implementation of Advanced Access in Family Medicine Units? A Cross‐Case Comparison of Four Early Adopters in Quebec,” International Journal of Family Medicine 2017 (2017): 1595406. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 38. Breton M., Gaboury I., Martin E., et al., “Impact of Externally Facilitated Continuous Quality Improvement Cohorts on Advanced Access to Support Primary Healthcare Teams: Protocol for a Quasi‐Randomized Cluster Trial,” BMC Primary Care 24, no. 1 (2023): 97. [DOI] [PMC free article] [PubMed] [Google Scholar]
Associated Data
This section collects any data citations, data availability statements, or supplementary materials included in this article.
Supplementary Materials
Supporting information.
Data Availability Statement
The data that support the findings of this study are available on request from the corresponding author. The data are not publicly available due to privacy or ethical restrictions.
