Skip to main content
Osteoarthritis and Cartilage Open logoLink to Osteoarthritis and Cartilage Open
. 2020 Nov 4;2(4):100115. doi: 10.1016/j.ocarto.2020.100115

Validation of canadian health administrative data algorithms for estimating trends in the incidence and prevalence of osteoarthritis

Jessica Widdifield a,b,c,, R Liisa Jaakkimainen a,b,c,d, Jodi M Gatley b, Gillian A Hawker b,e,f, Lisa M Lix g, Sasha Bernatsky h, Bheeshma Ravi a, David Wasserstein a, Bing Yu b, Karen Tu c,i,j
PMCID: PMC9718092  PMID: 36474895

Summary

Objective

To estimate the 1) accuracy of algorithms for identifying osteoarthritis (OA) using health administrative data; and 2) population-level OA prevalence and incidence over time in Ontario, Canada.

Method

We performed a retrospective chart abstraction study to identify OA patients in a random sample of 7500 primary care patients from electronic medical records. The validation sample was linked with several administrative data sources. Accuracy of administrative data algorithms for identifying OA was tested against two reference standard definitions by estimating the sensitivity, specificity and predictive values. The validated algorithms were then applied to the Ontario population to estimate and compare population-level prevalence and incidence from 2000 to 2017.

Results

OA prevalence within the validation sample ranged from 10% to 23% across the two reference standards. Algorithms varied in accuracy depending on the reference standard, with the sensitivity highest (77%) for patients with OA documented in medical problem lists. Using the top performing administrative data algorithms, the crude population-level OA prevalence ranged from 11% to 25% and standardized prevalence ranged from 9 to 22% in 2017. Over time, prevalence increased whereas incidence remained stable (~1% annually).

Conclusion

Health administrative data have limited sensitivity in adequately identifying all OA patients and appear to be more sensitive at detecting OA patients for whom their physician formally documented their diagnosis in medical problem lists than individuals who have their diagnosis documented outside of problem lists. Irrespective of the algorithm used, OA prevalence has increased over the past decade while annual incidence has been stable.

Keywords: Health administrative data, Epidemiology, Osteoarthritis, Electronic medical records, Incidence and prevalence, Validation

1. Introduction

Osteoarthritis (OA) is the most common form of arthritis (1), but incidence and prevalence estimates vary across studies. Various case definitions of OA (symptomatic, radiographic, self-reported, or doctor-diagnosed) [[2], [3], [4], [5], [6]] have been used across different clinical studies. Similarly, epidemiological studies using health administrative and electronic medical record (EMR) data have used various OA algorithms [[7], [8], [9], [10], [11], [12], [13], [14], [15]], which may contribute to varying disease estimates across studies.

Health administrative data are generated at every patient encounter within a health care system, whether through a physician visit, a diagnostic or therapeutic procedure, hospital admission, or receiving/filling prescription medication [16]. There is uncertainty surrounding the optimal approach to identifying OA in administrative data as algorithms have been shown to have poor sensitivity and low positive predictive values (PPV) [[17], [18], [19], [20]]. Potential reasons for administrative data having suboptimal performance may be a consequence of OA care not being prioritized during physician encounters to warrant assignment of a diagnosis or a diagnosis code (6), or perhaps imperfect reference standards being used during the validation of algorithms.

The key aspect that differentiates health administrative data from EMRs is that the latter contains more detailed clinical data comprised of a mix of coded (structured) and free text data. As the coded data in EMRs (such as diagnoses in problem lists) is much more readily accessible to researchers, clinical diagnoses embedded in free text data are often missed. Therefore, ascertaining patient populations using coded data like problem lists may be imperfect, as data completeness varies substantially across providers [[21], [22], [23]]. Thus, when the case identification process is confined to coded EMR data alone, classification errors in disease status can be significant [[24], [25], [26]] and can considerably bias study findings [27].

In an attempt to resolve these issues, we conducted a detailed review of EMRs to identify OA patients in order to test the accuracy of administrative data OA algorithms against two different definitions for classifying OA patients (based on the location of the diagnosis within the EMR). We then applied the top-performing algorithms to the entire Ontario population to compare the annual estimates of OA incidence and prevalence from 2000 to 2017.

2. Method

Setting and design. Ontario residents are covered by single-payer, public health insurance that covers hospital care and physicians’ services. We first conducted a retrospective chart abstraction study on a random sample of patients from primary care EMRs to identify patients with a physician-documented OA diagnosis. This validation sample was used to construct two different reference standard definitions, which varied based on the location of where an OA diagnosis was documented in the EMR. We tested multiple administrative data algorithms across both reference standards. Subsequently, we conducted a population-based study using the top performing validated algorithms to assess annual OA incidence and prevalence.

Validation cohort selection. The validation sample was derived from the Electronic Medical Record Primary Care Database. The patient profile of this primary care population closely approximates that of the general population in terms of demographics and disease burden [28], making these data a preferred source for validation studies [[29], [30], [31], [32]].

The data contains a full-chart extract of all clinical data from a patient's EMR, including cumulative patient profiles with medical problem lists, progress notes, laboratory and diagnostic test results, prescriptions, specialist consultation letters and hospital discharge summaries. The comprehensiveness of the data has been evaluated and all data go through data quality assessments after collection and before research use [28,33]. These data quality assessments include verifying individual patient-level and practice-level data entry fields meet data quality thresholds. At the time of study, the database was comprised of 400 primary care physicians distributed across Ontario who use the PS (Practice Solutions) Suite EMR. The database consisted of 443,253 unique patients, of which 350,863 met potential eligibility criteria for this study.

Patient inclusion criteria required a valid health insurance number and date of birth; one or more physician visits in the past year; and enrolled in the physician's practice (to allow for sufficient clinical information to verify disease status). Physician inclusion criteria required at least two years of data for their individual practice population to ensure the EMR was adequately populated. Among 73,014 eligible patients from 83 primary care physicians meeting both patient and physician eligibility criteria, we randomly sampled 7500 patients aged 20 years and older.

Data abstraction. For each of the 7500 patients, the complete EMR was reviewed by one of five trained chart abstractors to review every entry in the EMR and identify documented diagnosis of OA (at any joint) using a standardized data abstraction tool. Trained abstractors coded each patient encounter (including progress notes, consultation letters and imaging reports) as to whether or not a diagnosis of OA was documented. A diagnosis was not inferred if none was explicitly stated. Abstractors also coded whether the OA diagnosis was documented in medical problem lists, and/or in any location within the EMR, including the free text portions of the EMR (i.e. progress notes, consultation letters).

To assess the intra- and inter-rater reliability of abstractors, an initial 10% sample was abstracted a second time by the same abstractor, and once by a different abstractor. Kappa scores for inter- and intra-rater reliability all exceeded 0.85 indicating good agreement for all five chart abstractors.

Reference Standard Definitions. We a priori defined two alternative reference standard definitions of OA within the validation sample. Reference standard 1 classified all individuals with OA if the diagnosis was present in any location of their EMR. Reference standard 2 classified individuals with OA if the diagnosis was on their medical problem list only. Those who did not meet the stated OA case criteria were defined as non-cases. Thus, for reference standard 2, patients with an OA diagnoses only documented outside their medical problem list (i.e. in unstructured free text data within clinical notes) were classified as a non-case.

Health administrative data sources. Several different administrative data sources were obtained and records were linked between databases using unique, encoded identifiers [16]. We used the Ontario Health Insurance Plan (OHIP) Claims Database to identify physician billing diagnosis codes for OA [34] using International Classification of Diseases (ICD) code 715. Within OHIP, physicians provide only one diagnosis code, representing the main ‘reason for the visit’, with each billing claim. Physician specialty (associated with OHIP billing claims for OA) was identified by linking with the ICES Physician Database. From the Canadian Institute for Health Information Discharge Abstract Database, we identified OA diagnosis codes (ICD9 715, ICD10 M15-M19) within all diagnoses listed in the hospital discharge abstract. The National Ambulatory Care Reporting System identified OA diagnoses recorded during emergency department visits [35]. All hospital data prior to April 1, 2002 have diagnoses coded in ICD-9 [36] and can contain up to 16 recorded diagnoses per hospital encounter. Hospitalizations and emergency department visits after April 1, 2002 are coded using ICD-10-CA [36], and each record contains up to 25 diagnoses per admission. Patient demographic information and vital status were obtained from the Registered Persons Database (RPDB). Annual population denominators were ascertained for all OHIP beneficiaries that were 20 years and older, alive, and who have accessed the healthcare system (at least once in a seven-year period of the reporting year).

Derivation of health administrative data algorithms. Administrative data algorithms are simple rule-based case definitions which are derived from coded data (such as diagnosis codes and pharmacy dispensations) from one or more data sources. Administrative data OA algorithms were developed using combinations of physician billing diagnosis for OA (ICD 715), primary and secondary hospital discharge diagnosis for OA (ICD9 715; ICD10 M15-M19), prescription drug claims (for NSAIDs, COX-2 inhibitors), by varying time windows between diagnosis codes or the period in which diagnosis codes appeared (e.g. within 2 years), and whether the diagnosis codes were rendered by a musculoskeletal specialist (orthopedic surgeon, rheumatologist, or internist). We a priori were interested in identifying administrative data OA algorithms which maximized sensitivity and PPV (to identify all true OA patients and minimize false positives) [37].

Statistical analysis. We descriptively characterized the validation sample and assessed the prevalence of OA according to the two reference standard definitions. We estimated the accuracy of administrative data algorithms against the two reference standards by computing the sensitivity, specificity, PPV, and negative predictive value (NPV), accompanied by their 95% confidence intervals (CIs) to identify those with the combined optimal sensitivity (to capture all cases) and PPVs (to reduce false positives). We then applied the selected administrative data algorithms to the entire Ontario adult population to provide comparative estimates of incidence and prevalence of OA in the population. For each selected algorithm, annual crude and age- and sex-standardized prevalence and incidence over time were estimated. Annual prevalence were calculated as the total number of patients aged 20 years or older classified as having OA divided by the number of Ontario residents aged 20 years or older. The annual denominators consisted of all Ontario residents for each calendar year, excluding those <20 years of age or those who died in the year prior. Once a case was included, it remained a case in the next year unless they died. The annual incidence was calculated by dividing the number of new OA patients aged 20 or older by the population at risk among Ontario residents aged 20 or older. Incidence was defined as the first occurrence of OA per 100 population. As administrative data are available from 1990 onwards, to reduce the potential of observational period effects (misclassification of incident and prevalent cases after the establishment of administrative data), cases were ascertained over all available years of data but annual estimates are reported from 2000 onwards. Therefore a 10-year washout period was used to report incidence. To adjust for differences in population distribution over time, direct age and sex standardization was done using 2001 Ontario census population estimates.

All analyses were performed at ICES (www.ices.on.ca) using SAS version 9.2 (SAS Institute, Cary, NC, USA). This study was approved by the Research Ethics Board at Sunnybrook Health Sciences Centre.

3. Results

3.1. Description of the validation sample: Table 1

Table 1.

Characteristics of 7500 OA and non-OA patients according to different reference standard definitions.

Reference Standard 1a
Reference Standard 2b
OA cases Non-OA cases OA cases Non-OA cases
Number of patients, n (%) 1736 (23.1%) 5764 (76.9%) 738 (9.8%) 6762 (90.2%)
Age, mean ± SD years 64.4 ± 13.5 46.0 ± 15.2 66.6 ± 13.0 48.5 ± 16.1
Age group, n (%)
 20–44 years 128 (7.4%) 2863 (49.7%) 37 (5.0%) 2954 (43.7%)
 45–64 years 784 (45.2%) 2249 (39.0%) 301 (40.8%) 2732 (40.4%)
 65–74 years 412 (23.7%) 415 (7.2%) 198 (26.8%) 629 (9.3%)
 75–84 years 295 (17.0%) 178 (3.1%) 141 (19.1%) 332 (4.9%)
 ≥85 years 117 (6.7%) 59 (1.0%) 61 (8.3%) 115 (1.7%)
Sex, n (%)
 Female 1046 (60.3%) 3251 (56.4%) 478 (64.8%) 3819 (56.5%)
 Male 690 (39.7%) 2513 (43.6%) 260 (35.2%) 2943 (43.5%)
a

Reference standard 1: includes individuals with any physician-documented OA diagnosis anywhere in their entire medical record.

b

Reference standard 2: includes individuals with any physician-documented OA diagnosis in their medical problem list.

When the reference standard was considered to be a physician-documented OA diagnosis within any part of the EMR (reference standard 1), we identified 1736 cases and 5764 non-cases, corresponding to a 23.1% OA prevalence. With the second reference standard (physician-documented OA diagnosis only in the medical problem list), 738 cases and 6762 non-cases were identified, corresponding to a 9.8% OA prevalence. Patient characteristics by reference standard definition are presented in Table 1.

3.2. Validation results: Table 2

Table 2.

Accuracy of administrative data algorithms to identify OA against 2 different reference standard definitions.

Algorithm Reference Standarda Sensitivity (95% CI) Specificity (95% CI) PPV (95% CI) NPV (95% CI) Post-Test Prevalenceb
≥1 P ever 1 68.5 (66.3–70.7) 84.3 (83.4–85.3) 56.8 (54.7–59.0) 89.9 (89.1–90.7) 27.9%
2 77.2 (74.2–80.3) 77.5 (76.5–78.5) 27.2 (25.3–29.2) 96.9 (96.4–97.4)
≥2 P in 2 years 1 45.3 (42.9–47.6) 95.0 (94.5–95.6) 73.3 (70.7–76.0) 85.2 (84.4–86.1) 14.3%
2 56.4 (52.8–59.9) 90.3 (89.6–91.0) 38.8 (35.9–41.7) 95.0 (94.5–95.5)
≥2 P in 3 years 1 47.1 (44.8–49.5) 94.6 (94.0–95.2) 72.5 (69.9–75.1) 85.6 (84.7–86.5) 15.0%
2 58.8 (55.3–62.4) 89.7 (89.0–90.5) 38.5 (35.6–41.3) 95.2 (94.7–95.8)
≥3 P in 3 years 1 33.8 (31.5–36.0) 97.3 (96.8–97.7) 78.8 (75.8–81.7) 83.0 (82.1–83.9) 9.9%
2 43.5 (39.9–47.1) 93.7 (93.2–94.3) 43.1 (39.6–46.7) 93.8 (93.3–94.4)
≥1H ever 1 13.0 (11.4–14.6) 99.4 (99.3–99.6) 87.6 (83.6–91.6) 79.1 (78.2–80.1) 3.4%
2 17.6 (14.9–20.4) 98.1 (97.8–98.4) 50.4 (44.3–56.5) 91.6 (91.0–92.2)
Top Performing Definitions
≥1H ever or ≥ 1 P ever 1 68.6 (66.4–70.8) 84.2 (83.3–85.2) 56.7 (54.6–58.8) 89.9 (89.1–90.7) 28.0%
2 77.5 (74.5–80.5) 77.4 (76.4–78.4) 27.2 (25.3–29.1) 96.9 (96.5–97.4)
≥1H ever or ≥ 2 P ever 1 50.9 (48.6–53.3) 93.4 (92.7–94.0) 69.9 (67.4–72.4) 86.3 (85.5–87.2) 16.9%
2 62.7 (59.2–66.2) 88.1 (87.4–88.9) 36.6 (33.9–39.3) 95.6 (95.1–96.1)
≥1H or (≥2 P in 5 years) 1 48.4 (46.0–50.7) 94.1 (93.5–94.7) 71.3 (68.7–73.9) 85.8 (85.0–86.7) 15.7%
2 60.0 (56.5–63.6) 89.1 (88.4–89.9) 37.6 (34.8–40.4) 95.3 (94.8–95.9)
≥1H ever or (≥3 P in 5 years) 1 48.4 (46.0–50.7) 94.1 (93.5–94.7) 71.3 (68.7–73.9) 85.8 (85.0–86.7) 15.7%
2 60.0 (56.5–63.6) 89.1 (88.4–89.9) 37.6 (34.8–40.4) 95.3 (94.8–95.9)

Abbreviations: H = hospitalization; P = physician billing claim.

a

Reference Standard 1: 1736 cases with a physician-confirmed OA diagnosis and 5764 non-cases based on a complete review of the entire medical record, corresponding to a 23.1% OA prevalence. Reference Standard 2: 738 cases with physician-documented OA diagnosis in provided in their medical problem list only and 6762 non-cases, corresponding to a 9.8% OA prevalence.

b

Post-Test Prevalence: Prevalence of OA within health administrative data among the 7500 patients in the validation cohort (true positive + false positive cases)/7500.

The diagnostic test characteristics for different administrative data algorithms to detect each reference standard are presented in Table 2. The test characteristics of administrative data algorithms identifying reference standards 2 (OA diagnosis documented in EMR problem list) differed significantly (with non-overlapping confidence intervals) from administrative data algorithms identifying reference standard 1, which also included individuals labelled as having OA elsewhere in their EMR. Across all administrative data algorithms tested, sensitivity was significantly and consistently higher for detecting OA cases for reference standards 2 (EMR problem list only), as compared to reference standard 1. Overall, specificity estimates across algorithms were modest to excellent, and tended to be highest within reference standard 1 (likely due to fewer cases misclassified in the reference standard). PPV estimates were highest for reference standard 1 (also likely due to fewer cases misclassified in the reference standard). In all instances, PPVs increased as the number of physician diagnosis codes in administrative data increased. Administrative data algorithms that incorporated additional features, such as being seen by a musculoskeletal specialist or a NSAID dispensation, did not improve accuracy. Further, only 28.3%, and 36.9% of OA patients in reference standards 1 and 2, respectively, had OA billing claims by a musculoskeletal specialist, thus dramatically decreasing the sensitivity of algorithms requiring specialty claims.

Algorithms that required “at least one hospital admission with an OA diagnosis or one (or two) physician OA diagnosis code(s) ever” were associated with maximal sensitivity, while algorithms that required “at least one hospital admission associated with an OA diagnosis or at least two (or three) physician diagnosis codes within five years” had the highest PPVs. These four algorithms were selected to compare estimates of population-level prevalence.

3.3. Estimated population incidence and prevalence of OA: Fig. 1

Fig. 1.

Fig. 1

Annual crude prevalence of OA per 100 population among adults aged 20 years and older in Ontario for the years 2000–2017. Abbreviations: H = hospitalization; P = physician billing claim.

From 2000 to 2017, the number of Ontario adults aged 20 years of age and older (the population denominator) increased from 8,643,521 to 10,969,729 (Supplementary Table 1). Based on the best algorithms, the crude OA prevalence more than doubled over this time period, from 5.4–11.2% in 2000 to 11.4–25.3% in 2017 (Fig. 1). Age- and sex-standardized prevalence estimates nearly doubled over the study period, with an estimated 9.3–22.0 cases per 100 population in 2017 (Fig. 2, Supplementary Table 2).

Fig. 2.

Fig. 2

Annual age- and sex-standardized prevalence per 100 population among adults aged 20 years and older in Ontario for the years 2000–2017. Abbreviations: H = hospitalization; P = physician billing claim.

The crude incidence ranged from 1.0 to 2.1 cases per 100 population in 2000 to 0.7–1.5 cases per 100 population in 2017 (Fig. 3, Supplementary Table 3). Standardized incidence across definitions estimated about 1 newly identified OA case per 100 persons in each year, Fig. 4. Annual incidence estimates were slightly elevated in years 2000–2007, which may have been influenced by observational period effects rather than a true decline in incidence over time, Supplementary Table 4.

Fig. 3.

Fig. 3

Annual age- and sex-standardized incidence of OA per 100 population among adults aged 20 years and older in Ontario for the years 2000–2017. Abbreviations: H = hospitalization; P = physician billing claim.

Fig. 4.

Fig. 4

Counts of New OA cases and Total OA cases across algorithms for 2017 identified among the 10, 969, 729 adult Ontario population. Abbreviations: H = hospitalization; P = physician billing claim.

The age-standardized estimates in males and females showed similar increasing trends to the overall estimates, with higher number for females versus males, Supplementary Fig. 1.

In 2017, among the 10,969,729 adult Ontario population, a large variation in the counts of the number of new and total OA cases across different algorithms were noticeable, Fig. 4. Twice as many new annual cases were ascertained from the population by the most sensitive case definition (n = 121,068) versus the algorithm with the highest PPV (but lower sensitivity), which only identified 67,187 new annual cases. The total number of OA cases to be identified over the study period was similarly significantly higher for the most sensitive algorithm (n = 2,775,888, 25.3%) than the lower sensitivity algorithm (n = 1,250,161, 11.4%). The most sensitive algorithm achieved a population-level prevalence that most closely approximates the pre-test prevalence of OA ascertained from the primary care random sample.

4. Discussion

We present a comprehensive assessment on the accuracy of algorithms for detecting OA in health administrative data. Our findings report that administrative data have limited sensitivity in adequately identifying all OA patients and appear to be most sensitive at detecting OA patients for whom their physician formally documented their diagnosis in medical problem lists. These findings underscore the need for careful scrutiny of the use of administrative data for OA surveillance as they may substantially underestimate the number of individuals with OA. By contrasting different reference standard definitions used for administrative data validation studies, we observed the effects of alternative reference standards on algorithm performance. These findings illustrate under-documentation of OA diagnoses in medical problem lists, and defining OA cases by medical problem lists alone estimated lower PPVs for administrative data algorithms because the reference standard definition completely misses some true OA cases who have their diagnosis documented outside of problem lists. Our population-based findings estimated an increasing OA prevalence over time irrespective of the algorithm used to identify OA, whereas annual incidence remained steady over the past decade.

In our comprehensive review of primary care EMRs of a large random sample, only 58% of OA patients had their diagnosis documented in their medical problem list. These findings are consistent with prior studies [[38], [39], [40]] that have found OA to be under-recognized in primary care. As many as 60–90% of people with OA have at least one other condition, most often diabetes, hypertension, dyslipidemia and heart disease [41, 42]. In the setting of other ‘more important’ health problems, people with OA and their physicians may place OA at a lower priority for attention. However, this under-documentation of OA diagnoses may suggest that medical problem lists may be inadequate to identify OA patients for secondary research purposes. Although there are limited reports on identifying OA patients in Canadian primary care EMRs, Birtwistle et al. identified OA patients using a combination of both ICD-9 codes and diagnoses documented in problem lists and identified an OA prevalence of 14% [43]. In our large validation sample, our OA prevalence based on diagnoses in medical problem lists was only 10%. However, we did not also ascertain patients using ICD codes contained within the EMR (as it would bias our algorithm performance). More recently, reports of low overall completeness of problem lists [22] further suggests caution when relying only on problem lists for disease surveillance. Outside of Canada, the under-recording of arthritis diagnoses in primary care has been similarly reported [25,26]. It may be that only more severe OA is associated with formal documentation; OA may not be prioritized as a co-existing medical condition warranting a documentation, or other reasons.

Among previous studies examining the accuracy of administrative data OA algorithms [[17], [18], [19], [20],[44], [45], [46], [47], [48], [49]], only a few were conducted in Canada [[17], [18], [19], [20]]. Past OA reference standards used in Canadian studies have varied (from patient-self reported diagnosis to diagnosis confirmed via imaging reports), and have uniformly been associated with suboptimal sensitivities and predictive values [[17], [18], [19], [20]]. A prior study also used a range of reference standards that variably combined OA on imaging, self-report, and OA diagnostic criteria in 171 Canadians with knee pain invited for OA screening, and reported sensitivities of 21–57% using administrative data but high PPVs (82–100%) [19]. Their low sensitivity of identifying OA patients meeting diagnostic criteria may be explained by their screening process (potentially identifying previously undiagnosed OA) and their high PPVs may be explained by the high prevalence of OA within their reference standard and exclusion of individuals with other musculoskeletal issues. In our study, we chose a priori to evaluate the performance of administrative data algorithms against two OA reference standards. We opted not to include OA diagnostic criteria as a standard, or confine OA to those with confirmed OA on imaging as primary care physicians do not consistently use these tools and they are not required for diagnosis [50]. While there was less misclassification of OA when classifying diagnoses during all medical encounters, we suspected reference standard 1 would most likely include individuals across all spectrums of the disease, and thus administrative data algorithms would have lower sensitivity, but higher PPVs due to fewer false positives in administrative data. We also suspect that individuals labelled with OA in problem lists (reference standard 2) may have had more health contacts related to their OA (to warrant populating the problem list) and therefore administrative data algorithms would have higher sensitivity at detecting these OA cases. However, as reference standard 2 completely misses some true OA, the PPVs estimated against reference standards 2 should be interpreted with caution.

Our findings also have important implications for other settings, particularly US insurance claims databases, which may have less extensive data histories on individual patients than here in Canada. The sensitivity of administrative data algorithms to identify OA in these settings may be even lower, leading to even greater underestimates of the burden of OA. Additionally, the doubling of the OA prevalence in the population over our study's timeframe has important implications for policy making, health services planning, raising awareness, and informing funding allocations for OA care and research. Over time, the increasing pain, loss of function, diminished quality of life and increased disability [[51], [52], [53], [54]] associated with OA will contribute to increasing financial and physical burden on individuals and society [[55], [56], [57], [58]]. OA care is provided mainly in ambulatory or outpatient settings, and already comprises a substantial burden of primary care office visits [[59], [60], [61], [62], [63]]. Moreover, Canadian wait times to see musculoskeletal specialists – including rheumatologists and orthopedic surgeons – are the longest among subspecialists indicating capacity of providing quality OA care needs to be increased [64,65]. With our findings indicating that OA is now affecting 1 in 5 adults, mobilizing planning for appropriate health service needs to manage these individuals is urgently needed.

There have been some reports using computerized health data to describe the incidence and prevalence of OA [1,7,[66], [67], [68], [69], [70], [71], [72]] but limited reports evaluating trends over time. A UK study on temporal trends between 1992 and 2013 reported that the annual age–sex standardized incidence of diagnosed OA showed a small increase over the period 1992–2004 but remained stable or even declined thereafter and that similar trends were observed for both sexes [70]. In contrast, an earlier study from British Columbia, Canada reported that the age-standardized incidence did not change among males but increased among females over 1996–2004 [66]. In our study which spans more recent years, we observed higher incidence during the initial observational period than in more recent years; however, we did not observe any differences in incidence trends between sexes.

In our validation sample, we detected a high prevalence of doctor-diagnosed OA in primary care (23%), with 36% having OA among individuals 45 years and older. The overarching purpose of public health surveillance for OA is to facilitate the primary, secondary and tertiary prevention of OA. This includes being able to accurately identify the total number of individuals with OA and the number of patients diagnosed annually. Across the top performing algorithms, the crude population-level OA prevalence ranged from 11% to 25% in 2017. Algorithms with lower sensitivity identified lower prevalence and may not be as useful for public health surveillance. Using our most sensitive algorithm, we identified 2,775,888 Ontario patients with at least one hospital or physician diagnosis code over the study period, and over a million OA patients are undetected if requiring one additional diagnosis code. Thus, careful consideration of the algorithm is required in any surveillance system aimed at detecting OA individuals.

There are potential limitations of our study. First, we assumed that a physician diagnosis of OA in our validation sample was correct and complete. We did not require patients to satisfy classification criteria as these are not routinely employed or documented during the course of clinical care. OA diagnoses were taken verbatim and no inferences were made from signs or symptoms. However, the validity of the diagnosis made by a physician was not the study's objective. Second, our findings are based on patients who are rostered to a primary care physician and thus may not be generalizable to patients without access to primary care. Third, changes to physician coding practices in Ontario, including the introduction of different payment models, and higher remuneration for the care of other diseases, such as diabetes, may have influenced the sensitivity of administrative data to detect OA cases among individuals with multi-morbidity. This is not a limitation per se, but a caveat of the potential for diagnostic overshadowing in billing claims where only one diagnosis code is permitted per billing claim. Finally, we may have misclassified prevalent OA as incident OA in the population analyses, especially during the early years. The early years of data coverage are most prone to misclassification of prevalent OA as incident OA whereby incidence appears to be highest in the earlier years as there are more prevalent OA cases being misclassified as incident. Previous studies have shown that a 5-year washout period may be sufficient but a 10-year washout period (which we employed) further reduces misclassification and is preferred [73,74]. Moreover, it is difficult to state when OA actually becomes ‘incident’, since mild OA can be present quite early in life, and worsen slowly over decades.

In conclusion, improving efforts to encourage the documentation of OA diagnoses in both EMR and administrative data will enhance the surveillance and evaluation efforts of OA. Health administrative data appear to be more sensitive at detecting OA patients for whom their physician formally documented their diagnosis in medical problem lists. Potential reasons for this require ongoing investigation to determine whether these individuals have more severe OA, are actively receiving care for their OA, diagnostic overshadowing in the presence of multimorbidity, or other reasons. Additionally, we observed the prevalence OA to be increasing over time, irrespective of the algorithm used to identify OA, whereas annual incidence appears to be stable.

Author contributions

Jessica Widdifield was involved in conception and design, obtaining funding, acquisition of data analysis and interpretation of the data, manuscript preparation, and takes responsibility for the integrity of the work.

R. Liisa Jaakkimainen was involved in conception and design, obtaining funding, acquisition of data analysis and interpretation of the data, and manuscript preparation.

Jodi Gatley was involved in design, analysis and interpretation of the data, administrative, technical, and logistic support, and manuscript preparation.

Lisa Lix was involved in design, analysis and interpretation of the data, and manuscript preparation.

Gillian Hawker was involved in design, obtaining funding, analysis and interpretation of the data, and manuscript preparation.

Sasha Bernatsky was involved in design, analysis and interpretation of the data, and manuscript preparation.

Bheeshma Ravi was involved in design, analysis and interpretation of the data, and manuscript preparation.

David Wasserstein was involved in design, analysis and interpretation of the data, and manuscript preparation.

Bing Yu was involved in design, acquisition of data, analysis and interpretation of the data, administrative, technical, and logistic support, and manuscript preparation.

Karen Tu was involved in conception and design, obtaining funding, acquisition of data, analysis and interpretation of the data, and manuscript preparation.

Role of the funding source

This work was supported by the Canadian Institutes of Health Research and the Public Health Agency of Canada, who played no role in the design or conduct of the study.

Declaration of competing interest

None of the authors have conflicts of interests related to this study.

Acknowledgements

First and foremost, we wish to thank all the family physicians who provide data to the Electronic Medical Record Primary Care Database (EMRPC), also known as the Electronic Medical Record Administrative data Linked Database (EMRALD). We would like to thank our five chart abstractors for their work (Nancy Cooper, Abayomi Fowora, Diane Kerbel, Anne Marie Mior and Barbara Thompson), health information analysts Myra Wang and Jacqueline Young for assistance in preparing the reference standard sample, and project management support from Raquel Duchen (ICES).

This study was supported by ICES, which is funded by an annual grant from the Ontario Ministry of Health and Long-Term Care (MOHLTC). This study also received funding from the Public Health Agency of Canada and the Canadian Institutes of Health Research. Parts of this material are based on data and information compiled and provided by the MOHLTC, and the Canadian Institutes of Health Information (CIHI). The analyses, conclusions, opinions and statements expressed herein are solely those of the authors and do not reflect those of the funding or data sources; no endorsement is intended or should be inferred.

Karen Tu is supported by a research scholar award by the Department of Family and Community Medicine at the University of Toronto. Jessica Widdifield receives support from the Arthritis Society Stars Career Development Award (STAR-19-0610). Lisa Lix is supported by a Tier 1 Canada Research Chair.

Footnotes

Appendix A

Supplementary data related to this article can be found at https://doi.org/10.1016/j.ocarto.2020.100115.

Contributor Information

Jessica Widdifield, Email: jessica.widdifield@utoronto.ca.

R. Liisa Jaakkimainen, Email: liisa.jaakkimainen@ices.on.ca.

Jodi M. Gatley, Email: Jodi.Gatley@ices.on.ca.

Gillian A. Hawker, Email: g.hawker@utoronto.ca.

Lisa M. Lix, Email: Lisa.Lix@umanitoba.ca.

Sasha Bernatsky, Email: sasha.bernatsky@mcgill.ca.

Bheeshma Ravi, Email: bheeshma.ravi@sunnybrook.ca.

David Wasserstein, Email: david.wasserstein@utoronto.ca.

Bing Yu, Email: Bing.Yu@ices.on.ca.

Karen Tu, Email: k.tu@utoronto.ca.

Appendix A. Supplementary data

The following is the supplementary data related to this article:

Multimedia component 1
mmc1.docx (41KB, docx)

References

  • 1.Global Burden of DiseaseStudy Collaborators Global, regional, and national incidence, prevalence, and years lived with disability for 301 acute and chronic diseases and injuries in 188 countries, 1990-2013: a systematic analysis for the global burden of disease study 2013. Lancet. 2015;86:743–800. doi: 10.1016/S0140-6736(15)60692-4. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 2.Dagenais S.G.S., Wai E.K. Systematic review of the prevalence of radiographic primary hip osteoarthritis. Clin. Orthop. Relat. Res. 2009;467:623–637. doi: 10.1007/s11999-008-0625-5. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 3.Public Health Agency of Canada . PHAC; Ottawa: 2017. National Surveillance of Osteoarthritis and Rheumatoid Arthritis in canada: Results from the Canadian Chrnic Disease Surveillance System. [Google Scholar]
  • 4.Plotnikoff R.K.N., Lytvyak E., Penfold C., Schopflocher D., Imayama I., Johnson S.T., Raine K. Osteoarthritis prevalence and modifiable factors: a population study. BMC Publ. Health. 2015;15:1195. doi: 10.1186/s12889-015-2529-0. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 5.MacDonald K.V.S.C., Langlois K., Marshall D.A. Symptom onset, diagnosis and management of osteoarthritis. Health Rep. 2014;25:10–17. [PubMed] [Google Scholar]
  • 6.Murphy L.B., Sacks J.J., Helmick C.G., Brady T.J., Barbour K.E., Hootman J.M., et al. Arthritis prevalence: which case definition should be used for surveillance? Comment on the article by jafarzadeh and felsons. Arthritis Rheum. 2019;71:172–175. doi: 10.1002/art.40733. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 7.Marshall D.A., Shahid R., Bertazzon S., Seidel J.E., Patel A.B., Nasr M., et al. Geographic variation in osteoarthritis prevalence in alberta: a spatial analysis approach. 2019;103:112–121. [Google Scholar]
  • 8.Kopec J.A., Berthelot J.M., Le Petit C., Aghajanian J., Sayre E.C., Cibere J., et al. Descriptive epidemiology of osteoarthritis in british columbia, Canada. J. Rheumatol. 2007;34:386–393. [PubMed] [Google Scholar]
  • 9.Kopec J.A., Sayre E.C., Cibere J., Flanagan W.M., Aghajanian J., Anis A.H., et al. Trends in physician-diagnosed osteoarthritis incidence in an administrative database in British Columbia, Canada, 1996-1997 through 2003-2004. Arthritis Rheum. 2008;59:929–934. doi: 10.1002/art.23827. [DOI] [PubMed] [Google Scholar]
  • 10.Rahman M.M., Goldsmith C.H., Anis A.H., Kopec J.A. Osteoarthritis incidence and trends in administrative health records from british columbia, Canada. J. Rheumatol. 2014;41:1147–1154. doi: 10.3899/jrheum.131011. [DOI] [PubMed] [Google Scholar]
  • 11.Sun J.G.K., Svenson L.W., Bell N.R., Frank C. Estimating osteoarthritis incidence from population-based administrative health care databases. Ann. Epidemiol. 2006;17:51–56. doi: 10.1016/j.annepidem.2006.06.003. [DOI] [PubMed] [Google Scholar]
  • 12.Marshall D.A., Barnabe C., MacDonald K.V., Maxwell C., Mosher D., Wasylak T., et al. Estimating the burden of osteoarthritis to plan for the future. Arthritis Care Res. 2015;67:1379–1386. doi: 10.1002/acr.22612. [DOI] [PubMed] [Google Scholar]
  • 13.Yu D.P.G., Bedson J., Jordan K.P. Annual consultation incidence of osteoarthritis estimated from population-based health care data in england. Rheumatology. 2015;54:2051–2060. doi: 10.1093/rheumatology/kev231. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 14.Powell K.E., Presley R.J., Tolsma D., Harris S., Mertz K.J., Viel K., et al. Administrative data as a tool for arthritis surveillance: estimating prevalence and utilization of services. J. Publ. Health Manag. Pract. 2003;9:291–298. doi: 10.1097/00124784-200307000-00007. [DOI] [PubMed] [Google Scholar]
  • 15.Harrold L.R., Straus W., Andrade S.E., Reed J.I., Cernieux J., Lewis B.E., Gurwitz J.H. Challenges of estimating health service utilization for osteoarthritis patients on a population level. J. Rheumatol. 2002;29:1931–1936. [PubMed] [Google Scholar]
  • 16.Cadarette S.M., Wong L. An introduction to health care administrative data. Can. J. Hosp. Pharm. 2015;68:232–237. doi: 10.4212/cjhp.v68i3.1457. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 17.Lix L., Mann J. Winnipeg; 2008. Defining and Validating Chronic Diseases: an Administrative Data Approach – an Update with Icd-10-Ca. [Google Scholar]
  • 18.Lix L., Burchill C., Metge C., McKeen N., Moore D., Bond R. Winnipeg; 2006. Defining and Validating Chronic Diseases: an Administrative Data Approach. [Google Scholar]
  • 19.Rahman M.M., Goldsmith C.H., Anis A.H., Cibere J. Validation of administrative osteoarthritis diagnosis using a clinical and radiological population-based cohort. International Journal of Rheumatology. 2016;2016:1–7. doi: 10.1155/2016/6475318. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 20.Fortin M., Sanche S., Almirall J. Self-reported versus health administrative data: implications for assessing chronic illness burden in populations. A cross-sectional study. CMAJ Open. 2017;5:E729–E733. doi: 10.9778/cmajo.20170029. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 21.Wright A., McCoy A.B., Hickman T.T., Hilaire D.S., Borbolla D., Bowes W.A., 3rd, et al. Problem list completeness in electronic health records: a multi-site study and assessment of success factors. Int. J. Med. Inf. 2015;84:784–790. doi: 10.1016/j.ijmedinf.2015.06.011. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 22.Singer A., Kroeker A.L., Yakubovich S., Duarte R., Dufault B., Katz A. Data quality in electronic medical records in manitoba: do problem lists reflect chronic disease as defined by prescriptions? Can. Fam. Physician. 2017;63:382–389. [PMC free article] [PubMed] [Google Scholar]
  • 23.Martin P.M., Sbaffi L. Electronic health record and problem lists in leeds, United Kingdom: variability of general practitioners' views. Health Inf. J. 2019;1460458219895184 doi: 10.1177/1460458219895184. [DOI] [PubMed] [Google Scholar]
  • 24.Tate A.R., Martin A.G., Ali A., Cassell J.A. Using free text information to explore how and when gps code a diagnosis of ovarian cancer: an observational study using primary care records of patients with ovarian cancer. BMJ Open. 2011;1 doi: 10.1136/bmjopen-2010-000025. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 25.Yu D.J.K., Peat G. Underrecording of osteoarthritis in United Kingdom primary care electronic health record data. Clin. Epidemiol. 2018;10:1195–1201. doi: 10.2147/CLEP.S160059. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 26.Ford E.N.A., Koeling R., Tate A., Carroll J., Axelrod L., Smith H.E., et al. Optimising the use of electronic health records to estimate the incidence of rheumatoid arthritis in primary care: what information is hidden in free text? BMC Med. Res. Methodol. 2013;13:105. doi: 10.1186/1471-2288-13-105. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 27.Manuel D.G., Rosella L.C., Stukel T.A. Importance of accurately identifying disease in studies using electronic health records. BMJ. 2010;341:c4226. doi: 10.1136/bmj.c4226. [DOI] [PubMed] [Google Scholar]
  • 28.Tu K., Widdifield J., Young J., Oud W., Ivers N.M., Butt D.A., Leaver C.A., Jaakkimainen L. Are family physicians comprehensively using electronic medical records such that the data can be used for secondary purposes? A canadian perspective. BMC Med Inform Decis Mak. 2015 Aug 13;15:67. doi: 10.1186/s12911-015-0195-x. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 29.Tu K.M.D., Lam K., Kavanagh D., Mitiku T.F., Guo H. Diabetics can be identified in an electronic medical record using laboratory tests and prescriptions. J. Clin. Epidemiol. 2011;64:431–435. doi: 10.1016/j.jclinepi.2010.04.007. [DOI] [PubMed] [Google Scholar]
  • 30.Tu K1 W.M., Young J., Green D., Ivers N.M., Butt D., Jaakkimainen L., Kapral M.K. Validity of administrative data for identifying patients who have had a stroke or transient ischemic attack using emrald as a reference standard. Can. J. Cardiol. 2013;29:1388–1394. doi: 10.1016/j.cjca.2013.07.676. [DOI] [PubMed] [Google Scholar]
  • 31.Widdifield J.B.C., Bernatsky S., Paterson J.M., Green D., Young J., Ivers N., Butt D.A., Jaakkimainen R.L., Thorne J.C., Tu K. An administrative data validation study of the accuracy of algorithms for identifying rheumatoid arthritis: the influence of the reference standard on algorithm performance. BMC Muscoskel. Disord. 2014;15:1–9. doi: 10.1186/1471-2474-15-216. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 32.Butt D.A., Tu K., Young J., Green D., Wang M., Ivers N., et al. A validation study of administrative data algorithms to identify patients with parkinsonism with prevalence and incidence trends. Neuroepidemiology. 2014;43:28–37. doi: 10.1159/000365590. [DOI] [PubMed] [Google Scholar]
  • 33.Tu K.M.T., Ivers N.M., Guo H., Lu H., Jaakkimainen L., Kavanagh D.G., Lee D.S., Tu J.V. Evaluation of electronic medical record administrative data linked database (emrald) Am. J. Manag. Care. 2014;20:e15–21. [PubMed] [Google Scholar]
  • 34.Care MoHaL-T . Government of Ontario; 2019. Ontario Health Insurance Plan. Ohip Schedule of Benefits and Fees.http://health.gov.on.ca/en/pro/programs/ohip/sob/ Available from: [Google Scholar]
  • 35.Information CIfH . CIHI; 2019. Data Holdings. Available from: https://www.cihi.ca/en/access-data-and-reports/make-a-data-request/data-holdings. [Google Scholar]
  • 36.CIHI . CIHI; 2019. Icd-10-ca/cci Implementation Schedule. Available from: https://www.cihi.ca/en/icd-10-cacci-implementation-schedule. [Google Scholar]
  • 37.Chubak J., Pocobelli G., Weiss N.S. Tradeoffs between accuracy measures for electronic health care data algorithms. J. Clin. Epidemiol. 2012;65:343–349 e2. doi: 10.1016/j.jclinepi.2011.09.002. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 38.McHugh G.A., Silman A.J., Luker K.A. Quality of care for people with osteoarthritis: a qualitative study. J. Clin. Nurs. 2007;16:168–176. doi: 10.1111/j.1365-2702.2007.01885.x. [DOI] [PubMed] [Google Scholar]
  • 39.Dziedzic K.S., Allen K.D. Challenges and controversies of complex interventions in osteoarthritis management: recognizing inappropriate and discordant care. Rheumatology. 2018;57:iv88–iv98. doi: 10.1093/rheumatology/key062. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 40.Coxon D., Frisher M., Jinks C., Jordan K., Paskins Z., Peat G. The relative importance of perceived doctor's attitude on the decision to consult for symptomatic osteoarthritis: a choice-based conjoint analysis study. BMJ Open. 2015;5 doi: 10.1136/bmjopen-2015-009625. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 41.Kadam U.T., Jordan K., Croft P.R. Clinical comorbidity in patients with osteoarthritis: a case-control study of general practice consulters in england and wales. Ann. Rheum. Dis. 2004;63:408–414. doi: 10.1136/ard.2003.007526. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 42.van Dijk G.M., Veenhof C., Schellevis F., Hulsmans H., Bakker J.P., Arwert H., et al. Comorbidity, limitations in activities and pain in patients with osteoarthritis of the hip or knee. BMC Muscoskel. Disord. 2008;9:95. doi: 10.1186/1471-2474-9-95. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 43.Birtwhistle R., Morkem R., Peat G., Williamson T., Green M.E., Khan S., et al. Prevalence and management of osteoarthritis in primary care: an epidemiologic cohort study from the canadian primary care sentinel surveillance network. CMAJ Open. 2015;3:E270–E275. doi: 10.9778/cmajo.20150018. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 44.Katz J.N., Liang M.H., Bacon A.M., Kaplan H., Kieval R.I., Lindsey S.M., Roberts W.N., Sheff D.M., Spencer R.T., Weaver A.L., Baron J.A. Sensitivity and positive predictive value of medicare part b physician claims for rheumatologic diagnoses and procedures. Arthritis Rheum. 1997;40:1594–1600. doi: 10.1002/art.1780400908. [DOI] [PubMed] [Google Scholar]
  • 45.Harrold L.R., Andrade S.E., Reed J.I., Cernieux J., Straus W., Weeks M., Lewis B., Gurwitz J.H. Evaluating the predictive value of osteoarthritis diagnoses in an administrative database. Arthritis Rheum. 2000;43:1881–1885. doi: 10.1002/1529-0131(200008)43:8<1881::AID-ANR26>3.0.CO;2-#. [DOI] [PubMed] [Google Scholar]
  • 46.Losina E.B.J., Baron J.A., Katz J.N. Accuracy of medicare claims data for rheumatologic diagnoses in total hip replacement recipients. J. Clin. Epidemiol. 2003;56:515–519. doi: 10.1016/s0895-4356(03)00056-8. [DOI] [PubMed] [Google Scholar]
  • 47.Gabriel S.E., O'Fallon W.M. A mathematical model that improves the validity of osteoarthritis diagnoses obtained from a computerized diagnostic database. J. Clin. Epidemiol. 1996;49:1025–1029. doi: 10.1016/0895-4356(96)00115-1. [DOI] [PubMed] [Google Scholar]
  • 48.Widdifield J., Lix L., Paterson J.M., Bernatsky S., Tu K., Ivers N., Bombardier C. Systematic review and critical appraisal of validation studies to identify rheumatic diseases in health administrative databases. Arthritis Care Res. 2013;65:1490–1503. doi: 10.1002/acr.21993. [DOI] [PubMed] [Google Scholar]
  • 49.Shrestha S., Losina E., Katz J.N. Diagnostic accuracy of administrative data algorithms in the diagnosis of osteoarthritis: a systematic review. BMC Med. Inf. Decis. Making. 2016;16:82. doi: 10.1186/s12911-016-0319-y. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 50.NICE . National Institute for Health and Clinical Excellence; London: 2014. Osteoarthritis: Care and Management. [Google Scholar]
  • 51.Losina E., Walensky R.P., Reichmann W.M., Holt H.L., Gerlovin H., Solomon D.H., et al. Impact of obesity and knee osteoarthritis on morbidity and mortality in older americans. Ann. Intern. Med. 2011;154:217–226. doi: 10.1059/0003-4819-154-4-201102150-00001. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 52.Boutron I., Rannou F., Jardinaud-Lopez M., Meric G., Revel M., Poiraudeau S. Disability and quality of life of patients with knee or hip osteoarthritis in the primary care setting and factors associated with general practitioners' indication for prosthetic replacement within 1 year. Osteoarthritis Cartilage. 2008;16:1024–1031. doi: 10.1016/j.joca.2008.01.001. [DOI] [PubMed] [Google Scholar]
  • 53.Rosemann T., Laux G., Szecsenyi J. Osteoarthritis: quality of life, comorbidities, medication and health service utilization assessed in a large sample of primary care patients. J. Orthop. Surg. Res. 2007;2:12. doi: 10.1186/1749-799X-2-12. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 54.Zakaria Z.F., Bakar A.A., Hasmoni H.M., Rani F.A., Kadir S.A. Health-related quality of life in patients with knee osteoarthritis attending two primary care clinics in Malaysia: a cross-sectional study. Asia Pac. Fam. Med. 2009;8:10. doi: 10.1186/1447-056X-8-10. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 55.Bombardier C., Hawker G., Mosher D. 2011. The Impact of Arthritis in canada: Today and over the Next 30 Years. [Google Scholar]
  • 56.Maetzel A., Li L.C., Pencharz J., Tomlinson G., Bombardier C., Community H., et al. The economic burden associated with osteoarthritis, rheumatoid arthritis, and hypertension: a comparative study. Ann. Rheum. Dis. 2004;63:395–401. doi: 10.1136/ard.2003.006031. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 57.Maetzel A. The challenges of estimating the national costs of osteoarthritis: are we making progress? J. Rheumatol. 2002;29:1811–1813. [PubMed] [Google Scholar]
  • 58.Kingsbury S.R., Gross H.J., Isherwood G., Conaghan P.G. Osteoarthritis in europe: impact on health status, work productivity and use of pharmacotherapies in five european countries. Rheumatology. 2014;53:937–947. doi: 10.1093/rheumatology/ket463. [DOI] [PubMed] [Google Scholar]
  • 59.Public Health Agency of Canada . 2010. Life with Arthritis in canada a Personal and Public Health Challenge.https://www.canada.ca/content/dam/phac-aspc/migration/phac-aspc/cd-mc/arthritis-arthrite/lwaic-vaaac-10/pdf/arthritis-2010-eng.pdf Accessed. [Google Scholar]
  • 60.Birrell F., Croft P., Cooper C., Hosie G., Macfarlane G., Silman A. Health impact of pain in the hip region with and without radiographic evidence of osteoarthritis: a study of new attenders to primary care. The pcr hip study group. Ann. Rheum. Dis. 2000;59:857–863. doi: 10.1136/ard.59.11.857. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 61.Peat G., McCarney R., Croft P. Knee pain and osteoarthritis in older adults: a review of community burden and current use of primary health care. Ann. Rheum. Dis. 2001;60:91–97. doi: 10.1136/ard.60.2.91. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 62.Peat G., Thomas E., Duncan R., Wood L., Hay E., Croft P. Clinical classification criteria for knee osteoarthritis: performance in the general population and primary care. Ann. Rheum. Dis. 2006;65:1363–1367. doi: 10.1136/ard.2006.051482. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 63.Woolf A.D., Pfleger B. Burden of major musculoskeletal conditions. Bull. World Health Organ. 2003;81:646–656. [PMC free article] [PubMed] [Google Scholar]
  • 64.Widdifield J., Bombardier C., Thorne C., Jaakkimainen L., Paterson M., Bernatsky S., et al. Wait times to rheumatology care for patients with rheumatic diseases: a data linkage study of primary care electronic medical records and administrative data. CMAJ Open. 2016;4(2):E205–E212. doi: 10.9778/cmajo.20150116. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 65.Jaakkimainen L., Glazier R., Barnsley J., Salkeld E., Lu H., Tu K. Waiting to see the specialist: patient and provider characteristics of wait times from primary to specialty care. BMC Fam. Pract. 2014;15:16. doi: 10.1186/1471-2296-15-16. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 66.Kopec J.A., Rahman M.M., Sayre E.C., Cibere J., Flanagan W.M., Aghajanian J., et al. Trends in physician-diagnosed osteoarthritis incidence in an administrative database in british columbia, Canada, 1996-1997 through 2003-2004. Arthritis Rheum. 2008;59:929–934. doi: 10.1002/art.23827. [DOI] [PubMed] [Google Scholar]
  • 67.Sun J., Gooch K., Svenson L.W., Bell N.R., Frank C. Estimating osteoarthritis incidence from population-based administrative health care databases. Ann. Epidemiol. 2007;17:51–56. doi: 10.1016/j.annepidem.2006.06.003. [DOI] [PubMed] [Google Scholar]
  • 68.Kopec J.A., Rahman M.M., Berthelot J.M., Le Petit C., Aghajanian J., Sayre E.C., et al. Descriptive epidemiology of osteoarthritis in british columbia, Canada. J. Rheumatol. 2007;34:386–393. [PubMed] [Google Scholar]
  • 69.Yu D., Peat G., Bedson J., Jordan K.P. Annual consultation incidence of osteoarthritis estimated from population-based health care data in england. Rheumatology. 2015;54:2051–2060. doi: 10.1093/rheumatology/kev231. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 70.Yu D., Jordan K.P., Bedson J., Englund M., Blyth F., Turkiewicz A., et al. Population trends in the incidence and initial management of osteoarthritis: age-period-cohort analysis of the clinical practice research datalink, 1992-2013. Rheumatology. 2017;56:1902–1917. doi: 10.1093/rheumatology/kex270. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 71.Morgan O.J., Hillstrom H.J., Ellis S.J., Golightly Y.M., Russell R., Hannan M.T., et al. Osteoarthritis in england: incidence trends from national health service hospital episode statistics. ACR Open Rheumatol. 2019;1:493–498. doi: 10.1002/acr2.11071. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 72.Turkiewicz A., Petersson I.F., Bjork J., Hawker G., Dahlberg L.E., Lohmander L.S., et al. Current and future impact of osteoarthritis on health care: a population-based study with projections to year 2032. Osteoarthritis Cartilage. 2014;22:1826–1832. doi: 10.1016/j.joca.2014.07.015. [DOI] [PubMed] [Google Scholar]
  • 73.Widdifield J., Paterson J.M., Bernatsky S., Tu K., Tomlinson G., Kuriya B., et al. The epidemiology of rheumatoid arthritis (RA) in Ontario, Canada. Arthritis Rheum. 2014;66:786–793. doi: 10.1002/art.38306. [DOI] [PubMed] [Google Scholar]
  • 74.Ng R., Bernatsky S., Rahme E. Observation period effects on estimation of systemic lupus erythematosus incidence and prevalence in quebec. J. Rheumatol. 2013;40:1334–1336. doi: 10.3899/jrheum.121215. [DOI] [PubMed] [Google Scholar]

Associated Data

This section collects any data citations, data availability statements, or supplementary materials included in this article.

Supplementary Materials

Multimedia component 1
mmc1.docx (41KB, docx)

Articles from Osteoarthritis and Cartilage Open are provided here courtesy of Elsevier

RESOURCES