Validity of diagnoses of SARS-CoV-2 infection in Canadian administrative health data: a multiprovince, population-based cohort study ==================================================================================================================================== * Lisa M. Lix * Christel Renoux * Carolina Moriello * Ko Long Choi * Colin R. Dormuth * Anat Fisher * Matthew Dahl * Fangyun Wu * Ayesha Asaf * J. Michael Paterson ## Abstract **Background:** Accurate coding of diagnoses of SARS-CoV-2 infection in administrative data benefits population-based studies about the epidemiology, treatment and outcomes of COVID-19. We describe the validity of diagnoses of SARS-CoV-2 infection recorded in hospital discharge abstracts, emergency department records and outpatient physician service claims from 3 Canadian provinces. **Methods:** In this cohort study, population-based inpatient, emergency department and outpatient records were linked to SARS-CoV-2 polymerase chain reaction (PCR; reference standard) test results from British Columbia, Manitoba and Ontario for Apr. 1, 2020, to Mar. 31, 2021. Sensitivity, specificity, positive predictive value (PPV) and negative predictive value (NPV) of diagnoses of SARS-CoV-2 infection were estimated for each quarter in the study period, overall and by province, age group and sex. **Results:** Our study encompassed more than 13 million SARS-CoV-2 PCR test results. Specificity and NPV of diagnoses of SARS-CoV-2 infection were consistently high (i.e., most estimates were > 95%). Overall sensitivity estimates were 86.2%, 60.4% and 20.3% in the first quarter for inpatient, emergency department and outpatient cohorts, and 66.2%, 47.5% and 25.0% in the last quarter, respectively. For inpatients, overall PPV estimates ranged from 50.0% to 66.4%. For emergency department patients, overall PPV estimates were 76.9% and 68.3% in the first and last quarters, respectively. For outpatients, PPV estimates were 6.8% and 29.1% in the first and last quarters, respectively. **Interpretation:** We found variations in the validity of diagnoses for SARS-CoV-2 infection recorded in different health care settings, geographic areas and over time. Our multiprovince validation study provides evidence about the potential use of inpatient and emergency department records as an alternative to population-based laboratory data for identification of patients with SARS-CoV-2 infection, but does not support the use of outpatient claims for this purpose. Administrative health data are increasingly used to examine outcomes, risk factors and treatments at the population level for individuals diagnosed with SARS-CoV-2 infection. However, the validity of diagnosis coding in these data is largely unknown outside of inpatient settings.1–5 Validity may vary between jurisdictions and over time owing to variations in testing protocols, diagnosis coding procedures and staff training. There is a substantial history of conducting diagnostic validation studies for administrative health data in Canada to inform studies about the validity of diagnoses of SARS-CoV-2 infection.6,7 The Canadian Institute for Health Information published its first guidance about the *International Statistical Classification of Diseases and Related Health Problems, 10th Revision, Canada* (ICD-10-CA) diagnosis codes for COVID-19 patients admitted to hospitals and emergency departments in March 2020.8 Provincial and territorial guidance for diagnosis and fee codes for physician service claims were also published. For example, Ontario introduced a COVID-19 diagnosis code in March 2020 to be used “when treating patients with suspected or confirmed COVID-19 and/or when treating a patient by telephone/video for suspected or confirmed COVID-19,”9 and British Columbia published COVID-19 codes for “services directly related to COVID-19.”10 To our knowledge, only Wu and colleagues3 have studied the validity of COVID-19 diagnoses in Canadian inpatient and emergency department records, and they focused on Alberta. Information about the validity of diagnoses of SARS-CoV-2 infection in administrative data is important to understand COVID-19 epidemiology, treatments and outcomes. Our study assessed the validity of diagnoses of SARS-CoV-2 infection recorded in inpatient hospital discharge abstracts, emergency department records and outpatient physician service claims in 3 Canadian provinces. ## Methods We undertook a population-based cohort study using administrative health data linked with SARS-CoV-2 polymerase chain reaction (PCR) laboratory test results from BC, Manitoba and Ontario (provinces for which we had access to linked data sources) for Apr. 1, 2020, through Mar. 31, 2021. Throughout the study period there was publicly available PCR testing in these provinces. We adopted the Strengthening the Reporting of Observational Studies in Epidemiology reporting guidelines with the Reporting of Studies Conducted Using Observational Routinely-collected Data extension.11,12 ### Data sources Four administrative health databases were used in each province: health insurance registry, physician service claims, emergency department discharge records and inpatient hospital discharge records (Appendix 1, Table S1, available at [www.cmajopen.ca/content/11/5/E790/suppl/DC1](http://www.cmajopen.ca/content/11/5/E790/suppl/DC1)). All data sources were linked at the individual level using anonymized health insurance numbers; only individuals with valid health insurance numbers were included. Health insurance registry files capture start and end dates of health insurance coverage, including loss of coverage due to death or migration; demographic and residence location information is also captured. Physician service claims capture services provided by specialists and general practitioners, including type and date of service, fee code and at least 1 diagnosis code associated with the reason for the service. Diagnosis codes are recorded using modifications of the eighth (Ontario) and ninth revisions of ICD (ICD-8 and ICD-9).13 We included claims for office, telephone and virtual patient consultations, home visits and long-term care visits. Emergency department discharge records contain information about visits to hospital-based emergency departments, including visit date, chief complaint and diagnoses, typically coded using ICD-10-CA. Inpatient hospital discharge records contain diagnostic and procedural information for acute hospital stays, including up to 25 diagnoses coded using ICD-10-CA. SARS-CoV-2 laboratory test results were linked to administrative data. Test results were from the BC Ministry of Health COVID-19 Test Laboratory Data, the Ontario Laboratories Information System, and the Manitoba Cadham Provincial COVID-19 Laboratory Testing and Results Database. ### Study cohorts Inpatient, emergency department and outpatient study cohorts were composed of provincial health insurance registrants with at least 1 inpatient hospital discharge record, emergency department discharge record and outpatient physician service claim, respectively. Cohort exclusions were based on missing sex or health care coverage, and overlap in dates of hospitalization, emergency department visits and outpatient visits. The cohorts were stratified into 3-month (quarter) subgroups. ### Study measures Positive test cases, negative test cases and no-test cases were identified from laboratory data. A positive test case had at least 1 positive PCR test with a specimen collection date within the quarter, and a negative test case had at least 1 negative PCR test with a specimen collection date and no positive PCR tests within the quarter. No-test cases had no PCR tests, or only indeterminate PCR tests within the quarter. The ICD-10-CA code U07.1 in any position was used to ascertain diagnosed cases in hospital and emergency department records (Appendix 1, Table S2). For the inpatient cohort, true positive cases had a COVID-19 diagnosis code and specimen collection date for a positive PCR test between (and including) hospital admission and discharge dates. For the emergency department cohort, true positive cases had a COVID-19 diagnosis code with a service date at most 2 days before the specimen collection date or 2 days after the specimen collection date for a positive PCR test. For the outpatient cohort, newly developed COVID-19 coding directives provided by each ministry of health was used for case ascertainment (Appendix 1, Table S2). In Ontario and Manitoba, the diagnosis codes were 0809 and 079.82,14 respectively. In BC, case ascertainment was initially based on the newly developed diagnosis code C19.10 However, this diagnosis code was not associated with any claims during the study period; accordingly, newly developed service codes relevant to COVID-19 were used for case ascertainment. 10 True positive cases had a COVID-19 diagnosis code with a service date in physician claims at most 2 days before the specimen collection date or 2 days after the specimen collection date for a positive PCR test. ### Statistical analysis We used frequencies, percentages, means and standard deviations (SDs) to describe the cohort (i.e., age group, sex, income quintile, and rural or urban residence). Area-level income quintile was based on postal code of residence and household income from the Statistics Canada census.15,16 Rural or urban residence was based on each province’s definition; for example, Manitoba rural residents lived outside of the major urban centres of Winnipeg and Brandon. Validity was assessed using sensitivity, specificity, positive predictive value (PPV) and negative predictive value (NPV). We reported the estimates as percentages with 95% confidence intervals (CIs). The Youden Index (calculated as sensitivity + specificity − 1) was also reported.17 We produced validation measure estimates for each cohort and quarter for the 3 provinces combined and then separately for each province. Estimates were also stratified by sex and age group (< 65 yr, 65–79 yr, ≥ 80 yr). To assess the robustness of our findings, we performed 4 prespecified sensitivity analyses using larger case ascertainment windows as per previous studies.1,2 In the first, for the inpatient cohort, true positive cases had a specimen collection date for a positive test in the period from 7 days before the admission date to the discharge date. In the second, for the inpatient cohort, the time window extended from 14 days before to 14 days after the hospital admission date to ascertain true positive cases, consistent with the work of Kluberg and colleagues.1,2 In the third and fourth sensitivity analyses, for the emergency department and outpatient cohorts, respectively, we identified true positives used a time window from 5 days before to 5 days after the specimen collection date for a positive test. ### Ethics approval Approvals were provided by the following ethics boards: the Clinical Research Ethics Board at the University of British Columbia and the Health Research Ethics Board at the University of Manitoba. ## Results The study cohorts were composed of approximately 1.3 million inpatients, 3.2 million emergency department patients and 15.1 million outpatients (Figure 1). Ontario residents accounted for 51.2%, 77.9% and 67.3% of the inpatient, emergency department and outpatient cohorts, respectively. Average age was 55.9 years for the inpatient cohort, 43.3 years for the emergency department cohort and 44.4 years for the outpatient cohort (Table 1; Appendix 1, Tables S3–S5). ![Figure 1:](http://www.cmajopen.ca/https://www.cmajopen.ca/content/cmajo/11/5/E790/F1.medium.gif) [Figure 1:](http://www.cmajopen.ca/content/11/5/E790/F1) Figure 1: Flow diagram for construction of study cohorts. Note: ED = emergency department. View this table: [Table 1:](http://www.cmajopen.ca/content/11/5/E790/T1) Table 1: Characteristics of inpatient, emergency department and outpatient cohorts This study encompassed more than 13 million SARS-CoV-2 PCR test results. Monthly PCR tests, positive PCR tests and PCR test rates per 10 000 population are reported in Appendix 1, Table S6. The percentage of positive tests ranged from 0.1% to 12.6%, and positive test rates per 10 000 population ranged from 1 to 51 per month. Figures 2, 3 and 4 contain overall estimates of sensitivity, specificity, PPV and NPV for the inpatient, emergency department and outpatient cohorts, respectively, by quarter. Province-specific estimates are reported in Tables 2, 3 and 4; case frequencies are in Appendix 1, Tables S7 to S9. Specificity and NPV estimates were consistently high and frequently exceeded 95%. ![Figure 2:](http://www.cmajopen.ca/https://www.cmajopen.ca/content/cmajo/11/5/E790/F2.medium.gif) [Figure 2:](http://www.cmajopen.ca/content/11/5/E790/F2) Figure 2: Overall validation estimates by quarter (Q): inpatient cohort. Note: NPV = negative predictive value, PPV = positive predictive value, Q1 = April 2020–June 2020, Q2 = July 2020–September 2020, Q3 = October 2020–December 2020, Q4 = January 2021–March 2021. Error bars represent 95% confidence intervals. ![Figure 3:](http://www.cmajopen.ca/https://www.cmajopen.ca/content/cmajo/11/5/E790/F3.medium.gif) [Figure 3:](http://www.cmajopen.ca/content/11/5/E790/F3) Figure 3: Overall validation estimates by quarter (Q): emergency department cohort. Note: NPV = negative predictive value, PPV = positive predictive value, Q1 = April 2020–June 2020, Q2 = July 2020–September 2020, Q3 = October 2020–December 2020, Q4 = January 2021–March 2021. Error bars represent 95% confidence intervals. ![Figure 4:](http://www.cmajopen.ca/https://www.cmajopen.ca/content/cmajo/11/5/E790/F4.medium.gif) [Figure 4:](http://www.cmajopen.ca/content/11/5/E790/F4) Figure 4: Overall validation estimates by quarter (Q): outpatient cohort. Note: NPV = negative predictive value, PPV = positive predictive value, Q1 = April 2020–June 2020, Q2 = July 2020–September 2020, Q3 = October 2020–December 2020, Q4 = January 2021–March 2021. Error bars represent 95% confidence intervals. View this table: [Table 2:](http://www.cmajopen.ca/content/11/5/E790/T2) Table 2: Validation estimates by province and quarter: inpatient cohort View this table: [Table 3:](http://www.cmajopen.ca/content/11/5/E790/T3) Table 3: Validation estimates by province and quarter: emergency department cohort View this table: [Table 4:](http://www.cmajopen.ca/content/11/5/E790/T4) Table 4: Validation estimates by province and quarter: outpatient cohort For the inpatient cohort, the overall sensitivity was 86.2% (95% CI 84.2%–88.1%) in the first quarter (i.e., Q1); it dropped to 66.2% (95% CI 64.7%–67.6%) in the last quarter (i.e., Q4). The overall PPVs were 66.4% (95% CI 64.5%–68.4%) in Q1 and 66.3% (95% CI 65.0%–67.6%) in Q4. The lowest overall PPV was 50.0% (95% CI 46.8%–53.2%) in Q2. Province-specific PPVs for Q1 ranged from 30.0% (95% CI 26.9%–33.0%) in BC to 75.0% (95% CI 73.6%–76.4%) in Ontario. In BC, the PPV was low in Q1 and Q2, then increased. In Ontario, the PPV dropped in Q3 and Q4 compared with Q2. The Youden Index ranged from 0.49 to 0.88. For the emergency department cohort, the overall sensitivity was 60.4% (95% CI 58.3%–62.5%) in Q1 and 47.5% (95% CI 46.5%–48.6%) in Q4. The overall PPVs were 76.9% (95% CI 75.0%–78.8%) and 68.3% (95% CI 67.2%–69.4%) in Q1 and Q4, respectively. Sensitivity was poor for Manitoba throughout the study period, with a maximum of 11.6% (95% CI 9.5%–13.6%) in Q4; the corresponding PPV in Q4 was 39.9% (95% CI 34.1%–45.7%). In comparison, maximum sensitivity for Ontario was 61.0% (95% CI 59.7%–62.2%) in Q1 and the PPV was 77.0 (95% CI 75.8%–78.2%). The Youden Index had a maximum of 0.60. For the outpatient cohort, overall sensitivity was 20.3% (95% CI 19.4%–21.3%) in Q1 and 25.0% (95% CI 24.6%–25.4%) in Q4. The overall PPVs were 6.8% (95% CI 6.5%–7.1%) in Q1 and 29.1% (95% CI 28.7%–29.5%) in Q4. Sensitivity in Manitoba and Ontario was low but increased slightly from 1.3% (95% CI 0%–3.8%) and 21.1% (95% CI 20.6%–21.7%), respectively, in Q1, to 6.3% (95% CI 5.5%–7.2%) and 35.6% (95% CI 35.3%–35.9%), respectively, in Q4. In BC, sensitivity declined, falling from 10.2% (95% CI 8.6%–11.8%) in Q1 to 2.5% (95% CI 2.3%–2.6%) in Q4. The PPV increased slightly from 1.0% (95% CI 0.9%–1.2%) in Q1 to 12.9% (95% CI 12.2%–13.7%) in Q4. The Youden Index had a maximum of 0.29. Overall sensitivity and PPV generally increased across age groups in the inpatient cohort, but declined across age groups in the emergency department and outpatient cohorts (Appendix 1, Tables S10–S12). No consistent pattern was observed for sex (Appendix 1, Tables S10–S12). For the predefined sensitivity analyses for the inpatient cohort (Appendix 1, Table S13 and Table S14), expanding the duration of the case ascertainment window led to absolute increases in estimated sensitivity of up to 46%, with the largest in BC. The first sensitivity analysis resulted in greater improvements in overall sensitivity and PPV estimates than the second; temporal trends were similar to those observed in the primary analysis. For the third sensitivity analysis (i.e., emergency department cohort; Appendix 1, Table S15), expanding the case ascertainment window resulted in absolute increases of up to 10% in sensitivity. Increases in PPV were small. For the outpatient cohort, expanding the case ascertainment window led to absolute increases in sensitivity and PPV of up to 18%, although most increases were small (i.e., less than 5%; Appendix 1, Table S16). ## Interpretation Our multiprovince validation study showed that diagnoses of SARS-CoV-2 infection had PPV estimates ranging from 17.8% to 75.0% in inpatient records, 0.0% to 81.3% in emergency department records, and 0.6% to 31.1% in outpatient records, illustrating substantial variation by province and over time. Positive predictive value estimates either improved or remained stable over time, and sensitivity estimates generally declined. Positive predictive value estimates are dependent on prevalence of positive tests, which varied considerably. Expanding the duration of the observation window for ascertaining diagnoses of SARS-CoV-2 infection in health care records improved sensitivity and PPV estimates for inpatient data, but the effect for outpatient data was generally small. Using US inpatient data from May to October 2020, Kluberg and colleagues1,2 reported sensitivity estimates of 95% and PPV estimates of 81% using ICD-10 code U07.1. Similarly, Kadri and colleagues4 reported sensitivity estimates of 98% and PPV estimates of 92% from April to May 2020. Estimates from our study were lower in all provinces. However, PPVs improved over time in both Manitoba and BC and decreased only slightly in Ontario. Our study provides important information about the validity of COVID-19 diagnosis coding in emergency department records and outpatient physician claims. For the emergency department cohort, overall validity was poorer than in a prior Canadian study.3 Positive predictive value estimates were highest in Ontario. Performance was noticeably poorer in Manitoba than in the other 2 provinces. Manitoba does not use the National Ambulatory Care Reporting System for emergency department records and has fewer fields for diagnosis codes in emergency department records than Ontario and BC, which may account in part for this finding. The validity of COVID-19 diagnosis coding in outpatient claims was poor and only minimally affected by expanding the case ascertainment window. This finding may be attributed to limited access to family physicians, particularly during the early months of the pandemic, the multiple reasons a person may consult their physician regarding COVID-19, the likelihood that testing-related visits were directed to hospital- or community-based testing clinics rather than doctors’ offices, and the time needed for physicians and billing clerks to become accustomed to new diagnosis or fee codes. Our findings do not support use of physician service claims as a substitute for population-based laboratory data to identify patients with SARS-CoV-2 infection. Strengths of our study include assessment of diagnostic validity in inpatient and outpatient settings, during multiple periods and in 3 provinces. Our access to population-wide, community-based PCR laboratory test results made validation possible outside of inpatient hospital settings. ### Limitations The generalizability of our findings to outpatient physician service claims from other provinces and territories is unknown; each jurisdiction implemented its own COVID-19 coding for outpatient claims. In addition, while SARS-CoV-2 PCR laboratory testing was openly and widely accessible to symptomatic patients throughout the study period, the results of these PCR tests could influence diagnosis coding behaviour. Our findings may not generalize to settings where SARS-CoV-2 PCR testing policies and practice differed from those in these provinces during the study period, although many countries appear to have implemented similar testing policies.18 Another limitation is the potential lack of independence between PCR testing and the COVID-19 diagnosis. For the 2 to be independent, a positive or negative test result would need to be possible for all cohort members and in all settings. For example, we found that in Ontario, 47% of inpatient hospitalizations were accompanied by a PCR test; for emergency department visits this figure was 15%, and for outpatient visits this figure was only 3%. ### Conclusion We identified variations in the validity of diagnoses of SARS-CoV-2 infection recorded in different health care settings, geographic areas and over time. The performance of diagnosis codes for COVID-19 case ascertainment was better for inpatient and emergency department than for outpatient administrative data in the first year of the pandemic, likely owing to greater standardization in diagnosis coding practices in the former. However, over this period and subsequently, there were changes in guidelines and testing practices that may influence generalizability. Nevertheless, this study provides valuable insights about the validity of administrative data sources for COVID-19 case ascertainment that can benefit population-based research and surveillance. ## Data sharing: This study was conducted by CNODES using administrative health data obtained through data-sharing agreements between its member research centres and their respective provincial data stewards. Data availability thus differs by site. British Columbia: The authors do not have permission to share data from this study. The data that support the findings of this study are available from Population Data BC ([https://www.popdata.bc.ca/](https://www.popdata.bc.ca/)), but restrictions apply to the availability of these data, which were used under licence for the current study and so are not publicly available. Manitoba: Data used in this article were derived from administrative health and social data as a secondary source. The data were provided under specific data-sharing agreements only for approved use at Manitoba Centre for Health Policy (MCHP). The original source data are not owned by the researchers or MCHP and as such cannot be provided to a public repository. The original data source and approval for use have been noted in the acknowledgments of the article. Where necessary, source data specific to this article or project may be reviewed at MCHP with the consent of the original data providers, along with the required privacy and ethical review bodies. Ontario: The data set from this study is held securely in coded form at ICES. While legal data sharing agreements between ICES and data providers (e.g., health organizations and government) prohibit ICES from making the data set publicly available, access may be granted to those who meet prespecified criteria for confidential access, available at [https://www.ices.on.ca/DAS](https://www.ices.on.ca/DAS) (email: das{at}ices.on.ca). ## Acknowledgements This study was made possible through data-sharing agreements between the CNODES member research centres and the respective provincial governments of British Columbia, Manitoba (Health Information Privacy Committee [HIPC] no. 2021-378), and Ontario. ## Footnotes * **Competing interests:** None declared. * This article has been peer reviewed. * **The Canadian Network for Observational Drug Effect Studies (CNODES) Investigators:** Samy Suissa (principal investigator); Colin R. Dormuth (British Columbia); Brenda R. Hemmelgarn (Alberta); Jacqueline Quail (Saskatchewan); Dan Chateau (Manitoba); J. Michael Paterson (Ontario); Jacques LeLorier (Québec); Adrian R. Levy (Atlantic: Nova Scotia, Newfoundland and Labrador, New Brunswick, Prince Edward Island); Pierre Ernst and Kristian B. Filion (UK Clinical Practice Research Datalink); Lisa Lix (database); Robert W. Platt (methods); and Ingrid S. Sketris (knowledge translation). * **Contributors:** Lisa Lix, Christel Renoux, Carolina Moriello, Colin Dormuth, Anat Fisher, Matthew Dahl, Fangyun Wu, Ayesha Asaf and J. Michael Paterson were involved in data extraction, study design, data analysis and interpretation of results, and critically reviewed the manuscript for important intellectual content. Ko Choi was involved in data analysis and interpretation of results, and critically reviewed the manuscript for important intellectual content. Lisa Lix, Carolina Moriello, J. Michael Paterson and Ko Choi drafted the manuscript. All authors gave final approval of the version to be published and agreed to be accountable for all aspects of the work. * **Funding:** The Canadian Network for Observational Drug Effect Studies (CNODES), a collaborating centre of the Drug Safety and Effectiveness Network, is funded by the Canadian Institutes of Health Research (CIHR; grant no. DSE-146021). The funders had no role in the design of the study, analysis and interpretation of data, and in writing of the manuscript. * **Supplemental information:** For reviewer comments and the original submission of this manuscript, please see [www.cmajopen.ca/content/11/5/E790/suppl/DC1](http://www.cmajopen.ca/content/11/5/E790/suppl/DC1). * **Disclaimer:** The BC Ministry of Health approved access to and use of BC data facilitated by Population Data BC, for this study. All inferences, opinions and conclusions drawn in this manuscript are those of the authors, and do not reflect the opinions or policies of the data stewards. British Columbia data sources were as follows ([https://www2.gov.bc.ca/gov/content/health/conducting-health-research-evaluation/data-access-health-data-central](https://www2.gov.bc.ca/gov/content/health/conducting-health-research-evaluation/data-access-health-data-central)): British Columbia Ministry of Health [creator] (2022): Medical Services Plan (MSP) Payment Information File. BC Ministry of Health [publisher]. BC Ministry of Health (2022); Canadian Institute for Health Information [creator] (2022): National Ambulatory Care Reporting System. BC Ministry of Health [publisher]. BC Ministry of Health (2022); Canadian Institute for Health Information [creator] (2022): Discharge Abstract Database (Hospital Separations). BC Ministry of Health [publisher]. BC Ministry of Health (2022); British Columbia Ministry of Health [creator] (2022): Consolidation File (MSP Registration & Premium Billing). BC Ministry of Health [publisher]. BC Ministry of Health (2022). Parts of this material are based on data and/or information compiled and provided by the Canadian Institute for Health Information (CIHI). This study was supported by ICES, which is funded in part by an annual grant from the Ontario Ministry of Health. The authors also acknowledge the Manitoba Centre for Health Policy for use of data contained in the Manitoba Population Research Data Repository under project no. 2022-004 (HIPC no. 2021/2022-25). Data used in this study are from the Manitoba Population Research Data Repository housed at the Manitoba Centre for Health Policy, University of Manitoba and were derived from data provided by Manitoba Health. The opinions, results and conclusions are those of the authors. No endorsement by the provincial governments, data stewards, CIHI, Health Canada or CIHR is intended or should be inferred. This is an Open Access article distributed in accordance with the terms of the Creative Commons Attribution (CC BY-NC-ND 4.0) licence, which permits use, distribution and reproduction in any medium, provided that the original publication is properly cited, the use is noncommercial (i.e., research or educational use), and no modifications or adaptations are made. See: [https://creativecommons.org/licenses/by-nc-nd/4.0/](https://creativecommons.org/licenses/by-nc-nd/4.0/) ## References 1. Kluberg SA, Hou L, Dutcher SK, et al. (2022) Validation of diagnosis codes to identify hospitalized COVID-19 patients in health care claims data. Pharmacoepidemiol Drug Saf 31:476–80. 2. Kluberg S ICPE all access special COVID-19 sessions; 2020 Nov. 6–Dec. 3; virtual meeting Validation of claims-based algorithms to identify hospitalized COVID-19 events within the FDA Sentinel System. 3. Wu G, D’Souza AG, Quan H, et al. (2022) Validity of ICD-10 codes for COVID-19 patients with hospital admissions or ED visits in Canada: a retrospective cohort study. BMJ Open 12:e057838. [Abstract/FREE Full Text](http://www.cmajopen.ca/lookup/ijlink/YTozOntzOjQ6InBhdGgiO3M6MTQ6Ii9sb29rdXAvaWpsaW5rIjtzOjU6InF1ZXJ5IjthOjQ6e3M6ODoibGlua1R5cGUiO3M6NDoiQUJTVCI7czoxMToiam91cm5hbENvZGUiO3M6NzoiYm1qb3BlbiI7czo1OiJyZXNpZCI7czoxMjoiMTIvMS9lMDU3ODM4IjtzOjQ6ImF0b20iO3M6MjE6Ii9jbWFqby8xMS81L0U3OTAuYXRvbSI7fXM6ODoiZnJhZ21lbnQiO3M6MDoiIjt9) 4. Kadri SS, Gundrum J, Warner S, et al. (2020) Uptake and accuracy of the diagnosis code for COVID-19 among US hospitalizations. JAMA 324:2553–4. [CrossRef](http://www.cmajopen.ca/lookup/external-ref?access_num=10.1001/jama.2020.20323&link_type=DOI) [PubMed](http://www.cmajopen.ca/lookup/external-ref?access_num=33351033&link_type=MED&atom=%2Fcmajo%2F11%2F5%2FE790.atom) 5. Bhatt AS, McElrath EE, Claggett BL, et al. (2021) Accuracy of ICD-10 diagnostic codes to identify COVID-19 among hospitalized patients. J Gen Intern Med 36:2532–5. [CrossRef](http://www.cmajopen.ca/lookup/external-ref?access_num=10.1007/s11606-021-06936-w&link_type=DOI) 6. Hinds A, Lix LM, Smith M, et al. (2016) Quality of administrative health databases in Canada: a scoping review. Can J Public Health 107:e56–61. 7. Lix LM, Vasylkiv V, Ayilara O, et al. (2022) A synthesis of algorithms for multi-jurisdiction research in Canada. Int J Popul Data Sci 7:1911. 8. (2020) ICD-10-CA coding direction for suspected COVID-19 cases (Canadian Institute for Health Information, Ottawa) Available: [https://www.cihi.ca/sites/default/files/document/covid-19-presentation-en.pdf](https://www.cihi.ca/sites/default/files/document/covid-19-presentation-en.pdf). accessed 2022 June 16. 9. Re: COVID-19 Temporary Fee Schedule Codes Implemented-Physicians can begin to submit claims for COVID-19 on May 1, 2020 [INFOBulletin] (Health Services Branch, Ministry of Health, Toronto) modified 2020 May 1. Available: [https://www.ontario.ca/document/ohip-infobulletins-2020/bulletin-4755-covid-19-temporary-fee-schedule-codes-implemented](https://www.ontario.ca/document/ohip-infobulletins-2020/bulletin-4755-covid-19-temporary-fee-schedule-codes-implemented). accessed 2022 June 1. 10. (2020) Virtual Telehealth Reference Guide billing (Divisions of Family Practice, Vancouver) Available: [https://divisionsbc.ca/sites/default/files/54972/Virtual%20Telehealth%20Reference%20Guide%20-%20billing_0.pdf](https://divisionsbc.ca/sites/default/files/54972/Virtual%20Telehealth%20Reference%20Guide%20-%20billing_0.pdf). accessed 2022 June 16. 11. von Elm E, Altman DG, Egger M, et al., STROBE Initiative (2007) The Strengthening the Reporting of Observational Studies in Epidemiology (STROBE) statement: guidelines for reporting observational studies. PLoS Med 4:e296. [CrossRef](http://www.cmajopen.ca/lookup/external-ref?access_num=10.1371/journal.pmed.0040296&link_type=DOI) [PubMed](http://www.cmajopen.ca/lookup/external-ref?access_num=17941714&link_type=MED&atom=%2Fcmajo%2F11%2F5%2FE790.atom) 12. Benchimol EI, Smeeth L, Guttmann A, et al., RECORD Working Committee (2015) The REporting of studies Conducted using Observational Routinely-collected health Data (RECORD) statement. PLoS Med 12:e1001885. [CrossRef](http://www.cmajopen.ca/lookup/external-ref?access_num=10.1371/journal.pmed.1001885&link_type=DOI) [PubMed](http://www.cmajopen.ca/lookup/external-ref?access_num=26440803&link_type=MED&atom=%2Fcmajo%2F11%2F5%2FE790.atom) 13. Lix LM, Walker R, Quan H, et al., CHEP-ORTF Hypertension Outcomes and Surveillance Team (2012) Features of physician services databases in Canada. Chronic Dis Inj Can 32:186–93. [PubMed](http://www.cmajopen.ca/lookup/external-ref?access_num=23046800&link_type=MED&atom=%2Fcmajo%2F11%2F5%2FE790.atom) 14. Claims processing solution (CPS) (Government of Manitoba, Winnipeg) Available: [https://www.gov.mb.ca/health/claims/index.html](https://www.gov.mb.ca/health/claims/index.html). accessed 2022 June 16. 15. Buajitti E, Chiodo S, Rosella LC (2020) Agreement between area-and individual-level income measures in a population-based cohort: implications for population health research. SSM Popul Health 10:100553. 16. Roos NP, Mustard CA (1997) Variation in health and health care use by socioeconomic status in Winnipeg, Canada: Does the system work well? Yes and no. Milbank Q 75:89–111. [CrossRef](http://www.cmajopen.ca/lookup/external-ref?access_num=10.1111/1468-0009.00045&link_type=DOI) [PubMed](http://www.cmajopen.ca/lookup/external-ref?access_num=9063301&link_type=MED&atom=%2Fcmajo%2F11%2F5%2FE790.atom) [Web of Science](http://www.cmajopen.ca/lookup/external-ref?access_num=A1997WM19500005&link_type=ISI) 17. Youden WJ (1950) Index for rating diagnostic tests. Cancer 3:32–5. [CrossRef](http://www.cmajopen.ca/lookup/external-ref?access_num=10.1002/1097-0142(1950)3:1<32::AID-CNCR2820030106>3.0.CO;2-3&link_type=DOI) [PubMed](http://www.cmajopen.ca/lookup/external-ref?access_num=15405679&link_type=MED&atom=%2Fcmajo%2F11%2F5%2FE790.atom) [Web of Science](http://www.cmajopen.ca/lookup/external-ref?access_num=A1950UD97200004&link_type=ISI) 18. Hale T, Angrist N, Goldszmidt R, et al. (2021) A global panel database of pandemic policies (Oxford COVID-19 government response tracker). Nat Hum Behav 5:529–38. * © 2023 CMA Impact Inc. or its licensors