Published on in Vol 27 (2025)

This is a member publication of University of Toronto

Preprints (earlier versions) of this paper are available at https://preprints.jmir.org/preprint/52244, first published .
Perspectives on Using Artificial Intelligence to Derive Social Determinants of Health Data From Medical Records in Canada: Large Multijurisdictional Qualitative Study

Perspectives on Using Artificial Intelligence to Derive Social Determinants of Health Data From Medical Records in Canada: Large Multijurisdictional Qualitative Study

Perspectives on Using Artificial Intelligence to Derive Social Determinants of Health Data From Medical Records in Canada: Large Multijurisdictional Qualitative Study

Original Paper

1Department of Health Behavior and Health Equity, School of Public Health, University of Michigan–Ann Arbor, Ann Arbor, MI, United States

2Upstream Lab, MAP Centre for Urban Health Solutions, Li Ka Shing Knowledge Institute, Unity Health Toronto, Toronto, ON, Canada

3Primary Healthcare Research Unit, Memorial University of Newfoundland and Labrador, St. John's, NL, Canada

4Department of Family Medicine, Rady Faculty of Health Sciences, University of Manitoba, Winnipeg, MB, Canada

5Department of Family Medicine, Dalhousie University, Halifax, NS, Canada

6School of Health and Human Performance, Dalhousie University, Halifax, NS, Canada

7Department of Community Health & Epidemiology, College of Medicine, University of Saskatchewan, Saskatoon, SK, Canada

8Department of Family Medicine, University of Calgary, Calgary, Canada

9Department of Family and Community Medicine, Faculty of Medicine, University of Toronto, Toronto, ON, Canada

10Department of Family and Community Medicine, St. Michael’s Hospital, Toronto, ON, Canada

11Dalla Lana School of Public Health, University of Toronto, Toronto, ON, Canada

Corresponding Author:

Andrew D Pinto, CCFP, MSc, MD

Upstream Lab, MAP Centre for Urban Health Solutions

Li Ka Shing Knowledge Institute

Unity Health Toronto

30 Bond Street

Toronto, ON, M5B 1W8

Canada

Phone: 1 416 864 6060 ext 76148

Email: andrew.pinto@utoronto.ca


Background: Data on the social determinants of health could be used to improve care, support quality improvement initiatives, and track progress toward health equity. However, this data collection is not widespread. Artificial intelligence (AI), specifically natural language processing and machine learning, could be used to derive social determinants of health data from electronic medical records. This could reduce the time and resources required to obtain social determinants of health data.

Objective: This study aimed to understand perspectives of a diverse sample of Canadians on the use of AI to derive social determinants of health information from electronic medical record data, including benefits and concerns.

Methods: Using a qualitative description approach, in-depth interviews were conducted with 195 participants purposefully recruited from Ontario, Newfoundland and Labrador, Manitoba, and Saskatchewan. Transcripts were analyzed using an inductive and deductive content analysis.

Results: A total of 4 themes were identified. First, AI was described as the inevitable future, facilitating more efficient, accessible social determinants of health information and use in primary care. Second, participants expressed concerns about potential health care harms and a distrust in AI and public systems. Third, some participants indicated that AI could lead to a loss of the human touch in health care, emphasizing a preference for strong relationships with providers and individualized care. Fourth, participants described the critical importance of consent and the need for strong safeguards to protect patient data and trust.

Conclusions: These findings provide important considerations for the use of AI in health care, and particularly when health care administrators and decision makers seek to derive social determinants of health data.

J Med Internet Res 2025;27:e52244

doi:10.2196/52244

Keywords



The social determinants of health (SDoH), people’s daily living and working conditions that are influenced by policies and structures (eg, racism and housing) [Social determinants of health. World Health Organization. URL: https://www.who.int/health-topics/social-determinants-of -health [accessed 2022-01-20] 1], contribute to systemic and avoidable health inequities across groups [Braveman P, Gottlieb L. The social determinants of health: it's time to consider the causes of the causes. Public Health Rep. 2014;129 Suppl 2(Suppl 2):19-31. [FREE Full text] [CrossRef] [Medline]2,Marmot M, Friel S, Bell R, Houweling TAJ, Taylor S, Commission on Social Determinants of Health. Closing the gap in a generation: health equity through action on the social determinants of health. Lancet. 2008;372(9650):1661-1669. [CrossRef] [Medline]3]. These resource and power-related determinants contribute to access to high-quality health care services and delivery and widen the gap in health outcomes across sociocultural groups [Pinto AD, Glattstein-Young G, Mohamed A, Bloch G, Leung F-H, Glazier RH. Building a foundation to reduce health inequities: routine collection of sociodemographic data in primary care. J Am Board Fam Med. 2016;29(3):348-355. [FREE Full text] [CrossRef] [Medline]4-Baumann AA, Cabassa LJ. Reframing implementation science to address inequities in healthcare delivery. BMC Health Serv Res. 2020;20(1):190. [FREE Full text] [CrossRef] [Medline]6]. For example, people with lower-incomes in the United States often experience barriers to access to care due in part to gaps in health insurance, and many Black individuals have experienced health care discrimination and worse postoperative care outcomes than White individuals due to structural racism [Dickman SL, Himmelstein DU, Woolhandler S. Inequality and the health-care system in the USA. Lancet. 2017;389(10077):1431-1441. [CrossRef] [Medline]7-Johnson TJ. Intersection of bias, structural racism, and social determinants with health care inequities. Pediatrics. 2020;146(2):e2020003657. [CrossRef] [Medline]10]. Income inequities persist in access to primary and specialist care in Canada despite universal health care [Haggerty J, Levesque JF, Harris M, Scott C, Dahrouge S, Lewis V, et al. Does healthcare inequity reflect variations in peoples' abilities to access healthcare? Results from a multi-jurisdictional interventional study in two high-income countries. Int J Equity Health. 2020;19(1):167. [FREE Full text] [CrossRef] [Medline]11,Hirello L, Pulok MH, Hajizadeh M. Equity in healthcare utilization in Canada's publicly funded health system: 2000-2014. Eur J Health Econ. 2022;23(9):1519-1533. [CrossRef] [Medline]12].

Primary health care is at the nexus of medical care, public health, and community and social services [DeVoe JE, Bazemore AW, Cottrell EK, Likumahuwa-Ackman S, Grandmont J, Spach N, et al. Perspectives in primary care: a conceptual framework and path for integrating social determinants of health into primary care practice. Ann Fam Med. 2016;14(2):104-108. [FREE Full text] [CrossRef] [Medline]13-Exworthy M, Morcillo V. Primary care doctors' understandings of and strategies to tackle health inequalities: a qualitative study. Prim Health Care Res Dev. 2019;20:e20. [FREE Full text] [CrossRef] [Medline]15], and the collection of SDoH data in primary care is essential to identify and tackle inequities, which contribute to poor health outcomes [Pinto AD, Glattstein-Young G, Mohamed A, Bloch G, Leung F-H, Glazier RH. Building a foundation to reduce health inequities: routine collection of sociodemographic data in primary care. J Am Board Fam Med. 2016;29(3):348-355. [FREE Full text] [CrossRef] [Medline]4,Penman-Aguilar A, Talih M, Huang D, Moonesinghe R, Bouye K, Beckles G. Measurement of health disparities, health inequities, and social determinants of health to support the advancement of health equity. J Public Health Manag Pract. 2016;22 Suppl 1(Suppl 1):S33-S42. [FREE Full text] [CrossRef] [Medline]16,Andermann A, CLEAR Collaboration. Taking action on the social determinants of health in clinical practice: a framework for health professionals. CMAJ. 2016;188(17-18):E474-E483. [FREE Full text] [CrossRef] [Medline]17]. These data could be used to improve care, help patients with their social and financial situations through coordination to local services [Andermann A, CLEAR Collaboration. Taking action on the social determinants of health in clinical practice: a framework for health professionals. CMAJ. 2016;188(17-18):E474-E483. [FREE Full text] [CrossRef] [Medline]17-Pinto AD, Bloch G. Framework for building primary care capacity to address the social determinants of health. Can Fam Physician. 2017;63(11):e476-e482. [FREE Full text] [Medline]20], and guide health care changes and public policy [Andermann A, CLEAR Collaboration. Taking action on the social determinants of health in clinical practice: a framework for health professionals. CMAJ. 2016;188(17-18):E474-E483. [FREE Full text] [CrossRef] [Medline]17,Gottlieb L, Tobey R, Cantor J, Hessler D, Adler NE. Integrating social and medical data to improve population health: opportunities and barriers. Health Aff (Millwood). 2016;35(11):2116-2123. [CrossRef] [Medline]19-Davis VH, Dainty KN, Dhalla IA, Sheehan KA, Wong BM, Pinto AD. "Addressing the bigger picture": a qualitative study of internal medicine patients' perspectives on social needs data collection and use. PLoS One. 2023;18(6):e0285795. [FREE Full text] [CrossRef] [Medline]21]. However, it is challenging to operationalize SDoH data in real-world clinical settings. Practical and technological challenges including a lack of a unified and standardized measurement of SDoH [Davis VH, Rodger L, Pinto AD. Collection and use of social determinants of health data in inpatient general internal medicine wards: a scoping review. J Gen Intern Med. 2023;38(2):480-489. [FREE Full text] [CrossRef] [Medline]22], electronic medical record (EMR) system variabilities [Hatef E, Rouhizadeh M, Tia I, Lasser E, Hill-Briggs F, Marsteller J, et al. Assessing the availability of data on social and behavioral determinants in structured and unstructured electronic health records: a retrospective analysis of a multilevel health care system. JMIR Med Inform. 2019;7(3):e13802. [FREE Full text] [CrossRef] [Medline]23-Chen M, Tan X, Padman R. Social determinants of health in electronic health records and their impact on analysis and risk prediction: a systematic review. J Am Med Inform Assoc. 2020;27(11):1764-1773. [FREE Full text] [CrossRef] [Medline]25], and finite health care system capacity, can lead to limited data interoperability and reduced power to inform health care systems and public health policies [Hatef E, Rouhizadeh M, Tia I, Lasser E, Hill-Briggs F, Marsteller J, et al. Assessing the availability of data on social and behavioral determinants in structured and unstructured electronic health records: a retrospective analysis of a multilevel health care system. JMIR Med Inform. 2019;7(3):e13802. [FREE Full text] [CrossRef] [Medline]23-Chen M, Tan X, Padman R. Social determinants of health in electronic health records and their impact on analysis and risk prediction: a systematic review. J Am Med Inform Assoc. 2020;27(11):1764-1773. [FREE Full text] [CrossRef] [Medline]25]. In addition, it is important to consider the health care team’s capacity with additional workload associated with administrating, collecting, documenting, and responding to needs, which lead to increasing risk of burnout [Davis VH, Dainty KN, Dhalla IA, Sheehan KA, Wong BM, Pinto AD. "Addressing the bigger picture": a qualitative study of internal medicine patients' perspectives on social needs data collection and use. PLoS One. 2023;18(6):e0285795. [FREE Full text] [CrossRef] [Medline]21,Williams-Roberts H, Neudorf C, Abonyi S, Cushon J, Muhajarine N. Facilitators and barriers of sociodemographic data collection in Canadian health care settings: a multisite case study evaluation. Int J Equity Health. 2018;17(1):186. [FREE Full text] [CrossRef] [Medline]26-Kirst M, Shankardass K, Bomze S, Lofters A, Quiñonez C. Sociodemographic data collection for health equity measurement: a mixed methods study examining public opinions. Int J Equity Health. 2013;12:75. [FREE Full text] [CrossRef] [Medline]29].

Artificial intelligence (AI) can leverage the potential benefits of SDoH data to identify patient needs [Patra BG, Sharma MM, Vekaria V, Adekkanattu P, Patterson OV, Glicksberg B, et al. Extracting social determinants of health from electronic health records using natural language processing: a systematic review. J Am Med Inform Assoc. 2021;28(12):2716-2727. [FREE Full text] [CrossRef] [Medline]30,Lybarger K, Dobbins NJ, Long R, Singh A, Wedgeworth P, Uzuner, et al. Leveraging natural language processing to augment structured social determinants of health data in the electronic health record. J Am Med Inform Assoc. 2023;30(8):1389-1397. [FREE Full text] [CrossRef] [Medline]31] and respond to health care overcapacity and disease complexity [d'Elia A, Gabbay M, Rodgers S, Kierans C, Jones E, Durrani I, et al. Artificial intelligence and health inequities in primary care: a systematic scoping review and framework. Fam Med Community Health. 2022;10(Suppl 1):e001670. [FREE Full text] [CrossRef] [Medline]32,Balicer RD, Cohen-Stavi C. Advancing healthcare through data-driven medicine and artificial intelligence. In: Nordlinger B, Villani C, Rus D, editors. Healthcare and Artificial Intelligence. Cham. Springer International Publishing; 2020:9-15.33]. Machine learning is a common type of AI used to detect, predict, and categorize outcomes by looking for patterns in the data that are associated with known observations or “ground truth” cases [Callahan A, Shah NH. Chapter 19 - machine learning in healthcare. In: Sheikh A, Cresswell KM, Wright A, Bates DW, editors. Key Advances in Clinical Informatics. United States. Academic Press; 2017:279-291.34,Davenport T, Kalakota R. The potential for artificial intelligence in healthcare. Future Healthc J. 2019;6(2):94-98. [FREE Full text] [CrossRef] [Medline]35]. Although this work is still in exploratory stages, machine learning and natural language processing have demonstrated the feasibility of detecting a number of SDoH from EMR data, such as childhood experiences [Bejan CA, Angiolillo J, Conway D, Nash R, Shirey-Rice JK, Lipworth L, et al. Mining 100 million notes to find homelessness and adverse childhood experiences: 2 case studies of rare and severe social determinants of health in electronic health records. J Am Med Inform Assoc. 2018;25(1):61-71. [FREE Full text] [CrossRef] [Medline]36], social connections [Zhu VJ, Lenert LA, Bunnell BE, Obeid JS, Jefferson M, Halbert CH. Automatically identifying social isolation from clinical narratives for patients with prostate Cancer. BMC Med Inform Decis Mak. 2019;19(1):43. [FREE Full text] [CrossRef] [Medline]37], living situation [Lybarger K, Dobbins NJ, Long R, Singh A, Wedgeworth P, Uzuner, et al. Leveraging natural language processing to augment structured social determinants of health data in the electronic health record. J Am Med Inform Assoc. 2023;30(8):1389-1397. [FREE Full text] [CrossRef] [Medline]31,Bejan CA, Angiolillo J, Conway D, Nash R, Shirey-Rice JK, Lipworth L, et al. Mining 100 million notes to find homelessness and adverse childhood experiences: 2 case studies of rare and severe social determinants of health in electronic health records. J Am Med Inform Assoc. 2018;25(1):61-71. [FREE Full text] [CrossRef] [Medline]36], employment [Lybarger K, Dobbins NJ, Long R, Singh A, Wedgeworth P, Uzuner, et al. Leveraging natural language processing to augment structured social determinants of health data in the electronic health record. J Am Med Inform Assoc. 2023;30(8):1389-1397. [FREE Full text] [CrossRef] [Medline]31], and predicting health outcomes based on the SDoH [Chen M, Tan X, Padman R. Social determinants of health in electronic health records and their impact on analysis and risk prediction: a systematic review. J Am Med Inform Assoc. 2020;27(11):1764-1773. [FREE Full text] [CrossRef] [Medline]25].

While there is strong potential for adopting AI to identify SDoH data, there are multiple potential harms. Concerns with using AI technology in health care include the exacerbation of biases and inequities, discrimination [d'Elia A, Gabbay M, Rodgers S, Kierans C, Jones E, Durrani I, et al. Artificial intelligence and health inequities in primary care: a systematic scoping review and framework. Fam Med Community Health. 2022;10(Suppl 1):e001670. [FREE Full text] [CrossRef] [Medline]32], data security and privacy, and lack of infrastructure [Richardson JP, Smith C, Curtis S, Watson S, Zhu X, Barry B, et al. Patient apprehensions about the use of artificial intelligence in healthcare. NPJ Digit Med. 2021;4(1):140. [FREE Full text] [CrossRef] [Medline]38-Esmaeilzadeh P. Use of AI-based tools for healthcare purposes: a survey study from consumers' perspectives. BMC Med Inform Decis Mak. 2020;20(1):170. [FREE Full text] [CrossRef] [Medline]40]. For example, the datasets themselves may reflect discriminative or biased practices, in which the algorithms are trained to learn [d'Elia A, Gabbay M, Rodgers S, Kierans C, Jones E, Durrani I, et al. Artificial intelligence and health inequities in primary care: a systematic scoping review and framework. Fam Med Community Health. 2022;10(Suppl 1):e001670. [FREE Full text] [CrossRef] [Medline]32,Leslie D, Mazumder A, Peppin A, Wolters MK, Hagerty A. Does "AI" stand for augmenting inequality in the era of covid-19 healthcare? BMJ. 2021;372:n304. [FREE Full text] [CrossRef] [Medline]41]. Unstructured physician notes may contain biases which are similarly replicated to determine patterns and impact patient outcomes [d'Elia A, Gabbay M, Rodgers S, Kierans C, Jones E, Durrani I, et al. Artificial intelligence and health inequities in primary care: a systematic scoping review and framework. Fam Med Community Health. 2022;10(Suppl 1):e001670. [FREE Full text] [CrossRef] [Medline]32,Leslie D, Mazumder A, Peppin A, Wolters MK, Hagerty A. Does "AI" stand for augmenting inequality in the era of covid-19 healthcare? BMJ. 2021;372:n304. [FREE Full text] [CrossRef] [Medline]41].

The perspectives of patients and the general public (who have been or may become patients) are critical to shaping decisions surrounding the implementation of AI in health care. Patient data are used to develop AI algorithms and patients are affected by having AI inform their care [Moy S, Irannejad M, Manning SJ, Farahani M, Ahmed Y, Gao E, et al. Patient perspectives on the use of artificial intelligence in health care: a scoping review. J Patient Cent Res Rev. 2024;11(1):51-62. [FREE Full text] [CrossRef] [Medline]42]. Despite its importance, few articles focus on public or patient perspectives on AI in health care [Richardson JP, Smith C, Curtis S, Watson S, Zhu X, Barry B, et al. Patient apprehensions about the use of artificial intelligence in healthcare. NPJ Digit Med. 2021;4(1):140. [FREE Full text] [CrossRef] [Medline]38-Esmaeilzadeh P. Use of AI-based tools for healthcare purposes: a survey study from consumers' perspectives. BMC Med Inform Decis Mak. 2020;20(1):170. [FREE Full text] [CrossRef] [Medline]40,Moy S, Irannejad M, Manning SJ, Farahani M, Ahmed Y, Gao E, et al. Patient perspectives on the use of artificial intelligence in health care: a scoping review. J Patient Cent Res Rev. 2024;11(1):51-62. [FREE Full text] [CrossRef] [Medline]42]. A scoping review of 37 articles (grey literature and peer-reviewed) found that many patients reported positive views on AI, although views may differ based on patients’ experiences, concerns, and trust [Moy S, Irannejad M, Manning SJ, Farahani M, Ahmed Y, Gao E, et al. Patient perspectives on the use of artificial intelligence in health care: a scoping review. J Patient Cent Res Rev. 2024;11(1):51-62. [FREE Full text] [CrossRef] [Medline]42]. However, there is limited knowledge of participants’ perspectives on using AI to derive their SDoH information using existing EMR data, which may elicit different views due to the sensitive nature of this information. Given the widespread adoption of EMRs in primary care settings, the sensitivity of SDoH data, and the potential harms associated with AI in health care, it is essential to learn viewpoints on whether and how AI could be implemented in an equitable, acceptable, and safe manner. This qualitative study aimed to understand the perspectives of a diverse sample of Canadians on using AI to derive SDoH information from existing EMR data in primary care, including potential benefits and concerns.


Study Background

This study was conducted as part of a multicomponent project that developed and refined a standardized SDoH questionnaire for primary care settings, known as the Screening for Poverty And Related Social Determinants and Intervening to Improve Knowledge of and Links to Resources (SPARK) Tool (

Multimedia Appendix 1

SPARK (Screening for Poverty And Related Social Determinants and Intervening to Improve Knowledge of and Links to Resources) tool.

DOCX File , 27 KBMultimedia Appendix 1) [Adekoya I, Delahunty-Pike A, Howse D, Kosowan L, Seshie Z, Abaga E, et al. Screening for poverty and related social determinants to improve knowledge of and links to resources (SPARK): development and cognitive testing of a tool for primary care. BMC Prim Care. 2023;24(1):247. [FREE Full text] [CrossRef] [Medline]43]. Details on the SPARK Tool and the broader project are provided elsewhere [Adekoya I, Delahunty-Pike A, Howse D, Kosowan L, Seshie Z, Abaga E, et al. Screening for poverty and related social determinants to improve knowledge of and links to resources (SPARK): development and cognitive testing of a tool for primary care. BMC Prim Care. 2023;24(1):247. [FREE Full text] [CrossRef] [Medline]43]. For this paper, we report on data gathered from in-depth interviews with participants related to their perspectives on having AI derive SDoH data from the EMR. Participants became aware of the SDoH before in-depth interviews by completing the SPARK Tool and were provided with examples of determinants.

Study Design, Setting, and Sampling Approach

A qualitative description approach was chosen to provide a comprehensive summary of participants’ experiences and preferences on using AI to derive SDoH data in primary care settings [Sandelowski M. Whatever happened to qualitative description? Res Nurs Health. 2000;23(4):334-340. [CrossRef] [Medline]44]. One-on-one, semistructured interviews were conducted using video teleconference software across 4 Canadian provinces: Ontario, Newfoundland and Labrador, Saskatchewan, and Manitoba.

Maximum variation sampling helped to ensure that a diverse sample of participants were included across ages, races and ethnicities, location (rural or city), languages, and gender identities where possible [Patton M. Qualitative Research & Evaluation Methods. Thousand Oaks. Sage; 2002. 45]. After indicating interest in the study, potential participants were asked about these characteristics, which enabled the purposive selection of individuals who were not initially well-represented in the sample. Adults (aged 18 years and older) were recruited through social media advertisements, email distribution lists, and posters in community and health centers [Adekoya I, Delahunty-Pike A, Howse D, Kosowan L, Seshie Z, Abaga E, et al. Screening for poverty and related social determinants to improve knowledge of and links to resources (SPARK): development and cognitive testing of a tool for primary care. BMC Prim Care. 2023;24(1):247. [FREE Full text] [CrossRef] [Medline]43]. Advertisements were translated based on the 3 most commonly spoken languages in each province other than English and French, although interviews were only conducted in English due to a lack of non-English speakers contacting the study team. The study aimed to recruit a sample of approximately 200 individuals, with a greater number of participants from Ontario due to the size of the province, to ensure adequate diversity across the multiple domains of the SPARK Tool and to inform the multicomponent study objectives and sampling approach.

Data Collection

The interviewers were female research staff [DH, AD-P, AZS, LK, and IAM] or research assistants, and all received training or had previous experience with qualitative interviewing. The study was also informed regularly through a national advisory group composed of patient partners, scientists, and other collaborators. Data collection occurred from April 2021 to January 2022. The interview guide was established and iteratively revised through meetings with the research team and participant feedback to ensure that the questions were comprehensible for participants without a background in AI. Participants provided verbal informed consent before participating in the study, in accordance with research ethics board requirements. Participants were asked their perspectives on having secure software access patients’ records to derive their SDoH information, as opposed to asking patients for this information directly (refer to

Multimedia Appendix 2

Interview questions assessing sociodemographic data Collection using artificial intelligence.

DOCX File , 13 KBMultimedia Appendix 2 for interview questions). Follow-up questions included examining possible benefits or concerns with the use of this software.

Data Analysis

Interviews were audio-recorded and transcribed verbatim using a professional service. NVivo (version 12; Lumivero) software was used for data management. The codebook was developed collaboratively by the interviewers and the analysis team (VHD, JRQ, IAM, DH, AZS, LK, AD-P, and ADP). Two study team members (VHD and JRQ) with qualitative research experience led the analysis. An initial codebook outline was created by 2 of the interviewers (AD-P and IAM) during independent review of 2 transcripts. This outline helped to guide the development of a preliminary codebook, which was developed by the lead analysts after examining 10 randomly selected transcripts, including the 2 previously examined transcripts (4 from Ontario, 2 each from Newfoundland and Labrador, Saskatchewan, and Manitoba). The remaining analysis team members reviewed the same 10 transcripts and provided detailed feedback on the preliminary codebook. After the codebook was revised and agreed upon, 2 study team members continued to refine it by reviewing 2 additional transcripts each. Codes were compared for consistency and alignment, and extensive documentation was created to outline decisions made during multiple meetings between the coders. The codebook continued to undergo minor edits as more transcripts were reviewed. After 12 transcripts were thoroughly reviewed and examined between the 2 coders, the remaining interviews were randomly split within each province between the coders.

A qualitative content analysis was conducted to focus on theme development across participant interviews [Graneheim UH, Lundman B. Qualitative content analysis in nursing research: concepts, procedures and measures to achieve trustworthiness. Nurse Educ Today. 2004;24(2):105-112. [CrossRef] [Medline]46-Elo S, Kyngäs H. The qualitative content analysis process. J Adv Nurs. 2008;62(1):107-115. [CrossRef] [Medline]48]. A combined inductive and deductive approach was used to enable flexibility to incorporate emerging codes and findings from transcripts, while also focusing on codes pertaining to the research and interview questions.

The researchers familiarized themselves with the transcripts before analysis by thoroughly reading the contents. For the inductive coding approach, the transcripts were read and meaning units (chunks of data, such as sentences or a paragraph) were selected and coded without a predetermined plan [Graneheim UH, Lundman B. Qualitative content analysis in nursing research: concepts, procedures and measures to achieve trustworthiness. Nurse Educ Today. 2004;24(2):105-112. [CrossRef] [Medline]46,Lindgren BM, Lundman B, Graneheim UH. Abstraction and interpretation during the qualitative content analysis process. Int J Nurs Stud. 2020;108:103632. [CrossRef] [Medline]47]. For the deductive coding approach, codes specific to potential benefits, challenges or concerns with using AI to derive SDoH information were created before analyzing the data based on the research aims and interview questions. Meaning units were assigned to these predetermined codes, and codes were grouped into categories and themes in an iterative process. Throughout the data analysis, regular meetings were scheduled with the analysis team to discuss updates, preliminary findings, and assist with the context and interpretation of codes into themes. Figure 1 provides a simplified description of the analysis workflow based on the methodology by Elo and Kyngäs [Elo S, Kyngäs H. The qualitative content analysis process. J Adv Nurs. 2008;62(1):107-115. [CrossRef] [Medline]48].

Figure 1. Qualitative content analysis based on the methodology by Elo and Kyngäs [Elo S, Kyngäs H. The qualitative content analysis process. J Adv Nurs. 2008;62(1):107-115. [CrossRef] [Medline]48].

Positionality, Reflexivity, and Trustworthiness

The 5 main interviewers (IA, DH, AZS, LK, and AD-P) reside in different provinces and have different backgrounds and perspectives. The team regularly met to discuss the interviews and the project both while the interviews were being conducted and when the analysis was occurring, alongside VHD, JRQ, ADP, and at times EA. This team approach helped to mitigate any 1 perspective from dominating the interviews and analysis, as a form of data triangulation to enhance the criteria of trustworthiness [Guba EG, Lincoln YS. Epistemological and methodological bases of naturalistic inquiry. ECTJ. 1982;30(4):233-252. [CrossRef]49].

The 2 main analysts (VHD and JRQ) engaged in reflexivity to understand how their own biases and preconceptions could influence how the data were approached and analyzed, through regular journaling and discussions before and throughout the analysis [Guba EG, Lincoln YS. Epistemological and methodological bases of naturalistic inquiry. ECTJ. 1982;30(4):233-252. [CrossRef]49,Guillemin M, Gillam L. Ethics, reflexivity, and “Ethically Important Moments” in research. Qualitative Inquiry. 2004;10(2):261-280. [CrossRef]50]. They also took memos upon analyzing interviews to document early findings [Guba EG, Lincoln YS. Epistemological and methodological bases of naturalistic inquiry. ECTJ. 1982;30(4):233-252. [CrossRef]49,Guillemin M, Gillam L. Ethics, reflexivity, and “Ethically Important Moments” in research. Qualitative Inquiry. 2004;10(2):261-280. [CrossRef]50]. For example, they documented their own beliefs about whether AI should be used to derive SDoH information, which became more nuanced based on the benefits and concerns expressed by participants. Both analysts created extensive documentation outlining the processes and decisions for the coding approach and analysis [Guba EG, Lincoln YS. Epistemological and methodological bases of naturalistic inquiry. ECTJ. 1982;30(4):233-252. [CrossRef]49].

Ethical Considerations

This study was approved by the Unity Health Toronto Research Ethics Board (#20-241) Saskatchewan Behavioural Research Ethics Board (#2373), Newfoundland and Labrador Health Research Ethics Board (#2020.259), and University of Manitoba Health Research Ethics Board (#HS24204).


Overview

There were 195 interviews conducted across the 4 provinces, lasting approximately 30-45 minutes each. Most participants lived in Ontario (124/195, 64%), identified as women (126/195, 65%), and non-White (122/169, 72% among those who disclosed their race and ethnicity), while 38% (75/195) of participants reported at least 1 unmet social need (Table 1).

Table 1. Participant demographics (N=195; adapted from Adekoya et al [Adekoya I, Delahunty-Pike A, Howse D, Kosowan L, Seshie Z, Abaga E, et al. Screening for poverty and related social determinants to improve knowledge of and links to resources (SPARK): development and cognitive testing of a tool for primary care. BMC Prim Care. 2023;24(1):247. [FREE Full text] [CrossRef] [Medline]43], which is published under Creative Commons Attribution 4.0 International License [Attribution 4.0 International (CC BY 4.0). Creative Commons. URL: https://creativecommons.org/licenses/by/4.0/ [accessed 2025-03-05] 51]).
CharacteristicsParticipants, n (%)a
Canadian province

Ontario125 (64)

Saskatchewan25 (13)

Manitoba24 (12)

Newfoundland and Labrador21 (11)
Race and ethnicity

Asian71 (36)

Black17 (9)

Indigenous8 (4)

White47 (24)

Other10 (5)

Multiracial16 (8)

Data not collected (Manitoba) or no responseb26 (12)
Gender

Man58 (30)

Woman126 (65)

Transgender, gender fluid, or nonbinary9 (5)

No response≤5
Sex at birth

Male58 (30)

Female133 (68)

Intersex≤5

No response≤5
One or more unmet social needsc

Yes75 (38)

No120 (62)
Difficulty making ends meet

Yes43 (22)

No151 (77)

No response≤5
Highest level of educational attainment

Less than a high school diploma≤5

High school diploma or some postsecondary education48 (25)

Trades certificate or diploma14 (7)

College or university degree94 (48)

Postgraduate degree34 (17)

No response≤ 5

aPercentage values are not provided for values that are ≤5.

bRace and ethnicity data were not collected in Manitoba.

cUnmet social needs were based on the following categories: precarious employment (presence of all of the following: short term, casual, or temporary employment; fear of being fired if raised employment concerns; and varying pay); living in social housing; missed rent or utility bill payments; missed appointment due to transportation cost; avoided filling prescription or made it last longer due to cost; difficulty making ends meet; and lack of social support.

Themes

The results are presented as main themes and subthemes. The 4 main themes include AI as the inevitable future; potential health care harms; loss of the human touch; and consent is critical. Additional quotes that provide evidence of themes are provided in

Multimedia Appendix 3

Additional representative quotes.

DOCX File , 28 KBMultimedia Appendix 3.

AI as the Inevitable Future: Facilitating More Efficient, Accessible SDoH Information and Use

Participants described common benefits to leveraging the use of AI in medicine to derive their SDoH information, as it represented the future of technological advancements, and “it’s much better that way” (05_20, Newfoundland and Labrador). Some participants spoke of AI as inevitable regardless of their perspectives for or against its use and seemed to acquiesce to its use in medicine.

Yeah, I’m for it… I mean it’s the future of medicine and it’s the future of the world so… Whether I have concerns or not, it’s going to take over, but personally I don’t have concerns, I’m okay with it, yep.
[01_135 (Ontario)]
Efficient, Streamlined Social Determinants of Health Data

Most participants expressed that the benefits of using AI in primary care were to streamline SDoH data collection in a more efficient and timely manner for patients, staff, and the health care system.

Well, I see benefits cause it streamlines the process and makes your information more accessible to people so you’re not the one who has to like repeat and remind [healthcare staff] all the time.
[04_16 (Manitoba)]

One participant mentioned that, if used correctly and accurately, it could reduce the burden on the health care system to free-up resources for providers to speak with patients.

People think that artificial intelligence and technology will be replacing jobs but like that’s not true because it only is there to help us move faster in life…Once you give the job [of deriving patients’ SDoH data through AI] …the people who were doing that [previously] could have one-on-one…interpersonal communication with patients and you know speak to them while they’re in the waiting room.
[01_06 (Ontario)]
AI Could Overcome Barriers to Disclosure

A small number of participants described how using AI to derive SDoH information based on existing EMR data could overcome barriers to verbally disclosure, in a “more accessible” (01_124, Ontario) manner.

So, I do [see benefits of AI], I think like folks that have challenges expressing themselves like…people that are vulnerable…the homeless, folks that are coming out of traumatic situations…people that have language barriers, speech barriers; I think this technology would be very beneficial to them…And people that have disabilities.
[01_99 (Ontario)]
Data Use to Improve Health

Participants expressed ways that using AI to derive SDoH information could be useful to improving health or health care. For example, 1 participant described how it would help to

move towards a system of greater continuity of care and a system where you are not forced to tell each new professional you see the same old story. [01_134 (Ontario)]

Participants expressed that AI could help generate automated alerts of potential conditions for individuals based on group memberships from the SDoH data, in addition to other health information in the EMR.

I think it’s good for certain things like I think if you’re… from a particular country or race or something and you’re prone to very like high health risk diseases for example…I think that’s important and sure a computer can fill that information out so that way they red flag like check her heart every time she comes in.
[01_68 (Ontario)]

Participants indicated that AI could be useful to help automate local social and community resources to assist with their social situation.

That’s a way to screen and actually find the best help for you within your area…say for example the person has addiction or…mental health challenges or…are in an abusive situation…Sometimes these programs are so filled there is not enough space available…this program can provide all this information accessible to you and show the patient immediately if tomorrow they can get help… they’re put on some waiting list or [indicate where] there are shelters available or…programs from the government.
[01_38 (Ontario)]

Some participants expressed that disaggregated SDoH data could be used at a larger health care or population level, as opposed to the individual level. This included assistance with policy-making decisions to reduce health inequities, disease surveillance, understanding the causes of diseases, and prevention. COVID-19 disparities were mentioned as an example of the use of SDoH data collection.

I think the benefits is if we’re going to be using to track down diseases like not [individual] people [and determine]…What are the cause?... That would be beneficial and [it could be used for prevention] … [You could] do more campaigns…[but] knowing the stats and releasing that to the public in general I think that would help everybody…[It would help] to make them aware of what’s going on… Especially now with COVID.
[03_16 (Saskatchewan)]

Similarly, 1 participant described how the information generated could be used to inform “culturally competent” care through staff training practices.

You [can use AI for demographic purposes, such as you] find that you have 50% African, 30% Asian, 10% White, 20% mixed [racial identities] … Then it would therefore affect training because clearly the majority of your clientele will be African so you have to be culturally competent to be able to treat [them] with whatever they are coming with, the possibility so I think in that sense it could be beneficial.
[03_19 (Saskatchewan)]
Potential Health Care Harms: Distrust in AI Used in Public Systems

Many participants held strong views about the potential harms of implementing AI in primary care and expressed an overall distrust in AI.

AI Inaccuracies and Impact on Care

The primary concerns with using AI to derive SDoH information in health care were inaccurate SDoH predictions and the subsequent consequences on their care provision. Many participants were particularly concerned about their race and ethnicity, gender identity, and sexual orientation being incorrectly identified, including “misclassifications” (01_28, Ontario).

I think that could actually kind of be dangerous… I’m not sure I would trust the computer…I mean I work in [information technology] but [Laugh] I actually wonder how would a computer know for sure what my racial identity is?... I’m just afraid of the computer making the wrong choice and then it somehow impacting my healthcare negatively… If the computer [was] in the background [and] was only doing it for statistics that is not for an individual’s healthcare then I mean I don’t really care. I feel like there could be room for error.
[01_32 (Ontario)]

However, a few participants felt that more inaccuracies would result from a human compared with AI and they would trust the computer over humans with their SDoH information:

I think maybe a computer would probably be better than an individual doing it because you know [for an] individual there is always I would think a larger chance for a margin of human error rather than a computer.
[01_56 (Ontario)]

With regards to updating SDoH information, some participants expressed that social situations are malleable and may change, but those changes may not be accurately captured in their EMR by their physician unless they are explicitly asked about it through a survey. Participants mentioned this often in the context of financial situations, sexual orientation, or gender identity.

I think asking people one-by-one even though it is a bit more time consuming is worth it because you know like even for sexuality and all like it always changes so it’s not like they can, it can be generated once because it should come from the main source…because like let’s say you’re in a court and someone says like you know where did you get this information from? It’s like I didn’t even provide that information.
[01_39 (Ontario)]

It was described how AI would not pick up on the subtle human cues that are intrinsic to conversation.

The only problem, when it comes to like using that type of technology is there’s no emotion behind it so when you’re speaking with someone one-on-one you might say something in a certain way but you don’t mean it that way. So, your facial expression, the way you speak it communicates your idea across differently.
[01_27 (Ontario)]

In addition, some participants had different perspectives on using AI based on the SDoH variable that was predicted. They mentioned that perhaps certain SDoH should be derived using EMR data while others should be asked directly by patients.

So, I think for things that are like you know pretty like core defining questions like your race, your gender, your age, your citizenship like I don’t want the thing guessing that I’m Chinese based on my last name for example…So, I think those things are, you know like more like the legal aspects of things… Other things like you know your income like okay, it can predict it and probably it’s like pretty accurate…I would just want to be assured that whatever predictions it makes like it would be to try to benefit me and not to try to like characterize myself and like sell me some new drugs.
[01_72 (Ontario)]
Privacy and Security

Privacy and security were important concerns of participants:

I think that privacy…has to be the number one issue that guides every aspect of this.
[04_21 (Manitoba)]

Participants expressed concerns with the management of SDoH information in the EMR, including that “…the infrastructure and the environment in which you are operating is not equipped to protect the data you are collecting.” (01_13, Ontario). Another participant mentioned “It’s frightening what computers are being programmed to do without any accountability.” [04_21 (Manitoba)].

Some participants also mentioned that the data being retrieved using EMR was in itself a “huge invasion of privacy” [01_80 (Ontario)].

Several participants described how they did not want AI to derive their SDoH information as it removed their ability to control and share what they are comfortable with sharing to their doctor, given the sensitive nature of the SDoH questions. For example, some participants were themselves not willing to have their SDoH data on the EMR, such as their race and ethnicity or sexual orientation.

Importantly I think it would make me very leery or weary of what I share every time I talk to a healthcare provider cause I might want to share a sensitive piece of information but I don’t want that information to be widely available on my record …Your program to retroactively go back and gather this information from all of my visits just makes me feel very like not in control of my own information.
[01_50 (Ontario)]

The interviews occurred at a time when there was media attention surrounding hacking and privacy breaches of government across provinces, and COVID-19 vaccine policies. Individuals also reported having distrust in government and overall technology.

My thoughts on this whole COVID-19, yeah, it’s a long story and stuff but I don’t trust the government so I wouldn’t want it to be done... I don’t have a lot of trust in computers... You can say everything is secure but like I said I’ve seen it over time and especially in the last 20 months there’s been a lot of breaches…You want to know? Phone me. You want to question me or ask me? Bring me in person-to-person. Don’t do it over the computer... because you can be hacked at any time.
[03_22 (Saskatchewan)]
Data Misuse and Discrimination

Few participants mentioned their concerns about data misuse and discrimination. For example, a participant mentioned how the use of AI would be stigmatizing and could negatively stereotype people, particularly those living with disabilities. Discrimination due to race and ethnicity and income were also reported by participants.

Some participants extrapolated the harms beyond the health care system. Participants compared the potential for AI to discriminate or be used for discrimination, to their concerns regarding the criminal system and policing. For example, 1 participant indicated that it would facilitate “[racial] profiling in general and not only healthcare but the government is going to use it, so I am concerned” and provided an example of it being used in fascist governments for “genocide against communities” [01_07 (Ontario)].

There were also concerns regarding the data being misused if it were to be leaked.

The companies are trying to surveil us or they’re trying to… use our data for their own means…I’m thinking about like police officers... I think it was in British Columbia where like…the municipal police department …was using AI to surveil folks and… that is like very problematic and could really harm a lot of people and I think that this could just be linked to that and like I feel like police would maybe somehow have access to this if the government does.
[01_41 (Ontario)]

One participant described how the algorithm would essentially be stereotyping individuals in order to predict and identify their SDoH information.

Well, and it’s based on a whole lot of basically stereotyping groups so saying oh, well I am racialized and low-income therefore these things must be true and I’m not at all comfortable with, cause it’s, the whole system relies on those stereotypes and assumptions.
[04_17 (Manitoba)]

However, 1 participant believed that AI “is not racist. [Laugh] They don’t see race” [03_24 (Saskatchewan)] and is better suited to SDoH data retrieval as opposed to human data collection.

Loss of the Human Touch: Preference for Provider Relationships and Individualized Care

A few participants expressed an overall preference for having strong patient-provider relationships and interactions and emphasized the need for individualized care. Participants believed AI could pose a barrier to patient-centered care and the patient-provider relationship, by removing the “human factor” with providers which could otherwise be nurtured by having a conversation about their SDoH situation. For example, they expressed that it would be “de-personalizing encounters with the doctor” [04_20 (Manitoba)] or would be “dehumanizing,” particularly for people experiencing a mental health condition and social isolation, by “just being dealt with by a machine” [01_125 (Ontario)].

I think having the human touch to our view like these surveys is always good because well people will give more personal opinions if you allow them to whereas if it was just AI then you know we’re just numbers.
[01_36 (Ontario)]
Healthcare should be about individuals and it should be about connection. It should be about you and me face-to-face and to put artificial intelligence in there to pigeon hole me into a certain group I think would be very dangerous. I think we want to get away from that and start looking at individual situations and everybody’s you know life and how we can interact with them best.
[03_03 (Saskatchewan)]

Similarly, one participant provided a unique perspective as someone who identifies as living with disabilities.

I’d just like to see that be individualized and person centered... That requirement for person-centered care for diversity of options, a diversity of supports, and for that holistic care to be present and connected and supportive. Mostly I think it’s a human rights actual systemic problem to use AI.
[03_21 (Saskatchewan)]
Consent Is Critical: Strong Safeguards Are Needed to Protect Patients’ Data and Trust
Overview

Many participants advocated for strong safeguards and fully-informed consent before having AI used to derive their SDoH information. Most of these participants described opt-in consent, however some described opt-out consent. There is a need for “full transparency” [01_78 (Ontario)] regarding why the data is being collected by AI, what it will be used for, how it will be stored, who has access, how it will work, and oversight before implementation. Without these safeguards, participants mentioned that it would be a “breach of trust” [01_78 (Ontario)] and they would feel upset and “caught off guard” [01_38 (Ontario)].

And I think verbal consent or written consent is very important…as opposed to the passive consent you know it’s like when you download something how it’s all agree, agree, agree…We really do need to read the fine print…And because I think that we, our private information is more valuable than we actually realize and we’ve gotten so accustomed to passing it out at times that we’re failing to realize how valuable and how sensitive it can be so….
[05_12 (Newfoundland and Labrador)]
So it is a black box in the way it uses that [information] but then is there a team which is always correcting it if it’s out of whack?
[01_13 (Ontario)]
Data Access and Verification

Multiple participants wanted the opportunity to check what the AI algorithms derived for their SDoH information and have the chance to rectify inaccuracies.

Maybe you know it would be okay if the AI [derives the SDoH information] first and then the patient…is asked to confirm like say here is what we think it is, do you think this is correct - and then the patient can say yes or no but I think not to just do it and like keep it there behind closed doors.
[03_05 (Saskatchewan)]

Principal Findings

Overall, there were varied perspectives on whether AI should be used in primary care to derive patients’ SDoH information. Some participants expressed that the benefits of more efficient data collection or improved care did not outweigh the potential harms from inaccurate predictions, privacy and security concerns, reduced patient-provider interactions, and data misuse and discrimination. Many participants emphasized the need for strong safeguards if AI is used, including fully informed consent, transparency, oversight, and the ability to check and verify predictions.

Benefits

Participants in other studies have similarly identified that the use of AI in health care is efficient and could promote improved health and health care [Wu C, Xu H, Bai D, Chen X, Gao J, Jiang X. Public perceptions on the application of artificial intelligence in healthcare: a qualitative meta-synthesis. BMJ Open. 2023;13(1):e066322. [FREE Full text] [CrossRef] [Medline]52]. Integrating SDoH information through AI can help to predict risks of negative health outcomes (eg, suicide [Zheng L, Wang O, Hao S, Ye C, Liu M, Xia M, et al. Development of an early-warning system for high-risk patients for suicide attempt using deep learning and electronic health records. Transl Psychiatry. 2020;10(1):72. [CrossRef] [Medline]53] and HIV [Feller DJ, Zucker J, Yin MT, Gordon P, Elhadad N. Using clinical notes and natural language processing for automated HIV risk assessment. J Acquir Immune Defic Syndr. 2018;77(2):160-166. [FREE Full text] [CrossRef] [Medline]54]) and health care use [Chen M, Tan X, Padman R. Social determinants of health in electronic health records and their impact on analysis and risk prediction: a systematic review. J Am Med Inform Assoc. 2020;27(11):1764-1773. [FREE Full text] [CrossRef] [Medline]25,Ethics and governance of artificial intelligence for health: WHO guidance. World Health Organization. 2021. URL: https://www.who.int/publications-detail-redirect/9789240029200 [accessed 2023-07-30] 55-Soy Chen MS, Danielle B, Kelly M, Allison K, John F, John S. Using applied machine learning to predict healthcare utilization based on socioeconomic determinants of care. The American Journal of Managed Care. 2020;26(1):26-31. [FREE Full text] [CrossRef]57]. Other studies have found that the identification of unmet social and financial needs using EMR data could assist with predicting the need for community resources, and therefore facilitate community resource connections and personalized interventions to address patients’ social situations [Soy Chen MS, Danielle B, Kelly M, Allison K, John F, John S. Using applied machine learning to predict healthcare utilization based on socioeconomic determinants of care. The American Journal of Managed Care. 2020;26(1):26-31. [FREE Full text] [CrossRef]57,Kasthurirathne SN, Vest JR, Menachemi N, Halverson PK, Grannis SJ. Assessing the capacity of social determinants of health data to augment predictive models identifying patients in need of wraparound social services. J Am Med Inform Assoc. 2018;25(1):47-53. [FREE Full text] [CrossRef] [Medline]58]. It could also help to inform future program design and community planning to build capacity and outreach to encourage access to necessary health services and reduce related inequities [Soy Chen MS, Danielle B, Kelly M, Allison K, John F, John S. Using applied machine learning to predict healthcare utilization based on socioeconomic determinants of care. The American Journal of Managed Care. 2020;26(1):26-31. [FREE Full text] [CrossRef]57,Kasthurirathne SN, Vest JR, Menachemi N, Halverson PK, Grannis SJ. Assessing the capacity of social determinants of health data to augment predictive models identifying patients in need of wraparound social services. J Am Med Inform Assoc. 2018;25(1):47-53. [FREE Full text] [CrossRef] [Medline]58].

Navigating Concerns and Establishing Strong Patient Protections

Many of the potential harms of AI were focused on issues surrounding participants’ personal care and treatment at the individual-level, as opposed to the use of aggregated data for system- or organization-level change. This may partly explain the strong concerns that surfaced about AI. Some participants highlighted the impracticality of continuously verifying AI-generated outcomes over time and expressed willingness to contribute their data anonymously for research and quality improvement. Health services may consider the feasibility of adopting alternative models of patient consent [Wiertz S, Boldt J. Evaluating models of consent in changing health research environments. Med Health Care Philos. 2022;25(2):269-280. [FREE Full text] [CrossRef] [Medline]59]. For example, tiered consent enables patients to opt-in or -out of sharing their data under various conditions (eg, data used at the aggregate level for algorithm development or individual-level) [Wiertz S, Boldt J. Evaluating models of consent in changing health research environments. Med Health Care Philos. 2022;25(2):269-280. [FREE Full text] [CrossRef] [Medline]59,Kotsenas AL, Balthazar P, Andrews D, Geis JR, Cook TS. Rethinking patient consent in the era of artificial intelligence and big data. J Am Coll Radiol. 2021;18(1 Pt B):180-184. [CrossRef] [Medline]60]. It has been recommended for organizations to implement patient education on consent regarding AI and to work alongside patients in creating a consent that is coherent and straightforward [Kotsenas AL, Balthazar P, Andrews D, Geis JR, Cook TS. Rethinking patient consent in the era of artificial intelligence and big data. J Am Coll Radiol. 2021;18(1 Pt B):180-184. [CrossRef] [Medline]60]. Similar to the participants in this study, others have discussed consent, transparency, verification, control, and oversight as part of recommendations for the ethical use of AI [Ethics and governance of artificial intelligence for health: WHO guidance. World Health Organization. 2021. URL: https://www.who.int/publications-detail-redirect/9789240029200 [accessed 2023-07-30] 55,Fisher S, Rosella LC. Priorities for successful use of artificial intelligence by public health organizations: a literature review. BMC Public Health. 2022;22(1):2146. [FREE Full text] [CrossRef] [Medline]61-Jobin A, Ienca M, Vayena E. The global landscape of AI ethics guidelines. Nat Mach Intell. 2019;1(9):389-399. [CrossRef]64].

AI is a broad term to describe many types of complex statistical methods, which are not often transparent or well understood by the public. This can lead to mistrust in AI and medicine, and given the novelty of AI in health care settings, there are still many unknowns. A major concern is the potential for AI to misclassify one’s racial or ethnic background, as well as fluctuating social circumstances. Even with highly accurate AI models, misclassification is possible, as well as algorithmic or data bias and societal bias [d'Elia A, Gabbay M, Rodgers S, Kierans C, Jones E, Durrani I, et al. Artificial intelligence and health inequities in primary care: a systematic scoping review and framework. Fam Med Community Health. 2022;10(Suppl 1):e001670. [FREE Full text] [CrossRef] [Medline]32]. Without careful oversight, inaccurate or biased AI models could be used to significantly impact patient care or propagate discrimination or health inequities [d'Elia A, Gabbay M, Rodgers S, Kierans C, Jones E, Durrani I, et al. Artificial intelligence and health inequities in primary care: a systematic scoping review and framework. Fam Med Community Health. 2022;10(Suppl 1):e001670. [FREE Full text] [CrossRef] [Medline]32,Leslie D, Mazumder A, Peppin A, Wolters MK, Hagerty A. Does "AI" stand for augmenting inequality in the era of covid-19 healthcare? BMJ. 2021;372:n304. [FREE Full text] [CrossRef] [Medline]41]. This may contribute to overcriminalization and systemic discrimination experienced by racialized and low-income communities [Bingley WJ, Haslam SA, Steffens NK, Gillespie N, Worthy P, Curtis C, et al. Enlarging the model of the human at the heart of human-centered AI: a social self-determination model of AI system impact. New Ideas in Psychology. 2023;70:101025. [CrossRef]65,Richardson R, Schultz JM, Crawford K. Dirty data, bad predictions: how civil rights violations impact police data, predictive policing systems, and justice. NYU L Rev Online. 2019;94:15.66]. Thus, although resource-intensive, it is imperative that patients have the ability to verify the AI-generated SDoH outcomes and modify it. It is also incumbent on AI teams to produce algorithms that go further than just being interpretable, but ones that provide justification for the outcome [Ghassemi M, Naumann T, Schulam P, Beam AL, Chen IY, Ranganath R. A review of challenges and opportunities in machine learning for health. AMIA Jt Summits Transl Sci Proc. 2020;2020:191-200. [FREE Full text] [Medline]67] and proactively address health equity [Rajkomar A, Hardt M, Howell MD, Corrado G, Chin MH. Ensuring fairness in machine learning to advance health equity. Ann Intern Med. 2018;169(12):866-872. [FREE Full text] [CrossRef] [Medline]68].

Other studies examining perspectives on AI in health care have reported similar concerns of privacy and data breaches, infrastructure and oversight, and lack of choice or control of the data [Alpert J, Kim HJ, McDonnell C, Guo Y, George TJ, Bian J, et al. Barriers and facilitators of obtaining social determinants of health of patients with cancer through the electronic health record using natural language processing technology: qualitative feasibility study With stakeholder interviews. JMIR Form Res. 2022;6(12):e43059. [FREE Full text] [CrossRef] [Medline]24,Richardson JP, Smith C, Curtis S, Watson S, Zhu X, Barry B, et al. Patient apprehensions about the use of artificial intelligence in healthcare. NPJ Digit Med. 2021;4(1):140. [FREE Full text] [CrossRef] [Medline]38,Aggarwal R, Farag S, Martin G, Ashrafian H, Darzi A. Patient perceptions on data sharing and applying artificial intelligence to health care data: cross-sectional survey. J Med Internet Res. 2021;23(8):e26162. [FREE Full text] [CrossRef] [Medline]39,Wu C, Xu H, Bai D, Chen X, Gao J, Jiang X. Public perceptions on the application of artificial intelligence in healthcare: a qualitative meta-synthesis. BMJ Open. 2023;13(1):e066322. [FREE Full text] [CrossRef] [Medline]52,Hartzler AL, Xie SJ, Wedgeworth P, Spice C, Lybarger K, Wood BR, et al. SDoH Community Champion Advisory Board. Integrating patient voices into the extraction of social determinants of health from clinical notes: ethical considerations and recommendations. J Am Med Inform Assoc. 2023;30(8):1456-1462. [FREE Full text] [CrossRef] [Medline]62,Beets B, Newman TP, Howell EL, Bao L, Yang S. Surveying public perceptions of artificial intelligence in health care in the United States: systematic review. J Med Internet Res. 2023;25:e40337. [FREE Full text] [CrossRef] [Medline]63,Haan M, Ongena YP, Hommes S, Kwee TC, Yakar D. A qualitative study to understand patient perspective on the use of artificial intelligence in radiology. J Am Coll Radiol. 2019;16(10):1416-1419. [CrossRef] [Medline]69]. During the time of the interviews, the Canadian news media frequently discussed ransomware attacks and data breaches in medical data systems [Wilner AS, Luce H, Ouellet E, Williams O, Costa N. From public health to cyber hygiene: cybersecurity and Canada’s healthcare sector. International Journal. 2022;76(4):002070202110679. [FREE Full text] [CrossRef]70], including those described as the “worst in Canadian history” [N.L. health-care cyberattack is worst in Canadian history, says cybersecurity expert. CBC News. 2021. URL: https://www.cbc.ca/news/canada/newfoundland-labrador/nl-cyber-attack-worst-canada-1.6236210 [accessed 2023-08-22] 71]. This context could have impacted participant concerns of using AI to derive SDoH data in this study.

Health care, particularly primary care, is a highly personal “social enterprise, powered by committed, caring, and collaborative connections between the humans involved” [Lin SY, Mahoney MR, Sinsky CA. Ten ways artificial intelligence will transform primary care. J Gen Intern Med. 2019;34(8):1626-1630. [FREE Full text] [CrossRef] [Medline]72]. It is not surprising that a major concern in this study was the potential for AI to threaten patient-centered and individualized care, harming the patient-provider relationship. These findings have been reported in the literature [Haan M, Ongena YP, Hommes S, Kwee TC, Yakar D. A qualitative study to understand patient perspective on the use of artificial intelligence in radiology. J Am Coll Radiol. 2019;16(10):1416-1419. [CrossRef] [Medline]69,Young AT, Amara D, Bhattacharya A, Wei ML. Patient and general public attitudes towards clinical artificial intelligence: a mixed methods systematic review. Lancet Digit Health. 2021;3(9):e599-e611. [FREE Full text] [CrossRef] [Medline]73] and may reflect concerns about AI in medicine in general, as opposed to extraction of SDoH data on the backend. Some articles have suggested restricting AI from decision-making [Witkowski K, Okhai R, Neely SR. Public perceptions of artificial intelligence in healthcare: ethical concerns and opportunities for patient-centered care. BMC Med Ethics. 2024;25(1):74. [FREE Full text] [CrossRef] [Medline]74,Bjerring JC, Busch J. Artificial intelligence and patient-centered decision-making. Philos. Technol. 2020;34(2):349-371. [CrossRef]75] such as automatically deriving patient SDoH data and recording it into patient records. Instead, AI could be used as a tool [Witkowski K, Okhai R, Neely SR. Public perceptions of artificial intelligence in healthcare: ethical concerns and opportunities for patient-centered care. BMC Med Ethics. 2024;25(1):74. [FREE Full text] [CrossRef] [Medline]74] as part of patient-centered care that helps providers prioritize discussions about SDoH with their patient based on their risk of having unmet needs. Based on these discussions, team-based care approaches could support personalized actions following identification of a need.

Implementation of AI for SDoH data extraction should incorporate a strong focus on equity and meaningful partnerships with underrepresented communities in every stage, to promote safety and minimize potential harms [d'Elia A, Gabbay M, Rodgers S, Kierans C, Jones E, Durrani I, et al. Artificial intelligence and health inequities in primary care: a systematic scoping review and framework. Fam Med Community Health. 2022;10(Suppl 1):e001670. [FREE Full text] [CrossRef] [Medline]32,Richardson JP, Smith C, Curtis S, Watson S, Zhu X, Barry B, et al. Patient apprehensions about the use of artificial intelligence in healthcare. NPJ Digit Med. 2021;4(1):140. [FREE Full text] [CrossRef] [Medline]38,Hartzler AL, Xie SJ, Wedgeworth P, Spice C, Lybarger K, Wood BR, et al. SDoH Community Champion Advisory Board. Integrating patient voices into the extraction of social determinants of health from clinical notes: ethical considerations and recommendations. J Am Med Inform Assoc. 2023;30(8):1456-1462. [FREE Full text] [CrossRef] [Medline]62]. This should include community governance, particularly for Indigenous and Black communities [Fisher S, Rosella LC. Priorities for successful use of artificial intelligence by public health organizations: a literature review. BMC Public Health. 2022;22(1):2146. [FREE Full text] [CrossRef] [Medline]61], and would inform whether the SDoH data is collected, the type of data to collect, the frequency for such data collection, and its use. In combination, using an equity framework to guide implementation could help to mitigate medical mistrust and associated health disparities [Jaiswal J, Halkitis PN. Towards a more inclusive and dynamic understanding of medical mistrust informed by science. Behav Med. 2019;45(2):79-85. [FREE Full text] [CrossRef] [Medline]76,Pellowski JA, Price DM, Allen AM, Eaton LA, Kalichman SC. The differences between medical trust and mistrust and their respective influences on medication beliefs and ART adherence among African-Americans living with HIV. Psychol Health. 2017;32(9):1127-1139. [FREE Full text] [CrossRef] [Medline]77], particularly among individuals who experience structural racism and discrimination in health care [Jaiswal J, Halkitis PN. Towards a more inclusive and dynamic understanding of medical mistrust informed by science. Behav Med. 2019;45(2):79-85. [FREE Full text] [CrossRef] [Medline]76,Benkert R, Peters RM, Clark R, Keves-Foster K. Effects of perceived racism, cultural mistrust and trust in providers on satisfaction with care. J Natl Med Assoc. 2006;98(9):1532-1540. [Medline]78]. Overall, while the necessary safeguards may diminish some of the returns of more timely, efficient SDoH information retrieval using AI, they must be an inextricable part of using AI to derive SDoH data.

Strengths and Limitations

This study has multiple strengths. It involved a large sample of 195 participants across 4 Canadian provinces and used a maximum variation approach for a diverse sample. For example, the majority of participants who disclosed their race and ethnicity identified as non-White, and most had at least 1 unmet social need. There are few multijurisdictional studies that examine perspectives on using AI to derive SDoH data; thus, this study contributes to reducing knowledge gaps regarding implementation of AI to derive SDoH data in primary care. Various techniques were also used to increase the trustworthiness and quality of the study. Furthermore, the SPARK project is composed of a dedicated team of researchers and patient partners who are passionate about health and social inequities and meet regularly to discuss the study and decision making.

Several limitations were also identified. This study reports the findings from questions on AI that were asked as part of a larger interview on general SDoH data collection and use in primary care. Participants’ perspectives may have been impacted by the other questions that preceded the AI conversation. In addition, despite having study recruitment materials available in multiple languages for each province, the team relied entirely on potential participants to make initial contact and were not approached by non-English speakers. While the sampling strategy captured a range of formal educational experiences, approximately one-third had a college, university, or postgraduate degree. Thus, the study sample may not include individuals who experience language barriers and may be less transferable to individuals who did not complete high or grade school. Furthermore, race-based data in Manitoba were not collected due to the need for greater community engagement. This study did not receive Research Ethics Board approval across all provinces to report participants’ age; however, each province considered age for a diverse recruitment sample. Finally, given project timelines and resource constraints, qualitative data analysis was not formally initiated until the interviews were completed. Despite this, the close communication, regular meetings, and feedback enabled the interview guide to undergo changes for greater clarity; the interview questions were also piloted by the study team beforehand.

Conclusions

This large qualitative study examined perspectives on using AI to derive SDoH data in primary care using existing EMR data. Participants described the benefits of efficiency, access and improvement of care, and concerns focused on inaccuracies, negative consequences to care, privacy and security breaches, reduced patient-provider interactions, data misuse, and discrimination. Strong safeguards, including fully informed consent, verification and oversight can help to alleviate harm. There is a need to engage communities in the development, implementation, and evaluation of future AI initiatives, to determine opportunities and safety measures for using AI. Interviews with providers, health care administrators, and decision-makers are also necessary to understand the feasibility of integrating AI in primary care for SDoH information.

Acknowledgments

We would like to sincerely thank the study participants from across Canada who kindly offered their time and perspectives. We are grateful for the entire SPARK team, including the many patient partners. We are also appreciative of Ellah San Antonio and Nada Dali for helping with the interviews.

This project was supported in part by Canadian Institutes for Health Research (FRN 156885). Author [ADP] is supported as a Clinician-Scientist by the Department of Family and Community Medicine, Faculty of Medicine at the University of Toronto and at St. Michael’s Hospital, the Li Ka Shing Knowledge Institute, St. Michael’s Hospital, and a CIHR Applied Public Health Chair in Upstream Prevention. The opinions, results and conclusions reported in this article are those of the authors and are independent from any funding sources.

Authors' Contributions

KA-B, MI, LAJ, AK, EGM, NM, CN and ADP contributed to the study conceptualization and design. IAM, DH, AZS, LK, and AD-P collected the data. The data analysis and interpretation were led by VHD and JRQ, and supported by DH, AZS, LK, AD-P, and ADP. Interpretation of the data was supported by EA, JC, MR, DS, AZ, and SG. VHD wrote the paper, and was assisted by JRQ and SG. All authors revised and approved the final draft.

Conflicts of Interest

None to declare.

Multimedia Appendix 1

SPARK (Screening for Poverty And Related Social Determinants and Intervening to Improve Knowledge of and Links to Resources) tool.

DOCX File , 27 KB

Multimedia Appendix 2

Interview questions assessing sociodemographic data Collection using artificial intelligence.

DOCX File , 13 KB

Multimedia Appendix 3

Additional representative quotes.

DOCX File , 28 KB

  1. Social determinants of health. World Health Organization. URL: https://www.who.int/health-topics/social-determinants-of -health [accessed 2022-01-20]
  2. Braveman P, Gottlieb L. The social determinants of health: it's time to consider the causes of the causes. Public Health Rep. 2014;129 Suppl 2(Suppl 2):19-31. [FREE Full text] [CrossRef] [Medline]
  3. Marmot M, Friel S, Bell R, Houweling TAJ, Taylor S, Commission on Social Determinants of Health. Closing the gap in a generation: health equity through action on the social determinants of health. Lancet. 2008;372(9650):1661-1669. [CrossRef] [Medline]
  4. Pinto AD, Glattstein-Young G, Mohamed A, Bloch G, Leung F-H, Glazier RH. Building a foundation to reduce health inequities: routine collection of sociodemographic data in primary care. J Am Board Fam Med. 2016;29(3):348-355. [FREE Full text] [CrossRef] [Medline]
  5. Peiris D, Brown A, Cass A. Addressing inequities in access to quality health care for indigenous people. CMAJ. 2008;179(10):985-986. [FREE Full text] [CrossRef] [Medline]
  6. Baumann AA, Cabassa LJ. Reframing implementation science to address inequities in healthcare delivery. BMC Health Serv Res. 2020;20(1):190. [FREE Full text] [CrossRef] [Medline]
  7. Dickman SL, Himmelstein DU, Woolhandler S. Inequality and the health-care system in the USA. Lancet. 2017;389(10077):1431-1441. [CrossRef] [Medline]
  8. Nafiu OO, Mpody C, Kim SS, Uffman JC, Tobias JD. Race, postoperative complications, and death in apparently healthy children. Pediatrics. 2020;146(2):e20194113. [CrossRef] [Medline]
  9. Kraemer K, Cohen ME, Liu Y, Barnhart DC, Rangel SJ, Saito JM, et al. Development and evaluation of the American college of surgeons NSQIP pediatric surgical risk calculator. J Am Coll Surg. 2016;223(5):685-693. [CrossRef] [Medline]
  10. Johnson TJ. Intersection of bias, structural racism, and social determinants with health care inequities. Pediatrics. 2020;146(2):e2020003657. [CrossRef] [Medline]
  11. Haggerty J, Levesque JF, Harris M, Scott C, Dahrouge S, Lewis V, et al. Does healthcare inequity reflect variations in peoples' abilities to access healthcare? Results from a multi-jurisdictional interventional study in two high-income countries. Int J Equity Health. 2020;19(1):167. [FREE Full text] [CrossRef] [Medline]
  12. Hirello L, Pulok MH, Hajizadeh M. Equity in healthcare utilization in Canada's publicly funded health system: 2000-2014. Eur J Health Econ. 2022;23(9):1519-1533. [CrossRef] [Medline]
  13. DeVoe JE, Bazemore AW, Cottrell EK, Likumahuwa-Ackman S, Grandmont J, Spach N, et al. Perspectives in primary care: a conceptual framework and path for integrating social determinants of health into primary care practice. Ann Fam Med. 2016;14(2):104-108. [FREE Full text] [CrossRef] [Medline]
  14. Rasanathan K, Montesinos EV, Matheson D, Etienne C, Evans T. Primary health care and the social determinants of health: essential and complementary approaches for reducing inequities in health. J Epidemiol Community Health. 2011;65(8):656-660. [CrossRef] [Medline]
  15. Exworthy M, Morcillo V. Primary care doctors' understandings of and strategies to tackle health inequalities: a qualitative study. Prim Health Care Res Dev. 2019;20:e20. [FREE Full text] [CrossRef] [Medline]
  16. Penman-Aguilar A, Talih M, Huang D, Moonesinghe R, Bouye K, Beckles G. Measurement of health disparities, health inequities, and social determinants of health to support the advancement of health equity. J Public Health Manag Pract. 2016;22 Suppl 1(Suppl 1):S33-S42. [FREE Full text] [CrossRef] [Medline]
  17. Andermann A, CLEAR Collaboration. Taking action on the social determinants of health in clinical practice: a framework for health professionals. CMAJ. 2016;188(17-18):E474-E483. [FREE Full text] [CrossRef] [Medline]
  18. Fichtenberg CM, Alley DE, Mistry KB. Improving social needs intervention research: key questions for advancing the field. Am J Prev Med. 2019;57(6 Suppl 1):S47-S54. [FREE Full text] [CrossRef] [Medline]
  19. Gottlieb L, Tobey R, Cantor J, Hessler D, Adler NE. Integrating social and medical data to improve population health: opportunities and barriers. Health Aff (Millwood). 2016;35(11):2116-2123. [CrossRef] [Medline]
  20. Pinto AD, Bloch G. Framework for building primary care capacity to address the social determinants of health. Can Fam Physician. 2017;63(11):e476-e482. [FREE Full text] [Medline]
  21. Davis VH, Dainty KN, Dhalla IA, Sheehan KA, Wong BM, Pinto AD. "Addressing the bigger picture": a qualitative study of internal medicine patients' perspectives on social needs data collection and use. PLoS One. 2023;18(6):e0285795. [FREE Full text] [CrossRef] [Medline]
  22. Davis VH, Rodger L, Pinto AD. Collection and use of social determinants of health data in inpatient general internal medicine wards: a scoping review. J Gen Intern Med. 2023;38(2):480-489. [FREE Full text] [CrossRef] [Medline]
  23. Hatef E, Rouhizadeh M, Tia I, Lasser E, Hill-Briggs F, Marsteller J, et al. Assessing the availability of data on social and behavioral determinants in structured and unstructured electronic health records: a retrospective analysis of a multilevel health care system. JMIR Med Inform. 2019;7(3):e13802. [FREE Full text] [CrossRef] [Medline]
  24. Alpert J, Kim HJ, McDonnell C, Guo Y, George TJ, Bian J, et al. Barriers and facilitators of obtaining social determinants of health of patients with cancer through the electronic health record using natural language processing technology: qualitative feasibility study With stakeholder interviews. JMIR Form Res. 2022;6(12):e43059. [FREE Full text] [CrossRef] [Medline]
  25. Chen M, Tan X, Padman R. Social determinants of health in electronic health records and their impact on analysis and risk prediction: a systematic review. J Am Med Inform Assoc. 2020;27(11):1764-1773. [FREE Full text] [CrossRef] [Medline]
  26. Williams-Roberts H, Neudorf C, Abonyi S, Cushon J, Muhajarine N. Facilitators and barriers of sociodemographic data collection in Canadian health care settings: a multisite case study evaluation. Int J Equity Health. 2018;17(1):186. [FREE Full text] [CrossRef] [Medline]
  27. Bakken S. Can informatics innovation help mitigate clinician burnout? J Am Med Inform Assoc. 2019;26(2):93-94. [FREE Full text] [CrossRef] [Medline]
  28. Lofters AK, Shankardass K, Kirst M, Quiñonez C. Sociodemographic data collection in healthcare settings: an examination of public opinions. Med Care. 2011;49(2):193-199. [CrossRef] [Medline]
  29. Kirst M, Shankardass K, Bomze S, Lofters A, Quiñonez C. Sociodemographic data collection for health equity measurement: a mixed methods study examining public opinions. Int J Equity Health. 2013;12:75. [FREE Full text] [CrossRef] [Medline]
  30. Patra BG, Sharma MM, Vekaria V, Adekkanattu P, Patterson OV, Glicksberg B, et al. Extracting social determinants of health from electronic health records using natural language processing: a systematic review. J Am Med Inform Assoc. 2021;28(12):2716-2727. [FREE Full text] [CrossRef] [Medline]
  31. Lybarger K, Dobbins NJ, Long R, Singh A, Wedgeworth P, Uzuner, et al. Leveraging natural language processing to augment structured social determinants of health data in the electronic health record. J Am Med Inform Assoc. 2023;30(8):1389-1397. [FREE Full text] [CrossRef] [Medline]
  32. d'Elia A, Gabbay M, Rodgers S, Kierans C, Jones E, Durrani I, et al. Artificial intelligence and health inequities in primary care: a systematic scoping review and framework. Fam Med Community Health. 2022;10(Suppl 1):e001670. [FREE Full text] [CrossRef] [Medline]
  33. Balicer RD, Cohen-Stavi C. Advancing healthcare through data-driven medicine and artificial intelligence. In: Nordlinger B, Villani C, Rus D, editors. Healthcare and Artificial Intelligence. Cham. Springer International Publishing; 2020:9-15.
  34. Callahan A, Shah NH. Chapter 19 - machine learning in healthcare. In: Sheikh A, Cresswell KM, Wright A, Bates DW, editors. Key Advances in Clinical Informatics. United States. Academic Press; 2017:279-291.
  35. Davenport T, Kalakota R. The potential for artificial intelligence in healthcare. Future Healthc J. 2019;6(2):94-98. [FREE Full text] [CrossRef] [Medline]
  36. Bejan CA, Angiolillo J, Conway D, Nash R, Shirey-Rice JK, Lipworth L, et al. Mining 100 million notes to find homelessness and adverse childhood experiences: 2 case studies of rare and severe social determinants of health in electronic health records. J Am Med Inform Assoc. 2018;25(1):61-71. [FREE Full text] [CrossRef] [Medline]
  37. Zhu VJ, Lenert LA, Bunnell BE, Obeid JS, Jefferson M, Halbert CH. Automatically identifying social isolation from clinical narratives for patients with prostate Cancer. BMC Med Inform Decis Mak. 2019;19(1):43. [FREE Full text] [CrossRef] [Medline]
  38. Richardson JP, Smith C, Curtis S, Watson S, Zhu X, Barry B, et al. Patient apprehensions about the use of artificial intelligence in healthcare. NPJ Digit Med. 2021;4(1):140. [FREE Full text] [CrossRef] [Medline]
  39. Aggarwal R, Farag S, Martin G, Ashrafian H, Darzi A. Patient perceptions on data sharing and applying artificial intelligence to health care data: cross-sectional survey. J Med Internet Res. 2021;23(8):e26162. [FREE Full text] [CrossRef] [Medline]
  40. Esmaeilzadeh P. Use of AI-based tools for healthcare purposes: a survey study from consumers' perspectives. BMC Med Inform Decis Mak. 2020;20(1):170. [FREE Full text] [CrossRef] [Medline]
  41. Leslie D, Mazumder A, Peppin A, Wolters MK, Hagerty A. Does "AI" stand for augmenting inequality in the era of covid-19 healthcare? BMJ. 2021;372:n304. [FREE Full text] [CrossRef] [Medline]
  42. Moy S, Irannejad M, Manning SJ, Farahani M, Ahmed Y, Gao E, et al. Patient perspectives on the use of artificial intelligence in health care: a scoping review. J Patient Cent Res Rev. 2024;11(1):51-62. [FREE Full text] [CrossRef] [Medline]
  43. Adekoya I, Delahunty-Pike A, Howse D, Kosowan L, Seshie Z, Abaga E, et al. Screening for poverty and related social determinants to improve knowledge of and links to resources (SPARK): development and cognitive testing of a tool for primary care. BMC Prim Care. 2023;24(1):247. [FREE Full text] [CrossRef] [Medline]
  44. Sandelowski M. Whatever happened to qualitative description? Res Nurs Health. 2000;23(4):334-340. [CrossRef] [Medline]
  45. Patton M. Qualitative Research & Evaluation Methods. Thousand Oaks. Sage; 2002.
  46. Graneheim UH, Lundman B. Qualitative content analysis in nursing research: concepts, procedures and measures to achieve trustworthiness. Nurse Educ Today. 2004;24(2):105-112. [CrossRef] [Medline]
  47. Lindgren BM, Lundman B, Graneheim UH. Abstraction and interpretation during the qualitative content analysis process. Int J Nurs Stud. 2020;108:103632. [CrossRef] [Medline]
  48. Elo S, Kyngäs H. The qualitative content analysis process. J Adv Nurs. 2008;62(1):107-115. [CrossRef] [Medline]
  49. Guba EG, Lincoln YS. Epistemological and methodological bases of naturalistic inquiry. ECTJ. 1982;30(4):233-252. [CrossRef]
  50. Guillemin M, Gillam L. Ethics, reflexivity, and “Ethically Important Moments” in research. Qualitative Inquiry. 2004;10(2):261-280. [CrossRef]
  51. Attribution 4.0 International (CC BY 4.0). Creative Commons. URL: https://creativecommons.org/licenses/by/4.0/ [accessed 2025-03-05]
  52. Wu C, Xu H, Bai D, Chen X, Gao J, Jiang X. Public perceptions on the application of artificial intelligence in healthcare: a qualitative meta-synthesis. BMJ Open. 2023;13(1):e066322. [FREE Full text] [CrossRef] [Medline]
  53. Zheng L, Wang O, Hao S, Ye C, Liu M, Xia M, et al. Development of an early-warning system for high-risk patients for suicide attempt using deep learning and electronic health records. Transl Psychiatry. 2020;10(1):72. [CrossRef] [Medline]
  54. Feller DJ, Zucker J, Yin MT, Gordon P, Elhadad N. Using clinical notes and natural language processing for automated HIV risk assessment. J Acquir Immune Defic Syndr. 2018;77(2):160-166. [FREE Full text] [CrossRef] [Medline]
  55. Ethics and governance of artificial intelligence for health: WHO guidance. World Health Organization. 2021. URL: https://www.who.int/publications-detail-redirect/9789240029200 [accessed 2023-07-30]
  56. Nijhawan AE, Metsch LR, Zhang S, Feaster DJ, Gooden L, Jain MK, et al. Clinical and sociobehavioral prediction model of 30-day hospital readmissions among people with HIV and substance use disorder: beyond electronic health record data. J Acquir Immune Defic Syndr. 2019;80(3):330-341. [FREE Full text] [CrossRef] [Medline]
  57. Soy Chen MS, Danielle B, Kelly M, Allison K, John F, John S. Using applied machine learning to predict healthcare utilization based on socioeconomic determinants of care. The American Journal of Managed Care. 2020;26(1):26-31. [FREE Full text] [CrossRef]
  58. Kasthurirathne SN, Vest JR, Menachemi N, Halverson PK, Grannis SJ. Assessing the capacity of social determinants of health data to augment predictive models identifying patients in need of wraparound social services. J Am Med Inform Assoc. 2018;25(1):47-53. [FREE Full text] [CrossRef] [Medline]
  59. Wiertz S, Boldt J. Evaluating models of consent in changing health research environments. Med Health Care Philos. 2022;25(2):269-280. [FREE Full text] [CrossRef] [Medline]
  60. Kotsenas AL, Balthazar P, Andrews D, Geis JR, Cook TS. Rethinking patient consent in the era of artificial intelligence and big data. J Am Coll Radiol. 2021;18(1 Pt B):180-184. [CrossRef] [Medline]
  61. Fisher S, Rosella LC. Priorities for successful use of artificial intelligence by public health organizations: a literature review. BMC Public Health. 2022;22(1):2146. [FREE Full text] [CrossRef] [Medline]
  62. Hartzler AL, Xie SJ, Wedgeworth P, Spice C, Lybarger K, Wood BR, et al. SDoH Community Champion Advisory Board. Integrating patient voices into the extraction of social determinants of health from clinical notes: ethical considerations and recommendations. J Am Med Inform Assoc. 2023;30(8):1456-1462. [FREE Full text] [CrossRef] [Medline]
  63. Beets B, Newman TP, Howell EL, Bao L, Yang S. Surveying public perceptions of artificial intelligence in health care in the United States: systematic review. J Med Internet Res. 2023;25:e40337. [FREE Full text] [CrossRef] [Medline]
  64. Jobin A, Ienca M, Vayena E. The global landscape of AI ethics guidelines. Nat Mach Intell. 2019;1(9):389-399. [CrossRef]
  65. Bingley WJ, Haslam SA, Steffens NK, Gillespie N, Worthy P, Curtis C, et al. Enlarging the model of the human at the heart of human-centered AI: a social self-determination model of AI system impact. New Ideas in Psychology. 2023;70:101025. [CrossRef]
  66. Richardson R, Schultz JM, Crawford K. Dirty data, bad predictions: how civil rights violations impact police data, predictive policing systems, and justice. NYU L Rev Online. 2019;94:15.
  67. Ghassemi M, Naumann T, Schulam P, Beam AL, Chen IY, Ranganath R. A review of challenges and opportunities in machine learning for health. AMIA Jt Summits Transl Sci Proc. 2020;2020:191-200. [FREE Full text] [Medline]
  68. Rajkomar A, Hardt M, Howell MD, Corrado G, Chin MH. Ensuring fairness in machine learning to advance health equity. Ann Intern Med. 2018;169(12):866-872. [FREE Full text] [CrossRef] [Medline]
  69. Haan M, Ongena YP, Hommes S, Kwee TC, Yakar D. A qualitative study to understand patient perspective on the use of artificial intelligence in radiology. J Am Coll Radiol. 2019;16(10):1416-1419. [CrossRef] [Medline]
  70. Wilner AS, Luce H, Ouellet E, Williams O, Costa N. From public health to cyber hygiene: cybersecurity and Canada’s healthcare sector. International Journal. 2022;76(4):002070202110679. [FREE Full text] [CrossRef]
  71. N.L. health-care cyberattack is worst in Canadian history, says cybersecurity expert. CBC News. 2021. URL: https://www.cbc.ca/news/canada/newfoundland-labrador/nl-cyber-attack-worst-canada-1.6236210 [accessed 2023-08-22]
  72. Lin SY, Mahoney MR, Sinsky CA. Ten ways artificial intelligence will transform primary care. J Gen Intern Med. 2019;34(8):1626-1630. [FREE Full text] [CrossRef] [Medline]
  73. Young AT, Amara D, Bhattacharya A, Wei ML. Patient and general public attitudes towards clinical artificial intelligence: a mixed methods systematic review. Lancet Digit Health. 2021;3(9):e599-e611. [FREE Full text] [CrossRef] [Medline]
  74. Witkowski K, Okhai R, Neely SR. Public perceptions of artificial intelligence in healthcare: ethical concerns and opportunities for patient-centered care. BMC Med Ethics. 2024;25(1):74. [FREE Full text] [CrossRef] [Medline]
  75. Bjerring JC, Busch J. Artificial intelligence and patient-centered decision-making. Philos. Technol. 2020;34(2):349-371. [CrossRef]
  76. Jaiswal J, Halkitis PN. Towards a more inclusive and dynamic understanding of medical mistrust informed by science. Behav Med. 2019;45(2):79-85. [FREE Full text] [CrossRef] [Medline]
  77. Pellowski JA, Price DM, Allen AM, Eaton LA, Kalichman SC. The differences between medical trust and mistrust and their respective influences on medication beliefs and ART adherence among African-Americans living with HIV. Psychol Health. 2017;32(9):1127-1139. [FREE Full text] [CrossRef] [Medline]
  78. Benkert R, Peters RM, Clark R, Keves-Foster K. Effects of perceived racism, cultural mistrust and trust in providers on satisfaction with care. J Natl Med Assoc. 2006;98(9):1532-1540. [Medline]


AI: artificial intelligence
EMR: electronic medical record
SDoH: social determinants of health
SPARK: Screening for Poverty And Related Social Determinants and Intervening to Improve Knowledge of and Links to Resources


Edited by T de Azevedo Cardoso, KJ Craig; submitted 28.08.23; peer-reviewed by BG Patra, Y Khan, J Wu, A Frier, K Fuji; comments to author 30.06.24; revised version received 31.10.24; accepted 29.11.24; published 06.03.25.

Copyright

©Victoria H Davis, Jinfan Rose Qiang, Itunuoluwa Adekoya MacCarthy, Dana Howse, Abigail Zita Seshie, Leanne Kosowan, Alannah Delahunty-Pike, Eunice Abaga, Jane Cooney, Marjeiry Robinson, Dorothy Senior, Alexander Zsager, Kris Aubrey-Bassler, Mandi Irwin, Lois A Jackson, Alan Katz, Emily Gard Marshall, Nazeem Muhajarine, Cory Neudorf, Stephanie Garies, Andrew D Pinto. Originally published in the Journal of Medical Internet Research (https://www.jmir.org), 06.03.2025.

This is an open-access article distributed under the terms of the Creative Commons Attribution License (https://creativecommons.org/licenses/by/4.0/), which permits unrestricted use, distribution, and reproduction in any medium, provided the original work, first published in the Journal of Medical Internet Research (ISSN 1438-8871), is properly cited. The complete bibliographic information, a link to the original publication on https://www.jmir.org/, as well as this copyright and license information must be included.