Viewpoint
Abstract
In this viewpoint, we explore the use of big data analytics and artificial intelligence (AI) and discuss important challenges to their ethical, effective, and equitable use within opioid use disorder (OUD) treatment settings. Applying our collective experiences as OUD policy and treatment experts, we discuss 8 key challenges that OUD treatment services must contend with to make the most of these rapidly evolving technologies: data and algorithmic transparency, clinical validation, new practitioner-technology interfaces, capturing data relevant to improving patient care, understanding and responding to algorithmic outputs, obtaining informed patient consent, navigating mistrust, and addressing digital exclusion and bias. Through this paper, we hope to critically engage clinicians and policy makers on important ethical considerations, clinical implications, and implementation challenges involved in big data analytics and AI deployment in OUD treatment settings.
J Med Internet Res 2025;27:e58723doi:10.2196/58723
Keywords
Introduction
Opioid Use Disorder and Treatment Policy
Opioid use disorder (OUD) is a global public health crisis and is associated with significant morbidity, mortality, and implications for socioeconomic development [
]. There is a broad range of policy responses to reduce the public health burden of OUD, including demand reduction, supply reduction, harm reduction, and treatment policies [ - ]. Drug demand reduction interventions include public education and communication programs about the risks and harms of opioids and treatment options [ , ]. Supply restriction interventions include reducing unlawful access to opioids through law enforcement and reducing inappropriate lawful access by influencing physician opioid prescribing practices through Prescription Drug Monitoring Programs and clinical guidelines [ , ]. Harm reduction interventions include opioid overdose education and naloxone distribution (OEND) programs, drug checking, syringe service programs, and supervised injection facilities [ ]. Treatment policies tend to focus on increasing access to and use of medications for opioid use disorder (MOUD), such as methadone and buprenorphine, and other psychological and behavioral interventions [ ].Challenges to Policy Implementation
Among the range of policy responses, robust international evidence supports harm reduction interventions and MOUD [
, , - ]. Despite this, there is a significant treatment gap across low-, middle-income, and high-income countries [ , ]. In the United States, for example, only 13.4% of people who might have benefitted were able to access MOUD, such as methadone and buprenorphine [ ]. Worldwide, only 1 in 12 people in need of treatment for substance use disorder can access it [ ]. Furthermore, there is substantial variation across and within countries in the programmatic components, implementation, and quality of the different interventions [ , - ]. Jin et al [ ] illustrate this in their international systematic review of treatment programs offering MOUD, which vary considerably in quality, accessibility, and consistency, limiting its uptake and efficacy among people with OUD.Further complicating the implementation of policy responses to OUD is the diversity in potential outcomes, not all of which are equally valued. For example, harm reduction interventions intended to reduce the negative impact of drug use without requiring abstinence have been shown to reduce opioid overdose deaths and sequelae of injecting drug use, such as blood-borne virus transmission or soft tissue infections [
- ]. Yet, due to the stigma associated with drug use [ ], including by treatment providers [ ], there is often poor provision of these interventions in many settings [ ]. Many countries emphasize law enforcement approaches to reduce the drug supply over the provision of supervised drug consumption rooms, despite growing evidence of the impact of this intervention in reducing drug overdose events [ ]. Similarly, OUD service providers may have different perspectives on how positive outcomes from pharmacotherapy are defined. For some services, a positive outcome is defined as a person becoming abstinent and exiting a treatment program with the assistance of medication. In contrast, for others, it is defined as remaining on MOUD in the long term while reducing harmful illicit use [ , ].Against this complex and contested environment, there has been widespread support for data-driven systems and surveillance to inform policy, planning, funding, governance, monitoring, and evaluation of OUD interventions and treatment services [
- ]. As with other areas of health care, big data analytics and artificial intelligence (AI) are catalyzing a fundamental shift in what can be accomplished through data-driven systems to tackle the opioid crisis [ ]. This includes predictive modeling using large, linked datasets to identify underserved areas with high opioid overdose rates and the development of clinical decision support systems (CDSS) to identify and stratify OUD risk [ - ].Defining AI
AI refers to the use of computational systems and mathematical algorithms that simulate human-like intelligence to analyze complex problems [
]. AI is currently a consolidated field of research in health care and is being largely applied in other sectors [ ]. The European Commission high-level experts group on AI defines it as “...systems that display intelligent behavior by analyzing their environment and taking actions—with some degree of autonomy—to achieve specific goals” [ ]. These types of algorithms, or models, use large volumes of data (namely, big data), which come from heterogeneous sources, to identify patterns, find relationships between data entities, and generate predictions for supporting decision-making.In the specific context of OUD, AI systems may be able to use a range of data types and sources, including patient records, treatment histories, and demographic information, to generate insights into the characteristics of patients and the consequences that treatments and policies have on the management of related conditions. These types of analyses can be supervised, in which the model knows the outcome, or unsupervised, in which the outcome is unknown and modeling helps to identify groups of patients or outcomes. AI has the potential to enable health care systems to continuously improve their performance by learning from new data. In the context of OUD services, this may include enhancing clinical decision-making with AI-generated predictive insights, personalizing care, and improving operational efficiency.
Challenges to Deploying AI Applications in Health Care Settings
The large-scale deployment and integration of Al solutions into health care systems are known to have challenges, including the lack of a coherent regulatory framework across jurisdictions [
] and data protection, legal, ethical, and trust issues [ ]. However, we argue that there are additional complexities and challenges within the OUD policy and treatment services environment in implementing AI and big data solutions. These include the highly regulated nature of OUD treatment services, the criminalization and stigmatization of people who use drugs, inequities in access to technology and data, and a degree of cognitive dissonance clinicians may have in their dual roles of regulating behavior and providing person-centered care [ , ]. Using our collective experiences as OUD treatment providers, evidence experts, and policy makers, we aim to highlight the challenges to be addressed so that these technologies can be used ethically, effectively, and equitably.While it is beyond our scope to provide a cutting-edge account of this rapidly evolving field, we begin by providing examples of big data analytics and AI applications in OUD service settings. We then focus on 8 key challenges that OUD treatment services must contend with to make the most of these rapidly evolving technologies.
Big Data Analytics and AI Applications in OUD Treatment Settings
Big data analytics is characterized by the integration and analysis of a large volume of continuously generated, heterogeneous, and complex data from various sources, including sensors, smartphones, electronic health records (EHRs), results of clinical investigations, and the internet [
]. This contrasts with more traditional research studies with typically fixed boundaries, including clearly defined variables and data collected at specific time points over an established period. Types of data include clinical trial data, anthropometrics (eg, weight and height measurements), demographic information, payer and insurance data, lifestyle, behavioral and psychological traits, continuous physiological measurements, clinical phenotyping (eg, diagnoses, medication use, medical imaging, and procedure results), and process measures captured from mobile and wearable health applications (eg, smartphone apps and text messaging) [ ].Important technological developments that have expanded the amount of timely and actional health data include continuous digital connectivity through the mobile internet [
] and the digitization of how we interact with each other and our environment [ ]. For example, our mobile phones can provide continuous location information and the means to engage in real-time with social media communities [ ]. Bluetooth-enabled wearable technology allows the continuous monitoring of our heart rate and rhythm, oxygen levels, sleep patterns, and other physiological measurements [ ]. Many health services provide digital interfaces to exchange and store information, creating patient-held EHRs [ ]. provides an overview of some AI applications applicable to the OUD treatment setting.
Due to the volume of generated health care data and the velocity at which they are produced, it is beyond the capacity of most health care systems to absorb and respond to every individual data stream [
]. For example, in a system designed to detect and respond to an opioid overdose, physiological information such as respiratory rate will need to be contextualized alongside the individual’s clinical phenotype, and the health care system will need to be primed to respond to a predetermined threshold for action [ ]. Advanced statistical methods, the hallmark of AI and big data analytics, have therefore become necessary to extract timely, understandable, and actionable outputs from health care data [ ].Several reviews have described use cases for big data analytics and AI in the management of OUD [
- , , - ]. This includes the discovery of new pharmaceuticals [ ], population-level surveillance and public health planning [ , , , , ], OUD or opioid overdose risk prediction [ , , , , , - ], prediction models for treatment engagement and retention [ , , , , , ], generative AI interfaces to provide advice and support [ ], and improved monitoring and diagnostics [ , - ]. provides more details on a selection of these applications.Domain, and big data analytics and AI approaches | Examples | ||
Population-level epidemiology and needs assessment | |||
Contextualizing opioid overdose events with visual representations of neighborhood-built environment conditions [ | ]Geospatial data, Google Street View images, nonemergency “311” service requests, and US Census data were used as indicators to produce a high-resolution spatial-temporal analysis, indicating that OUD is influenced by social and neighborhood determinants such as depressing or insecure living environments, poverty, and health issues to inform health policies and guide responses to the opioid crisis. | ||
Mapping opioid overdose events against health care access and socioeconomic factors | Spatiotemporal patterns and maps using color to show variation in aggregates of geographical data created from opioid overdose–related emergency department visits, geolocation, data on socioecological factors such as health behaviors, health care, social and economic factors, and physical environment identified that emergency room visit rates were significantly associated with the changes in health care factors (ie, access to care and quality of care) and socioeconomic factors (ie, levels of education, employment, income, family and social support, and community safety) [ | ].||
Using social media data to identify trends in opioid use [ | - ]Statistical techniques applied to social media data may provide close to real-time, county-level estimates of overdose mortality, a basis to inform prevention and treatment decisions. | ||
Social media analysis and digital phenotyping [ | ]Other studies have identified the possibility of detecting a “phenotype” of social media use by text analysis in conditions such as schizophrenia, which may allow new opportunities to support early illness or relapse detection in the community. | ||
Using social media to provide targeted interventions for OUD | The social media platform X (formerly known as Twitter) offers a feasible approach to identify some people who use opioids, making it a possible arena to disseminate evidence-based content and facilitate linkage to treatment and harm reduction services [ | ].||
Integrated treatment approaches. | |||
Identifying intervention touchpoints for people with OUD across sectors and services | Data linkage studies, for example, between the records of major health and social agencies and the use of machine learning predictive models, have the potential to identify key intervention (touch) points to provide health or social care, treatment for OUD, or harm reduction interventions and can improve understanding of clinical trajectories for people with OUD [ | , ].||
Generative AI and chatbots | Chatbots that offer support, accountability, and some forms of psychotherapy. Preliminary studies show encouraging application in addiction treatment [ | - ]||
Clinical decision-making | |||
Development of clinical decision support algorithms for OUD treatment | Combining evidence-based tools, expert consensus with electronic health care data, and monitoring tools to develop a screening measure, symptom tracking measure, and clinical decision support algorithm necessary to implement measurement-based care for OUD with buprenorphine in primary care [ | ].||
Stratifying overdose risk | NarxCare is a proprietary analytic tool that analyses state-mandated prescription databases in the United States to calculate a risk score for possible overdose deaths, which is displayed in the patient’s electronic medical record [ | ].||
Tailoring interventions and personalized approaches; personalized medicine or treatment | |||
Predictive analytics to identify high-risk periods for people with OUD and link these with appropriate interventions | Linking ecological momentary assessment data (where participants are prompted by a smartphone app to self-report on various factors such as sleep, stress, pain, craving, and mood), ambulatory physiological assessment using mobile sensors or smartwatches and social media data, and using deep neural networks for predictive analysis may be useful to identify people at high risk of cravings or withdrawal symptoms to receiving dynamic dose adjustments of medications for OUD (eg, methadone and buprenorphine) to improve treatment retention [ | ].||
Drug discovery | |||
—a | Drug discovery models using generative AI may drive forward potential brain-targeting biological therapeutics for people who use drugs [ | ].||
Performance and quality benchmarking | |||
Identifying service level characteristics associated with disengagement from OUD treatment | Predicting premature discontinuation of OUD treatment, using a supervised machine learning approach for analysis of millions of treatment episodes to identify predictors of treatment discontinuation: the most influential risk factors include characteristics of service setting, geographic region, primary source of payment, referral source, employment status, and delays to entering treatment [ | ].
aNot applicable.
Implementation Challenges in OUD Treatment Services.
As mentioned earlier, there are important caveats to the potential of big data analytics and AI to transform OUD services, including limited research evidence on the impacts of this technology on treatment outcomes, uncertainties around the regulatory frameworks for the use of AI in health care, and significant translational and implementation obstacles [
- ]. Here, we focus on 8 key challenges that OUD treatment services must contend with to make the most of these rapidly evolving technologies.Service Performance Data and Algorithmic Transparency
Given the treatment gap and the lack of evidence-based practice [
], there have been calls to introduce data-driven approaches to monitor and improve OUD treatment service performance [ , ]. For example, the Pew Charitable Trust—a US nonpartisan, data-driven think tank—proposed making intervention impact and quality data publicly available to create accountability at the state level on efforts to tackle the opioid crisis [ ]. This proposal advocates for the collection and publication of disaggregated data on the numbers diagnosed with OUD, prescribed MOUD, and supported to remain on treatment for at least 6 months [ ]. In 2021, the Scottish Government launched a public health–led MOUD quality standards monitoring program, which incorporated many of the measures recommended by the Pew expert panel [ ]. Within 3 years, improvements in treatment access have been observed in Scotland, including a significant shortening in the waiting times to access MOUD [ ].There have been some notable examples of using big data analytics to identify service-level performance gaps [
]. For example, a US-based study used machine learning techniques on a sample of 941,286 OUD treatment episodes collected in 2015-2017 and identified inequities in access to treatment dominated by race, health insurance, and housing status [ ]. Another study used granular spatiotemporal data to map area-level access and quality of care against emergency department opioid overdose visits, potentially exposing jurisdictional variability in performance [ ]. Nevertheless, barriers to service providers and funders engaging in these approaches may lie in increased public scrutiny, potentially leading to performance management and other sanctions. Furthermore, insights into performance may not be welcome in settings where services are beleaguered by multiple pressures, including underfunding, lack of incentives, competing priorities, and a lack of capacity.Transparency and accountability are equally important where big data and AI systems are deployed in OUD treatment settings. The term explainable AI describes important aspects of transparency in using predictive algorithms, including its purpose, function, accuracy, and traceability [
]. Furthermore, with regard to function, it is important for clinicians and patients to understand the limits of these predictions, including the data used for training the algorithm and the implications for both accepting and rejecting the output [ ]. Unfortunately, the lack of algorithmic transparency is a common finding. In a recent systematic review on the use of AI in mental health research, a lack of transparency was noted in the reporting of methodological flaws relating to statistical processes and data preprocessing [ , ]. This lack of transparency often links back to the proprietary nature of AI algorithms that have been developed as commercial products [ ]. However, it may also be an unintentional consequence where black-box deep learning approaches are used or where there is a complexity level that the algorithm creators do not understand [ ].Clinical Validation
Linked to algorithmic transparency is the clinical validation of AI and big data applications in real-world OUD service settings [
, ]. A recent narrative review on the application of AI in opioid use disorder highlighted the necessity for validation and robust evaluation of these technologies [ ]. People in need of OUD treatment are a heterogeneous group with social, psychological, and biological diversity and complexity. This may make them particularly vulnerable to dataset shift, where a machine learning system underperforms as it has been developed from a dataset that is materially different from the one in which it is deployed [ ].The widespread use of unregulated big-data analytics in OUD treatment settings has already been identified. NarxCare (Bamboo Health) is an analytics platform embedded in the EHRs of health care providers in over 45 US states [
]. The platform analyses prescription drug monitoring data and combines them with other health data to provide clinicians with an opioid overdose risk score [ ]. Despite no evidence of its safety, effectiveness in reducing overdose deaths, or risk assessment of its potential to increase disparities in access to care, NarxCare is positioned to influence over 1 billion clinical encounters every year in the United States [ ]. The pervasiveness of the NarxCare product in everyday patient care underscores the need for more robust regulation of “Software as a Medical Device” products.It is also important to have a mandated system for monitoring and reporting mistakes and errors arising from using big data and AI analytics tools in health care settings [
]. However, it may be challenging for clinicians to correctly recognize and attribute errors to these tools, considering that algorithm creators themselves do not always understand how their products work. In addition, the lines of accountability for any error detection process have not been defined, for example, whether this would be the manufacturer’s responsibility or an independent regulator [ ].Navigating the Practitioner-Technology Interface
The interface between AI systems and clinicians is an important yet arguably understudied domain. For example, returning to the use of CDSS for identifying and treating OUD [
] or opioid prescribing [ ], it is unclear how to manage the incongruity between an algorithmic and a human clinical decision. What are the medicolegal implications if an algorithmic decision is overridden based on values or ethics-based judgment? Errors in AI and big data outputs are inevitable, yet it remains unclear where responsibility lies if an error output is ignored or acted upon [ ]. A clinician may be faced with committing serious harm if, for example, a correct predictor output is ignored or incorrect output is acted on.In addition, as seen with early technology-clinician interfaces such as radiological screening [
] and medicine interaction prompts [ ], there is a risk of “prompt fatigue” with predictive AI models [ ]. This is particularly likely where predictive warnings do not seem relevant to typical clinical encounters and may lead to a subconscious dismissal of warnings by clinicians, resulting in error when a correct prediction is given. Indeed, if new technologies become instruments of control and surveillance, thus increasing administrative burden and reducing clinical time and autonomy, clinicians may reject them. The human element of connection has been identified as essential in addiction care by both clinicians and patients [ ]. AI and big data interfaces must be designed to be unobtrusive and facilitative of person-centered care to preserve the relational aspect of clinical encounters.Capturing the Right Data
Health care data are often siloed into many different systems, such as medical imaging, pathology, EHRs, electronic prescribing tools, and insurance databases. These data become notoriously difficult to link and integrate due to the lack of uniformity in technology infrastructure and privacy and regulatory barriers. The lack of data interoperability limits effective AI model generation, training, and deployment [
]. In addition, the inability of different systems to communicate and share data in real time presents major challenges to the generation of relevant outputs in time-limited situations, for example, in opioid overdose scenarios [ ].Yet, linking diverse data sources has proven to be useful in understanding the clinical trajectories of people with OUD. For example, linked ecological momentary assessment, biosensor, and social media data have been used to increase the predictive accuracy when using AI to model OUD clinical trajectories [
]. However, it is important to note that simply adding more data because they are available does not improve model accuracy. In the work by Marsch et al [ ], the data types and streams were carefully selected based on existing evidence, clinical expertise, and judgment.Furthermore, there are necessary and important regulatory protections to prevent potentially sensitive data from being collected or used outside the context for which they were gathered. This is particularly important in the context of OUD, where prediction models can easily become “surveillance models,” which may impact the individual's privacy and work against their best interests [
]. For example, predictive modeling using social media data may be able to predict future drug use–related behaviors, creating opportunities for intervening and preventing a possible drug overdose [ , ]. However, people using social media have not consented to their data being used this way and may be uncomfortable with it. Their concerns are especially justified in jurisdictions with less robust regulatory and human rights protections, where exposure may lead to arrest or coercive OUD treatment [ ].Data sharing frameworks and models, such as Trusted Research Environment (TRE), have been created in some jurisdictions to address some of the issues raised here [
]. A TRE is a controlled remote access virtual computing domain allowing approved researchers to work with individual-level data but only export aggregate-level results once screened by data custodians [ ]. This helps to reduce the costs and risks to organizations when conducting big data analytics studies, which stem from data protection breaches and the operating costs of centralized systems [ , ].Responding to and Understanding AI and Big Data Outputs
There is a real risk that services may not keep up with the outputs and insights generated by big data analytics and AI technology. For example, cloud-based AI algorithms can effectively process smartphone-collected data to detect respiratory depression among people using opioids in real time to trigger an automated emergency response [
]. However, this opioid overdose detection and alert system works only if local emergency services have the capacity to respond to this additional data stream. Similarly, modeling studies showing shortfalls in service coverage [ , ] can be helpful if they result in a concerted effort for improvement rather than to generate defensiveness, resentment, or a loss of morale among care providers and funders. While it is laudable that our sector is engaging with new technologies and approaches, it is important that we make sustainable investments in services that have a measurable impact on patient outcomes. In the context of predictive and diagnostic technologies, this must include strengthening existing health care infrastructures to ensure they can respond to an increased demand for their services.Obtaining Informed Consent
Health care providers are ethically and legally obliged to obtain informed consent when providing a health intervention [
, ]. Obtaining informed consent is also a key aspect of ethical research practice [ ]. This applies also to the use of big data analytics and AI. Andreotta et al [ ] have identified 3 issues when obtaining informed consent in relation to the use of big data analytics: the issue of algorithmic transparency (how does it work?), the potential for data to be repurposed (how will it be used?), and providing people with a reasonable alternative should they refuse consent (do I have a choice?).Unfortunately, in relation to transparency, providing detailed information on the workings of an AI algorithm may not always be helpful to the patient in the decision-making process [
]. Explainable AI, a relatively new research area, attempts to define levels of understanding people may wish to have commensurate with the algorithms’ functions [ ]. One of the goals of explainable AI is to standardize how we communicate meaningfully with individuals to empower them to make informed decisions on the risks and benefits of data-sharing [ ].Indeed, the importance of communication about data cannot be understated. Data acquired and processed for a defined health care–related purpose are vulnerable to data security breaches and trafficking, making this a key international regulatory priority [
, ]. The data security landscape is in constant flux, requiring constantly improving systems, including advanced encryption measures [ ]. There is, therefore, an ethical obligation when obtaining informed consent to raise awareness of the risks of data security breaches and the measures taken to mitigate against these.More challenging to incorporate into consenting practices is the issue of repurposed data. AI algorithms may reveal insights and generate secondary uses for the same data that were not anticipated during the original consenting process [
]. While securing informed consent from the original participants of a large dataset is impractical, patients may wish to be protected from data repurposing, which results in actions that are personally detrimental. For example, insights from linked datasets to map intervention touch points for people with OUD [ ] can be easily repurposed to highlight areas that become targets of increased policing activity. Robust data protection laws must be in place to reassure patients that sensitive data will not be exploited. Trusted research environments described earlier can act as an arbitrator of public trust by guaranteeing that personal data is not reidentified, providing some reassurances to safeguard against such risks [ ].Trust in Institutions and Technology
Public trust in institutions and technology is one of the biggest barriers to the growth of big data analytics and AI innovation [
]. Among the most cited trust concerns are safety and security, state overreach, inequalities and bias, and unknown unknowns [ ]. Similarly, establishing and maintaining trust is paramount to effective health care delivery [ ]. Historically, there have been high levels of public trust in health care institutions, yet this has eroded over the years to the point of crisis in countries such as the United States and the United Kingdom [ ].Unfortunately, trust has long been an issue in OUD treatment settings due to experiences of institutional stigma, discrimination, and conflict [
]. For example, self and social stigma and mistrust of services may result in individuals not disclosing substance use or related social risks such as homelessness or criminal justice involvement [ , ]. Individuals with OUD may also be concerned about their geolocation data being used to penalize them if shared with the police [ ]. In other words, people with OUD have had issues of trust relating to safety and security, state overreach, inequalities and bias, and unknown unknowns long before the public in relation to AI. This is justified, as marginalized groups and those whose behaviors are stigmatized or criminalized stand to risk more when trusting their data to public institutions than those in mainstream society [ - ].One consequence of mistrust is the systematic refusal of a specific group to provide data or consent to data sharing. This results in data absenteeism, excluding them from public health surveillance-based service planning. Data absenteeism is more likely to occur among marginalized or multiply disadvantaged groups, such as people who use drugs, people experiencing homelessness, or those involved with criminal justice [
, ]. The consequence for some groups with OUD is that their needs remain unidentified or planned for at the policy level [ ]. There is also evidence to show that some groups miss out on timely data-driven interventions as a result of data absenteeism [ ].Three important considerations need to be addressed when building trust among people with OUD in relation to big data analytics and AI: Do they have a say on the research questions asked using their data? Will the health gains, profits, or benefits from using their data be accessible to them? Will they be disadvantaged by unanticipated socially biased policy decisions based on the data they have provided? The British Colombia Provincial Overdose Cohort is a large linked administrative health dataset on people who have had a drug overdose event [
]. This big data initiative is unique in its active engagement of people with OUD to identify research questions of importance to the affected community, providing responses to the three key trust questions mentioned here [ , ].Furthermore, policy makers, treatment funders, and providers must take steps to protect people seeking OUD treatment from being disadvantaged should they decline consent to participate in big data initiatives or wish for their data to be withdrawn. The ongoing engagement and consent of people who use drugs, multiple disadvantaged groups, and their advocates is critical to ensuring the impact of data absenteeism is managed and attenuated.
Digital Exclusion and Bias
People who use drugs and other marginalized and minoritized groups are particularly vulnerable to digital exclusion [
]. Digital exclusion refers to the barriers some face in participating fully in society due to a lack of access or inability to use digital technologies [ ]. It is a form of social inequality rooted in the unequal distribution of digital resources. Digital exclusion also contributes to data absenteeism. Rather than declining to share data, the individual has not had the skills, technology, or opportunity to create a digital identity or to generate digitally captured data [ ]. Furthermore, these groups are easily ignored in digital co-design and public engagement in data use and AI initiatives [ ].Algorithmic bias in big data analytics describes a systematically repeating predictive or output error that results in prejudicial outcomes against individuals, groups, or social categories due to race, gender, sexuality, socioeconomic grouping, culture, or geographic location [
, ]. AI algorithms are based on existing data that reflect structural inequalities in society, including digital exclusion and prevailing implicit and explicit societal biases against people who use drugs [ ]. Consequently, people who use drugs are particularly vulnerable to algorithmic bias [ , ].The identification of algorithmic bias requires proactive and prospective performance monitoring. An algorithm may, for example, be functional in a general sense in triggering a risk alert for an unwanted clinical outcome yet perform poorly for a specific demographic. Where such discrepancies are identified, more detailed information could be collected for this segment to enable retraining of the model to improve both fairness and accuracy [
]. Similarly, when an algorithm triggers the necessity for clinical contact, the clinician should be provided with guidance on how to interpret the thresholds for action through transparency on the difference between the person’s demographics compared with the algorithmic training cohort [ ]. A criticism of the NarxCare overdose risk score is that it has a prominent location within the patient’s EHR, embedding itself into the consultation, yet is not incorporated into the workflow, in that there are no clinically validated score thresholds to trigger an action [ ].Conclusion
OUD remains a challenging global public health crisis. AI and big data analytics offer new opportunities for data-driven improvements in policy, treatment options, quality, and access. However, there are significant implementation challenges that services need to be prepared for. These technologies expose both patients and services to increased scrutiny and surveillance. We need to ensure that we have made sufficient investments in services to be able to improve and respond to big data and AI outputs. Furthermore, we need to be able to advise our patients, who are already mistrustful of institutions and technology, on the risks and benefits of data sharing. Finally, we need to constantly appraise the limits, validity, legitimacy, and potential for bias in using AI algorithms in our practice. Our clinical expertise and critical scientific skills are needed more than ever to balance against the tempting automaticity of AI and big data analytics. The 8 challenges identified here must be addressed if these new technologies are to be used ethically, effectively, and equitably.
Acknowledgments
Support for this research was provided by the Commonwealth Fund. The views presented here are those of the authors and should not be attributed to the Commonwealth Fund or its directors, officers, or staff.
Conflicts of Interest
None declared.
References
- Strang J, Volkow ND, Degenhardt L, Hickman M, Johnson K, Koob GF, et al. Opioid use disorder. Nat Rev Dis Primers. 2020;6(1):3. [CrossRef] [Medline]
- Strickland JC, Victor G, Ray B. Perception of resource allocations to address the opioid epidemic. J Addict Med. 2022;16(5):563-569. [FREE Full text] [CrossRef] [Medline]
- Strang J, Babor T, Caulkins J, Fischer B, Foxcroft D, Humphreys K. Drug policy and the public good: evidence for effective interventions. Lancet. 2012;379(9810):71-83. [CrossRef] [Medline]
- National Academies of Sciences, Engineering, and Medicine, Health and Medicine Division, Board on Health Sciences Policy, Committee on Pain Management and Regulatory Strategies to Address Prescription Opioid Abuse. Pain management and the opioid epidemic: balancing societal and individual benefits and risks of prescription opioid use. In: Phillips JK, Ford MA, Bonnie RJ, editors. Evidence on Strategies for Addressing the Opioid Epidemic. Washington, DC. National Academies Press; 2017.
- O'Leary C, Ralphs R, Stevenson J, Smith A, Harrison J, Kiss Z, et al. The effectiveness of abstinence-based and harm reduction-based interventions in reducing problematic substance use in adults who are experiencing homelessness in high income countries: a systematic review and meta-analysis: a systematic review. Campbell Syst Rev. 2024;20(2):e1396. [FREE Full text] [CrossRef] [Medline]
- Resiak D, Mpofu E, Rothwell R. Sustainable harm reduction needle and syringe programs for people who inject drugs: a scoping review of their implementation qualities. Sustainability. 2021;13(5):2834. [CrossRef]
- Tonin FS, Fernandez-Llimos F, Alves da Costa F. Evidence of the impact of harm minimization programs. In: Encyclopedia of Evidence in Pharmaceutical Public Health and Health Services Research in Pharmacy. Cham, Switzerland. Springer International Publishing; 2020:1-23.
- Connery HS, McHugh RK, Reilly M, Shin S, Greenfield SF. Substance use disorders in global mental health delivery: epidemiology, treatment gap, and implementation of evidence-based treatments. Harv Rev Psychiatry. 2020;28(5):316-327. [FREE Full text] [CrossRef] [Medline]
- Colledge-Frisby S, Ottaviano S, Webb P, Grebely J, Wheeler A, Cunningham EB, et al. Global coverage of interventions to prevent and manage drug-related harms among people who inject drugs: a systematic review. Lancet Glob Health. 2023;11(5):e673-e683. [FREE Full text] [CrossRef] [Medline]
- Krawczyk N, Rivera BD, Jent V, Keyes KM, Jones CM, Cerdá M. Has the treatment gap for opioid use disorder narrowed in the U.S.?: A yearly assessment from 2010 to 2019". Int J Drug Policy. 2022;110:103786. [FREE Full text] [CrossRef] [Medline]
- World Drug Report 2024. UNODC. 2024. URL: https://www.unodc.org/documents/data-and-analysis/WDR_2024/WDR24_Key_findings_and_conclusions.pdf [accessed 2024-07-09]
- Moustaqim-Barrette A, Dhillon D, Ng J, Sundvick K, Ali F, Elton-Marshall T, et al. Take-home naloxone programs for suspected opioid overdose in community settings: a scoping umbrella review. BMC Public Health. 2021;21(1):597. [FREE Full text] [CrossRef] [Medline]
- Tonin FS, Alves da Costa F, Fernandez-Llimos F. Impact of harm minimization interventions on reducing blood-borne infection transmission and some injecting behaviors among people who inject drugs: an overview and evidence gap mapping. Addict Sci Clin Pract. 2024;19(1):9. [FREE Full text] [CrossRef] [Medline]
- Jin H, Marshall BDL, Degenhardt L, Strang J, Hickman M, Fiellin DA, et al. Global opioid agonist treatment: a review of clinical practices by country. Addiction. 2020;115(12):2243-2254. [CrossRef] [Medline]
- Sumnall HR, Atkinson A, Montgomery C, Maynard O, Nicholls J. Effects of media representations of drug related deaths on public stigma and support for harm reduction. Int J Drug Policy. 2023;111:103909. [FREE Full text] [CrossRef] [Medline]
- Sulzer SH, Prevedel S, Barrett T, Voss MW, Manning C, Madden EF. Professional education to reduce provider stigma toward harm reduction and pharmacotherapy. Drugs: Education, Prevention and Policy. 2021;29(5):576-586. [CrossRef]
- Ivsins A, Warnock A, Small W, Strike C, Kerr T, Bardwell G. A scoping review of qualitative research on barriers and facilitators to the use of supervised consumption services. Int J Drug Policy. 2023;111:103910. [CrossRef] [Medline]
- Biondi BE, Zheng X, Frank CA, Petrakis I, Springer SA. A literature review examining primary outcomes of medication treatment studies for opioid use disorder: what outcome should be used to measure opioid treatment success? Am J Addict. 2020;29(4):249-267. [FREE Full text] [CrossRef] [Medline]
- Ghaleb EAA, Dominic PDD, Muneer A, Almohammedi AA. Big data in healthcare transformation: a short review. 2022. Presented at: 2022 International Conference on Decision Aid Sciences and Applications (DASA); March 23-25, 2022:265-269; Chiangrai, Thailand. [CrossRef]
- World Health Organization. International Standards for the Treatment of Drug use Disorders: Revised Edition Incorporating Results of Field-Testing. Geneva, Switzerland. World Health Organization; 2020.
- Quality assurance in treatment for drug use disorders: key quality standards for service appraisal. UNODC. 2021. URL: https://www.unodc.org/documents/QA_OCTOBER_2021.pdf [accessed 2023-06-18]
- Implementing quality standards for drug services and systems: a six-step guide to support quality assurance. EMCDDA. 2021. URL: https://www.euda.europa.eu/publications/manuals/implementing-quality-standards-drug-services-and-systems-six-step-guide-support-quality-assurance_en [accessed 2024-12-13]
- Blanco C, Wall MM, Olfson M. Data needs and models for the opioid epidemic. Mol Psychiatry. 2022;27(2):787-792. [FREE Full text] [CrossRef] [Medline]
- Bharat C, Hickman M, Barbieri S, Degenhardt L. Big data and predictive modelling for the opioid crisis: existing research and future potential. Lancet Digit Health. 2021;3(6):e397-e407. [FREE Full text] [CrossRef] [Medline]
- Hayes CJ, Cucciare MA, Martin BC, Hudson TJ, Bush K, Lo-Ciganic W, et al. Using data science to improve outcomes for persons with opioid use disorder. Subst Abus. 2022;43(1):956-963. [FREE Full text] [CrossRef] [Medline]
- Gadhia S, Richards GC, Marriott T, Rose J. Artificial intelligence and opioid use: a narrative review. medRxiv. 2022. [CrossRef]
- Freda PJ, Kranzler HR, Moore JH. Novel digital approaches to the assessment of problematic opioid use. BioData Min. 2022;15(1):14. [FREE Full text] [CrossRef] [Medline]
- Buonora MJ, Axson SA, Cohen SM, Becker WC. Paths forward for clinicians amidst the rise of unregulated clinical decision support software: our perspective on NarxCare. J Gen Intern Med. 2024;39(5):858-862. [CrossRef] [Medline]
- Pesarsick J, Gwilliam M, Adeniran O, Rudisill T, Smith G, Hendricks B. Identifying high-risk areas for nonfatal opioid overdose: a spatial case-control study using EMS run data. Ann Epidemiol. 2019;36:20-25. [FREE Full text] [CrossRef] [Medline]
- Eguale T, Bastardot F, Song W, Motta-Calderon D, Elsobky Y, Rui A. A machine learning application to classify patients at differing levels of risk of opioid use disorder: clinician-based validation study. JMIR Medical Informatics. 2024. [FREE Full text] [CrossRef]
- Wang P. On defining artificial intelligence. J Artif Gen Intell. 2019;10(2):1-37. [CrossRef]
- Schmid T, Hildesheim W, Holoyad T, Schumacher K. The AI methods, capabilities and criticality grid. Künstl Intell. 2021;35(3-4):1-16. [CrossRef]
- A definition of artificial intelligence: main capabilities and scientific disciplines. Shaping Europe’s Digital Future. 2018. URL: https://digital-strategy.ec.europa.eu/en/library/definition-artificial-intelligence-main-capabilities-and-scientific-disciplines [accessed 2024-10-15]
- Meskó B, Topol EJ. The imperative for regulatory oversight of large language models (or generative AI) in healthcare. NPJ Digit Med. 2023;6(1):120. [FREE Full text] [CrossRef] [Medline]
- Kumar P, Chauhan S, Awasthi LK. Engineering Applications of Artificial Intelligence. 2023;120:105894. [CrossRef]
- Tyndall M, Dodd Z. How structural violence, prohibition, and stigma have paralyzed North American responses to opioid overdose. AMA J Ethics. 2020;22(1):E723-E728. [FREE Full text] [CrossRef] [Medline]
- Shilo S, Rossman H, Segal E. Axes of a revolution: challenges and promises of big data in healthcare. Nat Med. 2020;26(1):29-38. [CrossRef] [Medline]
- Tas B, Lawn W, Traykova EV, Evans RAS, Murvai B, Walker H, et al. A scoping review of mHealth technologies for opioid overdose prevention, detection, and response. PsyArXiv. 2022. [CrossRef]
- Flickinger TE, Waselewski M, Tabackman A, Huynh J, Hodges J, Otero K, et al. Communication between patients, peers, and care providers through a mobile health intervention supporting medication-assisted treatment for opioid use disorder. Patient Educ Couns. 2022;105(7):2110-2115. [FREE Full text] [CrossRef] [Medline]
- Goldfine C, Lai JT, Lucey E, Newcomb M, Carreiro S. Wearable and wireless mHealth technologies for substance use disorder. Curr Addict Rep. 2020;7(3):291-300. [FREE Full text] [CrossRef] [Medline]
- Rollston R, Gallogly W, Hoffman L, Tewari E, Powers S, Clear B. Collaborative, patient-centred care model that provides tech-enabled treatment of opioid use disorder via telehealth. BMJ Innov. 2022;8(2):117-122. [CrossRef]
- Dutta PP, Pal S, Mukherjee M. Healthcare Big Data: A Comprehensive Overview. Hershey, NY. IGI Global; 2018:72-100.
- Gupta R, Srivastava D, Sahu M, Tiwari S, Ambasta RK, Kumar P. Artificial intelligence to deep learning: machine intelligence approach for drug discovery. Mol Divers. 2021;25(3):1315-1360. [FREE Full text] [CrossRef] [Medline]
- Beaulieu T, Knight R, Nolan S, Quick O, Ti L. Artificial intelligence interventions focused on opioid use disorders: a review of the gray literature. Am J Drug Alcohol Abuse. 2021;47(1):26-42. [CrossRef] [Medline]
- Bonfiglio NS, Mascia ML, Cataudella S, Penna MP. Digital help for substance users (SU): a systematic review. Int J Environ Res Public Health. 2022;19(18):11309. [FREE Full text] [CrossRef] [Medline]
- Rumbut J, Fang H, Carreiro S, Smelson D, Boyer E. An overview of wearable biosensor systems for real-time substance use detection. IEEE Internet Things J. 2022;9(23):23405-23415. [CrossRef]
- Johnson M, Albizri A, Harfouche A, Tutun S. Digital transformation to mitigate emergency situations: increasing opioid overdose survival rates through explainable artificial intelligence. IMDS. 2021;123(1):324-344. [CrossRef]
- Lo-Ciganic WH, Huang JL, Zhang HH, Weiss JC, Wu Y, Kwoh CK, et al. Evaluation of machine-learning algorithms for predicting opioid overdose risk among medicare beneficiaries with opioid prescriptions. JAMA Netw Open. 2019;2(3):e190968. [FREE Full text] [CrossRef] [Medline]
- Oteo A, Daneshvar H, Baldacchino A, Matheson C. Overdose alert and response technologies: state-of-the-art review. J Med Internet Res. 2023;25:e40389. [FREE Full text] [CrossRef] [Medline]
- Garbin C, Marques N, Marques O. Machine learning for predicting opioid use disorder from healthcare data: a systematic review. Comput Methods Programs Biomed. 2023;236:107573. [CrossRef] [Medline]
- Li Y, Miller HJ, Root ED, Hyder A, Liu D. Understanding the role of urban social and physical environment in opioid overdose events using found geospatial data. Health Place. 2022;75:102792. [CrossRef] [Medline]
- Kong Y, Zhou J, Zheng Z, Amaro H, Guerrero EG. Using machine learning to advance disparities research: subgroup analyses of access to opioid treatment. Health Serv Res. 2022;57(2):411-421. [FREE Full text] [CrossRef] [Medline]
- Liu YS, Kiyang L, Hayward J, Zhang Y, Metes D, Wang M, et al. Individualized prospective prediction of opioid use disorder. Can J Psychiatry. 2023;68(1):54-63. [FREE Full text] [CrossRef] [Medline]
- Marsch LA, Chen CH, Adams SR, Asyyed A, Does MB, Hassanpour S, et al. The feasibility and utility of harnessing digital health to understand clinical trajectories in medication treatment for opioid use disorder: D-TECT study design and methodological considerations. Front Psychiatry. 2022;13:871916. [FREE Full text] [CrossRef] [Medline]
- Pozzi G. Automated opioid risk scores: a case for machine learning-induced epistemic injustice in healthcare. Ethics Inf Technol. 2023;25(1):3. [FREE Full text] [CrossRef] [Medline]
- Stafford C, Marrero WJ, Naumann RB, Lich KH, Wakeman S, Jalali MS. Identifying key risk factors for premature discontinuation of opioid use disorder treatment in the United States: a predictive modeling study. Drug Alcohol Depend. 2022;237:109507. [FREE Full text] [CrossRef] [Medline]
- Gottlieb A, Yatsco A, Bakos-Block C, Langabeer JR, Champagne-Langabeer T. Machine learning for predicting risk of early dropout in a recovery program for opioid use disorder. Healthcare (Basel). 2022;10(2):223. [FREE Full text] [CrossRef] [Medline]
- Lambert TP, Gazi AH, Harrison AB, Gharehbaghi S, Chan M, Obideen M, et al. Leveraging accelerometry as a prognostic indicator for increase in opioid withdrawal symptoms. Biosensors (Basel). 2022;12(11):924. [FREE Full text] [CrossRef] [Medline]
- Nandakumar R, Gollakota S, Sunshine JE. Opioid overdose detection using smartphones. Sci Transl Med. 2019;11(474):eaau8914. [CrossRef] [Medline]
- Xia Y, Hu J, Zhao S, Tao L, Li Z, Yue T, et al. Build-in sensors and analysis algorithms aided smartphone-based sensors for point-of-care tests. Biosensors and Bioelectronics: X. 2022;11:100195. [CrossRef]
- Acharya A, Izquierdo AM, Gonçalves SF, Bates RA, Taxman FS, Slawski MP, et al. Exploring county-level spatio-temporal patterns in opioid overdose related emergency department visits. PLoS One. 2022;17(12):e0269509. [FREE Full text] [CrossRef] [Medline]
- Cuomo R, Purushothaman V, Calac AJ, McMann T, Li Z, Mackey T. Estimating county-level overdose rates using opioid-related twitter data: interdisciplinary infodemiology study. JMIR Form Res. 2023;7:e42162. [FREE Full text] [CrossRef] [Medline]
- Curtis B, Giorgi S, Ungar L, Vu H, Yaden D, Liu T, et al. AI-based analysis of social media language predicts addiction treatment dropout at 90 days. Neuropsychopharmacology. 2023;48(11):1579-1585. [CrossRef] [Medline]
- Matero M, Giorgi S, Curtis B, Ungar LH, Schwartz HA. Opioid death projections with AI-based forecasts using social media language. NPJ Digit Med. 2023;6(1):35. [FREE Full text] [CrossRef] [Medline]
- Hswen Y, Naslund JA, Brownstein JS, Hawkins JB. Online communication about depression and anxiety among twitter users with schizophrenia: preliminary findings to inform a digital phenotype using social media. Psychiatr Q. 2018;89(3):569-580. [FREE Full text] [CrossRef] [Medline]
- Tofighi B, Aphinyanaphongs Y, Marini C, Ghassemlou S, Nayebvali P, Metzger I, et al. Detecting illicit opioid content on Twitter. Drug Alcohol Rev. 2020;39(3):205-208. [FREE Full text] [CrossRef] [Medline]
- Xavier C, Zhao B, Gan WQ, Slaunwhite A. Provincial overdose cohort: population data linkage during an overdose crisis. IJPDS. 2020;5(5). [CrossRef]
- Larney S, Jones N, Fiellin D, Nielsen S, Hickman M, Dobbins T, et al. Data resource profile: the opioid agonist treatment and safety (OATS) study, New South Wales, Australia. Int J Epidemiol. 2021;49(6):1774-1775. [FREE Full text] [CrossRef] [Medline]
- Chun-Hung L, Guan-Hsiung L, Wu-Chuan Y, Yu-Hsin L. Chatbot-assisted therapy for patients with methamphetamine use disorder: a preliminary randomized controlled trial. Front Psychiatry. 2023;14:1159399. [FREE Full text] [CrossRef] [Medline]
- Ogilvie L, Prescott J, Carson J. The use of chatbots as supportive agents for people seeking help with substance use disorder: a systematic review. Eur Addict Res. 2022;28(6):405-418. [FREE Full text] [CrossRef] [Medline]
- Anthony CA, Rojas EO, Keffala V, Glass NA, Shah AS, Miller BJ, et al. Acceptance and commitment therapy delivered via a mobile phone messaging robot to decrease postoperative opioid use in patients with orthopedic trauma: randomized controlled trial. J Med Internet Res. 2020;22(7):e17750. [FREE Full text] [CrossRef] [Medline]
- Dela Cruz AM, Walker R, Pipes R, Wakhlu S, Trivedi MH. Creation of an algorithm for clinical decision support for treatment of opioid use disorder with buprenorphine in primary care. Addict Sci Clin Pract. 2021;16(1):12. [FREE Full text] [CrossRef] [Medline]
- Arnold C. Inside the nascent industry of AI-designed drugs. Nat Med. 2023;29(6):1292-1295. [CrossRef] [Medline]
- Kelly CJ, Karthikesalingam A, Suleyman M, Corrado G, King D. Key challenges for delivering clinical impact with artificial intelligence. BMC Med. 2019;17(1):195. [FREE Full text] [CrossRef] [Medline]
- Petersson L, Larsson I, Nygren JM, Nilsen P, Neher M, Reed JE, et al. Challenges to implementing artificial intelligence in healthcare: a qualitative interview study with healthcare leaders in Sweden. BMC Health Serv Res. 2022;22(1):850. [FREE Full text] [CrossRef] [Medline]
- Awrahman BJ, Aziz Fatah C, Hamaamin MY. A review of the role and challenges of big data in healthcare informatics and analytics. Comput Intell Neurosci. 2022;2022:5317760. [FREE Full text] [CrossRef] [Medline]
- Furstenau LB, Leivas P, Sott MK, Dohan MS, López-Robles JR, Cobo MJ, et al. Big data in healthcare: conceptual network structure, key challenges and opportunities. Digital Communications and Networks. 2023;9(4):856-868. [CrossRef]
- Williams AR, Nunes EV, Bisaga A, Levin FR, Olfson M. Development of a cascade of care for responding to the opioid epidemic. Am J Drug Alcohol Abuse. 2019;45(1):1-10. [FREE Full text] [CrossRef] [Medline]
- Williams AR, Mauro CM, Feng T, Wilson A, Cruz A, Olfson M, et al. Performance measurement for opioid use disorder medication treatment and care retention. Am J Psychiatry. 2023;180(6):454-457. [FREE Full text] [CrossRef] [Medline]
- Alvanzo A, Baillieu R, Biello S, Brown J, Geller A, Hollen A, et al. Policy brief: states should measure opioid use disorder treatment to improve outcomes. Pew Charitable Trusts. 2022. URL: https://pew.org/3Ez9WeW [accessed 2024-07-11]
- Dickie E, Tracey C, Duncan M. Medication assisted treatment MAT standards for Scotland access, choice, support interim report. Scottish Drug Deaths Taskforce. 2021. URL: https://www.drugsandalcohol.ie/33995/1/MAT-Subgroup-Interim-Report-on-programme-to-date_Mar21.pdf [accessed 2023-06-28]
- Public Health Scotland. National Benchmarking Report on Implementation of the Medication assisted Treatment (MAT) standards: Scotland 2022/23. 2023. URL: https://www.drugsandalcohol.ie/39020/1/national-benchmarking-report-on-implementation-of-the-mat-standards-scotland-2022-23.pdf [accessed 2024-12-13]
- Minh D, Wang HX, Li YF, Nguyen TN. Explainable artificial intelligence: a comprehensive review. Artif Intell Rev. 2021;55(5):3503-3568. [CrossRef]
- Monteith S, Glenn T, Geddes J, Whybrow PC, Achtyes E, Bauer M. Expectations for Artificial Intelligence (AI) in Psychiatry. Curr Psychiatry Rep. 2022;24(11):709-721. [FREE Full text] [CrossRef] [Medline]
- Tornero-Costa R, Martinez-Millana A, Azzopardi-Muscat N, Lazeri L, Traver V, Novillo-Ortiz D. Methodological and quality flaws in the use of artificial intelligence in mental health research: systematic review. JMIR Ment Health. 2023;10:e42045. [FREE Full text] [CrossRef] [Medline]
- Andreotta AJ, Kirkham N, Rizzi M. AI, big data, and the future of consent. AI Soc. 2022;37(4):1715-1728. [FREE Full text] [CrossRef] [Medline]
- Keane PA, Topol EJ. With an eye to AI and autonomous diagnosis. NPJ Digit Med. 2018;1:40. [FREE Full text] [CrossRef] [Medline]
- Finlayson SG, Subbaswamy A, Singh K, Bowers J, Kupke A, Zittrain J, et al. The clinician and dataset shift in artificial intelligence. N Engl J Med. 2021;385(3):283-286. [FREE Full text] [CrossRef] [Medline]
- Tettey F, Parupelli SK, Desai S. A review of biomedical devices: classification, regulatory guidelines, human factors, software as a medical device, and cybersecurity. Biomedical Materials & Devices. 2023;2(1):316-341. [CrossRef]
- Zhang J, Zhang Z. Ethics and governance of trustworthy medical artificial intelligence. BMC Med Inform Decis Mak. 2023;23(1):7. [FREE Full text] [CrossRef] [Medline]
- Lehman CD, Wellman RD, Buist DSM, Kerlikowske K, Tosteson ANA, Miglioretti DL, et al. Diagnostic accuracy of digital screening mammography with and without computer-aided detection. JAMA Intern Med. 2015;175(11):1828-1837. [FREE Full text] [CrossRef] [Medline]
- Phansalkar S, van der Sijs H, Tucker AD, Desai AA, Bell DS, Teich JM, et al. Drug-drug interactions that should be non-interruptive in order to reduce alert fatigue in electronic health records. J Am Med Inform Assoc. 2013;20(3):489-493. [FREE Full text] [CrossRef] [Medline]
- Miller WR, Moyers TB. The forest and the trees: relational and specific factors in addiction treatment. Addiction. 2015;110(3):401-413. [CrossRef] [Medline]
- Lehne M, Sass J, Essenwanger A, Schepers J, Thun S. Why digital medicine depends on interoperability. NPJ Digit Med. 2019;2(1):79. [FREE Full text] [CrossRef] [Medline]
- Tay WTJ, Oteo A, Baldacchino A. Rapid opioid overdose response system technologies. Curr Opin Psychiatry. 2023. [CrossRef]
- Howe III EG, Elenberg F. Ethical challenges posed by big data. Innov Clin Neurosci. 2020;17(10-12):24-30. [FREE Full text] [Medline]
- Goold I. Digital tracking medication: big promise or big brother? Law, Innovation and Technology. 2019;11(2):203-230. [CrossRef]
- Mansouri-Benssassi E, Rogers S, Reel S, Malone M, Smith J, Ritchie F, et al. Disclosure control of machine learning models from trusted research environments (TRE): New challenges and opportunities. Heliyon. 2023;9(4):e15143. [FREE Full text] [CrossRef] [Medline]
- Graham M, Milne R, Fitzsimmons P, Sheehan M. Trust and the goldacre review: why trusted research environments are not about trust. J Med Ethics. 2023;49(10):670-673. [FREE Full text] [CrossRef] [Medline]
- Kerasidou CX, Malone M, Daly A, Tava F. Machine learning models, trusted research environments and UK health data: ensuring a safe and beneficial future for AI development in healthcare. J Med Ethics. 2023;49(12):838-843. [FREE Full text] [CrossRef] [Medline]
- Walker R, Logan T, Clark JJ, Leukefeld C. Informed consent to undergo treatment for substance abuse: a recommended approach. J Subst Abuse Treat. 2005;29(4):241-251. [CrossRef] [Medline]
- Whitney SN, McGuire AL, McCullough LB. A typology of shared decision making, informed consent, and simple consent. Ann Intern Med. 2004;140(1):54-59. [CrossRef] [Medline]
- Allan J. Ethics and practice of research with people who use drugs. In: Liamputtong P, editor. Handbook of Research Methods in Health Social Sciences. Singapore. Springer; 2017:1-17.
- Colaner N. Is explainable artificial intelligence intrinsically valuable? AI & Soc. 2021;37(1):231-238. [CrossRef]
- Thapa C, Camtepe S. Precision health data: requirements, challenges and existing techniques for data security and privacy. Comput Biol Med. 2021;129:104130. [CrossRef] [Medline]
- Almalawi A, Khan AI, Alsolami F, Abushark YB, Alfakeeh AS. Managing security of healthcare data for a modern healthcare system. Sensors (Basel). 2023;23(7):3612. [FREE Full text] [CrossRef] [Medline]
- Arnold MH. Teasing out artificial intelligence in medicine: an ethical critique of artificial intelligence and machine learning in medicine. J Bioeth Inq. 2021;18(1):121-139. [FREE Full text] [CrossRef] [Medline]
- Chakravorti B. AI's trust problem. Harvard Business Review. 2024. URL: https://hbr.org/2024/05/ais-trust-problem [accessed 2024-07-12]
- Gille F, Smith S, Mays N. Why public trust in health care systems matters and deserves greater research attention. J Health Serv Res Policy. 2015;20(1):62-64. [CrossRef] [Medline]
- Cernasev A, Hohmeier KC, Frederick K, Jasmin H, Gatwood J. A systematic literature review of patient perspectives of barriers and facilitators to access, adherence, stigma, and persistence to treatment for substance use disorder. Explor Res Clin Soc Pharm. 2021;2:100029. [FREE Full text] [CrossRef] [Medline]
- Livingston JD. A Framework for Assessing Structural Stigma in Health-Care Contexts for People with Mental Health and Substance Use Issues. Ottawa, Canada. Mental Health Commission of Canada; 2021:26.
- Siegel Z. In a world of stigma and bias, can a computer algorithm really predict overdose risk? Annals of Emergency Medicine. 2022;79(6):A16-A19. [CrossRef]
- McGraw D, Mandl KD. Privacy protections to encourage use of health-relevant digital data in a learning health system. NPJ Digit Med. 2021;4(1):2. [FREE Full text] [CrossRef] [Medline]
- Newman C, MacGibbon J, Smith AKJ, Broady T, Lupton D, Davis M, et al. Why trust digital health? Understanding the perspectives of communities affected by BBVs/STIs and social stigma. UNSW Centre for Social Research in Health, Sydney. 2020. [CrossRef]
- Davis S. Contact tracing appsxtra risks for women marginalized groups. Health and Human Rights. 2020. URL: https://www.hhrjournal.org/2020/04/contact-tracing-apps-extra-risks-for-women-and-marginalized-groups/ [accessed 2023-01-08]
- Policy brief: mapping and population size estimates of sex workers. Global Network of Sex Work Projects. 2015. URL: https://nswp.org/sites/default/files/Mapping%26Population%20Size%20Estimates%20Policy%20Brief%2C%20NSWP%20-%20November%202015_0.pdf [accessed 2023-01-08]
- Viswanath K, McCloud RF, Lee EWJ, Bekalu MA. Measuring what matters: data absenteeism, science communication, and the perpetuation of inequities. Ann Am Acad Pol Soc Sci SAGE Publications Inc. 2022;700(1):208-219. [CrossRef]
- Lee EWJ, Viswanath K. Big data in context: addressing the twin perils of data absenteeism and chauvinism in the context of health disparities research. J Med Internet Res. 2020;22(1):e16377. [FREE Full text] [CrossRef] [Medline]
- Wu E, Villani J, Davis A, Fareed N, Harris DR, Huerta TR, et al. Community dashboards to support data-informed decision-making in the HEALing communities study. Drug Alcohol Depend. 2020;217:108331. [FREE Full text] [CrossRef] [Medline]
- Slaunwhite A, Palis H, Xavier C, Desai R, Zhao B, Zhao B. Engagement of persons with lived experience in research that uses linked administrative health data - the BC Provincial Overdose Cohort. Int J Popul Data Sci. 2022;7(3):2100. [CrossRef]
- Tay Wee Teck J, Butner JL, Baldacchino A. Understanding the use of telemedicine across different opioid use disorder treatment models: A scoping review. J Telemed Telecare. 2023. [CrossRef]
- Park S, Humphry J. Exclusion by design: intersections of social, digital and data exclusion. Inf Commun Soc Routledge. 2019;22(7):934-953. [CrossRef]
- Claborn KR, Creech S, Whittfield Q, Parra-Cardona R, Daugherty A, Benzer J. Ethical by design: engaging the community to co-design a digital health ecosystem to improve overdose prevention efforts among highly vulnerable people who use drugs. Front Digit Health. 2022;4:880849. [FREE Full text] [CrossRef] [Medline]
- Panch T, Mattie H, Atun R. Artificial intelligence and algorithmic bias: implications for health systems. J Glob Health. 2019;9(2):010318. [FREE Full text] [CrossRef] [Medline]
- Walsh CG, Chaudhry B, Dua P, Goodman KW, Kaplan B, Kavuluru R, et al. Stigma, biomarkers, and algorithmic bias: recommendations for precision behavioral health with artificial intelligence. JAMIA Open. 2020;3(1):9-15. [FREE Full text] [CrossRef] [Medline]
- Ezell JM, Ajayi BP, Parikh T, Miller K, Rains A, Scales D. Drug use and artificial intelligence: weighing concerns and possibilities for prevention. Am J Prev Med. 2024;66(3):568-572. [CrossRef] [Medline]
Abbreviations
AI: artificial intelligence |
CDSS: clinical decision support systems |
EHR: electronic health record |
MOUD: medications for opioid use disorder |
OEND: opioid overdose education and naloxone distribution |
OUD: opioid use disorder |
TRE: Trusted Research Environments |
Edited by A Mavragani; submitted 22.03.24; peer-reviewed by T Schmid, Z Su, M Saberikamarposhti; comments to author 04.05.24; revised version received 13.07.24; accepted 17.11.24; published 28.04.25.
Copyright©Matthew Amer, Rosalind Gittins, Antonio Martinez Millana, Florian Scheibein, Marica Ferri, Babak Tofighi, Frank Sullivan, Margaret Handley, Monty Ghosh, Alexander Baldacchino, Joseph Tay Wee Teck. Originally published in the Journal of Medical Internet Research (https://www.jmir.org), 28.04.2025.
This is an open-access article distributed under the terms of the Creative Commons Attribution License (https://creativecommons.org/licenses/by/4.0/), which permits unrestricted use, distribution, and reproduction in any medium, provided the original work, first published in the Journal of Medical Internet Research (ISSN 1438-8871), is properly cited. The complete bibliographic information, a link to the original publication on https://www.jmir.org/, as well as this copyright and license information must be included.