Published on in Vol 26 (2024)

Preprints (earlier versions) of this paper are available at https://preprints.jmir.org/preprint/49022, first published .
Promises, Pitfalls, and Clinical Applications of Artificial Intelligence in Pediatrics

Promises, Pitfalls, and Clinical Applications of Artificial Intelligence in Pediatrics

Promises, Pitfalls, and Clinical Applications of Artificial Intelligence in Pediatrics

Viewpoint

1Children’s Hospital of Atlanta, Atlanta, GA, United States

2School of Medicine, Emory University, Atlanta, GA, United States

3Healio, South New Jersey, NJ, United States

4Cognoa, Inc, Palo Alto, CA, United States

5Division of Health Informatics, Department of Pediatrics, University of Pittsburgh, Pittsburgh, PA, United States

6UPMC Children’s Hospital of Pittsburgh, Pittsburgh, PA, United States

7Fowler School of Engineering, Chapman University, Orange, CA, United States

8SSI Strategy, Parsippany, NJ, United States

9Lapsi Health, Amsterdam, Netherlands

10Rady Children’s Hospital, San Diego, CA, United States

11AdaptX, Seattle, WA, United States

12Children’s Hospital of Orange County, Orange, CA, United States

13University of California Irvine School of Medicine, Irvine, CA, United States

Corresponding Author:

Carmela Salomon, PhD

Cognoa, Inc

2185 Park Blvd

Palo Alto, CA, 94306

United States

Phone: 1 8664264622

Email: carmela.salomon@cognoa.com


Artificial intelligence (AI) broadly describes a branch of computer science focused on developing machines capable of performing tasks typically associated with human intelligence. Those who connect AI with the world of science fiction may meet its growing rise with hesitancy or outright skepticism. However, AI is becoming increasingly pervasive in our society, from algorithms helping to sift through airline fares to substituting words in emails and SMS text messages based on user choices. Data collection is ongoing and is being leveraged by software platforms to analyze patterns and make predictions across multiple industries. Health care is gradually becoming part of this technological transformation, as advancements in computational power and storage converge with the rapid expansion of digitized medical information. Given the growing and inevitable integration of AI into health care systems, it is our viewpoint that pediatricians urgently require training and orientation to the uses, promises, and pitfalls of AI in medicine. AI is unlikely to solve the full array of complex challenges confronting pediatricians today; however, if used responsibly, it holds great potential to improve many aspects of care for providers, children, and families. Our aim in this viewpoint is to provide clinicians with a targeted introduction to the field of AI in pediatrics, including key promises, pitfalls, and clinical applications, so they can play a more active role in shaping the future impact of AI in medicine.

J Med Internet Res 2024;26:e49022

doi:10.2196/49022

Keywords



Artificial intelligence (AI) is a broad term describing the use of machine-based learning algorithms and software [1]. Nonmedical applications of AI include text prediction when we write emails or movie suggestions on cable and streaming based on our previous viewing choices. Machine learning (ML) is a specific branch of AI focused on altering programs or algorithms based on exposure to data in order to improve performance over time; in other words, the “machine” is “learning” as it accumulates more data and patterns. ML exists on a continuum. Supervised algorithms, for example, may require a great deal of outside input in order to function, while unsupervised algorithms can function with much greater degrees of autonomy [2]. One ML approach, useful when exploring complex nonlinear patterns, involves convolutional neural networks (CNNs). CNNs apply a special mathematical operation called a convolution across cumulative steps to perform specialized tasks like image processing. Natural language processing (NLP) solutions using ML are also being pioneered to help computers understand, categorize, and extract insights from natural language data sets [1]. In the field of medicine, AI has shown promise to assist with a wide array of clinical tasks, including risk prediction, diagnosis, and augmented decision-making and treatment and monitoring. Intelligent technology may also be leveraged to streamline workflows and automate some routine but historically time-intensive tasks, such as clinical note-taking [3]. NLP solutions, for example, have proven extremely useful in supporting clinicians to more efficiently leverage the large quantities of data collected in electronic medical records (EMRs) every day. NLP “reads” EMRs, attempting to understand medical words and phrases within the specific context of a given patient. IBM Watson, for example, has already successfully been used to examine large data sets of EMRs from diverse populations to create lists of common complications for a given population. Watson has also been used to gather medical literature in response to queries contained within examined EMRs [4].

In the following section of this viewpoint, we explore a selection of pediatric conditions, where AI-based approaches are being pioneered to various support risk prediction, diagnosis, therapy and response monitoring, or clinical workflow efficiency.


Risk Prediction

Despite well-established clinical standards and pathways for asthma management, asthma remains one of the most frequent clinical concerns in pediatric offices and emergency rooms [5]. AI-based innovative solutions are being explored to help predict asthma risk so earlier intervention can occur. One study analyzed over 335,000 asthma events spanning the years 2005 to 2018 and then used ML to predict the risk of hospitalization for patients with asthma [6]. Similarly, another set of ML algorithms integrated clinical data with information on weather, neighborhood characteristics, and community viral load to predict the likelihood of hospitalization for children with asthma [7]. Another study used EHR data from 9934 children to train ML models to accurately predict the likelihood of a child’s asthma persisting over time [8].

Monitoring and Medication Management

Several companies are pioneering efforts in this space. One company is optimizing AI for spirometry in a cell phone app [9]. The app captures a patient’s exhalation with the phone microphone and then translates these acoustic sounds into a curve representing lung volume. A portable spirometry test allows patients to conveniently track deterioration or therapeutic improvement over time. A separate mobile tool uses smart sensor data to register inhaler usage. This sensor analyzes the inhalation history of a patient in relation to weather conditions in order to recognize potential triggers of asthma attacks [10]. This analysis can also provide suggestions for the treatment dosage based on exacerbating factors present [10]. Another innovative digital therapy (DTx) uses AI to alert patients and parents to the early stages of an asthma attack. DTx continuously analyzes respiratory sounds in order to more quickly recognize changes in the sounds that could signal an oncoming asthmatic event. Ideally, warnings from DTx will help children and caregivers recognize the triggers as well as symptoms of an asthmatic crisis. Predicting outcomes in real time by using patient data and pattern detection algorithms patterns could lead to more precise and personalized asthma management [11].


A leading children’s hospital has developed a program that uses pattern recognition and real-time data analysis to efficiently diagnose rare diseases in newborns. Leveraging AI to help automate genomic diagnosis could streamline and expedite the diagnostic process in neonatal intensive care units and pediatric intensive care units (PICUs). While clinical adoption considerations must first be addressed, such technology could potentially reduce delays to targeted life-changing treatments in newborns with diseases of previously unknown etiology [12].


Another serious concern in pediatrics is the early diagnosis of sepsis. In a study of almost 500 PICU patients, AI detected severe sepsis as early as 8 hours prior to traditional EMR-based screening algorithms [13]. This efficiency can have a profound impact on the management of sepsis in the PICU by allowing for earlier intervention and thus potentially reducing morbidity and mortality.


In the United States, approximately 7% of adolescents develop persistent opioid use after surgery [14]. A new company focused on health and wellness for athletes recently made use of AI-powered software, OR Advisor (AdaptX). The use of OR Advisor enables frontline clinicians to perform rapid analysis of real-world data collected in their EMR. OR Advisor is also capable of tracking clinical metrics over time as well as monitoring treatment outcomes [14]. OR Advisor can help physicians more quickly answer clinically relevant questions, such as how adjusting medication doses might affect pain management. Using OR Advisor, Seattle Children’s Bellevue Clinic and Surgery Center was able to reduce opioid administration from 85% of surgeries to less than 1% [15]. Decreasing perioperative opioid use could potentially not only save thousands of lives per year but also reduce complications, thereby reducing the length of stay and improving recovery.


AI in radiology is perhaps the most impressive application of AI in medicine, with CNNs often performing as or even more accurately than clinicians. AI has already been applied in a variety of pediatric settings, such as scoliosis quantification, fracture and injury detection, and cystic fibrosis scoring on chest radiographs [16], but there are many other potential uses. One of the best potential uses of CNNs is as a screening tool for life-threatening diagnoses that need rapid intervention, such as acute ischemic strokes [17]. CNNs can accelerate the timeline of a non–life-threatening diagnosis (such as a small pneumonia) as well. Sophisticated AI image interpretation can also potentially lead to novel discoveries in medical images (such as tumor-stroma interfaces) that can help with subtyping, therapy modification, and prognostication of medical conditions [18]. To date, the majority of AI-based radiology solutions have been developed for adult populations. Radiology imaging advocacy groups have recently begun lobbying the US Congress to develop policies addressing the scarcity of pediatric-specific AI-based innovations [19].


AI is being leveraged to support improved behavioral and mental health outcomes in pediatric populations, including in children with autism. While autism diagnosis is possible as early as the age of 18 months, the average age of diagnosis in the United States remains older than 4 years [20,21]. The majority of these children are diagnosed in specialty care; however, the increasing demand for evaluations coupled with a shortage of specialists has contributed to lengthy waits for assessments [20,22,23]. Delays in treatment initiation can have lifelong implications for children and their families [24,25]. Applications using AI can potentially reduce these delays. For example, a Food and Drug Administration (FDA)–authorized AI-based diagnostic device, Canvas Dx, uses inputs from a child’s health care provider, caregiver, and video analysis to help health care providers rapidly diagnose or rule out autism in young children with concern for developmental delay [26-28]. The diagnostic device can be used in primary care settings, potentially decreasing delays to treatment initiation and thus significantly improving the child’s trajectory [27].

AI may also have a role to play in reducing diagnostic bias prevalent in some autism care pathways. Currently, compared to White peers, Black and Latino or Hispanic children receive later autism diagnoses [29,30]. One study found that in comparison to White children, Hispanic and Black children are 65% and 19% less likely to be diagnosed with autism and these racial or ethnic disparities exist regardless of economic class [29]. Similarly, girls are diagnosed with autism an average of 1.5 years later than their male peers [31,32]. Children from rural communities also experience lags in diagnosis and access issues. AI could help address some of these biases and promote diagnostic equity in several ways. Item-level racial or gender bias noted in some existing diagnostic and screening tools [33], for example, could be reduced through careful training of AI-based diagnostic alternatives on large samples of gender and racially diverse accurately labeled data. AI-based remotely administered technologies may also reduce access barriers for children in rural communities who face health care disparities due to geographical isolation, hospital closures, and insufficient clinical workforce coverage [34].


Mental health management in preteens and adolescents is another important concern in the primary care pediatric setting and can be especially challenging in the context of limited mental health resource availability. An estimated 31% of teens have an anxiety disorder [35]. The severity of these disorders and the scarcity of providers make it imperative that patients and caregivers have easily accessible resources that can be used while waiting to see a mental health provider. Such is the case with a research-based mental health software app, Woebot, that provides digital cognitive behavioral therapy to patients. Using AI, Woebot tailors its conversations to each patient, with the goal of providing personalized and high-quality care [33]. In a study published in JMIR, the intervention group experienced a 22% reduction in depressive symptoms over the course of 2 weeks after using this software in comparison with the control group [33]. The product was initially tested with subjects between the ages of 18 and 65 years; however, it may prove to be useful in the pediatric space as well, given ongoing mental health access and resource issues. Table S1 in Multimedia Appendix 1 [6-10,12,13,36,37] summarizes topical studies discussed in this viewpoint including further detail about methodology, performance metrics, and product development.


Although AI has the potential to meaningfully benefit pediatric care, a number of regulatory, implementation, and ethical challenges exist. Additionally, poorly conceived AI has the potential to cause harm and therefore requires serious clinical consideration and risk identification and mitigation as part of any workflow implementation process. First, clinicians should be aware that training algorithms often require large data sets. In pediatrics, the small sample sizes of some data sets, especially when split by age group, can pose challenges when building unbiased AI algorithms. Relatedly, in order to build less biased algorithms, training data must be representative of the applicable populations across domains such as race, ethnicity, gender, socioeconomic status, and geographic contexts. Obtaining samples with known demographics and large enough subsamples of intersecting groups can prove challenging.

While we previously touched on the potential of AI to enhance health equity, the reverse may also be true. Without careful and equitable sampling, algorithms can potentially perpetuate rather than reduce bias. Even when this risk is mitigated by training AI algorithms on a diverse patient base, bias can still occur at the data labeling level. This is because the algorithm may be “learning” from the implicit biases of humans who are annotating the input data or validating its output classifications. Clinicians can help mitigate these risks by requesting information on the size and diversity of the training data used and through careful interrogation of any potential labeling biases.

Also considering that even when AI is built on representative data sets and yields generalizable algorithms, implementation challenges may reduce the extent to which such technologies can address inequities in practice. An unbiased algorithm implemented inequitably could, in fact, maintain or perpetuate disparities in pediatric health care. Families without access to a computer or reliable internet connectivity, for example, may not be able to access some digital technologies that could benefit them. As AI-powered tools become more commonplace in clinical practice, we recommend that their implementation be accompanied by physician education, antibias training, and system-wide change aimed at making health care more accessible to marginalized populations. Similarly, disparities in access to technology should be addressed. These goals should be supported through tactics such as the development of consensus ethical frameworks for the use of AI health care, the introduction of robust AI and ethics curriculum in medical education, as well as regulatory frameworks.

There are a number of ethical questions to consider when integrating AI-based technologies into clinical pathways. For example, what are the implications of physicians relying on outputs from complex, multidimensional ML models in clinical decision-making when they may have limited visibility into how such models arrive at diagnostic conclusions? [3,38] Thoughtful use of technology also necessitates weighing the benefits of intelligent technology against the potential scale of harm that could occur if a widely deployed algorithm were to malfunction [3]. Open questions also remain about who should be held accountable for incorrect diagnostic or treatment recommendations produced by devices powered by AI [39]. Some patients have also raised concerns about how their data privacy and security will be maintained within such systems [40]. In response to these questions, a number of ethical frameworks for the use of AI in clinical practice are being developed [41,42]. These include action points related to building institutional capabilities to redress any emerging AI-related harms; distinguishing between appropriate and inappropriate AI task delegation; developing metrics to help measure and monitor the trustworthiness of AI-based technologies; and financially incentivizing the inclusion of ethical, legal, and social considerations in AI research projects [41].

Open questions also remain about how to effectively regulate AI-based medical technologies. To date, the FDA has cleared more than 160 AI-powered devices for use [43]. Existing regulatory frameworks, however, do not account for the ability of intelligent devices to iteratively “learn” from the data they are exposed to and alter their algorithms in response [44]. Should such features be disabled in approved products to avoid drift in performance and function? If not, how should such features be scrutinized and regulated on an ongoing basis? The FDA proposed regulatory framework for software as medical device technologies provide some direction for potential solutions such as predetermined change control plan and algorithm change plan protocol specifications [45].


As clinical AI apps become more commonplace, pediatricians will require fundamental training and orientation to the data science behind AI so they can use AI-enabled tools appropriately and evaluate their strengths and weaknesses. However, AI integration into medical curriculum is currently limited and inconsistent, and no topical content is tested in key licensing examinations [46]. This has left both students and practicing clinicians with knowledge and confidence gaps regarding the application of AI in clinical practice [47-49].

Medical schools should establish introductory AI courses to foster clinical confidence and competence. The future pediatrician will benefit greatly from the synergy between AI and human cognition. A number of proposed curriculum updates are being explored [46,50,51], including MEd2030, an updated medical school curriculum that will add 10 additional modules (eg, biomedical informatics and AI, transdisciplinary collaboration and diversity principles, biomedical entrepreneurship, design innovation) [52]. In the continuing education sector, conferences and programs on digital therapeutics and AI-based health technologies are also growing as more companies introduce FDA-regulated products.

Actively engaging with AI-based technologies will allow pediatricians to play a greater role in the implementation and use of AI in every aspect of health care. Learning to apply developing technologies is one key to optimizing patient care and improving the future of health care systems globally. When EMRs were first introduced in the 2000s, many clinicians knew about them, but few were engaged in the implementation and continual improvement of the technology. EMRs have since infiltrated almost every aspect of a clinician’s practice; yet, as a result of limited early-stage clinical engagement, they are often not optimized for the clinical workflow [1]. By seeking to understand and engage with the next wave of technological advances earlier and more proactively, pediatricians can help ensure they will be more tailored to the needs of both care providers and patients.


Health care delivery models are changing rapidly, and AI is poised to greatly impact the future of pediatric care. Efficient diagnostic tools, faster data analysis, predictive outcomes modeling, more streamlined care experiences, and automation of some previously time-intensive tasks are just some of the potential benefits offered by AI. However, widespread clinical integration of AI into pediatric care pathways will also require thoughtful solutions to complex data quality and privacy issues, and ethical and regulatory implementation barriers. We recommend that tailored AI-focused curriculum and ongoing training opportunities be developed and implemented rapidly to support pediatricians and medical students alike to responsibly use and understand the strengths and limitations of such technologies more deeply. With improved education, we are optimistic about the future potential of AI to enhance clinical efficiencies and outcomes and to support broader access to high-quality pediatric health care in marginalized communities.

Acknowledgments

No generative AI was used in any portion of the manuscript writing.

Authors' Contributions

AO contributed to the manuscript section focusing on using artificial intelligence (AI) to expedite genomic diagnosis and to predict the physiological deterioration of noncardiac patients in the intensive care unit as well as patients in the cardiothoracic intensive care unit. DvS contributed to the case study examples including the use of AI in pediatric respiratory diseases. AK made revisions to each section and helped with the organization of the manuscript. CS contributed to the section on the challenges of AI in health care, made revisions to each section of the manuscript, and helped with the organization of the manuscript. ST contributed to all sections of the manuscript and provided final review and edits. Primary contributions were in the overview of AI, the use of AI in clinical care specifically in regard to autism spectrum disorder and behavioral and mental health, challenges in AI, and future medical education. RK contributed to all sections of the manuscript and provided overall review and edits. SS contributed to all sections of the manuscript and provided overall review and edits. AC contributed to the section titled “What is artificial intelligence in medicine?” and provided overall review and edits. DL contributed to the section of the manuscript discussing the use of AI to reduce opioid use in surgery.

Conflicts of Interest

RK, HB, and AK are consultants for Cognoa, Inc. ST and CS are employees of Cognoa, Inc and have its stock options. ST additionally receives consulting fees for Cognito Therapeutics, volunteers as a board member of the American Academy of Pediatrics, Orange County Chapter, and American Academy of Pediatrics–California, is a paid advisor for MI10 LLC, and owns stock in NTX, Inc, and HandzIn. HB is also an advisor for FreeSpira. DvS is an employee and shareholder of Lapsi Health. DL is an employee of AdaptX. AO is an employee of Rady Children’s Hospital. AO also serves on the board of directors for San Diego Health Connect and the advisory board for KLAS Research.

Multimedia Appendix 1

Summary of reviewed studies.

DOCX File , 16 KB

  1. Chang AC. Intelligence-Based Medicine: Artificial Intelligence and Human Cognition in Clinical Medicine and Healthcare. London, United Kingdom. Academic Press; 2020.
  2. Aylward BS, Abbas H, Taraman S, Salomon C, Gal-Szabo D, Kraft C, et al. An introduction to artificial intelligence in developmental and behavioral pediatrics. J Dev Behav Pediatr. 2023;44(2):e126-e134. [FREE Full text] [CrossRef] [Medline]
  3. Topol EJ. High-performance medicine: the convergence of human and artificial intelligence. Nat Med. 2019;25(1):44-56. [CrossRef] [Medline]
  4. Miller DD, Brown EW. Artificial intelligence in medical practice: the question to the answer? Am J Med. 2018;131(2):129-133. [FREE Full text] [CrossRef] [Medline]
  5. Zahran HS, Bailey CM, Damon SA, Garbe PL, Breysse PN. Vital signs: asthma in children - United States, 2001-2016. MMWR Morb Mortal Wkly Rep. 2018;67(5):149-155. [FREE Full text] [CrossRef] [Medline]
  6. Luo G, He S, Stone BL, Nkoy FL, Johnson MD. Developing a model to predict hospital encounters for asthma in asthmatic patients: secondary analysis. JMIR Med Inform. 2020;8(1):e16080. [FREE Full text] [CrossRef] [Medline]
  7. Patel SJ, Chamberlain DB, Chamberlain JM. A machine learning approach to predicting need for hospitalization for pediatric asthma exacerbation at the time of emergency department triage. Acad Emerg Med. 2018;25(12):1463-1470. [FREE Full text] [CrossRef] [Medline]
  8. Bose S, Kenyon CC, Masino AJ. Personalized prediction of early childhood asthma persistence: a machine learning approach. PLoS One. 2021;16(3):e0247784. [FREE Full text] [CrossRef] [Medline]
  9. Larson EC, Goel M, Boriello G, Heltshe S, Rosenfeld M, Patel SN. SpiroSmart: using a microphone to measure lung function on a mobile phone. ACM Press; 2012 Presented at: Ubicomp '12: The 2012 ACM Conference on Ubiquitous Computing; September 5-8, 2012, 2012;280; Pittsburgh, PA. [CrossRef]
  10. Van Sickle D, Barrett M, Humblet O, Henderson K, Hogg C. Randomized, controlled study of the impact of a mobile health tool on asthma SABA use, control and adherence. Eur Respir J. 2016;48(suppl 60):PA1018. [FREE Full text] [CrossRef]
  11. Dimitriades JB. Ep. 85: value of digital therapeutics – Jhonatan Dimitriades, MD (CEO Lapsi Health). Health Podcast Network. 2021. URL: https:/​/healthpodcastnetwork.​com/​episodes/​health-unchained/​ep-85-value-of-digital-therapeutics-jhonatan-dimitriades-md-ceo-lapsi-health/​ [accessed 2021-11-15]
  12. Clark MM, Hildreth A, Batalov S, Ding Y, Chowdhury S, Watkins K, et al. Diagnosis of genetic diseases in seriously ill children by rapid whole-genome sequencing and automated phenotyping and interpretation. Sci Transl Med. 2019;11(489):eaat6177. [FREE Full text] [CrossRef] [Medline]
  13. Kamaleswaran R, Akbilgic O, Hallman MA, West AN, Davis RL, Shah SH. Applying artificial intelligence to identify physiomarkers predicting severe sepsis in the PICU. Pediatr Crit Care Med. 2018;19(10):e495-e503. [FREE Full text] [CrossRef] [Medline]
  14. AdaptX- eliminating opioids from ambulatory care. AdaptX. URL: https://www.adaptx.com/ [accessed 2024-01-31]
  15. Low D. Closing the gateway from surgery to persistent opioid use. Institute for Healthcare Improvement. 2019. URL: http://www.ihi.org/communities/blogs/closing-the-gateway-from-surgery-to-persistent-opioid-use [accessed 2021-11-05]
  16. Otjen JP, Moore MM, Romberg EK, Perez FA, Iyer RS. The current and future roles of artificial intelligence in pediatric radiology. Pediatr Radiol. 2022;52(11):2065-2073. [FREE Full text] [CrossRef] [Medline]
  17. Öman O, Mäkelä T, Salli E, Savolainen S, Kangasniemi M. 3D convolutional neural networks applied to CT angiography in the detection of acute ischemic stroke. Eur Radiol Exp. 2019;3(1):8. [FREE Full text] [CrossRef] [Medline]
  18. Baxi V, Edwards R, Montalto M, Saha S. Digital pathology and artificial intelligence in translational medicine and clinical practice. Mod Pathol. 2022;35(1):23-32. [FREE Full text] [CrossRef] [Medline]
  19. Stempniak M. Radiology groups urge congress to address scarcity of AI solutions in pediatric care. Radiology Business. 2022. URL: https:/​/www.​radiologybusiness.com/​topics/​artificial-intelligence/​radiology-congress-ai-solutions-pediatric-care [accessed 2024-01-31]
  20. Maenner MJ, Shaw KA, Bakian AV, Bilder DA, Durkin MS, Esler A, et al. Prevalence and characteristics of autism spectrum disorder among children aged 8 years - autism and developmental disabilities monitoring network, 11 sites, United States, 2018. MMWR Surveill Summ. 2021;70(11):1-16. [FREE Full text] [CrossRef] [Medline]
  21. Pierce K, Gazestani VH, Bacon E, Barnes CC, Cha D, Nalabolu S, et al. Evaluation of the diagnostic stability of the early autism spectrum disorder phenotype in the general population starting at 12 months. JAMA Pediatr. 2019;173(6):578-587. [FREE Full text] [CrossRef] [Medline]
  22. Maenner MJ, Shaw KA, Baio J, Washington A, Patrick M, DiRienzo M, et al. Prevalence of autism spectrum disorder among children aged 8 years - autism and developmental disabilities monitoring network, 11 sites, United States, 2016. MMWR Surveill Summ. 2020;69(4):1-12. [FREE Full text] [CrossRef] [Medline]
  23. Zuckerman KE, Lindly OJ, Sinche BK. Parental concerns, provider response, and timeliness of autism spectrum disorder diagnosis. J Pediatr. 2015;166(6):1431-1439.e1. [FREE Full text] [CrossRef] [Medline]
  24. Vivanti G, Dissanayake C, Victorian ASELCC Team. Outcome for children receiving the early start denver model before and after 48 months. J Autism Dev Disord. 2016;46(7):2441-2449. [FREE Full text] [CrossRef] [Medline]
  25. MacDonald R, Parry-Cruwys D, Dupere S, Ahearn W. Assessing progress and outcome of early intensive behavioral intervention for toddlers with autism. Res Dev Disabil. 2014;35(12):3632-3644. [FREE Full text] [CrossRef] [Medline]
  26. CanvasDx. URL: https://canvasdx.com/ [accessed 2021-10-26]
  27. Megerian JT, Dey S, Melmed RD, Coury DL, Lerner M, Nicholls CJ, et al. Evaluation of an artificial intelligence-based medical device for diagnosis of autism spectrum disorder. NPJ Digit Med. 2022;5(1):57. [FREE Full text] [CrossRef] [Medline]
  28. FDA authorizes marketing of diagnostic aid for autism spectrum disorder. U.S. Food & Drug Administration. 2021. URL: https:/​/www.​fda.gov/​news-events/​press-announcements/​fda-authorizes-marketing-diagnostic-aid-autism-spectrum-disorder [accessed 2024-01-31]
  29. Durkin MS, Maenner MJ, Baio J, Christensen D, Daniels J, Fitzgerald R, et al. Autism spectrum disorder among US children (2002-2010): socioeconomic, racial, and ethnic disparities. Am J Public Health. 2017;107(11):1818-1826. [FREE Full text] [CrossRef] [Medline]
  30. Wiggins LD, Durkin M, Esler A, Lee LC, Zahorodny W, Rice C, et al. Disparities in documented diagnoses of autism spectrum disorder based on demographic, individual, and service factors. Autism Res. 2020;13(3):464-473. [FREE Full text] [CrossRef] [Medline]
  31. McCormick CEB, Kavanaugh BC, Sipsock D, Righi G, Oberman LM, De Luca DM, et al. Autism heterogeneity in a densely sampled U.S. population: results from the first 1,000 participants in the RI-CART study. Autism Res. 2020;13(3):474-488. [FREE Full text] [CrossRef] [Medline]
  32. Harrison AJ, Long KA, Tommet DC, Jones RN. Examining the role of race, ethnicity, and gender on social and behavioral ratings within the autism diagnostic observation schedule. J Autism Dev Disord. 2017;47(9):2770-2782. [FREE Full text] [CrossRef] [Medline]
  33. Prochaska JJ, Vogel EA, Chieng A, Kendra M, Baiocchi M, Pajarito S, et al. A therapeutic relational agent for reducing problematic substance use (Woebot): development and usability study. J Med Internet Res. 2021;23(3):e24850. [FREE Full text] [CrossRef] [Medline]
  34. Digital therapeutics: reducing rural health inequalities. Digital Therapeutics Alliance. 2022. URL: https://dtxalliance.org/wp-content/uploads/2021/01/DTA_Rural-Health_r13_110220.pdf [accessed 2024-04-30]
  35. Any anxiety disorder. The National Institute of Mental Health. URL: https://www.nimh.nih.gov/health/statistics/any-anxiety-disorder [accessed 2021-11-05]
  36. Yang J, Zhang K, Fan H, Huang Z, Xiang Y, Yang J, et al. Development and validation of deep learning algorithms for scoliosis screening using back images. Commun Biol. 2019;2:390. [CrossRef] [Medline]
  37. Wall DP, Liu-Mayo S, Salomon C, Shannon J, Taraman S. Optimizing a de novo artificial intelligence-based medical device under a predetermined change control plan: improved ability to detect or rule out pediatric autism. Intelligence-Based Med. 2023;8:100102. [CrossRef]
  38. Ghassemi M, Oakden-Rayner L, Beam AL. The false hope of current approaches to explainable artificial intelligence in health care. Lancet Digit Health. 2021;3(11):e745-e750. [FREE Full text] [CrossRef] [Medline]
  39. Davenport T, Kalakota R. The potential for artificial intelligence in healthcare. Future Healthc J. 2019;6(2):94-98. [FREE Full text] [CrossRef] [Medline]
  40. Richardson JP, Smith C, Curtis S, Watson S, Zhu X, Barry B, et al. Patient apprehensions about the use of artificial intelligence in healthcare. NPJ Digit Med. 2021;4(1):140. [FREE Full text] [CrossRef] [Medline]
  41. Floridi L, Cowls J, Beltrametti M, Chatila R, Chazerand P, Dignum V, et al. An ethical framework for a good AI society: opportunities, risks, principles, and recommendations. In: Ethics, Governance, and Policies in Artificial Intelligence. Cham, Switzerland. Springer International Publishing; 2021;19-39.
  42. Miller KW. Moral responsibility for computing artifacts: "the rules". IT Prof. 2011;13(3):57-59. [FREE Full text] [CrossRef]
  43. Katznelson G, Gerke S. The need for health AI ethics in medical school education. Adv Health Sci Educ Theory Pract. 2021;26(4):1447-1458. [FREE Full text] [CrossRef] [Medline]
  44. Kagiyama N, Shrestha S, Farjo PD, Sengupta PP. Artificial intelligence: practical primer for clinical research in cardiovascular disease. J Am Heart Assoc. 2019;8(17):e012788. [FREE Full text] [CrossRef] [Medline]
  45. Proposed regulatory framework for modifications to Artificial Intelligence/Machine Learning (AI/ML)-based Software as a Medical Device (SaMD) - discussion paper and request for feedback. U.S. Food and Drug Administration. 2019. URL: https://www.regulations.gov/document/FDA-2019-N-1185-0001 [accessed 2024-01-31]
  46. Paranjape K, Schinkel M, Panday RN, Car J, Nanayakkara P. Introducing artificial intelligence training in medical education. JMIR Med Educ. 2019;5(2):e16048. [FREE Full text] [CrossRef] [Medline]
  47. Banerjee M, Chiew D, Patel KT, Johns I, Chappell D, Linton N, et al. The impact of artificial intelligence on clinical education: perceptions of postgraduate trainee doctors in London (UK) and recommendations for trainers. BMC Med Educ. 2021;21(1):429. [FREE Full text] [CrossRef] [Medline]
  48. Dos Santos DP, Giese D, Brodehl S, Chon SH, Staab W, Kleinert R, et al. Medical students' attitude towards artificial intelligence: a multicentre survey. Eur Radiol. 2019;29(4):1640-1646. [FREE Full text] [CrossRef] [Medline]
  49. Sit C, Srinivasan R, Amlani A, Muthuswamy K, Azam A, Monzon L, et al. Attitudes and perceptions of UK medical students towards artificial intelligence and radiology: a multicentre survey. Insights Imaging. 2020;11(1):14. [FREE Full text] [CrossRef] [Medline]
  50. Kolachalama VB, Garg PS. Machine learning and medical education. NPJ Digit Med. 2018;1:54. [FREE Full text] [CrossRef] [Medline]
  51. Wartman SA, Combs CD. Reimagining medical education in the age of AI. AMA J Ethics. 2019;21(2):E146-E152. [FREE Full text] [CrossRef] [Medline]
  52. Chang A. Medical education for the future clinician: a proposal for our medical education leaders. Medical Intelligence 10. URL: https://www.mi10.ai/2021/10/06/medical-education-for-the-future-clinician/ [accessed 2024-04-30]


AI: artificial intelligence
CNN: convolutional neural network
DTx: digital therapy
EMR: electronic medical record
FDA: Food and Drug Administration
ML: machine learning
NLP: natural language processing
PICU: pediatric intensive care unit


Edited by T de Azevedo Cardoso; submitted 15.05.23; peer-reviewed by V Kaelin, D Riggins; comments to author 11.08.23; revised version received 01.09.23; accepted 22.01.24; published 29.02.24.

Copyright

©Hansa Bhargava, Carmela Salomon, Srinivasan Suresh, Anthony Chang, Rachel Kilian, Diana van Stijn, Albert Oriol, Daniel Low, Ashley Knebel, Sharief Taraman. Originally published in the Journal of Medical Internet Research (https://www.jmir.org), 29.02.2024.

This is an open-access article distributed under the terms of the Creative Commons Attribution License (https://creativecommons.org/licenses/by/4.0/), which permits unrestricted use, distribution, and reproduction in any medium, provided the original work, first published in the Journal of Medical Internet Research, is properly cited. The complete bibliographic information, a link to the original publication on https://www.jmir.org/, as well as this copyright and license information must be included.