ViewPoint
Abstract
Artificial intelligence (AI) is being rolled out across the UK National Health Service (NHS) to improve efficiency; yet, its carbon footprint is largely invisible within mandatory Green Plan reporting. This work shows where NHS carbon reporting omits AI-related emissions and proposes feasible accounting and procurement measures that allow trusts to assess whether AI adoption advances or undermines net zero. A review of NHS sustainability guidance, the Department for Environment, Food & Rural Affairs conversion factors, and recent evidence on AI energy use shows that current Scopes 1-3 accounting omits substantial emissions at 3 points. First, a lack of granularity provides averages that can obscure the extreme energy intensity of certain AI workloads. Second, life-cycle emissions from specialized hardware (eg, graphics processing units) are often excluded unless trusts own the equipment, ignoring upstream manufacturing impacts. Third, widespread use of unprocured generative AI tools is unmeasured; extrapolating general practice survey data suggests that ChatGPT queries alone could release ≈ 349t CO₂e per year in primary care. To close these gaps, we propose three potential ways to help reduce these reporting gaps: (1) AI-specific carbon disclosure clauses in vendor contracts, (2) inclusion of cradle-to-grave emission factors for AI hardware in Scope 3 reporting, and (3) lightweight monitoring of external AI traffic (while recognizing potential ethical issues with this). Implementing these measures would give health care leaders a more accurate baseline against which to judge whether AI supports or undermines the NHS net-zero target.
J Med Internet Res 2025;27:e79174doi:10.2196/79174
Keywords
Why Artificial Intelligence Matters for NHS Net Zero
Artificial intelligence (AI) is increasingly integrated in health care delivery, including within the UK’s National Health Service (NHS) []. Although AI promises more efficient care [] and potential climate benefits [], it also carries a carbon cost due to the energy-intensive computation and manufacturing of digital infrastructure []. On the one hand, AI-driven automation could reduce emissions by streamlining care processes, optimizing resource use, and supporting remote care. On the other hand, AI systems (particularly large-scale models) consume significant energy and contribute to carbon emissions through data center operations, hardware production, and software training. Given that the NHS accounts for 4% of United Kingdom’s total carbon emissions [], it is crucial to understand how AI aligns with its sustainability goals. This can only be done if the reporting of carbon emissions from AI is accurate. This work will focus on carbon emissions reporting in the NHS and how AI troubles current practice in order to show where current reporting omits AI-related emissions and proposes feasible accounting and procurement measures that allow trusts to better measure AI emissions.
The NHS has committed to achieving net zero emissions by 2040 for direct operations (Scopes 1 and 2) and by 2045 for indirect emissions (Scope 3), as part of the United Kingdom’s wider goals of net zero by 2050 []. A visual of these from NHS England can be seen in []. Current reporting frameworks identify emissions within these scopes, but critically, there is no specific accounting for the emissions associated with AI technologies, potentially leaving a significant gap in NHS’s net zero roadmap. In this piece, I argue that the current NHS’s net zero strategy potentially underestimates or overlooks the carbon footprint of AI. Without proper measurement and regulation (such as AI carbon accounting and sustainable procurement), rising AI adoption could undermine NHS’s environmental goals. The focus of this viewpoint is not whether AI will be environmentally positive or negative for the environment and the NHS (which has been covered elsewhere, such as [,]) but rather that current reporting practices are not currently stringent enough, given the increased use of AI in the NHS.

NHS Net Zero Commitments and Reporting
The NHS has set ambitious net-zero targets, now embedded in law via the Health and Care Act 2022 []. This legislation places a duty on all NHS trusts, foundation trusts, and Integrated Care Boards to contribute to statutory emissions reduction targets []. In practical terms, every NHS trust has been asked to develop a Green Plan to map out a rapid decarbonization trajectory in line with these national goals []. These reports cover the organization’s greenhouse gas emissions across Scopes 1 and 2 and a defined subset of Scope 3, aligning with NHS England’s Carbon Footprint (directly controlled emissions) and Carbon Footprint Plus (influenced emissions) frameworks []. In short, all NHS trusts are now required to measure and disclose their carbon footprint annually.
No specific mandate exists to isolate AI software emissions in NHS reports, but trusts are expected to account for the carbon impact of digital health technologies under their broader footprint emissions. AI tools are typically subsumed under existing categories such as IT and electricity use. If an AI application runs on hospital-owned servers or devices, its energy consumption contributes to the trust’s Scope 2 electricity emissions, which are calculated using standard conversion factors (eg, kgCO₂ per kWh of UK electricity) []. These Scope 2 emissions are reported in line with the Greenhouse Gas Protocol and UK government guidelines []. On the other hand, if an AI solution is delivered as a service by a vendor (eg, a cloud-based AI diagnostic tool), the associated emissions fall into Scope 3 (the supply chain or NHS Carbon Footprint Plus). In such cases, trusts rely either on data from the supplier or, more commonly, on estimations since many AI vendors do not disclose product-level carbon data. For AI and other digital services, where direct measurements are rarely available, trusts often resort to input-output modelling or proxy indicators.
Given the data gaps from AI tool suppliers, NHS trusts use established frameworks to estimate emissions from digital technologies in the form of the UK Government’s Greenhouse Gas Conversion Factors, published annually by the Department for Environment, Food & Rural Affairs (DEFRA). This DEFRA framework provides standardized CO₂ equivalent values per unit activity, for example, per kWh of electricity, per passenger-km of travel, or per pound sterling of spend in various economic sectors []. This means, if a trust spends a certain amount on IT services or software subscriptions (which would include AI software contracts), that spend can be converted into an estimated carbon emission figure by using the relevant factor. This approach captures the approximate footprint of the wider IT sector per pound spent, rather than the specific energy used by a particular algorithm but is often the only feasible method when suppliers do not provide detailed data. This method had the potential to over, or under, quantity emissions.
There are 3 key problems with this methodology, which contribute to potential underreporting of carbon emissions regarding the use of AI technologies: a lack of granularity, the AI infrastructure life cycle, and unprocured AI tools.
Problem 1 – Granularity and High-Intensity AI Workloads
Although the above methods allow NHS trusts to include AI in their carbon ledgers, they may not fully capture the true carbon footprint of AI software. One limitation is that of granularity, as generic emission factors provide averages that can obscure the extreme energy intensity of certain AI workloads. For example, one analysis equated training OpenAI’s GPT-3 to the carbon emissions of hundreds of transatlantic flights []. Such one-time training emissions would not be visible to a hospital trust procuring the model as a service, since the trust might only log the ongoing usage or subscription cost. Further to this, models are often retrained often to avoid critical issues such as data drift [], which is unlikely to be evident to those engaged in reporting. Even at the usage stage, AI algorithms can draw significantly more power than conventional software []. A recent study warned that implementing large language model systems across hospitals could have very significant environmental consequences, noting that a single complex AI query uses enough electricity to charge a smartphone 11 times (and consumes 20 mL of cooling water), with ChatGPT estimated to use 15 times the energy of a traditional Google search []. If a hospital begins running thousands of AI queries on patient data, these energy costs quickly accumulate. Yet, unless the trust actively measures the IT electricity load or the cloud provider reports it, the default accounting might underestimate this. In many cases, the emissions from running hospital AI are simply grouped into facility electricity use, where they are hard to distinguish from other IT or equipment energy demands.
A potential method for overcoming this issue could be to add an AI-specific carbon disclosure clause to digital health contracts. This could, for example, require vendors to provide model-level emissions data (training, retraining, and inference energy use) and data-center carbon intensity by using Greenhouse Gas Protocol methods. Trusts would then record these figures separately rather than folding them into generic IT spend [,].
Problem 2 – AI Infrastructure Life Cycle and Embedded Emissions
Another challenge is that the life cycle impacts of AI infrastructure are often overlooked in current reporting. DEFRA’s conversion factors and NHS models primarily address operational emissions (eg, energy use, fuel, travel). However, AI systems rely on power-hungry hardware (graphics processing units, servers) whose manufacture and maintenance carry a carbon cost that is not accounted for by just looking at electricity consumption. Researchers have highlighted that the embedded carbon in digital hardware is significant, and the manufacturing of high-end processors can roughly double the carbon footprint of AI operations when considered over the hardware’s life []. Unless a trust directly purchases new hardware for AI (in which case some of that manufacturing footprint might appear in the supply chain emissions of medical equipment or IT equipment procurement), these upstream impacts remain hidden with current DEFRA estimates. For AI services run in the cloud, the manufacturing footprint is on the vendor’s books and would only reach the NHS if the vendor reports it or if the NHS uses a broad spend-based factor that averages such upstream impacts. In short, today’s estimation techniques might undercount the full lifecycle of emissions from deploying AI.
A consideration for this problem could be to extend Green Plans to include a full life-cycle assessment of AI hardware []. This may manifest itself in procurement templates; insist on cradle-to-grave carbon factors for graphics processing units, servers, and cooling equipment; and incorporate them into Scope 3 reporting. This addresses the hidden manufacturing footprint of high-end processors noted in the analysis.
Problem 3 – Unprocured Generative AI in Daily Practice
The third area where current reporting falls short is on unprocured, freely available, AI such as ChatGPT and Google Gemini. As these are not officially procured by the NHS, and their use by individuals within the health care system is not known or accounted for, their environmental impacts are not reported. The exact usage of these AI tools in the NHS is unknown, but a recent survey of general practice found that 20% of general practices in the United Kingdom use generative AI such as ChatGPT in their everyday practice [].
Estimates for how much carbon is released per ChatGPT query vary (as OpenAI do not disclose their environmental reports). However, various sources use the figure of 4.32 g of carbon released per prompt [,]. In July 2025, 33.6 million general practice appointments took place in the United Kingdom []. To estimate the total number per year, we divide this by 30.4 (the average days in a month) and multiply by 365 days in a year to reach 403,421,053 general practice appointments per year. If we assume 20% of these are using unprocured generative AI [], then we can estimate that each year, 348,555.790 kg of carbon is released from these tools in the general practice alone (see for estimated calculations).
Estimated number of general practice (GP) appointments per year
33.6 million GP appointments in July 2025.
(33,600,000 / 30.4) x 365 = 403,421,053 appointments per year
Estimated total annual unprocured artificial intelligence (AI) queries
20% use generative AI → 80,684,211 AI queries per year.
Estimated total annual CO₂ emissions from unprocured generative AI
Each ChatGPT query emits 4.32 g CO2.
80,684,211 × 4.32 g = 348,555,790 g = 348,555.79 kg
Over a year, this is equivalent to around 2,300,000 phlebotomies [] or 20,000 magnetic resonance imaging scans []. Further, a worldwide survey by Elsevier Health of 2607 clinicians [] found that 48% supported using generative AI to aid clinical decision-making. It is therefore likely that many currently are in the NHS and that the emissions from this are not reported. As the extent of the use of these tools is unknown, there is no way for it to be accounted for in the current reporting mechanisms.
The setting up of a lightweight monitoring for generative-AI web traffic by deploying browser or network logging that counts staff queries to external AI tools and multiplies them by an agreed emission factor may be a way to overcome this lack of reporting. However, this itself raises ethical concerns over privacy [] and therefore may not be deemed appropriate.
Moving Forward
Current carbon reporting metrics for the NHS are unlikely to fully capture the environmental impact of AI being implemented. Unlike more traditional sources (energy, travel, waste), which have well-defined reporting protocols, AI as of yet does not have an agreed-upon carbon accounting method in the NHS context. A 2021 scoping review on AI in the NHS noted that standardized measures are lacking for quantifying AI-associated emissions, which limits the ability of health systems to track improvements or trade-offs in this area []. More recently, a 2024 policy paper by UK researchers and NHS experts echoed the need for better tools: it called for developing an open and shared database of carbon factors for digital health and for standardizing how we calculate the carbon impact of digital interventions (from electronic records to AI) []. The report emphasizes that as health care digitizes, robust preimplementation and postimplementation carbon assessments of new technology are needed to ensure that purported emission savings (eg, through telemedicine or AI-driven efficiency) are not undermined by unaccounted digital emissions.
In practice, NHS trusts currently face data and methodology gaps when it comes to AI. They may know, for example, that a telehealth AI platform reduces patient travel (a clear carbon saving that they can count in reduced transport emissions), but they likely lack precise data on the platform’s own cloud computing footprint. The default approach using DEFRA conversion factors and broad averages is useful but coarse. It treats digital services with a one-size-fits-all lens, which can misrepresent both highly optimized low-carbon software and particularly energy-intensive AI processes. The result is that the carbon footprint of AI in health care is often only partially accounted for. It might be recorded as a small increment in electricity usage or an imputed supply-chain emission, even if the true impact (especially at scale) could be larger.
Broader Implications
Although this viewpoint focuses on missed AI emissions in NHS reporting, it illustrates a wider challenge in global health care. AI is being implemented without full accounting for externalities, with the environment only one among others such as safety, equity, labor, supply chains, and service resilience. Poorly specified accounting can export carbon and other risks to different regions, weakening international comparisons of AI-enabled care. Progress therefore depends on harmonized disclosure and auditable methods that let health systems judge whether AI genuinely improves care without shifting costs to other people or places.
Acknowledgments
This work received no specific funding. DJR receives funding by the National Institute for Health and Care Research (NIHR) Artificial Intelligence for Multiple Long-Term Conditions (AIM) program (NIHR203982). The views expressed are those of the author and not necessarily those of the NIHR or the Department for Health and Social Care.
Conflicts of Interest
None declared.
References
- Artificial intelligence. NHS England Transformation Directorate. Apr 30, 2025. URL: https://transform.england.nhs.uk/information-governance/guidance/artificial-intelligence/ [accessed 2025-05-26]
- NHS AI expansion to help tackle missed appointments and improve waiting times. NHS England. Mar 14, 2024. URL: https://www.england.nhs.uk/2024/03/nhs-ai-expansion-to-help-tackle-missed-appointments-and-improve-waiting-times/ [accessed 2025-08-13]
- Bloomfield P, Clutton-Brock P, Pencheon E, Magnusson J, Karpathakis K. Artificial intelligence in the NHS: climate and emissions✰,✰✰. The Journal of Climate Change and Health. Oct 2021;4:100056. [CrossRef]
- Bratan T, Heyen NB, Hüsing B, Marscheider-Weidemann F, Thomann J. Hypotheses on environmental impacts of AI use in healthcare. The Journal of Climate Change and Health. Mar 2024;16:100299. [CrossRef]
- Delivering a "net zero" National Health Service. NHS England. URL: https://www.england.nhs.uk/greenernhs/wp-content/uploads/sites/51/2020/10/delivering-a-net-zero-national-health-service.pdf [accessed 2025-05-26]
- Policy paper. Net zero strategy: build back greener. Department for Energy Security and Net Zero and Department for Business, Energy & Industrial Strategy. 2022. URL: https://www.gov.uk/government/publications/net-zero-strategy [accessed 2025-05-26]
- Mafi A, Holmes S. Artificial intelligence risks becoming an environmental disaster. BMJ. Mar 13, 2025;388:r505. [CrossRef] [Medline]
- Open Government License (version 3.0) for public sector information. The National Archives. URL: https://www.nationalarchives.gov.uk/doc/open-government-licence/version/3/ [accessed 2025-10-24]
- UK Government E. Health and care act 2022. Legislation.gov.uk. 2022. URL: https://www.legislation.gov.uk/ukpga/2022/31/contents [accessed 2025-05-26]
- Trust contributions to the NHS carbon footprint plus. Royal College of Emergency Medicine. URL: https://rcem.ac.uk/wp-content/uploads/2023/06/Greener-NHS-Carbon-Footprint-Plus-trust-methodology.pdf [accessed 2025-05-26]
- Our action 50 green plan. Cambridge University Hospitals NHS Foundation Trust. URL: https://www.cuh.nhs.uk/about-us/climate-emergency/our-action-50-green-plan/ [accessed 2025-05-26]
- Greenhouse Gas Protocol. URL: https://ghgprotocol.org/ [accessed 2025-05-26]
- Greenhouse gas reporting: conversion factors 2024. Department for Energy Security and Net Zero. URL: https://www.gov.uk/government/publications/greenhouse-gas-reporting-conversion-factors-2024 [accessed 2025-05-26]
- Truhn D, Müller-Franzes G, Kather JN. The ecological footprint of medical AI. Eur Radiol. Feb 2024;34(2):1176-1178. [FREE Full text] [CrossRef] [Medline]
- Sahiner B, Chen W, Samala R, Petrick N. Data drift in medical machine learning: implications and potential remedies. Br J Radiol. Oct 2023;96(1150):20220878. [CrossRef] [Medline]
- Alloghani M. Architecting green artificial intelligence products: recommendations for sustainable AI software development and evaluation. In: Artificial Intelligence and Sustainability. Cham. Springer; 2024:65-86.
- Kleinig O, Sinhal S, Khurram R, Gao C, Spajic L, Zannettino A, et al. Environmental impact of large language models in medicine. Intern Med J. Dec 2024;54(12):2083-2086. [CrossRef] [Medline]
- Mytton D. Assessing the suitability of the Greenhouse Gas Protocol for calculation of emissions from public cloud computing workloads. J Cloud Comp. Aug 08, 2020;9(1):45. [CrossRef]
- Henderson P, Hu J, Romoff J, Brunskill E, Jurafsky D, Pineau J. Towards the systematic reporting of the energy and carbon footprints of machine learning. J Mach Learn Res. Jan 1, 2020. URL: https://jmlr.org/papers/volume21/20-312/20-312.pdf [accessed 2025-10-24]
- Limit hospital emissions by using short AI prompts - study. University of Reading. Nov 15, 2024. URL: https://tinyurl.com/4v3zmy5h [accessed 2025-05-26]
- Alissa H, Nick T, Raniwala A, Arribas Herranz A, Frost K, Manousakis I, et al. Using life cycle assessment to drive innovation for sustainable cool clouds. Nature. May 2025;641(8062):331-338. [CrossRef] [Medline]
- Blease CR, Locher C, Gaab J, Hägglund M, Mandl KD. Generative artificial intelligence in primary care: an online survey of UK general practitioners. BMJ Health Care Inform. Sep 17, 2024;31(1):e101102. [FREE Full text] [CrossRef] [Medline]
- What is the CO2 emission per ChatGPT query? Smartly.AI. URL: https://smartly.ai/blog/the-carbon-footprint-of-chatgpt-how-much-co2-does-a-query-generate [accessed 2025-05-26]
- Mittal A. ChatGPT: how much does each query contribute to carbon emissions? LinkedIn. URL: https://www.linkedin.com/pulse/chatgpt-how-much-does-each-query-contribute-carbon-emissions-mittal-wjf8c/ [accessed 2025-05-26]
- Appointments in general practice. NHS England. URL: https://digital.nhs.uk/data-and-information/publications/statistical/appointments-in-general-practice [accessed 2025-05-26]
- Spoyalo K, Lalande A, Rizan C, Park S, Simons J, Dawe P, et al. Patient, hospital and environmental costs of unnecessary bloodwork: capturing the triple bottom line of inappropriate care in general surgery patients. BMJ Open Qual. Jul 2023;12(3):1-8. [FREE Full text] [CrossRef] [Medline]
- McAlister S, McGain F, Petersen M, Story D, Charlesworth K, Ison G, et al. The carbon footprint of hospital diagnostic imaging in Australia. Lancet Reg Health West Pac. Jul 2022;24:100459. [FREE Full text] [CrossRef] [Medline]
- ChatGP? The uncertain role of generative AI in NHS care. Digital Health. URL: https://www.digitalhealth.net/2023/10/chatgp-the-uncertain-role-of-generative-ai-in-nhs-care/ [accessed 2025-05-26]
- Vodicka E, Mejilla R, Leveille SG, Ralston JD, Darer JD, Delbanco T, et al. Online access to doctors' notes: patient concerns about privacy. J Med Internet Res. Sep 26, 2013;15(9):e208. [FREE Full text] [CrossRef] [Medline]
- Daniel-Watanabe L, Moore R, Tongue B, Royston S. What is the carbon footprint of digital healthcare? UK Energy Research Centre. URL: https://d2e1qxpsswcpgz.cloudfront.net/uploads/2024/03/UKERC_EnergySHINES_What-is-the-Carbon-Footprint-of-Digital-Healthcare.pdf [accessed 2025-05-26]
Abbreviations
| AI: artificial intelligence |
| DEFRA: Department for Environment, Food & Rural Affairs |
| NHS: National Health Service |
Edited by A Mavragani; submitted 16.Jun.2025; peer-reviewed by S Ali, M Sawyer; comments to author 09.Sep.2025; revised version received 15.Sep.2025; accepted 09.Oct.2025; published 27.Oct.2025.
Copyright©Duncan J Reynolds. Originally published in the Journal of Medical Internet Research (https://www.jmir.org), 27.Oct.2025.
This is an open-access article distributed under the terms of the Creative Commons Attribution License (https://creativecommons.org/licenses/by/4.0/), which permits unrestricted use, distribution, and reproduction in any medium, provided the original work, first published in the Journal of Medical Internet Research (ISSN 1438-8871), is properly cited. The complete bibliographic information, a link to the original publication on https://www.jmir.org/, as well as this copyright and license information must be included.

