Original Paper
Abstract
Background: Online postal self-sampling (OPSS) allows service users to screen for sexually transmitted infections (STIs) by ordering a self-sampling kit online, taking their own samples, returning them to a laboratory for testing, and receiving their results remotely. OPSS availability and use has increased in both the United Kingdom and globally the past decade but has been adopted in different regions of England at different times, with different models of delivery. It is not known why certain models were decided on or how implementation strategies have influenced outcomes, including the sustainability of OPSS in sexual health service delivery.
Objective: This study aims to evaluate the implementation of OPSS in 3 case study areas of England, with a focus on the sustainability of implementation and the relationship between implementation strategies and outcomes.
Methods: Qualitative data collection methods were used: interviews with staff and stakeholders involved in the implementation and delivery of OPSS, analysis of local implementation and national policy documents, and observations in sexual health clinics. Analysis of interviews and observations was undertaken using qualitative implementation science frameworks, including normalization process theory, the Consolidated Framework for Implementation Research, and the major system change framework. Documentary sources were used primarily to map processes over time and triangulate against interview and observational evidence.
Results: Across the 3 case study areas, 60 staff and stakeholders were interviewed, 12 observations were conducted, and data from 86 documents were collated. Rather than being a discrete digital health intervention, we found that OPSS was part of—or occurred parallel to—major system changes in all areas. These changes were driven by budgetary pressures in all areas, but there was variation in other objectives used to rationalize the decision to adopt. The financial context and organizational relationships in each area determined the implementation strategies available to decision makers, how these strategies were enacted, and, in turn, led to different outcomes at different time points. OPSS implementation was not a one-off outcome but an ongoing process in response to changes in context, which in turn affected how staff perceived and engaged with OPSS. The COVID-19 pandemic had profound but divergent effects on OPSS implementation in each area, accelerating it in some contexts and reversing it in others.
Conclusions: In this multisite case study, OPSS implementation was part of systems change to address a wider problem of insufficient funding to deliver sexual health care. Decisions about implementing OPSS were made before sufficient evidence was available to effectively guide the process. The resultant unintended consequences need acknowledgment to enable future commissioners and sexual health services to optimize sexual health service provision.
International Registered Report Identifier (IRRID): RR2-10.1136/bmjopen-2022-067170
doi:10.2196/72812
Keywords
Introduction
Background
Online postal self-sampling (OPSS) is a digital health intervention that allows service users to screen for sexually transmitted infections (STIs) and blood-borne viruses (BBVs) by ordering a self-sampling kit online, taking their own samples, returning them to a laboratory for testing, and receiving their results remotely []. OPSS differs from self-testing in that samples are sent off for processing in a laboratory, rather than being processed within the test kit [-]. Results are usually communicated electronically, and users in need of treatment or partner notification are typically referred to clinic-based sexual health services [].
Many settings globally are developing and implementing OPSS, with its availability and uptake having been accelerated during the COVID-19 pandemic [-]. The proportion of tests conducted via OPSS within England’s National Chlamydia Screening Programme, for example, rose from 19% in 2019 to 43% in 2023 []. This transition is occurring within the context of wider efforts to digitize health care; there is a national recommendation that OPSS should be provided as part of all sexual health services in England, while the World Health Organization recommends that self-sampling be provided alongside clinic-based sexual health services to improve the uptake of chlamydia and gonorrhea testing [,].
In England, public health commissioners are responsible for the process of planning, agreeing, purchasing, and monitoring health services []. The commissioning of free, open-access STI or BBV testing has primarily been the responsibility of local authorities since the implementation of the Health and Social Care Act 2012 []. Commissioners are responsible for identifying a preferred model of service provision and instigating a tendering process to select preferred sexual health services that can deliver the model within budgetary constraints. Although OPSS is increasingly part of sexual health service models, it has been adopted in different regions of England at different times, with different models of delivery. For example, OPSS is delivered by National Health Service (NHS) trusts in some areas and outsourced to sexual health clinical services, laboratories or online testing platforms provided by the private sector in others [-]. There is a lack of evidence as to why certain delivery models were decided on, how they have been implemented, or how implementation strategies have influenced the sustainability of OPSS into sexual health service delivery.
There is extensive evidence about factors influencing the adoption and embedding of digital health interventions in health care []. However, there are fewer studies that have considered the sustainability of their implementation, and this evidence gap has been noted specifically in relation to OPSS and other remote testing services for STI or BBVs [,]. In addition, it is unclear how applicable pre–COVID-19 findings (before March 2020) are to the current context of digital health implementation. This is important, given the impact of COVID-19 on normalizing online health care delivery [].
Objectives
This study is nested within the assessing the impact of OPSS for STIs on health inequalities, access to care and clinical outcomes in the United Kingdom (ASSIST) study, a comprehensive evaluation of OPSS within the UK context, that aims to assess the impact of these services on health inequalities, access to care, and clinical and economic outcomes and to identify the factors that influence the implementation and sustainability of these services []. This study focuses on the implementation of OPSS in 3 case study areas (CSAs) of England. In so doing, it addresses important evidence gaps in digital STI or BBV testing.
Methods
Design
Evaluation of OPSS implementation from staff and stakeholder perspectives formed 1 workstream within ASSIST, a wider mixed methods realist evaluation of OPSS, with the published protocol documenting the full study methods and design []. Separate papers covered the findings from other workstreams, for example, on service user experiences [], health inequalities, access to care, and clinical and economic outcomes (Gibbs J, unpublished data, April 2025 and Jackson L, unpublished data, April 2025).
In this study, a combination of qualitative data collection methods were used: interviews, document analysis, and contextual observation. Methods and findings are reported in line with COREQ (). Each data source had a distinct primary purpose and was initially analyzed separately, with preliminary findings from each compared, triangulated, and explanatory detail added where available. Details of implementation models, impacts, and sustainability were drawn primarily from interviews. Observations were used to understand how OPSS featured in clinic-based settings and to complement experiences of OPSS drawn from interviews. Documentary sources were primarily used to construct a timeline for implementation.
The design and methods were informed by normalization process theory (NPT), a substantive middle-range theory that has been used extensively for the study of the work required for initiating, integrating, and embedding (normalizing) digital health innovations into practice [].
Setting
We evaluated OPSS implementation in 3 CSAs in England, labeled CSA 1, CSA 2, and CSA 3 for anonymity. The CSAs were purposively selected based on them having undergone a distinct commissioning process for OPSS, with implementation at different times and the continuation of clinic-based services alongside OPSS. This enabled exploration of different delivery models at different stages following OPSS implementation. Each CSA was predominantly urban and had a population of at least 500,000, which was diverse in terms of socioeconomic circumstances and the proportion of people who are: ethnic minorities; lesbian, gay, bisexual, transgender, queer, and similar minorities; and young. These populations may be disproportionately affected by STIs or BBVs []. The focus of data collection was clinic-based and OPSS service settings, with 1 clinic-based service in each CSA chosen as the setting for contextual observation. These clinics were all level-3 sexual health services—offering the most complex care—and situated near the center of each CSA.
Interviews
Sampling and Recruitment
We used a purposive snowball sampling strategy to reach a range of staff and stakeholders who could offer different perspectives on the implementation of OPSS. We sought to speak to people in leadership roles, such as commissioners, clinical leads, and service managers, who were involved in the decisions to adopt OPSS and staff at varying levels of seniority working in clinical and nonclinical roles in sexual health services (clinic-based and online) to understand the impact of OPSS on their work. We aimed to recruit up to 20 staff and stakeholders per CSA. Our sample size was determined by the concept of information power []. This enabled us to capture the range of diverse participant perspectives we were seeking, rather than to achieve data saturation, which would not have been feasible due to key perspectives—such as those of a commissioner—only being held by 1 person in some CSAs.
Primary investigators at each CSA were initially asked to circulate details of the study to staff working in their service, who contacted TS if they were willing to be interviewed. At the end of each interview, participants were asked to suggest other people they thought would be able to offer a useful perspective on the research topic. As interviews progressed across all 3 CSAs and we began to analyze the data, we noted roles that we had recruited in some CSAs but not others or that were likely to have a perspective on a topic that had arisen in other interviews. We then asked primary investigators to connect us with staff or stakeholders who could fill these gaps.
Data Collection
Semistructured interviews lasting 30 to 60 minutes were conducted, using Microsoft Teams, between January and December 2022. They each followed a topic guide covering: the adoption of OPSS; their perspectives of OPSS in relation to wider sexual health service delivery; the impact of OPSS on their and colleagues’ work; and their reflections on OPSS and its implementation, including its sustainability where relevant to the CSA (; also available in the protocol by Gibbs et al []). Questions were tailored depending on whether participants were more involved in adoption or delivery. We mitigated for recall bias by seeking overlapping perspectives where possible and comparing interview data with contemporaneous documents. Participants were interviewed individually, aside from 3 with overlapping roles in CSA 1, who were interviewed together. All interviews were conducted by TS or JS.
Data Analysis
All interviews were audio-recorded, transcribed by a professional transcription company, and transcripts checked for accuracy. Our analysis began with NPT as a deductive framework to enable exploration of the normalization of OPSS within staff and stakeholder work () [,-]. We also used the Consolidated Framework for Implementation Research (CFIR), which we felt was able to capture examples of implementation context more directly than NPT [].
Coding was undertaken by TS and JS, using NVivo (Lumivero) software, who reviewed and refined the coding approach informed by discussion with the wider research team: FMB, JG, AH, GW, LJ, and DC. Interim findings were presented to staff at all research sites, including some who participated in the study and others who did not, to inform interpretation, check whether our interim findings aligned with their experiences of OPSS implementation, and verify that we had adequately captured the context in each site.
Contextual Observation
Overview
TS conducted 2 forms of contextual observation in person at clinic-based sexual health services: think-aloud exercises with reception staff and nonparticipant observation of clinical consultations []. The aim was to obtain an understanding of how OPSS featured in practice, within staff work in clinic settings, to supplement interview accounts of perceptions and experiences. We were interested in both clinical staff who provide expert guidance to service users during consultations and reception staff who—we knew from the interviews—often interact with service users first and direct them toward OPSS or clinic-based services. As it would not have been feasible to obtain consent from service users before they presented at the clinic reception, we opted to use think-aloud exercises to explore how reception staff would interact with service users who might benefit from OPSS.
Data Collection
Reception staff present on the day of data collection were invited to take part in the research individually. They were each read 4 short scenarios about a service user who was presenting at the sexual health service (). They were asked to say aloud what they were thinking about each scenario and how they would respond to the service user. TS would prompt them to keep thinking aloud and to explain which services they would direct service users toward and why. TS took contemporaneous notes as the participant was speaking.
Health advisors, whose role includes providing partner notification, management of people diagnosed with STI or BBVs, counseling, and signposting to services, obtained informed consent from any service users willing to have their consultation observed. TS would take notes during the consultation, primarily using a preagreed framework that focused on whether and how OPSS featured in each consultation, as well as what support staff were offering to service users that may be absent from an OPSS experience.
Data Analysis
TS took notes on whether and how OPSS was featured in spaces such as the waiting room or consultation rooms. Each site visit was discussed afterward by TS and JS to explore how it confirmed, contradicted, or provided further explanation to the interview data.
Documents
Sampling and Data Collection
Documents were sourced in 3 phases. First, a Google (Google LLC) search was conducted to identify publicly available documents related to UK policy, strategy, and guidance on sexual health services and the role of digital platforms, limited to publications released after the Health and Social Care Act 2012. Search terms included “sexual health” plus “digital” or “online.” Sources of documentation included clinical guidelines, national strategy or policy, reports, position statements, or recommendations by the national government, government agencies, or clinical or service user organizations.
Second, specific searches were conducted to identify documents of relevance to the adoption and delivery of sexual health services in each CSA. Searches were conducted on Google and on specific local authority websites. The search was limited to documents published from 3 years before the delivery of OPSS in each CSA. Search terms included the case study region and areas within each region, with “sexual health” plus “digital” or “online.” Sources of documentation included needs assessments, meeting agendas and minutes, committee reports, newsletters, service plans, and strategies or reports of commissioning processes.
Third, we contacted each research site to seek documentation not in the public domain. We requested service specifications, strategy documents, and consultation documents as well as borough- and service-level minutes from meetings that recorded decisions around how OPSS was instigated, provided, and sustained. We also asked interview participants if they were aware of, and could share, documents that offered other insights into the implementation or delivery of OPSS in their CSA.
Data Analysis
Documents that did not mention online delivery of sexual health services were excluded. Included documents were cataloged by date, CSA (or national), and type. Documents were then ordered to build a timeline of implementation in each CSA by providing evidence of dates of when key developments in OPSS occurred.
Data Synthesis
Interviews were the dominant source of data in our evaluation, and we first coded these sources using a coding framework based on NPT to analyze them. This initial analysis was used to develop early “pen portraits” of our CSAs and to inform analysis of documentary sources (ie, to identify pivotal contextual events or circumstances and the key implementation milestones in each site). Extraction from documentary sources was used primarily to provide the “hard” data on when key events happened and also, where available, to triangulate interviewees’ recollections.
As our analysis progressed, we inductively identified that the implementation of OPSS in all CSAs was part of—or took place alongside—substantial transformation of sexual health services. Therefore, we used other frameworks to inform our coding to complement NPT and added CFIR (), guided by the Dissemination and Implementation Models in Health webtool []. We applied a conceptual framework relating to major system change, developed by Fulop et al [], to structure the account of how OPSS was conceived and implemented. We were also guided by the implementation outcome measures proposed by Proctor et al [], which provided a structured framework for reporting implementation outcomes.
| Theory or framework | Key propositions | Application and adaptation to ASSISTa |
| NPTb [] |
|
|
| Implementation outcomes by Proctor et al [] |
|
|
| Major system change framework [] |
|
|
| CFIRe [] |
|
|
aASSIST: assessing the impact of online postal self-sampling for sexually transmitted infections on health inequalities, access to care and clinical outcomes in the United Kingdom.
bNPT: normalization process theory.
cItalicization indicates how similar concepts featured in different frameworks were treated.
dCFIR: Consolidated Framework for Implementation Research.
We used these frameworks to develop a table that enabled us to make an initial comparison of the implementation in each CSA, from adoption to outcomes. We continuously refined the table categories and content as analysis progressed until we created the version and accompanying narrative presented in the results. Documents were used as a comparison with the interview data to explore where the documents confirmed, contradicted, or provided further explanation to the interview data. Contextual observation enabled us to test our findings derived from analyzing the interview data using the NPT framework, examining whether OPSS appeared in staff workflows as described in the interviews.
Ethical Considerations
Ethics approval was granted by the NHS South Central—Berkshire B Research Ethics Committee (21/SC/0223). All participants provided written informed consent to be interviewed or observed as relevant to their role in the study. Data were deidentified by not naming the CSAs, removing any names from transcripts, and listing job categories rather than individual roles. Participants took part without compensation.
Results
We interviewed 60 staff and stakeholders across our 3 CSAs, conducted 12 observations, and collated data from 86 documents ().
| CSA 1 | CSA 2 | CSA 3 | Total | ||
| Interviews | |||||
| Commissioners | 3 | 6 | 1 | 10 | |
| Senior managers and clinical leads | 2 | 9 | 2 | 13 | |
| Clinical staff (consultants, junior doctors, nurses, health advisors, and health promotors) | 7 | 10 | 6 | 23 | |
| Nonclinical staff (administrators, communications, and clinic managers) | 3 | 3 | 2 | 8 | |
| Stakeholders (pharmacists and external OPSS staff) | 1 | 3 | 2 | 6 | |
| Observations | |||||
| Observations: think-aloud exercises with administrative staff | 5 | 0a | 4 | 9 | |
| Observations: clinics (patients) | 1 | 1 (3) | 1b (3) | 3 | |
| Documents (including needs assessments, commissioning specifications, tenders or service proposals, contracts, service guidelines, newsletters, user consultations, and case studies) | 14 | 51 | 11 | 86 (including 10 national documents) | |
aThe clinic-based service selected for contextual observation did not have the capacity to release staff to participate in the think-aloud exercises.
bObservations were of 1 clinic but 2 health advisors.
Description and Comparison of Implementation Processes and Outcomes
Overview
We first provided a thematic description of the implementation process and outcomes, which are also summarized in . Further details are available in an extended data table, . A timeline from the perspective of stakeholders in show, highlighting major contextual changes (blue), and implementation phases, with preparation in purple, OPSS delivery in pink and clinic changes in green. We then compared the influence of implementation strategies on implementation outcomes and their interaction with context.
| CSA 1 | CSA 2 | CSA 3 | |||||
| Adoption decision | |||||||
| Decision makers |
|
|
| ||||
| Decision rationale |
|
|
| ||||
| Implementation approach | |||||||
| Initial model of OPSS provision |
|
|
| ||||
| Relationship with wider, clinic-based services |
|
|
| ||||
| Implementation mechanisms | |||||||
| Actions to engage all actors in implementation (cognitive participation) |
|
|
| ||||
| Timescales for launch |
|
|
| ||||
| Implementation outcomes | |||||||
| Initial (pre–COVID-19) staff coherence and acceptability |
|
|
| ||||
| Penetration (initial; ie, speed and extent to which OPSS was delivered to service users) |
|
|
| ||||
| Sustainabilityd | |||||||
| Penetration (late—until end of ASSISTf data collection 2022) |
|
| —e | ||||
| Fidelity and adaptations: unplanned |
| — | — | ||||
| Fidelity and adaptations: planned (reflexive monitoring) |
|
|
| ||||
| Staff acceptability: staff work to enact and sustain OPSS (collective action and relational restructuring) |
|
|
| ||||
aEvidence from data sources is provided in .
bSTI: sexually transmitted infection.
cBBV: blood-borne virus.
dMarch 2020 coincided with lockdown and closure or significant restrictions in clinic-based service delivery because of the COVID-19 pandemic ().
eNot available.
fASSIST: assessing the impact of online postal self-sampling for sexually transmitted infections on health inequalities, access to care and clinical outcomes in the United Kingdom.
gMost of ASSIST’s data collection was completed by the end of 2022, so sustainability in this paper relates to delivery until the end of 2022, but contextual observations and discussions with sites indicate that delivery has continued to change in 2023.

Adoption
In CSA 1, the decision to implement OPSS came from an NHS trust, which chose to include OPSS as part of its bid for a wider sexual health services contract. In contrast, commissioners initiated the decision to adopt OPSS in CSA 2, following a multiyear transformation program for sexual health, which recommended the introduction of a city-wide OPSS service. It was also initiated by commissioners in CSA 3, who specified that OPSS needed to be part of the delivery model.
Implementation Approach
CSA 1 started delivery within 3 months of the trust being awarded the new, integrated sexual health contract covering the entire city and a neighboring local authority. Its implementation was very rapid, with cognitive participation (ie, the relational work to build and sustain OPSS) driven by a small in-house team, often working informally. CSA 1’s in-house model saw every aspect of the service—the website, pathology, and results communication—delivered by the NHS trust except kit production and dispatch, which were outsourced. It integrated OPSS with other face-to-face services, which were equipped to disseminate kits and receive them with completed samples—an example of collective action (ie, the work done to implement and sustain OPSS). The service was launched with a marketing campaign to encourage uptake. There were few restrictions on who could access OPSS or how often.
CSA 2, in contrast, was rolled out in phases, starting with 1 to 2 areas in January 2018, then in all commissioned areas by June 2018. The implementation of OPSS took place alongside other transformations that were commissioned separately, including the integration of sexual health and contraceptive services and some clinic closures. OPSS was not widely advertised—the aim being to shift existing service users from clinic-based testing to internet-based testing rather than increase overall testing.
The sexual health service in CSA 3 put out a commissioning specification for an OPSS service after it had commenced the new sexual health contract. OPSS was launched 2 months later. The new contract separated sexual health from contraceptive services, which were delivered by a separate organization.
CSAs 2 and 3 outsourced OPSS to a private company; CSA 2 commissioned a bespoke service, while CSA 3 bought in a standard service. Kits in both areas were predominantly posted, although some were collected at clinics. Marketing was limited, primarily to clinics. Patients could only access 1 kit every 3 months and needed to be asymptomatic.
Acceptability of OPSS
We found the strongest and most consistent support for OPSS among staff in CSA 1. There was a coherent view that embedding digital services would improve access and ensure patients with greater need could be seen more easily in clinics. Staff felt engaged in the implementation process and engaged with service users regularly about OPSS, although it increased work in some circumstances, such as when service user data were split between the electronic patient record systems of the NHS trust and the pharmacies that had been contracted to deliver and receive OPSS kits.
In contrast, there was a lack of shared coherence among stakeholders in CSA 2. Commissioners stated that their aim was to maintain STI or BBV testing amid rising demand and reduced funding; however, they also hoped that OPSS would improve access and public health outcomes. Clinic staff, in contrast, often felt that commissioners prioritized cost savings at the expense of meeting service user needs. Staff were initially unaware of, and were not engaged with, OPSS implementation, and some were openly hostile to its implementation.
There was support for OPSS in principle among CSA 3, but the sexual health service expressed concerns about the lack of control over volumes of testing demand, which was much higher than anticipated, and the ensuing spiraling of costs.
Sustainability
CSA 1’s service was severely disrupted from March 2020 by the COVID-19 pandemic, when laboratory capacity within the trust was redirected to testing for COVID-19 and there were challenges sourcing equipment for OPSS kits. Provision of kits decreased, and the kit processing times increased. This required new collective action, with many staff having to deal with service user inquiries about delays to receiving kits or results on a regular basis and some engaging in work to mitigate the disruption, such as weekly meetings with the laboratories and kit suppliers. Kit orders were restricted for the first time to 1 per month. However, staff acceptability in CSA 1 remained high, despite these challenges in delivering OPSS.
In CSA 2, delivery of OPSS increased during the COVID-19 pandemic, and—alongside remote consultations—it enabled STI or BBV testing and treatment to be maintained amid restricted access to face-to-face clinic appointments during lockdowns. The remit of OPSS was expanded to include users with mild symptoms and those whose partners had tested positive for an STI or BBV. Many clinics continued to redirect all asymptomatic service users to OPSS following the end of pandemic restrictions. OPSS was also used to manage demand during the mpox epidemic in 2022, which placed additional pressure on clinic-based services. Staff reported being more aware of OPSS after the pandemic and participating in more collective action, referring service users to OPSS more often and accessing results more frequently, although this did create additional work, such as having to access multiple electronic patient record systems. Clinic staff noted that while OPSS reduced visits from asymptomatic service users, it also brought changes to their clinic workflows. For health care support workers, who had previously managed in-person testing for asymptomatic service users, their role became more administrative, and staff noted that it reduced career development opportunities for them. In contrast, more senior clinical staff saw an increased complexity in their clinic caseloads, and this led to the instigation of longer consultation slots in some settings.
In CSA 3, use increased far higher and more rapidly than anticipated, largely because OPSS was launched shortly before the pandemic. This was challenging to reset after pandemic restrictions ended. Reflexive monitoring required additional work by clinic management and external stakeholders to explore ways to reduce the cost of delivering OPSS. The commissioner provided additional funding to meet this demand, and eventually, a cap on daily orders was agreed upon by commissioners and the sexual health service, making clinics once again the primary access point for testing. In our interviews conducted 2 years after OPSS had been introduced, there was still a lack of coherence among staff in CSA 3, with many believing that OPSS had been introduced in response to the COVID-19 pandemic. Many also believed that it was an entirely separate service, rather than being commissioned and clinically led by the trust. Staff saw value in OPSS to improve access, but there was a strong consensus via reflexive monitoring that it was too expensive to deliver.
Explanations of Implementation Variation
Two contextual factors determined the implementation approaches available to service leaders and had a substantial impact on the implementation outcomes in all CSAs. The influence of each factor was further impacted by a third contextual factor, the COVID-19 pandemic.
Decreasing Budgets for Sexual Health Services
Participants involved in the decision to adopt OPSS frequently noted that decisions were made because of, or with reference to, reductions in budgets for STI or BBV testing. The pressure on budgets was noted as particularly acute following the transfer of responsibility for most sexual health service commissioning to local authorities. Its stated importance as a driver to implement OPSS was different across CSAs, and there was a lack of consensus within some CSAs on the extent to which it affected OPSS decisions.
Staff in CSA 1, which launched before the steepest decline in national sexual health budgets, told us that while the decision to adopt OPSS was partly driven by a desire to reduce the cost of service delivery, there was also a very strong emphasis on reforming services to improve access, embed digital, and reduce health inequalities:
The [local authority] Director of Public Health at that point was very proactive...and clearly wanted to change how we delivered this service, particularly around decentralisation and a more equitable service I think. More opportunities for people to access it.
[Participant 38, clinical, CSA 1]
The commissioner in CSA 3—who took the decision to adopt OPSS—similarly saw the introduction of OPSS as an opportunity to deliver better outcomes amid funding constraints:
There are areas on the outskirts of the city which are 16 miles away from a city-centre clinic so I felt quite strongly that we- although we had a good service we had an inequitable service.... At the same time we’d also [had] five years of cuts to the public health grant that resulted in us closing the city-centre contraception clinic so again that compromised access even further. So those factors really led me to a place of believing that if we needed to improve the service and improve access to STI and contraception care that it needed to be redesigned.
[Participant 50, commissioner, CSA 3]
Commissioners in CSA 2, by contrast, noted their main reason for introducing OPSS was to maintain access amid rising demand and reducing income, although they also hoped to increase access and public health outcomes:
Because of the financial trouble that authorities were in, with the cuts to the public health grant, STI testing had been going up and up in CSA2 and then—for two years before [the OPSS service launched]—stopped because the councils ran out of the money and they needed a way to grow activity without growing costs at the same rate.
[Participant 15, commissioner, CSA 2]
The financial landscape in turn shaped the delivery model and implementation approach in each CSA. CSA 1 chose to deliver OPSS in-house, enabled by the trust’s ability to fund the required infrastructure:
...doing it internally was better and the reasons for that were mostly integration and cost. Because clearly we want the people who are tested on like postal testing to have their results all in our own system rather than having it imported later. And also looking at the costs actually the external costs were very high compared with what we could do internally.... The reason it happened primarily was there was senior management buy in...the agreement to invest some money into it and the driver of losing the service essentially unless we could.
[Participant 38, clinical, CSA 1]
CSA 1 also took a proactive approach in marketing OPSS to encourage uptake. This approach was resisted in CSA 2 to avoid placing more demand on the sexual health budget. Similarly, CSA 3 stated during its bid for the contract that OPSS use would need to be constrained for the service to be financially sustainable:
The level of online testing and future trends would be closely monitored to ensure negative consequences of burgeoning demand are avoided, i.e. reduced positivity and the potential to significantly exceed the financial envelope.
[Document S7, contract, CSA 3]
This budgetary context also shaped implementation outcomes, such as staff acceptability. Clinic staff in CSA 2, for example, were more likely to see OPSS as a cost-saving measure rather than a benefit to their service and had more concerns than staff in other areas about the risk OPSS posed to their services and jobs. In CSA 3, where the costs of OPSS were directly drawing funding from clinic-based services, there was more stated concern about affordability:
I think at the end of the day it’s all about money, you know, which I think’s rather sad because when I look at my own Trust...we had lots of lovely small satellite services dotted all over [the area]. We’ve had to shut them all because again the asymptomatic pathway is all being channelled through [OPSS].
[Participant 5, nonclinical, CSA 2]
Staff in CSA 1, in contrast, were far more positive about the value OPSS brought to their service, despite some service closures and job losses during the commissioning process:
I think back when we first started yes, we was all quite positive about it. You know it took the pressure off the clinics and the walk-in clinics we used to do for people who would just walk-in and just want a general test. We could say you know rather than you sitting waiting for a few hours to be seen why don’t you do this and they were quite grateful of it a lot of them were.
[Participant 32, nonclinical, CSA 1]
Organizational Structure, Culture, and Relationships
The relationships between organizations involved in the commissioning and delivery of OPSS, the division of work between these organizations, and the culture within them were all strong influences on implementation outcomes.
CSA 1’s decision to deliver its OPSS almost entirely within the trust, for example, rather than outsourcing it to an external organization, allowed it to launch very quickly:
We decided what we were going to do and we up and did it, I’ve nothing written down at all, there were no minutes taken in meetings, we acted quickly with it and because we were all in the same [organization].
[Participant 38, clinical, CSA 1]
However, the informal relationship between stakeholders within the trust, such as the sexual health service and the pathology department, meant that equipment could be redirected away from STI and BBV testing and toward COVID-19 during the pandemic:
Covid just killed our online testing service.... So Covid started and we’re using the online testing service and all of a sudden...our main supplier cannot supply us the components simply because there was this competition between them supplying the same kits for Covid testing and STI kit.... At that time it was the height of the pandemic...it’s sexual health, no, we’re not important. Everything, all the efforts went into Covid. So that was a massive challenge to a point that we have to completely switch off our self-testing kit website and change our guidelines.
[Participant 34, clinical, CSA 1]
CSAs 2 and 3, in contrast, were able to expand OPSS delivery in 2020 because of outsourcing it to a specialist company. However, in CSA 3, there was a much higher unit cost than CSA 2, which was able to commission at scale as a consortium of local authorities. In addition, the sexual health service in CSA 3 was not able to make a unilateral decision to cap testing—it had to liaise with the local authority commissioner to make changes to the contract:
So when the contract came out, it asked us to upscale online testing over a period of five years from very little to 30% of our screening total and it very quickly obviously because the pandemic was forcing to almost 70-80% and reeling that back now is extremely difficult because the only way to reel it back in that isn’t extremely complicated is to cap testing. Most commissioners are a bit reluctant plus it is also a blunt tool because you don’t know who you are capping.
[Participant 48, clinical, CSA 3]
The number of local authorities and sexual health services involved in CSA 2’s commissioning process meant that relationships were highly formalized and there was a lack of coherence between some stakeholders. This contributed to an antagonistic culture among some:
[Meetings with commissioners] would be in seminar rooms...so you are sitting across each other and we need you to do X, Y, Z, we think this is that...it’s kind of like, well no I don’t agree...we were defensive of our service and what we needed to do and defensive of our population as well...to have a blanket approach and blanket targets we didn’t agree with. And so they listened but they were all defensive and they had their own agenda about reshaping services and reducing cost.
[Participant 2, clinical, CSA 2]
Some people saw part of what they do being taken away from them, is my sense, and felt very threatened about that like what is their purpose, their value, their specialness? And saw it is a direct kind of challenge and threat to that because...they have been indoctrinated and trained into a need to keep people passive and depending on them. Because it maintains their power and specialness.
[Participant 15, commissioner, CSA 2]
It also may have contributed to the longer, 3-year process of agreeing on the decision to commission OPSS and for it to be operational, as opposed to 6 to 12 months in the other 2 CSAs ().
Discussion
Principal Findings
This study of the implementation of OPSS for STI or BBVs across 3 diverse CSAs found that budgetary pressures and organizational relationships determined the implementation strategies available to decision makers and how these strategies were enacted. Our study additionally illustrates the profound but divergent effects the COVID-19 pandemic had on OPSS implementation in different contexts.
Strengths and Limitations
Our study has several strengths. Its multisite design allowed comparison of implementation strategies, and our combination of documentary sources over 12 years, with contemporary interviews and observations, enabled us to examine outcomes, including sustainability. Combining interviews with documentary sources and contextual observations provided richer insights than interviews alone. For example, the observations generated a more nuanced understanding of the collective action in “the real world,” where OPSS was less dominant than in an interview specifically about OPSS implementation. It also provided unintended insights into sustainability, for example, how CSA 3 implemented a daily cap on OPSS testing. Our use of multiple theoretical frameworks allowed these to complement each other and develop a more comprehensive understanding of implementation outcomes—and the factors that led to them—at different levels. The major system change framework helped to standardize the narrative across the 3 CSAs, enabling us to compare implementation approaches and outcomes, while NPT was particularly helpful for surfacing the implications on staff work (captured under “collective action”). CFIR enabled us to capture wider contextual factors, such as financing, policies, and laws, that influenced our findings using the other 2 frameworks.
Our study faced some limitations. The CSAs in this study were very different in size and complexity. On the one hand, this was a strength of the study in surfacing contrasting implementation approaches and outcomes within the same sexual health care system. On the other hand, it affected the research processes we were able to undertake. For example, we were not able to source key documents from all CSAs, such as tender specification and contracts. This was in part because nearly 10 years had elapsed since OPSS had launched in 1 CSA, and staff turnover meant no one in post still had access to documentation from the time of adoption. This was partly due to different implementation approaches; those that followed more formal processes had extensive documentation, whereas others documented fewer aspects. This made it harder to ascertain certain aspects of implementation and compare between CSAs and to corroborate some of the interview data. The time elapsed since the launch of OPSS in some CSAs may have introduced recall bias in certain interviews. In addition, some of the study team worked for, or closely with, sexual health services included in our evaluation. However, the data collection and analysis were conducted by researchers who did not have these relationships.
Comparison With Prior Work
The coherence of OPSS to staff in this study aligns to some extent with the findings of other studies. In their review of home-based sexual health care, Goense et al [] reported that staff believed “home-based STI testing” could improve access, especially for clients vulnerable to STIs. The interview study staff in the study by Chabot et al [] articulated an expectation that home-based STI testing would provide increased convenience for users. However, there was less evidence of staff conflict or a lack of coherence in these studies than we identified in our study. The research by Ryu et al [] on the experiences of sexual health services during COVID-19 lockdowns in Ontario in Canada did identify some concerns among staff. They were primarily concerned about the risks to service users not having sufficient privacy at home to access services remotely and staff finding it difficult to provide emotional support and education outside of face-to-face interactions. While staff concerns in this study did include risks to service users, the ambivalence or opposition to OPSS in some CSAs may have been driven by wider sexual health transformation changes, which also affected staff employment and service viability [].
Ryu et al [] identified that the pandemic catalyzed the adoption and implementation of remote models of sexual health care, which persisted beyond the pandemic. This aligns to some extent with experiences in 1 of the 3 CSAs in our evaluation, where expanding the scope of OPSS improved its acceptability among staff in CSA 2. However, our findings illustrate how COVID-19 could have contrasting effects. It severely undermined OPSS in CSAs 1 and 3, although for different reasons. The interaction of the organizational context and OPSS implementation approaches helps us to understand why the pandemic had such contrasting effects. These findings contribute toward addressing an evidence gap articulated by Goense et al [] by providing insights on what affects the sustainability or maintenance of OPSS services. They also address a wider gap in implementation science literature, where context has been well established as an important factor in implementation, but there is limited evidence on how it can dynamically affect implementation approaches, particularly over time, or its relationship with implementation outcomes [,,,].
Existing research has observed that financial resources are one of the most commonly cited contextual factors affecting implementation, and as Nilsen et al [] note, implementation research so far has focused predominantly on the need for sufficient financial resources to enable implementation to happen []. However, our study affirms a contrasting perspective, in that OPSS was implemented precisely because of concerns about insufficient financial resources []. Commissioners and services appeared to be faced with an unenviable choice: implement this “new, untested” service or face a failure of sexual health services altogether []. This distinct context, where implementation is initiated because of financial pressures, may help explain some of the surprising findings of this study and could have implications for other digital health innovations. For example, NPT posits that embedding of an innovation is more likely when staff perceive its benefits []. In this study, clinic staff acceptability did not predict implementation success; instead, implementation success—or otherwise—influenced acceptability.
We initially approached OPSS as a digital health innovation and sought to evaluate it as such. However, our findings suggest that implementation of OPSS was part of a process of large-scale service transformation [] and as such has more in common with digital transformation, in that it involved several organizations and was associated with changes in funding and staffing models across both clinic-based and internet-based services. The rapid pace of this transformation in 2 of our CSAs—which each implemented OPSS within 6 to 12 months—offers insight into why they faced challenges with sustainability, especially when compared to the more gradual implementation in CSA 2 and other digital sexual health innovations internationally, such as the 3-year implementation of Get Checked Online in British Columbia in Canada []. This rapidity may have hindered the capacity to obtain sufficient local evidence to effectively guide implementation and suggests that future studies of the implementation of OPSS or similar intentions should consider, as Chabot et al [] have done, how OPSS is embedded in the wider social and structural processes of the public health system.
Conclusions
In this multisite case study, OPSS implementation was part of systems change to address a wider problem of insufficient funding to deliver sexual health care. This systems change, alongside organizational relationships, determined the implementation strategies available to decision makers and the outcomes they achieved, as evidenced by stark differences in sustainability during COVID-19. Implementation of OPSS may have progressed before sufficient evidence to effectively guide implementation was available.
On the basis of our findings, we provide 3 actionable recommendations for implementing OPSS that could contribute to successful and sustainable implementation of OPSS services into sexual health systems.
First, those planning to introduce OPSS should allow sufficient time in the preparatory phase to iterate plans, informed by stakeholder feedback, to build coherence among staff and service users. Our study documented that the process of introducing OPSS varied significantly between CSAs in terms of timelines, stakeholder engagement, and how formalized the process was. Our findings suggested that very rapid implementation may affect later sustainability and that local authorities commissioning as collectives are able to deliver OPSS at lower cost, particularly when outsourcing. Sexual health system leaders considering introducing OPSS would benefit from considering the implications of their business and delivery models before implementation.
Second, professionals delivering OPSS and those commissioning OPSS should build flexibility into implementation so that delivery and models can be altered to respond to changes in context. Our findings illustrate how implementation of OPSS is not a one-off event but an ongoing process that is sensitive to the context into which it is introduced and sustained. In our CSAs, important contextual issues included public service budget cuts, the sexual health service transformation agenda, and the COVID-19 pandemic, which affected the process and outcomes of implementation, particularly CSAs’ capacity to respond to changes in demand for OPSS. The nature of contextual changes may not be predictable, but building in the capacity to increase or reduce delivery of OPSS—and rebuilding staff coherence around any changes—could enhance sustainability of implementation.
Third, professionals delivering OPSS and those commissioning OPSS should reflexively monitor changes to staff workload across the sexual health system following introduction to OPSS. Our study found that introducing OPSS into sexual health systems changes the work of different staff groups in heterogeneous ways. Some of these changes could not have been anticipated, but continually monitoring and meaningfully responding to such changes could enhance the acceptability of OPSS to staff, enhancing its adoption and sustainability.
Acknowledgments
The authors would like to thank the case study area (CSA) staff and stakeholders who generously gave their time to participate in the research as interviewees, provided documentation, or agreed to be observed. This work is dedicated to the memory of Prof Elizabeth Murray and Dr Naomi Fisher, who helped design and write the grant proposal for the study and led the implementation evaluation. This report is independent research funded by the National Institute for Health and Care Research (NIHR) Health Services Research and Delivery Programme (NIHR129157) and supported by the NIHR Applied Research Collaboration North Thames. JS is funded by an NIHR Population Health Career Scientist Fellowship (NIHR303616). The views expressed in this publication are those of the authors and not necessarily those of NIHR or the Department of Health and Social Care.
Data Availability
The datasets generated or analyzed during this study are not publicly available due to public availability not being part of participant consent.
Authors' Contributions
JG and FB conceived and designed the study. JG, GW, FMB, and JS designed the initial iterations of the data collection materials and analytic approach. TS collected and analyzed the interview, contextual observation, and document data. JS collected and analyzed a subset of the interview data and collected and organized document data. AH, VA, SD, JR, and CD made substantial contributions to the acquisition of data. JG, GW, AH, VA, AC, DC, SD, LJ, CHM, HM, JR, AS, AW, CD, FMB, TS, and JS made substantial contributions to the interpretation of the analysis. TS and JS drafted the initial versions of the manuscript. All coauthors critically reviewed the manuscript and have approved the submitted version.
Conflicts of Interest
JG is an honorary consultant at an organization in case study area CSA 2 and is on the editorial board for the Sexually Transmitted Infections journal. SD is employed at an organization in CSA 2. VA is employed at an organization in CSA 2, has received speakers’ fees, and is a medical director of Preventx. JR is a consultant at an organization in CSA 1, a member of the European Sexually Transmitted Infections Guidelines Editorial Board, a National Institute for Health and Care Research journals editor, treasurer for the International Union against Sexually Transmitted Infections, and chair of charity trustees for the Sexually Transmitted Infections Research Foundation. AS is employed at an organization in CSA 2. AW is a member of the European Sexually Transmitted Guidelines Editorial Board. CD is employed at an organization in CSA 3. FMB has received speakers’ fees and an institutional grant from Gilead Sciences Ltd. All other authors declare no other conflicts of interest.
Consolidated Criteria for Reporting Qualitative Research (COREQ) checklist.
PDF File (Adobe PDF File), 419 KBTopic guide, tailored to different interviewees in the study.
DOCX File , 33 KBCoding frame based on the normalization process theory codebook (May 2022).
DOCX File , 24 KBObservations and think-aloud data collection tools.
DOCX File , 16 KBExtended data table.
DOCX File , 24 KBReferences
- Sumray K, Lloyd KC, Estcourt CS, Burns F, Gibbs J. Access to, usage and clinic outcomes of, online postal sexually transmitted infection services: a scoping review. Sex Transm Infect. Jun 14, 2022;98(7):528-535. [CrossRef] [Medline]
- Spence T, Kander I, Walsh J, Griffiths F, Ross J. Perceptions and experiences of internet-based testing for sexually transmitted infections: systematic review and synthesis of qualitative research. J Med Internet Res. Aug 26, 2020;22(8):e17667. [FREE Full text] [CrossRef] [Medline]
- Goense CJ, Doan TP, Kpokiri EE, Evers YJ, Estcourt CS, Crutzen R, et al. Understanding practical, robust implementation and sustainability of home-based comprehensive sexual health care: a realist review. AIDS Behav. Oct 04, 2024;28(10):3338-3349. [CrossRef] [Medline]
- Estcourt CS, Gibbs J, Sutcliffe LJ, Gkatzidou V, Tickle L, Hone K, et al. The eSexual Health Clinic system for management, prevention, and control of sexually transmitted infections: exploratory studies in people testing for Chlamydia trachomatis. Lancet Public Health. Apr 2017;2(4):e182-e190. [FREE Full text] [CrossRef] [Medline]
- Cardwell ET, Ludwick T, Fairley C, Bourne C, Chang S, Hocking JS, et al. Web-based STI/HIV testing services available for access in Australia: systematic search and analysis. J Med Internet Res. Sep 22, 2023;25:e45695. [FREE Full text] [CrossRef] [Medline]
- Rahib D, Delagreverie H, Gabassi A, Le Thi TT, Vassel E, Vodosin P, et al. Online self-sampling kits to screen multipartner MSM for HIV and other STIs: participant characteristics and factors associated with kit use in the first 3 months of the MemoDepistages programme, France, 2018. Sex Transm Infect. Mar 04, 2021;97(2):134-140. [CrossRef] [Medline]
- Kersh EN, Mena LA. At-home diagnostics solutions for chlamydia and gonorrhea. JAMA. May 28, 2024;331(20):1701-1702. [FREE Full text] [CrossRef] [Medline]
- Goense CJ, Evers YJ, Manait J, Hoebe CJ, van Loo IH, Posthouwer D, et al. Evaluating the implementation of home-based sexual health care among men who have sex with men: Limburg4zero. AIDS Behav. Mar 08, 2025;29(3):976-992. [CrossRef] [Medline]
- Sexually transmitted infections (STIs): annual data. UK Health Security Agency. 2010. URL: https://www.gov.uk/government/statistics/sexually-transmitted-infections-stis-annual-data-tables/sexually-transmitted-infections-and-screening-for-chlamydia-in-england-2023-report#overall-trends [accessed 2025-07-03]
- Integrated sexual health service specification. Office for Health Improvement & Disparities and the UK Health Security Agency. Mar 20, 2023. URL: https://assets.publishing.service.gov.uk/media/6412fa41d3bf7f79d7b78f5d/Integrated-sexual-health-service-specification-2023.pdf [accessed 2025-06-20]
- WHO Consolidated Guideline on Self-care Interventions for Health: Sexual and Reproductive Health and Rights. World Health Organization. 2019. URL: https://iris.who.int/bitstream/handle/10665/325480/9789241550550-eng.pdf [accessed 2025-07-03]
- Commissioning. National Health Service England. URL: https://www.england.nhs.uk/commissioning/ [accessed 2025-05-23]
- Sexual health. UK Parliament. Jun 2, 2019. URL: https://publications.parliament.uk/pa/cm201719/cmselect/cmhealth/1419/full-report.html [accessed 2025-05-23]
- Wilson E, Free C, Morris TP, Kenward MG, Syred J, Baraitser P. Can internet-based sexual health services increase diagnoses of sexually transmitted infections (STI)? Protocol for a randomized evaluation of an internet-based STI testing and results service. JMIR Res Protoc. Jan 15, 2016;5(1):e9. [FREE Full text] [CrossRef] [Medline]
- Manavi K, Hodson J. Observational study of factors associated with return of home sampling kits for sexually transmitted infections requested online in the UK. BMJ Open. Oct 22, 2017;7(10):e017978. [FREE Full text] [CrossRef] [Medline]
- Smith J, Cook A, Packer C, Stokes-Lampard H. Testing for Chlamydia trachomatis: is more choice a good thing? J Fam Plann Reprod Health Care. Jan 10, 2011;37(1):4-7. [FREE Full text] [CrossRef] [Medline]
- Syred J, Holdsworth G, Howroyd C, Spelman K, Baraitser P. Choose to test: self-selected testing for sexually transmitted infections within an online service. Sex Transm Infect. May 05, 2019;95(3):171-174. [FREE Full text] [CrossRef] [Medline]
- Ross J, Stevenson F, Lau R, Murray E. Factors that influence the implementation of e-health: a systematic review of systematic reviews (an update). Implement Sci. Oct 26, 2016;11(1):146. [FREE Full text] [CrossRef] [Medline]
- Proctor EK, Bunger AC, Lengnick-Hall R, Gerke DR, Martin JK, Phillips RJ, et al. Ten years of implementation outcomes research: a scoping review. Implement Sci. Jul 25, 2023;18(1):31. [FREE Full text] [CrossRef] [Medline]
- Digital revolution to bust COVID backlogs and deliver more tailored care for patients. Department of Health and Social Care United Kingdom Government. Jun 29, 2022. URL: https://www.gov.uk/government/news/digital-revolution-to-bust-covid-backlogs-and-deliver-more-tailored-care-for-patients [accessed 2025-05-23]
- Gibbs J, Howarth AR, Sheringham J, Jackson LJ, Wong G, Copas A, et al. Assessing the impact of online postal self-sampling for sexually transmitted infections on health inequalities, access to care and clinical outcomes in the UK: protocol for ASSIST, a realist evaluation. BMJ Open. Dec 14, 2022;12(12):e067170. [FREE Full text] [CrossRef] [Medline]
- Spence T, Howarth A, Reid D, Sheringham J, Apea V, Crundwell D, et al. How does online postal self-sampling (OPSS) shape access to testing for sexually transmitted infections (STIs)? A qualitative study of service users. BMC Public Health. Aug 28, 2024;24(1):2339. [FREE Full text] [CrossRef] [Medline]
- May C, Finch T. Implementing, embedding, and integrating practices: an outline of normalization process theory. Sociology. Jun 15, 2009;43(3):535-554. [CrossRef]
- Malterud K, Siersma VD, Guassora AD. Sample size in qualitative interview studies: guided by information power. Qual Health Res. Nov 2016;26(13):1753-1760. [CrossRef] [Medline]
- May CR, Albers B, Bracher M, Finch TL, Gilbert A, Girling M, et al. Translational framework for implementation evaluation and research: a normalisation process theory coding manual for qualitative research and instrument development. Implement Sci. Feb 22, 2022;17(1):19. [FREE Full text] [CrossRef] [Medline]
- Dissemination implementation - an interactive webtool to help you use D and I models. University of Colorado Denver. URL: https://dissemination-implementation.org/ [accessed 2025-05-23]
- Fulop NJ, Ramsay AI, Perry C, Boaden RJ, McKevitt C, Rudd AG, et al. Explaining outcomes in major system change: a qualitative study of implementing centralised acute stroke services in two large metropolitan regions in England. Implement Sci. Jun 03, 2016;11(1):80. [FREE Full text] [CrossRef] [Medline]
- Damschroder LJ, Reardon CM, Widerquist MA, Lowery J. The updated Consolidated Framework for Implementation Research based on user feedback. Implement Sci. Oct 29, 2022;17(1):75. [FREE Full text] [CrossRef] [Medline]
- Willis GB. Cognitive Interviewing: A Tool for Improving Questionnaire Design. Thousand Oaks, CA. SAGE Publications; 2005.
- Proctor E, Silmere H, Raghavan R, Hovmand P, Aarons G, Bunger A, et al. Outcomes for implementation research: conceptual distinctions, measurement challenges, and research agenda. Adm Policy Ment Health. Mar 19, 2011;38(2):65-76. [FREE Full text] [CrossRef] [Medline]
- Birken SA, Powell BJ, Presseau J, Kirk MA, Lorencatto F, Gould NJ, et al. Combined use of the Consolidated Framework for Implementation Research (CFIR) and the Theoretical Domains Framework (TDF): a systematic review. Implement Sci. Jan 05, 2017;12(1):2. [FREE Full text] [CrossRef] [Medline]
- Nilsen P, Bernhardsson S. Context matters in implementation science: a scoping review of determinant frameworks that describe contextual determinants for implementation outcomes. BMC Health Serv Res. Mar 25, 2019;19(1):189. [FREE Full text] [CrossRef] [Medline]
- Chabot C, Gilbert M, Haag D, Ogilvie G, Hawe P, Bungay V, et al. Anticipating the potential for positive uptake and adaptation in the implementation of a publicly funded online STBBI testing service: a qualitative analysis. BMC Health Serv Res. Jan 30, 2018;18(1):57. [FREE Full text] [CrossRef] [Medline]
- Ryu H, Blaque E, Stewart M, Anand P, Gómez-Ramírez O, MacKinnon KR, et al. Disruptions of sexually transmitted and blood borne infections testing services during the COVID-19 pandemic: accounts of service providers in Ontario, Canada. BMC Health Serv Res. Jan 13, 2023;23(1):29. [FREE Full text] [CrossRef] [Medline]
- Day S, Singh GJ, Jones S, Kinsella R. Sexual assault reporting amongst users of online sexual health services. Int J STD AIDS. Mar 16, 2021;32(3):280-285. [FREE Full text] [CrossRef] [Medline]
- Wakida EK, Talib ZM, Akena D, Okello ES, Kinengyere A, Mindra A, et al. Barriers and facilitators to the integration of mental health services into primary health care: a systematic review. Syst Rev. Nov 28, 2018;7(1):211. [FREE Full text] [CrossRef] [Medline]
- White C. Sexual health services on the brink. BMJ. Nov 30, 2017;359:j5395. [CrossRef] [Medline]
- Waters A. Sexual health services are at "breaking point" after £1bn in cuts since 2015. BMJ. Nov 16, 2022;379:o2766. [CrossRef] [Medline]
- Best A, Greenhalgh T, Lewis S, Saul JE, Carroll S, Bitz J. Large-system transformation in health care: a realist review. Milbank Q. Sep 18, 2012;90(3):421-456. [CrossRef] [Medline]
- Gilbert M, Haag D, Hottes TS, Bondyra M, Elliot E, Chabot C, et al. Get checked… where? Lessons learned from implementing GetCheckedOnline, an integrated, complex public health system intervention to promote online STI/HIV testing in British Columbia, Canada. In: Proceedings of the 2016 National STD Prevention Conference. 2016. Presented at: STD Prevention Conference; September 20-23, 2016; Atlanta, Georgia.
Abbreviations
| ASSIST: assessing the impact of online postal self-sampling for sexually transmitted infections on health inequalities, access to care and clinical outcomes in the United Kingdom |
| BBV: blood-borne virus |
| CFIR: Consolidated Framework for Implementation Research |
| CSA: case study area |
| NHS: National Health Service |
| NPT: normalization process theory |
| OPSS: online postal self-sampling |
| STI: sexually transmitted infection |
Edited by A Mavragani; submitted 21.02.25; peer-reviewed by E Hannah, C Roleston; comments to author 13.04.25; revised version received 23.05.25; accepted 04.06.25; published 09.09.25.
Copyright©Tommer Spence, Jo Gibbs, Geoff Wong, Alison Howarth, Andrew Copas, David Crundwell, Louise Jackson, Catherine H Mercer, Hamish Mohammed, Vanessa Apea, Sara Day, Jonathan Ross, Ann Sullivan, Andrew Winter, Claire Dewsnap, Fiona M Burns, Jessica Sheringham. Originally published in the Journal of Medical Internet Research (https://www.jmir.org), 09.09.2025.
This is an open-access article distributed under the terms of the Creative Commons Attribution License (https://creativecommons.org/licenses/by/4.0/), which permits unrestricted use, distribution, and reproduction in any medium, provided the original work, first published in the Journal of Medical Internet Research (ISSN 1438-8871), is properly cited. The complete bibliographic information, a link to the original publication on https://www.jmir.org/, as well as this copyright and license information must be included.

