Published on in Vol 26 (2024)

Preprints (earlier versions) of this paper are available at https://preprints.jmir.org/preprint/50184, first published .
Mobile Phone Syndromic Surveillance for Respiratory Conditions in an Emergency (COVID-19) Context in Colombia: Representative Survey Design

Mobile Phone Syndromic Surveillance for Respiratory Conditions in an Emergency (COVID-19) Context in Colombia: Representative Survey Design

Mobile Phone Syndromic Surveillance for Respiratory Conditions in an Emergency (COVID-19) Context in Colombia: Representative Survey Design

Original Paper

1Health Systems Program, International Health Department, Johns Hopkins Bloomberg School of Public Health, Baltimore, MD, United States

2Pontificia Universidad Javeriana, Bogota, Colombia

3IMEK, Cali, Colombia

4Inter-American Development Bank, Bogota, Colombia

*these authors contributed equally

Corresponding Author:

Andres I Vecino-Ortiz, MD, PhD

Health Systems Program

International Health Department

Johns Hopkins Bloomberg School of Public Health

615 N Wolfe Street

Baltimore, MD, 21205

United States

Phone: 1 410 955 3934

Email: andres.vecino@gmail.com


Background: Syndromic surveillance for respiratory infections such as COVID-19 is a crucial part of the public health surveillance toolkit as it allows decision makers to detect and prepare for new waves of the disease in advance. However, it is labor-intensive, costly, and increases exposure to survey personnel. This study assesses the feasibility of conducting a mobile phone–based respiratory syndromic surveillance program in a middle-income country during a public health emergency, providing data to support the inclusion of this method in the standard infection control protocols at the population level.

Objective: This study aims to assess the feasibility of a national active syndromic surveillance system for COVID-19 disease in Colombia.

Methods: In total, 2 pilots of syndromic mobile phone surveys (MPSs) were deployed using interactive voice response technology in Colombia (367 complete surveys in March 2022 and 451 complete surveys in April and May 2022). Respondents aged 18 years and older were sampled using random digit dialing, and after obtaining consent, they were sent a 10-minute survey with modules on sociodemographic status, respiratory symptoms, past exposure to COVID-19 infection and vaccination status, preferences about COVID-19 vaccination, and information source for COVID-19. Pilot 1 used a nationally representative sample while pilot 2 used quota sampling to yield representative results at the regional level. In this work, we assessed the performance characteristics of the survey pilots and compared the demographic information collected with a nationally representative household survey.

Results: For both pilots, contact rates were between 1% and 2%, while participation rates were above 80%. The results revealed that younger, female, and higher educated participants were more likely to participate in the syndromic survey. Survey rates as well as demographics, COVID-19 vaccination status, and prevalence of respiratory symptoms are reported for both pilots. We found that respondents of the MPSs are more likely to be younger and female.

Conclusions: In a COVID-19 pandemic setting, using an interactive voice response MPS to conduct syndromic surveillance may be a transformational, low-risk, and feasible method to detect outbreaks. This evaluation expects to provide a path forward to the inclusion of MPSs as a traditional surveillance method.

J Med Internet Res 2024;26:e50184

doi:10.2196/50184

Keywords



Surveillance systems that provide real-time and accurate information are increasingly critical for decision-making in public health, as demonstrated by the COVID-19 pandemic [1]. Commonly used indicators for monitoring the COVID-19 pandemic were the number of laboratory-confirmed cases, the number of persons hospitalized or in intensive care, and the number of COVID-19 deaths [2,3]. However, these indicators are highly dependent on the health systems’ capacity, might fail to represent real-time conditions, are often mediated by other factors such as health care access, and are less likely to detect less severe cases [4]. These challenges become more important in low- and middle-income countries (LMICs) where traditional surveillance tools are scarce. For this reason, new strategies to complement traditional epidemiologic surveillance are needed [5]. The World Health Organization (WHO) has recommended enhancing traditional epidemiologic surveillance systems with other tools such as syndromic surveillance, where individuals can self-report symptoms related to infection [5], improving the timeliness and coverage, at lower costs [6-9].

Mobile phones have the potential to be an efficient tool to perform syndromic surveillance [1,4,10] because of their widespread use. For example, in Colombia, it has been estimated that there are 133 mobile telephone subscriptions for every 100 people [11]. In addition, it has been identified that surveys using mobile phone technology have lower costs than household surveys [9]. Finally, mobile phone syndromic surveillance is easy to deploy in emergency contexts where social distancing is required.

Mobile phone–based syndromic surveillance systems used during the COVID-19 pandemic have been mainly passive and limited to mobile phone apps that require smartphone technology linked to contact-tracing apps [1,4,10,12-14], which could be limited in LMIC where the number of smartphones with internet access is limited [10]. Other studies in LMIC have used the interactive voice response (IVR) as an active surveillance system on behavior, exposure, knowledge, and perception related to COVID-19, but not as a syndromic surveillance system [6,15,16]. To our knowledge, there have been no published experiences of an active syndromic surveillance system on COVID-19 disease or other health emergencies using IVR as a data collection tool. We hypothesize that syndromic surveillance is a feasible option to conduct syndromic surveillance in respiratory emergencies. This study aims to assess the feasibility of a national active syndromic surveillance system for COVID-19 disease in Colombia.


Data and Surveillance Instrument

IVR surveys were developed and cognitive testing was performed [17]. Participant phone numbers were obtained through random digit dialing [18], which included a prefix ranging from 300 to 323 (which are the prefixes for all mobile phone numbers used in Colombia), followed by 7 digits randomly selected. Random digit dialing is a technique used in previous research to yield a random sample of phone numbers when a previous database does not exist or cannot be accessed for privacy reasons [19]. Respondents who were aged ≥18 years and provided consent, were considered eligible and could participate in a 10-minute survey that comprised the following topical modules: sociodemographic status, respiratory symptoms, previous COVID-19 infection and vaccination status, preferences about COVID-19 vaccination and information on COVID-19. On survey completion, respondents were given an airtime incentive of COL $4000 (around US $1) [20].

Sampling

After testing, we deployed 2 pilots of IVR syndromic surveys with different levels of representativeness. The objective of the first pilot was to assess the feasibility of the IVR system to carry out syndromic surveys, as well as determine population profiles at the national level more likely to respond to this tool. The second pilot was aimed at determining whether quota sampling was effective in improving population representativeness for less populated regions. Following previous work [21-23], the 5 different regions were defined as follows: Caribbean (including Córdoba, Sucre, Bolívar, Magdalena, Atlántico, La Guajira, Cesar, and San Andrés y Providencia) with around 12 million inhabitants, Pacific region (including Chocó, Valle del Cauca, Cauca, and Nariño) with a population around 8.5 million, Amazonas river region (including Amazonas, Putumayo, Caquetá, Vaupés, Guaviare, and Guainía) with about 1.1 million population, Orinoco River region (including Meta, Vichada, Casanare, and Arauca) with 2 million population, and the Central region (including Bogotá, Cundinamarca, Huila, Tolima, Boyacá, Santander, Norte de Santander, Caldas, Risaralda, Antioquia, and Quindío) with around 29 million population [24].

The sample size for the first pilot was calculated using the STEPS (WHO Stepwise Approach to Noncommunicable Disease Risk Factor Surveillance) survey calculator as used in previous research [9,16,25,26]. The STEPS survey is a recommended calculation for repeated cross-sectional, population-based household surveys (see equation 1). The assumptions for the sample size are a 95% CI (z score=1.96), margin of error of 0.05, and baseline prevalence of 0.34 [27], yielding 345 complete surveys to be deployed to have nationally representative surveys.

The second pilot instead focused on producing values that are representative of 5 regions in Colombia through automated strata sampling [28]. To achieve this, we recalculated the sample size with the prevalence of symptoms obtained from the previous survey (0.06), yielding a total of 87 complete surveys for each of the 5 regions, keeping all other parameters the same.

The deployment of the surveys took place between March 12, 2022, and March 15, 2022, and April 23, 2022, and May 13, 2022, respectively. The second pilot took longer because of the quota sampling, where many calls were deemed ineligible if the quota for that region had been already filled.

Outcomes—Disposition Codes

Disposition codes were assessed using the following nomenclature. I stands for complete interview. The respondents were aged ≥18 years, consented, and answered all applicable questions. P stands for incomplete interview. The respondents were aged ≥18 years, consented, answered demographics questions, and some questions from the other modules, but did not finish the information on the COVID-19 module (did not reach the airtime incentive question). B stands for break-off the interview. The respondents were aged ≥18 years, consented, and answered the demographics questions. R stands for refuse after listening to the consent. U stands for unknown as these were phone numbers that did not answer or did answer but did not reach the consent and their eligibility status could not be ascertained. Respondents can be deemed ineligible for three reasons: (1) line not registered, (2) they might report being aged younger than 18 years, or (3) in pilot 2 they might be from a region whose quota sample was already met [29].

Survey rates are defined as follows: (1) contact rate measures the proportion of all cases in which some eligible mobile phone user was reached by the survey (I+P+B+R/I+P+B+R+U). (2) Response rate measures the proportion of both complete and incomplete interviews divided by all known eligible mobile phone users and users with unknown eligibility (I+P/I+P+B+R+U). (3) Refusal rate measures the proportion of mobile phone users that do not consent or break off the survey after consent divided by all known eligible users and users with unknown eligibility (B+R/I+P+B+R+U). (4) Cooperation rate: measures the proportion of complete interviews divided by all known eligible mobile phone users. (I/I+P+B)

An outline for the disposition codes can be seen in Figure 1.

Figure 1. Outline for the disposition codes.

Outcomes—Demographic Characteristics

Demographic characteristics of the respondents were compared to the National Quality of Life Survey 2021 (NQLS). Importantly, we did not use statistical tests to compare the 2 databases given that the sources of data are different in structure (Table 1).

Table 1. Comparison of sociodemographic and vaccination variables for Colombia, pilot 1a.
VariablesNQLS 2021b,c,d, % (SE)Pilot 1 (N=367), n (%, SE)Point difference between NQLS and pilot 1
Sociodemographic

Age (years), mean (SD)41.8 (0.068)36.3 (0.734)–5.4
Age groups (years)

18-2927 (0.0019)139 (38, 0.025)+11

30-3921 (0.0018)88 (24, 0.022)+3

40-4918 (0.0017)70 (19, 0.020)+1

50-5916 (0.0015)37 (10, 0.015)–6

60-7818 (0.0016)33 (9, 0.014)–9
Sex (female)52 (0.002)209 (57, 0.025)+5
Geographical area

Rural22 (0.0012)99 (27, 0.023)+5

Urban78 (0.0012)268 (73, 0.023)–5
Education level

Do not have5 (00007)26 (7, 0.012)+2

Elementary school24 (0.0017)48 (13, 0.017)–11

High school44 (0.0022)128 (35, 0.024)–9

Technical12 (0.0016)99 (27, 0.023)+15

Undergraduate or more15 (0.0019)66 (18, 0.020)+3
Vaccinated against COVID-19

Yese330 (90, 0.015)

No37 (10, 0.015)

aWe do not use tests to compare both databases given that both sources of data are different in structure and cannot be merged in the same database.

bNQLS: National Quality of Life Survey 2021.

cFor the National Quality of Life Survey, absolute values are not reported because we estimated weighted percentages.

dThe percentages of the National Quality of Life Survey 2021 were calculated with sample weights.

eNot applicable.

All statistical analyses were conducted using Stata (version 14; StataCorp LLC).

Ethical Considerations

This study was approved by the Institutional Review Board of the Johns Hopkins Bloomberg School of Public Health (17868), and the Ethics Committee of the Public Health Institute at Universidad Javeriana under filing number 4 of the session conducted on August 13, 2021.


Disposition Codes and Survey Rates for Pilots 1 and 2

Pilot 1

In total, 55,000 phone calls were made for the first pilot. The contact rate was 1.64% (900 out of 55,000), the response rate was 0.73% (403 out of 55,000), and the cooperation rate was 80.48% (367 of 456; see Tables 2 and 3).

Table 2. Disposition codes and survey rates for pilots 1 and 2a.
Call outcomesPilot 1 (N=55,000), n (%)Pilot 2 (N=588,891), n (%)
Complete interview367 (0.7)451 (0.08)
Partial interview36 (0.07)24 (0.004)
Break-off53 (0.10)72 (0.01)
Ineligible: quota metN/Ab3486 (0.59)
Ineligible: <18 years72 (0.13)633 (0.11)
Ineligible: line not registered10,891 (19.80)119,542 (20.30)
Refusal444 (0.81)6230 (1.06)
Unknown43,137 (78.43)458,453 (77.85)

aThe disposition codes of the pilot surveys are according to the American Association for Public Opinion Research [29].

bN/A: not applicable.

Table 3. Disposition codes and survey rates for pilots 1 and 2a.
Survey ratePilot 1, n/N (%)Pilot 2, n/N (%)
Contact rateb900/43,984 (2.05)6777/465,230 (1.46)
Response ratec403/44,037 (0.92)475/465,230 (0.10)
Refusal rated497/44,037 (1.13)6302/465,230 (1.35)
Cooperation ratee367/456 (80.48)451/547 (82.45)

aThe disposition codes of the pilot surveys are according to the American Association for Public Opinion Research [29].

bContact rate: measures the proportion of all cases in which some eligible mobile phone user was reached by the survey.

cResponse rate: measures the proportion of both complete and incomplete interviews over both all known eligible mobile phone users and users with unknown eligibility.

dRefusal rate: measures the proportion of mobile phone users that do not consent or break off the survey after consent over both all known eligible and users with unknown eligibility.

eCooperation rate: measures the proportion of complete interviews over all known eligible mobile phone users.

We met the sample size with 367 surveys for a nationally representative sample. The average age was 36.36 years (SD 14.06 years, range 18 to 78 years); the population with the highest participation in the sample were aged 18 to 29 (n=139, 38%) years and 30 to 39 (n=88, 24%) years, followed by 40 to 49 age group (n=70, 19%). More than half were women (n=209, 57%), had not completed a college degree (n=301, 82%), and lived in an urban area (n=268, 73%).

We found that 6% (22 out of 367) of respondents reported that someone in their household had had any COVID-19–related symptoms in the past 3 days. It was primarily the respondents who had any COVID-19–related symptoms (n=11, 52%). Likewise, 44% (n=17) of the respondents reported having had a confirmed or suspected case of COVID-19 in the past. Further, 90% (n=329) of respondents reported having received at least one dose of the COVID-19 vaccine, 54% (n=177) reported being fully vaccinated but without the booster, and 27% (n=88) stated being fully vaccinated with the booster. Of the 10% (38/367) who had not been vaccinated, they cited concerns about vaccine safety (n=10, 27%). Additionally, the top 3 means by which respondents learned about the COVID-19 vaccine were other networks or the internet (n=180, 49%), Facebook (n=77, 21%), and friends or relatives (n=43, 12%). Finally, most respondents reported that they clearly understood the information on COVID-19 and trusted the information they received (Table 4).

Table 4. COVID-19 variables for pilot 1.
VariablesValues
Symptoms related to COVID-19 (3 days ago), n (%)

Yes21 (6)

No346 (94)
Who has COVID-19, n (%)

Respondent11 (52)

Household5 (24)

Respondent and household5 (24)
Age of sick household member (years), mean (SD)26.4 (21.74)
Sex of sick household member (female), n (%)4 (80)
Community symptoms, n (%)a

Yes11 (65)

No6 (35)
Have you ever had COVID-19, n (%)

Yes, confirmed69 (19)

Yes, suspected92 (25)

No206 (56)
Wanted to receive the COVID-19 vaccine, n (%)b

Yes276 (84)

No53 (16)
Would you like to receive the COVID-19 vaccine, n (%)

Yes17 (44)

No21 (55)
Why have you not been vaccinated, n (%)

Vaccines were not available6 (16)

Vaccine does not work2 (5)

Vaccine is not safe10 (27)

I do not think I need it5 (13)

Already got COVID-192 (5)

I have not been able to get the vaccine6 (16)

Other reasons5 (13)

Do not know or do not want to respond2 (5)
Number of COVID-19 vaccine doses, n (%)

One dose61 (19)

Fully, not booster177 (54)

Fully, plus booster88 (27)
The brand of the first vaccine dose, n (%)

CoronaVac87 (26)

Pfizer78 (24)

Johnson & Johnson66 (20)

AstraZeneca45 (14)

Moderna44 (13)

Another vaccine2 (1)

Do not know7 (2)
The brand of the second vaccine dose, n (%)

CoronaVac75 (28)

Pfizer76 (29)

Johnson & Johnson35 (13)

AstraZeneca33 (12)

Moderna37 (14)

Do not know9 (3)
The brand of the booster vaccine dose, n (%)

CoronaVac16 (18)

Pfizer24 (27)

Johnson & Johnson12 (14)

AstraZeneca16 (18)

Moderna17 (19)

Do not know3 (3)
Vaccination record for COVID-19, n (%)

Yes, I have it331 (90)

I do not have it30 (8)

I do not know it3 (1)

I prefer not to answer3 (1)
Main source of information about COVID-19 vaccination, n (%)

Facebook77 (21)

WhatsApp29 (8)

Other networks or internet180 (49)

Friends or relatives (word of mouth)43 (12)

Radio4 (1)

Television26 (7)

Print media3 (1)

I have not received any information5 (1)
Your understanding about COVID-19 vaccination seems, n (%)

Very clear244 (66)

It is more or less clear92 (25)

Little clear24 (7)

Not clear7 (2)
Do you trust the information you receive about COVID-19 vaccination, n (%)

Always157 (43)

Almost always135 (37)

Almost never55 (15)

Never20 (5)

aCommunity symptoms: do you know about someone outside your household who has had in the last month any of the following symptoms: fever, sore throat, frequent cough, feeling sick, or loss of the sense of smell?

bWanted to receive the COVID-19 vaccine: did you want to receive the COVID-19 vaccine?

Pilot 2

In total, 588,891 phone calls were made for the second pilot. The contact rate was 1.16% (6777 out of 588,891), the response rate was 0.08% (474 out of 588,891), and the cooperation rate was 82.45% (451 out of 547; see Tables 2 and 3).

We met the sample size for the 5 regions in Colombia (Caribbean region: n=90, Pacific region: n=91, Amazon region: n=88, Orinoco River region: n=93, and Central region: n=89). In total, for the 5 regions, a sample of 451 responses was reached. All results are presented in Table 5.

Table 5. Comparison of sociodemographic and vaccination variables for Colombia, pilot 2.
VariablesThe Caribbean regionThe Pacific regionThe Amazon regionThe Orinoquia regionThe Central region

NQLSa,b,c 2021, % (SE)IVRd (n=90), % (SE)Difference between NQLS and pilot 2NQLS 2021, % (SE)IVR (n=91), % (SE)Difference between NQLS and pilot 2NQLS 2021, % (SE)IVR (n=88), % (SE)Difference between NQLS and pilot 2NQLS 2021, % (SE)IVR (n=93), % (SE)Difference between NQLS and pilot 2NQLS 2021, % (SE)IVR (n=89), % (SE)Difference between NQLS and pilot 2
Sociodemographic
 Age (years), mean (SD)42.04 (0.101)36.16 (1.623)–5.8843.43 (0.167)34.89 (1.367)–8.4140.41 (0.152)29.72 (1.115)–10.6941.01 (0.171)32.69 (1.286)–8.3243.52 (0.111)34.17 (1.491)–9.35
Age groups (years)
 18-2929 (0.0028)41 (0.052)1227 (0.0043)40 (0.051)1332 (0.0045)52 (0.053)2031 (0.0047)48 (0.052)1726 (0.002946 (0.053)20
 30-3921 (0.0025)24 (0.045)321 (0.0039)25 (0.045)422 (0.0040)28 (0.048)622 (0.0042)26 (0.045)421 (0.0027)20 (0.042)–1
 40-4918 (0.0024)15 (0.037)–317 (0.0036)23 (0.044)617 (0.0035)16 (0.039)–118 (0.0039)13 (0.034)–518 (0.0025)16 (0.038)–2
 50-5915 (0.0021)11 (0.033)–415 (0.0033)9 (0.029)–614 (0.0031)2 (0.015)–1213 (0.0034)11 (0.032)–215 (0.0023)10 (0.032)–5
 60-9117 (0.0022)9 (0.030)–820 (0.0038)3 (0.018)–1715 (0.0031)1 (0.011)–1416 (0.0037)2 (0.015)–1420 (0.0025)8 (0.028)–12
Sex (female)52 (0.0030)59 (0.053)753 (0.0048)52 (0.052)–151 (0.0047)51 (0.054)050 (0.0051)49 (0.052)–153 (0.0033)64 (0.051)11
Geographical area
 Rural26 (0.0023)19 (0.041)–733 (0.0037)30 (0.048)–342 (0.0045)35 (0.051)–728 (0.0037)25 (0.044)–316 (0.0014)25 (0.045)9
 Urban74 (0.0023)81 (0.041)767 (0.0037)70 (0.048)358 (0.0045)65 (0.051)772 (0.0037)75 (0.044)384 (0.0014)75 (0.045)–9
Education level
 Do not have9 (0.0015)4 (0.021)–55 (0.0017)4 (0.021)–17 (0.0022)6 (0.024)–15 (0.0020)3 (0.018)–23 (0.0010)2 (0.015)–1
 Elementary school23 (0.0025)8 (0.028)–1530 (0.0042)8 (0.028)–2237 (0.0047)10 (0.032)–2729 (0.0045)12 (0.033)–1724 (0.0026)14 (0.036)–10
 High school45 (0.0031)31 (0.049)–1444 (0.0049)45 (0.052)143 (0.0048)30 (0.048)–1345 (0.0052)34 (0.049)–1143 (0.0034)47 (0.053)4
 Technical12 (0.0021)30 (0.048)1810 (0.0034)29 (0.047)197 (0.0023)37 (0.051)3011 (0.0034)39 (0.050)2812 (0.0024)19 (0.053)7
 Undergraduate or more11 (0.0021)27 (0.046)1611 (0.0034)14 (0.036)36 (0.0023)17 (0.040)1110 (0.0036)12 (0.033)118 (0.0030)18 (0.041)0
Vaccination for COVID-19e
 Yesf96 (0.021)82 (0.040)78 (0.044)90 (0.030)94 (0.024)
 No4 (0.021)18 (0.040)22 (0.044)10 (0.030)6 (0.024)

aNQLS: National Quality of Life Survey 2021.

bThe percentages of the National Quality of Life Survey 2021 were calculated with sample weights.

cFor the National Quality of Life Survey, absolute values are not reported because we estimated weighted percentages.

dIVR: interactive voice response.

eVaccination data were compared with information published in Our World in Data. There should be at least one dose of vaccination. There is no available data from government sources.

fNot applicable.

For the 5 regions, we found that between 6% (5/90) to 12% (11/93) of respondents reported that someone in their household had had any COVID-19–related symptoms in the past 3 days. For the 5 regions, more than 50% (374/451) of the respondents reported that they did not have confirmed or suspected COVID-19 in the past. The COVID-19 vaccination rate varied across regions, from 78% (69/88) in Amazonas to 96% (86/90) in the Caribbean. Most regions reported close to 50% (221/451) full vaccination but without the booster. Of those who had not been vaccinated, the main reason cited was about vaccine safety in all regions. Additionally, the top 3 main sources of information about COVID-19 vaccination were other networks or the internet, Facebook, and friends or relatives (336/451, 74%; Table 6).

Table 6. COVID-19 variables for pilot 2.
VariablesThe Caribbean region (n=90)The Pacific region (n=91)The Amazon region (n=88)The Orinoquia region (n=93)The Central region (n=89)
Symptoms related to COVID-19 (3 days ago), n (%)

Yes5 (6)6 (7)7 (7)11 (12)5 (6)

No85 (94)85 (93)82 (93)82 (88)84 (94)
Who has COVID-19, n (%)

Respondent1 (20)1 (17)3 (50)5 (46)2 (40)

Household4 (80)5 (83)3 (50)3 (27)2 (40)

Respondent and householda27 (3)20 (1)
Age of sick household member, mean (SD)17 (16.51)34 (30.76)22.33 (9.07)32.33 (24)17.5 (17.67)
Sex of sick household member (female)4 (100)3 (60)1 (33)2 (67)1 (50)
Community symptomsb, n (%)

Yes1 (50)1 (33)1 (20)6 (75)

No1 (50)2 (67)4 (80)2 (25)4 (100)
Have you ever had COVID-19, n (%)

Yes, confirmed16 (18)17 (19)12 (14)12 (13)20 (22)

Yes, suspected28 (31)25 (27)32 (36)27 (29)21 (24)

No46 (51)49 (54)44 (50)54 (58)48 (54)
Wanted to receive the COVID-19 vaccinec, n (%)

Yes70 (81)60 (80)51 (74)68 (81)69 (82)

No16 (19)15 (20)18 (26)16 (19)15 (18)
Would you like to receive the COVID-19 vaccine, n (%)

Yes1 (25)3 (19)7 (37)3 (33)1 (20)

No3 (75)13 (81)12 (63)6 (67)4 (80)
Why have you not been vaccinated, n (%)

Vaccines were not available12 (2)22 (2)

Vaccine does not work7 (1)11 (2)

Vaccine is not safe3 (75)9 (57)10 (53)4 (45)3 (60)

I do not think I need it2 (12)1 (11)1 (20)

Already got covid1 (5)

I have not been able to get the vaccine5 (26)2 (22)

Other reasons1 (25)2 (12)1 (5)1 (20)

Do not know or do not want to respond
Number of COVID-19 vaccine doses, n (%)

One dose19 (22)15 (20)18 (26)15 (18)19 (23)

Fully, not booster42 (50)49 (49)35 (51)50 (62)45 (54)

Fully, plus booster24 (28)23 (31)16 (23)16 (20)20 (24)
The brand of the first vaccine dose, n (%)

CoronaVac17 (20)15 (20)1 (2)19 (23)20 (24)

Pfizer23 (27)16 (21)14 (20)22 (26)19 (23)

Johnson & Johnson18 (21)25 (33)16 (23)11 (13)22 (26)

Astra Zeneca14 (16)7 (10)16 (23)16 (19)7 (8)

Moderna12 (14)11 (15)9 (13)14 (17)15 (18)

Another vaccine1 (1)11 (16)

Do not know1 (1)1 (1)2 (3)2 (2)1 (1)
The brand of the second vaccine dose, n (%)

CoronaVac12 (18)11 (19)10 (20)3 (4)17 (26)

Pfizer23 (35)15 (25)13 (25)20 (30)17 (26)

Johnson & Johnson8 (12)11 (19)8 (16)17 (26)10 (15)

Astra Zeneca11 (17)9 (15)8 (16)2 (3)7 (11)

Moderna10 (15)12 (20)11 (21)13 (20)12 (19)

Do not know2 (3)1 (2)1 (2)11 (17)2 (3)
The brand of the booster vaccine dose, n (%)

CoronaVac2 (8)3 (13)3 (19)2 (12)5 (1)

Pfizer12 (50)5 (22)8 (50)6 (38)9 (45)

Johnson & Johnson4 (18)4 (17)2 (12)2 (12)2 (10)

Astra Zeneca2 (8)1 (4)1 (7)3 (20)4 (20)

Moderna2 (8)3 (13)2 (12)2 (12)2 (10)

Do not know2 (8)7 (31)1 (6)1 (5)
Vaccination record for COVID-19, n (%)

Yes, I have it80 (89)71 (78)66 (75)80 (86)82 (92)

I do not have it8 (9)14 (16)16 (18)11 (12)7 (8)

I do not know it2 (2)3 (3)4 (5)2 (2)

I prefer not to answer3 (3)2 (2)
Main source of information about COVID-19 vaccination, n (%)

Facebook16 (18)19 (21)18 (20)20 (21)17 (19)

WhatsApp3 (3)8 (9)6 (7)9 (10)13 (15)

Other networks or internet49 (55)39 (43)35 (40)35 (38)45 (51)

Friends or relatives (word of mouth)9 (10)8 (9)9 (10)10 (11)7 (8)

Radio2 (2)5 (5)4 (5)3 (3)3 (3)

Television8 (9)9 (10)12 (14)10 (11)2 (2)

Print media2 (2)1 (1)1 (1)2 (2)1 (1)

I have not received any information1 (1)2 (2)3 (3)4 (4)1 (1)
Your understanding about COVID-19 vaccination seems, n (%)

Very clear63 (70)61 (67)52 (59)59 (63)58 (65)

It is more or less clear19 (21)19 (21)24 (27)17 (18)23 (26)

Little clear4 (4)9 (10)8 (9)11 (12)7 (8)

Not clear4 (4)2 (2)4 (5)6 (7)1 (1)
Do you trust the information you receive about COVID-19 vaccination, n (%)

Always44 (49)41 (45)38 (43)36 (39)46 (52)

Almost always30 (33)31 (34)26 (30)38 (41)18 (20)

Almost never10 (11)14 (15)19 (22)10 (11)24 (27)

Never6 (7)5 (5)5 (5)9 (9)1 (1)

aNot applicable.

bCommunity symptoms: Do you know about someone outside your household who has had in the last month any of the following symptoms: fever, sore throat, frequent cough, feeling sick, or loss of the sense of smell?

cWanted to receive the COVID-19 vaccine: Did you want to receive the COVID-19 vaccine?

Comparison of NQLS With Mobile Phone Survey Samples

Pilot 1

We found that demographic variables have a similar distribution between the mobile phone surveys and NQLS. Even though respondents are more likely to be younger and female, differences do not seem to be meaningful (we are not making statements on significance because we cannot test the differences between points estimated in both surveys). Regarding geographical variables, we found that in both IVR and NQLS samples, a quarter of respondents stated living in rural areas. Importantly, IVR respondents tend to be more educated than the general population (Table 5).

Pilot 2

Overall, for the 5 regions, we found that demographic variables have a similar distribution to pilot 1 results. When comparing pilot 2 results to NQLS, we found that in regions with lower access to mobile phone technology, population groups are less likely to be represented in IVR surveys (older, less educated, and rural) than in regions with higher access to mobile phone technology. For example, the central region concentrates around 60% of the population of the country. In this region, the share of respondents reporting living in rural areas was 9 percentage points higher than its share of the population. Meanwhile, in the Orinoco and Amazon river regions (each with around 1.5% of the population), the share of respondents reporting living in urban areas was 3 percentage points higher than its share of the population.


In this study, we conducted 2 different pilots to assess the feasibility of a national active syndromic surveillance system for COVID-19 in Colombia. In both pilots, we found that it is possible to deploy a rapid active syndromic surveillance system for respiratory disease at the national level in Colombia with fair contact rates (between 1% and 2%) and excellent cooperation rates (above 80%) when compared to other studies [6]. Evidence from Burkina Faso, a low-income country, has reported similar cooperation rates using IVR [30] in a nonemergency context. In another study, an IVR survey implemented during the COVID-19 pandemic reported refusal rates in Ecuador and Sri Lanka of 5.1% and 2%, respectively, whereas our refusal rate was close to 1% (6799/509,267). Another study conducted in 5 LMICs reported similar contact, response, cooperation, and refusal rates [8].

Sampling and operational characteristics allow public health officials to monitor changes in the prevalence of respiratory symptoms over time to detect and prepare for changes in respiratory disease prevalence. An additional benefit of this technology is that it does not need face-to-face interaction reducing exposure to survey personnel, can be deployed faster in challenging or low-density environments [26], and is generally less costly [9]. However, this system has 2 main limitations. On the demand side, it is clear that respondents are more likely to be younger, female, and more educated than the average population implying that they might be overrepresented, as it has been found in other research using IVR [16,31]. To reduce selection bias, measures must be taken to improve participation from less represented groups [28]. It is possible that over time and as younger cohorts grow, issues around digital literacy and access to cell phone numbers will subside.

On the supply side, it is also clear that connectivity and mobile phone availability might play a role in survey participation [31]. To reduce selection bias associated with this effect, the second pilot included quotas by region, revealing similar patterns to those found in the first pilot (more likely to find younger, female, and more educated respondents), but still providing closer demographic estimates to those found in external surveys. The strategy of using quotas can be further expanded to smaller areas of particular interest such as municipalities, if needed.

This study is limited mainly by the availability of government data to compare results, particularly related to COVID-19 and other cases of respiratory disease, and vaccination status by region. The available information about behavior and COVID-19 in Colombia is reported on the Johns Hopkins COVID Behaviors Dashboard, which is at the national level. For example, in March 2022, the Dashboard indicated that in Colombia, about 40% of the unvaccinated wanted to get a vaccine, and our data at the national level (pilot 1) found that that was about 44% (17/38). Likewise, the Dashboard reported an ever–COVID-19 prevalence of 43%, and our data found an ever-prevalence (confirmed or suspected) of 44% (161/367), proving similar estimates with very different data sources.

This evaluation shows that using IVR as a surveillance system may be useful and feasible for conducting syndromic surveillance amid a health emergency. This study informs new areas of expansion of this work: including developing quotas for demand-side variables that affect response rates including sex, age, and educational level as well as sentinel systems so people can report on the status of their households and neighborhoods to expand the network of surveillance.

Acknowledgments

We thank the anonymous participants who responded to the survey and the Ministry of Health of Colombia and the National Institute of Health for their request to carry out this evaluation, and for their guidance and advice throughout the development of the project. Finally, many thanks to engageSPARK, the mobile phone survey provider, for their outstanding role in delivering the surveys and continuous support. This evaluation was supported by the Inter-American Development Bank (137799). The content is the responsibility of the authors and does not necessarily reflect the views of the Inter-American Development Bank.

Data Availability

The anonymized surveillance data used in this study are available upon written request.

Conflicts of Interest

None declared.

  1. Kennedy B, Fitipaldi H, Hammar U, Maziarz M, Tsereteli N, Oskolkov N, et al. App-based COVID-19 syndromic surveillance and prediction of hospital admissions in COVID Symptom Study Sweden. Nat Commun. 2022;13(1):2110. [FREE Full text] [CrossRef] [Medline]
  2. Mian MS, Razaq L, Khan S, Hussain N, Razaq M. Pathological findings and management of COVID-19 patients: a brief overview of modern-day pandemic. Cureus. 2020;12(5):e8136. [FREE Full text] [CrossRef] [Medline]
  3. Graham LM. Observations from the COVID-19 pandemic. Pediatr Allergy Immunol Pulmonol. 2020;33(2):45-46. [FREE Full text] [CrossRef] [Medline]
  4. Güemes A, Ray S, Aboumerhi K, Desjardins MR, Kvit A, Corrigan AE, et al. A syndromic surveillance tool to detect anomalous clusters of COVID-19 symptoms in the United States. Sci Rep. 2021;11(1):4660. [FREE Full text] [CrossRef] [Medline]
  5. World Health Organization. Public health surveillance for COVID-19: interim guidance, 16 December 2020; Report No.: WHO/2019-nCoV/SurveillanceGuidance/2020.8. World Health Organization. 2020. URL: https://apps.who.int/iris/handle/10665/337897 [accessed 2022-06-28]
  6. Phadnis R, Wickramasinghe C, Zevallos JC, Davlin S, Kumarapeli V, Lea V, et al. Leveraging mobile phone surveys during the COVID-19 pandemic in Ecuador and Sri Lanka: methods, timeline and findings. PLoS One. 2021;16(4):e0250171. [FREE Full text] [CrossRef] [Medline]
  7. Nagpal K, Mathur MR, Biswas A, Fraker A. Who do phone surveys miss, and how to reduce exclusion: recommendations from phone surveys in nine Indian states. BMJ Glob Health. 2021;6(Suppl 5):e005610. [FREE Full text] [CrossRef] [Medline]
  8. Song Y, Phadnis R, Favaloro J, Lee J, Lau CQ, Moreira M, et al. Using mobile phone data collection tool, Surveda, for noncommunicable disease surveillance in five low- and middle-income countries. Online J Public Health Inform. 2020;12(2):e13. [FREE Full text] [CrossRef] [Medline]
  9. Vecino-Ortiz AI, Nagarajan M, Katumba KR, Akhter S, Tweheyo R, Gibson DG, et al. A cost study for mobile phone health surveys using interactive voice response for assessing risk factors of noncommunicable diseases. Popul Health Metr. 2021;19(1):32. [FREE Full text] [CrossRef] [Medline]
  10. Adeniyi EA, Awotunde J, Ogundokun R, Kolawole PO, Abiodun M, Adeniyi AA. Mobile health application and COVID-19: opportunities and challenges. Semantic Scholar. 2020. URL: https:/​/www.​semanticscholar.org/​paper/​MOBILE-HEALTH-APPLICATION-AND-COVID-19%3A-AND-Adeniyi-Awotunde/​a29aeaf1b3653feab5830a6f678f697159d5642f [accessed 2022-06-28]
  11. The world's richest source of ICT statistics and regulatory information. ITU DataHub. URL: https://datahub.itu.int/ [accessed 2022-06-28]
  12. Timmers T, Janssen L, Stohr J, Murk JL, Berrevoets MAH. Using eHealth to support COVID-19 education, self-assessment, and symptom monitoring in the Netherlands: observational study. JMIR mHealth uHealth. 2020;8(6):e19822. [FREE Full text] [CrossRef] [Medline]
  13. Mahmud AS, Chowdhury S, Sojib KH, Chowdhury A, Quader MT, Paul S, et al. Participatory syndromic surveillance as a tool for tracking COVID-19 in Bangladesh. Epidemics. 2021;35:100462. [FREE Full text] [CrossRef] [Medline]
  14. Community based self-administered syndromic surveillance mobile application—a strategic approach to monitor COVID situation at micro level. National Journal of Community Medicine. URL: https://www.njcmindia.com/index.php/file/article/view/197 [accessed 2022-06-28]
  15. Greenleaf AR, Gibson DG, Khattar C, Labrique AB, Pariyo GW. Building the evidence base for remote data collection in low- and middle-income countries: comparing reliability and accuracy across survey modalities. J Med Internet Res. 2017;19(5):e140. [FREE Full text] [CrossRef] [Medline]
  16. Pariyo GW, Greenleaf AR, Gibson DG, Ali J, Selig H, Labrique AB, et al. Does mobile phone survey method matter? Reliability of computer-assisted telephone interviews and interactive voice response non-communicable diseases risk factor surveys in low and middle income countries. PLoS One. 2019;14(4):e0214450. [FREE Full text] [CrossRef] [Medline]
  17. Gibson DG, Farrenkopf BA, Pereira A, Labrique AB, Pariyo GW. The development of an interactive voice response survey for noncommunicable disease risk factor estimation: technical assessment and cognitive testing. J Med Internet Res. 2017;19(5):e112. [FREE Full text] [CrossRef] [Medline]
  18. Wolter K, Chowdhury S, Kelly J. Chapter 7—Design, conduct, and analysis of random-digit dialing surveys. In: Rao CR, editor. Handbook of Statistics. Edinburgh, London, Oxford, UK. Elsevier; 2009:125-154.
  19. Silva J, Solano D, Fernandez C, Romero L, Vargas VJ. Privacy preserving, protection of personal data, and big data: a review of the Colombia case. Procedia Computer Science. 2019. URL: http://hdl.handle.net/11323/4836 [accessed 2022-09-27]
  20. Gibson DG, Kibria GMA, Pariyo GW, Ahmed S, Ali J, Labrique AB, et al. Promised and lottery airtime incentives to improve interactive voice response survey participation among adults in Bangladesh and Uganda: randomized controlled trial. J Med Internet Res. 2022;24(5):e36943. [FREE Full text] [CrossRef] [Medline]
  21. Vecino-Ortiz AI, Bardey D, Castano-Yepes R. Hospital variation in cesarean delivery: a multilevel analysis. Value Health Reg Issues. 2015;8:116-121. [FREE Full text] [CrossRef] [Medline]
  22. Ministerio de Salud y Protección Social. Encuesta Nacional de Salud Pública. 2007. URL: https:/​/www.​minsalud.gov.co/​sites/​rid/​Lists/​BibliotecaDigital/​RIDE/​VS/​ED/​GCFI/​ENCUESTA%20NACIONAL.​pdf [accessed 2019-05-18]
  23. Vecino-Ortiz AI. Determinants of demand for antenatal care in Colombia. Health Policy. 2008;86(2-3):363-372. [FREE Full text] [CrossRef] [Medline]
  24. Proyecciones de población. Departamento Administrativo Nacional de Estadística (DANE). URL: https:/​/www.​dane.gov.co/​index.php/​estadisticas-por-tema/​demografia-y-poblacion/​proyecciones-de-poblacion [accessed 2019-05-04]
  25. Gibson DG, Pariyo GW, Wosu AC, Greenleaf AR, Ali J, Ahmed S, et al. Evaluation of mechanisms to improve performance of mobile phone surveys in low- and middle-income countries: research protocol. JMIR Res Protoc. 2017;6(5):e81. [FREE Full text] [CrossRef] [Medline]
  26. Gibson DG, Pereira A, Farrenkopf BA, Labrique AB, Pariyo GW, Hyder AA. Mobile phone surveys for collecting population-level estimates in low- and middle-income countries: a literature review. J Med Internet Res. 2017;19(5):e139. [FREE Full text] [CrossRef] [Medline]
  27. Goyes AB, Gaona JV, López VB, Niño AS, Díaz GAH. Prevalencia de síntomas respiratorios y riesgo de obstrucción al flujo aéreo en ginebra—valle del cauca. Rev Med. 2017;25(2):42-54. [CrossRef]
  28. Labrique A, Blynn E, Ahmed S, Gibson D, Pariyo G, Hyder AA. Health surveys using mobile phones in developing countries: automated active strata monitoring and other statistical considerations for improving precision and reducing biases. J Med Internet Res. 2017;19(5):e121. [FREE Full text] [CrossRef] [Medline]
  29. Standard definitions final dispositions of case codes and outcome rates for surveys. American Association for Public Opinion Research. 2016. URL: https://aapor.org/standards-and-ethics/standard-definitions/ [accessed 2022-09-28]
  30. Greenleaf AR, Gadiaga A, Guiella G, Turke S, Battle N, Ahmed S, et al. Comparability of modern contraceptive use estimates between a face-to-face survey and a cellphone survey among women in Burkina Faso. PLoS One. 2020;15(5):e0231819. [FREE Full text] [CrossRef] [Medline]
  31. Torres-Quintero A, Vega A, Gibson DG, Rodriguez-Patarroyo M, Puerto S, Pariyo GW, et al. Adaptation of a mobile phone health survey for risk factors for noncommunicable diseases in Colombia: a qualitative study. Glob Health Action. Dec 31, 2020;13(1):1809841-1809893. [FREE Full text] [CrossRef] [Medline]


IVR: interactive voice response
LMIC: low- and middle-income country
MPS: mobile phone survey
NQLS: National Quality of Life Survey 2021
STEPS: World Health Organization Stepwise Approach to Noncommunicable Disease Risk Factor Surveillance
WHO: World Health Organization


Edited by G Eysenbach; submitted 22.06.23; peer-reviewed by DK Yon; comments to author 22.12.23; revised version received 03.03.24; accepted 08.03.24; published 17.10.24.

Copyright

©Andres I Vecino-Ortiz, Deivis Nicolas Guzman-Tordecilla, Vidhi Maniar, Sandra Agudelo-Londoño, Oscar Franco-Suarez, Nathaly Aya Pastrana, Mariana Rodríguez-Patarroyo, Marino Mejía-Rocha, Jaime Cardona, Mariangela Chavez Chamorro, Dustin Gibson. Originally published in the Journal of Medical Internet Research (https://www.jmir.org), 17.10.2024.

This is an open-access article distributed under the terms of the Creative Commons Attribution License (https://creativecommons.org/licenses/by/4.0/), which permits unrestricted use, distribution, and reproduction in any medium, provided the original work, first published in the Journal of Medical Internet Research (ISSN 1438-8871), is properly cited. The complete bibliographic information, a link to the original publication on https://www.jmir.org/, as well as this copyright and license information must be included.