Published on in Vol 20, No 3 (2018): March

Preprints (earlier versions) of this paper are available at https://preprints.jmir.org/preprint/9157, first published .
Using Fitness Trackers and Smartwatches to Measure Physical Activity in Research: Analysis of Consumer Wrist-Worn Wearables

Using Fitness Trackers and Smartwatches to Measure Physical Activity in Research: Analysis of Consumer Wrist-Worn Wearables

Using Fitness Trackers and Smartwatches to Measure Physical Activity in Research: Analysis of Consumer Wrist-Worn Wearables

Original Paper

1Department of Community Medicine, University of Tromsø – The Arctic University of Norway, Tromsø, Norway

2Department of Computer Science, University of Tromsø – The Arctic University of Norway, Tromsø, Norway

3Norwegian Centre for E-health Research, University Hospital of North Norway, Tromsø, Norway

4Spin-Off Company and Research Results Commercialization Center, 1st Faculty of Medicine, Charles University in Prague, Prague, Czech Republic

5Department of Health and Care Sciences, University of Tromsø – The Arctic University of Norway, Tromsø, Norway

Corresponding Author:

André Henriksen, MSc (Comp Sci), MBA

Department of Community Medicine

University of Tromsø – The Arctic University of Norway

Postboks 6050 Langnes

Tromsø, 9037

Norway

Phone: 47 77644000

Email: andre.henriksen@uit.no


Background: New fitness trackers and smartwatches are released to the consumer market every year. These devices are equipped with different sensors, algorithms, and accompanying mobile apps. With recent advances in mobile sensor technology, privately collected physical activity data can be used as an addition to existing methods for health data collection in research. Furthermore, data collected from these devices have possible applications in patient diagnostics and treatment. With an increasing number of diverse brands, there is a need for an overview of device sensor support, as well as device applicability in research projects.

Objective: The objective of this study was to examine the availability of wrist-worn fitness wearables and analyze availability of relevant fitness sensors from 2011 to 2017. Furthermore, the study was designed to assess brand usage in research projects, compare common brands in terms of developer access to collected health data, and features to consider when deciding which brand to use in future research.

Methods: We searched for devices and brand names in six wearable device databases. For each brand, we identified additional devices on official brand websites. The search was limited to wrist-worn fitness wearables with accelerometers, for which we mapped brand, release year, and supported sensors relevant for fitness tracking. In addition, we conducted a Medical Literature Analysis and Retrieval System Online (MEDLINE) and ClinicalTrials search to determine brand usage in research projects. Finally, we investigated developer accessibility to the health data collected by identified brands.

Results: We identified 423 unique devices from 132 different brands. Forty-seven percent of brands released only one device. Introduction of new brands peaked in 2014, and the highest number of new devices was introduced in 2015. Sensor support increased every year, and in addition to the accelerometer, a photoplethysmograph, for estimating heart rate, was the most common sensor. Out of the brands currently available, the five most often used in research projects are Fitbit, Garmin, Misfit, Apple, and Polar. Fitbit is used in twice as many validation studies as any other brands and is registered in ClinicalTrials studies 10 times as often as other brands.

Conclusions: The wearable landscape is in constant change. New devices and brands are released every year, promising improved measurements and user experience. At the same time, other brands disappear from the consumer market for various reasons. Advances in device quality offer new opportunities for research. However, only a few well-established brands are frequently used in research projects, and even less are thoroughly validated.

J Med Internet Res 2018;20(3):e110

doi:10.2196/jmir.9157

Keywords



Background

The World Health Organization recommends 150 min of moderate intensity physical activity (PA) each week for adults and 60 min for children and adolescents [1]. However, 25% of adults and more than 80% of adolescents do not achieve the recommended PA targets [1]. Results from the Tromsø Study, the longest running population study in Norway, shows that only 30.4% of women and 22.0% of men reach the recommended target [2].

Low PA is currently the fourth leading risk factor for mortality worldwide [3]. Even though there is limited evidence that using wearable fitness trackers will improve health [4,5], these devices are still popular, and new fitness devices appear on the consumer market regularly. In 2016, vendors shipped 102 million devices worldwide, compared with 82 million in 2015 [6]. Fifty-seven percent of these devices were sold by the top five brands: Fitbit, Xiaomi, Apple, Garmin, and Samsung. The first quarter of 2017 shows an increase of 18% in devices sold, compared with the same period in 2016 [7]. With a large number of available devices and brands, it is difficult to navigate through an ever-growing list of brands and devices with different capabilities, price, and quality.

Available sensors and internal interpreting algorithms determine device output. Sensor data are, in most devices, reduced to a limited set of metrics before being transferred to the user’s mobile phone. In addition, limited space affects how long the device can collect data before such a transfer is needed. Data are stored locally, and in many cases, uploaded to brand specific or open cloud–based health repositories. Accessing these data by third-party apps and comparing them is not always possible. These interoperability challenges were recently identified in a study by Arriba-Pérez et al [8]. They suggested ways to handle these issues, but they did not make any brand or device recommendations. Several studies have compared activity-tracking wearables. As an example, Kaewkannate and Kim [9] did a comparison of four popular fitness trackers in 2016. They compared devices objectively and subjectively. Data were thoroughly collected, but because of the rapid release of new devices, these four devices will be among the most popular only for a relatively short time. A comparison of brands is also of interest because brands from larger companies are, compared with small start-ups and crowd funded brands, likely to survive longer. In addition, it is of interest to know which brands support the various available programming options. Sanders et al [10] did a literature review on articles using wearables for health self-monitoring and sedentary behavior and PA detection. They reviewed various aspects of these devices, but they gave no details about device sensor support and suitability in research.

The objective of this study was to examine how the consumer market for wearables has evolved, and analyze and summarize available devices that can measure PA and heart rate (HR). Moreover, we aim to identify brands that are used extensively in research projects, and compare and consider their relevance for future studies.

Sensors

A plethora of devices promises to measure PA in new and improved ways. These devices use different sensors and algorithms to calculate human readable metrics based on sensor output. Traditional step counters use pedometers to detect daily step counts. Although cheap and energy efficient, pedometers are not as accurate as accelerometers, which is the current standard for collecting PA data [11]. All modern fitness trackers and smartwatches have an accelerometer. Compared with research tools (eg, ActiGraph [12]), these devices are considered less accurate for some measurements [13,14]. However, they are generally less invasive, cheaper, have more functionality, are more user-friendly, and are increasingly being used in research. Most accelerometer-based fitness wearables measure acceleration in three directions [15] and can be used to estimate type of movement, count steps, calculate energy expenditure (EE) and energy intensity, as well as estimate sleep patterns and more. The validity and reliability of these metrics varies. Evenson et al [14] did a review in 2015 and found high validity for steps but low validity for EE and sleep. Furthermore, they found reliability for steps, distance, EE, and sleep to be high for some devices.

In addition, some wearables have gyroscopes, magnetometers, barometers, and altimeters. A gyroscope can potentially increase device accuracy by measuring gravitational acceleration, that is, orientation and angular velocity, and better estimate which activity type a person is performing [16]. A magnetometer is a digital compass [15] and can improve motion tracking accuracy by detecting the orientation of the device relative to magnetic north. Magnetometers improve accuracy by compensating for gyroscope drift, a problem with gyroscopes where the rotation axis slowly drifts from the actual motion and must be restored regularly. Accelerometers, gyroscopes, and magnetometers are often combined into an inertial measurement unit (IMU). Most mobile phones use IMUs to calculate orientation, and an increasing number of fitness wearables include this unit to give more accurate metrics. Barometers or altimeters detect changes in altitude [15] and can be used to improve some metrics (eg, EE), as well as report additional metrics (eg, climbed floors).

Photoplethysmography (PPG) is a relatively new technique in wearables. PPG is an optical technique to estimate HR by monitoring changes in blood volume beneath the skin [17]. A light-emitting diode projects light onto the skin, which is affected by the HR and reflected back to the sensor. However, movement, ambient light, and tissue compression affect the light, resulting in signal noise, and cleaning algorithms often use accelerometer data to assist HR estimation [18]. There is some evidence that gyroscopes could be used [19] to reduce PPG signal noise, so we are likely to see more devices in the future equipped with PPG sensors. To further enrich the PA data collection, some devices have a built in global positioning system (GPS) receiver. This is especially true for high-end fitness trackers and sports watches specifically targeting physically active people. With a GPS, it is possible to track more data, including position, speed, and altitude.

Algorithms and Mobile Apps

Raw data from sensors must be converted into readable metrics to be meaningful for the user. Many devices only display a limited set of metrics directly on the device (eg, today’s step count or current HR) and rely on an accompanying mobile app to show the full range of available metrics (eg, historic daily step count and detailed HR data). Although the physical sensors in these devices are very similar, the algorithms that interpret sensor output are unique for most vendors. These algorithms are often company secrets, and they can be changed without notice. In addition, the quality and supported features of the accompanying mobile apps varies, and the total user experience will therefore differ. Each additional sensor included in a device can be used to add additional types of metrics for the user or supply internal algorithms with additional data to improve accuracy of already available metric types. However, additional sensors affect price and power consumption.

Device Types

There are many similarities between different types of devices, and they may be difficult to categorize. We will use the term wearable in this paper as a common term for wrist-worn devices that can track and share PA data with a mobile phone.

A smartwatch is a wrist-worn device that, mostly, acts as an extension to a mobile phone and can show notifications and track PA and related metrics. Modern smartwatches often include a touch screen and can support advanced features and display high resolution activity trends [15]. Fitness trackers (ie, smart band or fitness band), normally worn on the wrist or hip, are devices more dedicated to PA tracking. A fitness tracker is typically cheaper than a smartwatch because of less expensive hardware and often fewer sensors. Due to this, it generally also has better battery life and a limited interface for displaying tracking results [15].

Other terms are also used, for example, sports watch and GPS watch, which can be considered merges between smartwatches and fitness trackers. In addition, there are hybrid watches (ie, hybrid smartwatches) that have a traditional clockwork and analogue display that have been fitted with an accelerometer. An accompanying mobile app is needed to access most data, but daily step counts are often represented as an analogue gauge on the watch face.

Wearable Usage Scenario

Wearables come forward as a new alternative to tracking PA in research (compared with, eg, ActiGraph), especially when it is desired to collect measurements for a prolonged period of time. In an intervention study, continuous data collecting from wearables would allow researchers to better track changes in PA and adjust the intervention accordingly. Wearables can also be used in epidemiological research as a tool for tracking PA for an extended period. This could reveal detailed PA changes in a population over time. In both scenarios, there are several potential important requirements to consider when choosing a device for the study, including usability, battery life, price, accuracy, durability, look and feel, and data access possibilities.


Search Strategies

Brands, Devices, and Sensors

We searched six databases to create a list of relevant wearable devices: The Queen’s University’s Wearable Device Inventory [20], The Vandrico Wearables database [21], GsmArena [22], Wearables.com [23], SpecBucket [24], and PrisGuide [25,26]. We only used publicly available information when comparing devices. We did the search from May 15, 2017 to July 1, 2017.

We identified wearables in two steps. In step one, we identified and searched the six defined databases. In step two, we extracted all brands from the list of devices identified in step one and examined brand websites for additional devices. If we found the same device in several databases with conflicting information, we manually identified the correct information from the device’s official website or other online sources (eg, Wikipedia and Google search). We removed duplicates and devices not fitting the inclusion criteria.

Brand Usage in Research

We searched Ovid MEDLINE on September 30, 2017 to determine how often the most relevant brands were used in previous studies. For each search, we performed a keyword search with no limitations set. We divided our findings into validation and reliability studies and data collection studies.

To decide which brand to consider most relevant, we did two sets of searches. In the first set, we created a brand-specific keyword search for brands that were (1) One of the five most sold brands in 2015 or 2016 or (2) Had released 10 or more unique devices. From the resulting list of articles, we screened title, abstract, and the method section. This screening was done to (1) Exclude articles out of scope and (2) To identify additional brands used in these studies. We compiled a list of these brands and performed a second set of searches, one for each new identified brand. Eleven brands were finally included. The specific keyword search used for each brand is given in the Results section where we summarize our findings.

We also searched the US National Library of Medicine database of clinical studies through the ClinicalTrials website, using the same 11 keyword searches, to determine brand usage in ongoing projects. One author did the articles screening, as well as the projects description screening in ClinicalTrials.

Brand Developer Possibilities

To determine how relevant a specific brand is when planning a new research project, we reviewed the 11 identified brands and considered available developer options, supported mobile phone environments, and options for health data storage. We especially reviewed availability of an application programming interface (API) and a software development kit (SDK). Information was collected from Google Play, Apple’s App Store, and official brand websites. Information retrieval was done in September 2017.

Inclusion and Exclusion Criteria

Brands, Devices, and Sensors

The study is limited to wrist-worn consumer devices that utilize accelerometers to measure PA. Devices capable of collecting HR from the wrist using an optical sensor were tagged as PPG devices. Devices were tagged as GPS devices only if they had a built-in GPS tracker. We only included devices meant for personal use, designed to be worn continuously (24/7), and were capable of sharing data with mobile phones through Bluetooth. The wrist-worn limitation was added because hip-worn devices are not normally worn during the night (ie, not 24/7). Only devices released before July 1, 2017 were included. We excluded hybrid watches because most hybrid vendors make a large number of watch variations, with what seems to be the same hardware. In addition, these watches are mostly available through high-end suppliers of traditional watches, at a price point that would prevent researchers from considering their use in a large study.

Brand Usage in Research

Due to the large number of available brands, we limited our search to include only the 11 brands already identified as relevant. We excluded brands that are no longer available (ie, company shut down). Review studies were also excluded.

Brand Developer Possibilities

When reviewing brand relevance in research, we only reviewed developer capabilities for the 11 brands we had already included in the list of relevant brands. We set the additional limitation that the brand was used in at least one article in Ovid MEDLINE.

Device Categorization, Data Collection, and Reporting Categories

When collecting information about wearables, we categorized them into three groups:

  1. Smartwatches: a device was tagged as a smartwatch if
    • It supported mobile phone notifications, and the vendor described it as a smart watch, or if
    • It had a touch screen and was not explicitly described as a fitness tracker by the vendor.
  2. Fitness trackers: we classified a device as a fitness tracker if
    • Its main purpose was to track PA, or if
    • The vendor called it a fitness tracker, or if
    • The device did not support notifications from the connected mobile phone (eg, incoming calls or texts).
  3. Hybrid watches: to be considered a hybrid watch, the device had to have an analogue clockwork with a built-in digital accelerometer.

We collected the following variables for each device: brand name, device name, year of release, country of origin, device type (eg, fitness tracker), and whether they had a built-in accelerometer, gyroscope, magnetometer, barometer or altimeter, GPS, and PPG.

We looked at three aspects of the devices we identified and reported under three categories:

  1. Metrics and trends: in this category, we described the status for available brands, devices, and sensors, as well as reviewed trends in sensor availability over time.
  2. Brand usage in research: in this category, we searched Ovid MEDLINE and ClinicalTrials and determined which brands are most used in a research setting.
  3. Brand developer possibilities: in this category, we reviewed software integration platforms and mobile platform support for the most relevant brands.

Relevant Devices

An overview of the device search process is given in Figure 1. We found 572 devices by searching online and offline databases and 131 additional devices by visiting the official websites for each identified brand, totaling 703 devices. Removing duplicates left 567 unique devices. These were screened for variation, that is, the same device with different design. After excluding 41 because of variation, 526 remained and were screened for eligibility. We removed 103 devices for not fitting the inclusion criteria. The remaining 423 devices were included in the study.

Brands, Devices, and Sensors

Brands

We identified 423 unique wearables, distributed between 132 different brands. Almost half the brands (47.0%, 62/132) had only one device. Moreover, 75.0% (99/132) of brands had three or fewer devices, and 83.3% (110/132) had five or fewer devices. Brands originated from 23 different countries, but the United States (43.2%, 57/132) and China (16.7%, 22/132, mainland China; 19.0%, 25/132, including Taiwan) represented the largest number of brand origin. Each remaining country represented between 0.8% (1/132) and 5.3% (7/132) of brands.

As the market has grown and wearable technology has become increasingly popular, a number of new brands have appeared on the market. In 2011, there were only three brands available. There was a small increase in brand count in 2012 and 2013, but in 2014, we saw the largest increase with 41 new brands. The number of new brands started to decrease in 2015, with 36 new brands in 2015 and 23 in 2016. Only three new brands have been introduced in 2017, but this number only represents the first 6 months of 2017. The final count for 2017 will likely be higher. An overview of the number of new brands that appeared on the market between 2011 and 2017 is given in Figure 2. Note that some companies are no longer active and, for 17 devices, we could not determine release year.

Most brands only had a small number of wearables, but some produced a lot more. The brand with most unique wearables was Garmin (United States) with 40 different devices. No.1 (China) introduced the second highest number of wearables with 19 devices. An overview of the release year of the 22 (out of 132) brands that have released more than five devices is given in Table 1. Seven out of these 22 brands originated in the United States, five (six including Taiwan) originated in China, and two originated in South Korea. All other countries are represented only once. Some of these brands are no longer active (eg, Pebble and Jawbone).

Devices

Three devices were released in 2011 (earliest year), seven in 2012, 30 in 2013, and 87 in 2014. The year with the highest number of new wearables was 2015, with 121 new devices. In 2016, 120 new devices were released; the first year with a decreasing number of new wearables. The number of new and accumulated devices from 2011 to 2017 is summarized in Table 2. The last column (unknown) represents devices where we could not identify the release year. The above numbers represent the total number of new devices. If grouped into fitness trackers and smartwatches, there is a small overrepresentation among new smartwatches. Up until 2014, about half of devices were smartwatches. In 2015 and 2016, smartwatches represented 59.3% (143/241) of new devices, whereas fitness trackers represented 40.6% (98/241).

Sensors

The number of sensors included in new devices have increased in the last few years. Since 2015, the order of the most common sensors has consistently been PPG, GPS, gyroscope, magnetometer, and barometer or altimeter. In addition, these sensors have had a steady increase in availability in the same period. For 2017, 71% (27/38) of new devices included a PPG sensor, 50% (19/38) included a GPS, 39% (15/38) included a gyroscope, 34% (13/38) included a magnetometer, and 32% (12/38) included a barometer or altimeter. Figure 3 gives an overview of the number of devices each year that includes each sensor, in percent of total number of released devices that year. Devices with more than one sensor are represented once for each sensor it includes.

In total, since 2011, 38.5% (163/423) of wearables have only been equipped with one sensor (accelerometer). Moreover, 29.8% (126/423) of devices had two sensors, 12.1% (51/423) had three sensors, 11.1% (47/423) had four sensors, and 6.4% (27/423) had five sensors. Only 2.1% (9/423) of devices had all six sensors. In Table 3, these numbers are broken down by sensor combination and year. Some sensor combinations do not exist and are excluded.

Figure 1. Preferred Reporting Items for Systematic Reviews and Meta-Analyses (PRISMA) flowchart.
View this figure
Figure 2. Number of new and aggregated available brands by year.
View this figure
Table 1. Device count per year for brands with six or more wearables.
BrandCountry2011201220132014201520162017UnknownTotala
GarminUnited States
15611134
40
FitbitUnited States

11241
9
MisfitUnited States

11312
8
LifeTrakUnited States

15
1

7
iFitUnited States


141

6
JawboneUnited States1
113


6
PebbleUnited States

1131

6
No. 1China



595
19
OmateChina


252

9
ZeblazeChina



252
9
HuaweiChina


1331
8
OumaxChina


122117
Mobile ActionTaiwan



22
48
SamsungSouth Korea

1614

12
LGSouth Korea


3112
7
WorldSimEngland



11
57
PolarFinland

12422
11
TechnaxxGermany


4
2

6
AwatchItaly




34
7
EpsonJapan


25


7
TomTomNetherlands


214

7
MyKronozSwitzerland


4671
18

aTotal brand count for the United States=7, China and Taiwan=6, and South Korea=2. All other countries are represented only once.

Table 2. Number of new and accumulated devices by year.
Devices2011201220132014201520162017Unknown
New3730871211203817
Accumulated31040127248368406423
Figure 3. Percentage of devices released each year, supporting each sensor. GPS: global positioning system; PPG: photoplethysmography.
View this figure

Brand Usage in Research

The top five vendors in 2015 [27] and 2016 [6], in sold units, were Fitbit, Xiaomi, Apple, Garmin, and Samsung. Brands with more than 10 unique wearables include Garmin, No.1, MyKronoz, Samsung, and Polar. These eight, and additional brands identified during the MEDLINE search and ClinicalTrials search, were considered. We did not find any publications or active clinical trials that used devices from No.1 or MyKronoz. Devices from Basis, BodyMedia, Pebble, Jawbone, Microsoft, and Nike were also used in some of the identified studies, but these brands do no longer produce wearables within the scope of this paper and were excluded from further analysis.

The MEDLINE search resulted in 81 included studies that we divided into two groups: (1) validation and reliability studies and (2) data collection studies. Studies where wearable output was compared with existing research instruments known to give accurate results (eg, ActiGraph) or with direct observation, as well as studies where several wearables were compared with each other for accuracy or reliability, were classified as validation and reliability studies. Studies where wearables were used as a tool for intervention or observation, to collect data on PA, HR, EE, sleep, or other available metrics, were classified as data collection studies. Out of these 81 studies, 61 were classified as validation and reliability studies, whereas 20 were classifies as data collection studies.

Fitbit devices were used in 54 studies [9,13,28-79]. Out of these, 40 studies were validation or reliability studies. In 22 of the studies, one or more Garmin devices were used [32,33,46,49,50,62,77-92]. Of these, 18 were validation or reliability studies. Eight studies used Apple devices [29,30,35,49,62,79,93,94]. Six of these were validation or reliability studies. All studies using devices from Misfit, Polar, Withings, Mio, Samsung, PulseOn, TomTom, and Xiaomi were validation or reliability studies. Misfit devices were used in 12 studies [9,36,42,43,46,61-63,85,95-97]; Polar devices were used in 6 studies [36,43,46,62,98,99]; Withings [63,85,89,100,101], Mio [29,30,54,102,103], and Samsung [29,30,58,62,96] devices were used in 5 studies; PulseOn devices were used in 4 studies [29,104-106]; TomTom devices were used in 2 studies [54,79]; and Xiaomi devices were used in 1 study [96].

From ClinicalTrials, we found that the vast majority of ongoing projects use, or are planning to use, Fitbit devices. All other devices were mentioned in three or less projects, whereas Fitbit devices were mentioned in 31 studies. A summary of these studies and projects is given in Table 4. We further grouped the validation and reliability studies into five categories. A total of 31 studies focused on step counts or distance, 15 studies researched EE, 15 studies measured HR, 10 studies measured sleep, and 7 studies collected other metrics. Multimedia Appendix 1 gives an overview of articles found in MEDLINE, which brands they included in the study, and which of the five categories they are grouped into.

Brand Developer Possibilities

Next, we considered developer possibilities for the 11 brands already identified as most relevant in research: Apple, Fitbit, Garmin, Mio, Misfit, Polar, PulseOn, Samsung, TomTom, Withings, and Xiaomi. All brands had an app in the Apple App Store and could connect to the iPhone. Except for the Apple Watch, all other brands had an app in Google Play and could be used with Android phones.

Table 3. Number and percentage of devices supporting a specific group of sensors, by year.
Sensors2011201220132014201520162017
Accelerometer (Acc), n (%)2 (67)5 (71)16 (53)40 (46)50 (41.3)37 (30.8)4 (11)
Acc + 1 sensor, n (%)







PPGa
1 (14)1 (3)9 (10)11 (9.1)27 (22.5)10 (26)

GPSb1 (33)
2 (7)9 (10)15 (12.4)3 (2.5)

Gyroscope (Gyro)

1 (3)3 (3)9 (7.4)4 (3.3)1 (3)

Magnetometer (Mag)
1 (14)2 (7)1 (1)3 (2.5)


Barometer (Bar)


1 (1)1 (0.8)
2 (5)
Acc + 2 sensors, n (%)







GPS + PPG

1 (3)
7 (5.8)6 (5)3 (8)

Gyro + PPG


4 (5)5 (4.1)5 (4.2)1 (3)

Gyro + GPS


1 (1)2 (1.7)2 (1.7)

Bar + PPG

1 (3)
1 (0.8)2 (1.7)

Gyro + Mag


2 (2)1 (0.8)


Mag + GPS

1 (3)1 (1)1 (0.8)


Mag + PPG




1 (0.8)

Gyro + Bar


1 (1)



Bar + GPS


2 (2)


Acc + 3 sensors, n (%)







Gyro + Mag + GPS

1 (3)3 (3)3 (2.5)2 (1.7)1 (3)

Gyro + Mag + PPG


4 (5)2 (1.7)3 (2.5)1 (3)

Mag + Bar + GPS

3 (10)2 (2)
4 (3.3)1 (3)

Gyro + GPS + PPG


1 (1)
6 (5)1 (3)

Bar + GPS + PPG




2 (1.7)2 (5)

Mag + GPS + PPG




1 (0.8)1 (3)

Gyro + Bar + PPG



2 (1.7)


Gyro + Mag + Bar




1 (0.8)
Acc + 4 sensors, n (%)







Mag + Bar + GPS + PPG

1 (3)
3 (2.5)4 (3.3)

Gyro + Mag + GPS + PPG


1 (1)
3 (2.5)3 (8)

Gyro + Bar + GPS + PPG



2 (1.7)4 (3.3)1 (3)

Gyro + Mag + Bar + GPS




1 (0.8)2 (5)

Gyro + Mag + Bar + PPG


1 (1)1 (0.8)

Acc + 5 sensors, n (%)







All sensors


1 (1)2 (1.7)2 (1.7)4 (11)
Total, n37308712112038

aPPG: photoplethysmography.

bGPS: global positioning system.

Table 4. Number of identified articles in Medical Literature Analysis and Retrieval System Online (MEDLINE) and ClinicalTrials.
BrandMEDLINEa search termMEDLINEClinicalTrials


Validation or reliability studiesb (total article count=61)Data collection studiesc (total article count=20)Validation or reliability studiesdData collection studiese
FitbitFitbit AND (Alta OR Blaze OR Charge OR Flex OR Surge)4014130
GarminGarmin AND (Approach OR D2 OR Epix OR Fenix OR Forerunner OR Quatix OR Swim OR Tactix OR Vivo*)18412
MisfitMisfit AND (Flare OR Flash OR Link OR Ray OR Shine OR Vapor)12001
AppleApple watch6211
PolarPolar AND (“Polar Loop” OR M200 OR M4?0 OR M600 OR V800 OR A3?0)6013
WithingsWithings5002
MioMio Alpha OR Mio Fuse OR Mio Slice5012
SamsungSamsung Gear NOT “Gear VR” NOT Oculus5002
PulseOnPulseOn4001
TomTomTomTom201 
XiaomiXiaomi1001

aMEDLINE: Medical Literature Analysis and Retrieval System Online.

bNumber of validation or reliability studies in MEDLINE.

cNumber of data collection studies in MEDLINE.

dNumber of validation or reliability studies in ClinicalTrials.

eNumber of data collection studies in ClinicalTrials.

Three brands supported Windows Phone: Fitbit, Garmin, and Misfit. Apple Health and Google Fit are the two most common open cloud health repositories. Mio, Misfit, Polar, Withings, and Xiaomi, were the only brands that automatically synchronized fitness data to both of these repositories through these open APIs. The Apple Watch only synchronized automatically to the Apple Health repository. Seven out of 11 brands had a private cloud repository with an accompanying API, which allows third-party apps to access these data. Five brands had an SDK, which makes it possible to create custom programs to communicate with the device or create watch faces that can run on the device.

The Apple Watch was the only device running on watchOS. Three brands had at least one device running on Android Wear. The remaining seven brands used a custom system. A summary of all attributes for each brand is given in Table 5. Not all devices for a specific brand support all features. In addition, this is a snapshot of the status of these attributes, which are likely to change over time as new devices and brands expand their capabilities. The Apple Watch development environment is called WatchKit SDK and can be used to write apps for the Apple Watch [107]. Apple’s health storage solution is called Apple Health. A variety of different data types can be stored here and accessed by third-party developers through the HealthKit API [108]. Access to any of these services requires enrollment in the Apple Developer Program, which currently costs US $99 per year.

Fitbit offers three major SDKs (Device API, Companion API, and Settings API) for developing apps for Fitbit devices. In addition, Fitbit offers the Web API that can be used to access Fitbit cloud-stored fitness data. The Web API exposes six types of data: PA, HR, location, nutrition, sleep, and weight [109]. Fitbit also has a solution for accessing high-resolution step and HR data (ie, intraday data), granted on a case by case basis. There is no cost for developing with the Fitbit SDKs or API.

There are two generations of programmable Garmin wearables [110]. The Connect IQ SDK can be used by both generations, but devices using the newer Connect IQ 2 generation support more features. Development with this SDK is free. Garmin also offers a cloud-based Web API, Garmin Connect, which allows third-party apps to access users’ cloud-based fitness data. Access to this API costs US $5000 (one-time license). In addition, Garmin maintains a separate Health API intended to be used by companies for wellness improvement of their employees. This API is free but requires a manual approval from Garmin.

Table 5. Brand environment, integration, and development support.
FeatureAppleFitbitGarminMioMisfitPolarPulseOnSamsungTomTomWithingsXiaomi
Supported platform











Android

iPhone

Windows phone







Integration











Automatic synchronization to Apple Health





Automatic synchronization to Google Fit






Private cloud storage




Cloud storage APIa



Developer SDKb





Watch system











Android Wear








watchOS (Apple)










Custom



aAPI: application programming interface.

bSDK: software development kit.

The Misfit developer ecosystem consists of three SDKs (Sleep SDK, Link SDK, and Device SDK) [111]. The Misfit Device SDK is the major SDK for developing apps for and communication with Misfit devices. This SDK is only available on request. Misfit also offers the Misfit Scientific Library that can be used to access Misfits proprietary sensor algorithms directly. This library is also only available on request. In addition, the Misfit Cloud API is used to access users’ data from the Misfit cloud server. All SDKs and the API are free.

Polar does not offer a separate SDK. Polar devices can integrate with Google Fit and Apple Health and deposits collected data there [112]. This data are accessed using Google Fit APIs and Apple HealthKit APIs. In addition, data are uploaded to Polar’s cloud storage, which is accessible by third-party developers through the AccessLink API. Besides PA data (steps, EE, and sleep), basic training data are also stored here. Access to AccessLink is free.

Development for a Samsung smartwatch is done using the Tizen SDK (Samsung smartwatch operating system is called Tizen). The Samsung Health SDK platform consists of two parts: Data SDK and Service SDK. Together these can be used to store and access health data collected from internal and external sensors, as well as third-party apps running on a Samsung watch or a mobile phone. Development using any of these services is free [113].

TomTom offers the Sports Cloud API for accessing data collected from TomTom devices. The API provides four types of data: PA (eg, exercises bouts), HR, tracking (eg, steps and EE), and physiology (eg, weight). Access to the API is free [114].

Nokia acquired Withings in 2016, and the original Withings API is now available as the Nokia Health API. Besides PA and sleep measurements, the API also gives access to intraday PA data. Nokia must manually approve access to this high-resolution activity API. The API is free [115].

Summarizing Results

Which features are most important when considering devices for a research project will depend on the purpose and design of the study. It is therefore not possible to identify one brand as the best brand in all circumstances. However, we have tried to quantify various aspects of a brand to identify and summarize their benefits.

Table 6. Brand summary.
BrandFitbitGarminMisfitApplePolarSamsungWithingsMioPulseOnTomTomXiaomiMyKronozNo. 1
Devicesa940831112231731819
MEDLINEb54221286555421

Validation or reliabilityc40181266555421


Steps211061224


1


Energy expenditure1043431
22




Heart rate741412
5421


Sleep814
1
2






Other342
1







ClinicalTrialsd313124223111

SDKe






APIf




Apple Healthg






Google Fith







aNumber of unique devices.

bMEDLINE: Medical Literature Analysis and Retrieval System Online. Number of articles in MEDLINE.

cNumber of validation or reliability studies in MEDLINE, grouped by metric (step, EE, HR, sleep, and others).

dNumber of active projects in ClinicalTrials.

eSupports an SDK for third-party software implementation.

fAPI: application programming interface. Supports an API for developer access to data cloud.

gSupports automatic synchronization to Apple Health data cloud.

hSupports automatic synchronization to Google Fit data cloud.

We used eight categories in this custom comparison, which we suggest to consider before deciding on a brand for any research project:

  1. Device count: a higher number of available devices make it possible to pick a device that is more tailored to the study.
  2. Article count: a higher number of articles in Ovid MEDLINE indicate usage in previous studies.
  3. Validation or reliability count: a high number of validation or reliability studies provides knowledge about device and brand accuracy.
  4. ClinicalTrials count: a high number of active projects in ClinicalTrials indicate brand relevance.
  5. SDK support: brands that allows third-party programs to run on their devices or communicate directly with the device, by offering an SDK, adds more possibilities for customization.
  6. API support: brands that allows third-party programs to access the data cloud repository, by offering API access, adds more possibilities for health data collection and retrieval.
  7. Apple Health: brands supporting automatic synchronization to Apple Health allow usage of Apple HealthKit API.
  8. Google Fit: brands supporting automatic synchronization to Google Fit allow usage of Google Fit API.

A consensus between authors was reached to include these specific categories because we think together they indicate how often a specific brand has been used in the past and will be used in the future, and they show which options are available for data extraction. These are not the only possible categories, and each category will not be equally important for all studies.

Table 6 gives a summary of these categories for each brand. A transposed Excel (Microsoft) version for dynamic sorting is given in Multimedia Appendix 2. We have divided MEDLINE validation and reliability studies into subgroups, making it easier to compare brands for specific study purposes.


Availability and Trends

The number of new brands increased every year from 2011 to 2014, but from 2015 to 2016, we saw a decrease in the number of new brands. The number of new devices also increased from 2011 to 2015, with a slight reduction in 2016. Many new and existing companies have tried to enter the wearable market during these years. Some have become popular, whereas others are no longer available. The number of new devices in the first two quarters of 2017 seems low, and there is a small indication that the number of new brands and devices released each year is declining. During the data collection phase, we also identified a large number of hybrid watches. Although we did not report on these, this relatively new branch of wearables has grown in popularity. The Fossil group, representing 19 brands, recently announced they would launch more than 300 hybrid watches and smartwatches in 2017 [116]. Most of these will be hybrids, and 2017 may see the highest number of new hybrids released to date.

We only found nine devices that support all five sensors considered in this study. Among the 11 most relevant brands, only Fitbit Surge, Garmin Forerunner 935, Garmin Quatix 5, Samsung Gear S, and TomTom Adventure fall in this category. Most devices (68%) support only one sensor, in addition to the accelerometer. These numbers indicate that sensor count is not the main argument when choosing a device for personal use. In addition to the accelerometer, the most common sensors are PPG and GPS, regardless of sensor count. One reason for this may be that the added benefit of having these sensors, in a fitness setting, is very clear. Accelerometers can be used for step counting, PA intensity, exercise detection, and other well-understood metrics, whereas the added benefit of a gyroscope may be less intuitive. The added convenience of using a PPG compared with a pulse chest strap, or no HR detection at all, is also easy to understand. Adding a GPS also adds some easy-to-understand benefits, where tracking progress on a map and the possibility to detect speed is the most obvious. Magnetometers and barometers or altimeters may not be sensors that most people consider relevant for PA, although they can be used to enhance accuracy of EE and other metrics.

Brand Usage in Research

In the MEDLINE literature search, we found 81 studies that used one or more of the 11 brands we identified as most relevant in research. Out of these, 61 were validation or reliability studies. The remaining 20 studies used wearable devices as data collection instruments to measure PA, HR, EE, sleep, or other metrics. Fitbit was used in twice as many validation or reliability studies as any other brand. This has likely contributed to the high number of studies where Fitbit was used as the only instrument for health data collection. The same trend will likely continue in future publications because numbers from ClinicalTrials for active projects shows an overrepresentation of Fitbit-enabled projects. Of the brands currently available, the five most often used in research projects are Fitbit, Garmin, Misfit, Apple, and Polar. In addition, these brands have all existed for several years and have either released a large number of unique devices or shipped a large number of total devices. As such, they are likely to stay on the market for the near future.

A high article count, high number of validation or reliability studies, or high number of studies in ClinicalTrials for a specific brand does not automatically imply validity or reliability. It does, however, show researcher interest in these brands.

Implication for Practice

Table 6 is a good starting point when considering brands for a new research project. Article count, validation or reliability study count, and ClinicalTrials count together indicate brand dependability. Larger numbers indicate how relevant, usable, and valid previous researchers have found each brand to be. In projects where it is relevant, SDK support allows programmatic interaction directly with the device. API support allows storage in, and access to, a brand-specific cloud-based health data repository. Apple Health and Google Fit support are alternative solutions for storing and accessing health data in an open cloud repository. For projects that require multiple brand support, using open solutions reduces the need to implement specific software for each brand. SDK, API, Apple Health, and Google Fit must be supported on both the brand and device level, however.

A high brand device count makes it easier to find a device that best supports the study needs. In addition to available sensors (ie, metrics), validation, and previous usage in research, several other potential relevant criteria exist, including price, availability, phone environment support, affiliated app features, look and feel, battery life, build quality or robustness, water resistance, connectivity, and usability.

Figure 4. Criteria to consider when choosing brand or device. API: application programming interface; SDK: software development kit.
View this figure

Furthermore, projects that need programmatic access to the wearable or stored health data should especially consider SDK or API features and ease of use, as well as privacy and security. Figure 4 gives a summary of criteria to consider when selecting brand and device.

Limitations

We visited all the brands’ websites to find additional devices, but several sites did not contain any information about discontinued devices. The release year of a device was rarely available on device webpages, and we had to search for reviews and other sources to find this information. The level of detail in device hardware specifications varied. Some vendors did not specify which sensor they included in their devices and only mentioned which features the device had. In some cases, the sensor could be derived from this information, but in other cases, we had to find this information elsewhere. Wikipedia was also used to collect sensor support and release year for some devices. This open editable encyclopedia is not necessarily always updated with correct information. For these reasons, there may be some inaccuracies in reported sensor support and release year. We did not collect information about device discontinuation. Reported numbers for total available devices does, therefore, not reflect the numbers of devices that currently can be store bought but rather the number of unique devices that have existed at some point.

Conclusions

In the last few years, we have seen a large increase in available brands and wearable devices, and more devices are released with additional sensors. However, for activity tracking, some sensors are more relevant than others are. In this study, we have focused on sensor support, health data cloud integration, and developer possibilities; because we find these to be most relevant for collection of PA data in research. However, deciding which wearable to use will depend on several additional factors.

The wearable landscape is constantly changing as new devices are released and as new vendors enter or leave the market, or are acquired by larger vendors. What currently are considered relevant devices and brands will therefore change over time, and each research project should carefully consider which brand and device to use. As a tool for future research, we have defined a checklist of elements to consider when making this decision.

Acknowledgments

The authors would like to thank Vandrico Solutions Inc for giving them API access to their database. They would also like to thank Steven Richardson and Debra Mackinnon at Queen’s University for giving them an excerpt from their offline wearable database. The publication charges for this study have been funded by a grant from the publication fund of the University of Tromsø – The Arctic University of Norway.

Conflicts of Interest

None declared.

Multimedia Appendix 1

List of MEDLINE articles included in the results for "Brand usage in research".

XLSX File (Microsoft Excel File), 24KB

Multimedia Appendix 2

Summary of the most important categories to consider when selecting a wearable brand for research.

XLSX File (Microsoft Excel File), 13KB

  1. World Health Organization. Physical activity   URL: http://www.who.int/mediacentre/factsheets/fs385/en [accessed 2017-09-11] [WebCite Cache]
  2. Emaus A, Degerstrøm J, Wilsgaard T, Hansen BH, Dieli-Conwright CM, Furberg AS, et al. Does a variation in self-reported physical activity reflect variation in objectively measured physical activity, resting heart rate, and physical fitness? Results from the Tromso study. Scand J Public Health 2010 Nov;38(5 Suppl):105-118. [CrossRef] [Medline]
  3. World Health Organization. 2017. Global Strategy on Diet, Physical Activity and Health   URL: http://www.who.int/dietphysicalactivity/pa/en [accessed 2017-09-11] [WebCite Cache]
  4. Finkelstein EA, Haaland BA, Bilger M, Sahasranaman A, Sloan RA, Nang EE, et al. Effectiveness of activity trackers with and without incentives to increase physical activity (TRIPPA): a randomised controlled trial. Lancet Diabetes Endocrinol 2016 Dec;4(12):983-995. [CrossRef] [Medline]
  5. Jakicic JM, Davis KK, Rogers RJ, King WC, Marcus MD, Helsel D, et al. Effect of wearable technology combined with a lifestyle intervention on long-term weight loss: the IDEA randomized clinical trial. J Am Med Assoc 2016 Sep 20;316(11):1161-1171. [CrossRef] [Medline]
  6. IDC. Wearables Aren't Dead, They're Just Shifting Focus as the Market Grows 16.9% in the Fourth Quarter, According to IDC   URL: https://www.idc.com/getdoc.jsp?containerId=prUS42342317 [accessed 2018-03-01] [WebCite Cache]
  7. IDC. Xiaomi and Apple Tie for the Top Position as the Wearables Market Swells 17.9% During the First Quarter   URL: https://www.idc.com/getdoc.jsp?containerId=prUS42707517 [accessed 2018-03-01] [WebCite Cache]
  8. de Arriba-Perez F, Caeiro-Rodríguez M, Santos-Gago JM. Collection and processing of data from wrist wearable devices in heterogeneous and multiple-user scenarios. Sensors (Basel) 2016 Sep 21;16(9) [FREE Full text] [CrossRef] [Medline]
  9. Kaewkannate K, Kim S. A comparison of wearable fitness devices. BMC Public Health 2016 May 24;16:433 [FREE Full text] [CrossRef] [Medline]
  10. Sanders JP, Loveday A, Pearson N, Edwardson C, Yates T, Biddle SJ, et al. Devices for self-monitoring sedentary time or physical activity: a scoping review. J Med Internet Res 2016 May 04;18(5):e90 [FREE Full text] [CrossRef] [Medline]
  11. Corder K, Brage S, Ekelund U. Accelerometers and pedometers: methodology and clinical application. Curr Opin Clin Nutr Metab Care 2007 Sep;10(5):597-603. [CrossRef] [Medline]
  12. ActiGraph Corp. ActiGraph   URL: http://actigraphcorp.com/ [accessed 2017-09-11] [WebCite Cache]
  13. Reid RE, Insogna JA, Carver TE, Comptour AM, Bewski NA, Sciortino C, et al. Validity and reliability of Fitbit activity monitors compared to ActiGraph GT3X+ with female adults in a free-living environment. J Sci Med Sport 2017 Jun;20(6):578-582. [CrossRef] [Medline]
  14. Evenson KR, Goto MM, Furberg RD. Systematic review of the validity and reliability of consumer-wearable activity trackers. Int J Behav Nutr Phys Act 2015 Dec 18;12:159 [FREE Full text] [CrossRef] [Medline]
  15. Richardson S, Mackinnon D. Left to their own devices? Privacy implications of wearable technology in Canadian workplaces   URL: http://www.sscqueens.org/publications/left-to-their-own-devices [accessed 2018-03-01] [WebCite Cache]
  16. Wagenaar RC, Sapir I, Zhang Y, Markovic S, Vaina LM, Little TD. Continuous monitoring of functional activities using wearable, wireless gyroscope and accelerometer technology. Conf Proc IEEE Eng Med Biol Soc 2011;2011:4844-4847. [CrossRef] [Medline]
  17. Allen J. Photoplethysmography and its application in clinical physiological measurement. Physiol Meas 2007 Mar;28(3):R1-39. [CrossRef] [Medline]
  18. Kim SH, Ryoo DW, Bae C. Adaptive noise cancellation Using accelerometers for the PPG signal from forehead. Conf Proc IEEE Eng Med Biol Soc 2007;2007:2564-2567. [CrossRef] [Medline]
  19. Casson AJ, Vazquez Galvez A, Jarchi D. Gyroscope vs. accelerometer measurements of motion from wrist PPG during physical exercise. ICT Express 2016 Dec;2(4):175-179. [CrossRef]
  20. Richardson SM, Mackinnon D. Queen's University. 2017. Wearable Device Inventory, Queen's University.
  21. Vandrico. 2017. The wearables database   URL: http://vandrico.com/wearables/ [accessed 2017-07-08] [WebCite Cache]
  22. GSM Arena. 2017.   URL: http://www.gsmarena.com/results.php3?sFormFactors=8 [accessed 2017-07-08] [WebCite Cache]
  23. Wearables. 2017. Wearables.Com: Helping you make sense of wearable tech   URL: http://www.wearables.com/devices/ [accessed 2017-07-08] [WebCite Cache]
  24. Smartwatches. 2017. SpecBucket: Smart Watches   URL: https://smartwatches.specbucket.com/ [accessed 2017-07-08] [WebCite Cache]
  25. Prisguiden. 2017. Smartklokke   URL: https://prisguiden.no/kategorier/smartklokke [accessed 2017-07-08] [WebCite Cache]
  26. Prisguiden. 2017. Aktivitetsarmband   URL: https://prisguiden.no/kategorier/aktivitetsarmband [accessed 2017-07-08] [WebCite Cache]
  27. Business Wire. 2016. The Worldwide Wearables Market Leaps 126.9% in the Fourth Quarter and 171.6% in 2015   URL: https:/​/www.​businesswire.com/​news/​home/​20160223005496/​en/​Worldwide-Wearables-Market-Leaps-126.​9-Fourth-Quarter [accessed 2008-03-01] [WebCite Cache]
  28. Balto JM, Kinnett-Hopkins DL, Motl RW. Accuracy and precision of smartphone applications and commercially available motion sensors in multiple sclerosis. Mult Scler J Exp Transl Clin 2016;2:2055217316634754 [FREE Full text] [CrossRef] [Medline]
  29. Shcherbina A, Mattsson CM, Waggott D, Salisbury H, Christle JW, Hastie T, et al. Accuracy in wrist-worn, sensor-based measurements of heart rate and energy expenditure in a diverse cohort. J Pers Med 2017 May 24;7(2) [FREE Full text] [CrossRef] [Medline]
  30. Wallen MP, Gomersall SR, Keating SE, Wisløff U, Coombes JS. Accuracy of heart rate watches: implications for weight management. PLoS One 2016;11(5):e0154420 [FREE Full text] [CrossRef] [Medline]
  31. Chow JJ, Thom JM, Wewege MA, Ward RE, Parmenter BJ. Accuracy of step count measured by physical activity monitors: The effect of gait speed and anatomical placement site. Gait Posture 2017 Sep;57:199-203. [CrossRef] [Medline]
  32. Chen MD, Kuo CC, Pellegrini CA, Hsu MJ. Accuracy of wristband activity monitors during ambulation and activities. Med Sci Sports Exerc 2016 Dec;48(10):1942-1949. [CrossRef] [Medline]
  33. Duncan M, Murawski B, Short CE, Rebar AL, Schoeppe S, Alley S, et al. Activity trackers implement different behavior change techniques for activity, sleep, and sedentary behaviors. Interact J Med Res 2017 Aug 14;6(2):e13 [FREE Full text] [CrossRef] [Medline]
  34. Sprint G, Cook D, Weeks D, Dahmen J, La Fleur A. Analyzing sensor-based time series data to track changes in physical activity during inpatient rehabilitation. Sensors (Basel) 2017 Sep 27;17(10) [FREE Full text] [CrossRef] [Medline]
  35. Chowdhury EA, Western MJ, Nightingale TE, Peacock OJ, Thompson D. Assessment of laboratory and daily energy expenditure estimates from consumer multi-sensor physical activity monitors. PLoS One 2017;12(2):e0171720 [FREE Full text] [CrossRef] [Medline]
  36. Mercer K, Li M, Giangregorio L, Burns C, Grindrod K. Behavior change techniques present in wearable activity trackers: a critical analysis. JMIR Mhealth Uhealth 2016;4(2):e40 [FREE Full text] [CrossRef] [Medline]
  37. Gaudet J, Gallant F, Bélanger M. A bit of fit: minimalist intervention in adolescents based on a physical activity tracker. JMIR Mhealth Uhealth 2017 Jul 06;5(7):e92 [FREE Full text] [CrossRef] [Medline]
  38. Jacobsen RM, Ginde S, Mussatto K, Neubauer J, Earing M, Danduran M. Can a home-based cardiac physical activity program improve the physical function quality of life in children with fontan circulation? Congenit Heart Dis 2016;11(2):175-182. [CrossRef] [Medline]
  39. Ridgers ND, Timperio A, Brown H, Ball K, Macfarlane S, Lai SK, et al. A cluster-randomised controlled trial to promote physical activity in adolescents: the Raising Awareness of Physical Activity (RAW-PA) Study. BMC Public Health 2017 Jan 04;17(1):6 [FREE Full text] [CrossRef] [Medline]
  40. Li LC, Sayre EC, Xie H, Clayton C, Feehan LM. A community-based physical activity counselling program for people with knee osteoarthritis: feasibility and preliminary efficacy of the track-OA study. JMIR Mhealth Uhealth 2017 Jun 26;5(6):e86 [FREE Full text] [CrossRef] [Medline]
  41. Dondzila C, Garner D. Comparative accuracy of fitness tracking modalities in quantifying energy expenditure. J Med Eng Technol 2016 Aug;40(6):325-329. [CrossRef] [Medline]
  42. Bellone GJ, Plano SA, Cardinali DP, Chada DP, Vigo DE, Golombek DA. Comparative analysis of actigraphy performance in healthy young subjects. Sleep Sci 2016;9(4):272-279 [FREE Full text] [CrossRef] [Medline]
  43. Bai Y, Welk GJ, Nam YH, Lee JA, Lee JM, Kim Y, et al. Comparison of consumer and research monitors under semistructured settings. Med Sci Sports Exerc 2016;48(1):151-158. [CrossRef] [Medline]
  44. Lee HA, Lee HJ, Moon JH, Lee T, Kim MG, In H, et al. Comparison of wearable activity tracker with actigraphy for sleep evaluation and circadian rest-activity rhythm measurement in healthy young adults. Psychiatry Investig 2017 Mar;14(2):179-185 [FREE Full text] [CrossRef] [Medline]
  45. Chu AH, Ng SH, Paknezhad M, Gauterin A, Koh D, Brown MS, et al. Comparison of wrist-worn Fitbit Flex and waist-worn ActiGraph for measuring steps in free-living adults. PLoS One 2017;12(2):e0172535 [FREE Full text] [CrossRef] [Medline]
  46. Brooke SM, An HS, Kang SK, Noble JM, Berg KE, Lee JM. Concurrent validity of wearable activity trackers under free-living conditions. J Strength Cond Res 2017 Apr;31(4):1097-1106. [CrossRef] [Medline]
  47. Block VJ, Lizée A, Crabtree-Hartman E, Bevan CJ, Graves JS, Bove R, et al. Continuous daily assessment of multiple sclerosis disability using remote step count monitoring. J Neurol 2017 Feb;264(2):316-326. [CrossRef] [Medline]
  48. Morhardt DR, Luckenbaugh A, Goldstein C, Faerber GJ. Determining resident sleep during and after call with commercial sleep monitoring devices. Urology 2017 Aug;106:39-44. [CrossRef] [Medline]
  49. Dooley EE, Golaszewski NM, Bartholomew JB. Estimating accuracy at exercise intensities: a comparative study of self-monitoring heart rate and physical activity wearable devices. JMIR Mhealth Uhealth 2017 Mar 16;5(3):e34 [FREE Full text] [CrossRef] [Medline]
  50. Leth S, Hansen J, Nielsen OW, Dinesen B. Evaluation of commercial self-monitoring devices for clinical purposes: results from the Future Patient Trial, Phase I. Sensors (Basel, Switzerland) 2017;17(1):1-11.
  51. Bian J, Guo Y, Xie M, Parish AE, Wardlaw I, Brown R, et al. Exploring the association between self-reported asthma impact and Fitbit-derived sleep quality and physical activity measures in adolescents. JMIR Mhealth Uhealth 2017 Jul 25;5(7):e105 [FREE Full text] [CrossRef] [Medline]
  52. Coughlin SS, Hatzigeorgiou C, Anglin J, Xie D, Besenyi GM, De Leo G, et al. Healthy lifestyle intervention for adult clinic patients with type 2 diabetes mellitus. Diabetes Manag (Lond) 2017;7(2):197-204 [FREE Full text] [Medline]
  53. Åkerberg A, Koshmak G, Johansson A, Lindén M. Heart rate measurement as a tool to quantify sedentary behavior. Stud Health Technol Inform 2015;211:105-110. [Medline]
  54. Stahl SE, An HS, Dinkel DM, Noble JM, Lee JM. How accurate are the wrist-based heart rate monitors during walking and running activities? Are they accurate enough? BMJ Open Sport Exerc Med 2016;2(1):1-7 [FREE Full text] [CrossRef]
  55. Alinia P, Cain C, Fallahzadeh R, Shahrokni A, Cook D, Ghasemzadeh H. How accurate is your activity tracker? A comparative study of step counts in low-intensity physical activities. JMIR Mhealth Uhealth 2017 Aug 11;5(8):e106 [FREE Full text] [CrossRef] [Medline]
  56. Winslow BD, Nguyen N, Venta KE. Improved mental acuity forecasting with an individualized quantitative sleep model. Front Neurol 2017;8:160 [FREE Full text] [CrossRef] [Medline]
  57. Spiotta AM, Fargen KM, Denham SL, Fulton ME, Kellogg R, Young E, et al. Incorporation of a physical education and nutrition program into neurosurgery: a proof of concept pilot program. Neurosurgery 2016 Oct;79(4):613-619. [CrossRef] [Medline]
  58. Modave F, Guo Y, Bian J, Gurka MJ, Parish A, Smith MD, et al. Mobile device accuracy for step counting across age groups. JMIR Mhealth Uhealth 2017 Jun 28;5(6):e88 [FREE Full text] [CrossRef] [Medline]
  59. Dominick GM, Winfree KN, Pohlig RT, Papas MA. Physical activity assessment between consumer- and research-grade accelerometers: a comparative study in free-living conditions. JMIR Mhealth Uhealth 2016 Sep 19;4(3):e110 [FREE Full text] [CrossRef] [Medline]
  60. Schoenfelder E, Moreno M, Wilner M, Whitlock KB, Mendoza JA. Piloting a mobile health intervention to increase physical activity for adolescents with ADHD. Prev Med Rep 2017 Jun;6:210-213 [FREE Full text] [CrossRef] [Medline]
  61. Kooiman TJ, Dontje ML, Sprenger SR, Krijnen WP, van der Schans CP, de Groot M. Reliability and validity of ten consumer activity trackers. BMC Sports Sci Med Rehabil 2015;7:24 [FREE Full text] [CrossRef] [Medline]
  62. Fokkema T, Kooiman TJ, Krijnen WP, van der Schans CP, de Groot M. Reliability and validity of ten consumer activity trackers depend on walking speed. Med Sci Sports Exerc 2017 Apr;49(4):793-800. [CrossRef] [Medline]
  63. Mantua J, Gravel N, Spencer RM. Reliability of sleep measures from four personal health monitoring devices compared to research-based actigraphy and polysomnography. Sensors (Basel) 2016 Dec 05;16(5) [FREE Full text] [CrossRef] [Medline]
  64. Chen JL, Guedes CM, Cooper BA, Lung AE. Short-term efficacy of an innovative mobile phone technology-based intervention for weight management for overweight and obese adolescents: pilot study. Interact J Med Res 2017 Aug 02;6(2):e12 [FREE Full text] [CrossRef] [Medline]
  65. Reichardt LA, Aarden JJ, van Seben R, van der Schaaf M, Engelbert RH, Bosch JA, Hospital-ADL study group. Unravelling the potential mechanisms behind hospitalization-associated disability in older patients; the Hospital-Associated Disability and impact on daily Life (Hospital-ADL) cohort study protocol. BMC Geriatr 2016 Mar 05;16:59 [FREE Full text] [CrossRef] [Medline]
  66. Laranjo L, Lau AY, Martin P, Tong HL, Coiera E. Use of a mobile social networking intervention for weight management: a mixed-methods study protocol. Br Med J Open 2017 Jul 12;7(7):e016665 [FREE Full text] [CrossRef] [Medline]
  67. Pumper MA, Mendoza JA, Arseniev-Koehler A, Holm M, Waite A, Moreno MA. Using a Facebook group as an adjunct to a pilot mHealth physical activity intervention: a mixed methods approach. Stud Health Technol Inform 2015;219:97-101. [Medline]
  68. Cook JD, Prairie ML, Plante DT. Utility of the Fitbit Flex to evaluate sleep in major depressive disorder: a comparison against polysomnography and wrist-worn actigraphy. J Affect Disord 2017 Aug 01;217:299-305. [CrossRef] [Medline]
  69. Jo E, Lewis K, Directo D, Kim MJ, Dolezal BA. Validation of biofeedback wearables for photoplethysmographic heart rate tracking. J Sports Sci Med 2016 Sep;15(3):540-547 [FREE Full text] [Medline]
  70. Floegel TA, Florez-Pregonero A, Hekler EB, Buman MP. Validation of consumer-based hip and wrist activity monitors in older adults with varied ambulatory abilities. J Gerontol A Biol Sci Med Sci 2016 Jun 02;72(2):229-236. [CrossRef] [Medline]
  71. Alharbi M, Bauman A, Neubeck L, Gallagher R. Validation of Fitbit-Flex as a measure of free-living physical activity in a community-based phase III cardiac rehabilitation population. Eur J Prev Cardiol 2016 Feb 23;23(14):1476-1485. [CrossRef] [Medline]
  72. Diaz KM, Krupka DJ, Chang MJ, Shaffer JA, Ma Y, Goldsmith J, et al. Validation of the Fitbit One® for physical activity measurement at an upper torso attachment site. BMC Res Notes 2016 Apr 12;9:213 [FREE Full text] [CrossRef] [Medline]
  73. Sushames A, Edwards A, Thompson F, McDermott R, Gebel K. Validity and reliability of Fitbit Flex for step count, moderate to vigorous physical activity and activity energy expenditure. PLoS One 2016;11(9):e0161224 [FREE Full text] [CrossRef] [Medline]
  74. Kang SG, Kang JM, Ko KP, Park SC, Mariani S, Weng J. Validity of a commercial wearable sleep tracker in adult insomnia disorder patients and good sleepers. J Psychosom Res 2017 Jun;97:38-44. [CrossRef] [Medline]
  75. Voss C, Gardner RF, Dean PH, Harris KC. Validity of commercial activity trackers in children with congenital heart disease. Can J Cardiol 2017 Jun;33(6):799-805. [CrossRef] [Medline]
  76. Nelson MB, Kaminsky LA, Dickin DC, Montoye AH. Validity of consumer-based physical activity monitors for specific activity types. Med Sci Sports Exerc 2016 Aug;48(8):1619-1628. [CrossRef] [Medline]
  77. Treacy D, Hassett L, Schurr K, Chagpar S, Paul S, Sherrington C. Validity of different activity monitors to count steps in an inpatient rehabilitation setting. Phys Ther 2017;97(5):581-588.
  78. Huang Y, Xu J, Yu B, Shull PB. Validity of FitBit, Jawbone UP, Nike+ and other wearable devices for level and stair walking. Gait Posture 2016;48:36-41. [CrossRef]
  79. Gillinov S, Etiwy M, Wang R, Blackburn G, Phelan D, Gillinov AM, et al. Variable accuracy of wearable heart rate monitors during aerobic exercise. Med Sci Sports Exerc 2017 Aug;49(8):1697-1703. [CrossRef] [Medline]
  80. Woodman JA, Crouter SE, Bassett Jr DR, Fitzhugh EC, Boyer WR. Accuracy of consumer monitors for estimating energy expenditure and activity type. Med Sci Sports Exerc 2017;49(2):371-377. [Medline]
  81. Bronikowski M, Bronikowska M, Glapa A. Do they need goals or support? A report from a goal-setting intervention using physical activity monitors in youth. Int J Environ Res and Public Health 2016;13(9):1-12. [Medline]
  82. Jones AP, Coombes EG, Griffin SJ, van Sluijs EM. Environmental supportiveness for physical activity in English schoolchildren: a study using Global Positioning Systems. Int J Behav Nutr Phys Act 2009;6:42. [Medline]
  83. Mooney R, Quinlan LR, Corley G, Godfrey A, Osborough C, Olaighin G. Evaluation of the Finis Swimsense® and the Garmin SwimTM activity monitors for swimming performance and stroke kinematics analysis. PloS One 2017;12(2):e0170902. [Medline]
  84. An HS, Jones GC, Kang SK, Welk GJ, Lee JM. How valid are wearable physical activity trackers for measuring steps? Eur J Sport Sci 2017 Apr;17(3):360-368. [CrossRef] [Medline]
  85. Ehrler F, Weber C, Lovis C. Influence of pedometer position on pedometer accuracy at various walking speeds: a comparative study. J Med Internet Res 2016 Oct 06;18(10):e268 [FREE Full text] [CrossRef] [Medline]
  86. Ammann R, Taube W, Neuhaus M, Wyss T. The influence of the gait-related arm swing on elevation gain measured by sport watches. J Hum Kinet 2016;51:53-60. [CrossRef]
  87. Eyre EL, Duncan MJ, Birch SL, Cox V, Blackett M. Physical activity patterns of ethnic children from low socio-economic environments within the UK. J Sports Sci 2015;33(3):232-242. [Medline]
  88. Schaffer SD, Holzapfel SD, Fulk G, Bosch PR. Step count accuracy and reliability of two activity tracking devices in people after stroke. Physiother Theory Pract 2017;33(10):788-796. [Medline]
  89. O'Connell S, ÓLaighin G, Kelly L, Murphy E, Beirne S, Burke N, et al. These shoes are made for walking: sensitivity performance evaluation of commercial activity monitors under the expected conditions and circumstances required to achieve the international daily step goal of 10,000 steps. PLoS One 2016;11(5):e0154956 [FREE Full text] [CrossRef] [Medline]
  90. Jennings CA, Berry TR, Carson V, Culos-Reed SN, Duncan MJ, Loitz CC, et al. UWALK: the development of a multi-strategy, community-wide physical activity program. Transl Behav Med 2017;7(1):16-27. [CrossRef]
  91. Price K, Bird SR, Lythgo N, Raj IS, Wong JY, Lynch C. Validation of the Fitbit One, Garmin Vivofit and Jawbone UP activity tracker in estimation of energy expenditure during treadmill walking and running. J Med Eng Technol 2017;41(3):208-215. [Medline]
  92. Claes J, Buys R, Avila A, Finlay D, Kennedy A, Guldenring D, et al. Validity of heart rate measurements by the Garmin Forerunner 225 at different walking intensities. J Med Eng Technol 2017;41(6):480-485. [CrossRef]
  93. O'Brien A, Schlosser RW, Shane HC, Abramson J, Allen AA, Flynn S, et al. Brief report: just-in-time visual supports to children with autism via the Apple watch®: a pilot feasibility study. J Autism Dev Disord 2016 Dec;46(12):3818-3823. [CrossRef] [Medline]
  94. Cook S, Stauffer JC, Goy JJ, Graf D, Puricel S, Frobert A, et al. Heart rate never lies: interventional cardiologist and Braude's quote revised. Open Heart 2016;3(1):e000373 [FREE Full text] [CrossRef] [Medline]
  95. Mercer K, Giangregorio L, Schneider E, Chilana P, Li M, Grindrod K. Acceptance of commercially available wearable activity trackers among adults aged over 50 and with chronic illness: a mixed-methods evaluation. JMIR mHealth uHealth 2016;4(1):e7 [FREE Full text] [CrossRef] [Medline]
  96. El-Amrawy F, Nounou MI. Are currently available wearable devices for activity tracking and heart rate monitoring accurate, precise, and medically beneficial? Healthc Inform Res 2015 Oct;21(4):315-320 [FREE Full text] [CrossRef] [Medline]
  97. Ferguson T, Rowlands AV, Olds T, Maher C. The validity of consumer-level, activity monitors in healthy adults worn in free-living conditions: a cross-sectional study. Int J Behav Nutr Phys Act 2015;12:42 [FREE Full text] [CrossRef] [Medline]
  98. Hernández-Vicente A, Santos-Lozano A, De Cocker K, Garatachea N. Validation study of Polar V800 accelerometer. Ann Transl Med 2016 Aug;4(15):278 [FREE Full text] [CrossRef] [Medline]
  99. Giles D, Draper N, Neil W. Validity of the Polar V800 heart rate monitor to measure RR intervals at rest. Eur J Appl Physiol 2016 Mar;116(3):563-571 [FREE Full text] [CrossRef] [Medline]
  100. Gruwez A, Libert W, Ameye L, Bruyneel M. Reliability of commercially available sleep and activity trackers with manual switch-to-sleep mode activation in free-living healthy individuals. Int J Med Inform 2017 Jun;102:87-92. [CrossRef] [Medline]
  101. O'Connell S, ÓLaighin G, Quinlan LR. When a step is not a step! Specificity analysis of five physical activity monitors. PLoS One 2017;12(1):e0169616 [FREE Full text] [CrossRef] [Medline]
  102. Parak J, Korhonen I. Evaluation of wearable consumer heart rate monitors based on photopletysmography. Conf Proc IEEE Eng Med Biol Soc 2014;2014:3670-3673. [CrossRef] [Medline]
  103. Spierer DK, Rosen Z, Litman LL, Fujii K. Validation of photoplethysmography as a method to detect heart rate during rest and exercise. J Med Eng Technol 2015;39(5):264-271. [CrossRef] [Medline]
  104. Parak J, Uuskoski M, Machek J, Korhonen I. Estimating heart rate, energy expenditure, and physical performance with a wrist photoplethysmographic device during running. JMIR mHealth uHealth 2017 Jul 25;5(7):e97 [FREE Full text] [CrossRef] [Medline]
  105. Delgado-Gonzalo R, Parak J, Tarniceriu A, Renevey P, Bertschi M, Korhonen I. Evaluation of accuracy and reliability of PulseOn optical heart rate monitoring device. Conf Proc IEEE Eng Med Biol Soc 2015 Aug;2015:430-433. [CrossRef] [Medline]
  106. Parak J, Tarniceriu A, Renevey P, Bertschi M, Delgado-Gonzalo R, Korhonen I. Evaluation of the beat-to-beat detection accuracy of PulseOn wearable optical heart rate monitor. Conf Proc IEEE Eng Med Biol Soc 2015 Aug;2015:8099-8102. [CrossRef] [Medline]
  107. Apple. 2017. WatchOS - Apple Developer   URL: https://developer.apple.com/watchos [accessed 2017-09-19] [WebCite Cache]
  108. Apple. 2017. HealthKit - Apple Developer   URL: https://developer.apple.com/healthkit/ [accessed 2017-09-13] [WebCite Cache]
  109. Fitbit. 2017. Fitbit Developer   URL: https://dev.fitbit.com/ [accessed 2017-09-22] [WebCite Cache]
  110. Garmin. Garmin Developers   URL: https://developer.garmin.com/ [accessed 2017-09-13] [WebCite Cache]
  111. Misfit. Misfit Developer Toolkit   URL: https://build.misfit.com/ [accessed 2017-09-13] [WebCite Cache]
  112. Polar. Polar AccessLink   URL: https://www.polar.com/en/connect_with_polar/polar_accesslink [accessed 2017-09-19] [WebCite Cache]
  113. Samsung. Samsung Health - Samsung Developers   URL: http://developer.samsung.com/health [accessed 2017-09-13] [WebCite Cache]
  114. TomTom. TomTom Developer Portal   URL: https://developer.tomtom.com/ [accessed 2017-09-13] [WebCite Cache]
  115. Nokia. Nokia Health API Developer Documentation   URL: https://developer.health.nokia.com/api [accessed 2017-09-13] [WebCite Cache]
  116. PR Newswire Association. Fossil Group Reimagines the Watch   URL: https://www.prnewswire.com/news-releases/fossil-group-reimagines-the-watch-300427943.html [accessed 2018-03-01] [WebCite Cache]


API: application programming interface
EE: energy expenditure
GPS: global positioning system
HR: heart rate
IMU: inertial measurement unit
MEDLINE: Medical Literature Analysis and Retrieval System Online
PA: physical activity
PPG: photoplethysmography
SDK: software development kit


Edited by G Eysenbach; submitted 17.11.17; peer-reviewed by J Sanders, P Wark, K Winfree, R Fallahzadeh, C Fernández; comments to author 07.12.17; revised version received 18.12.17; accepted 06.01.18; published 22.03.18

Copyright

©André Henriksen, Martin Haugen Mikalsen, Ashenafi Zebene Woldaregay, Miroslav Muzny, Gunnar Hartvigsen, Laila Arnesdatter Hopstock, Sameline Grimsgaard. Originally published in the Journal of Medical Internet Research (http://www.jmir.org), 22.03.2018.

This is an open-access article distributed under the terms of the Creative Commons Attribution License (https://creativecommons.org/licenses/by/4.0/), which permits unrestricted use, distribution, and reproduction in any medium, provided the original work, first published in the Journal of Medical Internet Research, is properly cited. The complete bibliographic information, a link to the original publication on http://www.jmir.org/, as well as this copyright and license information must be included.