New wearables with optical heart rate monitors hit the market seemingly every week and with them come product reviews from the wearables, gadget and technology sites around the world. And nearly every one of these heart rate monitor reviews have a fundamental flaw: They are almost always tested ONLY by the reviewer themselves. Many of these reviews are extremely valuable in helping people understand different aspects of the wearable device, but here’s why that’s a problem for the heart rate monitoring capabilities:
Testing the accuracy of the heart rate measurement in these devices doesn’t work like the other technology in wearables, because they have to interact with the body to measure biometrics like heart rate – and everybody truly is different.
It is very different from testing a fitness tracker’s ability to show notifications or how a particular app performs on smart watches. Those either work or they don’t work. However, in measuring heart rate every human being is different and these devices perform differently on different people and in different conditions. (If you’re interested in learning more about how these devices work and why they are subject to human variability, you can find more detail in the post Optical Heart Rate Monitoring: What You Need to Know.)
To illustrate why testing a wearable on just one person is such a problem, let me give you an example from our testing lab, which tests more than 400 wearable devices every year and have some of the world’s experts in testing biometric wearables. This table shows actual results (anonymized of course) of testing of one of the most popular wearable devices on the market today.
A few things from the table to note:
- The device was tested on at least 20 people of all different sizes, shapes, skin tones, fitness levels, etc. to ensure statistical significance in the results found in the report.
- Of all the data points collected, 26% of the readings were more than +/- 5% different the chest strap. Not very accurate overall.
- However, there was significant variability in how the device performed on individual test participants.
Here’s the issue with many of the product reviews of these devices. If the writer of the review is tester #16, #17 or #18, then the writer is very likely to report the device tracks biometrics accurately. But what if the writer is tester #2, #8 or #10? They are likely to come to a very different conclusion in their product review and therefore mislead many of their readers for whom the device would work well.
Here’s another issue. Performance on these devices can change, sometimes drastically, on the same test subject over multiple tests. Our testing lab tests the same participants over time and we’ve seen some significant differences in performance testing individuals on the same devices. You can see an example here with participant #2:
First Test Second Test
We recognize it’s impossible for most product reviewers and tech publications to properly test these devices on a statistically significant number of people in controlled conditions and testing protocols. Just remember when you read the next review of a biometric wearable device that your mileage may vary from the product reviewer when it comes to the accuracy of the heart rate monitoring.
We’ll be sharing more in the near future on how to test biometric wearables and what to look out for, so stay tuned!