Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Heart rate sensor accuracy/confidence #1248

Open
halemmerich opened this issue Jan 8, 2022 · 11 comments
Open

Heart rate sensor accuracy/confidence #1248

halemmerich opened this issue Jan 8, 2022 · 11 comments

Comments

@halemmerich
Copy link
Collaborator

I have tried comparing the internal sensor with a cheap external one. The external one is also optical and worn on the upper arm.

The results look as follows when ignoring 0 values:
grafik
track_stairs.csv
(Going up and down stairs between 11:57 and about 12:22, to get some decent movement)

It seems high confidence correlates to similar results between both sensors, but there is a lot of time where confidence is 0. What exactly is the confidence value reported?
Is there untapped potential in the heart rate sensor? Correction via accelerometer or something like that?
It seems the correlation between both sensors is much higher during low movement times (e.g. sleep).
I will try to get my hands on an EKG based sensor to compare that too.

@gfwilliams
Copy link
Member

Well, yes, you could probably scrape a bit more data from the heart rate sensor. It's good that the confidence does seem to be a pretty good indicator though.

Confidence: I can't remember the numbers exactly but: we have list of the last 10 heart rate readings, and when 70% of them are within 10% of each other, confidence is 100. As they move away from that, confidence drops.

The HRM's algorithm (which is a binary blob) does use accelerometer data, but I'm not 100% sure how since it's binary. Right now we don't use that algorithm because I'm trying to keep things open, but a new algorithm (or using the binary blob) would likely help a lot if someone were willing to put the time into it.

@halemmerich
Copy link
Collaborator Author

Can the binary blob be loaded through javascript? Maybe an optional app loading the blob would be a stopgap alternative to integrating closed stuff into the firmware image.

@gfwilliams
Copy link
Member

it's a thought... It might be possible to do, but it's not going to be easy

@adamschmalhofer
Copy link
Contributor

The confidence is =0 on my watch almost all the time, too.

@halemmerich
Copy link
Collaborator Author

halemmerich commented Jan 13, 2022

I have done a track overnight:
sleep
Confidence looks a lot better and until actual movement started, both sensors deliver very similar curves.
It seems to me that the internal sensor mostly underestimates heart rate when movement is happening. If I actually move the watch relatively to my skin, there are vastly overestimated heart rates.
Maybe it would be possible to just multiply the heart rate with some value derived from the accelerometer to fix the underestimates and "fix" the overestimates by setting the confidence to zero if the correction value is too big.

@gfwilliams
Copy link
Member

Thanks - if anything it looks from your last two graphs like confidence is too 'unconfident'?

I'm not sure if it's going to be quite as easy as just multiplying the HRM value. Potentially what we could do is what we did for the step counter: Make an app that logs every HRM value alongside accelerometer and the HRM from a second device which we trust.

Then when we've got a bunch of data from different people we can try different algorithms and see what works best, but in a more scientific, quantifiable way.

How are you logging the data? It sounds like you're pretty much there (although I guess the HRM/accelerometer data needs to be logged in a more fine-grained way).

@halemmerich
Copy link
Collaborator Author

For these graphs I used the recorder app. I have done some experiments with recording every event, but my csv file keeps getting some lines mixed.

@halemmerich
Copy link
Collaborator Author

I have done some code for running with the IDE to record more detailed data than the recorder app does: gist
This records every event by default but can be limited to maxSize accelerometer events before every HRM event to keep the file size manageable.

This yields data looking something like this (about 80 seconds and 360KiB in this case):
plot

I can put this into an app, if that helps with analysis.

@gfwilliams
Copy link
Member

Thanks! This is really helpful! Recording straight from the Web IDE is a good plan - I guess it limits options for recording while running or outside, but even without that this data looks like it'd be a massive help for better heart rate recording. In terms of data size, this would all be for the PC so in a way the more data the better :)

  • It looks like the filtered HRM value is actually pretty clean when there's no movement, so we should really be able to pick up a more accurate, higher confidence value much more quickly than we do currently.
  • The HRM adjusts the LED brightness so that the light sensor reading stays within range (which is why the raw HRM value ends up clipped - see below about recording). Maybe we could get useful info while it's adjusting, but also it feels like it should be able to adjust far more quickly than it does now.

Only things I'd say about the recording are:

  • ACCm/ACCd aren't really needed because they are just calculated from the accelerometer - might save a bit of bandwidth
  • HRM-raw's ppgValue would be super helpful instead - because ppgValue is 'windowed' to 8 bits for the HRM filtering it's possible there is some extra data we could extract by doing that windowing differently (or not at all).

Have you seen https://github.com/gfwilliams/step-count ?

If we could stick all that HRM data into a repository like that, and we can compile in heartrate.c from the Espruino repository then we can start changing the code and doing some proper experiments.

@halemmerich
Copy link
Collaborator Author

halemmerich commented Jan 20, 2022

I have done some experimenting with putting my code in a custom app:
https://halemmerich.github.io/BangleApps/#hrm%20acc
As soon as every event type has been seen once, recording starts and the Bangle switches from full screen red to green.

I could not find ppgValue in my HRM-raw events, so I tried with vcPPG and vcPPGoffs. Are these the correct raw values?

This now yields this data:
plot

I have looked at the step-count repo, but did not yet try it out. It is a good idea to get some data together for experimenting with algorithms.

@gfwilliams
Copy link
Member

gfwilliams commented Jan 20, 2022

Hi, yes, sorry - vcPPG is what you need. vcPPG-vcPPGoffs==raw so you can ignore vcPPGoffs as we should be able to work that out later

edit: actually it might be handy to have vcPPGoffs as we can see if it's working right - there's definitely some odd stuff going on!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

4 participants