IoT

Tutorial: Real-time iOS Heart Rate Monitor and Dashboard

0 MIN READ • Michael Carroll on Sep 30, 2015
Build: Real-time iOS Heart Rate Monitor & Dashboard

In Part One of our blog series, we built an Android heart rate monitor application and real-time dashboard. To build on that, in this part, we'll show you how to build the same application for iOS devices.

How the iOS App Works

The iOS application will collect the heart rate readings and stream the data to a real-time dashboard. The iOS phone’s camera will be used to capture the heart rate. When the app user presses a finger against the camera, an image processing algorithm detects the red component on the finger image to sense the blood flow and then calculates a reading.

The reading is averaged at the time span of one minute, and the heart rate is determined. Once we have the heart rate reading, the app streams the data in real time to a monitoring dashboard.

Project Resources

Our final application will look and function like this:

iOS Heart Rate Monitor

PubNub Initialization

AppDelegate.m is the main application entry point for this app.  It also defines the function PublishOnPubNub( ), which handles the publishing of heart rate readings to the doctor's portal.

- (BOOL)application:(UIApplication *)application didFinishLaunchingWithOptions:(NSDictionary *)launchOptions {
// Override point for customization after application launch.
PNConfiguration *configuration = [PNConfiguration configurationWithPublishKey:@"pub-c-1b4f0648-a1e6-4aa1-9bae-aebadf76babe"
subscribeKey:@"sub-c-e9fadae6-f73a-11e4-af94-02ee2ddab7fe"];
self.client = [PubNub clientWithConfiguration:configuration];
[self.client addListener:self];
[self.client subscribeToChannels:@[@"doctor_id"] withPresence:YES];
return YES;
}
-(void)PublishOnPubNub :(NSString*)PulseRate docId:(NSString*)Docid{
NSString *DocId=[NSString stringWithFormat:@"%@heartbeat_alert",Docid];
[self.client publish:PulseRate toChannel:DocId storeInHistory:YES
withCompletion:^(PNPublishStatus *status) {
[self stopLoader];
// Check whether request successfully completed or not.
if (!status.isError) {
// Message successfully published to specified channel.
UIAlertView *alert=[[UIAlertView alloc] initWithTitle:@"PubNub" message:@"Your Message Successfuly sent to doctor" delegate:self cancelButtonTitle:@"Ok" otherButtonTitles:nil, nil];
[alert show];
}
// Request processing failed.
else {
UIAlertView *alert=[[UIAlertView alloc] initWithTitle:@"PubNub" message:@"Error Occurred !!" delegate:self cancelButtonTitle:@"Ok" otherButtonTitles:nil, nil];
[alert show];
// Handle message publish error. Check 'category' property to find out possible issue
// because of which request did fail.
//
// Request can be resent using: [status retry];
}
}];
}

Registering with the Doctor's Id

Users register with the doctor's id for sending their heart rate reading. The ViewController class defines a function called SaveDoctId_btn( ) for capturing the doctor's id keyed in by the users. This is then used to form the PubNub channel name in the same way as done for the Android App in Part One.

- (IBAction)SaveDoctId_btn:(id)sender {
if ([_doctorId_txtfld.text length] == 0 ) {
UIAlertView *alert=[[UIAlertView alloc] initWithTitle:@"PubNub" message:@"Please enter DoctorId" delegate:nil cancelButtonTitle:@"Ok" otherButtonTitles:nil, nil];
[alert show];
}
else{
[self.view endEditing:YES];
_backButton.hidden=NO;
self.pulseRate.text=@"PLACE FINGER ON CAMERA LENS";
[_heartImage setImage:[UIImage imageNamed:@"Black1_heart.png"]];
[UIView animateWithDuration:1.0
animations:^{
_doctorId_view.alpha = 0;
}
completion:^(BOOL finished){
[UIView animateWithDuration:1.0
animations:^{
_doctorId_view.alpha = 0;
}
completion:^(BOOL finished){
[_doctorId_view setHidden:YES];
}];
}];
}
}

Capturing the Heart Rate

The iOS app captures finger images from the camera frame to detect the heart beats and derives the heart rate from them. The ViewController class is responsible for controlling and capturing the camera frames.

-(void) startCameraCapture {
// [NSTimer scheduledTimerWithTimeInterval:0.5 target:self selector:@selector(BlinkingMethod) userInfo:nil repeats:YES];
timer = [NSTimer scheduledTimerWithTimeInterval: 0.5
target: self
selector:@selector(BlinkingMethod)
userInfo: nil repeats:YES];
// Create the AVCapture Session
self.session = [[AVCaptureSession alloc] init];
// Get the default camera device
self.camera = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];
// switch on torch mode - can't detect the pulse without it
if([self.camera isTorchModeSupported:AVCaptureTorchModeOn]) {
[self.camera lockForConfiguration:nil];
self.camera.torchMode=AVCaptureTorchModeOn;
[self.camera unlockForConfiguration];
}
// Create a AVCaptureInput with the camera device
NSError *error=nil;
AVCaptureInput* cameraInput = [[AVCaptureDeviceInput alloc] initWithDevice:self.camera error:&error];
if (cameraInput == nil) {
NSLog(@"Error to create camera capture:%@",error);
}
// Set the output
AVCaptureVideoDataOutput* videoOutput = [
[AVCaptureVideoDataOutput alloc] init];
// create a queue to run the capture on
dispatch_queue_t captureQueue= dispatch_queue_create("captureQueue", NULL);
// setup ourself up as the capture delegate
[videoOutput setSampleBufferDelegate:self queue:captureQueue];
// configure the pixel format
videoOutput.videoSettings = [NSDictionary dictionaryWithObjectsAndKeys:[NSNumber numberWithUnsignedInt:kCVPixelFormatType_32BGRA], (id)kCVPixelBufferPixelFormatTypeKey, nil];
// set the minimum acceptable frame rate to 10 fps
videoOutput.minFrameDuration=CMTimeMake(1, 10);
// and the size of the frames we want - we'll use the smallest frame size available
[self.session setSessionPreset:AVCaptureSessionPresetLow];
// Add the input and output
[self.session addInput:cameraInput];
[self.session addOutput:videoOutput];
// Start the session
[self.session startRunning];
// we're now sampling from the camera
self.currentState=STATE_SAMPLING;
// stop the app from sleeping
[UIApplication sharedApplication].idleTimerDisabled = YES;
// update our UI on a timer every 0.1 seconds
[NSTimer scheduledTimerWithTimeInterval:0.1 target:self selector:@selector(update) userInfo:nil repeats:YES];
}
-(void) stopCameraCapture {
[self.session stopRunning];
self.session=nil;
}

Further, the captureOutput( ) function captures and converts RGB to HSV and saves the valid frames to be used by the PulseDetector class for pulse rate sensing.

- (void)captureOutput:(AVCaptureOutput *)captureOutput didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection *)connection {
// if we're paused don't do anything
if(self.currentState==STATE_PAUSED) {
// reset our frame counter
self.validFrameCounter=0;
return;
}
// this is the image buffer
CVImageBufferRef cvimgRef = CMSampleBufferGetImageBuffer(sampleBuffer);
// Lock the image buffer
CVPixelBufferLockBaseAddress(cvimgRef,0);
// access the data
size_t width=CVPixelBufferGetWidth(cvimgRef);
size_t height=CVPixelBufferGetHeight(cvimgRef);
// get the raw image bytes
uint8_t *buf=(uint8_t *) CVPixelBufferGetBaseAddress(cvimgRef);
size_t bprow=CVPixelBufferGetBytesPerRow(cvimgRef);
// and pull out the average rgb value of the frame
float r=0,g=0,b=0;
for(int y=0; y<height; y++) {
for(int x=0; x<width*4; x+=4) {
b+=buf[x];
g+=buf[x+1];
r+=buf[x+2];
}
buf+=bprow;
}
r/=255*(float) (width*height);
g/=255*(float) (width*height);
b/=255*(float) (width*height);
// convert from rgb to hsv colourspace
float h,s,v;
RGBtoHSV(r, g, b, &h, &s, &v);
// do a sanity check to see if a finger is placed over the camera
if(s>0.5 && v>0.5) {
NSLog(@"RatePulse: %@",self.pulseRate.text);
// increment the valid frame count
self.validFrameCounter++;
// filter the hue value - the filter is a simple band pass filter that removes any DC component and any high frequency noise
float filtered=[self.filter processValue:h];
// have we collected enough frames for the filter to settle?
if(self.validFrameCounter > MIN_FRAMES_FOR_FILTER_TO_SETTLE) {
// add the new value to the pulse detector
[self.pulseDetector addNewValue:filtered atTime:CACurrentMediaTime()];
}
TimerBool=YES;
} else {
TimerBool=NO;
self.validFrameCounter = 0;
// clear the pulse detector - we only really need to do this once, just before we start adding valid samples
[self.pulseDetector reset];
}
}

The PulseDetector class defines the functions addNewValue( ) and getAverage( ) which are used to get the pulse rate from valid frames out of 10 frames per second and derive the average reading for a one minute interval.

-(float) addNewValue:(float) newVal atTime:(double) time {
// we keep track of the number of values above and below zero
if(newVal>0) {
upVals[upValIndex]=newVal;
upValIndex++;
if(upValIndex>=AVERAGE_SIZE) {
upValIndex=0;
}
}
if(newVal<0) {
downVals[downValIndex]=-newVal;
downValIndex++;
if(downValIndex>=AVERAGE_SIZE) {
downValIndex=0;
}
}
// work out the average value above zero
float count=0;
float total=0;
for(int i=0; i<AVERAGE_SIZE; i++) {
if(upVals[i]!=INVALID_ENTRY) {
count++;
total+=upVals[i];
}
}
float averageUp=total/count;
// and the average value below zero
count=0;
total=0;
for(int i=0; i<AVERAGE_SIZE; i++) {
if(downVals[i]!=INVALID_ENTRY) {
count++;
total+=downVals[i];
}
}
float averageDown=total/count;
// is the new value a down value?
if(newVal<-0.5*averageDown) {
wasDown=true;
}
// is the new value an up value and were we previously in the down state?
if(newVal>=0.5*averageUp && wasDown) {
wasDown=false;
// work out the difference between now and the last time this happenned
if(time-periodStart<MAX_PERIOD && time-periodStart>MIN_PERIOD) {
periods[periodIndex]=time-periodStart;
periodTimes[periodIndex]=time;
periodIndex++;
if(periodIndex>=MAX_PERIODS_TO_STORE) {
periodIndex=0;
}
}
// track when the transition happened
periodStart=time;
}
// return up or down
if(newVal<-0.5*averageDown) {
return -1;
} else if(newVal>0.5*averageUp) {
return 1;
}
return 0;
}
-(float) getAverage {
double time=CACurrentMediaTime();
double total=0;
double count=0;
for(int i=0; i<MAX_PERIODS_TO_STORE; i++) {
// only use upto 10 seconds worth of data
if(periods[i]!=INVALID_ENTRY && time-periodTimes[i]<10) {
count++;
total+=periods[i];
}
}
// do we have enough values?
if(count>2) {
return total/count;
}
return INVALID_PULSE_PERIOD;
}

Finally, when the derived reading is taken, the app sends it to the real-time dashboard via the PublishOnPubNub( ) function defined in the AppDelegate class. And that's it!

Wrapping Up

This is just one example of how real-time technology is changing how we're building healthcare applications. A massive growing and evolving market, by 2020, it's estimated that IoT healthcare will hit $117 billion dollars. And real-time technology will continue to integrate itself into a wide variety of healthcare applications, for patients, doctors, and organizations alike.