Apple has announced a slew of upgrades to ResearchKit, adding hearing, vision and speech tests, as well as an updated UI and new research API for monitoring Parkinson's tremors and dyskinesia to its capabilities, while also looking to the FDA-cleared hardware market.

It follows the launch of its new Health Records API that will allow developers to create apps that use data from patients’ electronic health records, and fitness updates to the Apple Watch.

According to an Apple core motion engineer, the ResearchKit upgrades will benefit Parkinson's care.

"One of the identifiable symptoms of Parkinson’s is a tremor and this API monitors for tremor at rest, characterised by a shaking or a trembling of the body when somebody is not intending to move," Gabriel Blanco said in a presentation last week.

"Now there are treatments, including medications, that can help control the symptoms of Parkinson’s. However, these very same treatments can often have side effects, such as dyskinesia. One of those that this API can monitor for is a fidgeting or swaying of the body known as choreiform movement."

The API allows for always-on passive monitoring of these movement disorders, using Apple's core motion processor. Researchers can see trends over time and longitudinal data.

Parkinson's has been a focus of Apple behind the scenes for some time – it was one of the topics Apple met with the FDA about in 2015 and was the focus of the Parkinson's mPower study, one of the first ResearchKit apps.

Apps like mPower relied on discrete tap tests to monitor patients and more generalised movement data, rather than having access to a bespoke motion disorder API.

Apple also announced active tasks for ResearchKit and CareKit that allow developers to incorporate vision, hearing, and speech tests. The available vision test is a digital implementation of the Amsler Grid, which can be used to detect symptoms of macular degeneration.

The hearing test is a tone audiometry test, designed to emulate the Hughes-Westlake method for hearing testing. The app plays a tone and instructs users to tap if they hear it. Developers can also include another task to identify the amount of background noise and inform users if the room is too noisy for the audiometry test.

The speech recognition module prompts a user to recite a sentence, then displays a transcript of that sentence and asks them to correct it. It can collect data on the syntactic, semantic, and linguistic features of speech.

One final protocol, Speech In Noise, combines the hearing and speech recognition tests in order to test the user's ability to detect and distinguish human speech in a crowded room, which can detect some types of hearing loss that tend to be missed by traditional tone audiometry. 

News has also emerged of Apple’s patent application for a wearable blood pressure monitor.

In what appears to be a slight tweak to the standard-of-care inflatable blood pressure cuff, Apple’s version has been built into a wearable that might be smaller, might be Bluetooth connected, and might sport a touchscreen.

Patents and patent applications, especially for a company with the resources of Apple, do not always lead to commercial products. But this patent could suggest a change in focus for the tech titan. Prior to Apple’s involvement in the FDA Pre-Cert program, CEO Tim Cook had said he was not interested in FDA-cleared devices. The possibility of fast-tracked devices could change Apple's calculus on whether to venture into FDA-cleared hardware.

This is at least the third Apple patent to come to light dealing with blood pressure. Last August, Apple was granted a patent for a technique for detecting blood pressure index using the front-facing camera, the ambient light sensor, the proximity sensor, or a special electrode built into the device. In October, another patent application came to light, this one using wrist-worn accelerometer and photo-plethysmogram sensors to calculate a blood pressure value from the pulse transit time.

Both those patents were initially filed in 2015. This latest one was filed in 2016.

Apple has also developed an improved informed consent module, which features quicker navigation, real time annotations, search capability and the ability to share or save the document.

A version of this story was originally pubished on the US edition of Healthcare IT News.

To share tips, news or announcements, contact the HITNA editor on


TAGS: Apple



White papers