The use of AI to make life easier for those with disabilities and mobility issues is becoming a reality with Hoobox Robotics’ latest technology. 

Hoobox Robotics’ Wheelie 7 kit, powered by Intel AI technology, is said to be the world’s first wheelchair powered by facial recognition technology, enabling people with disabilities who cannot drive a motorised wheelchair using their hands to control it using just facial movements.

No body sensors are required for this technology, letting quadriplegics to move, stop and change directions of the wheelchair based on a handful of gestures like a kiss, smile or even a raised eyebrow.

The technology takes seven minutes to install, and the Wheelie 7 kit allows users to pick from nine different facial expressions to control their motorised wheelchair.

Instead of invasive body sensors, the Wheelie 7 uses a 3D Intel Real Sense camera mounted on the wheelchair to stream large amounts of data. The solution runs offline, on an Intel NUC, installed underneath the wheelchair as an onboard computer.

Using next-generation facial analysis, the system is capable of detecting facial expressions more accurately regardless of light conditions. It’s precise enough to start to detect human behaviour, such as drowsiness, 10 level of pain, agitation/sedation levels, and spasms, and can even detect when a person will sneeze before sneezing

The AI algorithms then process the data in real time to control the chair.

Hoobox also uses Intel Core processors and the Intel Distribution of OpenVINO Toolkit to accelerate the inferencing of the facial recognition software and for immediate responsiveness.

Hoobox CEO and co-founder Dr Paulo Pinheiro said mobility is often enabled through a motorised wheelchair with complex sensors placed on the body that require special learning to operate, or people need to rely on a caregiver to move them around.

“We are working to help people regain their mobility with more comfort and more precision. We don’t want the quadriplegics only to be able to go from point A to point B, but we want them to do it faster and in a more natural way,” Dr Pinheiro told HITNA.

[Read more: AI and machine learning – how soon will it be key to a learning health system? | AI algorithms show promise for colonoscopy screenings]

 But, Dr Pinheiro said the ability to control the wheelchair without invasive body sensors provides users with independence and control over their movement.

“The Wheelie 7 is the first solution to use facial expressions to control a wheelchair. This requires incredible precision and accuracy. We are helping people regain their autonomy,” Dr Pinheiro said.

The Australian Institute of Health and Welfare has estimated that there are more than 15,000 Australians that live with spinal cord injuries, with five Australians sustaining a spinal cord injury in a week. A 2018 study found that physical mobility has the largest impact on the quality of life for people with spinal cord injuries.

However, even with those numbers, Dr Pinheiro said mobility remains an unexplored market in society.

“Users have very few [mobility] product options to buy. If we take a look at the market on the last 10 years, the current options we have available have not gone that far.”

Intel Head of AI for Social Good Anna Bethke said for those living with restricted mobility and requiring aids to assist movement, AI technology has the potential to transform their lives.

“It’s important to recognise the ways in which technology can help people regain mobility and control of their lives. The Wheelie 7 kit from Hoobox Robotics is a great example of using AI to enable people with limited mobility to move around using natural facial movements they’ve done their entire lives,“ she said.

[Read more: Ground-breaking collaboration creates world-first smart home for people with intellectual disabilities | Artificial intelligence and blockchain: an easy pill to swallow]

 Moving forward, Dr Pinheiro said the company will be looking to further improve the solution and add new features to it.

“The next step is to use the same computer vision technology to detect risk human behaviours, such as fainting and high levels of agitation. This solution will be run on a background of any monitoring application,” he said.

“We are currently looking at using this technology to help assist clinicians in hospitals by allowing them to know, in real-time, if a patient is experiencing pain, discomfort or spasms. Additionally, we are looking at ways to use this to help parents understand when their babies are in discomfort.”




White papers