Kinect Rehabilitation Project
The Kinect rehabilitation project is an undergraduate research project being carried out by Kathryn LaBelle. It started in the spring semester of 2011 under Professor Aaron Striegel’s direction. The goal is to investigate the practicality of using the Microsoft Kinect as a stroke therapy tool and to develop software that allows for its use as such.
About the Kinect
The Microsoft Kinect is a set of sensors intended for use with the Xbox 360 gaming console. Using depth, imaging, and audio sensors it detects users’ movements and allows them to play games using only their own bodies as controls. Unlike previous attempts at natural user controls it does not require the player to wear any kind of accessory to allow the device to track the player’s movements.
Stroke rehabilitation involves closely tracking a patient's movements. Accurate and specific data about a stroke victim's joints during a therapy session would be invaluable to rehabilitation doctors. We found that it is possible to monitor and analyze a user's joint positions by using the Microsoft Kinect's depth and imaging sensors together with user identification algorithms.
Kinect Stroke Therapy Software Tool
To obtain joint position and orientation data from the Kinect, the first iteration of our application uses the OpenNI Framework API together with PrimeSense’s NITE Middleware. NITE provides algorithms to process data from the OpenNI interface to the Kinect's sensors, and the OpenNI API then allows our joint-tracking application to access these algorithms.
Under the current NITE implementation we are able to get position data for 15 joints, and the application draws the corresponding skeleton over the depth map of the scene, as shown below:
Figure 1: Left -- NITE joints. Right -- application’s skeleton overlay screen capture
The joint position data is simultaneously recorded in a CSV file. We were also able to write an application to get joint positions from raw depth data that had been previously recorded.
The flow of the application we have developed occurs as follows:
1. Identify user
1. Calibrate user (user holds “psi pose”)
1. For every frame of depth data:
a. For each available joint, obtain joint positions
a. Write joint positions to CSV file
The Kinect has the potential to be a very useful tool for stroke rehabilitation. The depth sensor data was surprisingly accurate and consistent. The table below shows the range and standard deviation of joint readings for portions of the data collection where the subject was motionless. These metrics are shown for
At approximately 27 frames per second, the sampling rate is more than frequent enough to be useful. Sensor data may be recorded and played back, and it is possible to extract joints after the depth data has already been recorded.
We are hopeful that we will be able to work with doctors at Memorial Hospital to further develop this application so that individual joints may be monitored more easily. We also hope to integrate the application with the Wii balance board rehabilitation software developed by Professor Striegel’s research group.
Check the progress resports page
for documents detailing recent and past project updates and findings.
Source code for the Kinect application implementations using both OpenNI
(.zip) and the Microsoft SDK for Windows
(.zip) are attached to this page.
(.pptx) from the undergraduate research conference in the spring of 2011 is attached on this page as well.
- 28 Oct 2011