Sunday, August 11, 2013

Kinect Driven Arduino Powered Hand

After buying an Xbox with the kinect, and the kinect lying on my table collecting dust and being a moderately useful paper weight as of late. I had an idea of a project to design an arduino powered hand which is driven by the kinect sensor. This sensor would find my hand, check the position of my fingers and send information to the arduino hand accordingly. I ended up building a basic prototype which worked fairly well although with a few restrictions. The video can be viewed below:

Skip to 0:57 seconds to watch the hand in action

The design consisted of the kinect which was simply placed into the USB port of my laptop. From there a program called Processing was used to work with the data from the kinect and the addition of two main libraries which was a Finger Tracking one and Simple-openNI which allowed me to easily process the kinect information and find finger location data and tracking.
The data is sent through by the Processing application to the Arduino Uno which uses the information obtained and moves each servo of the electrical hand I built depending on the view of the kinect. This allowed me to control the fingers very simply as seen in the video above.

Unfortunately as a prototype a few problems and restrictions occurred which made the project difficult. First is the use of five servos for each finger, this took a lot of power from the arduino and I found some servos didn't react as well due to this. The second and biggest problem was due to the limitations involved in the finger tracking library. As seen in the video there are five main squares which are created through a calibration method of the five fingers. When the fingers are within these squares the arduino knows the fingers are open and thus moves the servo, and when not placed in the boxes the arduino knows the finger is not there and moves the servo accordingly.

This unfortunately is not as accurate as a proper design and creates two problematic constraints. The first of which is that the hand can't move around the vision of the kinect as it needs to be in place for the calibration squares and secondly the finger will be seen as either always up, or down, but not in-between.

Further programming has allowed me to remove the calibration part which lets me move my whole hand in the vision of the kinect with the same output for the electrical hand. Unfortunately it is much more error prone since the finger tracking library has a lot of trouble evaluating fingers compared to other objects in its vision.

A way around both problems could be to create a single point on the hand such as one for the palm which would help in evaluating not only the distance between the tip of the finger and the palm allowing the electrical hand to know that the finger is slightly down, or completely down or completely up. But also gives a point in which an appropriate quadratic formula could be created by graphing the finger tips and then segregating the distance in relation to each finger to the common palm point. This not only maps out the accurate location of each finger but also allows for error handling for other objects which are not included as a finger. For example the web between your fingers are commonly an error prone place which disrupts and adds extra fingers to the program. These fingers can be removed though if there found to be outside an appropriate finer tip range. Such as if there not within the quadratic formula of which all the finger tips should fit in then it can be safely assumed to not be a finger at all but a random object and thus not processed, though this could prove to be difficult.

As it stands though with the current libraries it does work fairly well, with slight issues which can be worked on. I hope to be able to attempt this with different finger tracking libraries as well and see if I can obtain a better accuracy and fix any problems with the system.

FingerTracker Library:
Processing Application:


  1. Hey so it looks like you made this a little while ago but hopefully you could answer a question for me. I'm working on a similar project involving a kinect, processing, and an arduino to make a camera automatically track a person but I can't seem to get anything usable to pass through the serial port from processing to the arduino. Any secrets to getting serial communication to work right or did you just use firmata?

    1. Did you use the same baud rate and the same type of parity? I essentially just wrote data from processing through the arduino COM port with a typical write() on processing and then I used the on the arduino to obtain those values and do what I wanted with them. You should be able to check out your serial monitor just to ensure that the right values are being sent and received.

  2. Hello, great work. I need to this project, how to install openNI. doesn't work website