Creating a Robotic Arm with Kinect Interface Case Study

I found a project creating a robotic arm by leveraging on Microsoft Kinect.

The overview of how everything works together:

Component Diagram for the project

Here goes the summary:

1.) The fingers are constructed using flex sensors, which are essentially potentiometers that react to bending.

To detect the tilt of the wrist, an accelerometer is used.

2.) Microsoft Kinect's skeletal tracking capability helps with reducing learning curve on mechanical engineering.

To retrieve the angles for : between elbow and shoulder; between elbow and wrist.

3.) Controllers are used for processing input and output - for capturing input data and producing processed output data based on the programming of the firmware.

Controllers are actually micro-computer, in this case, their functions are to read the various sensors on the glove and actuating the motors for the hand accordingly.

Arduino controller is a single-board hardware meant for hobbyists. The firmware can be written in C/C++ using the provided IDE which is Java based (cross platform).

There are two portions to programming:

1.) Kinect skeletal tracking - to retrieve skeletal angles and input into the controller

Using the SDK provided by Microsoft Kinect for developers.

2.) Controllers' firmware programming - to generate output instruction to move the actuators of the motors

Using the cross platform default IDE which supports C/C++.



christyjames said…
When examining potential points of future change, I use a very purposeful distinction between robotics and intelligent software agents. One is a component of the other and robotics is rapidly heading towards a state of early generic robotic platforms
Robotics Classes