Monday 7 July 2014

Gesture controlled Robot Using KINECT and LEGO MINDSTORM!

Objective of the Experiment
This application aims to drive the vehicle remotely without the need of driver sitting physically with the steering wheel.The user who will be facing Kinect Kit will control the Lego.

HARDWARE REQUIREMENTS:
1.LEGO MINDSTORM KIT

2.KINECT KIT

https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEipKusAvBOos5uv1T_2mE5jnLUwwYsePQcCQEKWaYwGuwk7qkPu-AO12KmMthgdhnaxpY7btQrJNtWtlLzXlR35IIA_A2ttqxNaGpqpJOIfDb2-ixNUqWrM5d9M6jzzhRIuAnKkl4jfbMrl/s1600/Untitled.png


Download Files Required
Kinect SDK 1.8

Kinect Developer Toolkits 1.8 
~https://drive.google.com/file/d/0B3e4_6C5_YOjaW85QzVKR2lYVkU/edit?usp=sharing


OpenCv (used to show the frames taken from kinect)

NXT ++(used to communicate with lego mindstorm)
NXT++ is an interface written in C++ that allows the control LEGO MINDSTORMS NXT robots directly through a USB or Bluetooth connection. The interface is intended to be simple and easy to use. The interface can be used in any C++ program.
http://sourceforge.net/projects/nxtpp/

STEPS
·      1.   Pair Lego and PC using Bluetooth
-- Install Kinect for Windows SDK
-- Install OpenNI 2.2 SDK  32bits / 64bits  ( install both 64 bits and 32bits if u are using win64)
-- Install  Nite 2.2  ( install both 64 bits and 32bits if u are using win64)

·     2.    Install Visual Studio 2010
--Start a new empty C++project in Visual Studio
--Go to the property manager for adding include and library files
-- In Property  Manager
      a) C++ àGeneral àAdd all header files required  for OpenCv, NXT++,NITE2,OPENNI
      b) In Linker à  General àAdd the path for the library files for OpenCv, NXT++, NITE2,     OPENNI (C:\Program files\X86\OpenNI2\lib)
      c) Linker à Input à  Add the following library files
                  i. fantom.lib(NXT++)
                  ii.openni2.lib(OpenNI)
                  iii.NiTE2(NITE2)
                  and the library files required for OpenCv
·     3.    Turn on the Bluetooth device of PC as well as Lego and pair it manually using Microsoft  Bluetooth assistance.
·     4.    Identify the hand gestures from the frame using the x,y,z axis of the hand tracked.
·     5.    Determine the direction of the vehicle it should move using 4 threshold  values for x,y,z.
·  6.       If the co-ordinate values are not above these threshold  stop the vehicle.
·    7.     Send  command through Bluetooth using API’s of  fantom library from PC to Lego Mindstorm.

·  9.      Command here is to turn  the servo motors with respect to the direction obtained from kinect which will eventually drive the Lego right and left and forward and backward and stop.


Extension to the project:
At present the robot environment is known to us. By fitting a camera on the Lego, and transmitting the video wirelessly to the PC or Head  Mounted Display , we can send the robot Lego anywhere in the campus, which gives a feel of Lego driving anonymously.



No comments:

Post a Comment