Objective
The objective of this project is to recognize the hand gestures and then use the recognized gestures in the image rotation application. Application after sensing the hand gestures can vary i.e scrolling display of lift-floor, selecting destination station in the metro, controlling the car steering, etc. Here we are first to recognize the hand gesture and movements like left, right, clockwise, anti-clockwise, and then use the result in image rotation and motion. We have done this project by following a step-by-step procedure.
- We have first selected a mobile accelerometer sensor to give the motion-sensing data. For this purpose, we have used a free app available on play store SENSOR NODE
- Using the app, we collect the data-set to generate weights for these 4 motions. For this, we have to hold the mobile in hand and then rotate in these directions and then collect that output data from the app using the accelerometer sensor of mobile.
- After collecting data-set we have used proper Machine Learning Algorithms to generate weights.
- We have used those weights to code our program in Keil µVision 5 Tool to predict the direction of motion.
- We need to set up a UDP connection to sending input data from the accelerometer to Keil µVision 5 Tool using UART.
- Once the UART connection has been set up in python, we then used it to send input data and operate those weights to predict the direction of motion.
- After correctly predicting the direction of motion we then upload an image data in Keil µVision 5 Tool use this data as image array to rotate and shifting of image.
- Both the images (test image as well as the updated image) has been displayed on TExaS Nokia 5110 screen that is available in Keil µVision 5 Tool
Sending Data from sensors to KEIL simulation environment
As the very first step of this project, we, first of all, set up a connection between the Mobile app(APK henceforth) and our PC to send the Accelerometer output data. This is done by setting up a UDP connection between the APK and the PC, The APK transmits the data over a UDP socket of PC. This sensor data is captured using a python script running on PC which is listening to the appropriate port. We have created 2 Virtual COM ports to simulate a real UART connection. These two Virtual COM port are like two hardware ports connected to each other using UART. After receiving the data, the python script redirects(writes) the data onto a UART COM port, now this UART COM port is read by our program in Keil environment to get the sensor values for our program.
Machine Learning: Training our model to recognize Gestures
We collected 20 data points for each gesture and then used them to train our Feedforward neural network. By training we mean we want to get the best weight values for the network so that we can predict the gestures to a good accuracy in future.
Model Description of our feed forward network:
1st layer(input layer)=> size = 10 nodes 2nd layer(hidden layer)=> size = 15 nodes Last layer(output layer)=> size = 4 nodes
We made 3 such models, one for each X, Y ,Z values from sensor. On analyzing the training sensor data we found: X and Z values plays major role in deciding the gesture, so we gave X and Z model prediction more weight than the Y model prediction.
The training was done in python using the TensorFlow library.
After training the model we got the weights. We then used those weights in our embedded C code to do the gesture prediction.
Image Rotation and Display
After Correctly predicting the hand gesture motion, we then implement the image rotation application. For this, we have first converted the image into the BMP image type. After that, we have used the BMP image file to take out the 2d-Image Array (i.e Image Matrix). Once we get the image array then after keeping the image header part fixed we then applied the image matrix rotation operations to rotate the image in the desired direction (i.e clockwise, anti-clockwise). For shifting the image left or right we have just subtracted and added the offset value in image data so that the resultant image can be shifted with respect to the original one. We have used the TExaS Nokia 5110 screen that is available in Keil µVision 5 Too to display the image before and after the rotation.
The image above shows a UP arrow as displayed on a NOKIA 1600 screen. Now we will rotate this arrow image clockwise, anticlockwise, shift it left and right using our gestures.
Software Tools
- APK Used
- Sensor Node
- Simulation Platform:
- Keil µVision5 Microcontroller Development Kit (MDK Version 5) by ARM
- TExaS Nokia 5110
- Python 3 & Tensorflow
- Virtual COM port connector:
References
- Tiva™ TM4C123GH6PM Microcontroller DATASHEET SPMS376E
- Library for TExaS Nokia 5110: http://users.ece.utexas.edu/~valvano/arm/
Recent Comments