Hand Gesture Detection for Sign Language Translation

A. INTRODUCTION

CE 3.1

I have provided the descriptive details of the project titled “Hand Gesture Detection System for Sign Language” in this part. I did this project while I was studying at Karachi Institute of Economics and Technology, pursuing my Bachelor of Engineering in Electronics Engineering, for the fulfillment of the requirement of the degree. I began the project on this date and completed it on this date.

B. BACKGROUND

CE 3.2

This work was taken in consideration of making the lives of people with hearing impairment and speaking disability easier by using the technological advancements in electronics engineering. The fact that, while video calls could be useful for two people who understand sign language, it is not possible for someone not knowing sign language to talk to people with such disability, fueled the need for a system that would interpret the sign language and translate it into audio voice. For this, a convenient glove for such people was designed in the project in which flex sensors were placed along with contact sensors and gyroscope for measurement of the fingers’ flexion and rotation of the hand, which then using appropriate algorithm was translated into audio. For the communication of gestures interpreted by Arduino through sensors to cell phones, a Bluetooth module was used.

CE 3.3

These objectives were to be focused for the proposed work.


  • To capture the various hand gestures for sign language recognition using sensors

  • To develop a machine algorithm for differentiating hand gestures and produce voice output

  • To provide the communication between the sensing device and voice output device (cellphone)

CE 3.4

I fulfilled both my share of responsibilities as a team lead handling the managerial works and as a team member performing the design, development, and testing of the system. I researched different papers and journals published on this matter to get to the execution of the task. I carried out a discussion on various techniques for the selection of appropriate methods for gesture detection and built the prototype of the system. I configured the assembly and programmed the microcontroller and performed compilation and testing of the project using the software.

CE 3.5

Following is the hierarchical chart depicting the order and authority under which I carried out this project:

Hand Gesture Detection for Sign Language Translation 1

Fig: Organizational chart

CE 3.6

The duties that were carried out in this project are mentioned below in a brief manner:


  • To study the literature to acquire understanding about the selection of components for the proposed system

  • To design the block diagram of the proposed system showing all the functional elements

  • To make a list of sign languages and understand the procedure of the sign language gesture.

  • To design the sensing environment by implementing a flex sensor, contact sensor

  • To develop the communication system between the sensing environment and software system

  • To develop the Matlab code for differentiation and recognition of the sign languages

  • To assemble the circuit and program the microcontroller to accomplish the task.

  • To perform functional testing of the system

C. PERSONAL ENGINEERING ACTIVITY

CE 3.7

I used an Arduino nano board which uses Atmega328P microcontroller for processing of the hand gestures to text., I used two hand gloves i.e. master and slave gloves for recognition of a gesture with flex sensors attached to each one of them. I also connected flex sensors to each hand glove and each hand glove has an nRF24L01 RF transceiver connected for inter glove control signal transmission. The nRF24L01 operates in 2.4GHz of frequency. For the sharing of the data to the computer, I used Bluetooth module HC105. I attached flex sensors near the knuckles of fingers to detect flexion of fingers. I made contact sensors out of scratch using aluminum foils of 0.7 inches of diameter for touch sensing. I used the MPU6050 sensor, which is an accelerometer and a gyrometer for the purpose of motion detection of hands. I used Arduino IDE and Matlab software for the programming of microcontrollers and the implementation of gesture detection algorithms.

CE 3.8

I applied the knowledge I learned from my then ongoing engineering education from various subjects like microcontroller architecture and pin configurations, electronic devices and circuits, sensors, wireless communication, VHDL and Verilog programming, etc. I used my programming knowledge of HDL to write Matlab code for differentiation and detection of gesture algorithm, and microcontroller programming for coding the Arduino. I referred the IEEE 1451 for the reliability of the sensors used. I followed IEEE 802.15.1 standards for the application of Bluetooth transmission. I followed the UL 60065 standard for consideration of overvoltage security parameters for the electronic components.

CE 3.9

CE 3.9.1

I carried out a study on the gesture recognition system by going through various research papers. I studied the existing systems which used sensors for sign language conversion, noting down the drawbacks and thus making a selection of the best technology and devices for the design and development of the proposed system. I designed the proposed system on the basis of a model of master-slave gloves. The master and slave gloves would connect to each other with the RF transmission system. I carried out design in three sections viz. gesture sensing, processing, and the output subsystem. I have shown the block diagram of the proposed system showing various functional blocks, and as shown, I used Arduino Nano board as a control unit, and Flex sensors were employed in gloves which I placed such that they lay in fingers of the glove to detect the bend. I connected each glove with an RF transceiver for communication. For sensing the motion of the hand, I used the MPU6050 motion sensor and ADC for the conversion of the sensor analog output to digital data.

hand gesture detection system

Fig: Block diagram of the proposed system

CE 3.9.2

For the selection of components, I considered the implicit requirements like portability, easy to use, etc. I chose the woven tufted fabric for glove because of its reliability as well as comfortability. Moving on with the hardware development, which included designing of flex sensor and inertial measurement subsystem and the intercommunication between the processing device and sensor gloves in PCB. I used 3.2” flex sensor, and then carried out a simulation of the flex sensor system for ensuring sufficient sensitivity and linearity using Matlab. I implemented the VDR technique by applying 5V across the flex sensor connected with a pulldown resistor which was grounded and using ADC, the voltage change was detected. I calculated the maximum possible voltage for various resistance values across the flex sensor, which I obtained 2.39V for 27K resistance. I then studied the linearity for the flex sensors for real-time by placing the sensor and determining a suitable position with adequate voltage variation over the hand. I then implemented the voltage drop values in the microcontroller using AnalogRead() command. Likewise, I implemented all other four flex sensors on hand gloves and used flexion values in microcontroller programming.

CE 3.9.3

Then, I started with the design and development of the contact sensor by reading one of the terminal voltage of contact and using high logic on another terminal so to define the contact. I used aluminum foils of about 0.7 inches’ diameter, by supply logic high on one coil and connected another coil with Arduino’s digital pin. I connected pull-up resistors to the sensor to stop the flickering of the output of the sensor. After I got proper values for the contacts, I pasted it on the glove. I added an inertial measurement functionality by connecting the MPU6050 with the microcontroller and detected 6 DOF parameters. The SCL and SDA pins of the MPU6050 module were configured to theA1 and A2 pins of the Arduino board correspondingly and I supplied the sensor with 5V power and ground connections with the Arduino board itself. Then, for providing the inter glove communication, I used an HC-12 RF transceiver, which operates in 433MHz to have both sides communication between gloves. I connected its Transmitter pin and receiver of the microcontroller to the corresponding pins of the same. I performed stability and range tests on this module for checking the communication distance without any data loss, which gave a range of 1m.

CE 3.9.4

After that, I configured the HC05 Bluetooth module with the microcontroller connected to the master glove. The Bluetooth module was programmed through serial communication protocol and by using Tx, Rx and 0 and 1 port, I created a virtual serial port. Using mySerial() command in the Arduino IDE, I programmed to display the output on the computer. And for the communication to the PC for Matlab processing, I used the Jtag UART port, and using Arduino’s serial library and Serial.print() command, I programmed to display data. In the Arduino IDE, I wrote the code for getting data from sensors. I used the dmpGetYawPitchRoll() function for obtaining the values for yaw, pitch, and roll, and applied the median filter for stabilizing the IMU values. For scaling the values to 0 to 360, I added 180 to values and appended these to gyrostring.

CE 3.9.5

I placed the IMU sensor on the right hand because of its dominance for sign language and so coded for right-hand imu only. Once the terminal signal was generated from contact sensor value, I called the functions getFlexValues(), getGyroValues() and updateGestureString() functions for appending these values into single string. I set the baud rate of 9600bps and defined control signal ‘11’ and ‘12’ for gesture initialization and termination respectively. I also wrote the Matlab code which split the received string from Arduino into right and left-hand flex values and six gyro values. I used gesture libraries in order to compare the obtained flex values with the library values and for matching with a 30% error of margin. Then, I moved with the evaluation and testing of the system.

CE 3.9.6

I used 5v batteries to power each glove. I stacked the battery and Arduino Nano upon each other to save space and I placed the MPU sensor near the wrist. I powered the master and slave gloves for use and connected it to the PC’s serial port. In the PC, I used CoolTerm, which is a serial port terminal application, and set the port, data bits and baud rate. From the list of the signs used in the sign language obtained from the sign language experts, I gestured a sign for the trial. Then, the gestures were collected and stored in the text file, which I moved to MS Excel and then loaded into Matlab as neural network inputs. And, after processing, the corresponding gesture was returned as output.

CE 3.10 TECHNICAL PROBLEMS AND SOLUTIONS


  • I found the flickering value between low and high logic for the incomplete connection of the contact. This was caused due to the high impedance state (Z-state). In order to avoid this Z-state, I used a pull-up resistor, which removed the floating points and the steady output was obtained. But, the issue of repetition of high value remained even when there was no contact made. For the removal of these false values, I applied the debouncer function where I used millis() function for obtaining the last debounce time and then subtracted it from the present instant of time.

  • During the testing of the motion-sensing where I used the MPU6050 IMU sensor, which captured the 6 DOF parameters. When I plotted the graph on Matlab using this DOF against various gestures, I found there were unexpected spikes in the result, which wasn’t desirable. So, to overcome these, I carried out research and came up with an idea of using Euler’s angles approach and thus applied a median filter over the obtained values. This technique solved the problem.

CE 3.11 CREATIVE WORK

Besides the material selection, which was one of the most innovative aspects of the project aimed at cost reduction and reliability of the users, like the selection of woven tufted gloves over nylon and woolen gloves as nylon gloves could uphold electricity and woolen weren’t comfortable, I took other innovative steps as well. For the implementation of the flex sensor, I simulated the system to determine resistance value for max. voltage change across flex than using mark headers in real-time. And here, I used VDR technique for the purpose, where I created excel sheet for calculation of maximum possible voltage for given resistance using formula . This approach helped in reducing the time consumption for the purpose.

CE 3.12

It would have been impossible to conduct and complete this project without support and coordination between the team members. The team held discussions and shared the views for the efficient conduction of the project. My project supervisor gave me motivation, technical support, and resources for the research and literature reviews regarding this project. I used the advice from the project supervisor to imply on my project and make my project further better and relevant. I received support from the library staff and laboratory regarding the availability of materials and those unusual timing of works.

D. SUMMARY

CE 3.13

The gesture detection system for interpretation of the sign language into the word or speech using a microcontroller, sensors, communication module, and software processing was successfully designed and developed in this project. Various tests were performed throughout the hardware and software implementation stage of the project for material selection, algorithm development, proper functioning of the system. About 1.4-1.9s was taken for the transmission of gesture data (i.e. sensor data) after the termination of gesture as signaled by contact sensor to the output subsystem for processing. And the processing time as determined using the tic toc function of Matlab was around 0.5s.

CE 3.14

The ease in communication of hearing/speaking disability with the conversion of signs to text, was achieved. I designed and developed the sensing environment that sensed the gestures for sign language through the use of flex sensors, and the dynamic movement was determined using the IMU sensor. I performed a continuous check through microcontroller for sensing environment, developed the Arduino code for interfacing the sensors and communication module, developed the Matlab code for the development of a neural network algorithm for recognition of the gesture.

CE 3.15

I was enriched with the vast scope of microcontroller applications and Matlab programming. My programming skills with the Arduino and Matlab coding was made better. I got to learn in-depth knowledge about sensors and communication systems. This project also made me capable of tackling and solving issues that arise during the execution of the project. My knowledge of project management was made better.