Indian Sign Language Recognition using Convolutional Neural Network (original) (raw)

Indian sign language recognition using convolution neural network

E3S Web of Conferences

The goal of the project is to create a machine learning model that can classify the numerous hand motions used in sign language fingerspelling. Communication with deaf and dumb persons is frequently difficult. A variety of hand, finger, and arm motions that assist the deaf and hard of hearing in communicating with others and vice versa. Classification machine learning algorithms are taught on a set of image data in this userindependent model, and testing is done on a completely other set of data. For some people with particular needs, sign language is their only means of communicating their thoughts and feelings. It enables individuals to understand the world around them by visual descriptions and hence contribute to society. As a result, our model aids us in solving the problem more broadly. By watching the user’s hand gestures, this transforms sign language to regular words.

IJERT-Sign Language Recognition System using Convolutional Neural Network and Computer Vision

International Journal of Engineering Research and Technology (IJERT), 2020

https://www.ijert.org/sign-language-recognition-system-using-convolutional-neural-network-and-computer-vision https://www.ijert.org/research/sign-language-recognition-system-using-convolutional-neural-network-and-computer-vision-IJERTV9IS120029.pdf Conversing to a person with hearing disability is always a major challenge. Sign language has indelibly become the ultimate panacea and is a very powerful tool for individuals with hearing and speech disability to communicate their feelings and opinions to the world. It makes the integration process between them and others smooth and less complex. However, the invention of sign language alone, is not enough. There are many strings attached to this boon.The sign gestures often get mixed and confused for someone who has never learnt it or knows it in a different language. However, this communication gap which has existed for years can now be narrowed with the introduction of various techniques to automate the detection of sign gestures. In this paper, we introduce a Sign Language recognition using American Sign Language. In this study, the user must be able to capture images of the hand gesture using web camera and the system shall predict and display the name of the captured image. We use the HSV colour algorithm to detect the hand gesture and set the background to black. The images undergo a series of processing steps which include various Computer vision techniques such as the conversion to grayscale, dilation and mask operation. And the region of interest which, in our case is the hand gesture is segmented. The features extracted are the binary pixels of the images. We make use of Convolutional Neural Network(CNN) for training and to classify the images. We are able to recognise 10 American Sign gesture alphabets with high accuracy. Our model has achieved a remarkable accuracy of above 90%.

Vision-based Hand Gesture Recognition for Indian Sign Language Using Convolution Neural Network

IETE Journal of Research, 2020

Hearing-impaired people can interact with other people through sign language. The proposed system tears down the communication barrier between Hard of hearing (HoH) community and those who do not know their sign language. In this paper, we have developed an algorithm to detect and segment the hand region from a depth image using the Microsoft Kinect sensor. The proposed algorithm works well in the cluttered environment, e.g. skin color background and hand overlaps the face. Convolution Neural networks (CNN) are applied to automatically construct features from Indian sign language (ISL) signs. These features are invariant to rotation and scaling. The proposed system recognizes gestures accurately up to 99.3%.

Simulation and Analysis of Hand Gesture Recognition for Indian Sign Language using CNN

International Journal on Recent and Innovation Trends in Computing and Communication

Sign Language Recognition is a device or program to help deaf and mute people. However, communication has always been difficult for a person with verbal and physical disabilities. Sign language recognition communication between the average person and the disabled using this device easily communicates with people who cannot communicate with the average person, this program reduces the communication gap between people. In total, the world has a population of about 15 -20% of the deaf and mute population which is a clear indication of the need for a Sign Language Awareness Program. Different methods are used to identify sign language but they are not effective due to the economic and commercial situation so we use this cheap and affordable method for people. Therefore, sign language recognition systems based on image processing and sensory networks are preferred over gadget programs as they are more accurate and easier to implement. This paper aims to create an easy-to-use and accurate...

Indian sign language recognition system

This article briefs about the development of Indian sign language recognition system using machine leaning. Two hand Gestures are generally used in Indian sign language to communicate. Creating model of machine learning that will be able to identify the different hand gestures used in sign language for finger spelling is the aim of this paper. On the datasets, Convolutional Neural Network (CNN) algorithms are applied.

Recognizing Bengali Sign Language Gestures for Digits in Real Time using Convolutional Neural Network

Vol. 19 No. 1 JANUARY 2021 International Journal of Computer Science and Information Security (IJCSIS), 2021

Recognizing sign language gestures for different languages has been found as a promising field of research that explores the possibility of communication by interpreting various signs and translating them into text or speech. Establishing a better communication way between deaf-mute people and ordinary people is the prime objective of this research arena. There are many existing Sign Language Recognition (SLR) systems throughout the world and these SLR systems are implemented using various methods, tools and techniques with a view to achieving better recognition accuracy. This research work aims at applying the concept of Convolutional Neural Network (CNN) for recognizing Bengali Sign Language gesture images for digits only in real time. Bengali sign language images for digits are collected from different individuals and the CNN model is trained with these images after performing several pre-processing tasks i.e. resizing to a specific dimension, converting these RGB images to the gray scale images, finding the equivalent binary images and rotating the images into different degrees both in left and right direction. The experiment is conducted using two major techniques. Firstly, the model has been trained with the dataset containing the equivalent binary images of the row images collected directly from different individuals. Secondly, the dataset is enriched by rotating all the images into 3°, 6°, 9°, 12°and 15°in both left and right directions. After applying the rotation technique, the recognition accuracy is found to be increased significantly. The maximum recognition accuracy of the proposed CNN model with the dataset without image rotation technique is 94.17% whereas the recognition accuracy is 99.75% while including the rotated images in the dataset.

SIGN LANGUAGE RECOGNITION USING MACHINE LEARNING

IRJET, 2022

Communication must happen in both ways. Communicating with the people having hearing disability is a difficult task so we developed the real time sign language recognition system using machine learning. This system contains the hand gestures performed by people which is captured using the web camera. Out of all images the region of interest is identified and the system will predict and show the name of the sign captured. The captured signal will undergo some of the processing steps inclusive of data acquisition, image processing, characteristic extraction and classification. We will use the Convolutional Neural Network (CNN) to train and build our model. Hence it is very useful for people with hearing and speech disability.

Sign Language Recognition

The biggest challenge with sign language is not being universal because it varies from country to country. Sign language is a way of communication adopted by hearing and speech impaired people by using hand gestures. It is a challenge for other people to communicate with them and vice versa. To make communication easier, there is a need for a bridge connecting the gap between physically challenged people and others. This project focuses on identifying the characters and numbers of Indian Sign Language using Convolutional Neural Network (CNN), Keras, Tensor Flow libraries. India doesn't have standard sign language but adopts ASL (American Sign Language), which is single-handed whereas ISL uses two hands for communicating. These reasons boosted us to develop software recognizing hand gestures using ISL.

Bangla numerical sign language recognition using convolutional neural networks (CNNs)

Indonesian Journal of Electrical Engineering and Computer Science

The amount of deaf and mute individuals on the earth is rising at an alarmingrate. Bangladesh has about 2.6 million people who are unable to interact with the community using language. Hearing-impaired citizens in Bangladesh use Bangladeshi sign language (BSL) as a means of communication. In this article,we propose a new method for Bengali sign language recognition based on deep convolutional neural networks. Our framework employs convolutional neural networks (CNN) to learn from the images in our dataset and interpret hand signs from input images. Checking their collections of ten indications (we usedten sets of images with 31 distinct signs) for a total of 310 images. The proposed system takes snap shots from a video by using a webcam with applying a computer vision-based approach. After that, it compares those photos to a previously trained dataset generated with CNN and displays the Bengali numbers (০-৯). After estimating the model on our dataset, weobtained an overall accuracy ...

Sign Language Detection using Convolutional Neural Network

International Journal of Advanced Research in Science, Communication and Technology

Hand gesture is one of the methods used in sign language for non-verbal communication. It is most commonly used by deaf & dumb people who have hearing or speech problems to communicate among themselves or with normal people. Various sign language systems have been developed by many makers around the world but they are neither flexible nor cost effective for the end users. Hence in this paper introduced software which presents a system prototype that is able to automatically recognize sign language to help deaf and dumb people to communicate more effectively with each other or normal people. Pattern recognition and Gesture recognition are the developing fields of research. Being a significant part in nonverbal communication, hand gestures play a key role in our daily life.