SynapSign: An Advanced Machine Learning Framework for American Sign Language Recognition Utilizing a Novel Landmark-Based Dataset

[ X ]

Tarih

2024

Dergi Başlığı

Dergi ISSN

Cilt Başlığı

Yayıncı

Institute of Electrical and Electronics Engineers Inc.

Erişim Hakkı

info:eu-repo/semantics/closedAccess

Özet

Hearing loss is a common condition affecting a significant proportion of the world's population, creating barriers to effective communication. Sign language, particularly American Sign Language (ASL), is an important tool for the social integration and personal growth of people with hearing loss. The need for effective tools to facilitate the learning and practice of ASL is increasingly recognized. Although there are many studies and proposed softwares that execute based on certain approaches such as recognition of signs by machine learning techniques with relatively high accuracy, it is apparent that there exists a need for higher performance. This paper presents SynapSign, a desktop application designed to enhance ASL learning using machine learning algorithms. To develop the application using the machine learning model having the highest accuracy, the performances of Random Forest, XGBoost and Deep Neural Network (DNN) classifiers were investigated. To this purpose, an image dataset consisting of 2600 images was prepared. For each letter of 26 letters for ASL, 100 images of hands showing 21 hand landmarks was built for accurate hand gesture recognition using Google's MediaPipe technology. Thereafter, the three classifiers were trained on this extensive dataset of ASL hand images. Acquired models were tested and their performances were compared based on accuracy, precision and recall metrics. The results reveal that the model of Random Forest classifier performs slightly higher for all three metrics with 99.6%, 99.3% and %99.7, respectively, than other models. Therefore, SynapSign was developed using this model with a user interface that allows user to input sign image from a video stream via a camera and labelled with 21 hand landmarks by MediaPipe Framework's default hand detection model. Compared to traditional methods, the application provides a more interactive and engaging learning experience, allowing users to practice and improve their ASL skills with real-time feedback. Our findings suggest that SynapSign could serve as a valuable tool for both educational and accessibility purposes, addressing the gap in resources available to ASL learners. © 2024 IEEE.

Açıklama

2024 Innovations in Intelligent Systems and Applications Conference, ASYU 2024 -- 2024-10-16 through 2024-10-18 -- Ankara -- 204562

Anahtar Kelimeler

Deep Neural Network (DNN), Hand Gesture Recognition, Hearing Impairment, Interaction System, Machine Learning, Random Forest, Sign Language, XGBoost

Kaynak

WoS Q Değeri

Scopus Q Değeri

N/A

Cilt

Sayı

Künye