Yayın:
TURKISH SIGN LANGUAGE EXPRESSIONS RECOGNITION USING DEEP LEARNING AND LANDMARK DATA

Placeholder

Akademik Birimler

item.page.program

item.page.orgauthor

item.page.kuauthor

item.page.coauthor

Danışman

item.page.language

item.page.type

Dergi Başlığı

Dergi ISSN

Cilt Başlığı

Özet

Sign language is a vital communication tool for hearing-impaired individuals to express their thoughts and emotions. Turkish Sign Language (TSL) is based on hand gestures, facial expressions, and body movements. In this study, deep learning models were developed to recognize 41 commonly used TSL expressions. An original dataset was created using the Media Pipe Holistic framework to capture the 3D landmarks of hand, face, and body movements. The study trained and evaluated GRU, LSTM, and Bi-LSTM models, as well as hybrid architectures such as CNN+GRU, GRU+LSTM, and GRU+Bi-LSTM. In the training of the models, a hold-out validation method was used. 80% of the dataset was allocated for training and 20% for testing. Additionally, 20% of the training data was used for validation. Among Deep Learning models, the CNN+GRU hybrid model achieved the highest accuracy rate of 96.72%, outperforming similar studies in the literature. Our results demonstrate that deep learning techniques can effectively classify TSL expressions, with the CNN+GRU combination showing particularly high performance. Future work will focus on expanding the dataset and developing real-time recognition systems that incorporate both skeleton images and landmarks.

Açıklama

item.page.source

Yayınevi

Mugla Sitki Kocman University

item.page.keywords

Alıntı

Koleksiyonlar

Endorsement

Review

item.page.supplemented

item.page.referenced

0

Views

0

Downloads

View PlumX Details


İlişkili Sürdürülebilir Kalkınma Hedefleri