Preview

Herald of the Kazakh-British technical university

Advanced search

DEVELOPMENT OF A KAZAKH SIGN LANGUAGE RECOGNITION MODEL BASED ON YOLO-NAS

https://doi.org/10.55452/1998-6688-2025-22-1-10-24

Abstract

The development of a reliable model of recognition of Kazakh sign language is an important step towards the development of inclusive communication and assistance to people with hearing impairments. This paper describes in detail the process of collecting and annotating data in which gesture images were used. Special attention is paid to the preparation and preprocessing of data to ensure their compatibility with the model. The process of learning the model involves optimizing hyperparameters and using various techniques to improve recognition accuracy. We also conducted a comprehensive performance assessment of the model based on test data to ensure its effectiveness in real-world conditions. In addition to the main development phase, we are considering testing the YOLO-NAS model on the same dataset to explore potential improvements in accuracy and performance. In conclusion, the results of our research can be used to further develop technologies that facilitate the integration of people with hearing impairments into society, as well as to create educational and communication platforms based on the Kazakh sign language.

About the Authors

M. Othman
Universiti Putra Malaysia
Malaysia

 Professor 

 Kuala Lumpur 



D. Oralbekova
Satbayev University
Kazakhstan

 Senior Lecturer 

 Almaty 



U. G. Berzhanova
Satbayev University
Kazakhstan

 Doctoral student 

 Almaty 



References

1. Daniels S., Suciati N., Fathichah C. Indonesian sign language recognition using YOLO method. IOP Conference Series: Materials Science and Engineering. IOP Publishing, 2021, vol. 1077, no. 1, p. 012029. https://doi.org/10.1088/1757-899x/1077/1/012029.

2. Al Ahmadi S., Mohammad F., Al Dawsari H. Efficient YOLO Based Deep Learning Model for Arabic Sign Language Recognition, 2024. https://doi.org/10.21203/rs.3.rs-4006855/v1.

3. Mesbahi S. C. et al. Hand gesture recognition based on various Deep Learning YOLO models. International Journal of Advanced Computer Science and Applications, 2023, vol. 14, no. 4.

4. Doždor Z. et al. TY-Net: Transforming YOLO for hand gesture recognition. IEEE access., 2023. https://doi.org/10.14569/ijacsa.2023.0140435.

5. Yerraboina S. Real-Time Hand Gesture Recognition System, 2024.

6. Mallikarjuna Swamy N. et al. Indian sign language detection using YOLOv3. High Performance Computing and Networking: Select Proceedings of CHSN 2021. Singapore: Springer Singapore, 2022, pp. 157–168. https://doi.org/10.1007/978-981-16-9885-9_13.

7. Asri M. et al. A real time Malaysian sign language detection algorithm based on YOLOv3. International Journal of Recent Technology and Engineering, 2019, vol. 8, no. 2, pp. 651–656. https://doi.org/10.35940/ijrte.b1102.0982s1119.

8. Khaliluzzaman M., Kobra K., Liaqat S. Comparative analysis on real-time hand gesture and sign language recognition using convexity defects and YOLOv3. Sigma Journal of Engineering and Natural Sciences, 2024, vol. 42, no. 1, pp. 99–115. https://doi.org/10.14744/sigma.2024.00012.

9. Mujahid A. et al. Real-time hand gesture recognition based on Deep Learning YOLOv3 model. Applied Sciences, 2021, vol. 11, no. 9, p. 4164. https://doi.org/10.3390/app11094164.

10. Lawand S. J. et al. Sign Language Hand Gesture Identification Using YOLOv3. Available at SSRN 4385690. http://dx.doi.org/10.2139/ssrn.4385690.

11. Alaftekin M., Pacal I., Cicek K. Real-time sign language recognition based on YOLO algorithm. Neural Computing and Applications, 2024, vol. 36, no. 14, pp. 7609–7624. https://doi.org/10.1007/s00521-024-09503-6.

12. Al-shaheen A., Çevik M., Alqaraghulı A. American sign language recognition using YOLOv4 method. International Journal of Multidisciplinary Studies and Innovative Technologies, 2022, vol. 6, no. 1, pp. 61–65, https://doi.org/ 10.36287/ijmsit.6.1.61.

13. Alaftekin M., Pacal I. & Cicek K. Real-time sign language recognition based on YOLO algorithm. Neural Comput & Applic, 2024, vol. 36, pp. 7609–7624. https://doi.org/10.1007/s00521-024-09503-6.

14. Sreemathy R. et al. Continuous word level sign language recognition using an expert system based on machine learning //International Journal of Cognitive Computing in Engineering, 2023, vol. 4, pp. 170–178. https://doi.org/10.1016/j.ijcce.2023.04.002.

15. Begum N. et al. Borno-net: a real-time Bengali sign-character detection and sentence generation system using quantized YOLOv4-Tiny and LSTMs //Applied Sciences, 2023, vol. 13, no. 9, p. 5219. https://doi.org/10.3390/app13095219.

16. Bankar S. et al. Real time sign language recognition using Deep Learning. International Research Journal of Engineering and Technology, 2022, vol. 9, no. 4, pp. 955–959. https://doi.org/10.22214/ijraset.2023.55621.

17. Aiouez S. et al. Real-time Arabic Sign Language Recognition based on YOLOv5. IMPROVE, 2022, pp. 17–25. https://doi.org/10.5220/0010979300003209.

18. Reddy P.V. et al. Sign Language Recognition based on YOLOv5 Algorithm for the Telugu Sign Language. arXiv e-prints, 2024. С. arXiv: 2406.10231, https://doi.org/10.48550/arXiv.2406.10231.

19. Venkatraja V.M.C. et al. Sign language to speech converter for indian languages, 2023.

20. Attia N.F., Ahmed M.T.F.S., Alshewimy M.A.M. Efficient Deep Learning models based on tension techniques for sign language recognition. Intelligent systems with applications, 2023, vol. 20, p. 200284. https://doi.org/10.1016/j.iswa.2023.200284.

21. Buttar A.M. et al. Deep Learning in sign language recognition: a hybrid approach for the recognition of static and dynamic signs. Mathematics, 2023, vol. 11, no. 17, p. 3729. https://doi.org/10.3390/math11173729.

22. Siddique S. et al. Deep Learning-based bangla sign language detection with an edge device. Intelligent Systems with Applications, 2023, vol. 18, p. 200224. https://doi.org/10.1016/j.iswa.2023.200224.

23. Nair A. B. et al. Malayalam Sign Language Identification using Finetuned YOLOv8 and Computer Vision Techniques. arXiv preprint arXiv:2405.06702, 2024. https://doi.org/10.48550/arXiv.2405.06702.

24. Kalimuthu S. Video Captioning Based on Sign Language Using YOLOV8 Model, 2023. https://doi.org/10.1007/978-3-031-45878-1_21.

25. Hinge R. et al. Improving Indian Sign Language Interpretation with Deep Learning-Based Translation System. Journal of technical education, p. 44.

26. ZainEldin H. et al. Active convolutional neural networks sign language (ActiveCNN-SL) framework: a paradigm shift in deaf-mute communication. Artificial Intelligence Review, 2024, vol. 57, no. 6, p. 162. https://doi.org/10.1007/s10462-024-10792-5.

27. Purnomo H. et al. Utilizing the YOLOv8 Model for Accurate Hand Gesture Recognition with Complex Background. Available at SSRN 4777516. http://dx.doi.org/10.2139/ssrn.4777516.

28. Mukhanov S. et al. Gesture recognition of machine learning and convolutional neural network methods for kazakh sign language. Scientific Journal of Astana IT University, 2023, pp. 85–100. https://doi.org/10.37943/15lpcu4095.

29. Tang Y., Wang Y., Qian Y. Real-time railroad track components inspection framework based on YOLONAS and edge computing. IOP Conference Series: Earth and Environmental Science. – IOP Publishing, 2024, vol. 1337, no. 1, p. 012017. https://doi.org/10.1088/1755-1315/1337/1/012017.


Review

For citations:


Othman M., Oralbekova D., Berzhanova U.G. DEVELOPMENT OF A KAZAKH SIGN LANGUAGE RECOGNITION MODEL BASED ON YOLO-NAS. Herald of the Kazakh-British technical university. 2025;22(1):10-24. https://doi.org/10.55452/1998-6688-2025-22-1-10-24

Views: 257


Creative Commons License
This work is licensed under a Creative Commons Attribution 4.0 License.


ISSN 1998-6688 (Print)
ISSN 2959-8109 (Online)