A Novel Image-Based Arabic Hand Gestures Recognition Approach Using YOLOv7 and ArSL21L

Document Type : Original Article

Authors

1 Electronics and Communication Engineering Department, Faculty of Engineering, Fayoum University,Fayoum, Egypt

2 Electrical Engineering Department, Future High Institute for Engineering, Fayoum, Egypt

Abstract

Recognizing and documenting Arabic sign language has recently received much attention because it enhances communication between deaf persons and normal people. The development of automatic sign language recognition (SLR) systems to allow communication with deaf persons is the primary goal of SLR. Until recently, Arabic SLR (ArSLR) received little attention. Building an automatic Arabic hand gesture recognition system is a challenging task. This work presents a novel image-based ArSL recognition approach where You Only Look Once v7 (YOLOv7) is used to build an accurate ArSL alphabet detector and classifier utilizing ArSL21L: Arabic Sign Language Letter Dataset. The proposed YOLOv7 medium model has achieved the highest mAP0.5 and mAP0.5:0.95 scores of 0.9909 and 0.8306, respectively. It has outperformed not only YOLOv5m but also YOLOv5l in terms of mAP0.5 and mAP0.5:0.95 scores. Furthermore, regarding mAP0.5 and mAP0.5:0.95 scores, the YOLOv7-tiny model has not only surpassed YOLOv5s but additionally YOLOv5m. YOLOv5s, on the other hand, has the lowest mAP0.5 and mAP0.5:0.95 scores of 0.9408 and 0.7661, respectively

Keywords