WhatsApp at (+91-9098855509) Support
ijprems Logo
  • Home
  • About Us
    • Editor Vision
    • Editorial Board
    • Privacy Policy
    • Terms & Conditions
    • Publication Ethics
    • Peer Review Process
  • For Authors
    • Publication Process(up)
    • Submit Paper Online
    • Pay Publication Fee
    • Track Paper
    • Copyright Form
    • Paper Format
    • Topics
  • Fees
  • Indexing
  • Conference
  • Contact
  • Archieves
    • Current Issue
    • Past Issue
  • More
    • FAQs
    • Join As Reviewer
  • Submit Paper

Recent Papers

Dedicated to advancing knowledge through rigorous research and scholarly publication

  1. Home
  2. Recent Papers

MACHINE LEARNING APPLICATION FOR PIDGIN SIGN LANGUAGE DETECTION

Biralatei Fawei Fawei, Ide Mercy Azibaye, Patrick Kenekayoro, Bunikiye Richard Japheth, Ide Mercy Azibaye , Patrick Kenekayoro , Bunikiye Richard Japheth

Download Paper

Paper Contents

Abstract

Sign language is a form of non-verbal communication that is used by people having difficulty in speech or hearing. There is always a need to translate between sign language and text for communication between the hearing impaired and those without hearing difficulties. Nigerian Schools adopt the American Sign Language, but locally the Pidgin Sign Language is more predominant. Although there have been studies that have used machine learning methods to translate sign language to text, there is a dearth of research that has focused on the local (Nigerian) Pidgin Sign Language (PSL). Thus, it is unknown how well the state-of-the-art methods for sign language detection and translation will perform for the Nigerian Pidgin Sign Language. This research aims to bridge this gap by investigating how well these state-of-the-art techniques tailored for American Sign Language (ASL) will perform for Nigerian Pidgin Sign Language (PSL) and adapt these techniques for better performance. Experimental and engineering research design was adopted in this research. The multilayer Perceptron (MLP) methodology was implemented and it is underpinned by the use of lightweight neural classifiers and temporal smoothing techniques, ensuring the systems performance is both robust and suitable for real-time operation on standard, non-specialized hardware. A webcam was used for data capturing for a total of 1,000 video frame gestures in PSL for training and testing for 8 different gestures summing to 8,000 frames using Python and OpenCV. The Static Classifier's superior performance accuracy of 94.15% confirms that the method of extracting and normalizing static keypoint data is highly effective for recognizing static PSL signs while the significant underperformance of the Temporal Classifier with an accuracy of 24.39% points to a critical issue in the dynamic gesture recognition pipeline. However, the model recognizes the various hand motions and outputs the appropriate word on the screen.

Copyright

Copyright © 2025 Biralatei Fawei, Ide Mercy Azibaye, Patrick Kenekayoro, Bunikiye Richard Japheth. This is an open access article distributed under the Creative Commons Attribution License.

Paper Details
Paper ID: IJPREMS51000002110
ISSN: 2321-9653
Publisher: ijprems
Page Navigation
  • Abstract
  • Copyright
About IJPREMS

The International Journal of Progressive Research in Engineering, Management and Science is a peer-reviewed, open access journal that publishes original research articles in engineering, management, and applied sciences.

Quick Links
  • Home
  • About Our Journal
  • Editorial Board
  • Publication Ethics
Contact Us
  • IJPREMS - International Journal of Progressive Research in Engineering Management and Science, motinagar, ujjain, Madhya Pradesh., india
  • Chat with us on WhatsApp: +91 909-885-5509
  • Email us: editor@ijprems.com
  • Sun-Sat: 9:00 AM - 9:00 PM

© 2025 International Journal of Progressive Research in Engineering, Management and Science. All Rights Reserved.

Terms & Conditions | Privacy Policy | Publication Ethics | Peer Review Process | Contact Us