WhatsApp at (+91-9098855509) Support
ijprems Logo
  • Home
  • About Us
    • Editor Vision
    • Editorial Board
    • Privacy Policy
    • Terms & Conditions
    • Publication Ethics
    • Peer Review Process
  • For Authors
    • Publication Process(up)
    • Submit Paper Online
    • Pay Publication Fee
    • Track Paper
    • Copyright Form
    • Paper Format
    • Topics
  • Fees
  • Indexing
  • Conference
  • Contact
  • Archieves
    • Current Issue
    • Past Issue
  • More
    • FAQs
    • Join As Reviewer
  • Submit Paper

Recent Papers

Dedicated to advancing knowledge through rigorous research and scholarly publication

  1. Home
  2. Recent Papers

Exploring the Potential of Transformers in Natural Language Processing -A Study on Text Classification

Dr. Bhagyashree Ashok Tingare Bhagyashree Ashok Tingare

DOI: 10.58257/IJPREMS35724
Download Paper

Paper Contents

Abstract

Natural Language Processing (NLP) has witnessed significant advancements in recent years, driven by the emergence of deep learning techniques. Transformers, introduced in 2017, have revolutionized the field of NLP, demonstrating exceptional performance in various tasks. This study aims to explore the potential of Transformers in text classification, a fundamental task in NLP.We conduct a comprehensive evaluation of three pre-trained Transformer models - BERT, RoBERTa, and XLNet - on three benchmark datasets - 20 Newsgroups, IMDB, and Stanford Sentiment Treebank. Our results show that Transformers achieve state-of-the-art performance in text classification, outperforming traditional machine learning approaches. We also analyze the strengths and limitations of each model, highlighting their ability to capture long-range dependencies and contextual relationships in text data.Our findings suggest that Transformers are robust and effective models for text classification, with applications in sentiment analysis, spam detection, and information retrieval. We also discuss the potential of Transformers in other NLP tasks, such as question answering, machine translation, and text generation.This study provides a comprehensive overview of the capabilities of Transformers in text classification, highlighting their potential as a powerful tool for NLP tasks. Our results and analysis provide insights for researchers and practitioners working in the field of NLP, highlighting the potential of Transformers for a wide range of applications.

Copyright

Copyright © 2024 Dr. Bhagyashree Ashok Tingare. This is an open access article distributed under the Creative Commons Attribution License.

Paper Details
Paper ID: IJPREMS40800009837
ISSN: 2321-9653
Publisher: ijprems
Page Navigation
  • Abstract
  • Copyright
About IJPREMS

The International Journal of Progressive Research in Engineering, Management and Science is a peer-reviewed, open access journal that publishes original research articles in engineering, management, and applied sciences.

Quick Links
  • Home
  • About Our Journal
  • Editorial Board
  • Publication Ethics
Contact Us
  • IJPREMS - International Journal of Progressive Research in Engineering Management and Science, motinagar, ujjain, Madhya Pradesh., india
  • Chat with us on WhatsApp: +91 909-885-5509
  • Email us: editor@ijprems.com
  • Mon-Fri: 9:00 AM - 5:00 PM

© 2025 International Journal of Progressive Research in Engineering, Management and Science.Designed and Developed by EVG Software Solutions All Rights Reserved.

Terms & Conditions | Privacy Policy | Publication Ethics | Peer Review Process | Contact Us