Paper Contents
Abstract
The rapid expansion of structured data stored in relational databases has created an urgent need for efficient retrieval systems that are accessible to both technical and non-technical users. Although Structured Query Language (SQL) remains the industry standard for interacting with relational databases, it requires detailed knowledge of schema design, query syntax, and optimization techniques. This creates a gap for business analysts, decision-makers, and general users who may lack SQL expertise. To address this challenge, Natural Language to SQL (NL2SQL) systems have emerged, enabling users to issue queries in plain language that are automatically translated into SQL.This study presents the design, implementation, and evaluation of an NL2SQL framework built on transformer-based sequence-to-sequence (Seq2Seq) models. Unlike earlier recurrent approaches such as RNNs and LSTMs, transformer models employ self-attention mechanisms that capture long-range dependencies and support parallel processingcapabilities that are essential for handling complex, context-rich natural language queries. The proposed system is trained and validated on benchmark datasets such as WikiSQL and Spider, covering both single-table and multi-table scenarios. Experimental findings show notable improvements in exact match and execution accuracy compared to baseline models including LSTM and GRU.By offering a robust and scalable NL2SQL solution, this research lowers the barrier to database access for non-technical users and paves the way for integration into applications such as voice assistants, business intelligence platforms, and conversational AI systems.
Copyright
Copyright © 2025 Sourav L. This is an open access article distributed under the Creative Commons Attribution License.