Home
World Journal of Advanced Research and Reviews
International Journal with High Impact Factor for fast publication of Research and Review articles

Main navigation

  • Home
    • Journal Information
    • Editorial Board Members
    • Reviewer Panel
    • Abstracting and Indexing
    • Journal Policies
    • Our CrossMark Policy
    • Publication Ethics
    • Issue in Progress
    • Current Issue
    • Past Issues
    • Instructions for Authors
    • Article processing fee
    • Track Manuscript Status
    • Get Publication Certificate
    • Join Editorial Board
    • Join Reviewer Panel
  • Contact us
  • Downloads

eISSN: 2581-9615 || CODEN: WJARAI || Impact Factor 8.2 ||  CrossRef DOI

Research and review articles are invited for publication in March 2026 (Volume 29, Issue 3) Submit manuscript

Bangla text generation system by incorporating attention in sequence-to-sequence model

Breadcrumb

  • Home
  • Bangla text generation system by incorporating attention in sequence-to-sequence model

Nayan Banik 1, *, Chayti Saha 1, Ikbal Ahmed 2 and Kulsum Akter Shapna 3

1 Department of Computer Science and Engineering, Comilla University, Cumilla, Bangladesh.
2 Department of Computer Science and Engineering, CCN University of Science & Technology, Cumilla, Bangladesh.
3 Department of Statistics, Comilla University, Cumilla, Bangladesh.
 
Research Article
World Journal of Advanced Research and Reviews, 2022, 14(01), 080-094
Article DOI: 10.30574/wjarr.2022.14.1.0292
DOI url: https://doi.org/10.30574/wjarr.2022.14.1.0292
 
Received on 23 January 2022; revised on 30 March 2022; accepted on 01 April 2022
 
In this AI-driven digital era, the pervasive nature of digital data is possible due to the widespread and cheap access to the Internet. Internet is continuously flourishing with data in many forms. Among them, textual data are a great source of information where people share their expressions in written format. Social media, blogs, online newspapers, government documents are some notable textual data sources. Information extraction from this enormous amount of data by manual inspection is time-consuming, cumbersome, and sometimes impossible. Natural Language Processing (NLP) is the computational domain for addressing those limitations by solving human language-related problems. Text summarization, Named entity recognition, Question answering are some of them where a common task for a machine is to generate coherent text. In such scenarios, the input is a sequence of text, and the output is also a sequence, but they differ in length. Sequence-to-Sequence (Seq2Seq) is an algorithmic approach to address that scenario by utilizing layers of recurrent units. However, the simple Seq2Seq model fails to capture the long-term relationship on the input sequence. Research shows that the attention mechanism guides the model to concentrate on specific inputs. Existing literature shows a lack of quality research on this text generation problem in the Bangla language, whereas many languages show excellent results. This work aims to develop such a system by incorporating attention to the Seq2Seq model and justifying its applicability by comparing it with baseline models. The model perplexity shows that the system can generate human-level readable text using a preprocessed dataset.
 
Bangla Text Generation; Sequence-To-Sequence; Natural Language Processing; Text Mining
 
https://wjarr.com/sites/default/files/fulltext_pdf/WJARR-2022-0292.pdf

Preview Article PDF

Nayan Banik, Chayti Saha, Ikbal Ahmed and Kulsum Akter Shapna. Bangla text generation system by incorporating attention in sequence-to-sequence model. World Journal of Advanced Research and Reviews, 2022, 14(1), 080-094. Article DOI: https://doi.org/10.30574/wjarr.2022.14.1.0292

Copyright © Author(s). All rights reserved. This article is published under the terms of the Creative Commons Attribution 4.0 International License (CC BY 4.0), which permits use, sharing, adaptation, distribution, and reproduction in any medium or format, as long as appropriate credit is given to the original author(s) and source, a link to the license is provided, and any changes made are indicated.


All statements, opinions, and data contained in this publication are solely those of the individual author(s) and contributor(s). The journal, editors, reviewers, and publisher disclaim any responsibility or liability for the content, including accuracy, completeness, or any consequences arising from its use.

Get Certificates

Get Publication Certificate

Download LoA

Check Corssref DOI details

Issue details

Issue Cover Page

Editorial Board

Table of content

Copyright © 2026 World Journal of Advanced Research and Reviews - All rights reserved

Developed & Designed by VS Infosolution