Abstractive Text Summarization using LSTM, GLOVE, and TensorFlow
Main Article Content
Abstract
Automatic Text summarization (ATS) is a fundamental task in NLP (natural-language-processing), aimed at generating concise and coherent summaries of longer texts. In this study, we investigate the use of deep-learning techniques, with a focus on LSTM (Long Short-Term Memory) neural networks, Global Vectors used in Word Representation (GLOVE) embeddings, and the TensorFlow framework, for abstractive text summarization.Our approach leverages the power of LSTM networks to capture sequential dependencies in the input text, enabling the generation of abstractions that extend beyond simple sentence extraction. GLOVE embeddings are used to denote the words in a constant vector space, thereby improving the model's comprehension of semantic relationships between words. TensorFlow provides the computational framework for efficient model training and deployment.We perform experiments on diverse datasets to assess the execution of our abstractive text summarization model. Our findings demonstrate that integrating LSTM, GloVe embeddings, and TensorFlow significantly enhances the quality and fluency of generated summaries compared to traditional extractive summarization approaches.This study contributes to the development of abstractive text summarization techniques, offering a promising approach to distill important details from textual data automatically. Furthermore, the use of open-source tools like TensorFlow makes our model accessible and adaptable for many different applications in the field of natural language processing.
Article Details
Issue
Section
Articles

This work is licensed under a Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International License.
How to Cite
Abstractive Text Summarization using LSTM, GLOVE, and TensorFlow. (2025). Architecture Image Studies, 7(1), 2914-2927. https://doi.org/10.62754/ais.v7i1.1342