Title of article :
Utilizing Gated Recurrent Units to Retain Long Term Dependencies with Recurrent Neural Network in Text Classification
Author/Authors :
Chandra, Nidhi Research Scholar - Amity Institute of Information Technology - Amity University Uttar Pradesh, India , Ahuja, Laxmi Dy. Director - Amity Institute of Information Technology - Amity University Uttar Pradesh, India , K. Khatri, Sunil Amity University Tashkent, Uzbekistan , Monga, Himanshu Jawaharlal Lal Nehru Government Engineering College, Sundernagar, Mandi (H.P).
Pages :
14
From page :
89
To page :
102
Abstract :
The classification of text is one of the key areas of research for natural language processing. Most of the organizations get customer reviews and feedbacks for their products for which they want quick reviews to action on them. Manual reviews would take a lot of time and effort and may impact their product sales, so to make it quick these organizations have asked their IT to leverage machine learning algorithms to process such text on a real-time basis. Gated recurrent units (GRUs) algorithms which is an extension of the Recurrent Neural Network and referred to as gating mechanism in the network helps provides such mechanism. Recurrent Neural Networks (RNN) has demonstrated to be the main alternative to deal with sequence classification and have demonstrated satisfactory to keep up the information from past outcomes and influence those outcomes for performance adjustment. The GRU model helps in rectifying gradient problems which can help benefit multiple use cases by making this model learn long-term dependencies in text data structures. A few of the use cases that follow are – sentiment analysis for NLP. GRU with RNN is being used as it would need to retain long-term dependencies. This paper presents a text classification technique using a sequential word embedding processed using gated recurrent unit sigmoid function in a Recurrent neural network. This paper focuses on classifying text using the Gated Recurrent Units method that makes use of the framework for embedding fixed size, matrix text. It helps specifically inform the network of long-term dependencies. We leveraged the GRU model on the movie review dataset with a classification accuracy of 87%.
Keywords :
Gated Recurrent Units , Recurrent Neural Network , Word Embedding , Deep Learning , LSTM
Journal title :
Journal of Information Systems and Telecommunication
Serial Year :
2021
Record number :
2703127
Link To Document :
بازگشت