Title of article :
Deep Learning Structure for Cross-Domain Sentiment Classification Based on Improved Cross Entropy and Weight
Author/Authors :
Fei, Rong College of Computer Science and Engineering - Xi’an University of Technology,China , Yao, Quanzhu College of Computer Science and Engineering - Xi’an University of Technology,China , Zhu,Yuanbo China Railway First Survey and Design Institute, China , Xu, Qingzheng College of Information and Communication - National University of Defense Technology, Changsha, China , Li, Aimin College of Computer Science and Engineering - Xi’an University of Technology,China , Wu,Haozheng College of Computer Science and Engineering - Xi’an University of Technology,China , Hu, Bo Beijing Huadian Youkong Technology Co., Ltd., China
Pages :
20
From page :
1
To page :
20
Abstract :
Within the sentiment classification field, the convolutional neural network (CNN) and long short-term memory (LSTM) are praised for their classification and prediction performance, but their accuracy, loss rate, and time are not ideal. To this purpose, a deep learning structure combining the improved cross entropy and weight for word is proposed for solving cross-domain sentiment classification, which focuses on achieving better text sentiment classification by optimizing and improving recurrent neural network (RNN) and CNN. Firstly, we use the idea of hinge loss function (hinge loss) and the triplet loss function (triplet loss) to improve the cross entropy loss. The improved cross entropy loss function is combined with the CNN model and LSTM network which are tested in the two classification problems. Then, the LSTM binary-optimize (LSTM-BO) model and CNN binary-optimize (CNN-BO) model are proposed, which are more effective in fitting the predicted errors and preventing overfitting. Finally, considering the characteristics of the processing text of the recurrent neural network, the influence of input words for the final classification is analysed, which can obtain the importance of each word to the classification results. The experiment results show that within the same time, the proposed weight-recurrent neural network (W-RNN) model gives higher weight to words with stronger emotional tendency to reduce the loss of emotional information, which improves the accuracy of classification.
Keywords :
Deep Learning Structure , Cross-Domain Sentiment Classification , Weight , Improved Cross Entropy
Journal title :
Scientific Programming
Serial Year :
2020
Full Text URL :
Record number :
2610873
Link To Document :
بازگشت