A Deep Learning Method Based Self-Attention and Bi-directional LSTM in Emotion Classification

Rong Fei,
Yuanbo Zhu,
Quanzhu Yao,
Qingzheng Xu,
Bo Hu,

Abstract


Traditional recurrent neural network cannot achieve parallelism, while convolutional neural network cannot be used to process variable-length sequence samples directly. In this study, we combined the bidirectional short-time memory (Bi-LSTM) model with the self-attention to form the SA-BiLSTM method, to further improve the performance of the emotion classification model. The SA-BiLSTM method obtains the attention probability distribution by calculating the correlation between the intermediate state and final state. The SA-BiLSTM method weights the state of each moment differently to ensure that the problem of information redundancy is solved while retaining valid information and the accuracy of text classification is improved by optimizing the text feature vector. Experimental results on three different data sets show that the performance of SA-BiLSTM algorithm outperforms the six emotion classification methods by the accuracy, loss rate, time and other performance indicators of the classification model.


Citation Format:
Rong Fei, Yuanbo Zhu, Quanzhu Yao, Qingzheng Xu, Bo Hu, "A Deep Learning Method Based Self-Attention and Bi-directional LSTM in Emotion Classification," Journal of Internet Technology, vol. 21, no. 5 , pp. 1447-1461, Sep. 2020.

Full Text:

PDF

Refbacks

  • There are currently no refbacks.





Published by Executive Committee, Taiwan Academic Network, Ministry of Education, Taipei, Taiwan, R.O.C
JIT Editorial Office, Office of Library and Information Services, National Dong Hwa University
No. 1, Sec. 2, Da Hsueh Rd., Shoufeng, Hualien 974301, Taiwan, R.O.C.
Tel: +886-3-931-7314  E-mail: jit.editorial@gmail.com