Abstract:As for sentence-level emotion analysis, current deep learning methods fail to make full use of emotional language resources such as emotion words, negative words and degree adverbs. A new model is proposed based on bidirectional encoder representations from transformers (BERT) and dual channel attention. One channel based on bi-directional GRU (BiGRU) neural network is responsible for extracting semantic features, while the other based on full connection neural network is responsible for extracting emotional features. At the same time, attention mechanism is introduced into both the channels to better extract key information, and the pre-trained model Bert is used to provide word vectors and thereafter adjust them dynamically according to the context so as to embed real emotional semantic into the model. The final semantic expression is obtained through the fusion of semantic features and emotional features from the two channels. The experimental results show that, compared with other word vector tools, BERT has a better feature extraction ability, while the emotional information channel and the attention mechanism enhance the model’s ability to capture emotional semantics, which significantly improves the performance of emotion classification and its convergence speed and stability as well.