英文摘要 |
Aspect-level sentiment classification is a key topic in the field of natural language processing (NLP). The existing models for sentiment classification have some drawbacks, such as the weakness of word-aspect associative perception, and with not strong generalization ability. In this paper, we develop a BERT-based interactive attention network (BIAN) to help improve aspect-level short-text sentiment classification. First, we use BERT model as encoder to extract different type of context features. Next, we create interactive attention networks to learn interactive attentions between context and aspect words. The final attention representations are constructed and the classification results are output. Experiments on the multiple data sets demonstrate that BIAN can achieve the state-of-the-art performance. |