Paying Attention to Product Reviews: Sentiment Analysis with Additive, Multiplicative, and Local Attention Mechanisms

Greg Eastman
MAS, 2021
Wu, Yingnian
Every business needs to understand its consumers’ experiences to succeed, but it is impossiblefor a large company to read every review. By having a computer read customer responses and relay their preferences, businesses can respond to consumers with a more informed approach. We begin this work by reviewing text preprocessing strategies as well as previous solutions to sentiment tasks. Then we use Amazon tool review data to introduce the addition of an attention mechanism to a BiLSTM encoder-decoder format. Three attention strategies are tested: additive, multiplicative, and local. To experimentally investigate their performance, we compare the attention models to each other as well as two controls.
2021