Published January 1, 2021 | Version v1
Conference paper Open

Sentiment and Context-refined Word Embeddings for Sentiment Analysis

  • 1. Middle East Tech Univ, Dept Comp Engn, Ankara, Turkey
  • 2. Koc Univ, Dept Int Relat, Istanbul, Turkey

Description

Word embeddings have become the de-facto tool for representing text in natural language processing (NLP) tasks, as they can capture semantic and syntactic relations, unlike their precedents such as Bag-of-Words. Although word embeddings have been employed in various studies in recent years and proven to be effective in many NLP tasks, they are still immature for sentiment analysis, as they suffer from insufficient sentiment information. General word embedding models pre-trained on large corpora with methods such as Word2Vec or GloVe achieve limited success in domain-specific NLP tasks. On the other hand, training domain-specific word embeddings from scratch requires a high amount of data and computation power. In this work, we target both shortcomings of pre-trained word embeddings to boost the performance of domain-specific sentiment analysis tasks. We propose a model that refines pre-trained word embeddings with context information and leverages the sentiment scores of sentences obtained from a lexicon-based method to further improve performance. Experiment results on two benchmark datasets show that the proposed method significantly increases the accuracy of sentiment classification.

Files

bib-eea9eff1-b9df-42b6-9ccd-eea8548545e9.txt

Files (184 Bytes)

Name Size Download all
md5:84357e1f2927cb3c1c275c73b8475332
184 Bytes Preview Download