Konferans bildirisi Açık Erişim

Waste Not: Meta-Embedding of Word and Context Vectors

   Degirmenci, Selin; Gerek, Aydin; Ganiz, Murat Can

The word2vec and fastText models train two vectors per word: a word and a context vector. Typically the context vectors are discarded after training, even though they may contain useful information for different NLP tasks. Therefore we combine word and context vectors in the framework of meta-embeddings. Our experiments show performance increases at several NLP tasks such as text classification, semantic similarity, and analogy. In conclusion, this approach can be used to increase performance at downstream tasks while requiring minimal additional computational resources.

Dosyalar (164 Bytes)
Dosya adı Boyutu
bib-25a74055-4dd9-4f9c-8285-06384aa7ecdf.txt
md5:cf9b014298a06ef4b655bddbc09b965c
164 Bytes İndir
37
6
görüntülenme
indirilme
Görüntülenme 37
İndirme 6
Veri hacmi 984 Bytes
Tekil görüntülenme 36
Tekil indirme 6

Alıntı yap