Konferans bildirisi Açık Erişim
Degirmenci, Selin; Gerek, Aydin; Ganiz, Murat Can
<?xml version='1.0' encoding='utf-8'?> <resource xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xmlns="http://datacite.org/schema/kernel-4" xsi:schemaLocation="http://datacite.org/schema/kernel-4 http://schema.datacite.org/meta/kernel-4.1/metadata.xsd"> <identifier identifierType="URL">https://aperta.ulakbim.gov.tr/record/68991</identifier> <creators> <creator> <creatorName>Degirmenci, Selin</creatorName> <givenName>Selin</givenName> <familyName>Degirmenci</familyName> <affiliation>Marmara Univ, TR-34730 Istanbul, Turkey</affiliation> </creator> <creator> <creatorName>Gerek, Aydin</creatorName> <givenName>Aydin</givenName> <familyName>Gerek</familyName> <affiliation>Marmara Univ, TR-34730 Istanbul, Turkey</affiliation> </creator> <creator> <creatorName>Ganiz, Murat Can</creatorName> <givenName>Murat Can</givenName> <familyName>Ganiz</familyName> <affiliation>Marmara Univ, TR-34730 Istanbul, Turkey</affiliation> </creator> </creators> <titles> <title>Waste Not: Meta-Embedding Of Word And Context Vectors</title> </titles> <publisher>Aperta</publisher> <publicationYear>2019</publicationYear> <dates> <date dateType="Issued">2019-01-01</date> </dates> <resourceType resourceTypeGeneral="Text">Conference paper</resourceType> <alternateIdentifiers> <alternateIdentifier alternateIdentifierType="url">https://aperta.ulakbim.gov.tr/record/68991</alternateIdentifier> </alternateIdentifiers> <relatedIdentifiers> <relatedIdentifier relatedIdentifierType="DOI" relationType="IsIdenticalTo">10.1007/978-3-030-23281-8_35</relatedIdentifier> </relatedIdentifiers> <rightsList> <rights rightsURI="http://www.opendefinition.org/licenses/cc-by">Creative Commons Attribution</rights> <rights rightsURI="info:eu-repo/semantics/openAccess">Open Access</rights> </rightsList> <descriptions> <description descriptionType="Abstract">The word2vec and fastText models train two vectors per word: a word and a context vector. Typically the context vectors are discarded after training, even though they may contain useful information for different NLP tasks. Therefore we combine word and context vectors in the framework of meta-embeddings. Our experiments show performance increases at several NLP tasks such as text classification, semantic similarity, and analogy. In conclusion, this approach can be used to increase performance at downstream tasks while requiring minimal additional computational resources.</description> </descriptions> </resource>
Görüntülenme | 37 |
İndirme | 6 |
Veri hacmi | 984 Bytes |
Tekil görüntülenme | 36 |
Tekil indirme | 6 |