Volltext-Downloads (blau) und Frontdoor-Views (grau)

Integrating distributional and lexical information for semantic classification of words using MRMF

  • Semantic classification of words using distributional features is usually based on the semantic similarity of words. We show on two different datasets that a trained classifier using the distributional features directly gives better results. We use Support Vector Machines (SVM) and Multirelational Matrix Factorization (MRMF) to train classifiers. Both give similar results. However, MRMF, that was not used for semantic classification with distributional features before, can easily be extended with more matrices containing more information from different sources on the same problem. We demonstrate the effectiveness of the novel approach by including information from WordNet. Thus we show, that MRMF provides an interesting approach for building semantic classifiers that (1) gives better results than unsupervised approaches based on vector similarity, (2) gives similar results as other supervised methods and (3) can naturally be extended with other sources of information in order to improve the results.

Download full text files

Export metadata

Additional Services

Search Google Scholar


Author:Rosa Tsegaye Aga, Lucas Drumond, Christian WartenaORCiDGND, Lars Schmidt-Thieme
Parent Title (English):Proceedings of COLING 2016, the 26th International Conference on Computational Linguistics: Technical Papers, Osaka, Japan, December 11-17 2016
Document Type:Conference Proceeding
Year of Completion:2016
Publishing Institution:Hochschule Hannover
Release Date:2017/07/11
GND Keyword:Klassifikation; Semantik
First Page:2708
Last Page:2717
Link to catalogue:1759224634
Institutes:Fakult├Ąt III - Medien, Information und Design
DDC classes:020 Bibliotheks- und Informationswissenschaft
Licence (German):License LogoCreative Commons - CC BY - Namensnennung 4.0 International