WordNet Is All You Need: A Surprisingly Effective Unsupervised Method for Graded Lexical Entailment - CRISTAL-MAGNET Access content directly
Conference Papers Year : 2023

WordNet Is All You Need: A Surprisingly Effective Unsupervised Method for Graded Lexical Entailment

Abstract

We propose a simple unsupervised approach which exclusively relies on WordNet (Miller, 1995) for predicting graded lexical entailment (GLE) in English. Inspired by the seminal work of Resnik (1995), our method models GLE as the sum of two information-theoretic scores: a symmetric semantic similarity score and an asymmetric specicity loss score, both exploiting the hierarchical synset structure of Word-Net. Our approach also includes a simple disambiguation mechanism to handle polysemy in a given word pair. Despite its simplicity, our method achieves performance above the state of the art (Spearman ρ = 0.75) on HyperLex (Vulic et al., 2017), the largest GLE dataset, outperforming all previous methods, including specialized word embeddings approaches that use WordNet as weak supervision.
Fichier principal
Vignette du fichier
emnlp23-GLE.pdf (88.43 Ko) Télécharger le fichier
Origin : Files produced by the author(s)

Dates and versions

hal-04250849 , version 1 (20-10-2023)

Licence

Attribution

Identifiers

  • HAL Id : hal-04250849 , version 1

Cite

Joseph Renner, Pascal Denis, Rémi Gilleron. WordNet Is All You Need: A Surprisingly Effective Unsupervised Method for Graded Lexical Entailment. Findings of the Association for Computational Linguistics: EMNLP 2023, 2023, Singapore, France. ⟨hal-04250849⟩
37 View
34 Download

Share

Gmail Facebook X LinkedIn More