Back to DMR 2024
LREC-COLING 2024workshop

Unveiling Semantic Information in Sentence Embeddings

Proceedings of the Fifth International Workshop on Designing Meaning Representations @ LREC-COLING 2024

DOI:10.63317/35atjjx6ap2d

Abstract

This study evaluates the extent to which semantic information is preserved within sentence embeddings generated from state-of-art sentence embedding models: SBERT and LaBSE. Specifically, we analyzed 13 semantic attributes in sentence embeddings. Our findings indicate that some semantic features (such as tense-related classes) can be decoded from the representation of sentence embeddings. Additionally, we discover the limitation of the current sentence embedding models: inferring meaning beyond the lexical level has proven to be difficult.

Details

Paper ID
lrec2024-ws-dmr-05
Pages
pp. 39-47
BibKey
zhang-etal-2024-unveiling
Editor
N/A
Publisher
European Language Resources Association (ELRA) and ICCL
ISSN
N/A
ISBN
N/A
Workshop
Proceedings of the Fifth International Workshop on Designing Meaning Representations @ LREC-COLING 2024
Location
undefined, undefined
Date
20 May 2024 25 May 2024

Authors

  • LZ

    Leixin Zhang

  • DB

    David Burian

  • VJ

    Vojtěch John

  • OB

    Ondřej Bojar

Links