Back to Main Conference 2022
LREC 2022main

Attention Understands Semantic Relations

Proceedings of the Thirteenth International Conference on Language Resources and Evaluation (LREC 2022)

DOI:10.63317/2ohxv2w2e9rs

Abstract

Today, natural language processing heavily relies on pre-trained large language models. Even though such models are criticized for the poor interpretability, they still yield state-of-the-art solutions for a wide set of very different tasks. While lots of probing studies have been conducted to measure the models’ awareness of grammatical knowledge, semantic probing is less popular. In this work, we introduce the probing pipeline to study the representedness of semantic relations in transformer language models. We show that in this task, attention scores are nearly as expressive as the layers’ output activations, despite their lesser ability to represent surface cues. This supports the hypothesis that attention mechanisms are focusing not only on the syntactic relational information but also on the semantic one.

Details

Paper ID
lrec2022-main-430
Pages
pp. 4040-4050
BibKey
chizhikova-etal-2022-attention
Editor
N/A
Publisher
European Language Resources Association (ELRA)
ISSN
2522-2686
ISBN
79-10-95546-38-2
Conference
Thirteenth Language Resources and Evaluation Conference
Location
Marseille, France
Date
20 June 2022 25 June 2022

Authors

  • AC

    Anastasia Chizhikova

  • SM

    Sanzhar Murzakhmetov

  • OS

    Oleg Serikov

  • TS

    Tatiana Shavrina

  • MB

    Mikhail Burtsev

Links