Back to ECNLP 2024
LREC-COLING 2024workshop

Multi-word Term Embeddings Improve Lexical Product Retrieval

Proceedings of the Seventh Workshop on e-Commerce and NLP @ LREC-COLING 2024

DOI:10.63317/2giicqnq73go

Abstract

Product search is uniquely different from search for documents, Internet resources or vacancies, therefore it requires the development of specialized search systems. The present work describes the H1 embdedding model, designed for an offline term indexing of product descriptions at e-commerce platforms. The model is compared to other state-of-the-art (SoTA) embedding models within a framework of hybrid product search system that incorporates the advantages of lexical methods for product retrieval and semantic embedding-based methods. We propose an approach to building semantically rich term vocabularies for search indexes. Compared to other production semantic models, H1 paired with the proposed approach stands out due to its ability to process multi-word product terms as one token. As an example, for search queries “new balance shoes”, “gloria jeans kids wear” brand entity will be represented as one token - “new balance”, “gloria jeans”. This results in an increased precision of the system without affecting the recall. The hybrid search system with proposed model scores mAP@12 = 56.1% and R@1k = 86.6% on the WANDS public dataset, beating other SoTA analogues.

Details

Paper ID
lrec2024-ws-ecnlp-12
Pages
pp. 115-124
BibKey
shcherbakov-krasnov-2024-multi
Editor
N/A
Publisher
European Language Resources Association (ELRA) and ICCL
ISSN
N/A
ISBN
N/A
Workshop
Proceedings of the Seventh Workshop on e-Commerce and NLP @ LREC-COLING 2024
Location
undefined, undefined
Date
20 May 2024 25 May 2024

Authors

  • VS

    Viktor Shcherbakov

  • FK

    Fedor Krasnov

Links