Logo Logo
Hilfe
Hilfe
Switch Language to English

Schick, Timo und Schütze, Hinrich (6. Juli 2020): BERTRAM: Improved Word Embeddings Have Big Impact on Contextualized Model Performance. The 58th Annual Meeting of the Association for Computational Linguistics, Seattle, USA, July 6 – 8, 2020. [PDF, 399kB]

[thumbnail of 1910.07181.pdf]
Vorschau
Download (399kB)

Abstract

Pretraining deep language models has led to large performance gains in NLP. Despite this success, Schick and Schu ̈tze (2020) recently showed that these models struggle to under- stand rare words. For static word embeddings, this problem has been addressed by separately learning representations for rare words. In this work, we transfer this idea to pretrained language models: We introduce BERTRAM, a powerful architecture based on BERT that is capable of inferring high-quality embeddings for rare words that are suitable as input rep- resentations for deep language models. This is achieved by enabling the surface form and con- texts of a word to interact with each other in a deep architecture. Integrating BERTRAM into BERT leads to large performance increases due to improved representations of rare and medium frequency words on both a rare word probing task and three downstream tasks.

Dokument bearbeiten Dokument bearbeiten