Please use this identifier to cite or link to this item:
https://www.um.edu.mt/library/oar/handle/123456789/104594
Full metadata record
DC Field | Value | Language |
---|---|---|
dc.contributor.author | Tanti, Marc | - |
dc.contributor.author | van der Plas, Lonneke | - |
dc.contributor.author | Borg, Claudia | - |
dc.contributor.author | Gatt, Albert | - |
dc.date.accessioned | 2022-12-21T11:04:43Z | - |
dc.date.available | 2022-12-21T11:04:43Z | - |
dc.date.issued | 2021 | - |
dc.identifier.citation | Tanti, M., van der Plas, L., Borg, C., & Gatt, A. (2021). On the language-specificity of multilingual BERT and the impact of fine-tuning. In Proceedings of the Fourth BlackboxNLP Workshop on Analyzing and Interpreting Neural Networks for NLP, Virtual Conference. 214-227. | en_GB |
dc.identifier.uri | https://www.um.edu.mt/library/oar/handle/123456789/104594 | - |
dc.description.abstract | Recent work has shown evidence that the knowledge acquired by multilingual BERT (mBERT) has two components: a language specific and a language-neutral one. This paper analyses the relationship between them, in the context of fine-tuning on two tasks – POS tagging and natural language inference – which require the model to bring to bear different degrees of language-specific knowledge. Visualisations reveal that mBERT loses the ability to cluster representations by language after fine-tuning, a result that is supported by evidence from language identification experiments. However, further experiments on ‘unlearning’ language-specific representations using gradient reversal and iterative adversarial learning are shown not to add further improvement to the language-independent component over and above the effect of fine-tuning. The results presented here suggest that the process of fine-tuning causes a reorganisation of the model’s limited representational capacity, enhancing language-independent representations at the expense of language-specific ones. | en_GB |
dc.language.iso | en | en_GB |
dc.publisher | Association for Computational Linguistics | en_GB |
dc.rights | info:eu-repo/semantics/restrictedAccess | en_GB |
dc.subject | Artificial intelligence | en_GB |
dc.subject | Multilingual computing | en_GB |
dc.subject | Computational linguistics | en_GB |
dc.title | On the language-specificity of multilingual BERT and the impact of fine-tuning | en_GB |
dc.type | conferenceObject | en_GB |
dc.rights.holder | The copyright of this work belongs to the author(s)/publisher. The rights of this work are as defined by the appropriate Copyright Legislation or as modified by any successive legislation. Users may access this work and can make use of the information contained in accordance with the Copyright Legislation provided that the author must be properly acknowledged. Further distribution or reproduction in any format is prohibited without the prior permission of the copyright holder. | en_GB |
dc.bibliographicCitation.conferencename | Proceedings of the Fourth BlackboxNLP Workshop on Analyzing and Interpreting Neural Networks for NLP | en_GB |
dc.bibliographicCitation.conferenceplace | Virtual conference. 11/11/2021. | en_GB |
dc.description.reviewed | peer-reviewed | en_GB |
Appears in Collections: | Scholarly Works - InsLin |
Files in This Item:
File | Description | Size | Format | |
---|---|---|---|---|
On_the_language_specificity_of_multilingual_BERT_and_the_impact_of_fine_tuning_2021.pdf Restricted Access | 1.86 MB | Adobe PDF | View/Open Request a copy |
Items in OAR@UM are protected by copyright, with all rights reserved, unless otherwise indicated.