Please use this identifier to cite or link to this item:
PIRA download icon_1.1View/Download Full Text
Title: BERT prescriptions to avoid unwanted headaches : a comparison of transformer architectures for adverse drug event detection
Authors: Portelli, B
Lenzi, E
Chersoni, E 
Serra, G
Santus, E
Issue Date: Apr-2021
Source: Proceedings of the 16th Conference of the European Chapter of the Association for Computational Linguistics, EACL, April 2021, main volume, p. 1740-1747
Abstract: Pretrained transformer-based models, such as BERT and its variants, have become a common choice to obtain state-of-the-art performances in NLP tasks. In the identification of Adverse Drug Events (ADE) from social media texts, for example, BERT architectures rank first in the leaderboard. However, a systematic comparison between these models has not yet been done. In this paper, we aim at sheddinglightonthedifferencesbetweentheir performance analyzing the results of 12 models, tested on two standard benchmarks.
SpanBERT and PubMedBERT emerged as the best models in our evaluation: this result clearly shows that span-based pretraining givesadecisiveadvantageinthepreciserecognition of ADEs, and that in-domain language pretraining is particularly useful when the transformer model is trained just on biomedical text from scratch.
Publisher: Association for Computational Linguistics
Rights: © 1963–2021 ACL; other materials are copyrighted by their respective copyright holders. Materials prior to 2016 here are licensed under the Creative Commons Attribution-NonCommercial-ShareAlike 3.0 International License. Permission is granted to make copies for the purposes of teaching and research. Materials published in or after 2016 are licensed on a Creative Commons Attribution 4.0 International License.
The following publication Portelli, B., Lenzi, E., Chersoni, E., Serra, G., & Santus, E. (2021, April). BERT Prescriptions to Avoid Unwanted Headaches: A Comparison of Transformer Architectures for Adverse Drug Event Detection. In Proceedings of the 16th Conference of the European Chapter of the Association for Computational Linguistics: Main Volume (pp. 1740-1747) is available at
Appears in Collections:Conference Paper

Files in This Item:
File Description SizeFormat 
2021.eacl-main.149.pdf236.3 kBAdobe PDFView/Open
Open Access Information
Status open access
File Version Version of Record
View full-text via PolyU eLinks SFX Query
Show full item record

Page views

Last Week
Last month
Citations as of May 28, 2023


Citations as of May 28, 2023

Google ScholarTM


Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.