![](https://dblp.uni-trier.de/img/logo.ua.320x120.png)
![](https://dblp.uni-trier.de/img/dropdown.dark.16x16.png)
![](https://dblp.uni-trier.de/img/peace.dark.16x16.png)
Остановите войну!
for scientists:
![search dblp search dblp](https://dblp.uni-trier.de/img/search.dark.16x16.png)
![search dblp](https://dblp.uni-trier.de/img/search.dark.16x16.png)
default search action
BibTeX record conf/emnlp/HassidPRKMS022
@inproceedings{DBLP:conf/emnlp/HassidPRKMS022, author = {Michael Hassid and Hao Peng and Daniel Rotem and Jungo Kasai and Ivan Montero and Noah A. Smith and Roy Schwartz}, editor = {Yoav Goldberg and Zornitsa Kozareva and Yue Zhang}, title = {How Much Does Attention Actually Attend? Questioning the Importance of Attention in Pretrained Transformers}, booktitle = {Findings of the Association for Computational Linguistics: {EMNLP} 2022, Abu Dhabi, United Arab Emirates, December 7-11, 2022}, pages = {1403--1416}, publisher = {Association for Computational Linguistics}, year = {2022}, url = {https://doi.org/10.18653/v1/2022.findings-emnlp.101}, doi = {10.18653/V1/2022.FINDINGS-EMNLP.101}, timestamp = {Wed, 21 Feb 2024 11:48:05 +0100}, biburl = {https://dblp.org/rec/conf/emnlp/HassidPRKMS022.bib}, bibsource = {dblp computer science bibliography, https://dblp.org} }
![](https://dblp.uni-trier.de/img/cog.dark.24x24.png)
manage site settings
To protect your privacy, all features that rely on external API calls from your browser are turned off by default. You need to opt-in for them to become active. All settings here will be stored as cookies with your web browser. For more information see our F.A.Q.