Home
Catalogue search
Refine your search:
Keyword:
Computational Linguistics (6)
Machine Learning (6)
Machine Learning and Data Mining (6)
Natural Language Processing (6)
Language Models (2)
Question-Answering Systems (1)
Text Summarization (1)
Creator / Publisher:
Inui, Kentaro (6)
The 2021 Conference on Empirical Methods in Natural Language Processing 2021 (6)
Kiyono, Shun (2)
., Harsh (1)
Balasubramanian, Niranjan (1)
Hanawa, Kazuaki (1)
Hitomi, Yuta (1)
Inoue, Naoya (1)
Kobayashi, Goro (1)
Kobayashi, Sosuke (1)
more
Year
Medium
Type
BLLDB-Access
Search in the Catalogues and Directories
All fields
Title
Creator / Publisher
Keyword
Year
AND
OR
AND NOT
All fields
Title
Creator / Publisher
Keyword
Year
AND
OR
AND NOT
All fields
Title
Creator / Publisher
Keyword
Year
AND
OR
AND NOT
All fields
Title
Creator / Publisher
Keyword
Year
AND
OR
AND NOT
All fields
Title
Creator / Publisher
Keyword
Year
Sort by
creator [A → Z]
'
creator [Z → A]
'
publishing year ↑ (asc)
'
publishing year ↓ (desc)
'
title [A → Z]
'
title [Z → A]
'
Simple Search
Hits 1 – 6 of 6
1
Summarize-then-Answer: Generating Concise Explanations for Multi-hop Reading Comprehension ...
The 2021 Conference on Empirical Methods in Natural Language Processing 2021
;
., Harsh
;
Balasubramanian, Niranjan
. - : Underline Science Inc., 2021
BASE
Show details
2
SHAPE: Shifted Absolute Position Embedding for Transformers ...
The 2021 Conference on Empirical Methods in Natural Language Processing 2021
;
Inui, Kentaro
;
Kiyono, Shun
. - : Underline Science Inc., 2021
BASE
Show details
3
Incorporating Residual and Normalization Layers into Analysis of Masked Language Models ...
The 2021 Conference on Empirical Methods in Natural Language Processing 2021
;
Inui, Kentaro
;
Kobayashi, Goro
;
Kuribayashi, Tatsuki
;
Yokoi, Sho
. - : Underline Science Inc., 2021
Abstract:
Anthology paper link: https://aclanthology.org/2021.emnlp-main.373/ Abstract: Transformer architecture has become ubiquitous in the natural language processing field. To interpret the Transformer-based models, their attention patterns have been extensively analyzed. However, the Transformer architecture is not only composed of the multi-head attention; other components can also contribute to Transformers’ progressive performance. In this study, we extended the scope of the analysis of Transformers from solely the attention patterns to the whole attention block, i.e., multi-head attention, residual connection, and layer normalization. Our analysis of Transformer-based masked language models shows that the token-to-token interaction performed via attention has less impact on the intermediate representations than previously assumed. These results provide new intuitive explanations of existing reports; for example, discarding the learned attention patterns tends not to adversely affect the performance. The codes ...
Keyword:
Computational Linguistics
;
Language Models
;
Machine Learning
;
Machine Learning and Data Mining
;
Natural Language Processing
URL:
https://dx.doi.org/10.48448/jh7m-qw81
https://underline.io/lecture/38024-incorporating-residual-and-normalization-layers-into-analysis-of-masked-language-models
BASE
Hide details
4
Pseudo Zero Pronoun Resolution Improves Zero Anaphora Resolution ...
The 2021 Conference on Empirical Methods in Natural Language Processing 2021
;
Inui, Kentaro
;
Kiyono, Shun
. - : Underline Science Inc., 2021
BASE
Show details
5
Exploring Methods for Generating Feedback Comments for Writing Learning ...
The 2021 Conference on Empirical Methods in Natural Language Processing 2021
;
Hanawa, Kazuaki
;
Inui, Kentaro
. - : Underline Science Inc., 2021
BASE
Show details
6
Transformer-based Lexically Constrained Headline Generation ...
The 2021 Conference on Empirical Methods in Natural Language Processing 2021
;
Hitomi, Yuta
;
Inui, Kentaro
. - : Underline Science Inc., 2021
BASE
Show details
Mobile view
All
Catalogues
UB Frankfurt Linguistik
0
IDS Mannheim
0
OLC Linguistik
0
UB Frankfurt Retrokatalog
0
DNB Subject Category Language
0
Institut für Empirische Sprachwissenschaft
0
Leibniz-Centre General Linguistics (ZAS)
0
Bibliographies
BLLDB
0
BDSL
0
IDS Bibliografie zur deutschen Grammatik
0
IDS Bibliografie zur Gesprächsforschung
0
IDS Konnektoren im Deutschen
0
IDS Präpositionen im Deutschen
0
IDS OBELEX meta
0
MPI-SHH Linguistics Collection
0
MPI for Psycholinguistics
0
Linked Open Data catalogues
Annohub
0
Online resources
Link directory
0
Journal directory
0
Database directory
0
Dictionary directory
0
Open access documents
BASE
6
Linguistik-Repository
0
IDS Publikationsserver
0
Online dissertations
0
Language Description Heritage
0
© 2013 - 2024 Lin|gu|is|tik
|
Imprint
|
Privacy Policy
|
Datenschutzeinstellungen ändern