Home
Catalogue search
Refine your search:
Keyword:
Computational Linguistics (3)
Condensed Matter Physics (2)
FOS Physical sciences (2)
Information and Knowledge Engineering (2)
Neural Network (2)
Semantics (2)
Computation and Language cs.CL (1)
Deep Learning (1)
Electromagnetism (1)
FOS Computer and information sciences (1)
more
Creator / Publisher:
Eisenschlos, Julian (3)
Müller, Thomas (3)
Krichene, Syrine (2)
The Joint Conference of the 59th Annual Meeting of the Association for Computational Linguistics and the 11th International Joint Conference on Natural Language Processing 2021 (2)
Cohen, William (1)
Czapla, Piotr (1)
Eisenschlos, Julian Martin (1)
Gor, Maharshi (1)
Gugger, Sylvain (1)
Howard, Jeremy (1)
more
Year
Medium
Type
BLLDB-Access
Search in the Catalogues and Directories
All fields
Title
Creator / Publisher
Keyword
Year
AND
OR
AND NOT
All fields
Title
Creator / Publisher
Keyword
Year
AND
OR
AND NOT
All fields
Title
Creator / Publisher
Keyword
Year
AND
OR
AND NOT
All fields
Title
Creator / Publisher
Keyword
Year
AND
OR
AND NOT
All fields
Title
Creator / Publisher
Keyword
Year
Sort by
creator [A → Z]
'
creator [Z → A]
'
publishing year ↑ (asc)
'
publishing year ↓ (desc)
'
title [A → Z]
'
title [Z → A]
'
Simple Search
Hits 1 – 4 of 4
1
DoT: An efficient Double Transformer for NLP tasks with tables ...
The Joint Conference of the 59th Annual Meeting of the Association for Computational Linguistics and the 11th International Joint Conference on Natural Language Processing 2021
;
Eisenschlos, Julian
;
Krichene, Syrine
. - : Underline Science Inc., 2021
BASE
Show details
2
MATE: Multi-view Attention for Table Transformer Efficiency ...
The 2021 Conference on Empirical Methods in Natural Language Processing 2021
;
Cohen, William
;
Eisenschlos, Julian
;
Gor, Maharshi
;
Müller, Thomas
. - : Underline Science Inc., 2021
Abstract:
Anthology paper link: https://aclanthology.org/2021.emnlp-main.600/ Abstract: This work presents a sparse-attention Transformer architecture for modeling documents that contain large tables. Tables are ubiquitous on the web, and are rich in information. However, more than 20% of relational tables on the web have 20 or more rows (Cafarella et al., 2008), and these large tables present a challenge for current Transformer models, which are typically limited to 512 tokens. Here we propose MATE, a novel Transformer architecture designed to model the structure of web tables. MATE uses sparse attention in a way that allows heads to efficiently attend to either rows or columns in a table. This architecture scales linearly with respect to speed and memory, and can handle documents containing more than 8000 tokens with current accelerators. MATE also has a more appropriate inductive bias for tabular data, and sets a new state-of-the-art for three table reasoning datasets. For HybridQA (Chen et al., 2020b), a dataset ...
Keyword:
Computational Linguistics
;
Machine Learning
;
Machine Learning and Data Mining
;
Natural Language Processing
;
Sentiment Analysis
URL:
https://underline.io/lecture/37326-mate-multi-view-attention-for-table-transformer-efficiency
https://dx.doi.org/10.48448/fney-f862
BASE
Hide details
3
DoT: An efficient Double Transformer for NLP tasks with tables ...
The Joint Conference of the 59th Annual Meeting of the Association for Computational Linguistics and the 11th International Joint Conference on Natural Language Processing 2021
;
Eisenschlos, Julian
;
Krichene, Syrine
. - : Underline Science Inc., 2021
BASE
Show details
4
MultiFiT: Efficient Multi-lingual Language Model Fine-tuning ...
Eisenschlos, Julian Martin
;
Ruder, Sebastian
;
Czapla, Piotr
. - : arXiv, 2019
BASE
Show details
Mobile view
All
Catalogues
UB Frankfurt Linguistik
0
IDS Mannheim
0
OLC Linguistik
0
UB Frankfurt Retrokatalog
0
DNB Subject Category Language
0
Institut für Empirische Sprachwissenschaft
0
Leibniz-Centre General Linguistics (ZAS)
0
Bibliographies
BLLDB
0
BDSL
0
IDS Bibliografie zur deutschen Grammatik
0
IDS Bibliografie zur Gesprächsforschung
0
IDS Konnektoren im Deutschen
0
IDS Präpositionen im Deutschen
0
IDS OBELEX meta
0
MPI-SHH Linguistics Collection
0
MPI for Psycholinguistics
0
Linked Open Data catalogues
Annohub
0
Online resources
Link directory
0
Journal directory
0
Database directory
0
Dictionary directory
0
Open access documents
BASE
4
Linguistik-Repository
0
IDS Publikationsserver
0
Online dissertations
0
Language Description Heritage
0
© 2013 - 2024 Lin|gu|is|tik
|
Imprint
|
Privacy Policy
|
Datenschutzeinstellungen ändern