Home
Catalogue search
Refine your search:
Keyword:
Computational Linguistics (6)
Condensed Matter Physics (6)
FOS Physical sciences (6)
Neural Network (6)
Semantics (6)
Deep Learning (5)
Electromagnetism (5)
Information and Knowledge Engineering (5)
Machine Learning (1)
Creator / Publisher:
Huang, Yongfeng (6)
The Joint Conference of the 59th Annual Meeting of the Association for Computational Linguistics and the 11th International Joint Conference on Natural Language Processing 2021 (6)
Wu, Chuhan (5)
Wu, Fangzhao (5)
Qi, Tao (3)
Xie, Xing (1)
Yang, Jinshuai (1)
Yang, Peiru (1)
Yang, Zhongliang (1)
Yu, t-yyu@microsoft.com (1)
more
Year
Medium
Type
BLLDB-Access
Search in the Catalogues and Directories
All fields
Title
Creator / Publisher
Keyword
Year
AND
OR
AND NOT
All fields
Title
Creator / Publisher
Keyword
Year
AND
OR
AND NOT
All fields
Title
Creator / Publisher
Keyword
Year
AND
OR
AND NOT
All fields
Title
Creator / Publisher
Keyword
Year
AND
OR
AND NOT
All fields
Title
Creator / Publisher
Keyword
Year
Sort by
creator [A → Z]
'
creator [Z → A]
'
publishing year ↑ (asc)
'
publishing year ↓ (desc)
'
title [A → Z]
'
title [Z → A]
'
Simple Search
Hits 1 – 6 of 6
1
One Teacher is Enough? Pre-trained Language Model Distillation from Multiple Teachers ...
The Joint Conference of the 59th Annual Meeting of the Association for Computational Linguistics and the 11th International Joint Conference on Natural Language Processing 2021
;
Huang, Yongfeng
;
Wu, Chuhan
. - : Underline Science Inc., 2021
BASE
Show details
2
Hi-Transformer: Hierarchical Interactive Transformer for Efficient and Effective Long Document Modeling ...
The Joint Conference of the 59th Annual Meeting of the Association for Computational Linguistics and the 11th International Joint Conference on Natural Language Processing 2021
;
Huang, Yongfeng
;
Qi, Tao
;
Wu, Chuhan
;
Wu, Fangzhao
. - : Underline Science Inc., 2021
Abstract:
Read paper: https://www.aclanthology.org/2021.acl-short.107 Abstract: Transformer is important for text modeling. However, it has difficulty in handling long documents due to the quadratic complexity with input text length. In order to handle this problem, we propose a hierarchical interactive Transformer (Hi-Transformer) for efficient and effective long document modeling. Hi-Transformer models documents in a hierarchical way, i.e., first learns sentence representations and then learns document representations. It can effectively reduce the complexity and meanwhile capture global document context in the modeling of each sentence. More specifically, we first use a sentence Transformer to learn the representations of each sentence. Then we use a document Transformer to model the global document context from these sentence representations. Next, we use another sentence Transformer to enhance sentence modeling using the global document context. Finally, we use hierarchical pooling method to obtain document ...
Keyword:
Computational Linguistics
;
Condensed Matter Physics
;
Deep Learning
;
Electromagnetism
;
FOS Physical sciences
;
Information and Knowledge Engineering
;
Neural Network
;
Semantics
URL:
https://dx.doi.org/10.48448/wqt9-bc64
https://underline.io/lecture/25858-hi-transformer-hierarchical-interactive-transformer-for-efficient-and-effective-long-document-modeling
BASE
Hide details
3
One Teacher is Enough? Pre-trained Language Model Distillation from Multiple Teachers ...
The Joint Conference of the 59th Annual Meeting of the Association for Computational Linguistics and the 11th International Joint Conference on Natural Language Processing 2021
;
Huang, Yongfeng
;
Wu, Chuhan
. - : Underline Science Inc., 2021
BASE
Show details
4
Provably Secure Generative Linguistic Steganography ...
The Joint Conference of the 59th Annual Meeting of the Association for Computational Linguistics and the 11th International Joint Conference on Natural Language Processing 2021
;
Huang, Yongfeng
;
Yang, Zhongliang
. - : Underline Science Inc., 2021
BASE
Show details
5
HieRec: Hierarchical User Interest Modeling for Personalized News Recommendation ...
The Joint Conference of the 59th Annual Meeting of the Association for Computational Linguistics and the 11th International Joint Conference on Natural Language Processing 2021
;
Huang, Yongfeng
;
Qi, Tao
. - : Underline Science Inc., 2021
BASE
Show details
6
PP-Rec: News Recommendation with Personalized User Interest and Time-aware News Popularity ...
The Joint Conference of the 59th Annual Meeting of the Association for Computational Linguistics and the 11th International Joint Conference on Natural Language Processing 2021
;
Huang, Yongfeng
;
Qi, Tao
. - : Underline Science Inc., 2021
BASE
Show details
Mobile view
All
Catalogues
UB Frankfurt Linguistik
0
IDS Mannheim
0
OLC Linguistik
0
UB Frankfurt Retrokatalog
0
DNB Subject Category Language
0
Institut für Empirische Sprachwissenschaft
0
Leibniz-Centre General Linguistics (ZAS)
0
Bibliographies
BLLDB
0
BDSL
0
IDS Bibliografie zur deutschen Grammatik
0
IDS Bibliografie zur Gesprächsforschung
0
IDS Konnektoren im Deutschen
0
IDS Präpositionen im Deutschen
0
IDS OBELEX meta
0
MPI-SHH Linguistics Collection
0
MPI for Psycholinguistics
0
Linked Open Data catalogues
Annohub
0
Online resources
Link directory
0
Journal directory
0
Database directory
0
Dictionary directory
0
Open access documents
BASE
6
Linguistik-Repository
0
IDS Publikationsserver
0
Online dissertations
0
Language Description Heritage
0
© 2013 - 2024 Lin|gu|is|tik
|
Imprint
|
Privacy Policy
|
Datenschutzeinstellungen ändern