Home
Catalogue search
Refine your search:
Keyword:
Computational Linguistics (2)
Condensed Matter Physics (2)
Deep Learning (2)
Electromagnetism (2)
FOS Physical sciences (2)
Neural Network (2)
Semantics (2)
Information and Knowledge Engineering (1)
Creator / Publisher:
Oseki, Yohei (2)
The Joint Conference of the 59th Annual Meeting of the Association for Computational Linguistics and the 11th International Joint Conference on Natural Language Processing 2021 (2)
Asahara, Masayuki (1)
Inui, Kentaro (1)
Ito, Takumi (1)
Kuribayashi, Tatsuki (1)
Noji, Hiroshi (1)
Yoshida, Ryo (1)
Year
Medium
Type
BLLDB-Access
Search in the Catalogues and Directories
All fields
Title
Creator / Publisher
Keyword
Year
AND
OR
AND NOT
All fields
Title
Creator / Publisher
Keyword
Year
AND
OR
AND NOT
All fields
Title
Creator / Publisher
Keyword
Year
AND
OR
AND NOT
All fields
Title
Creator / Publisher
Keyword
Year
AND
OR
AND NOT
All fields
Title
Creator / Publisher
Keyword
Year
Sort by
creator [A → Z]
'
creator [Z → A]
'
publishing year ↑ (asc)
'
publishing year ↓ (desc)
'
title [A → Z]
'
title [Z → A]
'
Simple Search
Hits 1 – 2 of 2
1
Effective Batching for Recurrent Neural Network Grammars ...
The Joint Conference of the 59th Annual Meeting of the Association for Computational Linguistics and the 11th International Joint Conference on Natural Language Processing 2021
;
Noji, Hiroshi
;
Oseki, Yohei
. - : Underline Science Inc., 2021
Abstract:
Read paper: https://www.aclanthology.org/2021.findings-acl.380 Abstract: As a language model that integrates traditional symbolic operations and flexible neural representations, recurrent neural network grammars (RNNGs) have attracted great attention from both scientific and engineering perspectives. However, RNNGs are known to be harder to scale due to the difficulty of batched training. In this paper, we propose effective batching for RNNGs, where every operation is computed in parallel with tensors across multiple sentences. Our PyTorch implementation effectively employs a GPU and achieves x6 speedup compared to the existing C++ DyNet implementation with model-independent auto-batching. Moreover, our batched RNNG also accelerates inference and achieves x20-150 speedup for beam search depending on beam sizes. Finally, we evaluate syntactic generalization performance of the scaled RNNG against the LSTM baseline, based on the large training data of 100M tokens from English Wikipedia and the broad-coverage ...
Keyword:
Computational Linguistics
;
Condensed Matter Physics
;
Deep Learning
;
Electromagnetism
;
FOS Physical sciences
;
Neural Network
;
Semantics
URL:
https://dx.doi.org/10.48448/x6f1-3c03
https://underline.io/lecture/26471-effective-batching-for-recurrent-neural-network-grammars
BASE
Hide details
2
Lower Perplexity is Not Always Human-Like ...
The Joint Conference of the 59th Annual Meeting of the Association for Computational Linguistics and the 11th International Joint Conference on Natural Language Processing 2021
;
Asahara, Masayuki
;
Inui, Kentaro
. - : Underline Science Inc., 2021
BASE
Show details
Mobile view
All
Catalogues
UB Frankfurt Linguistik
0
IDS Mannheim
0
OLC Linguistik
0
UB Frankfurt Retrokatalog
0
DNB Subject Category Language
0
Institut für Empirische Sprachwissenschaft
0
Leibniz-Centre General Linguistics (ZAS)
0
Bibliographies
BLLDB
0
BDSL
0
IDS Bibliografie zur deutschen Grammatik
0
IDS Bibliografie zur Gesprächsforschung
0
IDS Konnektoren im Deutschen
0
IDS Präpositionen im Deutschen
0
IDS OBELEX meta
0
MPI-SHH Linguistics Collection
0
MPI for Psycholinguistics
0
Linked Open Data catalogues
Annohub
0
Online resources
Link directory
0
Journal directory
0
Database directory
0
Dictionary directory
0
Open access documents
BASE
2
Linguistik-Repository
0
IDS Publikationsserver
0
Online dissertations
0
Language Description Heritage
0
© 2013 - 2024 Lin|gu|is|tik
|
Imprint
|
Privacy Policy
|
Datenschutzeinstellungen ändern