Home
Catalogue search
Refine your search:
Keyword:
Computational Linguistics (2)
Condensed Matter Physics (2)
Electromagnetism (2)
FOS Physical sciences (2)
Information and Knowledge Engineering (2)
Neural Network (2)
Semantics (2)
Deep Learning (1)
Creator / Publisher
Year
Medium:
Online (2)
Type
BLLDB-Access:
free (2)
subject to license (0)
Search in the Catalogues and Directories
All fields
Title
Creator / Publisher
Keyword
Year
AND
OR
AND NOT
All fields
Title
Creator / Publisher
Keyword
Year
AND
OR
AND NOT
All fields
Title
Creator / Publisher
Keyword
Year
AND
OR
AND NOT
All fields
Title
Creator / Publisher
Keyword
Year
AND
OR
AND NOT
All fields
Title
Creator / Publisher
Keyword
Year
Sort by
creator [A → Z]
'
creator [Z → A]
'
publishing year ↑ (asc)
'
publishing year ↓ (desc)
'
title [A → Z]
'
title [Z → A]
'
Simple Search
Hits 1 – 2 of 2
1
Bi-directional Domain Adaptation Using Weighted MTL ...
The Joint Conference of the 59th Annual Meeting of the Association for Computational Linguistics and the 11th International Joint Conference on Natural Language Processing 2021
;
Dakota, Daniel
;
Kübler, Sandra
. - : Underline Science Inc., 2021
BASE
Show details
2
Annotations Matter: Leveraging Multi-task Learning to Parse UD and SUD ...
The Joint Conference of the 59th Annual Meeting of the Association for Computational Linguistics and the 11th International Joint Conference on Natural Language Processing 2021
;
Dakota, Daniel
;
Sayyed, Zeeshan
. - : Underline Science Inc., 2021
Abstract:
Read paper: https://www.aclanthology.org/2021.findings-acl.305 Abstract: Using multiple treebanks to improve parsing performance has shown positive results. However, to what extent similar, yet competing annotation decisions play in parser behavior is unclear. We investigate this within a multi-task learning (MTL) dependency parser setup on two parallel treebanks, UD and SUD, which, while possessing similar annotation schemes, differ in specific linguistic annotation preferences. We perform a set of experiments with different MTL architectural choices, comparing performance across various input embeddings. We find languages tend to pattern in lose typological associations, but generally the performance within an MTL setting is lower than single model baseline parsers for each annotation scheme. The main contributing factor seems to be the competing syntactic annotation information shared between treebanks in an MTL setting, which is shown in experiments against differently annotated treebanks.This suggests ...
Keyword:
Computational Linguistics
;
Condensed Matter Physics
;
Deep Learning
;
Electromagnetism
;
FOS Physical sciences
;
Information and Knowledge Engineering
;
Neural Network
;
Semantics
URL:
https://dx.doi.org/10.48448/qa0p-z888
https://underline.io/lecture/26396-annotations-matter-leveraging-multi-task-learning-to-parse-ud-and-sud
BASE
Hide details
Mobile view
All
Catalogues
UB Frankfurt Linguistik
0
IDS Mannheim
0
OLC Linguistik
0
UB Frankfurt Retrokatalog
0
DNB Subject Category Language
0
Institut für Empirische Sprachwissenschaft
0
Leibniz-Centre General Linguistics (ZAS)
0
Bibliographies
BLLDB
0
BDSL
0
IDS Bibliografie zur deutschen Grammatik
0
IDS Bibliografie zur Gesprächsforschung
0
IDS Konnektoren im Deutschen
0
IDS Präpositionen im Deutschen
0
IDS OBELEX meta
0
MPI-SHH Linguistics Collection
0
MPI for Psycholinguistics
0
Linked Open Data catalogues
Annohub
0
Online resources
Link directory
0
Journal directory
0
Database directory
0
Dictionary directory
0
Open access documents
BASE
2
Linguistik-Repository
0
IDS Publikationsserver
0
Online dissertations
0
Language Description Heritage
0
© 2013 - 2024 Lin|gu|is|tik
|
Imprint
|
Privacy Policy
|
Datenschutzeinstellungen ändern