DE eng

Search in the Catalogues and Directories

Page: 1 2 3 4 5 6...72
Hits 21 – 40 of 1.423

21
HIT - A Hierarchically Fused Deep Attention Network for Robust Code-mixed Language Representation ...
BASE
Show details
22
Minimally-Supervised Morphological Segmentation using Adaptor Grammars with Linguistic Priors ...
BASE
Show details
23
Bridging Subword Gaps in Pretrain-Finetune Paradigm for Natural Language Generation ...
BASE
Show details
24
LearnDA: Learnable Knowledge-Guided Data Augmentation for Event Causality Identification ...
BASE
Show details
25
Quotation Recommendation and Interpretation Based on Transformation from Queries to Quotations ...
BASE
Show details
26
How Did This Get Funded?! Automatically Identifying Quirky Scientific Achievements ...
BASE
Show details
27
Minimax and Neyman–Pearson Meta-Learning for Outlier Languages ...
BASE
Show details
28
CLINE: Contrastive Learning with Semantic Negative Examples for Natural Language Understanding ...
BASE
Show details
29
Towards Protecting Vital Healthcare Programs by Extracting Actionable Knowledge from Policy ...
BASE
Show details
30
DYPLOC: Dynamic Planning of Content Using Mixed Language Models for Text Generation ...
BASE
Show details
31
Automated Concatenation of Embeddings for Structured Prediction ...
BASE
Show details
32
QASR: QCRI Aljazeera Speech Resource A Large Scale Annotated Arabic Speech Corpus ...
BASE
Show details
33
Code Generation from Natural Language with Less Prior Knowledge and More Monolingual Data ...
BASE
Show details
34
On the Distribution, Sparsity, and Inference-time Quantization of Attention Values in Transformers ...
BASE
Show details
35
Learning Disentangled Latent Topics for Twitter Rumour Veracity Classification ...
BASE
Show details
36
Sequence Models for Computational Etymology of Borrowings ...
Abstract: Read paper: https://www.aclanthology.org/2021.findings-acl.353 Abstract: We computationally model the processes of word borrowing from a donor word to an incorporated word, and vice versa, by answering two questions: (1) what does a word look like if it is incorporated into another language, and in the opposite direction (2) where did a word come from? We employ neural sequence models, focusing on six specific borrowing relations: calques, partial calques, semantic loans, phono-semantic matches, transliterations, and generic borrowings. We experiment with several model variants, including LSTM encoder-decoders, copy attention, and Transformers, along with several data distribution and preprocessing scenarios. We find that the quality of data strongly influences the performance of models, with low character edit distance between model predictions and the gold words. ...
Keyword: Computational Linguistics; Condensed Matter Physics; Deep Learning; Electromagnetism; FOS Physical sciences; Neural Network; Semantics
URL: https://underline.io/lecture/26444-sequence-models-for-computational-etymology-of-borrowings
https://dx.doi.org/10.48448/xv8m-yf67
BASE
Hide details
37
Scaling Within Document Coreference to Long Texts ...
BASE
Show details
38
How to Split: the Effect of Word Segmentation on Gender Bias in Speech Translation ...
BASE
Show details
39
Prefix-Tuning: Optimizing Continuous Prompts for Generation ...
BASE
Show details
40
Chase: A Large-Scale and Pragmatic Chinese Dataset for Cross-Database Context-Dependent Text-to-SQL ...
BASE
Show details

Page: 1 2 3 4 5 6...72

Catalogues
0
0
0
0
0
0
0
Bibliographies
0
0
0
0
0
0
0
0
0
Linked Open Data catalogues
0
Online resources
0
0
0
0
Open access documents
1.423
0
0
0
0
© 2013 - 2024 Lin|gu|is|tik | Imprint | Privacy Policy | Datenschutzeinstellungen ändern