DE eng

Search in the Catalogues and Directories

Page: 1 2 3 4 5...15
Hits 1 – 20 of 295

1
Statistical modelling in L3/Ln acquisition ...
Garcia, Guilherme. - : Open Science Framework, 2022
BASE
Show details
2
A Corpus-Based Sentence Classifier for Entity–Relationship Modelling
In: Electronics; Volume 11; Issue 6; Pages: 889 (2022)
BASE
Show details
3
Multi-National Topics Maps for Parliamentary Debate Analysis
BASE
Show details
4
Which data do elementary school teachers use to determine reading difficulties in their students?
In: Journal of learning disabilities 54 (2021) 5, S. 349-364 (2021)
BASE
Show details
5
Which data do elementary school teachers use to determine reading difficulties in their students? ...
Schmitterer, Alexandra; Brod, Garvin. - : Hammill Inst. on Disabilites, 2021. : Sage Publ., 2021
BASE
Show details
6
Computation of transition matrices.pptx ...
Gagniuc, Paul. - : figshare, 2021
BASE
Show details
7
Computation of transition matrices.pptx ...
Gagniuc, Paul. - : figshare, 2021
BASE
Show details
8
Finetuning Pretrained Transformers into Variational Autoencoders ...
BASE
Show details
9
Temporal social network reconstruction using wireless proximity sensors: model selection and consequences
In: ISSN: 2193-1127 ; EISSN: 2193-1127 ; EPJ Data Science ; https://hal.inria.fr/hal-03117988 ; EPJ Data Science, EDP Sciences, 2020, 9 (1), ⟨10.1140/epjds/s13688-020-00237-8⟩ (2020)
BASE
Show details
10
Inscriptions, Hieroglyphs, Linguistics… and Beyond! The Corpus of Classic Mayan as an Ontological Information Resource ...
BASE
Show details
11
Inscriptions, Hieroglyphs, Linguistics… and Beyond! The Corpus of Classic Mayan as an Ontological Information Resource ...
BASE
Show details
12
Anemone: a Visual Semantic Graph
Ficapal Vila, Joan. - : KTH, Skolan för elektroteknik och datavetenskap (EECS), 2019
BASE
Show details
13
Using Bidirectional Encoder Representations from Transformers for Conversational Machine Comprehension ; Användning av BERT-språkmodell för konversationsförståelse
Gogoulou, Evangelina. - : KTH, Skolan för elektroteknik och datavetenskap (EECS), 2019
Abstract: Bidirectional Encoder Representations from Transformers (BERT) is a recently proposed language representation model, designed to pre-train deep bidirectional representations, with the goal of extracting context-sensitive features from an input text [1]. One of the challenging problems in the field of Natural Language Processing is Conversational Machine Comprehension (CMC). Given a context passage, a conversational question and the conversational history, the system should predict the answer span of the question in the context passage. The main challenge in this task is how to effectively encode the conversational history into the prediction of the next answer. In this thesis work, we investigate the use of the BERT language model for the CMC task. We propose a new architecture, named BERT-CMC, using the BERT model as a base. This architecture includes a new module for encoding the conversational history, inspired by the Transformer-XL model [2]. This module serves the role of memory throughout the conversation. The proposed model is trained and evaluated on the Conversational Question Answering dataset (CoQA) [3]. Our hypothesis is that the BERT-CMC model will effectively learn the underlying context of the conversation, leading to better performance than the baseline model proposed for CoQA. Our results of evaluating the BERT-CMC on the CoQA dataset show that the model performs poorly (44.7% F1 score), comparing to the CoQA baseline model (66.2% F1 score). In the light of model explainability, we also perform a qualitative analysis of the model behavior in questions with various linguistic phenomena eg coreference, pragmatic reasoning. Additionally, we motivate the critical design choices made, by performing an ablation study of the effect of these choices on the model performance. The results suggest that fine tuning the BERT layers boost the model performance. Moreover, it is shown that increasing the number of extra layers on top of BERT leads to bigger capacity of the conversational memory. ; Bidirectional Encoder Representations from Transformers (BERT) är en nyligen föreslagen språkrepresentationsmodell, utformad för att förträna djupa dubbelriktade representationer, med målet att extrahera kontextkänsliga särdrag från en inmatningstext [1]. Ett utmanande problem inom området naturligtspråkbehandling är konversationsförståelse (förkortat CMC). Givet en bakgrundstext, en fråga och konversationshistoriken ska systemet förutsäga vilken del av bakgrundstexten som utgör svaret på frågan. Den viktigaste utmaningen i denna uppgift är hur man effektivt kan kodifiera konversationshistoriken i förutsägelsen av nästa svar. I detta examensarbete undersöker vi användningen av BERT-språkmodellen för CMC-uppgiften. Vi föreslår en ny arkitektur med namnet BERT-CMC med BERT-modellen som bas. Denna arkitektur innehåller en ny modul för kodning av konversationshistoriken, inspirerad av Transformer-XL-modellen [2]. Den här modulen tjänar minnets roll under hela konversationen. Den föreslagna modellen tränas och utvärderas på en datamängd för samtalsfrågesvar (CoQA) [3]. Vår hypotes är att BERT-CMC-modellen effektivt kommer att lära sig det underliggande sammanhanget för konversationen, vilket leder till bättre resultat än basmodellen som har föreslagits för CoQA. Våra resultat av utvärdering av BERT-CMC på CoQA-datasetet visar att modellen fungerar dåligt (44.7% F1 resultat), jämfört med CoQAbasmodellen (66.2% F1 resultat). För att bättre kunna förklara modellen utför vi också en kvalitativ analys av modellbeteendet i frågor med olika språkliga fenomen, t.ex. koreferens, pragmatiska resonemang. Dessutom motiverar vi de kritiska designvalen som gjorts genom att utföra en ablationsstudie av effekten av dessa val på modellens prestanda. Resultaten tyder på att finjustering av BERT-lager ökar modellens prestanda. Dessutom visas att ökning av antalet extra lager ovanpå BERT leder till större konversationsminne.
Keyword: Computer and Information Sciences; conversational machine comprehension; Data- och informationsvetenskap; frågesvar; language modelling; question answering; samtalsmaskinförståelse; self-attention; självuppmärksamhet; språkmodellering; transformatorer; transformers
URL: http://urn.kb.se/resolve?urn=urn:nbn:se:kth:diva-265656
BASE
Hide details
14
Combining Concepts and Their Translations from Structured Dictionaries of Uralic Minority Languages
In: Proceedings of the Eleventh International Conference on Language Resources and Evaluation (LREC 2018), Miyazaki, 7 - 12 May 2018 (2018), 862-867
IDS OBELEX meta
Show details
15
Signbank: Software to Support Web Based Dictionaries of Sign Language
In: Proceedings of the Eleventh International Conference on Language Resources and Evaluation (LREC 2018), Miyazaki, 7 - 12 May 2018 (2018), 2359-2364
IDS OBELEX meta
Show details
16
Designing a Collaborative Process to Create Bilingual Dictionaries of Indonesian Ethnic Languages
In: Proceedings of the Eleventh International Conference on Language Resources and Evaluation (LREC 2018), Miyazaki, 7 - 12 May 2018 (2018), 3397-3404
IDS OBELEX meta
Show details
17
LSP lexicography and typology of specialized dictionaries
In: Languages for special purposes. An international handbook (2018), 71-95
IDS OBELEX meta
Show details
18
An Integrated Formal Representation for Terminological and Lexical Data included in Classification Schemes
In: Proceedings of the Eleventh International Conference on Language Resources and Evaluation (LREC 2018), Miyazaki, 7 - 12 May 2018 (2018), 593-598
IDS OBELEX meta
Show details
19
Semantic modelling and publishing of traditional data collection questionnaires and answers
In: Abgaz, Yalemisew orcid:0000-0002-3887-5342 , Dorn, Amelie, Piringer, Barbara orcid:0000-0001-9983-1362 , Wandl-Vogt, Eveline orcid:0000-0002-0802-0255 and Way, Andy orcid:0000-0001-5736-5930 (2018) Semantic modelling and publishing of traditional data collection questionnaires and answers. Information, 9 (12). pp. 1-24. ISSN 2078-2489 (2018)
BASE
Show details
20
Unlocking cultural conceptualisation in indigenous language resources: collaborative computing methodologies
In: Dorn, Amelie, Wandl-Vogt, Eveline orcid:0000-0002-0802-0255 , Abgaz, Yalemisew orcid:0000-0002-3887-5342 , Benito Santos, Alejandro orcid:0000-0001-5317-6390 and Therón, Roberto orcid:0000-0001-6739-8875 (2018) Unlocking cultural conceptualisation in indigenous language resources: collaborative computing methodologies. In: 11th Language Resources and Evaluation Conference, 7-12 May 2018, Miyazaki, Japan. ISBN 979-10-95546-22-1 (2018)
BASE
Show details

Page: 1 2 3 4 5...15

Catalogues
0
0
0
0
0
0
0
Bibliographies
0
0
0
0
0
0
244
0
0
Linked Open Data catalogues
0
Online resources
0
0
0
0
Open access documents
51
0
0
0
0
© 2013 - 2024 Lin|gu|is|tik | Imprint | Privacy Policy | Datenschutzeinstellungen ändern