DE eng

Search in the Catalogues and Directories

Page: 1 2 3 4
Hits 1 – 20 of 69

1
MCSQ Translation Models (en-ru) (v1.0)
Variš, Dušan. - : Charles University, Faculty of Mathematics and Physics, Institute of Formal and Applied Linguistics (UFAL), 2022
BASE
Show details
2
MCSQ Translation Models (en-de) (v1.0)
Variš, Dušan. - : Charles University, Faculty of Mathematics and Physics, Institute of Formal and Applied Linguistics (UFAL), 2022
BASE
Show details
3
Evaluation computergestützter Verfahren der Emotionsklassifikation für deutschsprachige Dramen um 1800 ...
BASE
Show details
4
Evaluation computergestützter Verfahren der Emotionsklassifikation für deutschsprachige Dramen um 1800 ...
BASE
Show details
5
Transformer-Based Abstractive Summarization for Reddit and Twitter: Single Posts vs. Comment Pools in Three Languages
In: Future Internet; Volume 14; Issue 3; Pages: 69 (2022)
BASE
Show details
6
Speech Enhancement by Multiple Propagation through the Same Neural Network
In: Sensors; Volume 22; Issue 7; Pages: 2440 (2022)
BASE
Show details
7
FedQAS: Privacy-Aware Machine Reading Comprehension with Federated Learning
In: Applied Sciences; Volume 12; Issue 6; Pages: 3130 (2022)
BASE
Show details
8
A Pipeline Approach to Context-Aware Handwritten Text Recognition
In: Applied Sciences; Volume 12; Issue 4; Pages: 1870 (2022)
BASE
Show details
9
Correcting Diacritics and Typos with a ByT5 Transformer Model
In: Applied Sciences; Volume 12; Issue 5; Pages: 2636 (2022)
BASE
Show details
10
Research on Named Entity Recognition Methods in Chinese Forest Disease Texts
In: Applied Sciences; Volume 12; Issue 8; Pages: 3885 (2022)
BASE
Show details
11
Cross-Lingual Transfer Learning for Arabic Task-Oriented Dialogue Systems Using Multilingual Transformer Model mT5
In: Mathematics; Volume 10; Issue 5; Pages: 746 (2022)
BASE
Show details
12
AraConv: Developing an Arabic Task-Oriented Dialogue System Using Multi-Lingual Transformer Model mT5
In: Applied Sciences; Volume 12; Issue 4; Pages: 1881 (2022)
BASE
Show details
13
Retrieval-Based Transformer Pseudocode Generation
In: Mathematics; Volume 10; Issue 4; Pages: 604 (2022)
BASE
Show details
14
Hebrew Transformed: Machine Translation of Hebrew Using the Transformer Architecture
Crater, David T. - 2022
BASE
Show details
15
English machine reading comprehension: new approaches to answering multiple-choice questions
Dzendzik, Daria. - : Dublin City University. School of Computing, 2021. : Dublin City University. ADAPT, 2021
In: Dzendzik, Daria (2021) English machine reading comprehension: new approaches to answering multiple-choice questions. PhD thesis, Dublin City University. (2021)
BASE
Show details
16
Transformer versus LSTM Language Models Trained on Uncertain ASR Hypotheses in Limited Data Scenarios
In: https://hal.inria.fr/hal-03362828 ; 2021 (2021)
BASE
Show details
17
Simulating reading mistakes for child speech Transformer-based phone recognition
In: Annual Conference of the International Speech Communication Association (INTERSPEECH) ; https://hal.archives-ouvertes.fr/hal-03257870 ; Annual Conference of the International Speech Communication Association (INTERSPEECH), Aug 2021, Brno, Czech Republic (2021)
BASE
Show details
18
Breaking Down the Invisible Wall of Informal Fallacies in Online Discussions
In: ACL-IJCNLP 2021 - Joint Conference of the 59th Annual Meeting of the Association for Computational Linguistics and the 11th International Joint Conference on Natural Language Processing ; https://hal.inria.fr/hal-03351649 ; ACL-IJCNLP 2021 - Joint Conference of the 59th Annual Meeting of the Association for Computational Linguistics and the 11th International Joint Conference on Natural Language Processing, Aug 2021, Online, France ; https://2021.aclweb.org/ (2021)
BASE
Show details
19
End-to-end acoustic modelling for phone recognition of young readers
In: ISSN: 0167-6393 ; EISSN: 1872-7182 ; Speech Communication ; https://hal.archives-ouvertes.fr/hal-03373156 ; Speech Communication, Elsevier : North-Holland, 2021, 134, pp.71-84. ⟨10.1016/j.specom.2021.08.003⟩ ; https://www.sciencedirect.com/science/article/pii/S0167639321000959?via%3Dihub (2021)
BASE
Show details
20
Multitask Transformer Model-based Fintech Customer Service Chatbot NLU System with DECO-LGG SSP-based Data ; DECO-LGG 반자동 증강 학습데이터 활용 멀티태스크 트랜스포머 모델 기반 핀테크 CS 챗봇 NLU 시스템
In: Annual Conference on Human and Language Technology ; https://hal.archives-ouvertes.fr/hal-03603903 ; Annual Conference on Human and Language Technology, Oct 2021, Séoul, South Korea. pp.461-466 ; http://www.koreascience.or.kr/journal/OOGHAK.page (2021)
Abstract: National audience ; This study is based on the Semi-automatic Symbolic Propagation (SSP) method, which uses the DECO (Dictionnaire Electronique du COréen) Korean electronic dictionary and local grammar graphs (LGG). We created an annotated learning data set for a natural language understanding (NLU) chatbot for customer service (CS) in the field of financial technology (fintech). Using this dataset, we implemented a fintech CS chatbot NLU system with the Dual Intent and Entity Transformer (DIET) architecture provided by the RASA open source framework. Based on 10 conversation forums, and taking into account 32 topic types in the fintech field and 38 key events identified through real data, the DECO-LGG data generation module effectively generates high-quality annotated learning data for queries and complaint dialogues. An end-to-end multi-task transformer DIET model comprehensively processed object name recognition for intention classification and slot-filling. Learning with DIET-only led to an F1-score of 0.931 (Intent)/0.865 (Slot/Entity), and with DIET+KoBERT, the F1-score reached 0.951(Intent)/0.901(Slot/Entity). Thus, the DECO-LGG-based SSP-generated data is effective as training data and the KoBERT-based DIET model outperforms the DIET-only model. ; 본 연구에서는 DECO(Dictionnaire Electronique du COreen) 한국어 전자사전과 LGG(Local-Grammar Graph)에 기반한 반자동 언어데이터 증강(Semi-automatic Symbolic Propagation: SSP) 방식에 입각하여, 핀테크 분야의 CS(Customer Service) 챗봇 NLU(Natural Language Understanding)을 위한 주석 학습 데이터를 효과적으로 생성하고, 이를 기반으로 RASA 오픈 소스에서 제공하는 DIET(Dual Intent and Entity Transformer) 아키텍처를 활용하여 핀테크 CS 챗봇 NLU 시스템을 구현하였다. 실 데이터을 통해 확인된 핀테크 분야의 32가지의 토픽 유형 및 38가지의 핵심 이벤트와 10가지 담화소 구성에 따라, DECO-LGG 데이터 생성 모듈은 질의 및 불만 화행에 대한 양질의 주석 학습 데이터를 효과적으로 생성하며, 이를 의도 분류 및 Slot-filling을 위한 개체명 인식을 종합적으로 처리하는 End to End 방식의 멀티태스크 트랜스포머 모델 DIET로 학습함으로써 DIET-only F1-score 0.931(Intent)/0.865(Slot/Entity), DIET+KoBERT F1-score 0.951(Intent)/0.901(Slot/Entity)의 성능을 확인하였으며, DECO-LGG 기반의 SSP 생성 데이터의 학습 데이터로서의 효과성과 함께 KoBERT에 기반한 DIET 모델 성능의 우수성을 입증하였다.
Keyword: [INFO.INFO-CL]Computer Science [cs]/Computation and Language [cs.CL]; [INFO.INFO-TT]Computer Science [cs]/Document and Text Processing; [SHS.LANGUE]Humanities and Social Sciences/Linguistics; Chatbot; E-dictionary; Language Resource; Local grammar graph; Multitask transformer; NLP dictionary; Semi automatic annotation; Unitex; 멀티태스크 트랜스포머 모델; 반자동증강 학습데이터; 언어자원; 전자사전; 챗봇 NLU 시스템
URL: https://hal.archives-ouvertes.fr/hal-03603903
https://hal.archives-ouvertes.fr/hal-03603903/file/yoo-et-al-2021.pdf
https://hal.archives-ouvertes.fr/hal-03603903/document
BASE
Hide details

Page: 1 2 3 4

Catalogues
0
0
0
0
0
0
0
Bibliographies
1
0
0
0
0
0
0
0
0
Linked Open Data catalogues
0
Online resources
0
0
0
0
Open access documents
68
0
0
0
0
© 2013 - 2024 Lin|gu|is|tik | Imprint | Privacy Policy | Datenschutzeinstellungen ändern