1 |
Knowledge Augmented BERT Mutual Network in Multi-turn Spoken Dialogues ...
|
|
|
|
Abstract:
Modern spoken language understanding (SLU) systems rely on sophisticated semantic notions revealed in single utterances to detect intents and slots. However, they lack the capability of modeling multi-turn dynamics within a dialogue particularly in long-term slot contexts. Without external knowledge, depending on limited linguistic legitimacy within a word sequence may overlook deep semantic information across dialogue turns. In this paper, we propose to equip a BERT-based joint model with a knowledge attention module to mutually leverage dialogue contexts between two SLU tasks. A gating mechanism is further utilized to filter out irrelevant knowledge triples and to circumvent distracting comprehension. Experimental results in two complicated multi-turn dialogue datasets have demonstrate by mutually modeling two SLU tasks with filtered knowledge and dialogue contexts, our approach has considerable improvements compared with several competitive baselines. ... : Published in ICASSP 2022 ...
|
|
Keyword:
Computation and Language cs.CL; FOS Computer and information sciences
|
|
URL: https://dx.doi.org/10.48550/arxiv.2202.11299 https://arxiv.org/abs/2202.11299
|
|
BASE
|
|
Hide details
|
|
2 |
A Label-Aware BERT Attention Network for Zero-Shot Multi-Intent Detection in Spoken Language Understanding ...
|
|
|
|
BASE
|
|
Show details
|
|
6 |
Design of loss functions and feature transformation for minimum classification error based automatic speech recognition
|
|
|
|
BASE
|
|
Show details
|
|
|
|