DE eng

Search in the Catalogues and Directories

Page: 1 2
Hits 1 – 20 of 39

1
Fremdwortersatz im Russischen der Gegenwart ...
Lorenz, Marina. - : Universität Tübingen, 2022
BASE
Show details
2
Musik im Russischunterricht
Steinbach, Andrea; Birzer, Sandra. - : Dillingen a. d. Donau, 2022
BASE
Show details
3
Fremdwortersatz im Russischen der Gegenwart
Lorenz, Marina. - : Universität Tübingen, 2022
BASE
Show details
4
Von A wie Aspekt bis Z wie Zdvořilost. Ein Kaleidoskop der Slavistik für Tilman Berger zum 65. Geburtstag ...
Unkn Unknown. - : Tübingen Library Publishing, 2021
BASE
Show details
5
Musik im Russischunterricht
Steinbach, Andrea; Birzer, Sandra. - : Otto-Friedrich-Universität, 2021. : Bamberg, 2021
BASE
Show details
6
Transformer-based NMT : modeling, training and implementation
Xu, Hongfei. - : Saarländische Universitäts- und Landesbibliothek, 2021
Abstract: International trade and industrial collaborations enable countries and regions to concentrate their developments on specific industries while making the most of other countries' specializations, which significantly accelerates global development. However, globalization also increases the demand for cross-region communication. Language barriers between many languages worldwide create a challenge for achieving deep collaboration between groups speaking different languages, increasing the need for translation. Language technology, specifically, Machine Translation (MT) holds the promise to enable communication between languages efficiently in real-time with minimal costs. Even though nowadays computers can perform computation in parallel very fast, which provides machine translation users with translations with very low latency, and although the evolution from Statistical Machine Translation (SMT) to Neural Machine Translation (NMT) with the utilization of advanced deep learning algorithms has significantly boosted translation quality, current machine translation algorithms are still far from accurately translating all input. Thus, how to further improve the performance of state-of-the-art NMT algorithm remains a valuable open research question which has received a wide range of attention. In the research presented in this thesis, we first investigate the long-distance relation modeling ability of the state-of-the-art NMT model, the Transformer. We propose to learn source phrase representations and incorporate them into the Transformer translation model, aiming to enhance its ability to capture long-distance dependencies well. Second, though previous work (Bapna et al., 2018) suggests that deep Transformers have difficulty in converging, we empirically find that the convergence of deep Transformers depends on the interaction between the layer normalization and residual connections employed to stabilize its training. We conduct a theoretical study about how to ensure the convergence of Transformers, especially for deep Transformers, and propose to ensure the convergence of deep Transformers by putting the Lipschitz constraint on its parameter initialization. Finally, we investigate how to dynamically determine proper and efficient batch sizes during the training of the Transformer model. We find that the gradient direction gets stabilized with increasing batch size during gradient accumulation. Thus we propose to dynamically adjust batch sizes during training by monitoring the gradient direction change within gradient accumulation, and to achieve a proper and efficient batch size by stopping the gradient accumulation when the gradient direction starts to fluctuate. For our research in this thesis, we also implement our own NMT toolkit, the Neutron implementation of the Transformer and its variants. In addition to providing fundamental features as the basis of our implementations for the approaches presented in this thesis, we support many advanced features from recent cutting-edge research work. Implementations of all our approaches in this thesis are also included and open-sourced in the toolkit. To compare with previous approaches, we mainly conducted our experiments on the data from the WMT 14 English to German (En-De) and English to French (En-Fr) news translation tasks, except when studying the convergence of deep Transformers, where we alternated the WMT 14 En-Fr task with the WMT 15 Czech to English (Cs-En) news translation task to compare with Bapna et al. (2018). The sizes of these datasets vary from medium (the WMT 14 En-De, ~ 4.5M sentence pairs) to very large (the WMT 14 En-Fr, ~ 36M sentence pairs), thus we suggest our approaches help improve the translation quality between popular language pairs which are widely used and have sufficient data. ; China Scholarship Council
Keyword: ddc:430; ddc:440; ddc:490; ddc:491.8; ddc:600; dynamic batch size; neural machine translation; optimization; parameter initialization; phrase representation; transformer translation model
URL: https://doi.org/10.22028/D291-34998
http://nbn-resolving.org/urn:nbn:de:bsz:291--ds-349988
BASE
Hide details
7
Von A wie Aspekt bis Z wie Zdvořilost. Ein Kaleidoskop der Slavistik für Tilman Berger zum 65. Geburtstag
Brehmer, Bernhard; Perevozchikova, Tatiana; Gattnar, Anja. - : Tübingen Library Publishing, 2021
BASE
Show details
8
Comparing Comparatives: New Perspectives from Fieldwork and Processing ...
Berezovskaya, Polina. - : Universität Tübingen, 2020
BASE
Show details
9
Comparing Comparatives: New Perspectives from Fieldwork and Processing
Berezovskaya, Polina. - : Universität Tübingen, 2020
BASE
Show details
10
Das „Rechtschreib-Elend“: Der Umgang mit orthografischen Problemen im 19. und frühen 20. Jh. im deutsch-russischen Vergleich ...
Levinson, Kirill. - : Universität Tübingen, 2019
BASE
Show details
11
Ћирилица из нот дед: Cyrillic Script from a Sociolinguistic Perspective in Macedonia, Montenegro and Serbia
BASE
Show details
12
Information density and phonetic structure: Explaining segmental variability
Brandt, Erika. - : Saarländische Universitäts- und Landesbibliothek, 2019
BASE
Show details
13
Das „Rechtschreib-Elend“: Der Umgang mit orthografischen Problemen im 19. und frühen 20. Jh. im deutsch-russischen Vergleich
Levinson, Kirill. - : Universität Tübingen, 2019
BASE
Show details
14
The Learnability of Evidential Systems in the Case of L1 Bulgarian and L2 English
Ilchovska, Zlatomira Georgieva; Culbertson, Jennifer. - : Universität Tübingen, 2019
BASE
Show details
15
Information density and phonetic structure: Explaining segmental variability ...
Brandt, Erika. - : Universität des Saarlandes, 2018
BASE
Show details
16
Russische Sprache in Weißrussland
Norman, Boris. - 2018
BASE
Show details
17
Die Wahl von sprachlichen Varianten
BASE
Show details
18
Wieviel Grammatik der russischen Sprache brauchen wir?
BASE
Show details
19
Verben der Fortbewegung im Russischen: Wege und Modelle der semantischen Derivation
Röhrborn, Uta. - 2016
BASE
Show details
20
Osnovy pravoslavlja i pravoslavnye chramy Peterburga
BASE
Show details

Page: 1 2

Catalogues
0
0
0
0
0
0
0
Bibliographies
0
0
0
0
0
0
0
0
0
Linked Open Data catalogues
0
Online resources
0
0
0
0
Open access documents
39
0
0
0
0
© 2013 - 2024 Lin|gu|is|tik | Imprint | Privacy Policy | Datenschutzeinstellungen ändern