Page: 1 2 3 4 5 6 7 8... 1.020
61 |
Правила генерации глагольных словоформ для новописьменного варианта ливвиковского наречия ...
|
|
|
|
BASE
|
|
Show details
|
|
62 |
Правила генерации глагольных словоформ для новописьменного варианта ливвиковского наречия ...
|
|
|
|
BASE
|
|
Show details
|
|
63 |
Правила генерации глагольных словоформ для новописьменного варианта ливвиковского наречия ...
|
|
|
|
BASE
|
|
Show details
|
|
64 |
Правила генерации глагольных словоформ для новописьменного варианта ливвиковского наречия ...
|
|
|
|
BASE
|
|
Show details
|
|
70 |
Formalization of AMR Inference via Hybrid Logic Tableaux ...
|
|
|
|
BASE
|
|
Show details
|
|
71 |
Hebrew Transformed: Machine Translation of Hebrew Using the Transformer Architecture
|
|
|
|
Abstract:
This thesis presents the first known end-to-end application to Hebrew language of Google’s state-of-the-art Transformer architecture for natural language processing (NLP). The state of the art in machine translation (MT) of Hebrew remains poor. Scholarly work in MT, deep learning (DL), and other areas of NLP for Hebrew began to develop much later and remains much less mature than for other languages. The problem is difficult because of the nature of Hebrew as a morphologically-rich language (MRL), the small size of the total corpus of electronic Hebrew documents available as training material, and the small size of the Hebrew-literate computing community worldwide. Nonetheless, significant advances in Hebrew NLP tools, data, methods, and scholarly infrastructure over the last 15 years, combined with recent advances in general NLP and MT over the last few years, especially the rise of neural networks and deep learning, create an enticing opportunity to attempt to advance the current state of Hebrew MT. More specifically, Google’s Transformer neural network and associated technologies such as bidirectional encoder representations from Transformers (BERT) have revolutionized general MT and hold great promise for improving automatic Hebrew translation. This thesis demonstrates that, as measured by METEOR scores, a basic Hebrew Transformer trained in a few hours on a single GPU (graphics processing unit) exceeds the current performance of Google Translate on in-genre Hebrew translation tasks and is not far behind Google Translate on Hebrew translation tasks in general.
|
|
Keyword:
Artificial intelligence; bidirectional encoder representations from transformers (BERT); computational linguistics; Computer science; hebrew; Linguistics; machine translation; natural language processing (NLP); transformer
|
|
URL: https://nrs.harvard.edu/URN-3:HUL.INSTREPOS:37370749
|
|
BASE
|
|
Hide details
|
|
72 |
Supplementary materials for "Leveraging graph algorithms to speed up the annotation of large rhymed corpora" by Julien Baley, published in CLAO 51.1 (2022) ...
|
|
|
|
BASE
|
|
Show details
|
|
73 |
Supplementary materials for "Leveraging graph algorithms to speed up the annotation of large rhymed corpora" by Julien Baley, published in CLAO 51.1 (2022) ...
|
|
|
|
BASE
|
|
Show details
|
|
79 |
Human-like learning of syntactic islands by neural networks ...
|
|
|
|
BASE
|
|
Show details
|
|
80 |
Study 1 - Fred and his dog (revised with author vs respondent conditions) ...
|
|
|
|
BASE
|
|
Show details
|
|
Page: 1 2 3 4 5 6 7 8... 1.020
|
|