DE eng

Search in the Catalogues and Directories

Page: 1 2
Hits 1 – 20 of 30

1
Creating Biographical Networks from Chinese and English Wikipedia
In: https://halshs.archives-ouvertes.fr/halshs-03217972 ; 2021 (2021)
BASE
Show details
2
Who cares about calling non-consensual sex "rape" in summaries of fictional narratives on Wikipedia? From a gender identity hypothesis to recurrent activist discursive practices
In: Exploring Gender Identities Online ; https://hal-univ-bourgogne.archives-ouvertes.fr/hal-03293248 ; Exploring Gender Identities Online, Jul 2021, Greifswald / Constance (on line), Germany (2021)
BASE
Show details
3
Tabouid: un jeu de langage et de culture générale généré à partir de Wikipédia
In: Actes de la 28e Conférence sur le Traitement Automatique des Langues Naturelles. Volume 1 : conférence principale ; Traitement Automatique des Langues Naturelles ; https://hal.archives-ouvertes.fr/hal-03265882 ; Traitement Automatique des Langues Naturelles, 2021, Lille, France. pp.278-279 (2021)
BASE
Show details
4
Alfabetizzazione digitale, scrittura enciclopedica ed educazione linguistica democratica ...
Tavosanis, Mirko Luigi Aurelio. - : University of Salento, 2021
BASE
Show details
5
Comparable corpora of South-Slavic Wikipedias CLASSLA-Wikipedia 1.0
Ljubešić, Nikola; Markoski, Filip; Markoska, Elena. - : Jožef Stefan Institute, 2021
BASE
Show details
6
ViquiQuAD: an extractive QA dataset from Catalan Wikipedia ...
BASE
Show details
7
FAIR and Open multilingual clinical trials in Wikidata and Wikipedia ...
Rasberry, Lane; Kwok, Cherrie. - : Zenodo, 2021
BASE
Show details
8
ViquiQuAD: an extractive QA dataset from Catalan Wikipedia ...
BASE
Show details
9
WikiProject Clinical Trials for multilingual access to information ...
Rasberry, Lane. - : Zenodo, 2021
BASE
Show details
10
FAIR and Open multilingual clinical trials in Wikidata and Wikipedia ...
Rasberry, Lane; Kwok, Cherrie. - : Zenodo, 2021
BASE
Show details
11
The Influence of Multilingualism and Mutual Intelligibility on Wikipedia Reading Behaviour: A Research Proposal ...
Meier, Florian. - : Universität Regensburg, 2021
BASE
Show details
12
WikiProject Clinical Trials for multilingual access to information ...
Rasberry, Lane. - : Zenodo, 2021
BASE
Show details
13
FAIR and Open multilingual clinical trials in Wikidata and Wikipedia ...
Rasberry, Lane; Kwok, Cherrie. - : Zenodo, 2021
BASE
Show details
14
WikiProject Clinical Trials for multilingual access to information ...
Rasberry, Lane. - : Zenodo, 2021
BASE
Show details
15
Graphs, Computation, and Language ...
Ustalov, Dmitry. - : Zenodo, 2021
BASE
Show details
16
Graphs, Computation, and Language ...
Ustalov, Dmitry. - : Zenodo, 2021
BASE
Show details
17
Extracting Relations from Italian Wikipedia using Self-Training ...
BASE
Show details
18
WATS-SMS: A T5-Based French Wikipedia Abstractive Text Summarizer for SMS
In: Future Internet ; Volume 13 ; Issue 9 (2021)
Abstract: Text summarization remains a challenging task in the natural language processing field despite the plethora of applications in enterprises and daily life. One of the common use cases is the summarization of web pages which has the potential to provide an overview of web pages to devices with limited features. In fact, despite the increasing penetration rate of mobile devices in rural areas, the bulk of those devices offer limited features in addition to the fact that these areas are covered with limited connectivity such as the GSM network. Summarizing web pages into SMS becomes, therefore, an important task to provide information to limited devices. This work introduces WATS-SMS, a T5-based French Wikipedia Abstractive Text Summarizer for SMS. It is built through a transfer learning approach. The T5 English pre-trained model is used to generate a French text summarization model by retraining the model on 25,000 Wikipedia pages then compared with different approaches in the literature. The objective is twofold: (1) to check the assumption made in the literature that abstractive models provide better results compared to extractive ones ; and (2) to evaluate the performance of our model compared to other existing abstractive models. A score based on ROUGE metrics gave us a value of 52% for articles with length up to 500 characters against 34.2% for transformer-ED and 12.7% for seq-2seq-attention ; and a value of 77% for articles with larger size against 37% for transformers-DMCA. Moreover, an architecture including a software SMS-gateway has been developed to allow owners of mobile devices with limited features to send requests and to receive summaries through the GSM network.
Keyword: fine-tuning; French Wikipedia; gateway; SMS; text summarization; transformers
URL: https://doi.org/10.3390/fi13090238
BASE
Hide details
19
Extracting Relations from Italian Wikipedia using Self-Training ...
BASE
Show details
20
Extracting Relations from Italian Wikipedia using Self-Training ...
BASE
Show details

Page: 1 2

Catalogues
0
0
0
0
0
0
0
Bibliographies
0
0
0
0
0
0
0
0
0
Linked Open Data catalogues
0
Online resources
0
0
0
0
Open access documents
30
0
0
0
0
© 2013 - 2024 Lin|gu|is|tik | Imprint | Privacy Policy | Datenschutzeinstellungen ändern