Home
Catalogue search
Refine your search:
Keyword
Creator / Publisher:
Fang, Yimai (5)
Vandyke, David (3)
Su, Yixuan (2)
Wang, Sihui (2)
., Nigel (1)
Cheng, Jianpeng (1)
Collier, Nigel (1)
The 2021 Conference on Empirical Methods in Natural Language Processing 2021 (1)
Tseng, Bo-Hsiang (1)
Year
Medium
Type:
Book (2)
Miscellaneous (2)
Article (1)
BLLDB-Access
Search in the Catalogues and Directories
All fields
Title
Creator / Publisher
Keyword
Year
AND
OR
AND NOT
All fields
Title
Creator / Publisher
Keyword
Year
AND
OR
AND NOT
All fields
Title
Creator / Publisher
Keyword
Year
AND
OR
AND NOT
All fields
Title
Creator / Publisher
Keyword
Year
AND
OR
AND NOT
All fields
Title
Creator / Publisher
Keyword
Year
Sort by
creator [A → Z]
'
creator [Z → A]
'
publishing year ↑ (asc)
'
publishing year ↓ (desc)
'
title [A → Z]
'
title [Z → A]
'
Simple Search
Hits 1 – 5 of 5
1
Plan-then-Generate: Controlled Data-to-Text Generation via Planning ...
Su, Yixuan
;
Vandyke, David
;
Wang, Sihui
. - : arXiv, 2021
BASE
Show details
2
Plan-then-Generate: Controlled Data-to-Text Generation via Planning ...
The 2021 Conference on Empirical Methods in Natural Language Processing 2021
;
., Nigel
;
Fang, Yimai
. - : Underline Science Inc., 2021
BASE
Show details
3
A Generative Model for Joint Natural Language Understanding and Generation ...
Tseng, Bo-Hsiang
;
Cheng, Jianpeng
;
Fang, Yimai
. - : arXiv, 2020
BASE
Show details
4
Proposition-based summarization with a coherence-driven incremental model ...
Fang, Yimai
. - : Apollo - University of Cambridge Repository, 2019
Abstract:
Summarization models which operate on meaning representations of documents have been neglected in the past, although they are a very promising and interesting class of methods for summarization and text understanding. In this thesis, I present one such summarizer, which uses the proposition as its meaning representation. My summarizer is an implementation of Kintsch and van Dijk's model of comprehension, which uses a tree of propositions to represent the working memory. The input document is processed incrementally in iterations. In each iteration, new propositions are connected to the tree under the principle of local coherence, and then a forgetting mechanism is applied so that only a few important propositions are retained in the tree for the next iteration. A summary can be generated using the propositions which are frequently retained. Originally, this model was only played through by hand by its inventors using human-created propositions. In this work, I turned it into a fully automatic model using ...
Keyword:
abstractive summarisation
;
abstractive summarization
;
coherence
;
computational linguistics
;
document understanding
;
natural language processing
;
NLP
;
summarisation
;
summarization
;
text understanding
URL:
https://www.repository.cam.ac.uk/handle/1810/287468
https://dx.doi.org/10.17863/cam.34773
BASE
Hide details
5
Proposition-based summarization with a coherence-driven incremental model
Fang, Yimai
. - : University of Cambridge, 2018. : Computer Science and Technology, 2018. : Hughes Hall, 2018
BASE
Show details
Mobile view
All
Catalogues
UB Frankfurt Linguistik
0
IDS Mannheim
0
OLC Linguistik
0
UB Frankfurt Retrokatalog
0
DNB Subject Category Language
0
Institut für Empirische Sprachwissenschaft
0
Leibniz-Centre General Linguistics (ZAS)
0
Bibliographies
BLLDB
0
BDSL
0
IDS Bibliografie zur deutschen Grammatik
0
IDS Bibliografie zur Gesprächsforschung
0
IDS Konnektoren im Deutschen
0
IDS Präpositionen im Deutschen
0
IDS OBELEX meta
0
MPI-SHH Linguistics Collection
0
MPI for Psycholinguistics
0
Linked Open Data catalogues
Annohub
0
Online resources
Link directory
0
Journal directory
0
Database directory
0
Dictionary directory
0
Open access documents
BASE
5
Linguistik-Repository
0
IDS Publikationsserver
0
Online dissertations
0
Language Description Heritage
0
© 2013 - 2024 Lin|gu|is|tik
|
Imprint
|
Privacy Policy
|
Datenschutzeinstellungen ändern