DE eng

Search in the Catalogues and Directories

Page: 1 2 3
Hits 1 – 20 of 44

1
Learning to Borrow -- Relation Representation for Without-Mention Entity-Pairs for Knowledge Graph Completion ...
BASE
Show details
2
Learning Meta Word Embeddings by Unsupervised Weighted Concatenation of Source Embeddings ...
Bollegala, Danushka. - : arXiv, 2022
BASE
Show details
3
Sense Embeddings are also Biased--Evaluating Social Biases in Static and Contextualised Sense Embeddings
BASE
Show details
4
I Wish I Would Have Loved This One, But I Didn't -- A Multilingual Dataset for Counterfactual Detection in Product Reviews ...
BASE
Show details
5
Detect and Classify – Joint Span Detection and Classification for Health Outcomes ...
BASE
Show details
6
Unsupervised Abstractive Opinion Summarization by Generating Sentences with Tree-Structured Topic Guidance ...
BASE
Show details
7
Fine-Tuning Word Embeddings for Hierarchical Representation of Data Using a Corpus and a Knowledge Base for Various Machine Learning Applications
In: Comput Math Methods Med (2021)
BASE
Show details
8
RelWalk - A Latent Variable Model Approach to Knowledge Graph Embedding.
Bollegala, Danushka; Kawarabayashi, Ken-ichi; Yoshida, Yuichi. - : Association for Computational Linguistics, 2021
BASE
Show details
9
Dictionary-based Debiasing of Pre-trained Word Embeddings.
Bollegala, Danushka; Kaneko, Masahiro. - : Association for Computational Linguistics, 2021
BASE
Show details
10
Unsupervised Abstractive Opinion Summarization by Generating Sentences with Tree-Structured Topic Guidance
Sakata, Ichiro; Mori, Junichiro; Bollegala, Danushka. - : Massachusetts Institute of Technology Press, 2021
BASE
Show details
11
Unsupervised Abstractive Opinion Summarization by Generating Sentences with Tree-Structured Topic Guidance
BASE
Show details
12
Debiasing Pre-trained Contextualised Embeddings.
Kaneko, Masahiro; Bollegala, Danushka. - : Association for Computational Linguistics, 2021
BASE
Show details
13
Autoencoding Improves Pre-trained Word Embeddings ...
BASE
Show details
14
Autoencoding Improves Pre-trained Word Embeddings ...
BASE
Show details
15
Graph Convolution over Multiple Dependency Sub-graphs for Relation Extraction ...
BASE
Show details
16
Language-Independent Tokenisation Rivals Language-Specific Tokenisation for Word Similarity Prediction ...
BASE
Show details
17
Graph Convolution over Multiple Dependency Sub-graphs for Relation Extraction.
Mandya, Angrosh; Coenen, Frans; Bollegala, Danushka. - : International Committee on Computational Linguistics, 2020
BASE
Show details
18
Multi-Source Attention for Unsupervised Domain Adaptation.
Bollegala, Danushka; Cui, Xia. - : Association for Computational Linguistics, 2020
BASE
Show details
19
Learning to Compose Relational Embeddings in Knowledge Graphs
Hakami, Huda; Chen, Wenye; Bollegala, Danushka. - : Springer Singapore, 2020
Abstract: Knowledge Graph Embedding methods learn low-dimensional representations for entities and relations in knowledge graphs, which can be used to infer previously unknown relations between pairs of entities in the knowledge graph. This is particularly useful for expanding otherwise sparse knowledge graphs. However, the relation types that can be predicted using knowledge graph embeddings are confined to the set of relations that already exists in the KG. Often the set of relations that exist between two entities are not independent, and it is possible to predict what other relations are likely to exist between two entities by composing the embeddings of the relations in which each entity participates. We introduce relation composition as the task of inferring embeddings for unseen relations by combining existing relations in a knowledge graph. Specifically, we propose a supervised method to compose relational embeddings for novel relations using pre-trained relation embeddings for existing relations. Our experimental results on a previously proposed benchmark dataset for relation composition ranking and triple classification show that the proposed supervised relation composition method outperforms several unsupervised relation composition methods.
URL: http://livrepository.liverpool.ac.uk/3099008/1/Chen_PACLING_2019%20%281%29.pdf
https://link.springer.com/chapter/10.1007%2F978-981-15-6168-9_5
http://livrepository.liverpool.ac.uk/3099008/
BASE
Hide details
20
Tree-Structured Neural Topic Model
BASE
Show details

Page: 1 2 3

Catalogues
0
0
1
0
0
0
0
Bibliographies
0
0
0
0
0
0
0
0
0
Linked Open Data catalogues
0
Online resources
0
0
0
0
Open access documents
43
0
0
0
0
© 2013 - 2024 Lin|gu|is|tik | Imprint | Privacy Policy | Datenschutzeinstellungen ändern