1 |
Dependency Patterns of Complex Sentences and Semantic Disambiguation for Abstract Meaning Representation Parsing ...
|
|
|
|
BASE
|
|
Show details
|
|
2 |
One Semantic Parser to Parse Them All: Sequence to Sequence Multi-Task Learning on Semantic Parsing Datasets ...
|
|
|
|
BASE
|
|
Show details
|
|
3 |
Modeling Sense Structure in Word Usage Graphs with the Weighted Stochastic Block Model ...
|
|
|
|
BASE
|
|
Show details
|
|
4 |
InFillmore: Frame-Guided Language Generation with Bidirectional Context ...
|
|
|
|
BASE
|
|
Show details
|
|
6 |
Learning Embeddings for Rare Words Leveraging Internet Search Engine and Spatial Location Relationships ...
|
|
|
|
BASE
|
|
Show details
|
|
7 |
Evaluating Universal Dependency Parser Recovery of Predicate Argument Structure via CompChain Analysis ...
|
|
|
|
BASE
|
|
Show details
|
|
8 |
ParsFEVER : a Dataset for Farsi Fact Extraction and Verification ...
|
|
|
|
BASE
|
|
Show details
|
|
9 |
Did the Cat Drink the Coffee? Challenging Transformers with Generalized Event Knowledge ...
|
|
|
|
BASE
|
|
Show details
|
|
11 |
Teach the Rules, Provide the Facts: Targeted Relational-knowledge Enhancement for Textual Inference ...
|
|
|
|
Abstract:
We present InferBert, a method to enhance transformer-based inference models with relevant relational knowledge. Our approach facilitates learning generic inference patterns requiring relational knowledge (e.g. inferences related to hypernymy) during training, while injecting on-demand the relevant relational facts (e.g. pangolin is an animal) at test time. We apply InferBERT to the NLI task over a diverse set of inference types (hypernymy, location, color, and country of origin), for which we collected challenge datasets. In this setting, InferBert succeeds to learn general inference patterns, from a relatively small number of training instances, while not hurting performance on the original NLI data and substantially outperforming prior knowledge enhancement models on the challenge data. It further applies its inferences successfully at test time to previously unobserved entities. InferBert is computationally more efficient than most prior methods, in terms of number of parameters, memory consumption and ...
|
|
Keyword:
Computational Linguistics; Data Management System; FOS Languages and literature; Linguistics; Natural Language Processing; Semantics
|
|
URL: https://dx.doi.org/10.48448/05ak-j010 https://underline.io/lecture/29793-teach-the-rules,-provide-the-facts-targeted-relational-knowledge-enhancement-for-textual-inference
|
|
BASE
|
|
Hide details
|
|
12 |
Multilingual Neural Semantic Parsing for Low-Resourced Languages ...
|
|
|
|
BASE
|
|
Show details
|
|
13 |
Inducing Language-Agnostic Multilingual Representations ...
|
|
|
|
BASE
|
|
Show details
|
|
14 |
Denoising Word Embeddings by Averaging in a Shared Space ...
|
|
|
|
BASE
|
|
Show details
|
|
16 |
Evaluating a Joint Training Approach for Learning Cross-lingual Embeddings with Sub-word Information without Parallel Corpora on Lower-resource Languages ...
|
|
|
|
BASE
|
|
Show details
|
|
17 |
NUS-IDS at CASE 2021 Task 1: Improving Multilingual Event Sentence Coreference Identification with Linguistic Information ...
|
|
|
|
BASE
|
|
Show details
|
|
18 |
Can Transformer Langauge Models Predict Psychometric Properties? ...
|
|
|
|
BASE
|
|
Show details
|
|
|
|