Page: 1 2 3 4 5 6 7 8 9 10... 690
101 |
Emergent Communication for Understanding Human Language Evolution: What's Missing? ...
|
|
|
|
BASE
|
|
Show details
|
|
102 |
Learning Bidirectional Translation between Descriptions and Actions with Small Paired Data ...
|
|
|
|
BASE
|
|
Show details
|
|
103 |
Topic Discovery via Latent Space Clustering of Pretrained Language Model Representations ...
|
|
|
|
BASE
|
|
Show details
|
|
105 |
Learning Disentangled Representations of Negation and Uncertainty ...
|
|
|
|
BASE
|
|
Show details
|
|
106 |
Polish Natural Language Inference and Factivity -- an Expert-based Dataset and Benchmarks ...
|
|
|
|
BASE
|
|
Show details
|
|
108 |
Regional Negative Bias in Word Embeddings Predicts Racial Animus--but only via Name Frequency ...
|
|
|
|
BASE
|
|
Show details
|
|
109 |
WavThruVec: Latent speech representation as intermediate features for neural speech synthesis ...
|
|
|
|
BASE
|
|
Show details
|
|
111 |
Grounding Hindsight Instructions in Multi-Goal Reinforcement Learning for Robotics ...
|
|
|
|
BASE
|
|
Show details
|
|
112 |
Formal Language Recognition by Hard Attention Transformers: Perspectives from Circuit Complexity ...
|
|
|
|
BASE
|
|
Show details
|
|
113 |
An Information-theoretic Approach to Prompt Engineering Without Ground Truth Labels ...
|
|
|
|
BASE
|
|
Show details
|
|
114 |
A Slot Is Not Built in One Utterance: Spoken Language Dialogs with Sub-Slots ...
|
|
|
|
BASE
|
|
Show details
|
|
115 |
Towards Structuring Real-World Data at Scale: Deep Learning for Extracting Key Oncology Information from Clinical Text with Patient-Level Supervision ...
|
|
|
|
BASE
|
|
Show details
|
|
116 |
Improving Intrinsic Exploration with Language Abstractions ...
|
|
|
|
BASE
|
|
Show details
|
|
117 |
Shedding New Light on the Language of the Dark Web ...
|
|
|
|
Abstract:
The hidden nature and the limited accessibility of the Dark Web, combined with the lack of public datasets in this domain, make it difficult to study its inherent characteristics such as linguistic properties. Previous works on text classification of Dark Web domain have suggested that the use of deep neural models may be ineffective, potentially due to the linguistic differences between the Dark and Surface Webs. However, not much work has been done to uncover the linguistic characteristics of the Dark Web. This paper introduces CoDA, a publicly available Dark Web dataset consisting of 10000 web documents tailored towards text-based Dark Web analysis. By leveraging CoDA, we conduct a thorough linguistic analysis of the Dark Web and examine the textual differences between the Dark Web and the Surface Web. We also assess the performance of various methods of Dark Web page classification. Finally, we compare CoDA with an existing public Dark Web dataset and evaluate their suitability for various use cases. ... : To appear at NAACL 2022 (main conference) ...
|
|
Keyword:
Computation and Language cs.CL; FOS Computer and information sciences; Information Retrieval cs.IR; Machine Learning cs.LG
|
|
URL: https://dx.doi.org/10.48550/arxiv.2204.06885 https://arxiv.org/abs/2204.06885
|
|
BASE
|
|
Hide details
|
|
118 |
HuSpaCy: an industrial-strength Hungarian natural language processing toolkit ...
|
|
|
|
BASE
|
|
Show details
|
|
119 |
On the Importance of Data Size in Probing Fine-tuned Models ...
|
|
|
|
BASE
|
|
Show details
|
|
120 |
Fine-grained Noise Control for Multispeaker Speech Synthesis ...
|
|
|
|
BASE
|
|
Show details
|
|
Page: 1 2 3 4 5 6 7 8 9 10... 690
|
|