Deep Distant Supervision: Learning Statistical Relational Models For Weak Supervision In Natural Language Extraction
Solving Large Scale Learning Tasks: Challenges And Algorithms
Lecture Notes In Computer Science
One of the challenges to information extraction is the requirement of human annotated examples, commonly called gold-standard examples. Many successful approaches alleviate this problem by employing some form of distant supervision i.e., look into knowledge bases such as Freebase as a source of supervision to create more examples. While this is perfectly reasonable, most distant supervision methods rely on a given set of propositions as a source of supervision. We propose a different approach: we infer weakly supervised examples for relations from statistical relational models learned by using knowledge outside the natural language task. We argue that this deep distant supervision creates more robust examples that are particularly useful when learning the entire model (the structure and parameters). We demonstrate on several domains that this form of weak supervision yields superior results when learning structure compared to using distant supervision labels or a smaller set of labels.
S. Michaelis, N. Piatkowski, And M. Stolpe
S. Natarajan, Ameet Soni, A. Wazalwar, D. Viswanathan, and K. Kersting.
"Deep Distant Supervision: Learning Statistical Relational Models For Weak Supervision In Natural Language Extraction".
Solving Large Scale Learning Tasks: Challenges And Algorithms.