Label Noise Reduction in Entity Typing by Heterogeneous Partial-Label Embedding
Xiang Ren*, UIUC; Wenqi He, UIUC; Meng Qu, UIUC; Heng Ji, PRI; Clare Voss, ARL; Jiawei Han, University of Illinois at Urbana-Champaign
Current systems of fine-grained entity typing use distant supervision in conjunction with existing knowledge bases to assign categories (type labels) to entity mentions. However, the type labels so obtained from knowledge bases are often noisy (i.e., incorrect for the entity mention’s local context). We define a new task, Label Noise Reduction in Entity Typing (LNR), to be the automatic identification of correct type labels (type-paths) for training examples, given the set of candidate type labels obtained by distant supervision with a given type hierarchy. The unknown type labels for individual entity mentions and the semantic similarity between entity types pose unique challenges for solving the LNR task. We propose a general framework, called PLE, to jointly embed entity mentions, text features and entity types into the same low-dimensional space where, in that space, objects whose types are semantically close have similar representations. Then we estimate the type-path for each training example in a top-down manner using the learned embeddings. We formulate a global objective for learning the embeddings from text corpora and knowledge bases, which adopts a novel margin-based loss that is robust to noisy labels and faithfully models type correlation derived from knowledge bases. Our experiments on three public typing datasets demonstrate the effectiveness and robustness of PLE, with an average of 25% improvement in accuracy compared to next best method.
|Sourcecode||Github Link||View Now|
|Presentations||Dropbox link to download slide PDF||View Now|
Filed under: Mining Rich Data Types | Semi-Supervised Learning