A PHP Error was encountered

Severity: 8192

Message: Non-static method URL_tube::usage() should not be called statically, assuming $this from incompatible context

Filename: url_tube/pi.url_tube.php

Line Number: 13

KDD 2020 | Feature-Induced Manifold Disambiguation for Multi-View Partial Multi-label Learning

Accepted Papers

Feature-Induced Manifold Disambiguation for Multi-View Partial Multi-label Learning

Jing-Han Wu: Southeast University; Xuan Wu: Alibaba Group; Qing-Guo Chen: Alibaba Group; Yao Hu: Alibaba Group; Min-Ling Zhang: Southeast University


Download

In conventional multi-label learning framework, each example is assumed to be represented by a single feature vector and associated with multiple valid labels simultaneously. Nonetheless, real-world objects usually exhibit complicated properties which can have multi-view feature representation as well as false positive labeling. Accordingly, the problem of multi-view partial multi-label learning (MVPML) is studied in this paper, where each example is assumed to be presented by multiple feature vectors while associated with multiple candidate labels which are only partially valid. To learn from MVPML examples, a novel approach named FIMAN is proposed which makes use of multi-view feature representation to tackle the noisy labeling information. Firstly, an aggregate manifold structure over training examples is generated by adaptively fusing affinity information conveyed by feature vectors of different views. Then, candidate labels of each training example are disambiguated by preserving the feature-induced manifold structure in label space. Finally, the resulting predictive models are learned by fitting modeling outputs with the disambiguated labels. Extensive experiments on a number of real-world data sets show that FIMAN achieves highly competitive performance against state-of-the-art approaches in solving the MVPML problem.

How can we assist you?

We'll be updating the website as information becomes available. If you have a question that requires immediate attention, please feel free to contact us. Thank you!

Please enter the word you see in the image below: