Learning From Very-Few Labeled Examples with Soft Labels
- Publisher:
- IEEE Computer Society
- Publication Type:
- Conference Proceeding
- Citation:
- 2010 IEEE International Conference on Image Processing ICIP 2010 - Proceedings, 2010, pp. 3869 - 3872
- Issue Date:
- 2010-01
Closed Access
Filename | Description | Size | |||
---|---|---|---|---|---|
2009007894OK.pdf | 940.44 kB |
Copyright Clearance Process
- Recently Added
- In Progress
- Closed Access
This item is closed access and not available.
In this paper we propose Softboost, a novel Boosting al-gorithm which combines the merits of transductive and inductive learning approaches to attack the problem of learning from very few labeled training examples. In the transductive stage, soft labels of both the labeled and unlabeled samples are estimated based on a Markovian propagating procedure. While in the subsequent inductive stage, to efficiently handle out-of-sample data, we learn a weighted combination of simple rules in Boosting style, each of which maximizes confidence-weighted inter-class Kullback-Leibler (KL) divergence under current data distribution. Finally, experiments on toy dataset and USPS handwritten digits are presented to demonstrate its effectiveness.
Please use this identifier to cite or link to this item: