Non-Negative Semi-Supervised Learning

Changhu Wang, Shuicheng Yan, Lei Zhang, Hongjiang Zhang
Proceedings of the Twelth International Conference on Artificial Intelligence and Statistics, PMLR 5:575-582, 2009.

Abstract

The contributions of this paper are three-fold. First, we present a general formulation for reaping the benefits from both non-negative data factorization and semi-supervised learning, and the solution naturally possesses the characteristics of sparsity, robustness to partial occlusions, and greater discriminating power via extra unlabeled data. Then, an efficient multiplicative updating procedure is proposed along with its theoretic justification of the algorithmic convergency. Finally, the tensorization of this general formulation for non-negative semi-supervised learning is also briefed for handling tensor data of arbitrary order. Extensive experiments compared with the state-of-the-art algorithms for non-negative data factorization and semi-supervised learning demonstrate the algorithmic properties in sparsity, classification power, and robustness to image occlusions.

Cite this Paper


BibTeX
@InProceedings{pmlr-v5-wang09a, title = {Non-Negative Semi-Supervised Learning}, author = {Wang, Changhu and Yan, Shuicheng and Zhang, Lei and Zhang, Hongjiang}, booktitle = {Proceedings of the Twelth International Conference on Artificial Intelligence and Statistics}, pages = {575--582}, year = {2009}, editor = {van Dyk, David and Welling, Max}, volume = {5}, series = {Proceedings of Machine Learning Research}, address = {Hilton Clearwater Beach Resort, Clearwater Beach, Florida USA}, month = {16--18 Apr}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v5/wang09a/wang09a.pdf}, url = {https://proceedings.mlr.press/v5/wang09a.html}, abstract = {The contributions of this paper are three-fold. First, we present a general formulation for reaping the benefits from both non-negative data factorization and semi-supervised learning, and the solution naturally possesses the characteristics of sparsity, robustness to partial occlusions, and greater discriminating power via extra unlabeled data. Then, an efficient multiplicative updating procedure is proposed along with its theoretic justification of the algorithmic convergency. Finally, the tensorization of this general formulation for non-negative semi-supervised learning is also briefed for handling tensor data of arbitrary order. Extensive experiments compared with the state-of-the-art algorithms for non-negative data factorization and semi-supervised learning demonstrate the algorithmic properties in sparsity, classification power, and robustness to image occlusions.} }
Endnote
%0 Conference Paper %T Non-Negative Semi-Supervised Learning %A Changhu Wang %A Shuicheng Yan %A Lei Zhang %A Hongjiang Zhang %B Proceedings of the Twelth International Conference on Artificial Intelligence and Statistics %C Proceedings of Machine Learning Research %D 2009 %E David van Dyk %E Max Welling %F pmlr-v5-wang09a %I PMLR %P 575--582 %U https://proceedings.mlr.press/v5/wang09a.html %V 5 %X The contributions of this paper are three-fold. First, we present a general formulation for reaping the benefits from both non-negative data factorization and semi-supervised learning, and the solution naturally possesses the characteristics of sparsity, robustness to partial occlusions, and greater discriminating power via extra unlabeled data. Then, an efficient multiplicative updating procedure is proposed along with its theoretic justification of the algorithmic convergency. Finally, the tensorization of this general formulation for non-negative semi-supervised learning is also briefed for handling tensor data of arbitrary order. Extensive experiments compared with the state-of-the-art algorithms for non-negative data factorization and semi-supervised learning demonstrate the algorithmic properties in sparsity, classification power, and robustness to image occlusions.
RIS
TY - CPAPER TI - Non-Negative Semi-Supervised Learning AU - Changhu Wang AU - Shuicheng Yan AU - Lei Zhang AU - Hongjiang Zhang BT - Proceedings of the Twelth International Conference on Artificial Intelligence and Statistics DA - 2009/04/15 ED - David van Dyk ED - Max Welling ID - pmlr-v5-wang09a PB - PMLR DP - Proceedings of Machine Learning Research VL - 5 SP - 575 EP - 582 L1 - http://proceedings.mlr.press/v5/wang09a/wang09a.pdf UR - https://proceedings.mlr.press/v5/wang09a.html AB - The contributions of this paper are three-fold. First, we present a general formulation for reaping the benefits from both non-negative data factorization and semi-supervised learning, and the solution naturally possesses the characteristics of sparsity, robustness to partial occlusions, and greater discriminating power via extra unlabeled data. Then, an efficient multiplicative updating procedure is proposed along with its theoretic justification of the algorithmic convergency. Finally, the tensorization of this general formulation for non-negative semi-supervised learning is also briefed for handling tensor data of arbitrary order. Extensive experiments compared with the state-of-the-art algorithms for non-negative data factorization and semi-supervised learning demonstrate the algorithmic properties in sparsity, classification power, and robustness to image occlusions. ER -
APA
Wang, C., Yan, S., Zhang, L. & Zhang, H.. (2009). Non-Negative Semi-Supervised Learning. Proceedings of the Twelth International Conference on Artificial Intelligence and Statistics, in Proceedings of Machine Learning Research 5:575-582 Available from https://proceedings.mlr.press/v5/wang09a.html.

Related Material