Robust Principal Component Analysis with Complex Noise

Qian Zhao, Deyu Meng, Zongben Xu, Wangmeng Zuo, Lei Zhang
Proceedings of the 31st International Conference on Machine Learning, PMLR 32(2):55-63, 2014.

Abstract

The research on robust principal component analysis (RPCA) has been attracting much attention recently. The original RPCA model assumes sparse noise, and use the L_1-norm to characterize the error term. In practice, however, the noise is much more complex and it is not appropriate to simply use a certain L_p-norm for noise modeling. We propose a generative RPCA model under the Bayesian framework by modeling data noise as a mixture of Gaussians (MoG). The MoG is a universal approximator to continuous distributions and thus our model is able to fit a wide range of noises such as Laplacian, Gaussian, sparse noises and any combinations of them. A variational Bayes algorithm is presented to infer the posterior of the proposed model. All involved parameters can be recursively updated in closed form. The advantage of our method is demonstrated by extensive experiments on synthetic data, face modeling and background subtraction.

Cite this Paper


BibTeX
@InProceedings{pmlr-v32-zhao14, title = {Robust Principal Component Analysis with Complex Noise}, author = {Zhao, Qian and Meng, Deyu and Xu, Zongben and Zuo, Wangmeng and Zhang, Lei}, booktitle = {Proceedings of the 31st International Conference on Machine Learning}, pages = {55--63}, year = {2014}, editor = {Xing, Eric P. and Jebara, Tony}, volume = {32}, number = {2}, series = {Proceedings of Machine Learning Research}, address = {Bejing, China}, month = {22--24 Jun}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v32/zhao14.pdf}, url = {https://proceedings.mlr.press/v32/zhao14.html}, abstract = {The research on robust principal component analysis (RPCA) has been attracting much attention recently. The original RPCA model assumes sparse noise, and use the L_1-norm to characterize the error term. In practice, however, the noise is much more complex and it is not appropriate to simply use a certain L_p-norm for noise modeling. We propose a generative RPCA model under the Bayesian framework by modeling data noise as a mixture of Gaussians (MoG). The MoG is a universal approximator to continuous distributions and thus our model is able to fit a wide range of noises such as Laplacian, Gaussian, sparse noises and any combinations of them. A variational Bayes algorithm is presented to infer the posterior of the proposed model. All involved parameters can be recursively updated in closed form. The advantage of our method is demonstrated by extensive experiments on synthetic data, face modeling and background subtraction.} }
Endnote
%0 Conference Paper %T Robust Principal Component Analysis with Complex Noise %A Qian Zhao %A Deyu Meng %A Zongben Xu %A Wangmeng Zuo %A Lei Zhang %B Proceedings of the 31st International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2014 %E Eric P. Xing %E Tony Jebara %F pmlr-v32-zhao14 %I PMLR %P 55--63 %U https://proceedings.mlr.press/v32/zhao14.html %V 32 %N 2 %X The research on robust principal component analysis (RPCA) has been attracting much attention recently. The original RPCA model assumes sparse noise, and use the L_1-norm to characterize the error term. In practice, however, the noise is much more complex and it is not appropriate to simply use a certain L_p-norm for noise modeling. We propose a generative RPCA model under the Bayesian framework by modeling data noise as a mixture of Gaussians (MoG). The MoG is a universal approximator to continuous distributions and thus our model is able to fit a wide range of noises such as Laplacian, Gaussian, sparse noises and any combinations of them. A variational Bayes algorithm is presented to infer the posterior of the proposed model. All involved parameters can be recursively updated in closed form. The advantage of our method is demonstrated by extensive experiments on synthetic data, face modeling and background subtraction.
RIS
TY - CPAPER TI - Robust Principal Component Analysis with Complex Noise AU - Qian Zhao AU - Deyu Meng AU - Zongben Xu AU - Wangmeng Zuo AU - Lei Zhang BT - Proceedings of the 31st International Conference on Machine Learning DA - 2014/06/18 ED - Eric P. Xing ED - Tony Jebara ID - pmlr-v32-zhao14 PB - PMLR DP - Proceedings of Machine Learning Research VL - 32 IS - 2 SP - 55 EP - 63 L1 - http://proceedings.mlr.press/v32/zhao14.pdf UR - https://proceedings.mlr.press/v32/zhao14.html AB - The research on robust principal component analysis (RPCA) has been attracting much attention recently. The original RPCA model assumes sparse noise, and use the L_1-norm to characterize the error term. In practice, however, the noise is much more complex and it is not appropriate to simply use a certain L_p-norm for noise modeling. We propose a generative RPCA model under the Bayesian framework by modeling data noise as a mixture of Gaussians (MoG). The MoG is a universal approximator to continuous distributions and thus our model is able to fit a wide range of noises such as Laplacian, Gaussian, sparse noises and any combinations of them. A variational Bayes algorithm is presented to infer the posterior of the proposed model. All involved parameters can be recursively updated in closed form. The advantage of our method is demonstrated by extensive experiments on synthetic data, face modeling and background subtraction. ER -
APA
Zhao, Q., Meng, D., Xu, Z., Zuo, W. & Zhang, L.. (2014). Robust Principal Component Analysis with Complex Noise. Proceedings of the 31st International Conference on Machine Learning, in Proceedings of Machine Learning Research 32(2):55-63 Available from https://proceedings.mlr.press/v32/zhao14.html.

Related Material