Learnability of Non-I.I.D.

Wei Gao, Xin-Yi Niu, Zhi-Hua Zhou
Proceedings of The 8th Asian Conference on Machine Learning, PMLR 63:158-173, 2016.

Abstract

Learnability has always been one of the most central problems in learning theory. Most previous studies on this issue were based on the assumption that the samples are drawn independently and identically according to an underlying (unknown) distribution. The i.i.d. assumption, however, does not hold in many real applications. In this paper, we study the learnability of problems where the samples are drawn from empirical process of stationary β-mixing sequence, which has been a widely-used assumption implying a dependence weaken over time in training samples. By utilizing the independent blocks technique, we provide a sufficient and necessary condition for learnability, that is, average stability is equivalent to learnability with AERM (Asymptotic Empirical Risk Minimization) in the non-i.i.d. learning setting. In addition, we also discuss the generalization error when the test variable is dependent on the training sample.

Cite this Paper


BibTeX
@InProceedings{pmlr-v63-Gao09, title = {Learnability of Non-I.I.D.}, author = {Gao, Wei and Niu, Xin-Yi and Zhou, Zhi-Hua}, booktitle = {Proceedings of The 8th Asian Conference on Machine Learning}, pages = {158--173}, year = {2016}, editor = {Durrant, Robert J. and Kim, Kee-Eung}, volume = {63}, series = {Proceedings of Machine Learning Research}, address = {The University of Waikato, Hamilton, New Zealand}, month = {16--18 Nov}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v63/Gao09.pdf}, url = {https://proceedings.mlr.press/v63/Gao09.html}, abstract = {Learnability has always been one of the most central problems in learning theory. Most previous studies on this issue were based on the assumption that the samples are drawn independently and identically according to an underlying (unknown) distribution. The i.i.d. assumption, however, does not hold in many real applications. In this paper, we study the learnability of problems where the samples are drawn from empirical process of stationary β-mixing sequence, which has been a widely-used assumption implying a dependence weaken over time in training samples. By utilizing the independent blocks technique, we provide a sufficient and necessary condition for learnability, that is, average stability is equivalent to learnability with AERM (Asymptotic Empirical Risk Minimization) in the non-i.i.d. learning setting. In addition, we also discuss the generalization error when the test variable is dependent on the training sample.} }
Endnote
%0 Conference Paper %T Learnability of Non-I.I.D. %A Wei Gao %A Xin-Yi Niu %A Zhi-Hua Zhou %B Proceedings of The 8th Asian Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2016 %E Robert J. Durrant %E Kee-Eung Kim %F pmlr-v63-Gao09 %I PMLR %P 158--173 %U https://proceedings.mlr.press/v63/Gao09.html %V 63 %X Learnability has always been one of the most central problems in learning theory. Most previous studies on this issue were based on the assumption that the samples are drawn independently and identically according to an underlying (unknown) distribution. The i.i.d. assumption, however, does not hold in many real applications. In this paper, we study the learnability of problems where the samples are drawn from empirical process of stationary β-mixing sequence, which has been a widely-used assumption implying a dependence weaken over time in training samples. By utilizing the independent blocks technique, we provide a sufficient and necessary condition for learnability, that is, average stability is equivalent to learnability with AERM (Asymptotic Empirical Risk Minimization) in the non-i.i.d. learning setting. In addition, we also discuss the generalization error when the test variable is dependent on the training sample.
RIS
TY - CPAPER TI - Learnability of Non-I.I.D. AU - Wei Gao AU - Xin-Yi Niu AU - Zhi-Hua Zhou BT - Proceedings of The 8th Asian Conference on Machine Learning DA - 2016/11/20 ED - Robert J. Durrant ED - Kee-Eung Kim ID - pmlr-v63-Gao09 PB - PMLR DP - Proceedings of Machine Learning Research VL - 63 SP - 158 EP - 173 L1 - http://proceedings.mlr.press/v63/Gao09.pdf UR - https://proceedings.mlr.press/v63/Gao09.html AB - Learnability has always been one of the most central problems in learning theory. Most previous studies on this issue were based on the assumption that the samples are drawn independently and identically according to an underlying (unknown) distribution. The i.i.d. assumption, however, does not hold in many real applications. In this paper, we study the learnability of problems where the samples are drawn from empirical process of stationary β-mixing sequence, which has been a widely-used assumption implying a dependence weaken over time in training samples. By utilizing the independent blocks technique, we provide a sufficient and necessary condition for learnability, that is, average stability is equivalent to learnability with AERM (Asymptotic Empirical Risk Minimization) in the non-i.i.d. learning setting. In addition, we also discuss the generalization error when the test variable is dependent on the training sample. ER -
APA
Gao, W., Niu, X. & Zhou, Z.. (2016). Learnability of Non-I.I.D.. Proceedings of The 8th Asian Conference on Machine Learning, in Proceedings of Machine Learning Research 63:158-173 Available from https://proceedings.mlr.press/v63/Gao09.html.

Related Material