Home Page

Papers

Submissions

News

Editorial Board

Open Source Software

Proceedings (PMLR)

Transactions (TMLR)

Search

Statistics

Login

Frequently Asked Questions

Contact Us



RSS Feed

Decontamination of Mutual Contamination Models

Julian Katz-Samuels, Gilles Blanchard, Clayton Scott; 20(41):1−57, 2019.

Abstract

Many machine learning problems can be characterized by \emph{mutual contamination models}. In these problems, one observes several random samples from different convex combinations of a set of unknown base distributions and the goal is to infer these base distributions. This paper considers the general setting where the base distributions are defined on arbitrary probability spaces. We examine three popular machine learning problems that arise in this general setting: multiclass classification with label noise, demixing of mixed membership models, and classification with partial labels. In each case, we give sufficient conditions for identifiability and present algorithms for the infinite and finite sample settings, with associated performance guarantees.

[abs][pdf][bib]       
© JMLR 2019. (edit, beta)