The Information Structuralist

Counting bits with Vapnik and Chervonenkis

Posted in Information Theory, Statistical Learning and Inference by mraginsky on April 28, 2015

Machine learning is about enabling computers to improve their performance on a given task as they get more data. Can we express this intuition quantitatively using information-theoretic techniques? In this post, I will discuss a classic paper by David Haussler, Michael Kearns, and Robert Schapire that (to the best of my knowledge) took the first step in this direction. In this post, I will describe some of their results, recast in a more explicitly information-theoretic way.