More learning-theoretic goodness:
- Case study: empirical quantizer design, where I discuss beautiful work by Tamás Linder et al. that uses VC theory to bound the performance of empirically designed vector quantizers (which is engineering jargon for consistency of the method of k-means).
- Binary classification: from the classic bounds for linear and generalized linear discriminant rules to modern techniques based on surrogate losses; voting methods; kernel machines; Convex Risk Minimization.
It’s that time again, the time to savor the dreamy delights of divergence!
In this post, we will look at a powerful information-theoretic method for deriving concentration-of-measure inequalities (i.e., tail bounds) for general functions of independent random variables.