Thomas M. Cover (1938-2012)
Tom’s passing is a great loss. His work has left an indelible mark not only on information theory, but also on machine learning (for instance, his result with Peter Hart on the probability of error of the nearest-neighbor rule being at most twice the Bayes rate is one of the cornerstones of the theory of pattern recognition), statistics, probability, finance, etc. (here is Sergio’s wonderful presentation of Tom’s work across all these fields). His famous textbook, Elements of Information Theory, written with Joy Thomas, was the first one to feature not only the standard topics, such as source and channel coding, but also the more advanced material, such as multiterminal information theory, Kolmogorov complexity, connections between information theory and statistics, connections between information theory and gambling, or universal source coding, in a characteristically lucid manner that made them accessible to beginners. Everything Tom did was touched by elegance, simplicity, and grace. He will be missed, but never forgotten.