Cole Wyeth, University of Waterloo
ntroduction to Algorithmic Probability and the Coding Theorem
Building on the prefix-free Kolmogorov complexity discussed at our last meeting, I will introduce the basic objects of algorithmic probability. In particular, with a theory of effective explanations in hand, it is natural to ask which strings are more probable a priori? After all, it is harder to predict the data before you have seen it! The distributions generated by probabilistic Turing machines can be fully characterized as the (normalized) "lower semicomputable semimeasures," which naturally leads to the so-called "discrete universal distribution" m by simply mixing them all together. I will sketch a proof of Leonid A. Levin's coding theorem, which tells us that -lg m(x) = K(x) up to constants, meaning that all of our work was, in the most satisfying possible sense, for nothing: we can take only the shortest algorithmic explanation without losing anything. However, this is all just a warm-up: we will find that the situation is much more intricate when we turn to the prediction of infinite sequences, which I hope to gesture at, time permitting.
MC 5403