Hongchao Zhou, California Institute of Technology
Abstract
High-quality random bits generation from imperfect sources has a huge impact in computer science. Although many important problems have already been well studied, some fundamental questions still remain open. For example, Shannon’s theory tells us that the efficiency limit for random bits generation is determined by the source’s entropy. But it is still unknown that how to achieve this limit when generating a ‘stream’ of random bits from biased coins, or generating random bits from Markov chains. In this talk, we address these open questions and present a few algorithms that run in expected linear time and reach the
information-theoretic upper bound on efficiency. Additionally, we demonstrate that information theory provides many powerful tools for generating random bits from imperfect sources. Such tools include linear
transformations based on sparse random matrices, and data compression schemes like Lempel-Ziv codes.
Biography: Hongchao Zhou is a Ph.D. candidate in Electrical Engineering at California Institute of Technology. Before coming to Caltech, he got a bachelor degree in physics and mathematics and a master degree in control science and engineering from Tsinghua University. His current research interests focus on information theory and coding theory along with their applications in data storage, random number generation, and stochastic.