Algorithms and Complexity Seminar
*Note Special Date and Time*
Friday, September 15, 2006, 2-3:15pm in 32-G575.
Analytic Algorithmics, Combinatorics, And Information Theory
Analytic information theory aims at studying problems of information
theory using analytic algorithmics and combinatorics. Following
Hadamard and Knuth's precept, we tackle these problems by complex
analysis methods such as generating functions, Mellin transform,
Fourier series, saddle point method, analytic poissonization and
depoissonization, and singularity analysis. This pursuit lies at
the crossroad of computer science and information theory. In this
talk, we concentrate on one facet of information theory, namely the
redundancy rate problem of source coding (better known as data
compression). The redundancy rate problem for a class of sources
is the determination of how far the actual code length exceeds the
optimal (ideal) code length. We examine the worst case
(individual) minimax redundancy for memoryless, Markov, and renewal
sources. Precise analysis of redundancy for such sources leads us to
the tree generating functions (arising in counting labeled rooted
trees), integer partitions, enumeration of Eulerian paths in
multigraphs, and counting binary trees with a given path. In the
last part of this talk we reflect on *information* in its generality,
and muse on some problems in the interface of computer science and
information theory. In conclusion, we describe a few pertinent
challenges that bridge Shannon information with Boltzmann's entropy,
Maxwell's demon, Landauer's principle, Bennett's irreversible
computations, and also timeliness and value of information.
Host: Madhu Sudan
(The Algorithms
and Complexity Seminar series talks are usually held Mondays from
4-5:15pm in 32-G575.)