Theory Colloquium: Conditional Computational Entropy, or Towards Separating Pseudoentropy from Compressibility
Speaker: Leonid Reyzin, Boston University
Date: Tuesday, March 11 2008
Computational entropy measures the amount of randomness a distribution appears to have to a computationally bounded observer. It is an open question whether two definitions of this entropy -- the so-called "HILL entropy" (based on indistinguishability from truly random distributions) and "Yao entropy" (based on incompressibility) are equivalent, as they are in the information-theoretic setting. We observe that most of the time the observer has some correlated information, and thus define and study _conditional_ computational entropy. By considering conditional versions of HILL and Yao entropies, we obtain:
-- a separation between conditional HILL and Yao entropies;
-- the first demonstration of a distribution from which extraction techniques based on Yao entropy produce more pseudorandom bits than appears possible by the traditional HILL-entropy-based techniques;
-- a new, natural notion of unpredictability entropy, which, in particular, can handle entropy of singleton distributions, and allows for known extraction and hardcore bit results to be stated and used more generally.
Joint work with Chun-Yuan Hsiao and Chi-Jen Lu.
No comments:
Post a Comment
Note: Only a member of this blog may post a comment.