Extracting randomness via repeated condensing

Omer Reingold, Ronen Shaltiel, Avi Wigderson

Research output: Contribution to journalArticlepeer-review

Abstract

Extractors (as defined by Nisan and Zuckerman) are procedures that use a small number of truly random bits (called the seed) to extract many (almost) truly random bits from arbitrary distributions as long as distributions have sufficient (min)-entropy. A natural weakening of an extractor is a condenser, whose output distribution has a higher entropy rate than the input distribution (without losing much of the initial entropy). An extractor can be viewed as an ultimate condenser because it outputs a distribution with the maximal entropy rate. In this paper we construct explicit condensers with short seed length. The condenser constructions combine (variants of or more efficient versions of) ideas from several works, including the block extraction scheme of [N. Nisan and D. Zuckerman, J. Comput. System Sci., 52 (1996), pp. 43-52], the observation made in [A. Srinivasan and D. Zuckerman, SIAM J. Comput., 28 (1999), pp. 1433-1459; N. Nisan and A. Ta-Shma, J. Comput. System Sci., 58 (1999), pp. 148-173] that a failure of the block extraction scheme is also useful, the recursive "win-win" case analysis of [R. Impagliazzo, R. Shaltiel, and A. Wigderson, Near-optimal conversion of hardness into pseudo-randomness, in Proceedings of the 40th Annual IEEE Symposium on Foundations of Computer Science, IEEE, Los Alamitos, CA, 1999, pp. 181-190; R. Impagliazzo, R. Shaltiel, and A. Wigderson, Extractors and pseudo-random generators with optimal seed length, in Proceedings of the 32nd Annual ACM Symposium on Theory of Computing, ACM, New York, 2000, pp. 1-10], and the error correction of random sources used in [L. Trevisan, J. ACM, 48 (2001), pp. 860-879]. As a by-product (via repeated iterating of condensers), we obtain new extractor constructions. The new extractors give significant qualitative improvements over previous ones for sources of arbitrary min-entropy; they are nearly optimal simultaneously in the two main parameters of seed length and output length. Specifically, our extractors can make any one of these two parameters optimal (up to a constant factor) only at a polylogarithmic loss in the other. Previous constructions require polynomial loss in both cases for general sources. We also give a simple reduction converting "standard" extractors (which are good for an average seed) into "strong" ones (which are good for most seeds), with essentially the same parameters. With this reduction, all the above improvements apply to strong extractors as well.

Original languageEnglish
Pages (from-to)1185-1209
Number of pages25
JournalSIAM Journal on Computing
Volume35
Issue number5
DOIs
StatePublished - 2006

Keywords

  • Derandomization
  • Randomness condensers
  • Randomness extractors

ASJC Scopus subject areas

  • Computer Science (all)
  • Mathematics (all)

Fingerprint

Dive into the research topics of 'Extracting randomness via repeated condensing'. Together they form a unique fingerprint.

Cite this