Back to list of papers


Typically-Correct Derandomization for Small Time and Space

By William M. Hoza


Read the paper: arXivCCC proceedings

Abstract (for specialists)

Suppose a language $L$ can be decided by a bounded-error randomized algorithm that runs in space $S$ and time $n \cdot \text{poly}(S)$. We give a randomized algorithm for $L$ that still runs in space $O(S)$ and time $n \cdot \text{poly}(S)$ that uses only $O(S)$ random bits; our algorithm has a low failure probability on all but a negligible fraction of inputs of each length. As an immediate corollary, there is a deterministic algorithm for $L$ that runs in space $O(S)$ and succeeds on all but a negligible fraction of inputs of each length. We also give several other complexity-theoretic applications of our technique.

Not-so-abstract (for curious outsiders)

⚠️ This summary might gloss over some important details.

A "randomized" algorithm tosses coins to make decisions, whereas a "deterministic" algorithm doesn't use any randomness. Most theoretical computer scientists believe that if a problem can be solved by a randomized algorithm, then it can also be solved by a deterministic algorithm that uses about the same amount of time and memory. But nobody knows how to prove it. In this paper, we prove that if a problem can be solved by a fast randomized algorithm, then it can also be solved by a deterministic algorithm that uses about the same amount of memory, with the caveat that the deterministic algorithm might give the wrong answer on a tiny fraction of inputs.

I posted a manuscript online in November 2017; I presented the paper at CCC in July 2019. The arXiv version and the CCC proceedings version are the same except for formatting.


Expository material:

Video from my presentation at CCC (July 2019). Here are the slides from that presentation. See also these longer slides from my presentation at HUJI's Theory of Computer Science Seminar (March 2018); I used very similar slides for my presentations at Weizmann Institute's Foundations of Computer Science Seminar (March 2018) and TAU's Theory of Computation Seminar (March 2018).