TY - JOUR
TI - Reducing randomness in computation via explicit constructions
DO - https://doi.org/doi:10.7282/t3-de8n-zy96
AU - Zhou, Shiyu
PY - 1996-10
AB - One of the important issues in both complexity theory and algorithm design is to understand whether randomness can be more powerful than determinism. In this dissertation, we explore some problems related to this issue and study how to reduce randomness in computation via explicit combinatorial constructions. One approach to achieving determinism from randomness is by way of applying pseudorandom generators. We examine the relationship between constructing pseudorandom generators for space bounded computation and designing space- efficient deterministic algorithms for approximating matrix powering. As a consequence, we show that any randomized algorithm that runs in space O(S) and terminates can be simulated deterministically in space O(S3/2), where the previously best known simulation required space O(S2) following Savitch's Theorem. Deterministic amplification is one of the major subjects in the study of reducing randomness. An important issue in randomized computation that is essentially equivalent to deterministic amplification is to design randomized algorithms that are robust with respect to deviations of the random source from true randomness. It has been observed that this issue is closely connected to the problem of efficiently constructing graphs with certain expansion properties, called dispersers. We give an improved construction of a family of dispersers that can be used to design randomized algorithms achieving maximal robustness. Our construction settles a conjecture of Sipser that the complexity class Strong Randomized Polynomial Time (Strong-RP) is equivalent to Randomized Polynomial Time (RP). Our effort at constructing small-sized sample spaces that keep certain randomness properties was motivated by designing efficient deterministic algorithms via derandomization. Here we look at the problem from a different perspective and investigate the application of sample spaces with small bias to designing error-correcting communication protocols. We study the issue of constructing sample spaces with small bias on certain sets of bit positions. By way of derandomization, we give a new construction whose size matches the bound given by the probabilistic argument, improving on the previously best known result. As an application of our improved construction, we present an essentially optimal communication protocol for performing a type of data consistency check in distributed databases.
LA - English
ER -