Random self-reducibility

Random self-reducibility

Random self-reducibility (RSR): A good algorithm for the average case implies a good algorithm for the worst case. RSR is the ability to solve all instances of a problem by solving a large fraction of the instances.

Definition

If a function "f" evaluating any instance "x" can be reduced in polynomial time to the evaluation of "f" on one or more random instances "yi", then it is self-reducible (this is also known as a "non-adaptive uniform self-reduction"). In a random self-reduction an arbitrary worst-case instance "x" in the domain of "f" is mapped to a random set of instances "y1", ..., "yk". This is done so that "f"("x") can be computed in polynomial time, given the coin-toss sequence from the mapping, "x", and "f"("y1"), ..., "f"("yk"). Therefore, taking the average with respect to the induced distribution on "yi", the average-case complexity of "f" is the same (within polynomial factors) as the worst-case randomized complexity of "f".

One special case of note is when each random instance "yi" is distributed uniformly over the entire set of elements in the domain of "f" that have a length of |"x"|. In this case "f" is as hard on average as it is in the worst case. This approach contains two key restrictions. First the generation of "y1", ..., "yk" is performed non-adaptively. This means that "y2" is picked before "f"("y1") is known. Second, it is not necessary that the points "y1", ..., "yk" be uniformly distributed.

Application in cryptographic protocols

Problems that require some privacy in the data (typically "cryptographic problems") can use randomization to ensure that privacy. In fact, the only provably secure cryptographic system (the one-time pad) has its security relying totally on the "randomness" of the key data supplied to the system.

The field of cryptography utilizes the fact that certain number-theoretic functions are randomly self-reducible. This includes probabilistic encryption and cryptographically strong pseudorandom number generation. Also, instance-hiding schemes (where a weak private device uses a strong public device without revealing its data) are easily exemplified by random self-reductions.

Examples

* Discrete logarithm"Theorem": If a deterministic polynomial time algorithm "A" computes the discrete logarithm for a 1 /poly("n") fraction of input "y" ("n" = |"p"|), then there is a randomized polynomial time algorithm for discrete logarithm for all inputs.

Given a generator "g", and an "x" ∈ a cyclic group "G", the discrete log of "x" to the base "g" is an integer "k" ({0 ≤ "k" < |"G") and "x" = "g""k". Since the order of group "G" is known, and "B" is distributed uniformly on {0, ..., p - 1} (where "p" = |"G"| is a prime), then "xg""B" is also distributed uniformly on "G". Likewise, "gB" is uniform in "Zp*", meaning it is impossible for it to be 0. Therefore "xg""B" is independent of "x", and "log""g""xg""B" ≡ "B" + "log"g"x"(mod|G|) and the discrete logarithm is self-reducible.

* PermanentGiven the definition of the permanent of a matrix, it is clear that "PERM"("M") for any "n"-by-"n" matrix "M" is a multivariate polynomial of degree "n" over the entries in "M". Calculating the permanent of a matrix is a difficult computational task---"PERM" has been shown to be #P-complete (proof). Moreover, we can prove that if we can compute "PERM"("M") for most matrices, then there exists a random program that computes "PERM"("M") for all matrices. This demonstrates that "PERM" is random self-reducible. We will confine ourselves here to discussing matrices where the entries are drawn from a finite field "Fp" for some prime "p", and where all arithmetic is performed in that field.

Let "X" be a random "n"-by-"n" matrix with entries from "Fp". Since all the entries of any matrix "M" + "kX" are linear functions of "k", by composing those linear functions with the degree "n" mulitvariate polynomial that calculates "PERM"("M") we get another degree "n" polynomial on "k", which we will call "p"("k"). Clearly, "p"(0) is equal to the permanent of "M".

Suppose we know a program that computes the correct value of "PERM"("A") for most "n"-by-"n" matrices with entries from "Fp"---specifically, 1 - 1/(3"n") of them. Then with probability of approximately two-thirds, we can calculate "PERM"("M" + "kX") for "k" = 1,2,...,"n" + 1. Once we have those "n" + 1 values, we can solve for the coefficients of "p(k)" using interpolation (remember that "p"("k") has degree "n"). Once we know "p"("k") exactly, we evaluate "p"(0), which is equal to "PERM"("M").

If we do so, we run the risk of being wrong 1/3 of the time, but by picking multiple random "X"s and repeating the above procedure many times, and only providing the majority winner as an answer, we can drive the error rate down very low.

* Quadratic residuosity problem
* Inverse-RSA

Consequences

* If an NP-complete problem is self-reducible the polynomial hierarchy collapses to Σ3. This eliminates the possibility of non-adaptive uniform random self-reductions.
* If a CoNP-hard problem is random self-reducible in "O"(log "n"/"n") then Σ2 = Π2.

References

* M. Abadi, J. Feigenbaum, and J. Kilian, "On Hiding Information from an Oracle", J. Comput. System Sci., 1989
* S. Arora, "Around the PCP Theorem", 1996
* J. Feigenbaum, L. Fortnow, "On the Random-self-reducibility of Complete Sets", 1991


Wikimedia Foundation. 2010.

Игры ⚽ Поможем написать курсовую

Look at other dictionaries:

  • Best, worst and average case — In computer science, best, worst and average cases of a given algorithm express what the resource usage is at least , at most and on average , respectively. Usually the resource being considered is running time, but it could also be memory or… …   Wikipedia

  • Discrete logarithm — In mathematics, specifically in abstract algebra and its applications, discrete logarithms are group theoretic analogues of ordinary logarithms. In particular, an ordinary logarithm loga(b) is a solution of the equation ax = b over the… …   Wikipedia

  • Probabilistic analysis of algorithms — In analysis of algorithms, probabilistic analysis of algorithms is an approach to estimate the computational complexity of an algorithm or a computational problem. It starts from an assumption about a probabilistic distribution of the set of all… …   Wikipedia

  • Karp-Lipton theorem — The Karp–Lipton theorem in complexity theory states that if the boolean satisfiability problem (SAT) can be solved by Boolean circuits with a polynomial number of logic gates, then :Pi 2 , = Sigma 2 , and therefore mathrm{PH} , = Sigma 2 ,.That… …   Wikipedia

  • Boolean satisfiability problem — For the concept in mathematical logic, see Satisfiability. 3SAT redirects here. For the Central European television network, see 3sat. In computer science, satisfiability (often written in all capitals or abbreviated SAT) is the problem of… …   Wikipedia

  • IP (complexity) — In computational complexity theory, the class IP is the class of problems solvable by an interactive proof system. The concept of an interactive proof system was first introduced by Goldwasser, et al. in 1985. An interactive proof system consists …   Wikipedia

  • Computing the permanent — In mathematics, the computation of the permanent of a matrix is a problem that is believed to be more complex than the computation of the determinant of a matrix despite the apparent similarity of the definitions. The permanent is defined… …   Wikipedia

  • combinatorics — /keuhm buy neuh tawr iks, tor , kom beuh /, n. (used with singular v.) See combinatorial analysis. * * * Branch of mathematics concerned with the selection, arrangement, and combination of objects chosen from a finite set. The number of possible… …   Universalium

  • List of terms relating to algorithms and data structures — The [http://www.nist.gov/dads/ NIST Dictionary of Algorithms and Data Structures] is a reference work maintained by the U.S. National Institute of Standards and Technology. It defines a large number of terms relating to algorithms and data… …   Wikipedia

  • Список терминов, относящихся к алгоритмам и структурам данных —   Это служебный список статей, созданный для координации работ по развитию темы.   Данное предупреждение не устанавливается на информационные списки и глоссарии …   Википедия

Share the article and excerpts

Direct link
Do a right-click on the link above
and select “Copy Link”