La cuestión de la decidibilidad, es decir, si es posible demostrar computacionalmente que una expresión matemática es verdadera o falsa, fue planteada por Hilbert y permaneció abierta hasta que Turing la respondió de forma negativa. Establecida la no-decidibilidad de las matemáticas, los esfuerzos en informática teórica se centraron en el estudio de la complejidad computacional de los problemas decidibles. En este artículo presentamos una breve introducción a las clases

The decidability question, i.e., whether any mathematical statement could be computationally proven true or false, was raised by Hilbert and remained open until Turing answered it in the negative. Then, most efforts in theoretical computer science turned to complexity theory and the need to classify decidable problems according to their difficulty. Among others, the classes

The English mathematician Alan Mathison Turing (1912-1954) is well-known for his key role in the development of computer science, but he also made important contributions to the foundations of mathematics, numerical analysis, cryptography, quantum computing and biology. In the present paper we focus exclusively on the influence of Turing on

Turing was an important player in the settlement of the decidability issue, which led naturally to the study of the complexity of decidable problems, i.e., once it was proved that there are problems unsolvable by a computer, research turned to the question of “how long would it take to solve a decidable problem?” This is an exciting research field with lots of open questions. In this paper, we try to give a gentle introduction to the historical perspective of the search for efficient computing and its limitations.

Gottfried W. Leibniz (1646-1716) was an important mathematician, philosopher, jurist and inventor, whose dream was to devise an automatic reasoning machine, based on an alphabet of unambiguous symbols manipulated by mechanical rules, which could formalize any consistent linguistic or mathematical system. He produced a calculus ratiocinator, an algebra to specify the rules for manipulating logical concepts. More than a century later, George Boole (1815-1864) and Augustus De Morgan (1806-1871) converted the loose ideas of Leibniz into a formal system, Boolean algebra, from which Gottlob Frege (1848-1925) finally produced the fully developed system of

Frege went on to prove that all the laws of arithmetic could be logically deduced from a set of axioms. He wrote a two-volume book with the formal development of the foundations of arithmetic, and when the second volume was already in print, Frege received the celebrated letter from Bertrand Russell (1872-1970) showing his theory was inconsistent. The counterexample was the paradox of

Frege's work was the spark for 30 years of research on the foundations of mathematics, and the end of the 19

To better frame further developments, let us recall some notions. A

A system is said to be

In 1928, at the International Congress of Mathematicians in Bologna, David Hilbert (1862-1943), a leading figure in the formalist school, presented to his fellow mathematicians three problems, the second of which would have a big impact on the development of computer science: the

In 1936 Turing completed the draft of his paper "On Computable Numbers, with an Application to the Entscheidungsproblem", where he proposed a formal model of computation, the

In his model, Turing introduced two essential assumptions, the discretization of time and the discretization of the state of mind of a human calculator. A TM consisted of an infinite tape divided into squares, a finite input on a finite alphabet, and a write-read head that could be in a finite number of states representing those of the human calculator's brain (Davis,

Relying on his UTM, Turing made a decisive contribution to computability, namely he produced a problem that was undecidable: the

Turing’s PhD. thesis extended Gödel's incompleteness theorem by proving that when new axioms are added to an incomplete formal system the system remains incomplete (Turing,

Emil Post (1897-1954) realized that Turing's oracle machine had important implications for computability, as it could be used to compare the relative difficulty of problems; thus he defined the concept of Turing-reducibility (T-reducibility). Given problems

We have seen that the work of Turing and others let us determine, for any given problem, whether there is an algorithm to solve it (i.e., the problem is decidable) or not. Let us turn into the realm of solvable problems, and ask the question of how long it takes to solve a given decidable problem. It may seem that this is a pure technological issue, however we will see that there are problems for which today's computers would take more that the age of universe to find a solution, when the input to the problem is a bit large.

The

For decidable problems, the time complexity expression has a tremendous impact on the running time of an algorithm.

Let us try to get a feeling for the time-complexity of some problems. Given an n x n matrix

where

For example if

then det(

A direct implementation of the definition yields an

A problem is said to be

A particularly interesting kind of problems are the

For example, consider the

In the early 1950’s, some mathematicians started to realize that some decidable problems took too long to be solvable for large inputs. In a series of letters to the US National Security Agency, John Nash proposed a secure cryptographic encryption system based on the computational difficulty of problems (Nissan,

The second important historical event for the study of problem complexity was the letter Kurt Gödel sent to John von Neumann in March 1956. At the time, von Neumann was dying and he did not read the letter, which was lost until the 1980’s. An English translation of the letter can be found in the appendix of Lipton’s book (

In his letter, Gödel considered the

During the 1960’s, the existence of easy and hard problems started to be more obvious to researchers. As computers were solving problems with increasing input size, researchers begin to consider the effect of input size on the computation time (number of steps) and space (number of cells visited), using the Turing machine as the computational model. An important contribution appeared in 1965 by Hartmanis and Stearns (

Consider the problem of the

A backtracking deterministic algorithm would find an assignment satisfying . In a nondeterministic algorithm, P would supply as witness an assignment A (for example, A(x1) = A(x4) = T, A(x2) = A(x3) = F)

In 1971 Stephen Cook showed that the SAT problem was a paradigmatic unfeasible problem in the sense that any problem that could be solved

One attendee who understood perfectly the meaning of Cook’s result was Richard Karp, who the following year published a seminal paper formally defining the classes

We next give an intuitive description of the complexity classes. For a more technical description the reader is referred to any of the complexity or algorithmic books listed in the bibliography of this paper.

From the previous remarks on the SAT problem, it is clear that SAT ∈

To circumvent this difficulty, we consider the

Continuing with the example, the search version of the

The search versions of optimization problems are obviously in

Let us see some further examples of problems in

The

The

The problem of

Finally, the

Notice that

The problem to decide whether

The answer

To further study the relationship between search problems and the

In other words, if

A search problem

A useful property of reductions is that they are transitive. This property, together with the previous remark about reductions as problems solvers, tells us that the class

It is known that if

It is beyond the scope of this paper to show detailed reductions between

As already mentioned, SAT was the first problem known to be

Consider the

The following example taken from Moore and Mertens (

The class

This problem is

where

As a final example, we would like to present evidence of the influence of the

What can be done when having to deal with an

and then go over the variables in each clause and negate each with probability = 1/2. The

For example, the random 3-SAT formula

has density

Mitchell, Selman, and Levesque (

In 2002, using non-rigorous techniques from statistical physics (the replica method) on very large instances of 3-SAT, Mézard, Parisi and Zecchina (

The fact that some

Another practical alternative is using approximation algorithms. An algorithm is said to

Almost since the beginning of the development of complexity theory, there was a parallel effort to develop approximation algorithms for

Parallelism is another powerful tool to speed up computation by a constant factor. Notice that anything that can be done with 10

In the same way, a result by Impagliazzo and Widgerson (

At the present time, quantum computers are not an existing reality, although

A fascinating line of research in theoretical computer science started in the mid 1980’s, which led to fruitful results, namely

The basic idea is how one researcher

To grasp the concept of zero-knowledge proof, let us start with the very simple illustrative example from Quisquater, Quisquater, Quisquater, Quisquater, Guillou, Guillou, Guillou, Guillou, Guillou, Guillou, and Berson (

ZKP are a particular case of the most general

Let us describe a more interesting example. Consider a given graph

Technically the way P shows the colors to V

By applying reductions to the 3-colorability problem, it was shown in Goldreich, Micali and Wigderson (

The culmination of interactive proof systems research was one of the most beautiful and deep theorems in computer science, the

The rough idea of probabilistically checkable proof systems is: “Given a conventional mathematical proof in which a mistake could be hidden in any equation, transform the proof in such a way that the mistake is spread almost everywhere”. This kind of proof is denoted a

The

Contrary to Hilbert’s Entscheidungsproblem, it remains an open problem to decide whether the truncated Entscheidungsproblem is feasible, i.e., it remains open to decide whether

At least there are 54 existing bogus proofs of the

The aim of this manuscript was not to review the complexity field or the status and future of the

Complexity theory has studied different models of computation (Turing machine with bounded number of steps, Boolean circuits, quantum algorithms, randomized algorithms), the complexity of measuring different parameters (space, number of iterations, depth of circuits), and several measures of complexity (worst case, average, smoothed). All this work has created a whole cosmos of about 500 complexity classes (Aaronson, Kuperberg and Granade). For most of them, strict relations are not known and, at the end, the main issue boils down to the basic intuition by Kurt Gödel in 1954.

We tried to convey the idea that this is one of the deepest questions of today’s science, affecting not only computer science, mathematics and physics, but also biology, sociology, economics and most human activities. We see this broad coverage of this question as a direct consequence of Turing’s way of looking through

In parallel to Turing, A. Church also gave a negative answer to the Entscheidungsproblem using a logic formalism, λ-calculus.

The complexity is expressed in asymptotic notation, i.e., for very large values of the input. The notation

It may happen that there is no feasible solution for input x

The correct word is recognized, as the problems are posed as recognition problems, i.e. determining whether a word belongs to a language over a finite alphabet.

A positive answer will render all transactions done using RSA insecure.

In a large part of the scientific bibliography, the prover and the verifier are respectively named Merlin and Arthur.

The authors frame their explanation in the arabic tale “Ali Baba and the forty thieves” from the classic “One thousand and one nights”.

The conference version of Goldwasser, Micali and Rackoff (