John Nash, Cryptography, and Computational Complexity


Recently, the brilliant mathematician John Nash and his wife were killed in a car crash. While Nash was probably most famous for his pioneering work on game theory and his portrayal in Ron Howard‘s popular film A Beautiful Mind, he also worked in the field of cryptography. Several years ago, the NSA declassified several letters from 1955 between Nash and the NSA wherein Nash describes some ideas for an encryption/decryption scheme. While the NSA was not interested in the particular scheme devised by Nash, it seems that Nash foresaw the importance of computational complexity in the field of cryptography. In the letters, Nash states:

Consider the enciphering process with a finite “key,” operating on binary messages. Specifically, we can assume the process [is] described by a function
\[
y_i = F(\alpha_1, \alpha_2, \ldots, \alpha_r; x_i, x_{i-1}, \ldots, x_{i-n})
\]
where the \(\alpha\)’s, \(x\)’s and \(y\)’s are mod 2 and if \(x_i\) is changed with the other \(x\)’s and \(\alpha\)’s left fixed then \(y_i\) is changed. The \(\alpha\)’s denote the “key” containing \(r\) bits of information. \(n\) is the maximum span of the “memory” of the process…

…We see immediately that in principle the enemy needs very little information to break down the process. Essentially, as soon as \(r\) bits of enciphered message have been transmitted the key is about determined. This is no security, for a practical key should not be too long. But this does not consider how easy or difficult it is for the enemy to make the computation determining the key. If this computation, although always possible in principle, were sufficiently long at best the process could still be secure in a practical sense.

Nash goes on to say that

…a logical way to classify the enciphering process is the way in which the computation length for the computation on the key increases with increasing length of the key… Now my general conjecture is as follows: For almost all sufficiently complex types of enciphering…the mean key computation length increases exponentially with the length of the key.

The significance of this general conjecture, assuming its truth, is easy to see. It means that it is quite feasible to design ciphers that are effectively unbreakable.

To my knowledge, Nash’s letter letter is the earliest reference to using computational complexity to achieve practical cryptography. The idea is that while it is theoretically possible to decrypt an encrypted message without the key, doing so requires a prohibitive amount of computational resources. Interestingly, Nash’s letter predates an (in)famous 1956 letter from Kurt Godel to John von Neumann which is widely credited as being the first reference to the “P vs NP problem.” However the essential idea of the P vs NP problem is nascent in Nash’s conjecture: there are problems whose solution can efficiently be verified, but finding such a solution is computationally intractable. Specifically, a message can easily be decrypted if one knows the key, but finding the key to decrypt the message is hopelessly difficult.

The P vs NP problem was only formalized 16 years after Nash’s letters by Stephen Cook in his seminal paper, The complexity of theorem-proving procedures. In fact, Nash’s conjecture is strictly stronger than the P vs NP problem–its formulation is more akin to the exponential time hypothesis, which was only formulated in 1999!

Concerning Nash’s conjecture, he was certainly aware of the difficulty of its proof:

The nature of this conjecture is such that I cannot prove it, even for a special type of cipher. Nor do I expect it to be proven. But this does not destroy its significance. The probability of the truth of the conjecture can be guessed at on the basis of experience with enciphering and deciphering.

Indeed, the P vs NP problem remains among the most notorious open problems in mathematics and theoretical computer science.

PDF of Nash’s Letters to the NSA.

Leave a Reply

Your email address will not be published. Required fields are marked *