Category Archives: musings

computer science expository math musings

John Nash, Cryptography, and Computational Complexity


Recently, the brilliant mathematician John Nash and his wife were killed in a car crash. While Nash was probably most famous for his pioneering work on game theory and his portrayal in Ron Howard‘s popular film A Beautiful Mind, he also worked in the field of cryptography. Several years ago, the NSA declassified several letters from 1955 between Nash and the NSA wherein Nash describes some ideas for an encryption/decryption scheme. While the NSA was not interested in the particular scheme devised by Nash, it seems that Nash foresaw the importance of computational complexity in the field of cryptography. In the letters, Nash states:

Consider the enciphering process with a finite “key,” operating on binary messages. Specifically, we can assume the process [is] described by a function
\[
y_i = F(\alpha_1, \alpha_2, \ldots, \alpha_r; x_i, x_{i-1}, \ldots, x_{i-n})
\]
where the \(\alpha\)’s, \(x\)’s and \(y\)’s are mod 2 and if \(x_i\) is changed with the other \(x\)’s and \(\alpha\)’s left fixed then \(y_i\) is changed. The \(\alpha\)’s denote the “key” containing \(r\) bits of information. \(n\) is the maximum span of the “memory” of the process…

…We see immediately that in principle the enemy needs very little information to break down the process. Essentially, as soon as \(r\) bits of enciphered message have been transmitted the key is about determined. This is no security, for a practical key should not be too long. But this does not consider how easy or difficult it is for the enemy to make the computation determining the key. If this computation, although always possible in principle, were sufficiently long at best the process could still be secure in a practical sense.

Nash goes on to say that

…a logical way to classify the enciphering process is the way in which the computation length for the computation on the key increases with increasing length of the key… Now my general conjecture is as follows: For almost all sufficiently complex types of enciphering…the mean key computation length increases exponentially with the length of the key.

The significance of this general conjecture, assuming its truth, is easy to see. It means that it is quite feasible to design ciphers that are effectively unbreakable.

To my knowledge, Nash’s letter letter is the earliest reference to using computational complexity to achieve practical cryptography. The idea is that while it is theoretically possible to decrypt an encrypted message without the key, doing so requires a prohibitive amount of computational resources. Interestingly, Nash’s letter predates an (in)famous 1956 letter from Kurt Godel to John von Neumann which is widely credited as being the first reference to the “P vs NP problem.” However the essential idea of the P vs NP problem is nascent in Nash’s conjecture: there are problems whose solution can efficiently be verified, but finding such a solution is computationally intractable. Specifically, a message can easily be decrypted if one knows the key, but finding the key to decrypt the message is hopelessly difficult.

The P vs NP problem was only formalized 16 years after Nash’s letters by Stephen Cook in his seminal paper, The complexity of theorem-proving procedures. In fact, Nash’s conjecture is strictly stronger than the P vs NP problem–its formulation is more akin to the exponential time hypothesis, which was only formulated in 1999!

Concerning Nash’s conjecture, he was certainly aware of the difficulty of its proof:

The nature of this conjecture is such that I cannot prove it, even for a special type of cipher. Nor do I expect it to be proven. But this does not destroy its significance. The probability of the truth of the conjecture can be guessed at on the basis of experience with enciphering and deciphering.

Indeed, the P vs NP problem remains among the most notorious open problems in mathematics and theoretical computer science.

PDF of Nash’s Letters to the NSA.

computer science math musings

Match Day, 2015

Yesterday, March 20th, was Match Day 2015, when graduating medical school students are matched with residency programs. The problem of matching residents to hospitals was one of the primary applications that motivated Gale and Shapley to formalize the stable marriage problem back in 1962. In fact, the National Residency Matching Program uses a variant of Gale and Shapley’s original algorithm which they describe in their seminal paper, College Admissions and the Stability of Marriage.

Azure Gilman just published a wonderful article (full disclosure: Azure interviewed me for the article) on this year’s Match Day festivities at Columbia University. One thing I find fascinating about Match Day is that it exposes a very tangible impact that algorithm design can have on our lives. Typically in the computer science setting, we discuss only the computational complexity of an abstract problem, or the efficiency of an algorithm designed to solve a problem. But Match Day gives a prescient example of how algorithm design can impact the very trajectories of people’s lives.

computer science expository math musings

The Aura of Interactive Proofs


In his essay The Work of Art in the Age of Mechanical Reproduction, Walter Benjamin introduces the idea that original artwork has an aura — some ineffable something that the work’s creator imbues into the work, but which is lost in reproductions made by mechanical means. There is something unique about an original work. Let us imagine that Walter is able to read the aura of work of art, or sense its absence. Thus, he has the seemingly magical ability to tell original art from mechanical forgery.

Andy Warhol is skeptical of Walter’s claims. Andy doesn’t believe in the aura. And even if the aura does exist, he sincerely doubts that Walter is able to detect its presence. In an attempt to unmask Walter for the fraud he undoubtedly he is, Andy hatches a cunning plan to catch Walter in a lie. By hand, he paints an original painting, then using the most advanced technology available, makes a perfect replica of the original. Although the replica looks exactly like the original to the layman (and even Andy himself), according to Walter, there is something missing in the replica.

Andy’s original plan was to present Walter with the two seemingly identical paintings and simply ask which is the original. He soon realized, however, that this approach would be entirely unsatisfactory for his peace of mind. If Walter picked the true original, Andy still couldn’t be entirely convinced of Walter’s powers: maybe Walter just guessed and happened to guess correctly! How can Andy change his strategy in order to (1) be more likely to catch Walter in his lie (if in fact is he lying) and (2) be more convinced to Walter’s supernatural abilities if indeed he is telling the truth?

read more »

math musings

The Categorization of Set Theory

I just read an article called Rethinking Set Theory by Tom Leinster in the most recent issue of the American Mathematical Monthly. I found the article very interesting and appealing, so I thought I’d mention it here.

Set theory is the basis of the vast majority classical mathematics. As such, most mathematicians use set theory every day in their work. Yet the majority of mathematicians don’t appeal to the formal properties of set theory–i.e. the axioms of set theory–in their work. Rather, mathematicians tend to use a set of working principles about sets which they constantly rely upon. There is nothing wrong with this approach, as the working principles can be rigorously justified, but it reveals a (perhaps philosophical) disconnect between the theory and practice of set theory.

Leinster’s article in the Monthly proposes an alternate set of axioms for set theory. Instead of starting for the standard ZFC axioms, he suggests formulating axioms that appear more similar to the way in which sets are used by most mathematicians. The inspiration for Leinster’s axioms (which he attributes to F. W. Lawvere) comes from category theory. They place equal emphasis on the concepts of “set” and “function,” whereas in the traditional ZFC theory, the only formal objects are sets.

I find Leinster’s approach to be very appealing. From a practical standpoint, I very much like that his axioms directly appeal to common set theoretic applications. Philosophically, I also appreciate that the axioms seem to reference the way in which sets interact (via functions) rather than describing the internal structure of sets (as does ZFC). This emphasis of external interaction over internal structure seems analogous to humankind’s position in the universe: the internal structure of the universe is a mystery, so we must learn what we can by interacting with as diverse phenomena as possible. Thus perhaps we should learn to understand sets the same way.

musings

Now I can recommence my weekend plans

The perfectly addictive mobile game has the right balance of conceptual simplicity, strategy and randomness. 2048 hits this balance perfectly.

20140329-160117.jpg

math musings

The Paradox of the Proof

The Paradox of the Proof is an essay by Caroline Chen about Shinichi Mochizuki‘s supposed proof of the abc conjecture. The article is accessible and very well written. I especially like that the article touches on the social aspect of mathematics. It seems a common conception about math that mathematical statements are either true or false, and that the truth or falsity is somehow “objective.” However, Mochizuki’s work shows that this is not the complete story. Mochizuki claims to have a proof of the abc conjecture, but literally no one (except presumably Mochizuki himself) understands the proof. So the truth of the abc conjecture still seems in limbo. This gives credence to my belief that a mathematical proof is not merely a mechanical verification that a statement is true. Rather, a proof is a narrative about why a statement is true — if there is no consensus in the mathematical community that a proof is valid, it may as well not exist. Chen’s article does a wonderful job of conveying this tension between the austere and rigorous abstractions of mathematics and its social nature that gives the field life.

art musings

Yet another reason I like living in Los Feliz

This showed up on an electrical box at the end of my street overnight:

Larry David Bowie

math musings

The Pop Quiz Paradox

This is one of my all-time favorite paradoxes:

On Friday, a teacher announces to his class that there will be a pop quiz one day during the following week. In order to uphold the integrity of the quiz, it must satisfy the following two conditions:

  1. The quiz will be handed out at the beginning of class one day the following week (Monday through Friday).
  2. The students will not be able to (logically) deduce which day the quiz will be held before they are actually given the quiz.

After making this announcement, he notices a lot of murmuring around the class, and after a minute a few students start sniggering. He asks the students what they find so amusing. One of the cleverer students in the class raises her hand and explains:

“You cannot possibly give us a pop quiz next week,” she says.

“Why not?” the teacher asks.

“Well, if we won’t be able to deduce which day the quiz is before we are given the quiz, it can’t be on Friday. For if after class on Thursday we still haven’t gotten the quiz, we will know that the quiz must be the following day. This contradicts the second requirement for a pop quiz.”

“That makes sense,” he agrees.

“So the quiz can only be given one of the days from Monday through Thursday. By the same argument as before, the quiz can’t be on Thursday: Since we’ve shown it can’t be held on Friday, we will know by Wednesday if the exam is to be held on Thursday. By the same token, the quiz can’t be held on Wednesday or Tuesday. The only possible day you could give the quiz is Monday. But I know now that the pop quiz must be on Monday, which again contradicts the second condition for pop quizzes. So you can’t give us a pop quiz next week!”

The teacher tells the class that he doesn’t see any flaw in the student’s argument, so it appears that he can’t satisfy the requirements for a pop quiz. The students seem pleased to have a weekend free of study having deduced that a pop quiz is an impossibility. Imagine the students’ surprise when the teacher hands out the pop quiz on Tuesday at the beginning of class, thus fulfilling the previous week’s proclamation!

Commentary

I first encountered this paradox in a collection of Martin Gardner‘s writings entitled The Colossal Book of Mathematics, where it is referred to as “The unexpected hanging.” A more detailed account of this paradox can be found in this article by Timothy Chow. The subtlety of the paradox is suggested by the sheer number of references cited in Chow’s essay.

music musings

Revel in Ravel’s Repetition

Bolero has been stuck in my head for the past couple days.

You can download a recording of Bolero here. Bolero was one of Maurice Ravel’s last compositions, and is of a very different nature from his earlier work. This Radiolab podcast postulates that Ravel suffered from frontotemporal dementia. The podcast draws a parallel between the end of Ravel’s life, and the life of biologist-turned-painter Anne Adams. See some of her paintings here.

computer science math musings

Humans are Streaming Algorithms

I liked this quotation from Muthukrishnan’s book on data streams:

As human beings, we perceive each instant of our life through an array of sensory observations (visual, aural, nervous, etc). However, over the course of our life, we manage to abstract and store only part of the observations, and function adequately even if we can not recall every detail of each instant of our lives. We are biological data stream processing machines.

Way to tie the study of theoretical computer science to understanding the human condition!