Notes for remarks by Ronald L. Rivest
for ACM Turing Centenary Celebration
Jun 16, 2012 Palace Hotel, San Francisco, CA
-- First, I'd like to thank ACM for organizing this
wonderful event and for inviting me to participate in
this panel. Second, thanks for Vint for his kind
introduction.
-- In the few minutes I have, I'd like to cover just two
items:
-- The first relates to Turing's contributions to
cryptography (I have an unusual point to make
here).
-- The second relates to the unhappy marriage of
cryptography and computer security moving forward.
----------------------------------------------------------
-- Turing's contributions to cryptography are well-known.
That is to say, it is well known that he played a major
role in the breaking of the German Enigma cipher by the
Allies during World War II. This cryptanalytic trimuph
certainly shortened the war, and may have been decisive
in the Allies victory.
-- However, Turing didn't publish papers on cryptography,
and his contributions to the unclassified academic field
of cryptography are at best indirect.
It is known that Turing travelled to the U.S., and may
have talked with Claude Shannon while he was visiting
Bell Labs; Shannon's seminal 1948 paper on cryptography
A Mathematical theory of secrecy
was perhaps stimulated by
discussions he had with Turing. (We don't know.)
-- Some of Turing's classified papers on cryptography are
now becoming declassified -- indeed two were declassified
just this year. Viewing these papers, I suspect that the
heart of Turing's World War II cryptographic writings
still remain classified.
In any case, these papers have had little impact on
academic cryptography.
-- But I would like argue briefly and perhaps provacactively
that Turing's work on *artificial* *intelligence* has had
a major impact on modern cryptography (!).
How so?
-- Consider the following two questions:
When is a computer system *intelligent*?
When is a computer or cryptographic system *secure*?
These are subtle and slippery, but very important questions.
-- Turing's genius was to provide an "operational"
definition of intelligence, in the form of a game. (To
be precise, he wasn't trying to "define intelligence" but
to identify critical properties it should have that were
testable.)
-- His game asks if an examiner can distinguish between a
man and a computer by asking it questions.
(Last night I asked Siri if she was intelligent, and she
said, "I can't answer that." I asked "Why not", and she
said, "You see things, and you say "Why?", But I dream
things that never were, and I say, "Why not?") Turing
would have enjoyed playing with Siri!
-- The notion of an indistinguishability test as a means of
*defining* a difficult notion was a brilliant contribution
by Turing. The end result is that something *is* (defined
to be) intelligent if it is *indistinguishabile* from
something accepted as being intelligent.
-- The same approach has now pervaded academic cryptography;
this began with the work of Goldwasser and Micali, and
has been extended by Blum, Yao, Bellare, Rogaway, and
many others.
-- For example, we now say that a bit-generator is
(pseudo)-random if it is indistinguishable from a random
source. Similarly, we ask if an encryption function is
indistinguishable from a family of randomly chosen
permutations between the message space and the ciphertext
space. And so on. If an adversary can't distinguish
between the ideal object and our real-world instantiation
of it, using a guessing game that is very reminiscent of
Turing's Test, then our instantiation is ``good enough''
for all practical and theoretical purposes.
-- So the Turing Test was not only a fundamental
contribution to AI, it also provided a paradigm for much
of modern cryptography.
----------------------------------------------------------
-- The second item I would like to cover concerns the
unhappy marriage between cryptography and computer
security.
-- Abraham Lincoln had favorite riddle:
"If you call the tail of a dog a leg, how many legs does
the dog have?"
"Four. It doesn't matter what you call the tail; it is
still a tail."
-- Cryptographer's commit the same error when they call a
bit string a "secret key". Calling a bit string secret
doesn't make it secret...
-- The old saying goes: "If wishes were horses, then beggars
would ride..."
-- Today's version might go: "If wishes were secure
software, then we would all vote securely on the
internet!"
-- But wishing doesn't make keys secret, or voting systems
secure. The label is not the reality.
(Unless of course you are in marketing or sales!)
-- Cryptographer's tend live a bit in a fantasy world where
Alice and Bob can do modular exponentiations in their
head, or where all of their digital devices are totally
trustworthy and would never betray them...
Cryptographers call these "assumptions" but they are
really just nice wishes...
-- In the real world, we depend on digital devices to do our
modular exponentiations *and* to keep our so-called
``secret keys'' actually secret. Somebody has to do a
lot of hard work along the lines of ``wish fulfillment''
for the cryptographers...
-- The marriage contract goes along the following lines: The
computer security folks depend on the cryptographers to
provide crypto methods that can guard and authenticate
information sent over communications channels; in return,
the computer security folks must give cryptographers
computer systems that can actually keep secrets secret!
-- This is not a fair trade, it seems! The cryptographer's
job is much the easier of the two. (Even though it is
itself highly nontrivial.)
-- We are seeing an increasing amount of evidence that our
computer systems are trying to do too much, so that (from
a security perspective) it is often not done well enough.
There is a robust market for ``zero-days'' --
vulnerabilities that no one else has yet noticed, ready
to be exploited...
-- The recent disclosures of Stuxnet and Flame illustrate
further how far we really are from ``solving'' the
computer security problem... Perhaps it isn't
really a solvable problem...
-- It seems we should rethink and revise our assumptions:
Secret keys are kept secret -- most of the time.
People perform the correct protocols -- most of the time.
Operating systems protect parties from each other --
most of the time.
and so on...
-- We need an ``attitude adjustment'', so that we are
prepared to live with, detect, and recover from, *repeated*
*failures* of our security mechanisms, including the
repeated loss of cryptographic keys.
Some of this style can be modelled in a nice
game-theoretic manner; you can see some beginnings of
such a theory in my talks and slides on the game of
``FlipIt'' on my web site...
We also need new public-key infrastructure techniques
that are compatible with frequent key updates. This is
another of the projects I am currently working on...
-- MIT's aeronautics department just recently released a
report entitled: "Aircraft engineered with failure in mind
may last longer" That is, robust aircraft design is now
done where designers assume that other components may not
be functioning properly, and try to make the overall
design robust enough that it will still function. This is
not just the usual triple-redundant hydraulic systems, but
testing whether the sytem can be landed even when the
rudder fails completely.
-- I spend quite a bit of time these days working on voting
system integrity. These morals are doubly relevant for
voting systems, since the need to making ballots
anonymous precludes the usual sort of corrective feedback
available for financial systems.
You can give customers feedback on their checking
account, and allow them to make corrections, but you
don't want feedback on ballots to enable voters to sell
their votes!
-- A voting system must nonetheless be designed so that
errors in the tally introduced unintentionally or even by
intentional fraud, will be caught and corrected.
Voting on paper ballots, followed by effective
post-election audits, seems a great way to go. People
are sometimes surprised by my endorsement of such a
``low-tech'' solution. Yet high-tech brings complexity,
and any security guru will tell you that complexity is
the enemy of security.
Voting over the Internet is at best a very interesting
research project with long ways yet to go; at worst it is
harmful vaporware and wishes trying to become secure
software.
-- We have the technology to build enormously complex and
interesting systems, but we have yet much to learn about
how to build software that is robust and resilient when
attacked. The promise of cryptography won't be realized
until these software foundations are much studier, and
secret keys are more than merely labelled as ``secret''...
This concludes my remarks...