Sold for US$1,025,000 inc. premium
Own a similar item?
Submit your item online for a free auction estimate.How to sell
Looking for a similar item?
Our Books & Manuscripts specialists can help you find a similar item at an auction or via a private sale.Find your local specialist
Ask about this lot
Client Services (San Francisco)
Client Services (New York)
Client Services (Los Angeles)
TURING, ALAN MATHISON. 1912-1954.
Part 1: TURING, Alan. Autograph manuscript, 27 pp, on leaves 1r-14r. In blue ink, written primarily on rectos with occasional notes on versos. No date, but approximately 1944. Treating in general Peano's notation and axioms.
Part 2: GANDY, Robin. Autograph manuscript, 26 pp on leaves 14v-32r. In blue, red, green and black ink, written primarily on rectos with a few notes on versos. May 26, 1956-October 24, 1956. The 70 pages from leaf 32v-67r are blank.
Part 3: TURING, Alan. Autograph manuscript, 29 pp, on leaves 67v-81v. In blue-black ink, written upside down from end of notebook forwards, primarily on rectos, with a few notes on versos. No date but approximately 1944. Titled "Notes on Notations."
HITHERTO UNKNOWN WARTIME MANUSCRIPT OF THE UTMOST RARITY, CONSISTING OF 56 PAGES OF MATHEMATICAL NOTES BY ALAN TURING, LIKELY THE ONLY EXTENSIVE HOLOGRAPH MANUSCRIPT BY HIM IN EXISTENCE.
Turing material of any kind is extremely rare, and anything with a direct personal connection to him even more so. This, Turing's wartime notebook on logic, is the first time a manuscript by him has ever come to public market. Written during the Second World War, it has its origins in Turing's well-documented dialogues with Wittgenstein in 1939 on the interpretation of mathematical symbols.4
Origin and history of the manuscript
In his will,5 written on February 11th, 1954 just a few months before his death, Turing gave all of his mathematical books, articles, and manuscripts to his close friend the British mathematician and logician Robin Oliver Gandy (1919-1995). Gandy studied for a PhD in mathematics at Cambridge under Turing's supervision, and is best known for his important work in recursion theory. He would later take up Turing's mantle, making significant contributions including the Spector-Gandy theorem and the Gandy Selection theorem, as well as to the understanding of the Church-Turing thesis. His generalization of the Turing machine is known as a Gandy machine. In 1977, Gandy deposited the material he had inherited in an archive at King's College, Cambridge with the exception of one item: the present manuscript. Why? In 1952 following his conviction, Turing began seeing Dr. Greenbaum, a Jungian analyst who had Turing keep a series of dream journals. Along with the scientific papers, Gandy inherited these journals, and returned them to Greenbaum, who later destroyed them. Doubtless inspired by Alan's dream journals, Gandy decided to keep his own – in the middle blank pages of the present manuscript, stating in the opening lines:
"It seems a suitable disguise to write in between these notes of Alan's on notation; but possibly a little sinister; a dead father figure, some of his thoughts which I most completely inherited." (f. 15r)
As one could imagine, dream journals such as these would contain extremely private, and often embarrassing or painful information. Gandy chose what he felt would be the safest place to record his most intimate thoughts and dreams, though the sentimentality of placing his thoughts close to those of his late friend surely played a role in this choice. Indeed, Alan clearly made a frequent appearance in Gandy's thoughts, and is mentioned more than once in the journal:
"The more sinister as I handed over Alan's dream book to Greenbaum who I certainly at one stage thought of as responsible for A's suicide." (f. 14v)
Keeping this dream journal hidden between Turing's notes must have been at once a comfort to Gandy, while at the same time, a painful reminder of his loss of a dear friend. As the material deposited in the Turing Archive was to be made publicly available to historians and scholars, and has indeed been accessed many times, it is clear that this item was simply much too personal to be shared in the archive. Indeed, Gandy kept the journal among his personal effects for the rest of his life, and it was not seen by another person until his death in 1995. It was then inherited by one of Gandy's executors, from whom the current owner acquired it.
Dating the notes
There are a few clues that help us find an approximate date for this manuscript. A careful comparison shows that certain pages of the manuscript have a relation to an unpublished typescript entitled "The Reform of Mathematical Notation and Phraseology"6 housed among the Turing papers held in the Archive Centre at King's College, Cambridge. This typescript was later published in Turing's collected works, where the date attributed to the typescript is 1944-45.7 Additionally, Turing refers to the work of several mathematicians (including his own) the majority having been written between 1927 and 1940. The latest reference being for one of Turing's own papers, published in 1942.
First Part of the Manuscript: Peano's axioms
Giuseppe Peano (1858-1932) was an Italian mathematician known as one of the founders of mathematical logic and set theory, who axiomatized for the first time the theory of natural numbers. His five axioms were meant to provide a rigidly accurate foundation for the natural numbers. These five axioms would go on to play a key role in a number of questions posed by logicians and mathematicians, most notably by the German mathematician David Hilbert (1862-1943). Hilbert, in his 1928 address to the International Congress of Mathematicians in Bologna posed three famous challenges to the mathematical community:
1.To prove that all true mathematical statements could be proven, that is, the completeness of mathematics.
2.To prove that only true mathematical statements could be proven, that is, the consistency of mathematics.
3.To prove the decidability of mathematics, that is, the existence of a decision procedure to decide the truth or falsity of any given mathematical proposition.
The third of these three problems became known as Hilbert's Entscheidungsproblem [Decision Problem]. The first two questions of completeness and consistency were famously answered two years later by the Austrian logician Kurt Gödel (1906-1978) in his "Über formal Unentscheidbare Sätze der Principia Mathematica und verwandter Systeme I." [On Formerly Undecidable Propositions in Principia Mathematica and Related Systems.]8 In this revolutionary paper, Gödel introduced his incompleteness theorem, which "showed that even powerful logical systems could not hope to encompass the full scope of mathematical truth."9 Gödel showed that, for any axiomatic system powerful enough to describe the natural numbers, such as the Peano axioms, 1: If the system is consistent, then it cannot be complete, and 2: The consistency of the axioms cannot be proven within the system. So what was Turing's interest in the Peano axioms? We know that Turing was introduced to Hilbert's Entscheidungsproblem as well as Gödel's incompleteness theorem in a lecture course given by English mathematician and code-breaker Max Newman (1897-1984) at Cambridge in the spring of 1935; after learning about it, Turing immediately took up the challenge of solving the Entscheidungsproblem. It was his process of doing so that led to his development of a universal computing machine, as clearly expressed by Newman: "I believe it all started because he attended a lecture of mine on foundations of mathematics and logic ... I think I said in the course of this lecture that what is meant by saying that [a] process is constructive is that it's a purely mechanical machine and I may even have said, a machine can do it. And this of course led [Turing] to the next challenge, what sort of machine, and this inspired him to try and say what one would mean by a perfectly general computing machine."10 Newman understood that the "machine" Turing was working on had applications that went far beyond that of the Entscheidungsproblem, and said that Turing's now famous paper On Computable Numbers with an Application to the Entscheidungsproblem (1936) contained an "extraordinary definition of a perfectly general ... computable function, thus giving the first idea ... of a perfectly general computing machine."11
With this in mind, it becomes clear that the notes contained in the manuscript are not simply an attempt to get non-logicians to use stricter notation, or simply a superficial study of Peano's axioms. Turing, like his intellectual forefathers Leibniz and Boole, was in fact working on a topic of profound importance: the development of a universal language; something that was to be at the core of modern computer science.
To this end, Turing not only examines the work of Peano, but also references and compares the work of other mathematicians who helped to lay the foundation for the development of a universal language, including Church, Descartes, Pell, Lagrange, and Arbogast amongst others. On page 5 of the notes, Turing states:
"Dots are used as brackets, as in my paper. The more dots, the more powerful."
Here, he refers to his paper "The Use of Dots as Brackets in Church's System"12 in which he defines a new notation for Alonzo Church's λ-calculus (Church [1903-1995] oversaw Turing's PhD thesis at Princeton, and is perhaps best known for his λ-calculus), which Church introduced as part of his investigation into the foundations of mathematics. "The Use of Dots... [shows] Turing's ability to reason about important issues in computer science before there were digital computers to reason about. In this case, Turing essentially studies an important aspect of programming languages, a syntax for trees."13 In the manuscript, Turing offers some critiques on the notation, which are revealing of his passion for the subject, such as on page 5:
"a− = b − (a = b) Hateful!", and "There is no very clear notion of hypothesis. It is however suggested that on the first introduction of a variable it be specified what class it belongs to."
Second Part of the Manuscript: "Notes on Notation."
The second half of the Turing manuscript, which he entitles "Notes on Notations" consists of Turing's remarks and observations on the work of various prominent mathematicians, including Weyl (1885-1955), Leibniz (1646-1716), Hilbert (1862-1943), Courant (1888-1972), Titchmarsh, and Pontryagin (1908-1988) amongst others. In this section, it is clear that Turing analyzes the work on these mathematicians with the aim of improving and building upon their work, as he writes of Weyl:
"The idea of an 'indeterminate' is distinctly subtle, I would almost say too subtle. It is not ... the same as variable. Polynomials in an indeterminate x, f1(x) and f2(x), would not be considered identical if f1(x) = f2(x) all x in k, but the coefficients, with rules for multiplication and addition suggested by their form."
It is clear that Turing is not merely criticizing the notation conventions used by Weyl, but rather is trying to analyze and make improvements to them, with the intent of building upon them, as he then writes:
"I am inclined to the view that this is too subtle and makes an inconvenient definition. I prefer the indeterminate k be just the variable."
Leibniz was Turing's intellectual predecessor in the development of a universal language, so it is only natural that Turing should examine his forms of logical notation. Leibniz, best known for developing the infinitesimal calculus independently of Isaac Newton was also known for his "wonderful idea," a special alphabet whose characters represented concepts rather than sounds, "A language based on such an alphabet should make it possible to determine by symbolic calculation which sentences written in the language were true and what logical relationships existed among them."14 Of this, Turing notes:
"The Leibniz notation dy/dx I find extremely difficult to understand in spite of it having been the one I understood the best once! It certainly implies that some relation between x and y has been laid down ...."
Turing goes on to point out difficulties in dealing with the positioning of variables, and the pitfalls with the way they are commonly denoted. It is clear that this was a problem of importance to Turing, but one that he had not yet quite worked out, as he writes:
"What is the way out? The notation (d/dx f(x, y))x=y,y=x hardly seems to help in this difficult case."
While the mathematical content is surely the most historically significant part of the notebook, containing ideas that are, and will surely continue to be relevant, the notebook helps to tell a larger part of Turing's tragic life story, placing his work within the context of who he was as a person. He was not just one of the most highly influential mathematicians of the twentieth, if not of any century, not just a hero who helped to put an end to the Second World War with his solution to the Enigma codes, not just the father of the computer age. He was also a man who had fears and shortcomings like the rest of us, who wanted perhaps more than anything, to be free to live his life as he wished and to be himself without consequences. Because he was not permitted to do so, humanity has been deprived of one of its greatest minds, and we will never know what other groundbreaking ideas he would have developed had he not been put in the situation where taking his own life was preferable to having to live a life where he was not permitted to be his true self.
Alan Mathison Turing
Born in 1912 to a civil servant and the daughter of a railway engineer, Turing distinguished himself as a child, showing a natural inclination towards mathematics and science. At the age of 15 he was already solving advanced problems despite not having studied even elementary calculus. He completed his undergraduate work in mathematics at King's College, Cambridge, where he was subsequently elected a fellow. He then went on to complete his PhD at Princeton University, introducing the notions of ordinal logic and relative computing into his dissertation entitled Systems of Logic Based on Ordinals (1938). Turing is known as a hero for his work at Britain's code-breaking center at Bletchley Park, where he was the principal figure in solving the German Enigma codes, using his ingeniously designed bombe. He is considered to be the father of theoretical computer science for his work including the discovery of the universal model of computation now known as the Turing Machine, his development of the notion of "oracle relativization," and his invention of the LU-decomposition method in numerical computation. His landmark paper "On Computable Numbers," written in 1936 when he was only 24 years old, led to his development of a universal computing machine and is now considered to be one of the most important scientific papers of the 20th century.
Turing had the type of mind that was able to see systems and patterns across fields, and his later work ventured into areas other than mathematics, most notably biology and chemistry. His hugely influential paper entitled The Chemical Basis for Morphogenesis (1952), which some consider to be the spark of modern chaos theory, has very recently been validated by researchers at Brandeis University and the University of Pittsburgh, confirming that Turing was a multi-disciplinary intellect. His tireless work in mathematics and science was tragically halted by his conviction in 1952 for the then crime of gross indecency (committing acts of homosexuality), followed by his sentence of chemical castration, and his subsequent suicide in 1954. He received an official royal pardon for his "crime" in December 2013—60 years after his suicide, one of only four Royal Pardons granted under the Royal Pardon of Mercy since the Second World War, a testament to Turing's status as a hero in the eyes of the British people. Many books have been written about Turing, and his life has been memorialized in the award-winning film The Imitation Game, adapted from Andrew Hodges' excellent biography Alan Turing: The Enigma,1 as well as in the Broadway play Breaking the Code by Hugh Whitemore. Despite all of the popular interest in his life, there are still many things about Turing that remain obscure. Some of these are illuminated by the contents of the present manuscript. The manuscript sheds light on the problems that were of fundamental importance to Turing's work in the field of computer science, including deep issues in the foundations of mathematics such as the interpretation of symbols, and his quest to develop a universal language with the aim of allowing mathematics to be executed by machines rather than people. More than anything, the manuscript sheds light on Turing's great potential, giving us a glimpse into the types of work that he might have gone into had his life not tragically been cut short.
After completing his PhD dissertation in 1939 at the Institute for Advanced Study at Princeton under the supervision of Alonzo Church (1903-1995), Turing returned to Cambridge. There, before the Second World War broke out, Turing attended a series of lectures given by the philosopher Ludwig Wittgenstein (1889-1951) on the foundations of mathematics and the interpretation of mathematical symbols. These lectures were to have a profound influence on Turing's work in the field of computer science and type theory: as Juliet Floyd points out in Alan Turing: His Work and Impact, "Turing is explicit that 'the statement of the type principle' in this essay [The Reform of Mathematical Notation and Phraseology] was suggested by lectures of Wittgenstein."2 These ideas would continue to be important to Turing's work for the duration of his life, and were very much on his mind as he wrote the present manuscript in which he, among other things, carefully examines the work of some of his predecessors who worked to develop a universal mathematical language, such as Leibniz and Boole, precisely because he himself was endeavoring to do the same thing. "In the 1800s Babbage wrote polemics about mathematical notation, and by the 1880s Frege, Peano and others were trying hard to create more systematic ways to represent mathematical processes. And no doubt that systematization was a necessary precursor to Hilbert's program, Gödel's theorem, and ultimately Turing's own work on defining what amounts to a universal mechanism for mathematical processes ... I suspect Turing was curious about what would be involved in creating a higher level representation: a full systematic language for mathematics at the level people actually do it."3
It is truly a remarkable item that lends an unparalleled insight into the workings of one of the greatest minds of the 20th century.
1. Andrew Hodges, Alan Turing: The Enigma. (Princeton and Oxford: Princeton University Press, 2014).
2. Juliet Floyd, "Turing, Wittgenstein, and Types: Philosophical Aspects of Turing's 'The Reform of Mathematical Notation and Phraseology' (1944-45)," Alan Turing: His Work and Impact. Edited by S. Barry Cooper and Jan van Leuween. (London: Elsevier, 2013), 250-253.
3. Stephen Wolfram, "The Reform of Mathematical Notation and Phraseology: Stephen Wolfram Connects Computation, Mathematical Notation and Linguistics" Alan Turing: His Work and Impact. Edited by S. Barry Cooper and Jan van Leuween. (London: Elsevier, 2013), 239-244.
4. Cora Diamond, ed., Wittgenstein's Lectures on the Foundations of Mathematics, Cambridge, 1939. (Ithaca: Cornell, University Press, 1976).
5. Alan Mathison Turing, "The Reform of Mathematical Notation and Phraseology," ca. 1944. http://www.turingarchive.org/browse.php/C/12
6. Archive of Alan Mathison Turing at King's College, Cambridge. http://www.turingarchive.org/browse.php/A/5
7. Archive of Alan Mathison Turing at King's College, Cambridge. http://www.turingarchive.org/browse.php/C/12
8. Gandy, Robin, ed. The Collected Works of A.M. Turing. Mathematical Logic. (London: Elsevir, 2001), 211.
9. Monatsheften fur Mathematik, XXXVIII, Band I. (Leipzig: Akademische Verlagsgesellschaft, 1931).
10. Martin Davis, The Universal Computer: The Road From Leibniz to Turing. (New York: W.W. Norton, 2000), 100.
11. Max Newman, interview by Christopher Evans, The Pioneers of Computing: An Oral History of Computing, London Science Museum.
12. Max Newman, interview by Christopher Evans, The Pioneers of Computing: An Oral History of Computing, London Science Museum.
13. Journal of Symbolic Logic, Vol. 7 (1942), pp 146-156.
14. Lance Fortnow, "Lance Fortnow Discovers Turing's Dots", Alan Turing: His Work and Impact. Edited by S. Barry Cooper and Jan van Leuween. (London: Elsevier, 2013), 227-228.
15. Martin Davis, The Universal Computer: The Road From Leibniz to Turing. (New York: W.W. Norton, 2000).