A few weeks back I wrote a blog about an idea that I eventually called Knockout Primes with much input from readers. In the course of exploring the idea I mentioned that, " if we allow the old-fashioned idea that 1 was a prime" and opened the doors to a number of (mostly, but not all, middle school) math teachers who courteously informed me that "One is never prime", often with supporting quotes from their textbooks.
One not so courteous complaint called me to task for destroying the hard work of thousands of math teachers everywhere.
I responded to each of the writers with something of the nature of what is written here, a too brief history of one as a prime, but since I think there may be many more who are NOT aware, I wanted to make a more general lobbying for flexibility in definitions and greater awareness of the historical development of our beautiful subject.
So here is a bit about the tainted history of one.
The Greeks who gave us much of our early mathematical framework didn't even consider one as a number. Euclid's famous Elements gives the following definitions:
Def. 1. A unit is that by virtue of which each of the things that exist is called one.
Def. 2. A number is a multitude composed of units.
(from David E Joyce's online Elements)
So for many years (at least in the west) it was common to exclude one from the set of numbers, and so certainly not treat it as a prime. Even after one became commonly accepted as a number, definitions of prime would read something like this one from ELEMENTARY ARITHMETIC, WITH BRIEF NOTICES OF ITS HISTORY, (1876) By Robert Potts:
Prime numbers divisible only by one and a number equal to itself.
Potts was a Trinity College, Cambridge mathematician whose edition of Euclid's Elements was the standard geometry textbook in England for much of the 19th century. His definition is not very different from modern textbooks, nor from others of his period. (A photo copy of this book is still available on Amazon, and on line at Google books)
He was certainly aware of most reasons anyone today would give for excluding one from the primes, such as the Fundamental Theorem of Arithmetic, or Euler's Totient Function. But his table of primes, shown at the top of this page, clearly includes 1 as an entry.
He was not alone, nor even the least unusual for the period. Wikipedia includes in its entry; "In the 19th century however, many mathematicians did consider the number 1 a prime. " I would hazard to suggest "most" could have been used in place of many. Even as late as the middle of the 20th century, the eminent prime-hunter, Derrick Norman Lehmer's list of primes up to 10,006,721, started with 1 as its first prime.
Lehmer |
It is not uncommon at all for mathematicians to adjust definitions for selective ideas. One of the more common is the frequency with which the "Natural Numbers" are defined to include zero, or not include zero, sometimes within the same book.
It is certainly not difficult to adapt theorems that need NOT to include one as a prime with a phrase as simple as "primes greater than one" or myriad other such restrictions.
Teachers who may think of math as having perfect and consistent definitions that are unwavering, and questions with exactly one right answer; may find the fact that different geometers disagree about such simple things as the definition of trapezoid (trapezium for European readers) in geometry (at least vrs exactly one pair of opposite sides parallel) can be distressing. I recently received a note from a teacher who was exasperated that a group of thirty or so math teachers he was taking an in-service with nearly all protested that using synthetic division to divide by a quadratic was impossible, but none had any idea how they might prove that it was, nor gave any indication that they felt the need to prove it. He was planning to go in early the next day to do an example from my blog on the subject to show it was possible.
One of the reasons I encourage math teachers to learn and share the history of math with students is that it exposes both teacher and student to this power flexibility of math in a way textbooks often do not.
I am prone to mathematical misconceptions myself, and frequently get corrected for misstatements of facts, dates, etc. I am (almost) always grateful for the correction, and try to include a hat tip for the information. I am grateful as well to every reader who glances by my page, and hope many will share some of the information with students, friends, or other mathematicians. I am especially grateful if it is done with some courtesy, as it most often is.
And now I will remove my mask of indignant righteousness, step off my soapbox, and return you to your previously scheduled program.
Addendum:
At the time I wrote this I was unaware of a nice article by Chris Caldwell and Yeng Xiong that gives a very well researched historical view of the different ways of defining primes.
And they point out that for some, neither one nor two were prime :
Martinus and others such as Nicomachus (c. 100) and Iamblichus (c. 300) [26, p. 73], Boethius (c. 500) [53, pp. 89–95], and Cassiodorus (c. 550) [20, p. 5] make the primes a subset of the odds, excluding both one and two from the primes—so for them, the smallest prime was three. (It is easy to extend this list of those for whom the first prime was three well into the 16th century .) Most of the ancient Greeks however, like Euclid, began the sequence of primes with two.
In the 1950s & 60s I was taught that 1 is a prime and got quite annoyed, initially, when people later told me that it wasn't.
ReplyDeleteWhether you allow 1 to be a prime or not is actually a philosophical and not a mathematical decision.
Thony, the fun can begin when you ask "Why Not?" or ask for their definition and they give one that either doesn't rule out one, or else is impossible.
ReplyDeleteI think it is definitely a mathematical and not a philosophical discussion!
ReplyDeleteIn the 19th century indeed some mathematicians have considered 1 to be a prime, but during the 20th century, new notions have been introduced and researched which challenge this.
The concept of a prime number in the integers has been generalized to a prime ideal in a commutative ring, in which (for various theorems to make sense) you cannot count the entire ring as a prime ideal. When applied to the integers, this corresponds to not counting 1 as a prime number.
Furthermore (not unrelated), in algebraic number theory, we study algebraic extensions of the rationals and the integers and primes in those extensions. This allows seeing primes in a broader perspective.
This includes the most compelling (in my opinion) reason to not consider 1 a prime: If you look at the unique factorization in algebraic integers which have unique factorization, it is always factorization to a product of primes up to a unit (an invertible integer). In the regular integers, the units are 1 and -1, and we usually "don't care about negative primes" (-2, -3, -5...). In algebraic integers there may be more units however. It is clear that for the unique factorization to work correctly, we cannot consider units as primes, and applied to the integers it means we cannot consider 1 as a prime.