NZSM Online

Get TurboNote+ desktop sticky notes

Interclue makes your browsing smarter, faster, more informative

SciTech Daily Review

Webcentre Ltd: Web solutions, Smart software, Quality graphics

Viewpoint

When Should We Celebrate the Millennium?

Bill Keir

I have found mass media discussions of this question simplistic, confusing and inconclusive. I have also found in social discussions that many people are puzzled about the matter and don't fully understand the issues.

I was hoping Stephen Jay Gould's latest book, Questioning the Millennium (Jonathan Cape, 1997), would be the definitive statement we needed, but I was disappointed.

Quite properly, Gould tries to establish the historical facts and describes the complexities of calendars and the unevenness of the astronomical facts upon which much calendaring is based. He also dwells amusingly, and rightly, on the absurdities of the irrational human compulsion to have parties and to attach significance to arbitrary numerical periods which have no meaning in nature.

His book is readable, informative, entertaining and in the final paragraphs very touching with his account of a mentally handicapped young man with prodigious mental calculating abilities. But he never even mentions the subject most crucial to the debate -- the mathematics of measuring time.

I have found this to be the very point people are confused about and I have never seen a clear statement of it in any popular discussion forum.

Gould's summary of the historical facts is also a bit sloppy, but more of that later. Suffice to reiterate that it is a well documented historical fact that the sixth-century monk Dionysius Exiguus, who devised the Christian-era calendar, labelled the first year AD 1 --  he numbered the years as if counting objects.

The simple arithmetical consequence of this is that 2,000 years will not have elapsed until the end of the year 2000.

No mystery. So what's the problem? What puzzles people is why we need a year-number-zero when reckoning years, but it is nonsense to have an apple-number-zero when counting apples.

The confusion is about the mathematical distinction between counting and measuring. We count objects like apples using the counting numbers (the natural numbers) where zero doesn't exist (zero is only needed to make addition and subtraction work). We measure a continuum like time or distance, using the number line, as found on a ruler. Unlike objects, time and distance have a flowing nature and must be reckoned differently -- division into units is arbitrary, as is the starting point.

I acknowledge this oversimplifies. Philosophers of mathematics have long drawn attention to the fact that, when scrutinized, numbers, lines and points are more slippery concepts than they first appear. However, the value of mathematical methods is in their usefulness, and mathematicians realised long ago that if you want a precise method of reckoning time and distance for scientific purposes, simply counting the units as if they were apples is not good enough.

Dionysius can be excused for not realising this. The number line, although implicit in early Greek mathematics (such as Euclid's definition of lines and points, and Zeno's paradoxes), was not fully developed until the 17th century when Descartes refined the concept into the cartesian coordinates and Newton and Leibniz built on it to devise that monument of mathematics, the calculus.

But the concept of the number line is not that rarefied. In school maths we introduce it to children aged about eight; they have their first practical experience of it when they learn to use a ruler to measure distance, and they encounter it as soon as they grasp that they are not eight years old until eight full years have elapsed since they were born.

For the record then, let us set out some of the characteristics of the number line which make it so useful for measuring continuums.

First, it has a zero point. This is arbitrary -- it doesn't matter where it is on the continuum as long as it is defined.

Second, the units into which you choose to divide the continuum are arbitrary, as long as you define them and define the points at which one of them starts and ends. The important thing is the points; the division into units is artificial -- the number line is really a series of points.

Thirdly, it follows from this that the number line and its artificial units are infinitely subdivisible.

Another characteristic is that, unlike when counting objects, the cardinal number label for the unit doesn't match its ordinal --  year one is the second year -- a strange business until you realise you are not really counting, you are measuring ("how much" rather than "how many").

Because a continuum flows, we are talking about points at which units have elapsed. The first unit has elapsed at a point labelled "one"'. Note this is a label for a point, not a unit.

Thus it comes about that points preceding point "one"' are expressed as fractions of one, and thus it comes about that we are less than one year old (zero years old, so to speak) until our first birthday.

The usefulness of this for measuring is twofold. Firstly, the whole number for the unit indicates full units already elapsed. Secondly, the unit can be subdivided into fractions, expressed as the units elapsed plus a fraction, indicating any point on the continuum you are able to identify -- 2 1/4 years old, 0.99 metres long, Julian Day number 2449353.54167.

And it is a matter of definition. Julian Day numbers used in astronomy are defined as the number of solar days elapsed since zero hour Greenwich Mean Astronomical Time, January 1, 4713 BC (an arbitrary starting point), plus the decimal fraction of a day elapsed since that number point. The number line in all its glory.

If Dionysius had used the number line, 2,000 years will have elapsed at the beginning of 2000 AD. Obviously we will all be partying at that point, but to be pedantic, because of Dionysius, we will only be celebrating all those big fat zeroes clicking up.

This brings me back to Stephen Jay Gould. He makes the extraordinary claim that the question of when centuries and millennia end is an unresolvable issue. He then proceeds to resolve it by an arithmetical sleight-of-hand which is dubious to say the least.

The fact that 2,000 years will not have elapsed until the end of 2000 AD is not, as Gould insists, a matter of popular common sense versus rarefied learned opinion, and it is certainly not an unresolvable argument. It is a simple mathematical fact resulting from a well documented historical fact.

And it simply will not do to solve an arithmetical shortfall by pretending, as Gould seems to, that the starting point might as well have been earlier, or that the first century might as well have had only 99 years. The arithmetical consequence of calling the first year AD 1 results from the measuring system used, not from the choice of starting point. If you are going to redefine, you need to redefine the system, not the starting point. But imagine the chaos in the history books if we renumbered all the years -- the confusions are bad enough as it is.

The problem Gould is trying to resolve is a social problem of people wanting to celebrate the passing of 2,000 years when the zeroes click up, but, to be mathematically correct, they can't because of a historical accident in the measuring method used.

The way to resolve such a social problem is by a social process, not by mathematical self-deception. All we need to do is say, "Okay, let's acknowledge the mathematical and historical truth, and have a party at the end of 1999 just to celebrate the number changeover."

What of Gould's historical sloppiness? I mention only the most glaring example.

He says Dionysius began counting years from the birth of Jesus, assumed to have been December 25, and called AD 1 the year beginning January 1 seven days after the birth.

I don't know Gould's sources ( he mentions only one: Century's End by Hillel Schwartz), but nearly all the ecclesiastical history sources I have consulted, including an interesting mediaeval primary source, make it clear that Dionysius started counting from the incarnation (the conception of Jesus), assumed to have been March 25 AD 1.

The English Church, which adopted the Dionysian system at the Synod of Whitby in AD 664, treated March 25 as New Year's Day, but eventually reverted to January 1. But the Roman Church, adopting the Christian-era calendar later, aligned it with the Roman civil calendar with AD 1 starting on January 1 prior to the incarnation.

March 25 was, of course, a fictional date, because it was derived by subtracting nine months from December 25, itself only a traditional date for the birth of Jesus and not in common use until several hundred years after the event. We don't even know which year Jesus was born in, let alone which day.

The interesting thing about this, and the many other computational errors of Dionysius mentioned by Gould, is that it is irrelevant to the question of when to celebrate the millennium. What is relevant is Dionysius' method of measuring. If you want to know how much time has elapsed since a zero point it doesn't matter when the zero point is. What matters is how you measure.

I would have thought Gould would at least have mentioned the number line -- that ingenious invention which marries geometry (lines and points) with arithmetic (numbers) to give us a precise mathematical tool for measuring a continuum.

As for parties, you can throw one whenever you like and for whatever reason. In this case, let's be clear about the reason -- at the end of 1999 we will not be celebrating the passing of 2000 years, we will be celebrating the number changeover.

Bill Keir is a freelance journalist with an interest in astronomy and philosophy.