That’s because at precisely 23:31:30 GMT tonight (Friday), the ten-digit clock used by Unix computers - which includes the servers that run everything from the internet to air traffic control - will display all ten decimal digits in sequence.
For computer geeks everywhere, this seemingly dubious milestone deserve celebrations just like those that greeted the end of the millennium. Parties are planned around the world from London to New York, to Yerevan in Armenia and Asunción in Paraguay.
But after the brief flash of joy, comes the dread. Computer scientists fear the worst for the next major moment in Unix time - some time in the year 2038, when the Unix clock will run out of seconds it can count. On that January day, computers will fail to compute time, and crash. Your computer could shut down. Vehicles may pile up as traffic lights fail. Planes could fall out of the sky. The advice is to party now, because the digital apocalypse may soon be upon us.
Understanding how this will happen requires you to do away with your parochial understanding of time, and instead think more like a machine. You, being human, were under the impression that today was merely Friday February 13 in the year of our Lord, 2009.
Computers count time differently. They simply count the seconds from “Co-ordinated Standard Time”, or to human beings, the seconds elapsed from midnight, January 1, 1970 - the digital equivalent of the birth of Christ. Unix time is how many seconds there have been since then (not including leap seconds, in case you were wondering).
But why is 1234567890 a more significant moment in time than any other sequence of numbers?
“All calendars are just arbitrary,” argues Julian Burgess, a web developer from London. “Celebrating the millennium - why do that? It was just like any other day, the Earth rotates on its axis and it moves around the Sun. All these things are arbitrary, so for geeks to celebrate Unix time is something for them to enjoy.”
Others said it was the beauty of the number sequence that was worthy of celebration. “If you can’t get excited about all those numbers lined up in a row, well then this will clearly be lost on you,” said Ben Doddington, a computer scientist from Bookham in Surrey.
Unix is an operating system, like Windows which runs PCs, that was developed in the late 1960s by Bell Labs. Millions of modern PCs, including Apple's Macintosh computers, and entire computer systems still run on Unix or derivatives of it, such as Linux.
When Unix was first developed, computer storage of information was expensive, and with time being infinite, this created a problem. The brains behind Unix needed to cut down how much time a computer could store. So the developers created a time-counting system where time is represented as a 32-bit integer. This means that every second can be represented by a comination of 32 zeros or ones.
The problem with a 32-bit integer like this is that it can only count 4,294,967,296 seconds, or 136 years. This covers a period between 1901 and 2038. Once the Unix time clocks reach that moment they will “overflow” and the fear is many computers will stop working as a result, or at least suffer major problems. It's the same principle as the millennium bug, but one that many scientists believe should be be taken more seriously, as only people who count in binary will see it coming.
Fear not, the same computer scientists who were alone in celebrating the 1234567890 moment are the ones we will now rely on to update modern computer systems to a new counting system that will use a 64-bit integer. This will allow computers to count back 20 times the age of the universe, and around 293 billion years into the future. At which point, if man and machine are still around, they will have to deal with same problem all over again.
In case anyone wonders why the second limit is as so:
A 32-bit integer is something like a container which can hold 32 individual values. Obviously there is a maximum range -- the maximum number of unique values. Now each value in the container can hold one of two values, hence a base-2 number system. The maximum value one can hold in this container is a by-product of the number of unique values it can have.
Since there are 32 spaces in one container, and each space can hold one of two values, the formula is 2^32, or 4294967296 unique values. Since 0 is included, you actually only get a range of 0 to 4294967295.
Thus, you can represent 136 years' worth of seconds in a 32 bit number.