« Back to List View
When you look at time on a computer, all of the numbers on the left side of the decimal point are based on a sexagesimal numerical system (base 60), and all of the numbers to the right are based on a decimal system (base 10). Why on earth is that? It’s almost as though we need a computer to compute time. Chances are, if the measurement of time was invented today, we would probably be counting time on both sides of the decimal point using the decimal system. But the measurement of time was invented many years before computers. It all has to do with your hands.
At first glance, your hands support the decimal approach to counting. After all we have 10 fingers, which makes it easy for us to count in tens. But if you take a closer look at your hands, you’ll notice that each finger (not your thumb) has three joints – and each bone is technically known as a phalanx. Now multiply 3 phalanxes by four fingers and you get 12, which happens to be the base for the duodecimal numerical system. Next, multiply your five fingers on the other hand with 12 and you get 60, the base for the sexagesimal system. This all seems very complicated, unless you go back to around 2400 BC, to the ancient Sumerians who actually are responsible for getting us into this sexagesimal mess.
Sumerians liked to count using their thumbs as pointers and marking off each phalanx of the remaining four fingers. Don’t ask us why. They just did. They also used their other hand to mark five multiples of 12, the maximum number being 60. The number 60 is also the smallest number divisible by the first six counting numbers as well as by 10, 12, 15, 20 and 30. So, in a time when you had to rely on your fingers rather than your computers to compute, there were many reasons to go sexagesimal.
By the time the ancient Babylonians became the dominant civilization, they too adopted the sexagesimal and duodecimal systems. They made astronomical calculations in the sexagesimal system, divided the sky into the 12 signs of the Zodiac, divided day and night into 12 hours each, determined that a circle should have 360 degrees and split the year into 12 months.
From this point on there were refinements and additions to the use of the sexagesimal and duodecimal systems.
The Egyptians decided to add day and night to create a 24 hour day.
The Greeks devised systems for longitude and latitude using 360 degrees. The Romans went even further by subdividing each of the 360 degrees of latitude and longitude into smaller segments. Each degree was divided into 60 parts, each of which was again subdivided into 60 smaller parts. The first division became known as the "minute” and the second segmentation became known as the “second.”
In terms of actual time measurement, minutes and seconds were not really used for time keeping until the first mechanical clocks. While the hour was a recognized time measurement, clock displays were broken into halves, quarters or 12 parts. When the first mechanized clocks arrived at the end of the 16th century, they showed minutes and from this point on we’ve be breaking down time into smaller and smaller increments. The second is currently defined as the duration of 9,192,631,770 cycles of radiation corresponding to the transition between two energy levels of the caesium-133 atom. Try counting that with your fingers.
What’s interesting is that as we’ve gone beyond measuring the second, we’ve transitioned to the decimal system. A microsecond is a millionth of a second, a nanosecond is a billionth of the second and so on. And that’s a good thing, especially when computer networks rely on time synchronization that’s measured in microseconds.
And that’s a good thing because computer networks are much less concerned with the numbers on the left of the decimal than they are with the numbers on the right. In time synchronization, it’s pretty easy to get computers on your networks to be in sync within a minute or a second of each other. What’s much harder and more critical is synchronizing computers to within milliseconds and microseconds of each other. And if you want to know what a microsecond looks like in hand count, imagine 200,000 hands flashing at you at the same time. Just be thankful we have network time servers and the network time protocol (NTP) to take care of that task.