Time Standards

Royal Greenwich Observatory, London
The Royal Greenwich Observatory in London is the site of the Prime Meridian used in Greenwich Mean Time, the earliest internationally accepted time standard

Since its establishment in  the General Conference on Weights and Measures in 1967, the globally accepted standard of time measurement is the SI System (see the section on Units of Measurement), which uses the second as the base unit of time. Timekeeping is so important in the modern world, both for scientific purposes and increasingly for more general purposes, that it is now coordinated at an international level, and synchronized using incredibly precise atomic clocks (see the section on Clocks).

However, even if we can agree on the unit of measurement, there are several different specifications that can be used for measuring the rate at which time passes and/or points in time and coordinating that time across the world, and these specifications are known as time standards. Many of these standards are linked or related but differ from one another in certain small details, and some are used for very specific purposes.

International Atomic Time (TAI)

International Atomic Time (Temps atomique international, or TAI) is a weighted average of the time kept by over 200 atomic (mainly caesium) clocks worldwide, synchronized using GPS signals and two-way satellite time and frequency transfers, resulting in far more stability than a reading from any single clock. Because of the extreme accuracy of the atomic clocks, the contributing clocks are even corrected for height above sea level (see the section on Relativistic Time). TAI is the basis for Coordinated Universal Time (UTC), which is used for civil timekeeping throughout the world, and for Terrestrial Time (TT), which is the main standard used for astronomical calculations, both of which are described below.

Universal Time (UT)

Universal Time (UT) is based on mean solar time (i.e. based on the rotation of the Earth), and so is the same everywhere on Earth. UT replaced the older telescope-based system Greenwich Mean Time (see below) in 1928 as the worldwide time standard for the setting of Standard Time (also see below), although the older term GMT is still often used informally to refer to UT. The principal form of Universal Time is UT1, which is computed from observations of astronomical phenomena even more precise than measurements of the Sun, including observations of distant stars and quasars, laser ranging of the Moon and artificial satellites, and the determination of GPS satellite orbits, scaled and adjusted slightly to make them closer to solar time. UT2 (rarely used today) is a smoothed version of UT1, which filters out periodic seasonal variations.

Coordinated Universal Time (UTC)

Coordinated Universal Time (Temps Universel Coordonné, or UTC), which differs from UT1 by 0.9 seconds, is now the primary time standard by which the world’s civilian authorities regulate their clocks (although with some adjustments – see Standard Time below). Since 1964, under the auspices of the International Astronomical Union, international time broadcasts have been coordinated and adjusted as needed to conform to Coordinated Universal Time. UTC uses the same SI definition of seconds as International Atomic Time, and differs from TAI only by an integer number of seconds caused by the addition of leap seconds added at irregular intervals to compensate for the slowing of Earth’s rotation and natural cataclysms like earthquakes and hurricanes (a cumulative difference of 25 seconds as of 2012).

Standard Time (ST)
The familiar time zones we now use throughout the world are part of Standard Time
The familiar time zones we now use throughout the world are part of Standard Time

Standard Time (ST) is the usual clock time that most people use in daily life, and the basis for official civil time. It is based on Universal Time (see above), but seeks to take into account the geographical position of different parts of the Earth in relation to the Sun (and their consequent day and night cycles). ST was originally formalized by the Scottish-Canadian Sir Sandford Fleming in 1879, prior to which each country, and even each town, followed its own time. Standard Time divides the world into 24 time zones, each one covering 15 degrees of longitude (although, for reasons of practicality, some zones also follow country boundaries for part of their length), so that everyone on the planet has the Sun more or less at its highest point in the sky at noon. ST is defined in terms of offsets from the Prime Meridian, or 0° longitude, in Greenwich, England (also see Greenwich Mean Time below), and all clocks within each time zone are set to the same time as the others, and differ by one hour from those in the neighboring zones, although in a few cases half-hour, or even quarter-hour, offsets are also observed. The International Date Line is a line in the mid-Pacific Ocean at 180° longitude, where a calendar day must be added when travelling westward, and a day dropped when travelling eastward. Standard Time is also sometimes adjusted by daylight saving time (DST, or summer time), whereby clocks are advanced one hour during the lighter summer months so that evenings have more apparent daylight and mornings have less. DST was first implemented during the First World War, mainly to more closely match the hours that people are awake, thereby lowering the need for artificial light and conserving fuel. Many other countries have used it since then, and many still do, including most of Europe and North America.

Greenwich Mean Time (GMT)

Greenwich Mean Time (GMT) was the earliest internationally accepted time standard. It was established at the International Meridian Conference in 1884, when it was decided to place the Prime Meridian (0° longitude) at Greenwich, England, although Greenwich was already widely used as a standard since the establishment there of the Royal Observatory in 1675. It is a telescope-based standard, with noon GMT defined as the average (mean) time at which the Sun crosses the Prime Meridian and reaches its highest point in the sky there. Other time zones across the world, as used in Standard Time (see above), are defined in terms of offsets from GMT, so that all clocks within each time zone are set to the same time as the others. GMT was superseded as the main international standard by Universal Time (see above) in 1928, although UT is still often referred to informally as GMT.

GPS Time

GPS Time is the time standard used by the Global Positioning System (GPS), the space-based satellite navigation system, first developed in 1973 and made fully operational in in 1995, that provides location and time information to military, civil and commercial users around the world. GPS time is not corrected to match the rotation of the Earth, so it does not contain leap seconds or other corrections that are periodically added to UTC (GPS time was set to match UTC in 1980, but has since diverged), and periodic corrections have to be made to the on-board clocks to keep them synchronized with ground clocks. On the other hand, it differs from International Atomic Time by a constant offset of about 19 seconds.

Terrestrial Time (TT)

Terrestrial Time (TT)  is a modern astronomical time standard, used primarily for time-measurements of astronomical observations made from the surface of the Earth. It is a dynamical time standard, meaning that it is defined implicitly, inferred from the observed positions of  astronomical objects according to a theory of its motion, usually based on ephemerides, tables of the orbital positions of planet or satellites mapped over a period of time. Since the 1970s, it has superseded the similar Terrestrial Dynamical Time (TDT) and Barycentric Dynamical Time (TDB) standards, which were flawed, and the earlier Ephemeris Time (ET) standard before that. It uses standard SI seconds, and differs from International Atomic Time (see above) by about 32 seconds, although it is not itself defined by atomic clocks, but is essentially a theoretical ideal which real clocks can only approximate.

Sidereal Time

Sidereal time is a timekeeping system used by astronomers to keep track of the direction to point their telescopes to view a given star in the night sky. It is based on the Earth’s rate of rotation measured relative to the “fixed stars”, as opposed to solar time which reckons the passage of time based on the Sun’s position in the sky. As a result of the orbit of the Earth around the Sun, a sidereal day is about 4 minutes less than a solar day, varying from 3 minutes 35 seconds to 4 minutes 26 seconds due to the elliptic path of the Earth’s orbit.

System Time

System time on a computer is measured by a system clock, which is typically implemented as a simple count of the number of “ticks” that have transpired since some arbitrary starting date, called the epoch. Several variations of metric time are used in computing. The widely-used UNIX time is defined as the number of seconds elapsed since 00:00:00 UTC on 1 January 1970. Microsoft’s FILETIME uses multiples of 100ns since 1 January 1601. OpenVMS uses the number of 100ns since 17 November 1858, and RISC OS use the number of centiseconds since 1 January 1900. Most computers use the Network Time Protocol (NTP), one of the oldest Internet protocols used to coordinate computer clocks to Coordinated Universal Time (see above).

 

Units of Measurement

Seconds
The second is the base unit of time, and other units like minutes, hours, etc, are derived from it

The measurement of time requires the specification of units, but there are many different units of time, some of which may be more appropriate in certain circumstances than others.

SI Units

The International System of Units (Système Internationale d’Unités or SI) defines seven base units of measurement from which all other SI units are derived. The base unit for time is the second (the other SI units are: metre for length, kilogram for mass, ampere for electric current, kelvin for temperature, candela for luminous intensity, and mole for the amount of substance). The second can be abbreviated as s or sec.

Historically, a second was defined by reference to longer periods of time – minutes, hours and days – e.g. as 1/86,400 of a mean solar day (one day being 24 hours x 60 minutes x 60 seconds = 86,400 seconds). This is sometimes known as an ephemeris second (an ephemeris is a table showing the positions of the heavenly bodies on various dates in a regular sequence).

Since the establishment of the SI system in 1967, a second is technically defined in more precise and absolute atomic terms as “the duration of 9,192,631,770 periods of the radiation corresponding to the transition between the two hyperfine levels of the ground state of the caesium 133 atom”. In 1997, this definition was made even more specific with the stipulation that this refers to a caesium atom at rest at a temperature of 0° Kelvin.

Given that the Earth is very gradually slowing, and the mean solar day on which the original definition of a second was based has not remained the same, its definition is arguably a historical and cultural choice, even an arbitrary one. But at least the atomic definition we now use, whatever its provenance, will always remain constant. All other units of time measurement are now derived from the second. In fact, because we can measure time more accurately than length, even the SI measurement of the metre is defined in terms of the distance travelled by light in 0.000000003335640952 seconds.

Multiples and Submultiples

Units for periods of time shorter or longer than a second can be derived by applying the standard metric SI prefixes to the second:

­

second

decisecond = 1/10 secondcentisecond = 1/100 secondmillisecond = 1/1,000 secondmicrosecond = 1/1,000,000 second

nanosecond = 10-9 second

picosecond = 10-12 second

femtosecond = 10-15 second

attosecond = 10-18 second

zeptosecond = 10-21 second

yoctosecond = 10-24 second

decasecond = 10 secondshectosecond = 100 secondskilosecond = 1,000 seconds (about 16.7 minutes)megasecond = 1,000,000 seconds (about 11.6 days)

gigasecond = 109 seconds (about 31.7 years)

terasecond = 1012 seconds (about 31,700 years)

petasecond = 115 seconds (about 31.7 million   years)

exasecond = 1018 seconds (about 31.7 billion   years)

zettasecond = 1021 seconds (about 31.7 trillion years)

yottasecond = 1024 seconds (about 31.7   quadrillion years)

Other Units
Time Units
Flowchart illustrating interrelationships among the major units of time

More commonly, outside of purely scientific usage, other units are used for longer periods of time. Although technically “non-SI” units, because they do not use the decimal system, these units are officially accepted for use with the International System.

  • minute (60 seconds)
  • hour (60 minutes, or 3,600 seconds)
  • day (24 hours, or 86,400 seconds)
  • week (7 days, or 604,800 seconds)
  • month (28-31 days, or 2,419,200-2,678.400 seconds)
  • year (about 365.25 days, or about 31,557,600 seconds)

For even longer periods, some multiples of years are commonly used, e.g. decade (10 years), century (100 years), millennium (1,000 years), mega-anuum (1,000,000 years), etc.

In everyday speech, some less exact units of time are also commonly used, e.g. instant, moment, shake, jiffy, season, age, epoch, era, eon, etc. Some of these terms also have specific meanings in certain circumstances (e.g. in periodization), but in general usage their length is indefinite or ambiguous.

Quantum of Time

The chronon is a unit for a proposed discrete and indivisible unit of time in theoretical physics, known as a quantum of time. Such a unit may be used as part of a theory that proposes that time is not continuous but composed of many discrete units. It should be stressed that, according to our current understanding of physics, in both quantum mechanics and general relativity (which together make up most of modern physics), time does NOT come in quantized, discrete packages, but is smooth and continuous – see the section on Quantum Time. A discrete model may however be useful for some more obscure and largely hypothetical theories that try to combine quantum mechanics and relativity into a theory of quantum gravity.

It is not even clear what the value of a chronon might be. One candidate for it is Planck time (an infinitesimal 5.39 x 10-44 seconds), which is the time required for light to travel in a vacuum a distance of 1 Planck length, and is regarded by most physicists as the smallest time measurement possible, even in principle. Although much too small to have many practical applications, Planck time is consistent with the other Planck units for length, temperature, mass, density, etc, which are sometimes used in theoretical physics. Another possible candidate for the chronon is the time needed for light to travel the classical radius of an electron.

>> Time Standards

Periodization

Periodization
Periodization is the division of time into convenient periods or blocks

When dealing with time scales longer than those usually measured by calendars and clocks, the longer-term past can be divided up into convenient periods or blocks of time in a process known as periodization. Although records exist for most of human history, for longer time scales we must rely on geological and paleontological dating techniques.

To some extent, the start and end of identified periods is necessarily somewhat arbitrary (or at least imprecise), and periods may even overlap. But grouping together periods of time with relatively stable characteristics does provide at least some sort of framework to help us understand what would otherwise be a continuous stream of scattered and apparently random events.

Human Time Scale

The labels used for periodization of the more recent past may utilize many different references, including:

  • calendar dates (e.g. the 1960s, the 17th Century, etc);
  • prominent individuals (e.g. the Victorian Age, the Napoleonic Era, etc);
  • historical or political events (e.g. the post-War years, the pre-Columbian Era, etc),
  • cultural movements (e.g. the Renaissance, the Romantic Period, etc).

Some usages are necessarily geographically specific (e.g. the Meiji Era of Japan, the Merovingian Period of France, etc), and cultural references in one part of the world may be at odds with those in other parts (e.g. the Italian Renaissance of the 14th to 16th Century, the English Renaissance of the 16th and 17th Century, the Harlem Renaissance of the 1920s, etc). Some periods may even be entirely illusory or mythological in nature (e.g. the Golden Age, the Age of Aquarius).

The periodization of human prehistory typically relies on changes in material culture and technology (e.g. the Stone Age, the Bronze Age, the Iron Age, etc). Within these broad general periods, sub-periods can be identified (e.g. the Stone Age can be divided up into the Paleolithic, Mesolithic and Neolithic Periods; the Bronze Age is sometimes split into the Copper and the Bronze Age; etc). It relies to a large extent on dating techniques such as radiocarbon dating (which is quite accurate within a range of 500 to 50,000 years), and other even longer-scale radiometric dating techniques.

Geological Time Scale
For time scales longer than the human time scale, geological techniques like stratigraphy and the fossil record must be used
For time scales longer than the human time scale, geological techniques like stratigraphy and the fossil record must be used

Further back into pre-human history, in what is sometimes referred to as “deep time”, there are no human or archaeological markers to use, and geologists, paleontologists and earth scientists use the geological time scale (or geologic time scale). This time scale splits up periods of the magnitude of many millions of years, utilizing geological and paleontological techniques like stratigraphy (the study of rock layers) and the fossil record of major events like mass species extinctions.

The concept of geological time had its beginnings with the ground-breaking work of the early geologist James Hutton in the 18th Century, who was the first to realize that the Earth must be many millions of years old, not just the few thousands of years maintained by theologians. Hutton’s early work was supported in the 19th Century by fellow Scot Charles Lyell, and later still by Charles Darwin.

The geologic time scale is usually divided up  as follows:

  • eons and supereons – broad timespans (rarely used) covering half a billion years or more, e.g. Hadean Eon, Achaean Eon, Proterozoic Eon, Phanerozoic Eon.
  • eras – intermediate timespans of several hundred million years, e.g. the Phanerozoic Eon is split into the Paleozoic, Mesozoic and Cenozoic Eras.
  • periods – intermediate timespans of tens or hundreds of millions of years, e.g. the Mesozoic Era is split into the Triassic, Jurassic and Cretaceous Periods.
  • epochs – shorter timespans of tens of millions of years, e.g. the Early, Middle and Late Jurassic Epochs.
  • ages – even shorter timespans (also rarely used, apart from in highly technical circumstances) typically covering a few million years each, e.g. Hettangian, Sinemurian, Pliensbachian, etc.
Cosmological Time Scale

Outside of these time scales, and beyond even the age of the Earth, the cosmological time scale applies, using the Big Bang (the creation of the universe itself) as a reference point. Physicists have been able to model the time after the Big Bang with remarkable accuracy (see the section on Time and the Big Bang), and have identified various “epochs” depending on the ambient temperature of the universe and the physical phenomena that arose:

  • Planck Epoch – the first 10-43 seconds after the Big Bang, theoretically the smallest time period it would ever be possible to measure, and about which we absolutely nothing.
  • Grand Unification Epoch – 10-43 to 10-36 seconds after the Big Bang.
  • Inflationary Epoch – 10-36 to 10-32 seconds.
  • Electroweak Epoch – 10-36 to 10-12 seconds.
  • Quark Epoch – 10-12 to 10-6 seconds.
  • Hadron Epoch – 10-6 seconds to 1 second.
  • Lepton Epoch – 1 to 10 seconds.
  • Photon Epoch – 10 seconds to about 380,000 years.
  • Reionization Epoch – about 150 million to about 1 billion years after the Big Bang.

In The Five Ages of the Universe, a popular science book by Fred Adams and Gregory Laughlin, the evolution of the universe, from the deep past to the deep future future, is split into five ages or eras (also see the section on Time and the Big Bang):

  • Primordial Era – from the Big Bang to about 350,000 years, or, in terms of orders of magnitude, up to about 105 years.
  • Stelliferous Era – stars and galaxies are formed, from 106 to 1014 years.
  • Degenerate Era – stars gradually burn out and die, from 1015 to 1039 years.
  • Black Hole Era – black holes dominate, but gradually start to break down and evaporate, from 1040 to 10100 years.
  • Dark Era – only random, isolated, low-energy particles remain, after 10100 years.

>> Units of Measurement

Calendars

Calendars – whether stone, paper or electronic – have been used for millennia
Calendars – whether stone, paper or electronic – have been used for millennia

A calendar is a system for organizing days and specifying dates, as well as the physical device (whether paper or electronic) used to record such a system. Calendars have historically been designed for social, religious, agricultural, commercial or administrative purposes (or a combination of all of these). A calendar can also be extended into the future and used as a reminder of future planned events.

Calendar and Date Organization

Calendar divisions are based on the movements of the Earth and the regular appearances of the Sun and the Moon, and a calendar typically works by dividing time up into units of days, weeks, months and years. Most of these units are based on objectively verifiable astronomical cycles, although the use of weeks within calendars is purely for administrative convenience and is not tied to any astronomical cycle.

Any calendar system  also needs to have a starting or reference point , sometimes referred to as a “fiducial epoch” (or just epoch), from which to begin counting. For example, the old Roman calendar used the assumed founding date of the city of Rome; the widely-used modern Gregorian calendar uses the supposed date of the birth of Christ; the Hebrew calendar uses the estimated date of the creation of the world; etc. A particular date, or the occurrence of a particular event, can then be specified with reference to these unit divisions and the starting reference point (e.g. 3rd of May 2013CE).

A variety of standard date formats are in use, differing in the order of date components, component separators, whether leading zeros are included, whether all four digits of the year are written, whether the month is represented numerically or by name, etc. The most commonly used date component sequence is the little-endian sequence day-month-year (e.g. 26/12/13), although the big-endian year-month-day is used in several Asian and European countries (e.g. 2013/12/26), and month-day-year is the norm in the United States and, usually, Canada (e.g. 12/26/13).

Types of Calendar

Almost all calendars divide up the days into months and years, but exactly how they do so varies. Most calendars synchronize their periods with the cycle of the Sun or Moon (or both), although some Ancient Egyptian calendars appear to have been synchronized to the motion of the planet Venus and/or the Dog Star Sirius. Because the period of the Moon does not neatly match the period of the Sun, no calendar can be truly based on both, and so a choice must be made, often incorporating periodic adjustments in order to match the two (a trade-off between accuracy and convenience).

The main types of calendar are:

  • Solar calendar (e.g. the Persian Calendar, the Gregorian Calendar) is synchronized to the apparent motion of the Sun over the year and thus remains in line with annual seasonal changes. It makes no attempt to match the changes in the Moon, and the division into months is purely nominal.
  • Lunar calendar (e.g. the Islamic Calendar) is synchronized to the phases of the Moon. Because a lunar month is not an even fraction of a year, a purely lunar calendar tends to drift against the seasons.
  • Luni-solar calendar (e.g. the Hebrew calendar, the Hindu calendar) is based on a combination of both lunar and solar reckonings (i.e. months are based on lunar months, but years are based on solar years), in which most years have 12 months but every second or third year has 13 (including a leap month) in order to realign with the annual seasons.
Intercalation

The insertion of additional leap days or leap months into some calendar years in order to synchronize the calendar to the seasons or moon phases is called intercalation or embolism.

In the case of lunar and luni-solar calendars, the months (known as lunar months or synodic months) approximate the cycle of the Moon’s phases, a period of about 29.5 days, for which many lunar calendars use alternating months of 29 and 30 days. However, because a lunar month is not an even fraction of a year (there are about 12.37 lunar months in a year), a purely lunar calendar tends to drift against the seasons unless adjusted periodically, such as by the addition of a leap month every two or three years.

In the case of solar calendars, the months are fractions of the tropical or solar year (i.e. the length of time the Sun takes to return to the same position in the cycle of seasons, as seen from Earth). Even with a solar calendar, though, the number of days in a year is not an exact whole number (approximately 365.242), so that a system of adding an extra leap day every fourth year (leap years) is instituted in many solar calendars, or more complex variations thereof.

Ancient Calendars
Stonehenge
Ancient megalithic structures like Stonehenge probably had some calendrical function

Ancient peoples used the apparent motion of the celestial bodies (the Sun, the Moon, the planets and the stars) through the sky to determine the seasons, the length of the month, and the length of the year. Many early civilizations developed calendars independently.

The oldest known calendar is a lunar calendar discovered near the town of Crathes in Scotland, which dates to around 8,000BCE. It consists of 12 pits in an arc 54 metres long that seem to correspond with 12 lunar months, plus an added correction to bring the calendar back into sync with the solar year on the date of the winter solstice. The megalithic standing stones of Stonehenge in southern England, begun around 3100BCE and rebuilt and added to many times over the succeeding 1,500 years, served various purposes, one of which was the determination of seasonal or celestial events, such as lunar eclipses, solstices, etc.

The Sumerian calendar is the earliest written calendar of which we have any evidence, dating back to as early as c. 3000BCE. The Sumerians used a luni-solar calendar, dividing the year into 12 lunar months of 29 or 30 days (for a total of 354 days), each beginning with the sighting of the new moon, plus an additional leap or intercalary month inserted as needed by decree of the priesthood, in order to synchronize with the 365-day solar year. The months were often referred to simply as “first month”, “second month”, etc, but also went by the names Nisanu, Aru, Simanu, Dumuzu, Abu, Ululu, Tisritum, Samna, Kislimu, Ṭebetum, Sabaṭu, Adar and Ve-Adar (the leap month).

The Babylonians carried on most of the ideas and knowledge of the Sumerians. They used a very similar luni-solar calendar to the Sumerians, with 12 lunar months plus an intercalary month inserted as needed. The Babylonians improved the overall accuracy of their calendar still further by using 12 years of 12 months followed by 7 years of 13 months, in a 19-year cycle. The artificial administrative concept of weeks was not introduced until the time of the late Babylonians and the Chaldeans, and they named the days after the Sun, Moon and the five known planets (the number seven was also widely held as auspicious by many ancient cultures, and enshrined in the creation myth of the Bible, thus assuring the legacy of the seven-day week).

The principal ancient Egyptian calendar was a solar calendar with a year that was 365 days long, divided into 12 months of 30 days each, with five extra festival days added at the end of the year. Prior to this system, the Egyptians used a lunar calendar but, realizing that it was not able to help predict important agricultural events like the annual flooding of the Nile, the Egyptians became the first to begin using a calendar based purely on the solar year. In fact, for a period of over 2000 years, Egypt may have had three different calendars working concurrently: a stellar calendar for agriculture, a solar calendar of 365 days for civil administration, and a quasi-lunar calendar for festivals. Initially, the Egyptian solar calendar was not intercalated with leap days, and so astronomical events gradually varied in their incidence over the years. But in about 238BCE, Ptolemy III ordered that an extra day be added to every fourth year for increased accuracy, similar to the Julian leap year. The months were divided into three weeks of ten days each, and the year as a whole was divided into 3 seasons, akhet  or Inundation (of the Nile), peret or Growth (Winter) and shemu or Harvest (Summer).

The ancient Hebrew or Jewish calendar, at least since the time of the Babylonian exile (538BCE) was a luni-solar calendar based on that used by the Sumerians and Babylonians, using twelve lunar months alternating between 29 and 30 days, with the addition of an intercalary month every two or three years to synchronize the lunar cycles with the longer solar year. The months were named Tishrei, Marcheshvan, Kislev, Tevet, Shevat, Adar, Nisan, Iyar, Sivan, Tammuz, Av and Elul (the intercalary month was referred to as Adar I), and the beginning of each lunar month was based on the appearance of the new moon. The starting point of Hebrew chronology is the year 3761BCE, the putative date of the creation of the world as described in the Old Testament.

The ancient Mayan calendar, along with those of other related Mesoamerican civilizations, were perhaps the most complex of all, and by some measures the most accurate, with an error margin of just 2 days over 10,000 years. They used two different parallel systems: the 260-day Sacred Round, and the 365-day Vague Year. The Sacred Round consisted of 13 numbered “months”, each of which contained 20 named days (Imix, Ik, Akbal, Kan, Chicchan, Cimi, Manik, Lamat, Muluc, Oc, Chuen, Eb, Ben, Ix, Men, Cib, Caban, Eiznab, Cauac and Ahau), and this was the calendar used for purposes such as naming individuals, predicting the future, deciding on auspicious days for battles, marriages, etc. Some other Mesoamerican cultures used a series of twenty 13-day tricenas instead. The Vague Year, on the other hand, consisted of 18 named “months” (Pop, Uo, Zip, Zotz, Tzec, Xuc, Yaxkin, Mol, Chen, Yax, Zac, Ceh, Mac, Kankin, Maun, Pax, Kayab and Cumku) of 20 numbered days each, with a five-day period at the end, known as Uayeb, which was considered unlucky. These two different cycles only coincided every 52 years, a period which was therefore considered in a similar way to modern centuries. The Maya fully expected to see history repeat itself every 260 years, after the full cycle of their calendar system.

The ancient Chinese calendar (or Han calendar) was a luni-solar calendar dating back to the Han Dynasty of the 2nd Century BCE, although similar luni-solar calendars had been in use there for almost a millennium by that time. It, or a version of it, is still used for civil purposes today in China, Japan, Korea, Vietnam, etc. The Han calendar used 12 lunar months of 29 or 30 days with 7 intercalary months every 19-year cycle (similar to the entirely unrelated Babylonian system). Under this system, the Sun and Moon returned to their exact original relative positions after every 76 years. A reform in 1281 fixed the Chinese calendar at the equivalent of 365.2425 days, the same accuracy as the Gregorian calendar established in the West some three centuries later.

The ancient Greek calendar (also known as the Attic or Athenian calendar) was a luni-solar calendar, consisting of 12 named months of 29 or 30 days each (totalling 354 days), with a leap month added every third year to synchronize with the solar year. The months were named Hekatombaion, Metageitnion, Boedromion, Pyanepsion, Maimakterion, Poseideon, Gamelion, Anthesterion, Elaphebolion, Mounichion, Thargelion and Skirophorion, and were grouped into the familiar four seasons of summer, autumn, winter and spring. The additional leap month was achieved by repeating an existing month, so that the same month name was used twice in a row. However, there was also a “conciliar calendar”, maintained parallel to the main “festival calendar”, which divided the year according to the 10 (or later 11, 12 or 13) phylai, or sub-divisions of the Athenian population. Because of this variation over time, documents or events dated by this method are notoriously difficult to translate into modern calendar dates. A third, less official, calendar of seasons (using star risings to fix points in time) was also used for agricultural or maritime purposes.

The Islamic calendar was (and still is today) a lunar calendar consisting of 12 months alternating between 29 and 30 days, totalling 354 days in a year. The months are named Muḥarram, Ṣafar, Rabi’ al-Awwal, Rabi’ al-Thani, Jumada al-Awwal, Jumada al-Thani, Rajab, Sha’aban, Ramaḍan, Shawwal, Dhu al-Qi’dah and Dhu al-Ḥijjah. In each thirty year cycle, the 2nd, 5th, 7th, 10th, 13th, 16th, 18th, 21st, 24th, 26th and 29th years are leap years of 355 days. The Islamic calendar is used by Muslims to determine the proper days on which to observe Ramadan (the annual fast), to attend Hajj (pilgrimage to Mecca), and to celebrate other Islamic holidays and festivals. However, as a purely lunar calendar, the months tend to drift against the seasons (and with other solar-based calendars) so that it was not practical to used for agricultural purposes, and historically other calendars have been used for such purposes. Islamic years are counted from the Hijra in 622CE (the year when Muhammad emigrated from Mecca to Medina).

There were of course many other alternative calendars (e.g. Gaulish, Hindu, Zoroastrian, etc), each with their own underlying philosophy, and often with their own idiosyncratic twists and foibles. But the above-mentioned – along with the Roman calendar described below – are probably the most important and influential.

Roman, Julian and Gregorian Calendars
Ancient Rome
The Julian and Gregorian calendars most people use today were adapted from the Ancient Roman calendar

The earliest Roman calendar consisted of 304 days divided into 10 months (Martius, Aprilis, Maius, Iunius, Quintilis, Sextilis, September, October, November and December), with the winter days after the end of December and before the beginning of the following March not being assigned to any month. To account for these “unassigned” winter days, an additional two months, Ianuarius and Februarius, were added as a stop-gap measure by Numa Pompilius in the early 7th Century BCE, resulting in 12 lunar months of 29 or 30 days, totalling 355 days, with a leap month of 22 or 23 days, known as Mercdeonius or Intercalaris, added from time to time as dictated by the ruling Roman High Priest.

Roman years were dated from the founding of the city of Rome, which was originally assumed to be about 750BCE, although this date was later revised. However, both in everyday conversation and usually in official records, dates were commonly referred to according to the names of ruling consuls. Complications arose in this system because there were two consuls ruling at any one time, and because identical or similar names were common. The Romans initially used an eight-day week (based on its market cycle, and inherited from the Etruscans of antiquity) for commercial and administrative purposes, although as the empire expanded it encountered more and more cultures using the seven-day week, which it eventually adopted. The Romans also used a rather convoluted system of calculating the days of the month backwards from the publicly-declared fixed days of Kalends (1st day of the month), Nones (5th or 7th day of the month, depending on the particular month’s length) and Ides (13th or 15th day of the month).

Later, when these conflicting and confusing systems became completely inoperable, the Roman calendar was drastically overhauled, on the advice of the Greek astronomer Sosigenes, by Julius Caesar during his third consulship in 46BCE, reforms that were completed by Augustus two years later. The resulting Julian calendar became the predominant calendar throughout most of Europe (as well as some Muslim countries), until superseded by the Gregorian calendar in 1582. The Julian calendar was a solar calendar consisting of a regular year of 365 days divided in to 12 months, with a leap day added in February (Februarius) every four years. The month names were the same as the earlier Roman calendar, although the lengths of the months were adjusted to the 30 and 31 day months (with a rogue 28/29 day month in February) we are familiar with today.

Although the Julian calendar was widely used throughout the Middle Ages, different countries still used various local systems to count or identify years. These were usually regnal years, based on the reign of a particular sovereign or leader, in the Biblical fashion, although cyclical events (e.g. the 4-year Greek Olympiads, the 12-year Chinese animal cycle, etc) were used in some cases. Individual days of the year were usually indicated in relation to one of the church feasts, e.g. the 4th day after Easter in the 6th year of the reign of King xxxxx. To replace this rather awkward system, the Christian monk Dionysius Exiguus in the 6th Century, introduced the Anno Domini (AD) system of counting years from the putative (probably erroneous) birth of Christ. The new system caught on only gradually and was not widely used until the 11th to 14th Century, and the use of the BC sequence, counting backwards from the birth of Christ, was not introduced until as late as the 17th Century. The lack of a year zero (the year before AD1 is 1BC) and the awkward necessity of counting backwards for dates BC (before Christ) were drawbacks to this system, but it has nevertheless remained a much more popular reference point than other proposed alternatives, such as the Byzantine church’s use of the creation of the world (which they calculated to be 5509BC), or the Jewish creation date (calculated as 3761BC). The more recent (non-religious, and more politically correct) use of CE and BCE, using the abbreviations for Common Era (or Current Era or Christian Era) and Before Common Era instead of AD and BC, has not changed that reference date, which has now become the de facto standard worldwide.

The Gregorian calendar (also sometimes referred to as the Western calendar or Christian calendar or New Style calendar) was a minor refinement to the Julian calendar, introduced by Pope Gregory XIII in 1582, based on a proposal by Aloysius Lilius. The Julian year, with its leap year every four years, was still 11 minutes 14 seconds longer than the exact solar year on average, and this error had been accumulating for centuries. The Gregorian calendar reform reduced the number of leap years, so that years divisible by 100 (that would otherwise be leap years) were now not to be considered leap years, unless the year was also divisible by 400. This had the effect of correcting the length of a year from 365.25 days to a more accurate 365.2425 days, so that, while the Julian calendar carried forward an error of about 11 minutes each year, the Gregorian calendar was accurate to within just 2.6 seconds a year. This is accurate enough that an extra leap day to compensate for accumulating errors will not be needed until about the year 4000. The 1582 reform also dealt with the cumulative effect of the 11-minute Julian calendar errors (dating back to the First Council of Nicaea in 325CE) by completely skipping the 10 days from 5th to 14th October in the year 1582.

Although many Protestant and Eastern Orthodox countries continued to use the old Julian calendar for a time, the Gregorian calendar was slowly adopted throughout Europe (when the Gregorian calendar was finally adopted in Britain in 1752, for example, an 11-day correction needed to be made to account for further accumulated errors). Most branches of the Eastern Orthodox Church still use the Julian calendar (or the Revised Julian calendar) for calculating the dates of their moveable feasts, like Easter, and it is still used by the Berber people of North Africa. The Gregorian calendar, though, is now used throughout most of the Western world as well as in many parts of Asia, and it has become the unofficial global standard, recognized by most international institutions, including the United Nations.

Various “improvements” to the Gregorian calendar system have been proposed over the years, such as Marco Mastrofini’s proposed World Calendar or Moses Cotsworth’s International Fixed Calendar, but none have ever gained enough traction to merit the huge administrative inconvenience involved.

>> Periodization

Clocks

Clocks of one sort or another have been around for thousands of years
Clocks of one sort or another have been around for thousands of years

A clock is any free-standing device or instrument for measuring or displaying the current time. The English word “clock” comes from the Celtic words clocca and clogan, both meaning “bell”. A chronometer is an exceptionally precise mechanical timepiece, designed to be accurate in all conditions of temperature, pressure, etc, especially one used at sea. Watches are sometimes distinguished from clocks in general, but really a watch is just a portable clock, usually worn in a pocket or on the wrist.

Really, a clock can be anything that repeats itself in a predictable way. The rotation of the Earth is a good example of something that is both repetitive and predictable, and indeed the turning of the Earth, as measured by the position of the Sun in the sky, was the first method mankind used to estimate the time of day. Other kinds of clock mechanisms, from the passage of water out of a vessel to the swinging of a pendulum to the vibration of a quartz crystal or the oscillation of a microscopic atom, are really just more manageable and more accurate variations on the same idea.

If the principle behind a clock is relatively simple, the technological and engineering challenges involved in building a clock of any accuracy are relatively complex, and much of the history of clocks in the last two millennia has revolved around the invention of more and more complex machines that are able to overcome the intrinsic problems of friction, temperature differences, movement, magnetic fields, size, etc, in search of the ultimate in accurate, reliable and practical timekeeping devices.

Division of Time

We use clocks to divide the day into smaller increments. Although a 10-hour clock was briefly popular during France’s experiment with metric time measurement after the French Revolution, the 12-hour clock – a convention dating back to ancient Egypt and Babylon – has continued to be the norm, along with 24-hour clocks for some (mainly military and astronomical) applications. The standard sexagesimal system of time measurement  – 60 minutes in an hour, 60 seconds in a minute – also owes its ancestry to the sexagesimal system used in Sumer and Babylonia from around 2000BCE, in which 60 was the base number for most mathematical and counting purposes.

Our present system of dividing the day and night into 24 equal hours was instituted around the 14th Century. Prior to that, other than for some technical astronomical purposes, the day was usually split into 12 equal hours and the night into 12 equal hours, a practice dating back to ancient Egypt (although in practice, the night was often divided into 3 or 4 “watches” for security reasons). This meant of course that daytime hours were not necessarily of the same length as nighttime hours (except at the equinoxes), and hours at different times of the year would also vary in length. The advent of mechanical clocks institutionalized the use of 24 equal hours per day.

The first reference to clocks using 12 hours before noon and 12 hours after noon (all equal) comes from England in about 1380, although splitting the day into AM (ante meridiem, or before midday), and PM (post meridiem, or after midday) actually dates back to Roman days. The modern convention of starting the day at midnight, in the Roman style, is perhaps an arbitrary one, although a widely used one for all that, and the hour at which the day is considered to begin has varied throughout history and across different cultures. The ancient Greeks and Babylonians counted the day from sunset, as do orthodox Jews and Muslims even today; the Egyptians reckoned the hours from sunrise; in ancient Umbria the new day began at noon.

Ancient Clocks
Sundials were one of the earliest kinds of clock
Sundials were one of the earliest kinds of clock

The clock is one of the oldest human inventions, and a bewildering array of different mechanisms have been employed over the millennia.

The sundial, or shadow-clock, which measures the time of day by means of a shadow cast by the Sun onto a cylindrical stone, was widely used in ancient times, and can give at least a reasonably accurate reading of the local solar time. In a typical horizontal sundial, the Sun casts the shadow of a gnomon (a thin vertical rod or shaft) onto a horizontal surface marked with lines indicating the hours of the day. Some sundials may also be set vertically on the sides of buildings, and a wide variety of different designs have been employed over the centuries. As the Sun moves across the sky, the edge of the shadow aligns with the different hour markings. Sundials are therefore only of use during daylight hours, and when sufficient sun is shining, although in the Middle Ages an instrument called a nocturnal was developed which measured the nighttime hours according to the positions of the North Star or some other combination of stars (although again only when atmospheric conditions allowed).

Obelisks may have been used for shadow timekeeping in ancient Egypt as early as 3500BCE, but the first purpose-built sundials date from around 1500BCE, also in Egypt, and in Babylonia not too much later. The ancient Greek, Roman and Chinese civilizations all used sundials extensively, mainly based on earlier Egyptian and Babylonian designs. The idea of using a gnomon parallel to the Earth’s axis so that the hour lines indicate equal hours on any day of the year was first developed in the Islamic Caliphate in the 14th Century, and began to be used throughout Europe and elsewhere thereafter to give a better and more accurate reading of the time regardless of the seasons. Sundials were still being widely used as late as the 19th Century. Our use of time measurement units like minutes and seconds, which were originally radial angle measurements in geometry, probably arises from this method of measuring the time of day.

The water clock, or clepsydra, is, along with the sundial, probably the oldest time-measuring instrument. A simple water clock measures time by measuring the regulated flow of water into or out of a vessel of some sort. Water clocks existed in ancient Egypt and Babylonia at least as early as 1600BCE, and possibly significantly earlier in India and China, although the documentation is vague. Early water clocks were probably not very accurate, and even differences in ambient temperatures could result in significant errors. The ancient Greeks and Romans are usually credited with improving their accuracy by the use of complex gearing and escapement mechanisms (which transfer rotational energy into intermittent motions) around the 3rd Century BCE. They passed these designs on to Byzantium and the Islamic world, and thence back to Europe, although the Chinese also independently made similar improvements to their water clocks by about the 8th Century CE. Water clocks were usually calibrated using a sundial and, while never reaching a level of accuracy comparable to today’s standards of timekeeping, they were the most accurate and commonly-used timekeeping devices for many centuries, until replaced by more accurate pendulum clocks in 17th Century Europe.

Candle clocks have also been used since antiquity, although it is difficult to establish any firm dates. A candle clock utilizes a slow-burning thin candle, with markings that indicate the passing of the hours as the candle burns down throughout the day (or night). A series of sticks of incense that burn down at a reasonably predictable speed were used in ancient Sparta, with each different smell denoting a particular hour of the day. In ancient China, the practice was to burn a knotted rope, noting the length of time required for the fire to travel from one knot to the next. Perhaps the most sophisticated candle clocks date from the Islamic Golden Age (12th – 13th Century).

The hourglass or sandglass, in which fine sand pours through a tiny hole at a constant rate, has long been used to indicate the passage of a predetermined period of time, even if it could not be used to tell the absolute time of day. Typically, an hourglass has two connected vertical glass bulbs, which allow a regulated trickle of sand (or sometimes crushed eggshell, which does not erode the glass as much) from the top to the bottom bulb. Once the top bulb is empty it can be inverted to begin timing again. Although the concept goes back to antiquity, the hourglass as we know it appears to have been invented in medieval Europe, and was in common use in the Middle Ages, particularly on board ships.

Mechanical Clocks
Mechanical Clock
Mechanical and astronomical clocks of great complexity were developed during the Middle Ages

Mechanical clocks, governed by continually repeated mechanical (“clockwork”) motion, began to be developed independently in China, the Middle East and Europe during the early Middle Ages. They used a simple controlled release of power or escapement mechanism (a method of gradually and smoothly translating rotational energy into an oscillating motion that can be used to count time), as well as all manner of toothed wheels, ratchets, gears and oscillating levers. Early mechanical clocks were huge devices housed in church towers, and most early models still did not utilize a clock face or hands, but just struck the hour for religious and administrative purposes.

By the late 14th Century, the convention of a rotating hour hand on a fixed dial was adopted. Initially, accuracy was low, and errors of 15 minutes to an hour each day were common (minute hands were therefore not used). Gradually, though, these clocks began to acquire more and more extravagant features such as automata (moving figures) and complex astronomical depictions of the phases of the Moon, star maps, etc. The abbot of St. Alban’s abbey, Richard of Wallingford, built a famous mechanical clock as an astronomical orrery (a mechanical model of the solar system) as early as about 1330. Another famous astronomical clock was built in Strasbourg in 1352.

Spring-driven clocks began to appear in the 15th Century in Europe, and gradually new innovations were developed in order to keep the clock movement running at a constant rate as the spring ran down. Once wound, a spring-driven clock conserves energy by means of a gear train, with a balance wheel regulating the motive force. Around 1500, Peter Henlein, a locksmith in Nürnberg, Germany, began producing portable timepieces known popularly as ”Nürnberg eggs”. As accuracy increased (correct to within a minute a day), clocks began to appear with minute hands in the 16th Century, principally in Germany and France, one of the first being a 1577 clock made by Jost Burgi for the astronomer Tycho Brahe, who needed an accurate clock for his stargazing. However, minute hands only came into regular use around 1690. By the time of the scientific revolution, clocks and their workings had become miniaturized enough for sufficiently wealthy families to share a personal clock, or perhaps even a pocket watch.

In 1656, the Dutch scientist Christiaan Huygens developed the pendulum clock, following earlier ideas of Galileo, who had discovered the isochronism, or constant period, of a pendulum’s motion as early as 1583. The pendulum clock used a swinging bob to regulate the clock motion, achieving an accuracy of within 10 seconds per day. Such accuracy made the use of minute hands and even second hands a practical proposition. It had been known since 1644 that the period of a pendulum with a shaft length of precisely 0.994 metres (about 39.1 inches) had a period of precisely two seconds, one second for a swing in one direction and one second for the return swing. During the 17th Century, the centre of clock-making production and innovation moved to England and, in 1670, William Clement created the anchor escapement as an improvement on the old verge (or crown wheel) escapement, which had been used since the 14th Century, and he also encased the pendulum in the classic long-case or “grandfather” clock.

In 1675, Huygens and Robert Hooke made another crucial advance with the spiral balance, or hairspring, to control the oscillating speed of the balance wheel. Englishman Thomas Tompion also successfully used this mechanism in pocket watches or fob watches (a fob is a pocket designed the hold a watch, or the chain or ribbon that attaches it). In 1761, another Englishman, John Harrison, made various improvements which allowed accurate clocks to be used at sea, an instrument known as the marine chronometer, which provided an important boost for navigation (the measurement of longitude requires an accurate knowledge of time). Harrison received a handsome £20,000 reward (equivalent today to around $4.5 million) from the British government for his solution to the intractable problem of longitude. Second hands began to be commonly added to long-case clock dials around 1780, and jewelled bearings to reduce friction and prolong the life of clockworks were introduced in the 18th Century. The cuckoo clock (containing a carved wooden bird that emerges and “sings” to tell the time) made its first appearance in the Black Forest region of Germany in the mid-17th Century, although the classic “chalet-style” cuckoo clock originated in Switzerland in the late 19th Century.

The accuracy and reliability of pendulum and spring-driven clocks, and their increasing cheapness and ubiquity, marked a big shift in everyday life in much of the developed world, as people moved from telling the time by natural signs or events to measuring it by the clock, with all the repercussions this had for work practices, productivity, industrial development, etc. Pendulum clocks continued to be widely used in the 18th and 19th Century, and Switzerland gradually established itself as the pre-eminent clock-making centre during the 19th Century.

Alarm clocks have been around almost as long as clocks themselves. Some water clocks in classical times were adapted to strike an alarm, and some medieval mechanical clocks were also capable of chiming at a fixed time every day. Early user-settable mechanical alarm clocks, in which the alarm was set by placing a pin in the appropriate hole of a ring of holes in the clock dial, date back at least to 15th Century Europe. But the traditional mechanical wind-up alarm clock, that could be set for any time and was small enough to use on a bedside table, was patented by the American Seth E. Thomas in 1875.

Mass production of clocks with interchangeable parts began in the United States in the late 18th Century, and in 1836 the Pitkin brothers of Connecticut produced the first American-designed watch, and the first containing machine-made parts. New innovations and the economies of mass production, soon made the United States the leading clock-making country of the world, and competition reduced the price of a clock to $1 or less, so that for the first time most families could afford a clock. In 1884, at the International Meridian Conference, it was decided to place the Prime Meridian at Greenwich, England, establishing the international baseline of Greenwich Mean Time (GMT).

Electric clocks, which wind the mainspring using an electric motor, first arrived in 1840, patented by Scottish clockmaker Alexander Bain, and were further developed commercially by an American, Henry E. Warren, in the early 1900s. By the end of the 19th Century, the invention of the dry cell battery made electric clocks a practical proposition, and mechanical clocks gradually came to be largely powered by batteries, removing the need for daily winding. By the 1930s, electric clocks were the most widely-used type of clock.

Meanwhile, pocket watches began to be replaced by wristwatches after the First World War, when the success of wristwatches in military operations finally made them acceptable fashion accoutrements. The Swiss further developed their market dominance at this time through their mastery of the intricacies high quality wristwatches. The British Broadcasting Company began broadcasting its famous hourly time signal of six pips over the radio in 1924, allowing normal people to synchronize their watches with great accuracy.

Modern Clocks
Alarm Clock
During the 20th Century, electronic, digital and super-accurate atomic clocks have been developed

The development of electronics in the early decades of the 20th Century led to electronic clocks with no clockwork parts at all, with the timekeeping regulated by methods as varied as the vibration of a tuning fork, the piezoelectric behaviour of quartz crystals, and even the quantum vibration of atoms of caesium or rubidium (all of which are however examples of oscillatory motion, the same general method as that employed by simple pendulum clocks).

The development of the quartz clock in the late 1920s finally made electronic clocks more accurate and reliable than pendulum clocks. Using vibrating quartz crystals as oscillators allowed the production of clocks that were accurate to a few ten-thousandths of a second per day. The advances in microelectronics in the 1960s, particularly in Japan, made quartz clocks both compact and cheap to produce, so that by the 1980s they had become the dominant timekeeping technology for both clocks and wristwatches. Even in the digital age, though, Switzerland has managed to retain its reputation for quality mechanical watches, although ironically more as prestige jewellery items and status symbols than as ultra-accurate timepieces.

Atomic clocks, which use the oscillation frequencies of the electromagnetic spectrum of atoms (principally caesium atoms) to regulate their timekeeping, are currently the most accurate clocks available (with an accuracy of around 10−9 seconds per day, or about 1 second in 316,000 years). Indeed, they keep time better and more consistently than the rotation of the Earth and the movement of the stars, and they have been used as primary standards for international time distribution services since the 1960s. A hydrogen maser clock at the US Naval Research Laboratory in Washington DC, which uses hydrogen atoms instead of caesium, is believed to be accurate to 1 second in 1.7 million years, currently the most accurate clock in use, although it is believed that super-cooled hydrogen maser clocks could reach accuracies approaching 1 second in 300 million years.

The electronic revolution of the 20th Century has also made possible digital clocks, which display a numeric representation of the time using LCD, LED or VCF displays, and analog clocks (with hands) have declined in popularity ever since. Digital clocks are now found on all computers, cellphones, etc, as well as on the electronic timers for central heating systems, ovens, VCRs, etc. Indeed, the youth of today are more likely to use their cellphone to tell the time than to wear a wristwatch. Many newer clocks and computer-based applications even reset themselves based on radio or internet time servers that are tuned to ultra-accurate atomic clocks. Auditory clocks, tactile clocks and Braille watches are also available for those with limited sight.

Satellite navigation systems like the Global Positioning System (GPS) require an unprecedentedly accurate knowledge of time. For example, a time error of just 1 microsecond can translate to a spatial error of about 300 metres. GPS signals are now provided directly from a network of 31 Earth-orbit satellites and linked to atomic clocks.

Atomic clocks and modern time standards (see the section on Time Standards) are able to keep incredibly precise time measurements consistent across the world. They are also able to account for the tiny and gradual changes needed to keep the time compatible with astronomical data. For example, the Earth’s motion is slightly perturbed by the gravitational attraction of the other planets, and there is a gradual shift in the orientation of Earth’s axis of rotation (precession), so that the length of the tropical year is slowly decreasing: at the end of the 19th Century, the tropical solar year was 365.242196 days; at the end of the 20th Century, it was 365.242190 days. Also, tidal friction is gradually slowing the Earth’s rotation, and lengthening the day by almost 2 milliseconds every century (so that, in a few hundred million years, a day will actually be 25 hours long), and large natural cataclysms like earthquakes and hurricanes can also have perceptible effects on the Earth’s rotation.

The Clock of the Long Now is a recent project to design and build a mechanical clock that will function and keep accurate time for at least 10,000 years, and that will tell the time in a way that would be intelligible to any future civilization. While not as accurate as an atomic clock, it a good example of the long-term view of the importance of timekeeping that we humans are now starting to take.

>> Calendars

Measurement of Time

Watches
Watches are just one of many possible instruments of time measurement

Time provides us with a measure of change by putting dates on moments, fixing the durations of events, and specifying which events happen before which other events. In order to do that, some method of time measurement is needed. The science or art of the accurate measurement of time is known as chronometry (or, less formally, timekeeping). A similar concept, horology, usually refers to mechanical timekeeping devices or timepieces. Time can be measured both in terms of the absolute moment when a particular event occurs, or in terms of a time interval, i.e. the duration of a continued event.

There are two main methods used in the everyday measurement of time, depending on the accuracy required or the interval covered. A clock is a physical mechanism that counts the ongoing passage of time, and is mainly used for more accurate timekeeping and for periods of less than a day. A calendar is a mathematical abstraction used for calculating more extensive periods of time (i.e. longer than a day). Typically, both methods are used together to specify when in time a particular events occurs (e.g. 12:30PM on 16 December 2013). Even before such methods were devised, mankind has always used more informal methods for basic timekeeping, such as the cycle of the seasons, and of day and night, and the position of the Sun in the sky.

Chronology, as opposed to chronometry, is the science of arranging events in their order or sequence of occurrence in time, and is mainly used for studying the past. For convenience, events can be put into chronological groups, a process known as periodization. Chronology, periodization and the interpretation of the past are together known as the study of history.

The measurement of time involves the use of various different units of measurement, depending on the time scales and periods under consideration. These range from the almost infinitesimal units employed in physics, though the everyday units (e.g. seconds, minutes, hours, days, months, years, etc), to the much larger units used in geological and cosmological time scales.

Different time standards, specifications for the measurement of time, have been in use throughout history, although modern globalization and scientific internationalism have led to at the adoption of highly accurate and largely universal standards of time measurement and central reference points.

>> Clocks