To use the calculator, place your cursor in the desired unit field and write a number.The calculator will automatically convert your number and display the result in the other unit fields. If needed use the dot "." as the decimal separator.
Use the overview below to better understand the meaning and history of the different weight units.
A millennium (plural millennia or, rarely, millenniums) is a period equal to 1000 years, also called kiloyears. It derives from the Latin mille, thousand, and annus, year. It is often, but not always, related to a particular dating system.
A century (from the Latin centum, meaning one hundred; abbreviated c.) is a period of 100 years. Centuries are numbered ordinally in English and many other languages.
A decade is a period of 10 years. The word is derived (via French and Latin) from the Ancient Greek: δεκάς, translit. dekas), which means a group of ten. Other words for spans of years also come from Latin: biennium (2 years), triennium (3 years), quadrennium (4 years), lustrum (5 years), century (100 years), millennium (1000 years).
A year is the orbital period of the Earth moving in its orbit around the Sun. Due to the Earth's axial tilt, the course of a year sees the passing of the seasons, marked by change in weather, the hours of daylight, and, consequently, vegetation and soil fertility.
A month is a unit of time, used with calendars, which is approximately as long as a natural period related to the motion of the Moon; month and Moon are cognates. The traditional concept arose with the cycle of Moon phases; such months (lunations) are synodic months and last approximately 29.53 days. From excavated tally sticks, researchers have deduced that people counted days in relation to the Moon's phases as early as the Paleolithic age. Synodic months, based on the Moon's orbital period with respect to the Earth-Sun line, are still the basis of many calendars today, and are used to divide the year.
A week is a time unit equal to seven days. It is the standard time period used for cycles of rest days in most parts of the world, mostly alongside—although not strictly part of—the Gregorian calendar. The days of the week were named after the classical planets (derived from the astrological system of planetary hours) in the Roman era. In English, the names are Monday, Tuesday, Wednesday, Thursday, Friday, Saturday and Sunday.
A day, a unit of time, is approximately the period of time during which the Earth completes one rotation with respect to the Sun (solar day). In 1960, the second was redefined in terms of the orbital motion of the Earth in year 1900, and was designated the SI base unit of time. The unit of measurement "day", was redefined as 86 400 SI seconds and symbolized d. In 1967, the second and so the day were redefined by atomic electron transition. A civil day is usually 86 400 seconds, plus or minus a possible leap second in Coordinated Universal Time (UTC), and occasionally plus or minus an hour in those locations that change from or to daylight saving time.
An hour (symbol: h; also abbreviated hr.) is a unit of time conventionally reckoned as 1⁄24 of a day and scientifically reckoned as 3,599–3,601 seconds, depending on conditions. The seasonal, temporal, or unequal hour was established in the ancient Near East as 1⁄12 of the night or daytime. Such hours varied by season, latitude, and weather. It was subsequently divided into 60 minutes, each of 60 seconds. Its East Asian equivalent was the shi, which was 1⁄12 of the apparent solar day; a similar system was eventually developed in Europe which measured its equal or equinoctial hour as 1⁄24 of such days measured from noon to noon. The minor variations of this unit were eventually smoothed by making it 1⁄24 of the mean solar day, based on the measure of the sun's transit along the celestial equator rather than along the ecliptic. This was finally abandoned due to the minor slowing caused by the Earth's tidal deceleration by the Moon.
The minute is a unit of time or angle. As a unit of time, the minute is most of times equal to 1⁄60 (the first sexagesimal fraction) of an hour, or 60 seconds. In the UTC time standard, a minute on rare occasions has 61 seconds, a consequence of leap seconds (there is a provision to insert a negative leap second, which would result in a 59-second minute, but this has never happened in more than 40 years under this system). As a unit of angle, the minute of arc is equal to 1⁄60 of a degree, or 60 seconds (of arc). Although not an SI unit for either time or angle, the minute is accepted for use with SI units for both. The SI symbols for minute or minutes are min for time measurement, and the prime symbol after a number, e.g. 5′, for angle measurement. The prime is also sometimes used informally to denote minutes of time. In contrast to the hour, the minute (and the second) does not have a clear historical background. What is traceable only is that it started being recorded in the Middle Ages due to the construction of "precision" timepieces (mechanical and water clocks). However, no consistent records of the origin for the division as 1⁄60 part of the hour (and the second 1⁄60 of the minute) have ever been found, despite many speculations.
The second is the SI base unit of time, commonly understood and historically defined as 1⁄86400 of a day – this factor derived from the division of the day first into 24 hours, then to 60 minutes and finally to 60 seconds each. Another intuitive understanding is that it is about the time between beats of a human heart. Mechanical and electric clocks and watches usually have a face with 60 tickmarks representing seconds and minutes, traversed by a second hand and minute hand. Digital clocks and watches often have a two-digit counter that cycles through seconds. In common parlance, a "clock tick" is a second, though most modern clocks are digital electronic, and do not actually tick. The second is also part of several other units of measurement like velocity, acceleration, and frequency. Though the historical definition of the unit was based upon this division of the Earth's rotation cycle, the formal definition in the International System of Units SI is a much steadier timekeeper: 1 second is defined to be exactly 9 192 631 770 cycles of a Caesium atomic clock. Because the Earth's rotation varies and is also slowing ever so slightly, a leap second is added to clock time to keep clocks in sync with Earth's rotation.
A millisecond (from milli- and second; symbol: ms) is a thousandth (0.001 or 10−3 or 1/1000) of a second. Examples of millisecond are - cycle time for frequency 1 kHz; duration of light for typical photo flash strobe; time taken for sound wave to travel ca. 34 cm; repetition interval of GPS C/A PN code.
A microsecond is an SI unit of time equal to one millionth (0.000001 or 10−6 or 1/1,000,000) of a second. Its symbol is μs. A microsecond is equal to 1000 nanoseconds or 1/1,000 of a millisecond. Because the next SI prefix is 1000 times larger, measurements of 10−5 and 10−4 seconds are typically expressed as tens or hundreds of microseconds. An examples of microsecond is cycle time for frequency 1 × 106 hertz (1 MHz), the inverse unit. This corresponds to radio wavelength 300 m (AM medium wave band), as can be calculated by multiplying 1 µs by the speed of light (approximately 3.00 × 108 m/s) to determine the distance travelled.
A shake is an informal unit of time equal to 10 nanoseconds, or 10−8 seconds. It has applications in nuclear physics, helping to conveniently express the timing of various events in a nuclear explosion. The typical time required for one step in the chain reaction (i.e. the typical time for each neutron to cause a fission event, which releases more neutrons) is of the order of 1 shake, and the chain reaction is typically complete by 50 to 100 shakes.
A nanosecond (ns) is an SI unit of time equal to one thousand-millionth of a second (or one billionth of a second), that is, 1/1,000,000,000 of a second, or 10−9 seconds. The term combines the prefix nano- with the basic unit for one-sixtieth of a minute.
A picosecond is an SI unit of time equal to 10−12 or 1/1,000,000,000,000 (one trillionth) of a second. That is one trillionth, or one millionth of one millionth of a second, or 0.000 000 000 001 seconds. A picosecond is to one second as one second is to approximately 31,689 years. Multiple technical approaches achieve imaging within single-digit picoseconds: for example, the streak camera or intensified CCD (ICCD) cameras are able to picture the motion of light.
A femtosecond is the SI unit of time equal to 10−15 or 1/1,000,000,000,000,000 of a second; that is, one quadrillionth, or one millionth of one billionth, of a second. For context, a femtosecond is to a second as a second is to about 31.71 million years; a ray of light travels approximately 0.3 µm (micrometers) in 1 femtosecond, a distance comparable to the diameter of a virus.
An attosecond is 1×10−18 of a second (one quintillionth of a second). For context, an attosecond is to a second what a second is to about 31.71 billion years.