There’s a new record holder for the most accurate clock in the world. Researchers at the National Institute of Standards and Technology (NIST) have improved their atomic clock based on a trapped aluminum ion. Part of the latest wave of optical atomic clocks, it can perform timekeeping with 19 decimal places of accuracy.
Optical clocks are typically evaluated on two levels — accuracy (how close a clock comes to measuring the ideal “true” time, also known as systematic uncertainty) and stability (how efficiently a clock can measure time, related to statistical uncertainty). This new record in accuracy comes out of 20 years of continuous improvement of the aluminum ion clock. Beyond its world-best accuracy, 41% greater than the previous record, this new clock is also 2.6 times more stable than any other ion clock. Reaching these levels has meant carefully improving every aspect of the clock, from the laser to the trap and the vacuum chamber.
The team published its results in Physical Review Letters.
“It’s exciting to work on the most accurate clock ever,” said Mason Marshall, NIST researcher and first author on the paper. “At NIST we get to carry out these long-term plans in precision measurement that can push the field of physics and our understanding of the world around us.”
Indulge me in a rant. If we’re going to redefine the second because of advancements in measuring sensitivity, doesn’t this become a good time to reconsider the SI structure?
Bad approximations of distances in the 18th century brought us the metric system. With the sort of precision we now have, not to mention the need for nongeocentric units as space increasing becomes a field of research, why are we using a flawed system based on guesses from a few guys in France during The Enlightenment?
I’ve no issue with shorthand like AUs or light-years for large distances, but it feels we should have the basic tenets of the universe as the basis. Like, the light-nanosecond for distance on the human scale (it’s about 11.8 inches or 29.98cm) and then reconfigure the system from first principles.
I’m not saying we should throw out measuring systems each time they get more precise, but a lot of cruft is grandfathered in to what we currently use. We can’t just go for further precision and then shrug and say “well, nothing we can do about it.”
Like, the light-nanosecond for distance
Americans would use anything but meter…
This record does not redefine anything, same thing just got more precise.
So, if I understand correctly, your beef here is not that the metre is a flawed basis measurement but rather that the U.S. refuses to use metric? That’s certainly a hill to die on, but using universal constants to define measurements seems the better route. The foot just as arbitrary as the metre.
Per Wikipedia:
Since 2019, the metre has been defined as the length of the path travelled by light in vacuum during a time interval of 1/299792458 of a second, where the second is defined by a hyperfine transition frequency of caesium.[2]
The metre was originally defined in 1791 by the French National Assembly as one ten-millionth of the distance from the equator to the North Pole along a great circle, so the Earth’s polar circumference is approximately 40000 km.
In 1799, the metre was redefined in terms of a prototype metre bar. The bar used was changed in 1889, and in 1960 the metre was redefined in terms of a certain number of wavelengths of a certain emission line of krypton-86. The current definition was adopted in 1983 and modified slightly in 2002 to clarify that the metre is a measure of proper length. From 1983 until 2019, the metre was formally defined as the length of the path travelled by light in vacuum in 1/299792458 of a second. After the 2019 revision of the SI, this definition was rephrased to include the definition of a second in terms of the caesium frequency ΔνCs. This series of amendments did not alter the size of the metre significantly – today Earth’s polar circumference measures 40007.863 km, a change of about 200 parts per million from the original value of exactly 40000 km, which also includes improvements in the accuracy of measuring the circumference.
If you’re using 1 over arbitrary hundreds of millions as a basis of measurement, it’s a pretty clear sign the base unit makes no sense and serves to make mathematics more complex, not cohesive.
It’s a meme: https://knowyourmeme.com/memes/americans-will-use-anything-except-the-metric-system
Example:
About redefining part, we have this arbitrary number: 1/299792458 and you basically want to change that? What would it help? I constantly use metric and imperial units concurrently, if you don’t need accuracy for 19 decimal places it’s not a big deal. 3 feet is 1 meter, 1 inch is 2.5 cm. 1 pound is 0.5 kg, The only one I can’t calculate in my head that 1 mile is 1.6 km but if I need quickly then it’s just 1.5. For everyday life this accuracy is good enough. I’m an engineer, not scientist.
The whole point of metric is to avoid absurdly precise calculations across different measuring systems; a nine-digit denominator doesn’t help the cause. I of course deal with a mix of customary and metric units, but the metre is a terrible starting point and no less arbitrary than barleycorns or the king’s foot.
What would we get if we would switch to another arbitrary looking value? Also other units are based on other constants not just the speed of light, so at some conversion you would have to use similar strange numbers, you wouldn’t solve your problem you would just move it to elsewhere.
And you rarely convert to lightyears, so the strange numbers are outside, and everyday units fit nicely together with base 10.
And 1 liter or dm3 of water is roughly 1 kg, so the original meter is actually based on the density of water. It’s very convenient that you can convert between weight and length in your head.
I don’t think we’re going to solve a rational universal system of measurements on Beehaw. But the metre is most certainly not based on water; it’s the other way around.
We have so many systems at this point that traveling to space is an issue, which suggests the status quo isn’t working. What would you take as a base measurement instead of c?
Water is based on meter? They drank only wine in the dark middle ages, Lavoisier invented water, I knew it!
…
The point is your proposition doesn’t make any sense, read the definitions of other units, not just meter, from the 2019 si standard. Lightsecond would become a nice round number, but e.g. 1 lightyear is 31557600 lightsecond so there you already have a strange number, but it’s based on the rotation of Earth, a constant you cant redefine.
You’re dancing around my point that for human-scale units, basing something off a constant makes sense. Sure, at AUs and light-years, numbers get messy, but that has nothing to do with buying flour. And using a measurement system based on measurements that will never change seems wise.
Grace Hopper’s explanation about the light-nanosecond is still good for laypersons https://m.youtube.com/watch?v=9eyFDBPk4Yw
There are these, but I suspect their main benefit is that they make physics equations use nicer numbers, not as much for the layperson: https://en.wikipedia.org/wiki/Natural_units