So why not make the temperature go to the hottest? Let me guess, 0 isn’t the coldest either in America, right? It’s just so arbitrary, and pure cope to say it’s the best way to describe temperature.
All of them are. The decision to use water at all is completely arbitrary. Even Kelvin and Rankine are completely arbitrary: the “width” of the degrees is not defined by a physical factor, but relative to an entirely arbitrary concept.
100 is the imprecise average body temperature of the developer
That’s a myth. It’s no more true than the myth that it was the body temperature of horses, or that the scale was designed to reflect how humans experience the weather. (It happens to reflect how humans experience the weather, but this was an incidental characteristic and not the purpose for which the scale was designed.)
The Fahrenheit scale starts to make sense when you realize he was a geometrist. It turns out that a base-10 system of angular measurement objectively sucks ass, so the developer wasn’t particularly interested geometrically irrelevant numbers like “100”, but in geometrically interesting numbers like “180”. He put 180 degrees between the freezing and boiling points of water. (212F - 32F = 180F)
After settling on the “width” of his degree, he measured down to a repeatable origin point, which happened to be 32 of his degrees below the freezing point of water. He wanted a dial thermometer to point straight down in ice water, straight up in boiling water, and to use the same angular degrees as a protractor.
The calibration point he chose wasn’t the “freezing point” of the “random brine mixture”. The brine was water, ice, and ammonium chloride, which together form a frigorific mixture due to the phase change of the water. As the mixture is cooled, it resists getting colder than 0F due to the phase change of the water to ice. As it is warmed, it resists getting warmer than 0F due to the phase change of ice to water. (Obviously, it can’t maintain this relationship indefinitely. But so long as there is ice and liquid brine, the brine will maintain this temperature.) This makes it repeatable, in labs around the world.
And it wasn’t a “random” brine mixture: it was the coldest and most stable frigorific mixture known to the scientific community.
This criticism of Fahrenheit is borne of simple ignorance: people don’t understand how or why it was developed, and assume he was an idiot. He wasn’t. He had very good reasons for his choices.
That was a long way of saying what I said, you just don’t see faranheit as ludicrously out of date, while I (and most of the world) do. Live your life as you wish friend. It’s a random brine mixture. Maybe it was less random back then, but now it’s an arbitrary mixture of water and salts in arbitrary ratios. Deal with it. Fahrenheit sucks.
Every measurement system has had its formal definition changed several times. The kilogram, for example, was once formally defined as the mass of a specific block of metal in France, which was later determined to be losing mass, and thus made a pretty terrible standard. Now, the kilogram is formally defined in terms of the meter and the Planck Constant.
Celsius was once defined by the freezing and boiling points of water, but those aren’t actually constant: Fahrenheit’s brine mixture is actually significantly more consistent. Kelvin’s degree spacing comes from that definition of Celsius, but it it was eventually redefined to be more precise by using the triple point of water: pure water at a specific pressure and temperature where it can simultaneously exist as solid, liquid, and gas. Significantly more accurate, but not enough: Kelvin was redefined in 2019 in terms of joules, which are in turn defined by kg, m, s, which are ultimately defined in terms of the Planck constant.
Celsius is now formally defined in terms of Kelvin. Fahrenheit is also formally defined in terms of Kelvin. Fahrenheit’s brine story is just a piece of trivia.
That factoid makes celsius relevant for about 4 out of the 12 months, and humans lack the capacity to distinguish between 60-100 on the Celsius scale. Anything at those temperatures just feels like blisters.
The high end of 0 to 100 is nice for boiling, when I’m making beer at the boiling stage the number on the scale goes from somewhere below 25 to 100 and so the end point is obvious
We boil water quite a lot, though we often aren’t tracking the temperature
Most of the time the temperature scale that’s best is the one you know. I don’t know of any case where Fahrenheit is objectively best (like Celcius is when water is involved) but I think the best argument for Celcius is it is used in science, so American scientists start a step behind all the others by having to learn a new system. Given neither have any great advantage I reckon it’s worth America changing to make things better for American scientists
So why not make the temperature go to the hottest? Let me guess, 0 isn’t the coldest either in America, right? It’s just so arbitrary, and pure cope to say it’s the best way to describe temperature.
All of them are. The decision to use water at all is completely arbitrary. Even Kelvin and Rankine are completely arbitrary: the “width” of the degrees is not defined by a physical factor, but relative to an entirely arbitrary concept.
Technically all arbitrary, but Fahrenheit is definitely on a whole different level of arbitrary.
Celsius - 0 = precise freezing point of water and 100 = precise boiling point
Kelvin - same as C, but shifted so 0 is the precise lowest possible temperature
Fahrenheit - 0 is the imprecise freezing point of some random brine mixture, 100 is the imprecise average body temperature of the developer
That’s a myth. It’s no more true than the myth that it was the body temperature of horses, or that the scale was designed to reflect how humans experience the weather. (It happens to reflect how humans experience the weather, but this was an incidental characteristic and not the purpose for which the scale was designed.)
The Fahrenheit scale starts to make sense when you realize he was a geometrist. It turns out that a base-10 system of angular measurement objectively sucks ass, so the developer wasn’t particularly interested geometrically irrelevant numbers like “100”, but in geometrically interesting numbers like “180”. He put 180 degrees between the freezing and boiling points of water. (212F - 32F = 180F)
After settling on the “width” of his degree, he measured down to a repeatable origin point, which happened to be 32 of his degrees below the freezing point of water. He wanted a dial thermometer to point straight down in ice water, straight up in boiling water, and to use the same angular degrees as a protractor.
The calibration point he chose wasn’t the “freezing point” of the “random brine mixture”. The brine was water, ice, and ammonium chloride, which together form a frigorific mixture due to the phase change of the water. As the mixture is cooled, it resists getting colder than 0F due to the phase change of the water to ice. As it is warmed, it resists getting warmer than 0F due to the phase change of ice to water. (Obviously, it can’t maintain this relationship indefinitely. But so long as there is ice and liquid brine, the brine will maintain this temperature.) This makes it repeatable, in labs around the world.
And it wasn’t a “random” brine mixture: it was the coldest and most stable frigorific mixture known to the scientific community.
This criticism of Fahrenheit is borne of simple ignorance: people don’t understand how or why it was developed, and assume he was an idiot. He wasn’t. He had very good reasons for his choices.
That was a long way of saying what I said, you just don’t see faranheit as ludicrously out of date, while I (and most of the world) do. Live your life as you wish friend. It’s a random brine mixture. Maybe it was less random back then, but now it’s an arbitrary mixture of water and salts in arbitrary ratios. Deal with it. Fahrenheit sucks.
Every measurement system has had its formal definition changed several times. The kilogram, for example, was once formally defined as the mass of a specific block of metal in France, which was later determined to be losing mass, and thus made a pretty terrible standard. Now, the kilogram is formally defined in terms of the meter and the Planck Constant.
Celsius was once defined by the freezing and boiling points of water, but those aren’t actually constant: Fahrenheit’s brine mixture is actually significantly more consistent. Kelvin’s degree spacing comes from that definition of Celsius, but it it was eventually redefined to be more precise by using the triple point of water: pure water at a specific pressure and temperature where it can simultaneously exist as solid, liquid, and gas. Significantly more accurate, but not enough: Kelvin was redefined in 2019 in terms of joules, which are in turn defined by kg, m, s, which are ultimately defined in terms of the Planck constant.
Celsius is now formally defined in terms of Kelvin. Fahrenheit is also formally defined in terms of Kelvin. Fahrenheit’s brine story is just a piece of trivia.
We live on a water planet. The weather we care about is water.
If you look at the overnight low you probably want to know if frost was likely. Guess what Celcius temperature frost happens at.
That factoid makes celsius relevant for about 4 out of the 12 months, and humans lack the capacity to distinguish between 60-100 on the Celsius scale. Anything at those temperatures just feels like blisters.
The high end of 0 to 100 is nice for boiling, when I’m making beer at the boiling stage the number on the scale goes from somewhere below 25 to 100 and so the end point is obvious
We boil water quite a lot, though we often aren’t tracking the temperature
Most of the time the temperature scale that’s best is the one you know. I don’t know of any case where Fahrenheit is objectively best (like Celcius is when water is involved) but I think the best argument for Celcius is it is used in science, so American scientists start a step behind all the others by having to learn a new system. Given neither have any great advantage I reckon it’s worth America changing to make things better for American scientists
The records are -80°F and 134°F
That’s quite an error in a “whole human experience in zero to one hundred” system