For proof that this thread is just people justifying what they know as better somehow, look no further than Canada.
We do cooking temps in Fahrenheit, weather in Celsius. Human weights in pounds, but never pounds and oz. Food weights in grams, cooking weights in pounds and oz. Liquid volume in millilitres and litres, but cooking in cups, teaspoons and tablespoons. Speed & distance in kilometres, heights in feet and inches.
Try and give this any consistency and people will look at you like you’re fucked. The next town is 100km over, I’m 5ft 10in, a can of soda is 355ml, it’s 21c out and I have the oven roasting something at 400f. Tell me it’s 68f out and I will fight you.
People like what they are used to, and will bend over backwards to justify it. This becomes blatantly obvious when you use a random mix of units like we do, because you realize that all that matters is mental scale.
If Fahrenheit is “how people feel” then why are feet useful measurements of height when 90% of people are between 4ft and 6ft? They aren’t. You just know the scale in your head, so when someone says they’re 7ft tall you say “dang that’s tall”. That’s it.
Fahrenheit: let’s use “really cold weather” as zero and really hot weather as 100.
I don’t really have a horse in this race but this logic doesn’t seem legit to me.
How is -17°C really cold weather AND 37°C really hot weather?
One is actively trying to kill you if weren’t already dead by the time the weather got that bad. The other just makes your nuts stick to your thighs – if you’re in a humid place.
I’d agree with the logic if 100F was equal to something like 65°C. 🤷♂️
maybe it’s a climate thing? Where do you live, here in ameica it’s quite literally the best way to describe it. We see swings below 0f at the coldest parts of the year, and upwards of 100+ in the hottest parts of the year.
So why not make the temperature go to the hottest? Let me guess, 0 isn’t the coldest either in America, right? It’s just so arbitrary, and pure cope to say it’s the best way to describe temperature.
All of them are. The decision to use water at all is completely arbitrary. Even Kelvin and Rankine are completely arbitrary: the “width” of the degrees is not defined by a physical factor, but relative to an entirely arbitrary concept.
That factoid makes celsius relevant for about 4 out of the 12 months, and humans lack the capacity to distinguish between 60-100 on the Celsius scale. Anything at those temperatures just feels like blisters.
100 is the imprecise average body temperature of the developer
That’s a myth. It’s no more true than the myth that it was the body temperature of horses, or that the scale was designed to reflect how humans experience the weather. (It happens to reflect how humans experience the weather, but this was an incidental characteristic and not the purpose for which the scale was designed.)
The Fahrenheit scale starts to make sense when you realize he was a geometrist. It turns out that a base-10 system of angular measurement objectively sucks ass, so the developer wasn’t particularly interested geometrically irrelevant numbers like “100”, but in geometrically interesting numbers like “180”. He put 180 degrees between the freezing and boiling points of water. (212F - 32F = 180F)
After settling on the “width” of his degree, he measured down to a repeatable origin point, which happened to be 32 of his degrees below the freezing point of water. He wanted a dial thermometer to point straight down in ice water, straight up in boiling water, and to use the same angular degrees as a protractor.
The calibration point he chose wasn’t the “freezing point” of the “random brine mixture”. The brine was water, ice, and ammonium chloride, which together form a frigorific mixture due to the phase change of the water. As the mixture is cooled, it resists getting colder than 0F due to the phase change of the water to ice. As it is warmed, it resists getting warmer than 0F due to the phase change of ice to water. (Obviously, it can’t maintain this relationship indefinitely. But so long as there is ice and liquid brine, the brine will maintain this temperature.) This makes it repeatable, in labs around the world.
And it wasn’t a “random” brine mixture: it was the coldest and most stable frigorific mixture known to the scientific community.
This criticism of Fahrenheit is borne of simple ignorance: people don’t understand how or why it was developed, and assume he was an idiot. He wasn’t. He had very good reasons for his choices.
Every time a heat wave brings 100F, the news starts reporting about old people dying. Every time the temperatures reach zero, same thing.
Personally, I can handle the cold much easier than the heat. I get stupid-brain working more than 30 minutes at 95F. Another 15 minutes and I can’t catch my breath, lose fine motor control, and start feeling faint. Drenching myself in water - the colder the better - every 20 minutes or so is the only way I’ve found to be productive above 100F. I feel like 100F is actively trying to kill me.
0F is where it starts getting difficult for me to stay warm without an additional heat source.
Lmao are you a penguin or something? Please tell me that you’re exaggerating to make a point and aren’t seriously saying that you’re capable of staying warm at -10°C (14°F) “without an additional heat source.”
I mean, I have clothes. Long underwear? Layers? Coats, gloves, hats, scarves?
They say you can always put on more clothes if you’re cold, but that’s not really true. Insulation adds bulk, and bulk reduces mobility. Around 0F is where I start to have real trouble wearing enough clothing to stay warm while still being able to perform the activity that has me outside in that weather. Somewhere around 0F, clothing doesn’t really cut it, and I need shelter or additional heat.
It makes no sense because that’s not what the 0 of the Fahrenheit scale is. The 0 point is the coldest an ammonium chloride brine mixture can be cooled to. The 90 point was an estimated average for human body temperature (it was adjusted up over time). These were chosen because the goal of the scale was to provide a way for people to have a defined temperature scale with a range and degree size that could be reliably reproduced without passing around standardized tools. 100 is really hot because human bodies were used as a reference for the high end, but the low end has nothing to do with the human body.
but like isn’t that the whole point of celsius? all you need to calibrate a C thermometer is some water: when it starts freezing it’s 0°C and when it’s boiling it’s 100°C, super simple and accessible.
It’s not like “the estimated average human body temperature” is particularly accurate, and surely no matter what you mix into water it won’t magically boil at the same temperature regardless of air pressure?
You’re totally correct that Celsius is the more sensible scale with easier to replicate reference points (when using water). It was also invented almost 30 years after the Fahrenheit scale and with all the insights gained from that period of technological advancement. In fact in the modern day the Celsius degree size is defined in reference to the Boltzmann constant since Celsius is essentially the Kelvin scale with the numbers moved around.
It also used 100 as the freezing point of water and 0 as the boiling point when originally proposed, which changed after Anders Celsius died because everyone knew that was a weird way to do it.
0f is pretty fucking cold outside, your nose hairs start to freeze in this weather. It’s genuinely uncomfortable and you can die pretty easily if you aren’t prepared for it. 100f is similar, anything over 100f and you start to get into straight heat exhaustion and potential heat stroke region of danger. it’s really not that bad? Sure if you’re like, standing outside doing nothing in the shade, you’ll be fine, but do some labor and you might meet the fabled heat exhaustion fairy.
Obviously, when you convert it to celsius, it seems really fucking weird, That’s pretty normal for conversions though. Like just to be clear, if you round these numbers, they make more sense. -20 c and “damn it’s really cold out” you round up to 40c and “damn it’s really hot out”
also im not really sure what you’re trying to say, but 0f isn’t like, going to kill you kill you, it’s not pleasant, but in the right attire you’ll be fine. -20 f and you start getting closer, -40f and you really start having to think about it. Are you aussie or something? This scale seems really shifted up to me. “nuts sticking weather” is like 80f and humid here.
I’m saying that 0F is waaaaaaay more dangerous than 100F so the logic of those particular temperatures being the 0-100 ends of the scale can’t be explained by how dangerous they each are.
Almost everyone would be fine staying outside for 30 minutes at 100F without no external help (shade, cool drinks etc). Almost nobody would be fine after staying outside at 0F without external help (parka, thermals etc).
To me, with absolutely no data, it feels lie:
0F is as dangerous as 140F (you’re long dead if you’re outside in both cases)
100F is as dangerous as 40F (mildly uncomfortable but safe for a while)
So calling 0F and 100F both “really dangerous” and using that to justify them being the respective points of 0 and 100 disingenuous. Like, use Fahrenheit if that’s what you’re used to - I use it too because that’s what I’m used to. But I don’t explain the insane system with “it’s because the two ends are reallllly dangerous.”
I’m saying that 0F is waaaaaaay more dangerous than 100F so the logic of those particular temperatures being the 0-100 ends of the scale can’t be explained by how dangerous they each are.
idk about that though, i mean maybe if you go outside completely naked, sure. But idk who would be doing that. I’ve regularly been outside in close to 0f temperature in lighter clothing, it’s not pleasant, but im not going to freeze to death within twenty minutes. Plus you can also do physical activity, and as long as you regulate sweat, you’ll be fine. Although sweat can be particularly dangerous in colder weather.
Almost everyone would be fine staying outside for 30 minutes at 100F without no external help (shade, cool drinks etc). Almost nobody would be fine after staying outside at 0F without external help (parka, thermals etc).
i think that’s unreasonable though, you just wouldn’t be going outside at all in those clothes, in the same way that you wouldn’t go outside in 100f weather in a full winter get up. You would literally die.
140f as a relative measure is wild to me, in 140f if you’re outside without an air conditioned vehicle (death valley) and you don’t have water you will die within about a day. 100-130f is considered “extreme heat” in death valley, which has a website that you can pull up for some relevant information. Once your body is over about 110f internal temperature, you’re fucking dead. Unless you have a way to either redirect sunlight from hitting you, and water to replenish that lost from sweat, you die really quickly.
0f isn’t considered “extreme cold” that would be something like -40c (or f, they’re the same) where basically everything starts to freeze, and i’ve seen people do overnight camping in that weather. It’s perfectly doable, obviously not without gear, but who isn’t bringing gear? Hell you can bring a space blanket with you, with the right gear you can easily exist in 0f weather for a prolonged period.
I’m not sure where you’re quoting the “really dangerous” from because i just said both of them are “really hot/cold”
But I don’t explain the insane system with “it’s because the two ends are reallllly dangerous.”
did i say this anywhere??? I feel like i’m schizophrenic.
then why are feet useful measurements of height when 90% of people are between 4ft and 6ft? They aren’t. You just know the scale in your head, so when someone says they’re 7ft tall you say “dang that’s tall”. That’s it.
to be clear, we use feet and inches, and there is historical precedent for breaking things down once they get past a certain grouping, we only have 10 fingers after all. To me the difference between 200cm and 220 is literally fuck all. You ask me the difference between 4 ft and 6ft and i can pretty quickly tell you.
I find it weird that when measuring height in metric, people using cm exclusively, i’ve noticed this a lot actually, people will use cm or mm in places where it arguably doesn’t make any sense. I could see the justification for doing math maybe, but like, that defeats the whole point of it being metric no?
Shouldn’t you be using meters and cm for height specifically? Since most people are a good bit over one meter i feel like it would make sense to do it that way. But then again that’s just kind of a shit bucket worth of options you have, ideally you would use decimeters, but nobody uses those things for some reason.
Most of Europe just uses metres for people’s height. 1.67m, like that. I have no mental picture of that, so it doesn’t work for me. But they don’t seem to have any trouble, further evidence that it’s all just what you know.
I find it weird that when measuring height in metric, people using cm exclusively, i’ve noticed this a lot actually, people will use cm or mm in places where it arguably doesn’t make any sense. I could see the justification for doing math maybe, but like, that defeats the whole point of it being metric no?
Why is that defeating the whole point of being metric? If you know someone is 183 cm tall, you also know that they are 1.83 m tall. If its easier to say the length in cm, you do. No need for “one meter and eighty-three centimeters” or “one point eighty-three meters”, just “a hundred and eighty-three centimeters”. Often you just skip saying the “centimeters” part as well, because most people can see that you’re not the size of a skyscraper without getting a ruler out.
yeah idk, i guess it’s just weird to me, because here in the us if you measured someones height in inches alone, you would be chased out of a room. We strictly use feet and inches, and then yards if referring to a more “broad” range. So you can very safely assume something is in feet and inches if its just two numbers stuck together.
I feel like i could very easily get confused with metric if i’m not running a consistent rule for default units. Seems like an easy way to get a random x10 error in there to me.
To you. But you are aware that this is not the case for people (almost the rest of the world) who are using metric, right?
i mean i would assume so. But i have no direct reference to what 200cm is other than it’s somewhere about 6ft or 2 yards. something like 6’ 5" i think. I would need to know the height of like 50 other people to be able to make a relative distinction there.
If an argument is being made for one thing, Fahrenheit, it’s not relevant to bring up a different thing. Why is feet a useful measurement? Maybe it’s not, we’re talking about temperature.
Yeah like the metric system has good arguments for why it’s measurements and weights are better, mainly conversion being easier, but for temperature there really isn’t an argument. I would make an argument for Fahrenheit as it gives more precision without having to use decimals which at least in America isn’t a thing for temperature. But those are pretty minor things and I do tend to agree it comes down to what you grew up with.
This fear of decimals is a strictly American thing. Celsius achieves more precision with decimals than fahrenheit without decimals. And this American fear of decimals is pretty funny, considering you will happily do advanced fractions as soon as you are doing length measurements.
I don’t mind decimals at all, it’s more that I don’t trust companies to actually deal with supporting decimals when making the switch. Plus the last time I discussed this on Lemmy someone was saying that decimals aren’t even universally used and it might depend on what you get whether you get that precision or not. Either way like the main point of my post was anyways these are minor arguments and at the end of the day there isn’t really a reason to use Celsius vs Fahrenheit.
Can you feel the difference between 23.5° and 24? I can’t. You don’t often need precision to tenths.
In Australia most weather providers give you whole degrees, the bureau of meteorology gives you to one decimal in reports and whole degrees in forecasts
My coffee and beer boilers can hit high precision temperatures to variously 0.1° or 0.5° precision. The beer boiler gives 3 digits - hundredths below 10°, tenths below 100°, whole numbers 100° and over
You can choose the precision of thermometers you wish to buy for yourself
I have seen fahrenheit thermometers which are hard to read to better precision than 5 degrees
i still don’t see how this is intentionally obtuse, feet are a mid point between inches and yards, it just makes sense to break down things over a certain amount to a much more palatable scale. Everyone knows roughly what 1 ft is, and everyone knows roughly what 1 inch is. Paired together you can get a pretty rough and accurate guesstimate of height. I feel like it’s also pretty expected for it to be within the range of 4-6 ft. Most people don’t really measure feet outside of that range, unless you’re doing construction.
humans are a comparatively arbitrary height so i feel like you’re just complaining about the height of humans being weirdly arbitrary? Out of all the systems you could use for height, ft and in is pretty well tuned to the human nature, you’re not gonna do much better.
For proof that this thread is just people justifying what they know as better somehow, look no further than Canada.
We do cooking temps in Fahrenheit, weather in Celsius. Human weights in pounds, but never pounds and oz. Food weights in grams, cooking weights in pounds and oz. Liquid volume in millilitres and litres, but cooking in cups, teaspoons and tablespoons. Speed & distance in kilometres, heights in feet and inches.
Try and give this any consistency and people will look at you like you’re fucked. The next town is 100km over, I’m 5ft 10in, a can of soda is 355ml, it’s 21c out and I have the oven roasting something at 400f. Tell me it’s 68f out and I will fight you.
People like what they are used to, and will bend over backwards to justify it. This becomes blatantly obvious when you use a random mix of units like we do, because you realize that all that matters is mental scale.
If Fahrenheit is “how people feel” then why are feet useful measurements of height when 90% of people are between 4ft and 6ft? They aren’t. You just know the scale in your head, so when someone says they’re 7ft tall you say “dang that’s tall”. That’s it.
Fahrenheit: let’s use “really cold weather” as zero and “really hot weather” as 100.
Celsius: let’s use “freezing water” as zero, and “boiling water” as 100.
Canucks:
I don’t really have a horse in this race but this logic doesn’t seem legit to me.
How is -17°C really cold weather AND 37°C really hot weather?
One is actively trying to kill you if weren’t already dead by the time the weather got that bad. The other just makes your nuts stick to your thighs – if you’re in a humid place.
I’d agree with the logic if 100F was equal to something like 65°C. 🤷♂️
Thank you. That argument bugs the heck out of me.
maybe it’s a climate thing? Where do you live, here in ameica it’s quite literally the best way to describe it. We see swings below 0f at the coldest parts of the year, and upwards of 100+ in the hottest parts of the year.
So why not make the temperature go to the hottest? Let me guess, 0 isn’t the coldest either in America, right? It’s just so arbitrary, and pure cope to say it’s the best way to describe temperature.
All of them are. The decision to use water at all is completely arbitrary. Even Kelvin and Rankine are completely arbitrary: the “width” of the degrees is not defined by a physical factor, but relative to an entirely arbitrary concept.
We live on a water planet. The weather we care about is water.
If you look at the overnight low you probably want to know if frost was likely. Guess what Celcius temperature frost happens at.
That factoid makes celsius relevant for about 4 out of the 12 months, and humans lack the capacity to distinguish between 60-100 on the Celsius scale. Anything at those temperatures just feels like blisters.
Technically all arbitrary, but Fahrenheit is definitely on a whole different level of arbitrary.
Celsius - 0 = precise freezing point of water and 100 = precise boiling point
Kelvin - same as C, but shifted so 0 is the precise lowest possible temperature
Fahrenheit - 0 is the imprecise freezing point of some random brine mixture, 100 is the imprecise average body temperature of the developer
That’s a myth. It’s no more true than the myth that it was the body temperature of horses, or that the scale was designed to reflect how humans experience the weather. (It happens to reflect how humans experience the weather, but this was an incidental characteristic and not the purpose for which the scale was designed.)
The Fahrenheit scale starts to make sense when you realize he was a geometrist. It turns out that a base-10 system of angular measurement objectively sucks ass, so the developer wasn’t particularly interested geometrically irrelevant numbers like “100”, but in geometrically interesting numbers like “180”. He put 180 degrees between the freezing and boiling points of water. (212F - 32F = 180F)
After settling on the “width” of his degree, he measured down to a repeatable origin point, which happened to be 32 of his degrees below the freezing point of water. He wanted a dial thermometer to point straight down in ice water, straight up in boiling water, and to use the same angular degrees as a protractor.
The calibration point he chose wasn’t the “freezing point” of the “random brine mixture”. The brine was water, ice, and ammonium chloride, which together form a frigorific mixture due to the phase change of the water. As the mixture is cooled, it resists getting colder than 0F due to the phase change of the water to ice. As it is warmed, it resists getting warmer than 0F due to the phase change of ice to water. (Obviously, it can’t maintain this relationship indefinitely. But so long as there is ice and liquid brine, the brine will maintain this temperature.) This makes it repeatable, in labs around the world.
And it wasn’t a “random” brine mixture: it was the coldest and most stable frigorific mixture known to the scientific community.
This criticism of Fahrenheit is borne of simple ignorance: people don’t understand how or why it was developed, and assume he was an idiot. He wasn’t. He had very good reasons for his choices.
The records are -80°F and 134°F
That’s quite an error in a “whole human experience in zero to one hundred” system
Every time a heat wave brings 100F, the news starts reporting about old people dying. Every time the temperatures reach zero, same thing.
Personally, I can handle the cold much easier than the heat. I get stupid-brain working more than 30 minutes at 95F. Another 15 minutes and I can’t catch my breath, lose fine motor control, and start feeling faint. Drenching myself in water - the colder the better - every 20 minutes or so is the only way I’ve found to be productive above 100F. I feel like 100F is actively trying to kill me.
0F is where it starts getting difficult for me to stay warm without an additional heat source.
Lmao are you a penguin or something? Please tell me that you’re exaggerating to make a point and aren’t seriously saying that you’re capable of staying warm at -10°C (14°F) “without an additional heat source.”
I mean, I have clothes. Long underwear? Layers? Coats, gloves, hats, scarves?
They say you can always put on more clothes if you’re cold, but that’s not really true. Insulation adds bulk, and bulk reduces mobility. Around 0F is where I start to have real trouble wearing enough clothing to stay warm while still being able to perform the activity that has me outside in that weather. Somewhere around 0F, clothing doesn’t really cut it, and I need shelter or additional heat.
That’s a lot of moved goalposts to justify the weird temperature scale logic but okay. Have a good day! :)
It makes no sense because that’s not what the 0 of the Fahrenheit scale is. The 0 point is the coldest an ammonium chloride brine mixture can be cooled to. The 90 point was an estimated average for human body temperature (it was adjusted up over time). These were chosen because the goal of the scale was to provide a way for people to have a defined temperature scale with a range and degree size that could be reliably reproduced without passing around standardized tools. 100 is really hot because human bodies were used as a reference for the high end, but the low end has nothing to do with the human body.
Geometric construction plays a role in there as well: the 180 degrees between the boiling point and the freezing point of water was not accidental.
but like isn’t that the whole point of celsius? all you need to calibrate a C thermometer is some water: when it starts freezing it’s 0°C and when it’s boiling it’s 100°C, super simple and accessible.
It’s not like “the estimated average human body temperature” is particularly accurate, and surely no matter what you mix into water it won’t magically boil at the same temperature regardless of air pressure?
You’re totally correct that Celsius is the more sensible scale with easier to replicate reference points (when using water). It was also invented almost 30 years after the Fahrenheit scale and with all the insights gained from that period of technological advancement. In fact in the modern day the Celsius degree size is defined in reference to the Boltzmann constant since Celsius is essentially the Kelvin scale with the numbers moved around.
It also used 100 as the freezing point of water and 0 as the boiling point when originally proposed, which changed after Anders Celsius died because everyone knew that was a weird way to do it.
At what molar concentration? Was it just as much NH4Cl as he could dissolve at ambient temperature and pressure?
0f is pretty fucking cold outside, your nose hairs start to freeze in this weather. It’s genuinely uncomfortable and you can die pretty easily if you aren’t prepared for it. 100f is similar, anything over 100f and you start to get into straight heat exhaustion and potential heat stroke region of danger. it’s really not that bad? Sure if you’re like, standing outside doing nothing in the shade, you’ll be fine, but do some labor and you might meet the fabled heat exhaustion fairy.
Obviously, when you convert it to celsius, it seems really fucking weird, That’s pretty normal for conversions though. Like just to be clear, if you round these numbers, they make more sense. -20 c and “damn it’s really cold out” you round up to 40c and “damn it’s really hot out”
also im not really sure what you’re trying to say, but 0f isn’t like, going to kill you kill you, it’s not pleasant, but in the right attire you’ll be fine. -20 f and you start getting closer, -40f and you really start having to think about it. Are you aussie or something? This scale seems really shifted up to me. “nuts sticking weather” is like 80f and humid here.
I’m saying that 0F is waaaaaaay more dangerous than 100F so the logic of those particular temperatures being the 0-100 ends of the scale can’t be explained by how dangerous they each are.
Almost everyone would be fine staying outside for 30 minutes at 100F without no external help (shade, cool drinks etc). Almost nobody would be fine after staying outside at 0F without external help (parka, thermals etc).
To me, with absolutely no data, it feels lie:
So calling 0F and 100F both “really dangerous” and using that to justify them being the respective points of 0 and 100 disingenuous. Like, use Fahrenheit if that’s what you’re used to - I use it too because that’s what I’m used to. But I don’t explain the insane system with “it’s because the two ends are reallllly dangerous.”
idk about that though, i mean maybe if you go outside completely naked, sure. But idk who would be doing that. I’ve regularly been outside in close to 0f temperature in lighter clothing, it’s not pleasant, but im not going to freeze to death within twenty minutes. Plus you can also do physical activity, and as long as you regulate sweat, you’ll be fine. Although sweat can be particularly dangerous in colder weather.
i think that’s unreasonable though, you just wouldn’t be going outside at all in those clothes, in the same way that you wouldn’t go outside in 100f weather in a full winter get up. You would literally die.
140f as a relative measure is wild to me, in 140f if you’re outside without an air conditioned vehicle (death valley) and you don’t have water you will die within about a day. 100-130f is considered “extreme heat” in death valley, which has a website that you can pull up for some relevant information. Once your body is over about 110f internal temperature, you’re fucking dead. Unless you have a way to either redirect sunlight from hitting you, and water to replenish that lost from sweat, you die really quickly.
0f isn’t considered “extreme cold” that would be something like -40c (or f, they’re the same) where basically everything starts to freeze, and i’ve seen people do overnight camping in that weather. It’s perfectly doable, obviously not without gear, but who isn’t bringing gear? Hell you can bring a space blanket with you, with the right gear you can easily exist in 0f weather for a prolonged period.
I’m not sure where you’re quoting the “really dangerous” from because i just said both of them are “really hot/cold”
did i say this anywhere??? I feel like i’m schizophrenic.
Celsius is for scientists and nerds, Fahrenheit is for normal idiots. It’s not rocket surgery.
As a Canadian idk why your using us an an example, we are wrong to do so and we blame Americans for giving us this bad habit.
I just see it positively and choose to believe you’re in the process of transitioning to enlightenment (metric). ;)
Outdoor temperature in °C, unless you’re talking about an outdoor pool then it’s often enough °F :-)
I think part of the reasons it’s so mixed might just be due to how many Amero-centric devices and parts are common between the two countries.
Y’all can take your shitty Phillips screws though. Roberts is by far superior ;-)
Note to self: High heat levels make Canadians cranky.
to be clear, we use feet and inches, and there is historical precedent for breaking things down once they get past a certain grouping, we only have 10 fingers after all. To me the difference between 200cm and 220 is literally fuck all. You ask me the difference between 4 ft and 6ft and i can pretty quickly tell you.
I find it weird that when measuring height in metric, people using cm exclusively, i’ve noticed this a lot actually, people will use cm or mm in places where it arguably doesn’t make any sense. I could see the justification for doing math maybe, but like, that defeats the whole point of it being metric no?
Shouldn’t you be using meters and cm for height specifically? Since most people are a good bit over one meter i feel like it would make sense to do it that way. But then again that’s just kind of a shit bucket worth of options you have, ideally you would use decimeters, but nobody uses those things for some reason.
Most of Europe just uses metres for people’s height. 1.67m, like that. I have no mental picture of that, so it doesn’t work for me. But they don’t seem to have any trouble, further evidence that it’s all just what you know.
Why is that defeating the whole point of being metric? If you know someone is 183 cm tall, you also know that they are 1.83 m tall. If its easier to say the length in cm, you do. No need for “one meter and eighty-three centimeters” or “one point eighty-three meters”, just “a hundred and eighty-three centimeters”. Often you just skip saying the “centimeters” part as well, because most people can see that you’re not the size of a skyscraper without getting a ruler out.
yeah idk, i guess it’s just weird to me, because here in the us if you measured someones height in inches alone, you would be chased out of a room. We strictly use feet and inches, and then yards if referring to a more “broad” range. So you can very safely assume something is in feet and inches if its just two numbers stuck together.
I feel like i could very easily get confused with metric if i’m not running a consistent rule for default units. Seems like an easy way to get a random x10 error in there to me.
As you pointed out previously, nobody uses decimeters, so x10 errors are not that common.
i’m just gonna say that the joke here is that it was a 10x error. But that’s retroactive, so.
To you. But you are aware that this is not the case for people (almost the rest of the world) who are using metric, right?
i mean i would assume so. But i have no direct reference to what 200cm is other than it’s somewhere about 6ft or 2 yards. something like 6’ 5" i think. I would need to know the height of like 50 other people to be able to make a relative distinction there.
Those are two different things. Hope this helps.
It doesn’t help at all, it’s being intentionally obtuse. You know what I mean, it’s unhelpful to pretend otherwise and pick a fight over it.
If an argument is being made for one thing, Fahrenheit, it’s not relevant to bring up a different thing. Why is feet a useful measurement? Maybe it’s not, we’re talking about temperature.
Yeah like the metric system has good arguments for why it’s measurements and weights are better, mainly conversion being easier, but for temperature there really isn’t an argument. I would make an argument for Fahrenheit as it gives more precision without having to use decimals which at least in America isn’t a thing for temperature. But those are pretty minor things and I do tend to agree it comes down to what you grew up with.
This fear of decimals is a strictly American thing. Celsius achieves more precision with decimals than fahrenheit without decimals. And this American fear of decimals is pretty funny, considering you will happily do advanced fractions as soon as you are doing length measurements.
I don’t mind decimals at all, it’s more that I don’t trust companies to actually deal with supporting decimals when making the switch. Plus the last time I discussed this on Lemmy someone was saying that decimals aren’t even universally used and it might depend on what you get whether you get that precision or not. Either way like the main point of my post was anyways these are minor arguments and at the end of the day there isn’t really a reason to use Celsius vs Fahrenheit.
Can you feel the difference between 23.5° and 24? I can’t. You don’t often need precision to tenths.
In Australia most weather providers give you whole degrees, the bureau of meteorology gives you to one decimal in reports and whole degrees in forecasts
My coffee and beer boilers can hit high precision temperatures to variously 0.1° or 0.5° precision. The beer boiler gives 3 digits - hundredths below 10°, tenths below 100°, whole numbers 100° and over
You can choose the precision of thermometers you wish to buy for yourself
I have seen fahrenheit thermometers which are hard to read to better precision than 5 degrees
1cm3 of water weights 1 gr and needs 1 calorie to rise 1ºC.
But calories are now obsolete and the unit is Joules.
i still don’t see how this is intentionally obtuse, feet are a mid point between inches and yards, it just makes sense to break down things over a certain amount to a much more palatable scale. Everyone knows roughly what 1 ft is, and everyone knows roughly what 1 inch is. Paired together you can get a pretty rough and accurate guesstimate of height. I feel like it’s also pretty expected for it to be within the range of 4-6 ft. Most people don’t really measure feet outside of that range, unless you’re doing construction.
humans are a comparatively arbitrary height so i feel like you’re just complaining about the height of humans being weirdly arbitrary? Out of all the systems you could use for height, ft and in is pretty well tuned to the human nature, you’re not gonna do much better.