First of all, the °C is not the metric SI unit for temperature. K (Kelvin) is.
Second, even with Kelvin, nearly all temperatures that matter for normal human issues happen to be below 4000K, usually way below that mark. And with most of those temperatures, about all digits usually count. A core body temperature of 310K or 313K makes a BIG difference for the person involved.
I’ve seen mK used numerous times, but I haven’t seen, like MK for internal temperatures of stars or things. I imagine because those are more “for fun” numbers while the precise temperatures in a low temperature physics lab are four technical purposes.
Celsius uses an arbitrary reference point (freezing point of water). Kelvin uses the same sized units, but is referenced from absolute zero. While this seems just as arbitrary, it actually makes some scientific calculations a lot easier.
Basically, scientists have been working to slot the various base units together in a neat and orderly manner. Kelvin fits this far better than Celsius, and so became the baseline SI unit.
Some people seem to have this misconception that “0F cold 100F hot” is somehow an innate or intuitive concept for everyone. It’s not, brother, you just happen to be used to it. I have absolutely no idea if I should wear a coat with 62F or not, or for any other F temperature for that matter.
At least 0C and 100C have very practical references that anyone can recognise, but what the hell even is 0F and 100F?
Also, not sure why you’re trying to shoehorn 0-100F to 0-100C.
When talking about weather, it’s going to be in a range like 0C (cold) / 20C (nice) / 40C (hot), which is equally arbitrary but probably more useful than 0F/50F/100F anyway depending on where you live: my neck of the woods goes to 0C in a harsh winter, and to 40C in the peak of summer.
And do you use F for stuff like cooking? What purpose is 0F or 100F there?
How about stuff like chemistry or physics? I remember formulas in C or K, occasionally having to add 273.5. Is F used, or you just use K/C and convert at the start?
First of all, the °C is not the metric SI unit for temperature. K (Kelvin) is.
Second, even with Kelvin, nearly all temperatures that matter for normal human issues happen to be below 4000K, usually way below that mark. And with most of those temperatures, about all digits usually count. A core body temperature of 310K or 313K makes a BIG difference for the person involved.
Celsius is the SI unit of temperature. Kelvin is the SI unit of thermodynamic temperature. They’re both defined in SI.
You can say anything with confidence and people will believe it
https://en.m.wikipedia.org/wiki/International_System_of_Units
Kelvin is the base unit. Celsius is a derived unit, just like Watt or Newton. But they’re all SI.
I was agreeing with you, I was referring to OP saying it isn’t SI. But I think the downvotes are showing that wasn’t clear enough.
I’ve seen mK used numerous times, but I haven’t seen, like MK for internal temperatures of stars or things. I imagine because those are more “for fun” numbers while the precise temperatures in a low temperature physics lab are four technical purposes.
Isn’t Kelvin just Celsius+273.15?
Celsius uses an arbitrary reference point (freezing point of water). Kelvin uses the same sized units, but is referenced from absolute zero. While this seems just as arbitrary, it actually makes some scientific calculations a lot easier.
Basically, scientists have been working to slot the various base units together in a neat and orderly manner. Kelvin fits this far better than Celsius, and so became the baseline SI unit.
Yep! Celsius does make sense for our everyday life
Fahrenheit is better for human-survivable temps.
Fahrenheit:
Celsius:
Kelvin:
Some people seem to have this misconception that “0F cold 100F hot” is somehow an innate or intuitive concept for everyone. It’s not, brother, you just happen to be used to it. I have absolutely no idea if I should wear a coat with 62F or not, or for any other F temperature for that matter.
At least 0C and 100C have very practical references that anyone can recognise, but what the hell even is 0F and 100F?
Also, not sure why you’re trying to shoehorn 0-100F to 0-100C.
When talking about weather, it’s going to be in a range like 0C (cold) / 20C (nice) / 40C (hot), which is equally arbitrary but probably more useful than 0F/50F/100F anyway depending on where you live: my neck of the woods goes to 0C in a harsh winter, and to 40C in the peak of summer.
And do you use F for stuff like cooking? What purpose is 0F or 100F there?
How about stuff like chemistry or physics? I remember formulas in C or K, occasionally having to add 273.5. Is F used, or you just use K/C and convert at the start?
I fully agree with that. It’s also quite easy to shift between the 2. I just had the difference drilled into me way too much, at university.