I’m not sure I agree with the take for farenheit. It’s an arbitraty choice, and to me who grew up in a country that uses celsius, I find that far easier to understand and farenheit may as well be random numbers to me.
Fahrenheit was not an entirely arbitrary choice: it was defined based on two points of reference that could be measured at the time: the freezing temperature of an ammonium chloride brine is used as 0, and the best estimate for the average human body temperature is set at 96.
Over time, as the freezing point and boiling point of water at sea level atmospheric pressure proves to be more accurate reference points, the Fahrenheit scale was adjusted to provide exact conversion to Celsius.
Are you telling me they were able to measure those things, but not the boiling and freezing point of water?
Sure, let me just whip up that ammonium chloride mixture and travel somewhere where I can get it close to freezing so I can know the zero reference of that scale. What, did the just carry that NH₄Cl around for convenience?
Fahrenheit was proposed in 1724, Celsius dates back to 1742, so there wasn’t that much time between the two.
I grew up in Canada, but in a temperate climate area on the border with the US. Winter? Use Celsius. Summer? Use Fahrenheit. For me Celsius makes a lot more sense right around 0C. After about 15C my brain switches over and starts using Fahrenheit. I like the Fahrenheit scale from 60-100F for gauging the summer months. The Celsius scale isn’t granular enough. It feels like there’s a big difference between 18C and 22C versus the comparable 64F-72F. But I also was taught early a quick and dirty conversion. C to F, double and add 30. F to C subtract 30 and divide by 2.