• ShaunaTheDead@kbin.social
      link
      fedilink
      arrow-up
      28
      arrow-down
      5
      ·
      1 year ago

      Yeah but C makes more sense. 0-10 is cold but not freezing, 10-20 is cool, 20-30 is warm, 30-40 is hot, 40+ is “you’re gonna die of heat exposure! Get inside, what are you doing?!” increasing in urgency with the number. If it’s in the negatives, it’s the same as the 40+ except “cold exposure”.

    • SaltyIceteaMaker@lemmy.ml
      link
      fedilink
      arrow-up
      5
      ·
      edit-2
      1 year ago

      298.15 - 273,5 so its 24.65°C? I’d argue that is a lot. But i may just be heat sensitive

      Edit: fixed typo Edit2: fixed another typo. I gotta start proof reading before sending

      • Dmian@lemmy.world
        link
        fedilink
        English
        arrow-up
        7
        ·
        edit-2
        1 year ago

        VERY generally speaking, 20s are warm, 30s are hot. Humidity changes this a lot. And yes, personal sensitivity to heat plays a role. I live in a dry climate, and I feel rather comfortable until we’re close to 30 ºC. I remember reading something like the ideal room temperature for humans was around 20-22 ºC.

        For those using F, this is, more or less, the scale of C:

        Below 0: freezing (0 ºC being the freezing point of water, duh!)
        0 to 10: cold (don’t go out without a coat)
        10s: cool (a sweatshirt or light coat may do)
        20s: warm
        30s: hot
        40s: uncomfortably hot (stay in the shade and hydrate)
        50s: you’re dead (or you wish you were. Unsafe for humans)

        • uphillbothways@kbin.social
          link
          fedilink
          arrow-up
          2
          arrow-down
          1
          ·
          1 year ago

          Hot is still relative. Are you talking about soup, a cup of coffee/tea or outside temperature? People would probably answer differently in each instance.

          • Dmian@lemmy.world
            link
            fedilink
            arrow-up
            4
            ·
            edit-2
            1 year ago

            Well… I said VERY generally speaking. And as I’m defining a gradient of temperatures (clearly it’s not the same 30 °C than 38 °C), I’m also defining a gradient of “hot” sensations, from feeling a bit of heat in your body, to feeling like an oven. That’s the thing with generalizations. I’m not trying to be precise here, just give a general idea to those that are not used to Celsius (I’ve seen the same being done with Farenheit and found it useful). Cheers.

        • Eufalconimorph@discuss.tchncs.de
          link
          fedilink
          arrow-up
          1
          ·
          edit-2
          1 year ago

          Eh, as a weirdo who uses Celsius a lot but lives in Buffalo, NY…

          -20s is cold. Coat, gloves, scarf, & hat. Long underwear. Not too much evaporation from the lake since it can freeze, so not much snow.
          -10s is chilly. Coat, probably zip it up towards the lower end of the range. Decent chance of apocalyptic snow.
          0-10s is cool. Wear a sweater.
          10s is nice. Maybe consider long sleeves & pants if it gets a bit cooler.
          20s is shorts & t-shirt weather.
          30s is all AC, all the time. Uncomfortably hot not too far into the range.
          40s is “the humidity is now so high the air is soup, filled with mosquitoes”.

  • r00ty@kbin.life
    link
    fedilink
    arrow-up
    13
    ·
    1 year ago

    I’m from the UK and part of the forgotten generation, so I was pretty much brought up on both systems. The “cheaper” way to do F from C is double it and add thirty. It works reasonable enough. Of course the reverse is minus thirty then half.

    I’m definitely more native in deg C, but am fine with deg F too. Yes, I added deg to stop the coding jokes.

  • ALoafOfBread@lemmy.ml
    link
    fedilink
    arrow-up
    6
    arrow-down
    1
    ·
    edit-2
    1 year ago

    For easier mental C -> F conversion, take (2C)-(2C/10)+32 where C = temp in c.

    16c × 2 = 32

    32/10 = 3.2

    32 - 3.2 = 28.8

    28.8+32 = 60.8, which is exactly correct.

    Or if you just want a rough estimate for weather, 2C+(22 to 32) gets you close enough with easier mental math

    16x2+30 = 62, just 1 degree off.

    37x2+30 = 104 is like 6 degrees off, though. The further from zero you are, the smaller the number you should add.

  • pinkdrunkenelephants@lemmy.cafe
    link
    fedilink
    English
    arrow-up
    2
    ·
    1 year ago

    You’re not supposed to memorize the formulas, you’re supposed to draw it out on a graph from which the formula is derived.

    It’s literally just a basic linear graph like from Algebra class. You can just trace your finger along the line and find the right temperature, no Egyptian hieroglyphics required.

  • Patches@sh.itjust.works
    link
    fedilink
    arrow-up
    11
    arrow-down
    34
    ·
    edit-2
    1 year ago

    Fahrenheit is the way to go.

    Taking this opportunity to argue why Fahrenheit over Celsius makes way more sense. Fahrenheit is essentially based on a scale of 0 to 100, which is a scale we use for most things in life. Yes we can go below 0 or above 100, but it’s such a perfect scale to understand exactly HOW hot or cold something is.

    It’s 70 degrees outside? Wow, we’re at 70% heat. I bet that feels really nice and not too hot. Oh no it’s 100 degrees so we’re at 100% heat? Probably want to stay inside today. Water freezes once we get to 32%? Makes sense, that’s pretty cold. I’ll need to wear a winter coat. 0% outside? No way I want to go out in that.

    In terms of understanding how how or cold it is outside, looking at the temperature as a number out of 100% is a perfect way of understanding that and having a frame of reference.

    Celsius is so dumb. A 70 degree day in summer sounds great. 70% heat, not too hot, not too cold, just right. But convert that to Celsius? It’s 21 degrees in the summer? What does that even mean? Stupid.

    Also because of the way the math works, the scale for Celsius makes no sense. It’s 0 degrees out in the U.S., that’s -18 Celsius. But if it’s 100 in the U.S., that’s only 38 Celsius? What kind of stupid scale runs from -18 to 38? 0 to 100 is the way to go.

    Imagine if test scores ran from -18 to 38. Would you support this nonsensical scale then?

    To be clear, I’m on board with the metric system and I definitely don’t think the U.S. does everything right. But Celsius is trash.

      • Zerush@lemmy.ml
        link
        fedilink
        arrow-up
        8
        ·
        edit-2
        1 year ago

        With the current climate policy, soon the winters -10ºC and the summers at 115ºC, but

        • Patches@sh.itjust.works
          link
          fedilink
          arrow-up
          3
          arrow-down
          1
          ·
          edit-2
          1 year ago

          It means your temperatures are not compatible with human life. There is no confusion that 120% Hot would be “really fucking hot”.

          Both of those temperatures mean you’re outside the human scale and should limit your time outside.

          It’s pretty unclear that I can for a walk outside at 40 C but if I do that at 48C I might die.

    • Kirby@discuss.tchncs.de
      link
      fedilink
      arrow-up
      11
      ·
      1 year ago

      to a european like me 90 ‘%’ or 20 ‘%’ human comfort would be very confusing - you could probably guess that one is hot and the other’s not but I’d have no point of reference until I convert it to Celcius. I think the numbers that someone grows up with will always make more sense no matter what

      • Patches@sh.itjust.works
        link
        fedilink
        arrow-up
        3
        arrow-down
        5
        ·
        1 year ago

        I mean we use scales from 0 to 100 in every field. -18 to 38 is a scale used absolutely nowhere.

        But I agree Humans can get used to pretty much anything, and once they do - it’s all they will prefer over the unknown.

        • DarthFrodo@lemmy.world
          link
          fedilink
          arrow-up
          8
          ·
          edit-2
          1 year ago

          So 0°F was defined as the freezing temperature of a solution of brine made from a mixture of water, ice, and ammonium chloride (why in the world?). Originally, 90°F was set as human body temperature, which was later changed first to 96°F, and now it’s about 98.6°F.

          Celsius is just: 0°C is the freezing temperature of water 100°C is the boiling temperature of water

          Nobody uses a scale between -18 and 38. People in countries using Celsius just learned as a child that body temperature is 38°C, that’s all. -18°C has no special meaning to us.

          At 0°C outside it’s freezing (32°F). 10°C is quite cool (50°F), you’ll need a jacket. 20°C is a comfortable temperature for me, if it’s sunny (68°F). 30°C is getting rather warm (86°F). 40°C is hell outside, or a bad fever (104°F). To boil water, heat it to 100°C (212°F).

          I get that this seems confusing at first when you’re used to completely different orientation points, but for people who are used to C, it’s very intuitive.

    • RandomVideos@programming.dev
      link
      fedilink
      arrow-up
      5
      ·
      1 year ago

      Why does this matter? I have no idea what 0% or heat means, but know what -18° or 38° mean. For me 32% heat doesnt mean anything.

      How hot it feels can change. In summer, 10°C feels cold, but in winter it feels hot

      A lot more people use °C so more people would have to relearn how temperature is measured

      I dont think either of them is better than the other, its just that you got used to what numbers means its hot or cool

    • fastandcurious@lemmy.worldOP
      link
      fedilink
      arrow-up
      5
      ·
      1 year ago

      You are entitled to your opinion and obviously have your own preference and your way of explaining the scale is very good as well, but saying that Celsius is objectively worse is just wrong.

    • set_secret@lemmy.world
      link
      fedilink
      arrow-up
      3
      ·
      1 year ago

      The argument for Fahrenheit based on a perceived “0 to 100 scale” representing a percentage of heat can be critiqued as it misunderstands how temperature scales work. Temperature is not a percentage system; it’s a measure of thermal energy. The notion that 70 degrees Fahrenheit represents “70% heat” is not scientifically accurate as it implies that temperature is a linear scale capped at 100, which it is not. The Fahrenheit scale was actually based on arbitrary points: the freezing point of brine (0°F) and the average human body temperature (96°F at the time, which has since been adjusted to 98.6°F).

      Celsius, on the other hand, is based on the freezing and boiling points of water at 0°C and 100°C respectively, under standard atmospheric conditions. This makes it a decimal and scientifically consistent system that is easier to relate to the states of water, an essential reference in science and daily life.

      Comparing temperatures to percentages, like test scores, is a flawed analogy because temperature doesn’t have an upper limit “score” and is not designed to be read as a proportion. The scale from -18 to 38 in Celsius correlates directly with the physical properties of water, which is logical for scientific purposes.

      Moreover, many argue that Celsius is more intuitive for everyday weather-related use outside of the U.S., as the scale is more granular for colder climates (where a one-degree change in Celsius is noticeable) and aligns well with the metric system, which is used globally for scientific measurement.