👍👍👍

The tax breaks in the Inflation Recovery Act are crucial to making the deal economically feasible, according to Constellation. They provide a credit for every megawatt hour of nuclear energy produced.

lmao so instead of this funding the energy transition it’s just subsidizing the AI grift

  • someone [comrade/them, they/them]@hexbear.net
    link
    fedilink
    English
    arrow-up
    65
    ·
    3 days ago

    And of course the second tragedy is that the AI is absolute dogshit. They’re not powering an artificial general intelligence that could do useful things like help in running a modern global-scale Project Cybersyn. All this staggering amount of electricity wasted so that Github users don’t need to search Stackoverflow, so that people can say “hey google set a 4 minute timer” in their kitchens instead of hitting a half-dozen buttons on their microwave, so that people can tell Alexa to play Despacito.

    • iridaniotter [she/her]@hexbear.net
      link
      fedilink
      English
      arrow-up
      29
      ·
      edit-2
      3 days ago

      They’re not powering an artificial general intelligence that could do useful things like help in running a modern global-scale Project Cybersyn.

      You don’t need that for planning and in fact the People’s Commissariat for Energetics’ secret police would send you to super gulag for suggesting such a preposterous thing.

      • PorkrollPosadist [he/him, they/them]@hexbear.net
        link
        fedilink
        English
        arrow-up
        1
        ·
        edit-2
        23 hours ago

        People also put a lot of emphasis on the “I” in Artificial General Intelligence. It gives us the impression that we will have some kind of contraption with a button on it, and every time you push the button it conjures up a new, distinct digital agent of Albert Einstein. For a long, long time, at best, these things will conjure your average Redditor. People think if we create AGI we can tell computers to compose Mozart, but we’ll be lucky if we get anywhere farther than “I glued my balls to my butthole again.”

  • Infamousblt [any]@hexbear.net
    link
    fedilink
    English
    arrow-up
    42
    arrow-down
    1
    ·
    3 days ago

    Oh sure NOW we want nuclear power. Not because of global warming or the immense pollution that burning fossil fuels produces, no no, those aren’t good reasons to move to nuclear. But powering AI servers? That’s what we need nuclear for! That’s more important than the health of a population or the entire biosphere!

  • 12022081631 [he/him]@hexbear.net
    link
    fedilink
    English
    arrow-up
    32
    arrow-down
    1
    ·
    3 days ago

    ermmmmmmm. well. this is. really dumb. but its not as bad as if they were running the normal option of like 50 coal plants. can somebody sneak in and turn the AI off after its finished

    • hypercracker [he/him]@hexbear.netOP
      link
      fedilink
      English
      arrow-up
      27
      ·
      edit-2
      3 days ago

      On one hand the idea of some AI embodying a huge computer complex powered by its own reactor is straight out of sci-fi (yes I did just finish playing Rain World, how did you know?). On the other hand this vision is significantly undermined by the mundane reality of a radiation hazard powering a million confidently incorrect redditor chatbots

      • 12022081631 [he/him]@hexbear.net
        link
        fedilink
        English
        arrow-up
        6
        ·
        edit-2
        3 days ago

        last time i heard (old info) the three mile island release wasn’t ever confirmed to be significant. in reading the wikipedia section about its current status it seems like calling it a radiation hazard might not be any more accurate than any other nuke plant.

        e: see responses

        • iridaniotter [she/her]@hexbear.net
          link
          fedilink
          English
          arrow-up
          6
          ·
          edit-2
          3 days ago

          It’s a tossup between Three Mile Isle and Centralia for who gets to be called Pennsylvania’s Chernobyl. (Personally I vote Centralia since it’s still a hazard… I should visit)

          • chickentendrils [any, comrade/them]@hexbear.net
            link
            fedilink
            English
            arrow-up
            9
            ·
            3 days ago

            Centralia is fun.

            Three Mile is undisputed PA Chernobyl for me. My family were friends with another, the mother & daughter of which were from Philly but just happened to be a few miles from Three Mile the day things went down. Both of them developed breast cancer decades apart, with no prior family history thinky-felix

        • HumongousChungus [she/her]@hexbear.net
          link
          fedilink
          English
          arrow-up
          3
          ·
          3 days ago

          Current status, definitely not an ongoing hazard. At the time, though, a husband-wife team that joined up as a radiation monitoring technician and a senior surveillance technician, the Thompsons, spoke out about a health/dosimeter badge coverup and had to flee town after a stranger warned them their life was in danger. When they settled in NM and began working on a book about it with the wife’s brother, him and the husband were run off the road, killing the brother while a manuscript of the book that was in the trunk went ‘missing’. Epidemiology links increased rates of health issues that stem from ionizing radiation to both the locations surrounding the incident and the areas downwind. Jean Trimmer, in the area, reported a flash of heat and rain, followed by bad sunburns, hair turning white and falling out, and an idiopathic atrophy of the kidney that warranted presentation to a symposium of doctors nearby from how strange it is. None of these are consistent with the official estimates of exposure, but do match the symptoms of acute exposure of a much higher dose.

          Of course, this was also a time when the Soviets presented an information warfare challenge. On the same token, disasters of any size and sort are often covered up when there’s a cold war justification. See: the pandemic (ongoing, unabated)

          Potentially, the only difference between this and foreign radioactive disasters is the competency of US intelligence. I would not be surprised to learn much later that a coverup was instituted, which would have been perfectly possible especially in the information environment of the time. I recommend nuclear energy advocates cease condescendingly using it as an example of nuclear panic, and instead make an effort to compassionately address people’s concerns over potential health hazards and lack of government support in the future. At the very least, to avoid potential embarrassment and backlash if a “full story” ever comes out about the incident.

          • 12022081631 [he/him]@hexbear.net
            link
            fedilink
            English
            arrow-up
            2
            ·
            3 days ago

            Hey thanks for this response. I’ll try to refrain from going too much farther down the rhetorical line I was kind of representing if this particular incident comes up again. Nuclear energy has always seemed boss to me but I like the way you frame this with these different contexts

              • 12022081631 [he/him]@hexbear.net
                link
                fedilink
                English
                arrow-up
                2
                ·
                3 days ago

                I didn’t necessarily feel you were. I was riding on fumes of watching Penn & Teller: Bullshit! as a teen ( cringe ) and I knew it, so I wanted to cover my bases

  • UlyssesT [he/him]@hexbear.net
    link
    fedilink
    English
    arrow-up
    24
    ·
    3 days ago

    So very fucking sick and tired of this treat printer shit, most of all because of the additional environmental devastation and wasted energy in an already environmentally collapsing world.

    I don’t want to hear about any more “interesting times ahead” about the “potential” of this shit. This shit’s doing “interesting” enough damage already. I hope one day it’s seen in the same light as leaded gasoline, CFCs in hair spray, and partially hydrogenated soybean oil.

  • InevitableSwing [none/use name]@hexbear.net
    link
    fedilink
    English
    arrow-up
    21
    ·
    edit-2
    3 days ago

    I didn’t have the motivation to read the whole thing so I scanned it for funny stuff. But it looked dreary so the only thing I read was the final paragraph. The article ends on a funny note. Tech companies don’t even bother to make an effort to lie anymore. Look at this shit.

    Microsoft has signed a contract to purchase fusion energy from a start-up that claims it can deliver it by 2028.

    -–

    Edit

    Related - [“tech bro bullshit” news] Nuclear fusion startup Helion claims it will have a working power plant by 2028. Microsoft is already a customer. More in body. - Hexbear

      • InevitableSwing [none/use name]@hexbear.net
        link
        fedilink
        English
        arrow-up
        12
        ·
        3 days ago

        It might happen by 2128

        Seriously though - I wonder how that firm choose four years. “About a decade” is equally bullshit but to some people it would sound like a moonshot they might get to. But four years sounds like purely made up bullshit to appeal to VC firms.

    • QuillcrestFalconer [he/him]@hexbear.net
      link
      fedilink
      English
      arrow-up
      12
      ·
      3 days ago

      That startup (helion or whatever its called) claimed in 2013 they would be producing power by 2018, then in 2018 claimed they would be producing power by 2023, and then in 2023 claimed they would be producing power by 2028. I’m starting to see a pattern

  • dom [he/him]@hexbear.net
    link
    fedilink
    English
    arrow-up
    14
    ·
    3 days ago

    Reactivating a notorious nuclear power plant solely to run AI sounds like a story beat that was cut from a Kojima game.

    • ComradeKingfisher [he/him]@hexbear.net
      link
      fedilink
      English
      arrow-up
      15
      ·
      3 days ago

      Yes

      spoiler

      Each time you use AI to generate an image, write an email, or ask a chatbot a question, it comes at a cost to the planet.

      In fact, generating an image using a powerful AI model takes as much energy as fully charging your smartphone, according to a new study by researchers at the AI startup Hugging Face and Carnegie Mellon University. However, they found that using an AI model to generate text is significantly less energy-intensive. Creating text 1,000 times only uses as much energy as 16% of a full smartphone charge.

      Their work, which is yet to be peer reviewed, shows that while training massive AI models is incredibly energy intensive, it’s only one part of the puzzle. Most of their carbon footprint comes from their actual use.

      The study is the first time researchers have calculated the carbon emissions caused by using an AI model for different tasks, says Sasha Luccioni, an AI researcher at Hugging Face who led the work. She hopes understanding these emissions could help us make informed decisions about how to use AI in a more planet-friendly way.

      Luccioni and her team looked at the emissions associated with 10 popular AI tasks on the Hugging Face platform, such as question answering, text generation, image classification, captioning, and image generation. They ran the experiments on 88 different models. For each of the tasks, such as text generation, Luccioni ran 1,000 prompts, and measured the energy used with a tool she developed called Code Carbon. Code Carbon makes these calculations by looking at the energy the computer consumes while running the model. The team also calculated the emissions generated by doing these tasks using eight generative models, which were trained to do different tasks.

      Generating images was by far the most energy- and carbon-intensive AI-based task. Generating 1,000 images with a powerful AI model, such as Stable Diffusion XL, is responsible for roughly as much carbon dioxide as driving the equivalent of 4.1 miles in an average gasoline-powered car. In contrast, the least carbon-intensive text generation model they examined was responsible for as much CO2 as driving 0.0006 miles in a similar vehicle. Stability AI, the company behind Stable Diffusion XL, did not respond to a request for comment.

      AI startup Hugging Face has undertaken the tech sector’s first attempt to estimate the broader carbon footprint of a large language model.

      The study provides useful insights into AI’s carbon footprint by offering concrete numbers and reveals some worrying upward trends, says Lynn Kaack, an assistant professor of computer science and public policy at the Hertie School in Germany, where she leads work on AI and climate change. She was not involved in the research.

      These emissions add up quickly. The generative-AI boom has led big tech companies to integrate powerful AI models into many different products, from email to word processing. These generative AI models are now used millions if not billions of times every single day.

      The team found that using large generative models to create outputs was far more energy intensive than using smaller AI models tailored for specific tasks. For example, using a generative model to classify movie reviews according to whether they are positive or negative consumes around 30 times more energy than using a fine-tuned model created specifically for that task, Luccioni says. The reason generative AI models use much more energy is that they are trying to do many things at once, such as generate, classify, and summarize text, instead of just one task, such as classification.

      Luccioni says she hopes the research will encourage people to be choosier about when they use generative AI and opt for more specialized, less carbon-intensive models where possible.

      “If you’re doing a specific application, like searching through email … do you really need these big models that are capable of anything? I would say no,” Luccioni says.

      The energy consumption associated with using AI tools has been a missing piece in understanding their true carbon footprint, says Jesse Dodge, a research scientist at the Allen Institute for AI, who was not part of the study.

      Comparing the carbon emissions from newer, larger generative models and older AI models is also important, Dodge adds. “It highlights this idea that the new wave of AI systems are much more carbon intensive than what we had even two or five years ago,” he says.

      Google once estimated that an average online search used 0.3 watt-hours of electricity, equivalent to driving 0.0003 miles in a car. Today, that number is likely much higher, because Google has integrated generative AI models into its search, says Vijay Gadepally, a research scientist at the MIT Lincoln lab, who did not participate in the research.

      Not only did the researchers find emissions for each task to be much higher than they expected, but they discovered that the day-to-day emissions associated with using AI far exceeded the emissions from training large models. Luccioni tested different versions of Hugging Face’s multilingual AI model BLOOM to see how many uses would be needed to overtake training costs. It took over 590 million uses to reach the carbon cost of training its biggest model. For very popular models, such as ChatGPT, it could take just a couple of weeks for such a model’s usage emissions to exceed its training emissions, Luccioni says.

      This is because large AI models get trained just once, but then they can be used billions of times. According to some estimates, popular models such as ChatGPT have up to 10 million users a day, many of whom prompt the model more than once.

      Studies like these make the energy consumption and emissions related to AI more tangible and help raise awareness that there is a carbon footprint associated with using AI, says Gadepally, adding, “I would love it if this became something that consumers started to ask about.”

      Dodge says he hopes studies like this will help us to hold companies more accountable about their energy usage and emissions.

      “The responsibility here lies with a company that is creating the models and is earning a profit off of them,” he says.

        • AlbedoORourke [he/him]@hexbear.net
          link
          fedilink
          English
          arrow-up
          13
          ·
          3 days ago

          this

          There’s also the energy usage around obsessively gripe-posting about AI to consider.

          This is why we need a watts-used-per-unit-of-entertainment scale going so we can determine a treat hierarchy.

          • UlyssesT [he/him]@hexbear.net
            link
            fedilink
            English
            arrow-up
            2
            ·
            edit-2
            3 days ago

            Reaching hard to run interference for those treat printers again, aren’t you?

            obsessively

            The billionaires that own the most environmentally devastating data centers and the techbro startups that keep propping up new ones will probably manage just fine without your stanning for them.

        • UlyssesT [he/him]@hexbear.net
          link
          fedilink
          English
          arrow-up
          6
          ·
          3 days ago

          It’s probably still a problem, but I think a lot more overall entertainment value comes out of the same amount of electricity use and carbon waste in bideo bames than in hitting a prompt button over and over again to get a satisfactory cyberpunkerino waifu to go with the hundreds to thousands already in the spank bank folder.

            • UlyssesT [he/him]@hexbear.net
              link
              fedilink
              English
              arrow-up
              1
              ·
              3 days ago

              I don’t think Microsoft is trying to reopen a power plant because theres some sort of modernist gooning wave happening, but I could be wrong.

              It is a small part of it, but the larger part of it is primarily coming from expansions of the corporate surveillance state and even more data collection, not like that’s better.