• KobaCumTribute [she/her]@hexbear.net
    link
    fedilink
    English
    arrow-up
    3
    ·
    edit-2
    45 minutes ago

    A lot of people here are missing the funniest thing about this: SAI is floundering, has lost most of its tech talent, and suffered hard to the double punch of SD3 sucking complete shit and Flux showing up like a month later and being everything people had expected SD3 to be but better. SAI has also been pivoting away from the open source release model that got them literally all of the attention they’ve gotten in the first place.

    So it looks like James Cameron’s role with this would be trying to use his reputation to grift more investor money to keep the company that now doesn’t have the engineers responsible for all the popular Stable Diffusion models anymore afloat. I wonder if he knows he’s hopping onto a failing grift or if they’ve successfully tricked him into thinking there’s anything of value left in SAI?

  • laziestflagellant [they/them]@hexbear.net
    link
    fedilink
    English
    arrow-up
    9
    ·
    2 hours ago

    Isn’t he supposed to be an infamous perfectionist with his work?

    I guess he might be the 0.001% of AI users who can actually get something usable out of the tools because he’s willing to hogtie and drag it through the streets until it does exactly what he wants but like

    You have shitzillion dollar CGI industries at your beck and call who can actually make 3D assets the regular way lmao why would you bother browbeating the slop machine instead

  • StalinStan [none/use name]@hexbear.net
    link
    fedilink
    English
    arrow-up
    3
    ·
    1 hour ago

    Nah, now shit is really gonna pop off. I am excited. Cause as much as this hurts artists it will eventually hurt the studios the most. And after all that settles we can just buy a cheap Chinese solar pannel to give us infinite cheap treats

    • UlyssesT [he/him]@hexbear.net
      link
      fedilink
      English
      arrow-up
      22
      ·
      3 hours ago

      Imagine Skynet, except destroying the planet to make more movies about Skynet faster and more conveniently with less paid workers. brrrrrrrrrrrr

  • Huldra [they/them, it/its]@hexbear.net
    link
    fedilink
    English
    arrow-up
    12
    ·
    2 hours ago

    I’m pretty sure he has already done like two “4k remasters” where he just took whatever the latest blu-ray was then upscaled and denoised/grained it with AI so it looks like dogshit.

    • UlyssesT [he/him]@hexbear.net
      link
      fedilink
      English
      arrow-up
      9
      ·
      edit-2
      2 hours ago

      I’m pretty sure he has already done like two “4k remasters” where he just took whatever the latest blu-ray was then upscaled and denoised/grained it with AI so it looks like dogshit.

      Sounds like what George Lucas did to his own Star Wars movies, except in some ways worse.

      • LaGG_3 [he/him, comrade/them]@hexbear.net
        link
        fedilink
        English
        arrow-up
        3
        ·
        edit-2
        46 minutes ago

        Imagine all the little freak Glup Shittos that would be added in via AI if Lucas hadn’t sold to Disney lmao

        Edit: Well, I guess now we just have the digital corpses of OT actors edited into shit indefinitely

  • bazingabrain [comrade/them]@hexbear.net
    link
    fedilink
    English
    arrow-up
    16
    ·
    3 hours ago

    CGI 3 decades ago has nothing to do with what CGI is today lol. Its like comparing the fucking lumiere brother’s cinematographe and a camera from the company RED.

    • AntiOutsideAktion [he/him]@hexbear.net
      link
      fedilink
      English
      arrow-up
      9
      ·
      3 hours ago

      moreover, cgi 3 decades ago had nothing to do with stealing people’s art today and presenting it as your own as long as it goes through a computer rube goldberg machine

    • UlyssesT [he/him]@hexbear.net
      link
      fedilink
      English
      arrow-up
      10
      ·
      3 hours ago

      It’s like Chris Roberts claiming to be a cutting edge “code whisperer” to his credulous Star Citizen cultists when he hasn’t actually coded a game since fucking 1994.

        • UlyssesT [he/him]@hexbear.net
          link
          fedilink
          English
          arrow-up
          3
          ·
          2 hours ago

          Chris Roberts found a way to have GenX rich nerds to pay for his vacations.

          He even publicized a self-quote about how he’ll work so hard and use that crowdfunded money so well that you’d never see him on a yacht.

  • UlyssesT [he/him]@hexbear.net
    link
    fedilink
    English
    arrow-up
    17
    ·
    3 hours ago

    Oh boy even more slop and now the slop will derive itself from prior slop and crowd out almost everything else and burn the planet down faster than ever before, and if you don’t like it you’re an emotional Luddite! Bazinga!

  • bortsampson [he/him, any]@hexbear.net
    link
    fedilink
    English
    arrow-up
    6
    ·
    edit-2
    3 hours ago

    I wonder if this has to do with integrating into modeling/vfx software like Houdini and Blender. Those “AI” tools are actually pretty good use cases for the tech.

    Edit: I mean the texture generators specifically.

    • UlyssesT [he/him]@hexbear.net
      link
      fedilink
      English
      arrow-up
      4
      ·
      2 hours ago

      I wonder if this has to do with integrating into modeling/vfx software like Houdini and Blender.

      How’s the energy demands and carbon output of that compared to prior conventional methods? I don’t know; I’m actually asking.

      If it’s anything like the giant coal-powered hell factory databases that tech startups are using and expanding right now to chase the “AI” hype dragon, we don’t need more of that.

      • bortsampson [he/him, any]@hexbear.net
        link
        fedilink
        English
        arrow-up
        2
        ·
        2 hours ago

        Depends on texture size and model. The models before LLMs and SDF hype are extremely efficient with the right libraries. Without getting to into the weeds these older ones are no different then doing any other image processing (like applying a gauss blur on 8k image or something). Newer stuff uses diffusion models. So something like a 500x500 texture probably is the equivalent to leaving 25 watt lightbulb on for an hour at worst. But you do it once and it repeats so it’s efficient. I know the early models were trained on cropped animal patterns, zoomed in materials, and other very vanilla datasets.

          • bortsampson [he/him, any]@hexbear.net
            link
            fedilink
            English
            arrow-up
            2
            ·
            1 hour ago

            No problem. It’s appalling companys are pushing this out at scale and virtually uncapped. These models use as much power as large scale scientific simulations/models. Video more so. You don’t want a lot of people running these concurrently. VERY BAD lol