• BodyBySisyphus [he/him]@hexbear.net
      link
      fedilink
      English
      arrow-up
      22
      ·
      3 months ago

      Because LLMs aren’t useful enough to be profitable and the investments companies are making in infrastructure only make sense if they represent a viable stepping stone toward AGI. If LLMs are a dead end, a lot of money may be about to go up in smoke.

      The other problem is that they are mainly good at creating the illusion that they work well, and the main barrier to implementation, the tendency to hallucinate, may not be fixable.

      • Kefla [she/her, they/them]@hexbear.net
        link
        fedilink
        English
        arrow-up
        23
        ·
        3 months ago

        Of course it isn’t fixable and I’ve been saying this since like 2021. Hallucination isn’t a bug that mars their otherwise stellar performance, hallucination is the only thing they do. Since nothing they generate is founded on any sort of internal logic, everything they generate is hallucination, even the parts that coincidentally line up with reality.