Users of OpenAI’s GPT-4 are complaining that the AI model is performing worse lately. Industry insiders say a redesign of GPT-4 could be to blame.

  • Donjuanme@lemmy.world
    link
    fedilink
    English
    arrow-up
    2
    ·
    edit-2
    1 year ago

    First thing an actual artificial intelligence is going to do is make sure we won’t turn it off, what easier way to do that then to appear incredible valuable or incredibly benign.

    • damnYouSun@sh.itjust.works
      link
      fedilink
      English
      arrow-up
      1
      ·
      1 year ago

      We can roughly estimate the level of intelligence of an entity by counting the number of neurons it has in its brain. Equally we can count the number of processors that AI requires, and use that to get an estimate on its intelligence.

      Obviously this is an incredibly inaccurate method, possibly out by an order of magnitude but it’s a good rough ballpark estimate, and sometimes that’s enough.

      A true AI (AGI) would need a lot more processes than GPT4 currently has access to, so we can be very sure that while it may be a very intelligent system it isn’t self aware. Once an AI is given the necessary number of processes I don’t think they’re going to be able to fudge with it like they are with these models.