Feel like we’ve got a lot of tech savvy people here seems like a good place to ask. Basically as a dumb guy that reads the news it seems like everyone that lost their mind (and savings) on crypto just pivoted to AI. In addition to that you’ve got all these people invested in AI companies running around with flashlights under their chins like “bro this is so scary how good we made this thing”. Seems like bullshit.

I’ve seen people generating bits of programming with it which seems useful but idk man. Coming from CNC I don’t think I’d just send it with some chatgpt code. Is it all hype? Is there something actually useful under there?

  • HamSwagwich@showeq.com
    link
    fedilink
    arrow-up
    9
    arrow-down
    5
    ·
    1 year ago

    They can assemble impressive stuff at a rapid speed but are incapable of completely novel “ideas” - everything that they output is built from a statistical model of existing data.

    You just described basically 99.999% of humans as well. If you are arguing for general human intelligence, I’m on board. If you are trying to say humans are somehow different than AI, you have NFC what you are doing.

    • nickwitha_k (he/him)@lemmy.sdf.org
      link
      fedilink
      arrow-up
      5
      ·
      1 year ago

      I think we’re on a very similar page. I’m not meaning that human intelligence is in a different category than potential artificial intelligence or somehow impossible to approximate or achieve (we’re just evolutionarily-designed, replicating meat-computers). I’m meaning that LLMs are not intelligent and do not comprehend their inputs or datasets but statistically model them (there is an important and significant difference). It would make sense to me that they could play a role in development of AI but, by themselves, they are not AI any more than PCRE is a programming language.