Many workers are faking knowledge of AI to make sure they aren’t left behind::There’s a need for more AI training, report finds

  • r00ty@kbin.life
    link
    fedilink
    arrow-up
    23
    arrow-down
    1
    ·
    10 months ago

    Seems to me that while companies are bullshitting calling generic algorithms AI, it’s fine for the potentially employed to do the same.

    • jj4211@lemmy.world
      link
      fedilink
      English
      arrow-up
      11
      arrow-down
      3
      ·
      10 months ago

      I read an article the other day where an airline was breaking about using AI to predict how many passengers will buy a meal in flight based on how many people had historically bought a meal in flight.

      That’s… Literally just an average of how many people order a meal…

      • EvilBit@lemmy.world
        link
        fedilink
        English
        arrow-up
        5
        ·
        10 months ago

        Ehhhhh there are much more sophisticated models than just an average. What a neural network could do is derive inferences based on a wide variety of inputs like time of day, country of origin, individual passenger characteristics, and so on.

        • jj4211@lemmy.world
          link
          fedilink
          English
          arrow-up
          3
          arrow-down
          1
          ·
          10 months ago

          Ultimately that application is just averaging over a smaller subset.

          While admittedly I don’t know that scenario myself, it looks like several scenarios I’ve seen where we imagined some magic insight from AI over more limited statistics, but not one of those scenarios ever predicted better.

          That’s not to say AI approaches are useless, but this sort of data when the dataset is pretty well organized and the required predictions are straightforward, then a pretty simple statistical analysis is plenty, and declaring “AI” for such a simple scenario just undermines AI credibility where it can do formerly infeasible things.

          • P03 Locke@lemmy.dbzer0.com
            link
            fedilink
            English
            arrow-up
            4
            arrow-down
            1
            ·
            10 months ago

            AI models are averages, except in the form of weights over a large set of matrices. However, calling them “just averaging” is grossly oversimplifying how they work.

          • EvilBit@lemmy.world
            link
            fedilink
            English
            arrow-up
            2
            ·
            edit-2
            10 months ago

            You can basically think of AI as a massively multiverse analysis that can go far beyond a directly applied model. So while yes, technically averages are involved, they’re applied in a way that makes it incredibly naive to call it “just averages”.

            Edit: it is especially not “just an average of how many people order a meal” as you had said.

  • Rentlar@lemmy.ca
    link
    fedilink
    English
    arrow-up
    19
    arrow-down
    1
    ·
    10 months ago

    Many postings are like “10 years experience in ChatGPT” anyway so it balances out.

    • afraid_of_zombies@lemmy.world
      link
      fedilink
      English
      arrow-up
      9
      ·
      10 months ago

      I am so glad I am not longer only doing software development. The interview process alone is hell. We need someone with a decade experience in this one particular framework that only five companies on earth use. If you have a tech stack that is so widely not the norm that you can’t find people who know it consider changing it.

      • P03 Locke@lemmy.dbzer0.com
        link
        fedilink
        English
        arrow-up
        2
        ·
        10 months ago

        The hiring process should focus on expertise in the language and the ability to program, not the framework. Knowledge in frameworks are just a nice bonus, and typically easy to figure out with an adapt programmer.

        • XTornado@lemmy.ml
          link
          fedilink
          English
          arrow-up
          1
          ·
          10 months ago

          That usually works out…but to be honest some frameworks can be a big headache…

  • jj4211@lemmy.world
    link
    fedilink
    English
    arrow-up
    15
    ·
    10 months ago

    My management assigned me a title implying high level AI person. Evidently they had a mandate for X% AI experts, so a bunch of us had our titles arbitrarily changed and the mandate was satisfied.

    No one can tell we aren’t, so I guess it worked out?

    • XTornado@lemmy.ml
      link
      fedilink
      English
      arrow-up
      2
      ·
      edit-2
      10 months ago

      Yeah… Not exactly the same but I have seen for getting contracts they put some workers that really have the certificates or whatever needed, so they can get the contract but those guys ain’t gonna touch that work at all since they are in other projects somebody else / team will work on it.

      I guess sis not that bad since if they really need something they could eventually ask this either guys to help but…

  • BleatingZombie@lemmy.world
    link
    fedilink
    English
    arrow-up
    8
    arrow-down
    1
    ·
    10 months ago

    It’s what I do about every other technology at work. “Fake it until you make it”. Hopefully that day comes

  • Matriks404@lemmy.world
    link
    fedilink
    English
    arrow-up
    5
    arrow-down
    1
    ·
    edit-2
    10 months ago

    Well, they shouldn’t be. AI while often helpful in some use cases, is not really that powerful like some marketers want you to believe, and very often giving useless or just false data (for example when it comes to chatbots).

    If you’d believe everything what they say you would come to conclusion that we should all prepare for singularity or some other shit coming this year or next, which is just bullshit.