OpenAI just admitted it can’t identify AI-generated text. That’s bad for the internet and it could be really bad for AI models.::In January, OpenAI launched a system for identifying AI-generated text. This month, the company scrapped it.

  • Peanut@sopuli.xyz
    link
    fedilink
    English
    arrow-up
    56
    arrow-down
    2
    ·
    1 year ago

    again, the issue isn’t the technology, but the system that forces every technological development into functioning “in the name of increased profits for a tiny few.”

    that has been an issue for the fifty years prior to LLMs, and will continue to be the main issue after.

    removing LLMs or other AI will not fix the issue. why is it constantly framed as if it would?

    we should be demanding the system adjust for the productivity increases we’ve already seen, as well to what we expect in the near future. the system should make every advancement a boon for the general populace, not the obscenely wealthy few.

    even the fears of propaganda. the wealthy can already afford to manipulate public discourse beyond the general public’s ability to keep up. the bigger issue is in plain sight, but is still being largely ignored for the slant that “AI is the problem.”

    • P03 Locke@lemmy.dbzer0.com
      link
      fedilink
      English
      arrow-up
      22
      ·
      1 year ago

      Yep, the problem was never LLMs, but billionaires and the rich. The problems have always been the rich for thousands of years, and yet they are immensely successful at deflecting their attacks to other groups for those thousands of years. They will claim it’s Chinese immigrants, or blacks, or Mexicans, or gays, or trans people. Now LLMs and AI are the new boogieman.

      We should be talking about UBI, not LLMs.

      • agitatedpotato@lemmy.world
        link
        fedilink
        English
        arrow-up
        7
        arrow-down
        6
        ·
        edit-2
        1 year ago

        Sure but lets say you try to solve this problem. What’s the first thing you think a coordinated group could do, get sensible regulations about AI, or overthrow global capitalism. Its framed the way it is because unless you want ro revolt that’s the framework we’re gonna have to use to deal with it. I suppose we could alwyas do nothing to AI specifically and focus on just overthrowing capitalism, but during that time lots of harm will come to lots of workers because of AI use. I dont think anticapitalism has reached a critical mass (we need this for any real sustem wide attacks on and alternatives to capitalism) so I think dealing with this AI problem and trying to let everyone else know about how it’s really a capitalism thing would do more to build support and avert harm to workers. I hate that its like that too but those choices are basically the real options we have moving forward from my pov.

        • Gutless2615@ttrpg.network
          link
          fedilink
          English
          arrow-up
          12
          arrow-down
          1
          ·
          edit-2
          1 year ago

          You tell me what “sensible regulations about AI” are that don’t hurt small artists and creators more than they centralize the major players and enrich copyright hoarding, copyright-maximalist corporations. (Seriously, this isn’t bait. I’ve been wracking my mind on the issue for months. Because the only serious proposals so far are expanding the already far-too-broad copyright rights to things like covering training or granting artists more rights to their work during their lifetime - something that will only hurt small artists) We desperately need more fair use, not less. The only “sensible regulations” that we should and could be talking about is some form of UBI. That’s it.

          • agitatedpotato@lemmy.world
            link
            fedilink
            English
            arrow-up
            5
            arrow-down
            4
            ·
            edit-2
            1 year ago

            UBI is a bandaid that doesn’t solve the core issues of production under capitalism, the people with capital still control production, still make more money than eveyone else and still have more money and power to use influencing the politicians that write the laws surrounding UBI. And expecting me to solve the AI problem in a comment section is like me asking you to implement UBI in a way that landlords dont just jack up rent or business dont inflate prices with more cash and demand floating around, also whats your plan for when the level of UBI legislated , or planned increases in UBI is no longer sufficient enough to pay for housing food and other necessities? What do you do to counter the fact that the capitists still have more access to politicians and media empires they can use to discredit and remove UBI?

            • Gutless2615@ttrpg.network
              link
              fedilink
              English
              arrow-up
              7
              arrow-down
              2
              ·
              edit-2
              1 year ago

              UBI is a bandaid, sure. But bandaids actually help; “sensible AI regulations” - a nothing phrase that will most likely materialize as yet another expansion of copyright — will actively make things worse. UBI is achievable, and can be expanded on once it’s enacted. You establish protections and regulations that actually help people, and dare opposition to ever try to take them away; instead of carrying water for copyright maximalists along the way.

              • P03 Locke@lemmy.dbzer0.com
                link
                fedilink
                English
                arrow-up
                6
                ·
                1 year ago

                a nothing phrase that will most likely materialize as yet another expansion of copyright

                Exactly. We need to break apart copyright with a crowbar. It’s a broken system that only benefits the rich, and AI has the opportunity to turn the entire system into a pile of unenforceable garbage.

              • agitatedpotato@lemmy.world
                link
                fedilink
                English
                arrow-up
                1
                arrow-down
                2
                ·
                edit-2
                1 year ago

                Why does legislation or regulation surrounding AI necessarily have to be copyright maxamilism but UBI regulations are somehow in some undescribed way going to be strong enough to prevent lobbying from the people who still control the mean of production? You’re arguement gets to use the magic regulations that don’t get challenged or changed, but my arguement is stuck to the one mainstream idea that has people worried?

                • Gutless2615@ttrpg.network
                  link
                  fedilink
                  English
                  arrow-up
                  3
                  ·
                  edit-2
                  1 year ago

                  Because those are the only “sensible AI regulations” seriously being talked about. Tell me any other actual regulatory schemes that are being proposed that aren’t, and I’ll be happy to talk about those, and likely support them. I’m not getting the hostility, btw. fwiw this (getting stronger consumer protection laws passed) is literally my job; I’m going to go out on a limb here and say we probably agree with more than we disagree, based on your comment history. Obviously UBI won’t be enough to - will never be enough to - oust capitalists from having an outsized influence in policy, but what I don’t support at all are regulations that would further centralize the corporate IP holders and tech companies that would actually benefit from the copyright maximalist proposals currently being bandied about by the fear mongering anti generative AI discourse.

                  Fundamentally we’re not going to copyright our way out of the externalities AI brings with it.

                  • agitatedpotato@lemmy.world
                    link
                    fedilink
                    English
                    arrow-up
                    1
                    arrow-down
                    3
                    ·
                    edit-2
                    1 year ago

                    My arguement is not limited to the only regulations being currently talked about any more than your arguement is limited by what types of UBI are currently talked about because im not hearing any talk on UBI.

        • P03 Locke@lemmy.dbzer0.com
          link
          fedilink
          English
          arrow-up
          6
          arrow-down
          1
          ·
          1 year ago

          get sensible regulations about AI

          There’s no such thing as “sensible regulations” for AI. AI is a technological advantage. Any time you regulate that advantage, other groups that don’t have those regulations will fuck you over. Even if you start talking about regulations, the corpos will take over and fuck you over with regulations that only hurt the little guy.

          Hell, even without regulations, we’re already seeing this on the open-source vs. capitalism front. Google admitted that it lost some advantages because of open-source AI tools, and now these fucking removed are trying to hold on to their technology as close as possible. This is technology that needs to be free and open-source, and we’re going to see a fierce battle with multi-billion-dollar capitalistic corporations clawing back whatever technological gains OSS acquired, until you’re forced to spend hundreds or thousands of dollars to use a goddamn chess bot.

          GPLv3 is key here, and we need to force these fuckers into permanent copyleft licenses that they can’t revoke. OpenAI is not open, StabilityAI is not the future, and Google is not your friend.

          • agitatedpotato@lemmy.world
            link
            fedilink
            English
            arrow-up
            2
            arrow-down
            2
            ·
            edit-2
            1 year ago

            Isnt forcing a copyleft licence exactly a regulation that would be sensible though? So why wouldn’t regulations and legislation work if thats your solution too?

            • P03 Locke@lemmy.dbzer0.com
              link
              fedilink
              English
              arrow-up
              4
              ·
              1 year ago

              There’s never been a bill that had the word “copyleft” or “GNU Public License” on it at all, and thanks to corpo lobbyists, there probably never will be. We have to be realistic here, and the only realistic option is to encourage as much protected open-source software on the subject as possible.

    • jackoneill@lemmy.world
      link
      fedilink
      English
      arrow-up
      6
      ·
      1 year ago

      This isn’t a technological issue, it’s a human one

      I totally agree with everything you said, and I know that it will never ever happen. Power is used to get more power. Those in power will never give it up, only seek more. They intentionally frame the narrative to make the more ignorant among us believe that the tech is the issue rather than the people that own the tech.

      The only way out of this loop is for the working class to rise up and murder these removed en masse

      Viva la revolucion!

    • Ostrichgrif
      link
      fedilink
      English
      arrow-up
      5
      ·
      1 year ago

      I completely agree with you, ai should be seen as a great thing, but we all know that the society we live in will not pass those benefits to the average person, in fact it’ll probably be used to make life worse. From a leftist perspective it’s very easy to see this, but from the Norman position, atleast in the US, people aren’t thinking about how our society slants ai towards being evil and scary, they just think ai is evil and scary. Again I completely agree with what you’ve said it’s just important to remember how reactionary the average person is.

    • glockenspiel@lemmy.world
      link
      fedilink
      English
      arrow-up
      4
      arrow-down
      1
      ·
      edit-2
      1 year ago

      It is a completely understandable stance in the face of the economic model, though. Your argument could be fitted to explain why firearms shouldn’t be regulated at all. It isn’t the technology, so we should allow the sale of actual machine guns (outside of weird loopholes) and grenade launchers.

      The reality is that the technology is targeted by the people affected by it because we are hopeless in changing the broader system which exists to serve a handful of parasitic non-working vampires at the top of our societies.

      Edit: not to suggest that I’m against AI and LLM. I want my fully automated luxury communism and I want it now. However, I get why people are turning against this stuff. They’ve been fucked six ways from Sunday and they know how this is going to end for them.

      Plus, a huge amount of AI doomerism is being pushed by the entrenched monied AI players, like OpenAI and Meta, in order to used a captured government to regulate potential competition out of existence.