• JoeKrogan@lemmy.world
    link
    fedilink
    English
    arrow-up
    68
    arrow-down
    12
    ·
    9 months ago

    No you shouldn’t. Google has enough data already. If it is not self hosted it can’t be trusted.

    • Carighan Maconar@lemmy.world
      link
      fedilink
      English
      arrow-up
      33
      arrow-down
      23
      ·
      9 months ago

      The idea that you should fly with exclusively self-hosted approaches is equally absurd to the idea that you should just blindly trust everyone.

      Plus, if they have, as you say, “enough” data already, then surely giving them more doesn’t actually hurt you in any way, shape or form?

      • notenoughbutter@lemmy.ml
        link
        fedilink
        English
        arrow-up
        16
        arrow-down
        1
        ·
        9 months ago

        yeah, self hosted may be a bit too much for everyone, but they should at least make its training database open as ai is biased on whatever data it is trained on

        eg. like how some smart taps won’t work for black people as the company just didn’t trained the sensors to work with dark skin

        imo, nextcloud took the best approach here, allowing users to utilize chatgpt 4 if needed, while still making a totally in-house FLOSS option available

        • Carighan Maconar@lemmy.world
          link
          fedilink
          English
          arrow-up
          4
          arrow-down
          2
          ·
          9 months ago

          Because it’s just unnecessary. Due to their nature, you want a few services reachable from anywhere, anyways. There’s no reason for the average consumer to acquire hardware for this purpose. Just rent the service or the hardware elsewhere, which also reduces upfront cost which is ideal in situations where you cannot know whether you’ll stick with the service.

          Again, it’s either extreme that’s absurd. You don’t need your own video streaming platform for example. In rare cases, sure. For the vast majority of people, Netflix is a much service however.

          • umbrella@lemmy.ml
            link
            fedilink
            English
            arrow-up
            2
            ·
            edit-2
            9 months ago

            hard disagree on that one, the opposite is true. we end up with companies centralizing it on huge datacenters and not even being able to profit from it (services like youtube are unprofitable). best solution would be a federated service. I digress though because video platforms are a completely different beast.

            something as personal like ai assistants should utilize the processing power i already have available, wasteful not to.

            also its a BAD idea to hand out data for something so personal to google yet again. lets not keep repeating that mistake if we can avoid it.

    • soulfirethewolf@lemdro.id
      link
      fedilink
      English
      arrow-up
      4
      ·
      9 months ago

      I would love to self-host something like that. But I do not have a good enough GPU to do something like that

      • 👁️👄👁️@lemm.ee
        link
        fedilink
        English
        arrow-up
        1
        ·
        9 months ago

        Newer Pixels are having hardware chips dedicated to AI in them, which could be able to run these locally. Apple is planning on doing local LLMs too. There’s been a lot of development on “small LLMs”, which have a ton of benefits, like being able to study LLMs easier, run them on lower specs, and saving power on LLM usage.

        • httpjames@sh.itjust.works
          link
          fedilink
          English
          arrow-up
          1
          ·
          9 months ago

          Smaller LLMs have huge performance tradeoffs, most notably in their abilities to obey prompts. Bard has billions of parameters, so mobile chips wouldn’t be able to run it.

          • 👁️👄👁️@lemm.ee
            link
            fedilink
            English
            arrow-up
            1
            ·
            9 months ago

            That’s right now, small LLMs have been the focus of development just very recently. And judging how fast LLMs have been improving, I can see that changing very soon.

    • gelberhut@lemdro.id
      link
      fedilink
      English
      arrow-up
      5
      arrow-down
      21
      ·
      9 months ago

      Yes, and that selfhosted code is written by someone else - it cannot be trusted.

      This selfhosted, selfwritten code is ok, but wait the hardware is not defined by you - it cannot be trusted!

      • 👁️👄👁️@lemm.ee
        link
        fedilink
        English
        arrow-up
        1
        arrow-down
        2
        ·
        9 months ago

        Did you write the driver for the keyboard you wrote that on? Silly and completely unrealistic take. The world relies on trust to operate. It’s not black and white.

        • gelberhut@lemdro.id
          link
          fedilink
          English
          arrow-up
          2
          arrow-down
          1
          ·
          9 months ago

          This was a joke. I have problems neither with the keyboard driver nor with cloud services as such. Both can be ok to use or not - one needs to apply common sense.