• Pasta Dental@sh.itjust.works
    link
    fedilink
    arrow-up
    57
    arrow-down
    2
    ·
    edit-2
    7 months ago

    having contributors sign a CLA is always very sus and I think this is indicative of the project owners having some plans of monetizing it even though it is currently under AGPLv3. Their core values of no dark patterns and whatnot seem like a sales argument rather than an actual motivation/principle, especially when you see that they are a bootstrapped startup.

    • silas@programming.dev
      link
      fedilink
      English
      arrow-up
      12
      ·
      6 months ago

      Thanks for pointing that out—looks like they’re working on a Server Suite. I’d guess that they try to monetize that but leave the personal desktop version free

      • wiki_me@lemmy.ml
        link
        fedilink
        English
        arrow-up
        7
        ·
        6 months ago

        Yeah it’s easy to fall into a negativity bias instead of doing a risk benefit analysis , the company could be investing money and resources that could be missing from open source projects, especially professional work by non programmers (e.g. UX researchers) which is something that open source projects usually miss.

        You could probably figure it out by going over the contributions.

        • Pasta Dental@sh.itjust.works
          link
          fedilink
          arrow-up
          3
          ·
          6 months ago

          Of course, I am not against software being open-source, and I much prefer this approach of companies making their software open-source, but it’s the CLA that really bothers me. I like companies contributing to the FOSS ecosystem, what I don’t like is companies trying to benefit from free contributions and companies having the possibility to change the license of the code from those contributors

    • ___@lemm.ee
      link
      fedilink
      arrow-up
      1
      ·
      edit-2
      6 months ago

      I’m starting to come around to big corps running their custom enhanced versions while feeding their open source counterparts with the last gen weights. As much as I love open source, people need to eat.

      As was mentioned, if they start doing something egregious, they’re not the only game in town, and can also be forked. Love it or hate it, a big corp sponsor makes Joe six-pack feel a little more secure in using a product.

      • Kindness@lemmy.ml
        link
        fedilink
        arrow-up
        3
        ·
        6 months ago

        Free as in freedom, not free as in beer.

        GPLv3 allows you to sell your work for money, but you still have to hand over the code your customers purchased. You buy our product, you own it, as is. Do whatever you like with it, but if you sell a derivative, you better cough up the new code to whoever bought it.

    • circuscritic@lemmy.ca
      link
      fedilink
      arrow-up
      13
      arrow-down
      4
      ·
      edit-2
      7 months ago

      Depends. Are either of those companies bootstrapping a for-profit startup and trying to dupe people into contributing free labor prior to their inevitable rug pull/switcheroo?

      • ☆ Yσɠƚԋσʂ ☆@lemmy.mlOP
        link
        fedilink
        arrow-up
        6
        arrow-down
        7
        ·
        7 months ago

        Do explain how you dupe people into contributing free labor and do a switcheroo with an open source project. All the app does is just provide a nice UI for running models.

        • wispydust@sh.itjust.works
          link
          fedilink
          arrow-up
          7
          ·
          6 months ago

          I think they meant to imply that the original post (not yours) had suspicious intentions, while the ones you cited were more trustworthy

    • silas@programming.dev
      link
      fedilink
      English
      arrow-up
      2
      ·
      6 months ago

      Ok I tried it out and as of now Jan has a better UI/UX imo (easier to install and use), but Open WebUI seems to have more features like document/image processing.

  • xigoi@lemmy.sdf.org
    link
    fedilink
    arrow-up
    17
    ·
    6 months ago

    “100% Open Source“

    [links to two proprietary services]

    Why are so many projects like this?

    • ☆ Yσɠƚԋσʂ ☆@lemmy.mlOP
      link
      fedilink
      arrow-up
      5
      arrow-down
      2
      ·
      6 months ago

      I imagine it’s because a lot of people don’t have the hardware that can run models locally. I do wish they didn’t bake those in though.

      • Wes_Dev@lemmy.ml
        link
        fedilink
        arrow-up
        2
        ·
        6 months ago

        They all work well enough on my weak machine with an RX580.

        Buuuuuuuuuut, RWKY had some kind of optimization thing going that makes it two or three times faster to generate output. The problem is that you have to be more aware of the order of your input. It has a hard time going backwards to a previous sentence, for example.

        So you’d want to say things like “In the next sentence, identify the subject.” and not “Identify the subject in the previous text.”

    • Jeena@jemmy.jeena.net
      link
      fedilink
      arrow-up
      2
      ·
      6 months ago

      The biggest difference seems to be that you can let privateGPT to let analyze your own files. Didn’t see that functionality in Jan.

    • Jeena@jemmy.jeena.net
      link
      fedilink
      arrow-up
      1
      ·
      6 months ago

      One difference is that Jan is increadibly easy to install, just download the AppImage, make it executable and start it.

      • chebra@mstdn.io
        link
        fedilink
        arrow-up
        2
        arrow-down
        1
        ·
        6 months ago

        @jeena And absolutely nothing can go wrong by downloading random files from the internet based on contemporary hype, making them executable and starting them…

            • xigoi@lemmy.sdf.org
              link
              fedilink
              arrow-up
              2
              ·
              6 months ago

              How else would you install something that doesn’t happen to be in your favorite package manager?

              • chebra@mstdn.io
                link
                fedilink
                arrow-up
                0
                arrow-down
                1
                ·
                6 months ago

                @xigoi Are you actually trying to get malware into your computer? Don’t install **random** shiny new things without maximum skepticism. Period. Just let some other fools “test” the minefield for you. Or do a proper inspection. Executing foreign code just because it had “GPT” in the name… and acting like there was no other option… yuck!

  • Churbleyimyam@lemm.ee
    link
    fedilink
    arrow-up
    5
    ·
    7 months ago

    This looks very cool, especially the part about being able to use it on consumer-grade laptops. Will try it out when I get a chance.

  • Aria
    link
    fedilink
    arrow-up
    4
    ·
    6 months ago

    So what exactly is this? Open-source ChatGPT-alternatives have existed before and alongside ChatGPT the entire time, in the form of downloading oogabooga or a different interface and downloading an open source model from Huggingface. They aren’t competitive because users don’t have terabytes of VRAM or AI accelerators.

    • Schlemmy@lemmy.ml
      link
      fedilink
      arrow-up
      1
      ·
      edit-2
      6 months ago

      Edit: spelling. The Facebook LLM is pretty decent and has a huge amount of tokens. You can install it locally and feed your own data into the model so it will become tailor made.

    • ☆ Yσɠƚԋσʂ ☆@lemmy.mlOP
      link
      fedilink
      arrow-up
      1
      arrow-down
      1
      ·
      6 months ago

      It’s basically a UI for downloading and running models. You don’t need terabytes of VRAM to run most models though. A decent GPU and 16 gigs of RAM or so works fine.

    • ☆ Yσɠƚԋσʂ ☆@lemmy.mlOP
      link
      fedilink
      arrow-up
      2
      arrow-down
      2
      ·
      6 months ago

      Depends on the size of the model you want to run. Generally, having a decent GPU and at least 16 gigs of RAM is helpful.