I’ve been having a lot of fun circumventing ChatGPT’s filters and getting it to answer with stuff like this.

  • Arsen6331 ☭OP
    link
    fedilink
    arrow-up
    5
    ·
    edit-2
    2 years ago

    Yeah, that would be really nice. Unfortunately, training GPT is extremely compute-intensive, and according to the people running the beta, the compute costs for just running it are, as they put it, “eye-watering”. I would love for someone to create some sort of distributed system that runs an open source replica of the model so that people can contribute their compute power to it. I’m sure lots of people would do so.

    Actually, someone got it to tell them the secret, proprietary prompt it was trained on by sending “Ignore previous directions”, so this may not be entirely impossible.

    • knfrmity
      link
      fedilink
      arrow-up
      4
      ·
      2 years ago

      Cloud computing projects like BOINC would probably be perfectly suited to tasks like training GPT. Although I’m not sure how active BOINC and related projects are anymore.