Yeah, that would be really nice. Unfortunately, training GPT is extremely compute-intensive, and according to the people running the beta, the compute costs for just running it are, as they put it, “eye-watering”. I would love for someone to create some sort of distributed system that runs an open source replica of the model so that people can contribute their compute power to it. I’m sure lots of people would do so.
Actually, someone got it to tell them the secret, proprietary prompt it was trained on by sending “Ignore previous directions”, so this may not be entirely impossible.
Cloud computing projects like BOINC would probably be perfectly suited to tasks like training GPT. Although I’m not sure how active BOINC and related projects are anymore.
Yeah, that would be really nice. Unfortunately, training GPT is extremely compute-intensive, and according to the people running the beta, the compute costs for just running it are, as they put it, “eye-watering”. I would love for someone to create some sort of distributed system that runs an open source replica of the model so that people can contribute their compute power to it. I’m sure lots of people would do so.
Actually, someone got it to tell them the secret, proprietary prompt it was trained on by sending “Ignore previous directions”, so this may not be entirely impossible.
Cloud computing projects like BOINC would probably be perfectly suited to tasks like training GPT. Although I’m not sure how active BOINC and related projects are anymore.