A South Korean man has been sentenced to jail for using artificial intelligence to generate exploitative images of children, the first case of its kind in the country as courts around the world encounter the use of new technologies in creating abusive sexual content.
(Apologies if I use the wrong terminology here, I’m not an AI expert, just have a fact to share)
The really fucked part is that at least google has scraped a whole lot of CSAM as well as things like ISIS execution bids etc and they have all this stuff stored and use it to do things like train the algorithms for AIs. They refuse to delete this material as they claim that they just find the stuff and aren’t responsible for what it is.
Getting an AI image generator to produce CSAM means it knows what to show. So why is the individual in jail and not the tech bros?
(Apologies if I use the wrong terminology here, I’m not an AI expert, just have a fact to share)
The really fucked part is that at least google has scraped a whole lot of CSAM as well as things like ISIS execution bids etc and they have all this stuff stored and use it to do things like train the algorithms for AIs. They refuse to delete this material as they claim that they just find the stuff and aren’t responsible for what it is.
Getting an AI image generator to produce CSAM means it knows what to show. So why is the individual in jail and not the tech bros?