A very NSFW website called Pornpen.ai is churning out an endless stream of graphic, AI-generated porn. We have mixed feelings.
Could be interesting. I mean for one thing no real person being expolited and those with strange fetishist can be satisfied. But could be very disturbing as well. Wonder how long until AI video porn?
I think it is just a matter of time for the disturbing stuff to circulate. If people can create their darkest desires they will.
Then cue debates and political campaigns about AI in general and if we should allow anyone or anything to create depraved images, pronographic or not.
That’s so good, sissy. You got even better after I amputated your legs.
“We don’t intend to police the use of developing technologies at this time.”
That’s so good, sissy, blind that billionaire with your acidic piss.
“We cannot allow our children to be exposed to such grotesque videos.”
Yeah, although I think part of the missing nuance is that people already did that, the difference being that now anyone can, in theory, create what’s inside their head, regardless of their actual artistic talent. Now that creation is accessible though, everyone’s having another moral panic over what should be acceptable for people to create.
If anything, moving the more disturbing stuff from the real world to the digital seems like an absolute win. But I suppose there will always be the argument, much like video games making people violent, that digital content will become real.
People have been able to draw for eons. Imagine trying to ban the public from drawing because the odd few people are a little mixed up.
AI is just a tool, like the pencil, charcoal or paints.
I know you aren’t suggesting we ban it from the public, just my side of the hypothetical debate that your right in saying will arrive
“eh I’ll take a look”
first thing I see is a woman on her back with her legs behind her head, smooth skin where her genitals should be and nipples in the middle of her buttocks.
“alright then”
if xkcd was right about jpeggy porn being niche, i’d bank on terrible AI porn becoming a niche in the future too.
Yeh this site has nothing on some of the insane ai creators on pixiv
i prefer the pregnant woman with huge boobs instead of a pregnant stomach (and also less huge boobs where they normally are)
woman with a dinosaur and a woman without legs for me
🥵
deleted by creator
SFW content in an NSFW website. Nice!
I’d throw a hotdog down that corridor, if you know what I mean.
lewd
Finally some good technology
I’m not sure if I have the strength to clutch these pearls any harder over these articles. I peaked 8 articles ago.
Have a sandwich and a nap then I’m sure you’ll be ready to go again
Weakling.
I’ve been clutching so hard, the pearls fused into my flesh years ago. I’ve bankrupted myself buying more pearls, inserted one by one into my clenched fist.
Luckily the mere sight of me - a lurching pearlescent beast with glinting pearls for eyes - causes clams to voluntarily offer their own in reverance, my own unending supply.
You’re a poet.
Seems like the author is clutching them hard enough for everyone.
I personally can’t wait for AI to cause a thotmarket collapse.
But then who will you treat like shit?
We’ve programmed a robot to be treated like shit.
I also gave it clynical depression!
There’s always ‘self’.
The AI eGirl, but it’ll be alright because I’ll pay CeoGPT an extra $5/month for a model that’s into that shit.
Gotcha. You’re fine with paying someone to pretend they’d willingly fuck you, you’re just not comfortable with the money for it going anywhere except into an old white billionaires pocket.
I’m sure there’s nothing to unpack there.
Nah the old white billionaire will be replaced by AI too.
Technically if you’re versed enough then you can already do that but takes some effort
People who insist on real flesh porn will ultimately be viewed as weirdo’s out of touch with reality like people who insist everything sounds better on vinyl.
Fast forward 25 years past the first Ai war and a ragged but triumphant humanity must rediscover the lost art of waxing.
- Have you seen Demolition Man?
And, 2. Ew. 😅
“You mean … fluid transfer?!” 🤮
Why would I want to encourage the flesh trade where real women are hurt? And are limited to what humans are physically capable of?
When I can have AI generated people who are able to do anything imaginable and no one gets hurt?
They’ll be arguments that ‘once people get used to the fantasies they’ll want to try it in real life’ but we all know that that just isn’t true fr 40 years of video games. There hasn’t been any uptick in the events of people eating mushrooms and jumping on turtles or - what ever the fuck a goomba is -
At what point was porn NOT graphic, but now this thing IS GRAPHIC. Are we talking all caps, or just a small difference between the live stuff and the AI shit? Inquiring minds want to know.
deleted by creator
“Are we ready”, in the sense that for now it’s 95% garbage and 5% completely generic but passable looking stuff? Eh.
But, as this will increase in quality, the answer would be… who cares. It would suffer from the same major issues of large models : sourcing data, and how we decide the rights of the output. As for it being porn… maybe there’s no point in focusing on that specific issue.
When I first heard Stable Diffusion was going open source, I knew this would happen. The only thing I’m surprised at is that it took almost 2 years.
It went quite a bit faster than that. StableDiffusion has only been out for about 13 months and this started about three months after that with Unstable Diffusion. What this article is reporting on is already quite a few months old and quite a bit behind what you can do with a local install of StableDiffusion/Automatic1111/ControlNet/etc. (see CivitAI).
I AM!
Meh. It’s all only women and so samey samey. Not sexy IMO, but I don’t think fake is necessarily not hot, art can be, certainly.
You can change it to men, but most of the results are intersex(?) or outright women anyway. I guess the training data is heavily weighted toward examples of women.
Went and had a look and it’s some of the funniest stuff I’ve seen all day! A few images come close to realism but a lot of them are the sort AI fever dream stuff that you could not make up.
Ai has no ficks to give to people who are not ready.
I am a little surprised that no one had created a site like this for child pornography.
I am not a legal expert, but my layman’s understanding of Ashcroft v Free Speech Coalition https://en.wikipedia.org/wiki/Ashcroft_v._Free_Speech_Coalition is that as long as there is no person being harmed by it CSAM is legal.
Maybe later rulings have changed this. One can hope.
CivitAI is a pretty perverted site at the best of times. But there’s a disturbing amount of age adjustment plugins to make images of children on the same site they have plugins to make sex acts. It’s clear some people definitely are.
Some models also prefer children for some reason and then you have to put mature/adult in positive prompt and child in negative
I think part of the problem is that there is a lot of anime in the models and when you don’t filter that out with negative prompts it can distort the proportions of realistic images (e.g. everybody gets huge breasts unless you negative prompt it away). In general models are always heavily biased towards what they were trained on, and when you use a prompt or LORA that worked well on one model on another, you can get weird results. There is always a lot of nudging involved with keywords and weights to get the images to were you want it.
I remember reading that this may be already happening to some extent, eg people sharing tips on creating it on the deep web, maybe through prompt engineer, fine tuning or pretraining.
I don’t know how those models are made, but I do wonder the ones that need retraining/finetuning by using real csam can be classified as breaking the law.
deleted by creator
If a search engine cannot index it then it is the deep web. So yes, Discord chats are technically part of the deep web.
deleted by creator
Wikipedia on the deep web
The deep web,[1] invisible web,[2] or hidden web[3] are parts of the World Wide Web whose contents are not indexed by standard web search-engine programs.
Try accessing a Discord channel through your browser without being logged in. They aren’t indexed by search engines because you have to be logged in.
deleted by creator
I don’t care about some arbitrary challenge to get money from you. I’m trying to get you to think critically. If search engines like Google don’t index it then it’s part of the deep web. Just because things like Discord aren’t what people typically mean when people talk about the deep web doesn’t make Discord chats not part of the deep web.
Hentai maybe. But realistic shit is 100% illegal, even just making such an AI would require breaking the law as you’d have to use real CSAM to train it.
Surely we should know, right? Cartoons or hentai or whatever must have gone through this at some point?
Typically, the laws get amended so that anything that looks like CSAM is now CSAM. Expect porn generators tuned for minor characters to get outlawed very quickly.
deleted by creator
Well, to develop such a service, you need training data, i.e. lots of real child pornography in your possession.
Legality for your viewers will also differ massively around the world, so your target audience may not be very big.
And you probably need investors, which likely have less risky projects to invest into.
Well, and then there’s also the factor of some humans just not wanting to work on disgusting, legal grey area stuff.
yup, just like the ai needed lots of pictures of astronaughts on horses to make pictures of those…
Exactly. Some of these engines are perfectly capable of combining differing concepts. In your example, it knows basically what a horse looks like, and what a human riding on horseback looks like. It also knows that an astronaut looks very much like a human without a space suit and can put the two together.
Saying nothing of the morality, In this case, I suspect that an AI could be trained using pictures of clothed children perhaps combined with nude images of people who are of age and just are very slim or otherwise have a youthful appearance.
While I think it’s repugnent in concept, I also think that for those seeking this material, I’d much rather it be AI generated than an actual exploited child. Realistically though, I doubt that this would actually have any notable impact to the prevalence of CSAM, and might even make it more accessible.
Furthermore, if the generative AI gets good enough, it could make it difficult to determine whether an image is real or AI generated. That would make it more difficult for police to find the child and offender to try to remove them from that situation. So now we need an AI to help analyze and separate the two.
Yeah… I don’t like living in 2023 and things are only getting worse. I’ve put way more thought into this than I ever wanted to.
Aren’t AI generated images pretty obvious to detect from noise analysis? I know there’s no effective detection for AI generated text, and not that there won’t be projects to train AI to generate perfectly realistic images, but it’ll be a while before it does fingers right, let alone invisible pixel artifacts.
As a counterpoint, won’t the prevalence of AI generated CSAM collapse the organized abuse groups, since they rely on the funding from pedos? If genuine abuse material is swamped out by AI generated imagery, that would effectively collapse an entire dark web market. Not that it would end abuse, but it would at least undercut the financial motive, which is progress.
That’s pretty good for 2023.
With StableDiffusion you can intentionally leave an “invisible watermark” that machines can easily detect but humans cannot see. The idea being that in the future you don’t accidentally train on already AI generated images. I’d hope most sites are doing that but it can be turned off easily enough. Apart from that I’m not sure.
I could have sworn I saw an article talking about how there were noise artifacts that were fairly obvious, but now I can’t turn anything up. The watermark should help things, but outside of that it looks like there’s just a training dataset of pure generative AI images (GenImage) to train another AI to detect generated images. I guess we’ll see what happens with that.
Unfortunately, no, you just need training data on children in general and training data with legal porn, and these tools can combine it.
It’s already being done, which is disgusting but not surprising.
People have worried about this for a long time. I remember a subplot of a sci-fi series that got into this. (I think it was The Lost Fleet, 15 years ago).
You’d also have to convince them that it’s not real. It’ll probably end up creating laws tbh. Then there are weird things like Japan where lolis are legal, but uncensored genitals aren’t, even drawn.
I’m sure they’re out there on the deep web.
Fuck yeah!