Virtual paedophile porn

Author
Discussion

Kermit power

Original Poster:

29,472 posts

220 months

Wednesday 28th June 2023
quotequote all
I've just read this BBC News article about people using AI to generate virtual paedophile images and find myself somewhat in two minds over the whole thing!

On the one hand, I can see the argument made by the police that virtual porn (which, being 100% computer generated, doesn't directly harm anyone in its creation) may prove to be something of a gateway drug that could escalate to real imagery or worse, but on the other hand, none of us can really do anything to control what turns us on, and isn't there just as much chance that more paedophiles will be satisfied with this sort of image and not go on to the real thing as a result?

I'm also surprised that these images are treated identically to real ones under UK law despite being completely artificial.


Ridgemont

7,183 posts

138 months

Wednesday 28th June 2023
quotequote all
No idea about the child abuse variant but if it’s like the AI generated ‘art’ that is spreading like wildfire through fan forums and the like (I’m into D&D and the various communities are rife with AI crap), it isn’t just a randomly created image. It is using the vast amount of *real* imagery out there to generate a blended facsimile based on the terms of reference supplied. So not at all victimless.

ETA and that argument of ‘well it’s not real’ sounds very much like the kind of weaselly nonsense that such criminals would very much come out with. Not saying that’s your argument btw.

Edited by Ridgemont on Wednesday 28th June 00:30

Pistom

5,590 posts

166 months

Wednesday 28th June 2023
quotequote all
I think the real harm is that the technology normalises it. There's some subject matter where any level of normalisation is unacceptable.
We've probably overstepped the mark already with games where we can kill people and do other things we wouldn't do in reality.
If creating the images was considered acceptable then where do we stop?
Maybe if it was used as some sort of therapy to treat someone then it might be justifiable but personally, I feel it's not a direction society should move towards.

Kermit power

Original Poster:

29,472 posts

220 months

Wednesday 28th June 2023
quotequote all
Ridgemont said:
No idea about the child abuse variant but if it’s like the AI generated ‘art’ that is spreading like wildfire through fan forums and the like (I’m into D&D and the various communities are rife with AI crap), it isn’t just a randomly created image. It is using the vast amount of *real* imagery out there to generate a blended facsimile based on the terms of reference supplied. So not at all victimless.

ETA and that argument of ‘well it’s not real’ sounds very much like the kind of weaselly nonsense that such criminals would very much come out with. Not saying that’s your argument btw.
That may well be the crux of the matter, although you'd think they'd mention it in the article?

The tool certainly does require large numbers of images to train it, so I guess the question is to what extent does the training set have to be real? If you want an AI generated image of a Chihuahua in a dinner jacket riding a Harley Davidson, for example, you'll need lots of photos of Chihuahuas, dinner jackets and Harleys, but I can't imagine there will be many real photos of this on which to base your training.

I do still wonder whether understandable repugnance to the finished product has caused legislators to act without dispassionately assessing whether it truly increases or decreases the risk of actual children being harmed?

peterperkins

3,209 posts

249 months

Wednesday 28th June 2023
quotequote all
I suggest watch the excellent film 'The Artifice Girl' to see where this all might be heading.

The AI technology might be a double edged sword and useful to help catch predators as well as generate unlimited dubious content for them.

The fact fake (pseudo) images are also treated as illegal seems very sensible.

motco

16,231 posts

253 months

Wednesday 28th June 2023
quotequote all
Pistom said:
I think the real harm is that the technology normalises it. There's some subject matter where any level of normalisation is unacceptable.
We've probably overstepped the mark already with games where we can kill people and do other things we wouldn't do in reality.
If creating the images was considered acceptable then where do we stop?
Maybe if it was used as some sort of therapy to treat someone then it might be justifiable but personally, I feel it's not a direction society should move towards.
If children were to be regularly exposed to these images, they may be come inured to them and both try to offer themselves as subjects and also to be aroused into practising their newly developed desires on younger peers. As you implied Pistom, it renders terrible practices normal and the effect on general standards of behaviour become tainted with these creeping changes. The exposure of children to legal pornography is alarming enough without this move in the direction of plain evil.

Ridgemont

7,183 posts

138 months

Wednesday 28th June 2023
quotequote all
Kermit power said:
Ridgemont said:
No idea about the child abuse variant but if it’s like the AI generated ‘art’ that is spreading like wildfire through fan forums and the like (I’m into D&D and the various communities are rife with AI crap), it isn’t just a randomly created image. It is using the vast amount of *real* imagery out there to generate a blended facsimile based on the terms of reference supplied. So not at all victimless.

ETA and that argument of ‘well it’s not real’ sounds very much like the kind of weaselly nonsense that such criminals would very much come out with. Not saying that’s your argument btw.
That may well be the crux of the matter, although you'd think they'd mention it in the article?

The tool certainly does require large numbers of images to train it, so I guess the question is to what extent does the training set have to be real? If you want an AI generated image of a Chihuahua in a dinner jacket riding a Harley Davidson, for example, you'll need lots of photos of Chihuahuas, dinner jackets and Harleys, but I can't imagine there will be many real photos of this on which to base your training.

I do still wonder whether understandable repugnance to the finished product has caused legislators to act without dispassionately assessing whether it truly increases or decreases the risk of actual children being harmed?
Given the pace that legislation works versus the speed at which the technology is racing it’s probably too soon, though the fact that some countries have already outright ban hammered ChatGPT suggests that any response is going to be blunt. To get the level of nuance required would require years of actual research and impact analysis. Unlikely.

voyds9

8,489 posts

290 months

Wednesday 28th June 2023
quotequote all
I'm also in two minds about AI generation.
Where should the line be drawn, should there be a line.
Now we have imfamous videos of Paris Hilton and Pam Anderson should we now use those to put them with new people.

Now with kids I can see more of a problem but is it a gateway (watch porn then watch more extreme) or a preventative (like vaping to smoking)

I don't know but I think it needs definite research because if it does help prevent/reduce harm them I'm all for it.

s1962a

5,734 posts

169 months

Wednesday 28th June 2023
quotequote all
The line for AI generated images should be the same as where it is for any other type of indecent image involving children, and offenders punished accordinly.

Just recently another monster has been jailed for horrific crimes against children. If AI images were available, he might have had those in his possession to.

https://www.mirror.co.uk/news/uk-news/monster-who-...

There is no grey area.

boyse7en

7,129 posts

172 months

Wednesday 28th June 2023
quotequote all
motco said:
... it renders terrible practices normal and the effect on general standards of behaviour become tainted with these creeping changes.
Yet for years games producers and players have argued that there is no evidence that violence in video games has any effect on the perpetration of violent acts in the real world, and that repeatedly blowing people up in Call of Duty or GTA or whatever doesn't normalise violence.


Amateurish

7,918 posts

229 months

Wednesday 28th June 2023
quotequote all
It will be illegal in the UK: s.62 Coroners and Justice Act 2009.

62Possession of prohibited images of children
(1)It is an offence for a person to be in possession of a prohibited image of a child.
(2)A prohibited image is an image which—
(a)is pornographic,
(b)falls within subsection (6), and
(c)is grossly offensive, disgusting or otherwise of an obscene character.

...

65Meaning of “image” and “child”
(1)The following apply for the purposes of sections 62 to 64.
(2)“Image” includes—
(a)a moving or still image (produced by any means), or
(b)data (stored by any means) which is capable of conversion into an image within paragraph (a).

ecs

1,296 posts

177 months

Wednesday 28th June 2023
quotequote all
Kermit power said:
(which, being 100% computer generated, doesn't directly harm anyone in its creation)
The tech that's being touted as AI at moment can't do this - the algorithms are trained on vast amounts of existing data and nothing produced is truly original. If Stable Diffusion is able to generate child porn from a text prompt, you've got to wonder what training data has been used?

otolith

59,159 posts

211 months

Wednesday 28th June 2023
quotequote all
There are similar ethical concerns about the sale of child-like dolls and robots.

Rushjob

1,988 posts

265 months

Wednesday 28th June 2023
quotequote all
Kermit power said:
That may well be the crux of the matter, although you'd think they'd mention it in the article?

The tool certainly does require large numbers of images to train it, so I guess the question is to what extent does the training set have to be real? If you want an AI generated image of a Chihuahua in a dinner jacket riding a Harley Davidson, for example, you'll need lots of photos of Chihuahuas, dinner jackets and Harleys, but I can't imagine there will be many real photos of this on which to base your training.

I do still wonder whether understandable repugnance to the finished product has caused legislators to act without dispassionately assessing whether it truly increases or decreases the risk of actual children being harmed?
The legislators "acted" years ago to address pseudo images when offenders used to actually cut and paste with real paper and real glue to make pseudo photographs. 1994/5 IIRC

Tom8

3,091 posts

161 months

Wednesday 28th June 2023
quotequote all
Isn't one of the offences "creating an image" under law? So that means download of real image I presume and this would fall under the same law where you are actually creating it? Where then is the line drawn?

gavsdavs

1,211 posts

133 months

Wednesday 28th June 2023
quotequote all
I'm sorry, anyone with tendancies to consider sex with a minor is a problem, or needs a very f*cking strong word with themselves.
Giving people a tolerable outlet for this problem is a problem in itself, however "safe" people may think it to be. It's tolerating the intolerable.

theboss

7,129 posts

226 months

Wednesday 28th June 2023
quotequote all
s1962a said:
The line for AI generated images should be the same as where it is for any other type of indecent image involving children, and offenders punished accordinly.

Just recently another monster has been jailed for horrific crimes against children. If AI images were available, he might have had those in his possession to.

https://www.mirror.co.uk/news/uk-news/monster-who-...

There is no grey area.
I'm not so sure about that. There is always a grey area.

If I can programmatically fill my PC with highly illegal AI generated images just by issuing a command to some API does that warrant a hefty sentence even if I didn't see them / deleted them immediately and they weren't stored or disseminated?

What if somebody else did it to me as a prank, they were synced into Google / OneDrive algorithmically detected and the authorities called?

What if the API engine completely misinterpreted my instructions?

There are way too many possibilities.

My wife just sent me a cartoon, it's pretty crude and depicts an adult human who has just been raped by an animal along with a funny caption, and it's very clearly a joke.

In a world with no grey areas, what does that make me?

Gareth79

8,048 posts

253 months

Wednesday 28th June 2023
quotequote all
Tom8 said:
Isn't one of the offences "creating an image" under law? So that means download of real image I presume and this would fall under the same law where you are actually creating it? Where then is the line drawn?
The offence is "making" which both includes creating from scratch and making copies. A copy is either a physical copy, a file on a disk or memory. If you used AI to generate something on screen without saving it then it would still always fall under "making" because it's stored in RAM to be displayed on screen.

AI software creates images from a statistical model, so a piece of software capable of generating illegal images would not be illegal in itself, although I imagine if a piece of software was specifically written to do so then other offences might come up.

theboss said:
My wife just sent me a cartoon, it's pretty crude and depicts an adult human who has just been raped by an animal along with a funny caption, and it's very clearly a joke.
You can never tell what the CPS will do:
https://www.mirror.co.uk/news/uk-news/dad-wrongly-...

Something like that could also be deemed "grossly offensive" under the Communications Act 2003 if the recipient did not find it funny.



Edited by Gareth79 on Wednesday 28th June 13:04

Kermit power

Original Poster:

29,472 posts

220 months

Wednesday 28th June 2023
quotequote all
s1962a said:
The line for AI generated images should be the same as where it is for any other type of indecent image involving children, and offenders punished accordinly.

Just recently another monster has been jailed for horrific crimes against children. If AI images were available, he might have had those in his possession to.

https://www.mirror.co.uk/news/uk-news/monster-who-...

There is no grey area.
I'm still not convinced. Isn't this the perfect illustration of the saying "you might as well be hanged for a sheep as a lamb"?

The maximum sentence for supplying class C drugs is 14 years in prison, whereas for class A it is life. Both are illegal drugs, and I suspect there is probably more profit to be made from class A, so if both were treated equally in the eyes of the law, wouldn't all dealers try to focus on class A?

Likewise, maximum sentencing for burglary ranges from 10 years for burgling a non-dwelling up to life for aggravated burglary. If you made them all equivalent in the eyes of the law, why not work over a granny for her pin number whilst you're there?

If it turns out that AI generated porn causes an increase in actual abuse of children then I 100% agree that it should be prosecuted in the same vein.

If, on the other hand, it transpires that significant numbers of paedophiles can be completely satisfied by AI porn alone, isn't there a case to be made for potentially even licensing its use under controlled circumstances in the same way as methadone is used as a heroin substitute?

Just in case there's any possible doubt, I find the thought of both real and virtual images utterly repugnant and hope to never, ever see any, but I'd still hope that the authorities are demonstrably acting to improve overall outcomes for children, not just legislating this way because it feels like the only morally acceptable option.

Kermit power

Original Poster:

29,472 posts

220 months

Wednesday 28th June 2023
quotequote all
gavsdavs said:
I'm sorry, anyone with tendancies to consider sex with a minor is a problem, or needs a very f*cking strong word with themselves.
Giving people a tolerable outlet for this problem is a problem in itself, however "safe" people may think it to be. It's tolerating the intolerable.
It's not all that long ago that the mainstream view was that being homosexual was intolerable. Fortunately only a tiny proportion of society still believes that, and one of the key changes was the acceptance that you like what you like and it's simply not possible to "pray the gay away".

It goes without saying that any form of sexual activity with a minor is and always will be completely and utterly unacceptable, but on the understanding that a paedophile can no more change what turns them on than a gay guy or anyone else, surely the intolerable thing would be taking any action other than that which results in the least amount of overall harm?

I certainly don't like the idea of a paedophile being allowed access to these sorts of images even under supervision, but if doing so resulted in fewer actual children being abused, isn't it a price worth paying?