Man who made 'depraved' child images with AI jailed
Discussion
https://www.bbc.co.uk/news/articles/cq6l241z5mjo
What a dangerous individual, and it's good we are clamping down on AI generated obscene images.
What a dangerous individual, and it's good we are clamping down on AI generated obscene images.
MrBogSmith said:
That's a really strong sentence. Although reading it quite a lot of offences.
As AI gets more and more realistic, maybe it's a good message to get out there, that it won't be taken lightly and you will receive a strong sentence. He only got caught because he was charging people money to generate these images, and ended up talking to an undercover officer.
mwstewart said:
Putting aside this particular case and its details, isn't it about time we recognised this as a mental illness and treated it accordingly? As a parent myself I wouldn't wish the fall out on anyone, but something feels a bit witch/stake about our approach to it.
Is there a precedent anywhere in the world where this is managed well? You only have to look at the tourist child sex exploitation in places like Thailand to realise it's a worldwide problem.s1962a said:
Is there a precedent anywhere in the world where this is managed well? You only have to look at the tourist child sex exploitation in places like Thailand to realise it's a worldwide problem.
I don't think there is. The world as a whole is still in the disgust phase (understandably so).mwstewart said:
Putting aside this particular case and its details, isn't it about time we recognised this as a mental illness and treated it accordingly? As a parent myself I wouldn't wish the fall out on anyone, but something feels a bit witch/stake about our approach to it.
Yes. At the very least it should be easier for those who need/want it to get preventative help before committing an offence rather than afterwards.mwstewart said:
Putting aside this particular case and its details, isn't it about time we recognised this as a mental illness and treated it accordingly?
I've thought this for a while. I suspect that there's push back because there is a risk that treating it as a mental illness someway excuses the action. But anyone who derives sexual pleasure from children must surely fall into the category of suffering from some form of mental health issue that society would benefit from some level of research into.
StevieBee said:
mwstewart said:
Putting aside this particular case and its details, isn't it about time we recognised this as a mental illness and treated it accordingly?
I've thought this for a while. I suspect that there's push back because there is a risk that treating it as a mental illness someway excuses the action. But anyone who derives sexual pleasure from children must surely fall into the category of suffering from some form of mental health issue that society would benefit from some level of research into.
As I understand it - he was using actual photos of the faces of real children.
What about if it had all been AI?
Completely faked. No ‘real’ images involved.
Wouldn’t it be better if people with this predilection could get their hands on AI generated videos rather than real stuff?
I’m thinking it would drop the demand for any real kids to be involved. Reduce abuse levels.
Lesser of two evils kind of thing.
I’m not condoning or supporting it, just thinking that it might actually lessen harm to children.
What about if it had all been AI?
Completely faked. No ‘real’ images involved.
Wouldn’t it be better if people with this predilection could get their hands on AI generated videos rather than real stuff?
I’m thinking it would drop the demand for any real kids to be involved. Reduce abuse levels.
Lesser of two evils kind of thing.
I’m not condoning or supporting it, just thinking that it might actually lessen harm to children.
jdw100 said:
As I understand it - he was using actual photos of the faces of real children.
What about if it had all been AI?
Completely faked. No ‘real’ images involved.
Wouldn’t it be better if people with this predilection could get their hands on AI generated videos rather than real stuff?
I’m thinking it would drop the demand for any real kids to be involved. Reduce abuse levels.
Lesser of two evils kind of thing.
I’m not condoning or supporting it, just thinking that it might actually lessen harm to children.
Are you for real? What about if it had all been AI?
Completely faked. No ‘real’ images involved.
Wouldn’t it be better if people with this predilection could get their hands on AI generated videos rather than real stuff?
I’m thinking it would drop the demand for any real kids to be involved. Reduce abuse levels.
Lesser of two evils kind of thing.
I’m not condoning or supporting it, just thinking that it might actually lessen harm to children.
jdw100 said:
As I understand it - he was using actual photos of the faces of real children.
What about if it had all been AI?
Completely faked. No ‘real’ images involved.
Wouldn’t it be better if people with this predilection could get their hands on AI generated videos rather than real stuff?
I’m thinking it would drop the demand for any real kids to be involved. Reduce abuse levels.
Lesser of two evils kind of thing.
I’m not condoning or supporting it, just thinking that it might actually lessen harm to children.
I take your point but I don’t think that’s how child pornography works. It’s like an addiction and whilst offenders might start with AI generated images they would soon escalate to the “real” thing to satisfy their escalating urges. It’s why child pornography is so dangerous; it acts as a gateway to normalise abuse, which leads viewers to go on to physically abuse children. What about if it had all been AI?
Completely faked. No ‘real’ images involved.
Wouldn’t it be better if people with this predilection could get their hands on AI generated videos rather than real stuff?
I’m thinking it would drop the demand for any real kids to be involved. Reduce abuse levels.
Lesser of two evils kind of thing.
I’m not condoning or supporting it, just thinking that it might actually lessen harm to children.
I worked as a civilian for the Police many years ago and remember having a discussion with an IT analyst who recovered images from seized devices. He speculated that having a set number of images that people were allowed to posses might reduce demand and therefore abuse of children in the future. Apart from the ethical considerations of legalising pre-existing images of abuse, I don’t think it would work for the reasons stated above.
StevieBee said:
I've thought this for a while.
I suspect that there's push back because there is a risk that treating it as a mental illness someway excuses the action. But anyone who derives sexual pleasure from children must surely fall into the category of suffering from some form of mental health issue that society would benefit from some level of research into.
I remember reading that sexuality is controlled by various chemicals in the brain, and that the balance / proportions of those chemicals determine whether someone will be hetero, homo, bi, trans, etc or have a preference for pre-pubescent children.I suspect that there's push back because there is a risk that treating it as a mental illness someway excuses the action. But anyone who derives sexual pleasure from children must surely fall into the category of suffering from some form of mental health issue that society would benefit from some level of research into.
Most of those are morally acceptable whereas the latter is most certainly not.
I don't think it is a choice as to which group a person will fall into, although it is very much a choice for somebody to act upon sexual urges involving children.
TeamD said:
jdw100 said:
As I understand it - he was using actual photos of the faces of real children.
What about if it had all been AI?
Completely faked. No ‘real’ images involved.
Wouldn’t it be better if people with this predilection could get their hands on AI generated videos rather than real stuff?
I’m thinking it would drop the demand for any real kids to be involved. Reduce abuse levels.
Lesser of two evils kind of thing.
I’m not condoning or supporting it, just thinking that it might actually lessen harm to children.
Are you for real? What about if it had all been AI?
Completely faked. No ‘real’ images involved.
Wouldn’t it be better if people with this predilection could get their hands on AI generated videos rather than real stuff?
I’m thinking it would drop the demand for any real kids to be involved. Reduce abuse levels.
Lesser of two evils kind of thing.
I’m not condoning or supporting it, just thinking that it might actually lessen harm to children.
Clearly these guys are finding this stuff on the internet anyway.
Might it just not be better if they could look at images that weren’t real and hadn’t involved real kids?
Right now they has to be a ‘industry’ of taking photos of real children. This causes harm.
Take away the need for real kids, using AI, and the harm is reduced.
Not saying legalise it but it’s obvious that people are looking for and finding these images anyway.
tim0409 said:
jdw100 said:
As I understand it - he was using actual photos of the faces of real children.
What about if it had all been AI?
Completely faked. No ‘real’ images involved.
Wouldn’t it be better if people with this predilection could get their hands on AI generated videos rather than real stuff?
I’m thinking it would drop the demand for any real kids to be involved. Reduce abuse levels.
Lesser of two evils kind of thing.
I’m not condoning or supporting it, just thinking that it might actually lessen harm to children.
I take your point but I don’t think that’s how child pornography works. It’s like an addiction and whilst offenders might start with AI generated images they would soon escalate to the “real” thing to satisfy their escalating urges. It’s why child pornography is so dangerous; it acts as a gateway to normalise abuse, which leads viewers to go on to physically abuse children. What about if it had all been AI?
Completely faked. No ‘real’ images involved.
Wouldn’t it be better if people with this predilection could get their hands on AI generated videos rather than real stuff?
I’m thinking it would drop the demand for any real kids to be involved. Reduce abuse levels.
Lesser of two evils kind of thing.
I’m not condoning or supporting it, just thinking that it might actually lessen harm to children.
I worked as a civilian for the Police many years ago and remember having a discussion with an IT analyst who recovered images from seized devices. He speculated that having a set number of images that people were allowed to posses might reduce demand and therefore abuse of children in the future. Apart from the ethical considerations of legalising pre-existing images of abuse, I don’t think it would work for the reasons stated above.
Mind you there must be more people looking at images than ever abusing kids?
I guess we just don’t know how widespread this is or the numbers as it’s so underground.
Gassing Station | News, Politics & Economics | Top of Page | What's New | My Stuff