I didn't want to write about child sexual abuse images
but here we are
please please don’t read this if what you really need today (or in general) is photos of kittens. it is OK to want photos of kittens.
This is a content warning
This is a post about sexual abuse of children, and about child abuse images and how abusers use them. Please take care of yourself. You will know in your heart if this is something you can read about today, or ever. I am writing it today because I can do it, today. Some days I wouldn’t be able to.
It’s OK to tap out. The rest of us have got this one. You benefit no one by harming yourself. There is no moral necessity to expose yourself to things that will distress you. Please, don’t make yourself read this. Instead you could enjoy the story of Agnes the beaver who has been rehabilitated by The Pipsqueakery animal rescue and is now living a bougie life of couches and warm baths.
I’m on a plane between Sydney and Auckland as I write this, but of course the news is available everywhere, the internet is everywhere, what happens anywhere is happening everywhere.
And Grok, the AI embedded in Twitter-as-was is making images of child abuse and Elon Musk is apparently unwilling to prevent it from doing that, and has made the derisory concession to limit this function paying users. Kemi Badenoch, the leader of the Conservative Party has argued that all that’s needed is to stop children looking at social media. She made this statement: “We don’t ask nightclubs to serve orange squash to kids so they can have something to drink. We say no kids in nightclubs. Social media is exactly the same.”
As if “looking at artificially generated child abuse images” is something that every adult should be able to enjoy like a fine wine or a nice cigar. And why not, you might think, after all no child has been harmed in the making of these images, perhaps it is a bit like whacking off to a film of gorilla sex or something – distasteful to most of us but ultimately where’s the harm?
It is not like a cigar or even going to an adult sex club, and I am going to explain the two key reasons why.
My knowledge about this comes from the time I spent working at the children’s charity Barnardo’s in the 2000s, when they were researching the early internet and specifically I did some work with their child abuse team when they were learning how the internet was affecting abuse of children. I was never a front-line worker, but I wrote for and edited reports on their findings1. The report I worked on was called Just One Click and you can read the recent update to that report here.
Reason 1: abusers use these images to groom and manipulate their victims
Abusers want to convince kids not to talk about what’s happening, to have them so confused that they think they ‘wanted it’ on some level or made it happen and are too ashamed to tell another adult. This is how abusers remain hidden.
Child abuse images can be an important tool for them. When I learned about this at Barnardo’s they explained to me that abusers show the images to children and say “look, this child is enjoying what is happening here”. That is a very powerful way to convince a child that what’s happening to them is normal.
In an incredibly stupid comment yesterday, shadow Technology Secretary Julia Lopez said that Grok isn’t a problem because: “from crude drawing to Photoshop, Grok is not the only tool capable of generating false or offensive imagery”.
People have always been able to make horrible images, yes. But “could someone make something distasteful to whack off to?” is not the test. The test is “could this image convince a child that it is real, so that the child believes that other children do this and enjoy it?” A crude drawing doesn’t do that. An AI-generated image does that.
Have people been able to do this in the past by using real photographs and a scalpel or Photoshop? Sure. But AI image generation makes it much, much easier to do. This is not something that we want to make easy. It is something that should be as difficult as we can possibly make it.
Allowing AI image generation of child sexual abuse images will cause more children to be abused.
Reason 2: there is some evidence that easy access to child sexual abuse images makes more people attracted to children
It is important to know that, as this analysis says: “Not much is known about the etiology, or development, of deviant sexual interests.” What we have are theories and sparse pieces of self-reported evidence.
But what we have suggests that the story is complex. We might tend to think of sexual preferences and sexual attraction as fixed, but they’re probably not – certainly not completely. There is evidence that human sexual arousal can be ‘classically conditioned’. That is, we are influenced by what we see, and by what is presented to us in a sexual context.
There is a population who are lifelong predators, with a fixed sexual preference for children, who spend their lives seeking out child victims2.
But there is also some evidence that a certain proportion of people3 are vulnerable to becoming sexually attracted to children. People who look online at porn, perhaps beginning with images of consensual and even loving sex. But they are served increasingly extreme images – or they are on a hedonic treadmill, seeking more and more extreme material to get the same ‘high’ – and begin to find that normal4.
One study found that, “most participants… did not initially seek out CSAM but… first encountered it inadvertently or became curious after viewing legal pornography”. We don’t really know a lot about how the brain works, but how it could work is this: while a person is already aroused, the internet of porn shows them a child sexual abuse image. This can create a neurological link between looking at that image and being aroused. Or in academic language: ““[A]ny stimulus which regularly precedes ejaculation by the correct time interval should become more and more sexually exciting”5.
In this theory: they never went looking for it, but now that link is there, and every time it happens it gets worse. I can only imagine that this effect would be even more extreme if the child sexual abuse images a person looks at are those which are specifically ‘tailored’ by the AI to their other preferences.
The research on this is not settled; for obvious reasons this is a very difficult thing to investigate. But if it’s happening even a bit, that means there are some children who could be protected by preventing anyone from looking at child sexual abuse images.
If it happens even a bit, the more often it happens, the more people develop a sexual interest in children. The more people there are who have a sexual interest in children, the more children will be abused.
I’m not asking you to sympathise with abusers. They are responsible for their actions. I’m telling you that there’s evidence that this is a real effect which has real victims in the children those people end up abusing.
Allowing AI image generation of child sexual abuse images will cause more children to be abused.
What are the policy solutions here?
They are extremely clear.
If it is illegal to download a child sexual abuse image in your country, it must also be illegal to create one using AI.
The penalties for distributing AI-generated child sexual abuse images must be the same as those for distributing non-AI-generated images.
Services that allow people to create these images must be banned, internationally. Grok (and if necessary Twitter/X) should be immediately blocked in every country as part of a ban on child sexual abuse images until it has removed this function.
We can’t equivocate on this one and we can’t hang around. The current UK government is doing the right thing by acting swiftly. Every day this function exists is a day it is creating more abuse of children. This would be a good day to write to your MP or representative expressing your views about this.
I have been emboldened to write this because a lot of my survey respondents told me that they liked hearing my thoughts on current events, and that they value that I come at the world from my own perspectives with ideas and insights they haven’t heard before. I would like to hear from readers on it.
If this post was valuable to you in sharpening your thinking, please do share it with others and give it a like below which helps other people to find it.
My intention is that this sort of ‘public interest’ post will always remain free, but if you want to support my work becoming a free or paid subscriber is a great way to do it.
and the reason I say “child abuse images” and not “child pornography” is: there’s such a thing as consensual adult pornography. But a sexual image of a child is an image of child abuse. It is very important not to get the two confused.
there are also people who have this fixed preference and seek out help to make sure they never act on it. I was at Barnardo’s when they set up a helpline for people who had those desires but had never acted on them, to give them appropriate therapy. We got widely mocked in the media for it, but in fact I believe about 50 people a year called that helpline. The Stop It Now helpline exists for this purpose now. It is massively admirable, really, to have that desire and never let it make you harm a child and to risk humiliation and exposure to seek help.
who seem to be mostly, but not exclusively men. But women are not a well-studied population on this issue.
this effect is what has led to “choking” becoming a normalised sexual behaviour for young adults.
important to say: as we know from the fraud that is ‘conversion therapy’ this is not an effect that can eg make gay people not gay. Human sexuality is very complex. My guess is, as I say, that some people might be particularly vulnerable to being conditioned in this way about certain things.



Well explained. I thought of it as - if there were a user on X (or elsewhere) who was posting CSAM, you'd expect the moderators to zap their account and prevent them coming back. But here, the "user" is _embedded in the app_, so you need the moderators (again) to zap it and prevent it coming back, not put it on a paid-for tier.
Similarly, the "oh you can do this in Photoshop" point is like the difference between an axe and a chainsaw. Anyone who's ever tried cutting down a tree with an axe will welcome a chainsaw. Grok is the chainsaw here, and not in a good way.
Thank you for writing this. I will take your advice to write to my MP, and your points here have definitely made that substantially easier!