Site icon

Could AI-Generated Images Be Harmful or Indecent?

AI-Generated Images

The law in England and Wales contains various “indecent image” offences under legislation such as the Protection of Children Act 1978, the Criminal Justice Act 1988 and the Sexual Offences Act 2003. The strictest penalties are imposed when offences involve images of children, as described in section 1 of the Protection of Children Act 1978, which makes it an offence to take, make, distribute, show, possess with intent to distribute, or advertise an indecent photograph or computer generated pseudo-photograph of a child.

Indecent images involving adults may fall under the Obscene Publications Act 1959, if the material is considered to “deprave and corrupt” those likely to see it. With this said, there is no exhaustive definition of the term “indecent image” in any of these pieces of legislation. Whether an image is harmful or indecent is determined by a jury or the magistrates assigned when a case proceeds to court. The courts assess indecency based on the content and context of the image, and apply the standard of assessing whether a reasonable person would consider the image harmful. 

This means that whether an image is deemed indecent or harmful is subject to debate, and is one area in which Tyler Hoffman’s experienced indecent images solicitors can mount a defence on behalf of someone who has been falsely accused. However, what cannot work in your defence is to show that a photograph is not real or has been created by AI: in most cases, possession, creation or distribution of these images represents a criminal offence.

What does the law say about AI images?

AI image generation is a relatively new technology and the law has struggled to keep up with the pace of change, but that does not mean that there are no provisions in law to protect against the creation of indecent images using these tools.

The Coroners and Justice Act 2009 introduced additional offences relating to prohibited images of children that specify that pseudo-photgraphs, as well as real photographs, are covered under the law. This category may refer to a range of non-photographic pornographic images, including cartoons or computer-generated images, that the law deems grossly offensive, disgusting or otherwise of an obscene character.

The Crown Prosecution Service’s guidance on indecent and prohibited images of children explicitly states that: “Some high-quality computer-generated indecent images/AI-generated images can pass as photographs and it is possible to prosecute on the basis of [this] quality computer-generated images as pseudo-photographs. Technology exists to alter photographs to appear as though they are AI-generated images.”

The process of creating the image through an AI tool is less of a concern than the image itself, and it does not need to be photorealistic to fall afoul of the law, which applies equally to photographs, pseudo-photographs and other media. Under existing legislation, any of the following (or material that resembles the following) may result in a charge or conviction:

An indecent image can include:

Whether the image was made using AI or in another way is immaterial if it meets the test of being grossly offensive. It is also important to understand that you do not need to be the person to create the image or prompt an image-creation tool to be guilty of a crime, as the offence of ‘making” an indecent image includes downloading or saving such an image, even if you are not the person who created it originally.

What makes an image “indecent”?

While it is clear that the law applies to AI and other computer-generated images, it may not be entirely clear what makes such an image “indecent”, and there is no statutory checklist that can be applied to a piece of media to make this judgement. Images that depict children in sexual situations are the most common type of indecent image, and there is rarely ambiguity about what such an image depicts.

For offences related to indecent images of children, a “child” is defined as a person under 18 years old. For an AI image, the primary element that would determine whether the image justified prosecution would be how old the person in the image appears to be. Nudity alone does not automatically make an image indecent. For example, ordinary family photographs of children in non-sexual contexts are not criminal. However, it may be more difficult to defend using an AI image creation tool to create a pseudo-photograph in this same context, as there is clear intention behind the production of the image.

The key issue is whether the image, viewed objectively, would be considered indecent by contemporary standards, and whether the images have been created or shared for sexual arousal. It is not a defence that the child consented to the image being taken, as consent is legally irrelevant in this context. AI images that fall into this category may certainly fall afoul of these laws as a result.

Can you defend indecent images offences?

Indecent images of children are categorised for sentencing purposes into three groups, of which Category A is the most serious. These categories do not define indecency, but they can affect sentencing if a conviction follows. However, there may be opportunities to mount a defence against allegations of making, possessing or distributing indecent images, and some statutory defences are available in certain cases. For example, for possession offences under section 160 of the Criminal Justice Act 1988, a person has a defence if they did not know, and had no cause to suspect, that they were in possession of an indecent image of a child. This commonly arises where images were automatically cached or received without the defendant’s awareness. The issue is factual and will turn on the evidence of knowledge and control.

A defence may also arise if the defendant had not seen the image and did not know or have cause to suspect that it was indecent. The court will consider surrounding circumstances, including how the image was obtained and stored, but a solicitor can put forward a strong argument in your favour.

In some cases, the defence will focus on whether the prosecution can prove the defendant “made” the image (for example, by downloading it). There may be doubt about whether the defendant had custody or control of the device, or whether the digital attribution evidence supplied by the prosecution is reliable. These are not statutory defences but may prevent a conviction if the prosecution cannot prove the elements of the offence.

Exit mobile version