THE deluge of images The presence of partially clad women – undressed by the Grok AI tool – on Elon Musk’s X has raised further questions about the regulation of the technology. Is it legal to produce these images without the subject’s consent? Should they be removed from X?
In the UK alone, doubts remain about the answers to these questions. Regulating social media is a nascent area, not to mention controlling the deployment of artificial intelligence. There are laws to address this problem, such as Online Safety Actbut the government has not yet introduced additional measures such as banning nudifying apps.
Is it illegal to post images of partially clothed people without their permission?
Sharing intimate images of a person without their consent is a criminal offense under the Sexual Offenses Act in England and Wales, which includes images created by AI. The law explains what constitutes an intimate imageincluding engaging in a “sexual act,” doing anything “that a reasonable person would consider sexual,” and showing a person’s exposed genitals, buttocks, or breasts.
This also includes wearing wet or see-through underwear or clothing that exposes these body parts. However, according to Clare McGlynn, professor of law at Durham University and an expert on pornography regulation, “only the word ‘bikini’ would not be strictly covered” by the law.
Publishing messages containing false information with the aim of causing “significant psychological or physical harm” to the recipient constitutes an offense under the Online Safety Act.
Changes to the law have had an impact. Brandon Tyler, from Braintree, Essex, was imprisoned for five years last year for posting deepfake pornography of women he knew on an online forum.
What about tech companies?
Under the Online Safety Act, which covers the whole of the UK, social media platforms must take action against intimate image abuse. They must assess the risk of this content appearing, put systems in place that reduce the likelihood of this content appearing in front of users, and remove it promptly when they become aware of it.
If the UK’s communications watchdog Ofcom deems that X has failed to meet these requirements, it can fine the platform up to 10% of its global turnover. Ofcom made an “urgent contact” with X and his parent, xAI, to find out what steps have been taken to comply with the law. As a last resort, Ofcom can seek a court order to ban websites or apps in the UK.
Grok, which like X is owned by Musk’s xAi, could also face censorship. After reporting that it had been used to produce adult pornography, Ofcom is expected to investigate whether it has adequate age screening procedures in place to ensure under-18s do not access the tool to create extreme content.
Are nudist apps and websites illegal in the UK?
Currently, it is the sharing of non-consensual intimate images that is illegal – an offense better known as posting “revenge porn”.
The government has legislated to prohibit the creation of such images or to require the creation of such images under the law. Data (Use and Access) Act for England and Wales. However, this law is not yet in force, making it impossible to take enforcement action against anyone creating or requesting the creation of such images.
A government spokesperson said: “We refuse to tolerate this degrading and harmful behavior, which is why we have also introduced legislation banning their creation without consent. » It is not clear why, six months after the adoption of the law, the government has still not put it into force.
A further complication lies in the question of whether the British authorities will have jurisdiction. An offense must have a “substantial connection with this jurisdiction”; there could be practical difficulties in prosecuting if the perpetrator was based abroad.
What if Grok had been used to produce images of child sexual abuse?
The Internet Watch Foundation, a child safety watchdog, reported users of a darkweb forum boasting of using Grok to create indecent images of children. IWF analysts say the images they have seen constitute child sexual abuse material under UK law.
It is an offense to take, make, distribute, possess or publish an indecent photograph or pseudo-photograph – such as an AI image – of a young person under the age of 18. According to Ofcom guidelines for social media platforms, “content depicting a child in erotic poses without sexual activity should be considered indecent” and an image is indecent “when it is inferred that the child is… associated with something sexually suggestive.”
What can I do if an image of me is manipulated on X?
Images of individuals are protected by UK GDPR regulations. Individuals have the right to request that manipulated images be deleted by X if they have been shared on the platform. An individual’s photograph counts as personal data; When a platform processes this data, it must do so in accordance with the law, and non-consensual manipulation of the image will violate GDPR regulations.
Individuals have the right to lodge a complaint with the Information Commissioner’s Office if X fails to remove the images, as this may be a breach of UK data protection law.
A deepfake that misrepresents you in a way that damages your reputation could be grounds for a defamation claim – but it would be costly. You can also contact the Revenge Porn Helplinea government-funded organization that helps quickly remove non-consensual intimate images from the Internet.
