Close Menu
clearpathinsight.org
  • AI Studies
  • AI in Biz
  • AI in Tech
  • AI in Health
  • Supply AI
    • Smart Chain
    • Track AI
    • Chain Risk
  • More
    • AI Logistics
    • AI Updates
    • AI Startups

UAE startup Tandia attracts VC investment in AI-driven data monetization product

February 23, 2026

Trust ANC obtains patent for “AI Building” automation system

February 23, 2026

SK Networks invests more in AI startup Upstage

February 23, 2026
Facebook X (Twitter) Instagram
Facebook X (Twitter) Instagram
clearpathinsight.org
Subscribe
  • AI Studies
  • AI in Biz
  • AI in Tech
  • AI in Health
  • Supply AI
    • Smart Chain
    • Track AI
    • Chain Risk
  • More
    • AI Logistics
    • AI Updates
    • AI Startups
clearpathinsight.org
Home»AI in Technology»Grok AI: is it legal to produce or publish images of undressed people without their consent? | Grok AI
AI in Technology

Grok AI: is it legal to produce or publish images of undressed people without their consent? | Grok AI

January 9, 2026005 Mins Read
Share Facebook Twitter Pinterest Copy Link LinkedIn Tumblr Email Telegram WhatsApp
Follow Us
Google News Flipboard
4000.jpg
Share
Facebook Twitter LinkedIn Pinterest Email Copy Link

THE deluge of images The presence of partially clad women – undressed by the Grok AI tool – on Elon Musk’s X has raised further questions about the regulation of the technology. Is it legal to produce these images without the subject’s consent? Should they be removed from X?

In the UK alone, doubts remain about the answers to these questions. Regulating social media is a nascent area, not to mention controlling the deployment of artificial intelligence. There are laws to address this problem, such as Online Safety Actbut the government has not yet introduced additional measures such as banning nudifying apps.


Is it illegal to post images of partially clothed people without their permission?

Sharing intimate images of a person without their consent is a criminal offense under the Sexual Offenses Act in England and Wales, which includes images created by AI. The law explains what constitutes an intimate imageincluding engaging in a “sexual act,” doing anything “that a reasonable person would consider sexual,” and showing a person’s exposed genitals, buttocks, or breasts.

This also includes wearing wet or see-through underwear or clothing that exposes these body parts. However, according to Clare McGlynn, professor of law at Durham University and an expert on pornography regulation, “only the word ‘bikini’ would not be strictly covered” by the law.

Publishing messages containing false information with the aim of causing “significant psychological or physical harm” to the recipient constitutes an offense under the Online Safety Act.

Changes to the law have had an impact. Brandon Tyler, from Braintree, Essex, was imprisoned for five years last year for posting deepfake pornography of women he knew on an online forum.


What about tech companies?

Under the Online Safety Act, which covers the whole of the UK, social media platforms must take action against intimate image abuse. They must assess the risk of this content appearing, put systems in place that reduce the likelihood of this content appearing in front of users, and remove it promptly when they become aware of it.

If the UK’s communications watchdog Ofcom deems that X has failed to meet these requirements, it can fine the platform up to 10% of its global turnover. Ofcom made an “urgent contact” with X and his parent, xAI, to find out what steps have been taken to comply with the law. As a last resort, Ofcom can seek a court order to ban websites or apps in the UK.

Grok, which like X is owned by Musk’s xAi, could also face censorship. After reporting that it had been used to produce adult pornography, Ofcom is expected to investigate whether it has adequate age screening procedures in place to ensure under-18s do not access the tool to create extreme content.


Are nudist apps and websites illegal in the UK?

Currently, it is the sharing of non-consensual intimate images that is illegal – an offense better known as posting “revenge porn”.

The government has legislated to prohibit the creation of such images or to require the creation of such images under the law. Data (Use and Access) Act for England and Wales. However, this law is not yet in force, making it impossible to take enforcement action against anyone creating or requesting the creation of such images.

A government spokesperson said: “We refuse to tolerate this degrading and harmful behavior, which is why we have also introduced legislation banning their creation without consent. » It is not clear why, six months after the adoption of the law, the government has still not put it into force.

A further complication lies in the question of whether the British authorities will have jurisdiction. An offense must have a “substantial connection with this jurisdiction”; there could be practical difficulties in prosecuting if the perpetrator was based abroad.


What if Grok had been used to produce images of child sexual abuse?

The Internet Watch Foundation, a child safety watchdog, reported users of a darkweb forum boasting of using Grok to create indecent images of children. IWF analysts say the images they have seen constitute child sexual abuse material under UK law.

It is an offense to take, make, distribute, possess or publish an indecent photograph or pseudo-photograph – such as an AI image – of a young person under the age of 18. According to Ofcom guidelines for social media platforms, “content depicting a child in erotic poses without sexual activity should be considered indecent” and an image is indecent “when it is inferred that the child is… associated with something sexually suggestive.”


What can I do if an image of me is manipulated on X?

Images of individuals are protected by UK GDPR regulations. Individuals have the right to request that manipulated images be deleted by X if they have been shared on the platform. An individual’s photograph counts as personal data; When a platform processes this data, it must do so in accordance with the law, and non-consensual manipulation of the image will violate GDPR regulations.

Individuals have the right to lodge a complaint with the Information Commissioner’s Office if X fails to remove the images, as this may be a breach of UK data protection law.

A deepfake that misrepresents you in a way that damages your reputation could be grounds for a defamation claim – but it would be costly. You can also contact the Revenge Porn Helplinea government-funded organization that helps quickly remove non-consensual intimate images from the Internet.

Follow on Google News Follow on Flipboard
Share. Facebook Twitter Pinterest LinkedIn Tumblr Email Copy Link

Related Posts

BofA’s struggles with AI adoption reflect broader problem in banking

February 23, 2026

BBCUrgent research needed to combat AI threats, says Google AI boss. But the head of the American delegation to the AI ​​Impact Summit in Delhi says: "We totally reject global AI governance."0.2 days ago

February 23, 2026

The AI ​​Alarm Cycle: Much Talk, Little Action | Science and technology

February 23, 2026
Add A Comment
Leave A Reply Cancel Reply

Categories
  • AI Applications & Case Studies (70)
  • AI in Business (402)
  • AI in Healthcare (314)
  • AI in Technology (394)
  • AI Logistics (52)
  • AI Research Updates (131)
  • AI Startups & Investments (325)
  • Chain Risk (88)
  • Smart Chain (116)
  • Supply AI (105)
  • Track AI (70)

UAE startup Tandia attracts VC investment in AI-driven data monetization product

February 23, 2026

Trust ANC obtains patent for “AI Building” automation system

February 23, 2026

SK Networks invests more in AI startup Upstage

February 23, 2026

AI in healthcare has evolved faster than expected

February 23, 2026

Subscribe to Updates

Get the latest news from clearpathinsight.

Topics
  • AI Applications & Case Studies (70)
  • AI in Business (402)
  • AI in Healthcare (314)
  • AI in Technology (394)
  • AI Logistics (52)
  • AI Research Updates (131)
  • AI Startups & Investments (325)
  • Chain Risk (88)
  • Smart Chain (116)
  • Supply AI (105)
  • Track AI (70)
Join us

Subscribe to Updates

Get the latest news from clearpathinsight.

We are social
  • Facebook
  • Twitter
  • Pinterest
  • Instagram
  • YouTube
  • Reddit
  • Telegram
  • WhatsApp
Facebook X (Twitter) Instagram Pinterest
© 2026 Designed by clearpathinsight

Type above and press Enter to search. Press Esc to cancel.