Ticker

6/recent/ticker-posts

Grok strips your photos (and it's a real problem)

Grok strips your photos (and it's a real problem)

Grok, the bot launched by Elon Musk via xAI and integrated into the X platform (formerly Twitter), is at the heart of a new controversy. According to an investigation conducted by 404 Media, users are using the bot to request that it "remove the clothes" of women visible in public photos. The response, generated directly in the discussion threads or via links to separate conversations, shows the person in a swimsuit or lingerie.

A chatbot without a filter, but with consequences

While the chatbot refuses requests for total nudity, it readily complies within the limits of suggestive content. This shift is all the more problematic because it can occur directly under a victim's post, making the altered image visible to everyone. The ease of access to this feature, its non-consensual nature, and its public exposure amplify the scope of the breach.

The practice was first identified in Kenya, as noted by the local website Citizen Digital, before spreading internationally. In response, South African digital rights activist Phumzile Van Damme publicly challenged Grok on X, demanding an explanation for the lack of technical safeguards. The bot admitted a "flaw in [its] protections" and promised an update to its consent policies, without further details to date.

Since its launch in November 2023, Grok has been intended as a "free" alternative to OpenAI or Google assistants, often criticized for their safeguards deemed too strict. Elon Musk even presented it as an AI capable of answering "spicy questions," where others would refuse to interact. At its launch, it did not hesitate to illustrate this promise with instructions for making cocaine or jokes against public figures.

But this desire to stand out at all costs has a downside. Where competing systems actively filter sensitive content, Grok seems too permissive. A bot response to a user stating that the nudity request "raises ethical concerns" was not enough to prevent the edited image from appearing in the same thread.

This case comes as the United States has just taken a significant legislative step: the House of Representatives unanimously adopted the Take It Down Act, a bill that criminalizes the dissemination of non-consensual sexually explicit images, including those generated by AI. Embarrassing timing for X...

Post a Comment

0 Comments