Recently, X has added support for artificial intelligence-powered photo editing, making all photos uploaded to the platform changeable with text commands on Grok. This has led to various problems, especially abuse in generating images on the platform. This itself happened a few days after the feature was introduced, where irresponsible parties used AI to manipulate photos of women and children to produce indecent content.
Touching on this, MCMC said they have taken note of and are taking this matter seriously at X. MCMC also said they are currently investigating the harm that occurred at X, and will summon the platform representatives.
This also happened shortly after MCMC enforced the Online Safety Act 2025 (ONSA), which requires online platforms and licensed service providers to take steps to prevent the spread of harmful content including pornographic, indecent content and so on.
At present, it is not yet known when the investigation report will be released by MCMC. MCMC emphasizes that users and victims should report harmful content to PDRM, MCMC, and also through the MCMC complaints portal.
