Instagram Filters Porn Images In DMs Received By Underage Users




The problem with social media is that images can be easily shared and not all are age appropriate. On Instagram this issue is getting more serious with minors being sexually threatened by pedophiles who send obscene images via DM. Today Meta announced a feature to filter out pornographic images that underage users may receive through this private message feature.



The Instagram system will identify images sent using DM. If it is found to be consistent with pornographic images, by default it will be obscured with a label saying it may contain content that users do not want to see. By default it is activated on accounts of Instagram users under 18 years of age. Recipients can then report they received pornographic images to Instagram for action.


In addition to the recipient, the sender is also notified if Instagram detects that they are sending pornographic images. They can then delete the image that has been sent. Another feature is that a message will be displayed on the screen if there is an attempt to spread pornographic images to other users.



In addition to protecting minor users, this feature can also be activated by adult users who do not want to receive pornographic messages. Such images may be sent as a means of sexual harassment. It was also developed to prevent the recipient from becoming a victim of blackmail if this message is responded to with a personal pornographic image. In Malaysia, several cases of extortion using this method have been reported.


Being a parent and guardian in the internet 2.0 world is very challenging. This is why monitoring devices belonging to children under care is important. If you can't afford it, try not to give them access to social media which is now very dangerous.

Previous Post Next Post

Contact Form