InstaForex

Amazing 61 Seconds Video Like Nagita, K-Pop Artists Often Get Deepfakes

 


The 61-second video looks like Nagita Slavina which was later confirmed by the authorities as an edit. Obviously this issue is troubling, because as a result of the video there are some people who believe it is the truth.

In South Korea, a number of artists have also been hit by unpleasant experiences related to fake sex videos. They become targets for criminals who take advantage of deepfake technology.


A study from Deeptrace Labs, a cybersecurity firm that detects and monitors deepfake videos, reveals that politics is a minor issue while the real threat is more to women in different parts of the world.



According to the results of the study, 96% of deepfakes on the internet are pornographic videos without consensus, in other words these videos were released without the women's permission. The female subjects of pornographic deepfakes are 41% British or American actresses, almost 25% are K-Pop musicians or singers.


Henry Ajder, head of research analysis for Deeptrace Labs argues that the overrepresentation of K-pop musicians indicates an increasingly 'global' reach of deepfakes. K-pop deepfakes have also been referred to as an 'early trend' in AI, and are most often though not always used for pornographic deepfakes, as the Rolling Stones write.


Interestingly, Ajder said, the data shows that the majority of users on online forums who generate deepfakes are not from South Korea, but China, which hosts one of the largest K-pop markets in the world. This is despite diplomatic relations between the two countries which have been strained in recent years, with major Korean artists unable to perform in China since 2016.


Worse yet, one site exclusively caters to deepfake porn viewers by displaying the faces of female K-Pop idols on the page. They even rank it based on the popularity of views as reported by us from Koreaboo, Monday (17/1/2022).


These videos not only violate human rights and can be considered a form of sexual harassment. These videos are also often sold as pay-per-view, with the creators profiting from the images of the K-Pop idols. To add to the horror, the videos were also spread openly on social media such as Twitter.

Previous Post Next Post