SEOUL, Jan. 20 (Korea Bizwire) — Amid growing criticism over the online culture of sexually objectifying idol stars, deepvoice, sound synthesis technology using artificial intelligence (AI), is emerging as a new source of digital sex crimes.
In recent days, a string of pornography videos have been distributed that use deepvoice technology to manipulate the audio features of original content and replace it with the voices of idol stars.
Deepfake videos that manipulate the faces of famous celebrities have posed a threat mainly to female idols.
In deepvoice content, however, the voices of male idols are also targeted.
Deepvoice content is produced in various types including not only the simple manipulation of voices but also synthesizing the voices of male idols into same-sex pornography.
There are even cases in which such videos are transacted between individuals through social media like Twitter.
“Some cases have been reported where the victims were not male idols or actors but ordinary men, including school teachers,” said Moon Sung-ho, head of a consulting center for false accusations of sex crimes.
Given that this kind of technology abuse is a newly emerging trend, there are few precedents for legal intervention in South Korea.
However, lawyer Seo Hye-jin said, “If it’s acknowledged that deepvoice videos cause sexual humiliation, it could be punished under the Act on Sexual Crimes of Violence.”
J. S. Shin (email@example.com)