SEOUL, April 21 (Korea Bizwire) — Artificial intelligence (AI) technology is being used as eyes and ears for disabled persons, raising accessibility to media.
This technology is expected to allow disabled persons to enjoy the equivalent amount of access to media as non-disabled persons, particularly as media consumption continues to rise following the coronavirus outbreak.
The Korea Communications Commission (KCC) and the Ministry of Science and ICT held a technological demonstration on Tuesday at Yeouido, Seoul in commemoration of the 41th National Disabled Persons’ Day, introducing jointly developed AI technology that automatically translates audio into subtitles and sign language.
The KCC’s newly-developed auto subtitle generating technology uses AI to analyze voices and relay them on mobile devices as subtitles.
Users can access a media streaming application in which they can watch the news with automatic subtitles.
Using a voice-recognition browser allows users to watch all videos online with automatic subtitles. Currently, the rate of voice recognition stands at around 80 percent.
The Ministry of Science and ICT also introduced an emotion expression service that translates audio into sign language demonstrated by an avatar, and another service that interprets various emotions into audio.
“It isn’t easy to always have a sign language interpreter on standby in emergency situations, and it is believed that this could be one of the alternative solutions,” said Ahn Chung-hyun, senior researcher at the state-funded Electronics and Telecommunications Research Institute.
The two agencies plan to complete development of the automatic translation system for voice, subtitles, and sign language by 2023, and connect them to a sign language avatar capable of expressing emotions.
H. M. Kang (hmkang@koreabizwire.com)