Chinese AI clones user voice in seconds
Chinese AI clones user voice in seconds

Chinese AI clones user voice in seconds

Baidu is a company that many call Recently, she published technical documentation describing new developments in the field of artificial intelligence. A system based on a neural network can clone a person’s voice by analyzing even a short piece of recording. It is not only good at imitating human speech, but also knows how to supplement it with various “tricks” such as accent. The Network contains examples of how people’s voices are imitated using a neural network. The same technology in previous versions was able to imitate human speech by analyzing relatively long samples. Last year, Baidu developers demonstrated how

Deep Voice technology reproduces human speech based on half-hour material.

It would seem that this development is more of a pampering, rather than a serious practical activity. However, this is not at all true. Soon she will find a lot of applications. For example, a person who has lost the ability to talk will find it at least with the help of an artificial apparatus. And a troubled baby who does not know how to fall asleep without listening to a fairy tale told by someone from his family will be easier to lay. He will not be so dependent on his physical presence or call. And this is only a small part of the possible options for using the technology. You can use it by creating digital personalized assistants for gadgets, smart homes, cars, etc.

The truth is, this development, like any other, has a downside.

Attackers will be able to abuse the development, far from always using it legally. According to experts, the current version of the software creates a kind of voice capable of deceiving a system that recognizes it during a test in 95 cases out of 100. Cloned samples were rated by people at 3.14 points out of four possible. This means that there may come a time when fraudsters use artificial intelligence for their own purposes, and this despite the fact that there are already developments that change or mimic people’s faces in videos, using neural networks again. Say, recently the network was filled with porn plots with faces of models replaced with faces of celebrities. While this can be attributed to prank. However, the moment is not far off when it will be possible to release a lot of fake news based on imitation: both appearance and voice at the same time.

Add comment