Do you still believe in what you see?
Politicians getting drunk and swearing. Businesspersons confessing to have committed crimes, actors "gluing on" other people's faces and doing strange things. The Internet is full of videos of celebrities behaving in an unusual way. However, none of them is true. Deep fake is when a person's appearance and voice are changed in a real video using artificial intelligence technology. Such a hoax will understandably be brought to light, but until that happens it can do serious damage to your nerves and your career. Experts have already named deep fakes the main information weapon of this year. How far has everything gone? Let us ask Vitaly Demirov, Advisor and Consultant at the Belarusian Institute for Strategic Research.
Hoaxes have a long history and they are here to stay, the expert states. Some of them shock the public with scandalous revelations in newspapers and on the Internet. Others demonstrate no less sensational photos. And some of them populate the messengers and social media with fake news. A couple of years ago in Mexico, two men were beaten and set on fire because of rumours spreading on Facebook and WhatsApp that those two men were allegedly kidnapping children. The messages said, "Beware, the scourge of kidnapping has already struck this country". Same messages terrified the audience saying the criminals may be involved in organ trafficking, kids disappeared in the last few days, and their "bodies were found with organs harvested, and their bellies were cut open and gutted". The rumour turned out to be a hoax, but the angry crowd did their thing without waiting for the situation to be sorted out. A similar thing happened at the same time in India.
- Attempts to manipulate through information have been known since ancient times," Vitaly Demirov reminded. - This is shown, for example, in the ancient Chinese treatise "The Art of War" by Sun Tzu. In the 13th century B.C., the battle between Egypt and the Hittite State ended in a tactical draw, but each party took credit for the victory, as inscribed on the stones of their palaces. All these are manipulations designed to steer the public opinion and the behaviour of the crowd in the right direction. This has always been like that, the only difference being the channel for transmitting the hoaxes - whether it is a stone on the palace wall, a newspaper, a broadcast or a social media account.
These days, another channel – deep fakes – is gaining popularity. To create such fake videos, generative adversarial networks (GANs) are used. At first, publicly available photographs of a potential victim are collected, then they are uploaded to a software application, and the rest is mere formality. Vitaly describes the mechanism, trying not to go into technicalities: one GAN algorithm analyses photos and videos with a certain person and independently creates an image, competing with another algorithm until it starts to mix up the fake image with the original. This means that a person's face in the original photo or video can be "masked" over by someone else's face. It is also possible to "make" a victim say the words you want them to say – here, the software will learn the relationship between this person's mouth and specific words when they are spoken. Any need to 'glue' a face to someone else's body? The neural networks will do this after having learned the connections between them.
Full article is available in the Russian version of the website.