Volodymyr Zelensky appeared in a video during the third week of the Ukraine crisis earlier this year, wearing a dark green shirt and speaking slowly and deliberately while standing behind a white presidential podium bearing his country’s coat of arms. The Ukrainian president’s body barely moved as he spoke, with the exception of his head. As he appeared to exhort Ukrainians to surrender to Russia, his voice sounded warped and almost gravelly.
In the tape, which was instantly detected as a deep-fake, he appeared to say, in Ukrainian, “I ask you to lay down your weapons and go back to your families,” “This war is not worth dying for. I suggest you to keep on living, and I am going to do the same.” This is the acme of how deep-fakes can be used in such a distressed situation to alter the psychology of the survivors of any particular crisis.
Officials from U.S. intelligence agencies are monitoring various audio and video that have surfaced since the beginning of the Ukraine crisis for alteration that could lead to misinformation.
Before discussing and delving further, let us first understand what exactly is deep-fakes?
Deep-fakes – Deep-fakes are fake media in which a person’s likeness in an existing image or video is replaced with someone else’s. While the act of generating false information is not new, deepfakes use advanced machine learning and artificial intelligence algorithms to edit or generate visual and audio content that can fool more easily. Deep-fakes were, initially, developed for various ethical purposes like for marketing purposes. Marketers who use deep-fakes may save money on video advertising costs because they do not require an in-person performer. Rather than hiring actors in person, a marketer may obtain permission to utilise an actor’s character. You can then use previous digital recordings of the actor to create a new video by inserting pertinent phrases from the actor’s script.
However, a new use of deep-fakes has emerged in recent years, which is to affect the political dynamic of the country or, more recently, to mobilise the people for or against by fabricating and generating fake films of world leaders delivering false messages and speeches, the quintessential example of which would be the case we mentioned at the start of the paper. Surprisingly, this is not the first time that the threat of deep-fakes has been so serious; even during the 2020 US Presidential elections, FBI officials warned about the use of deep-fakes to influence election outcomes. But, thankfully, deep-fakes were not employed as extensively, either due to restricted technological development in this technology or perhaps due to the government’s vigorous attempts to combat this problem. But this time the stakes are way too high in the Ukraine crisis as the outcome of this particular crisis will decide the fate of global politics for many years to come. According to The Guardian, a Russian propaganda campaign named ‘Ukraine Today’ is promoting bogus news about the war by utilising fake profiles on Facebook, Twitter, and Instagram. And we are all aware of the lack of authentication that these social media sites have in place to authenticate the news and profiles of their users, i.e. it does not exist. Anyone can make an account on these sites and upload any sort of information on it without any proper verification. That is why the lack of potent authentication measures by these sites combined with this new technology of deep-fakes can be catastrophic to the stage of world politics.
Ukraine Crisis – Thousands of people were exposed to fake footage of unrelated explosions within hours of Russia’s intervention. Several people promptly published video of explosions in Tianjin, China, and Beirut, Lebanon, purporting to show Russian bombers bombing “Ukrainian HQ.” The videos were extensively shared on social media platforms including Facebook, Twitter, TikTok, and others, with the exceptional — but unrelated — footage attracting people’s attention. Simultaneously, other social media users began spreading fabricated folk tales about valiant Ukrainian deeds. The most well-known of these concerns is the “Ghost of Kyiv” fighter ace, who is said to have shot down six Russian planes within hours of the invasion’s start. An ancient video game or military practice tape was posted in response to the rumour, and it received millions of views. Former Ukrainian President Petro Poroshenko backed the idea until May, when the country’s military revealed that the “Ghost of Kyiv” was a “superhero mythology.” While inspiring stories of bravery may give residents hope during a battle, experts warn that obsessive disinformation can be detrimental and present an erroneous view of the conflict.
The Kremlin’s initial claim that the invasion of Ukraine is a “special military operation” to “denazify” and “demilitarise” a “Neo-Nazi state” has been echoed repeatedly by pro-Russian users. Many people have dismissed allegations of Russian war crimes, calling the conflict a “hoax.” A news reporter was seen in one widely circulated video standing in front of lines of corpse bags, one of which was moving. The film, however, does not depict fabricated Ukrainian battle fatalities, but rather a climate change protest in Vienna in February, three weeks before the invasion began. Other instances of Ukraine conflict deception have centred on “crisis actors,” or individuals allegedly hired to play terrified or dead combat victims. On March 9, a well-known beauty blogger “pretended” to be the pregnant victim of a horrific attack on a maternity hospital in Mariupol, according to one false report.
As the first missiles were launched against Kyiv, President Volodymyr Zelenskyy declared on social media that he would not abandon the country. His presence in Ukraine’s capital, as well as his nightly video remarks, put an end to any speculation that he had fled. Some claimed that the Ukrainian president was in exile and appeared in Kyiv through a green screen or film studio. Many of the photographs showed Zelenskyy constructing holograms for various digital technology conferences across Europe, and as the war progressed, he became a more regular target for Russian propaganda.
Geo-Political Angle – False remarks regarding the Ukraine conflict have spread to neighbouring countries, as well as the NATO military alliance. As the fighting continued into May, social media users mistakenly stated that European Union member countries were preparing to join the fight. With a digitally generated BBC News logo, one video stated that Poland’s military commander had issued an order putting army troops on “maximum alert.” The BBC subsequently claimed that no such item existed and that their brand had been used to create a bogus film. Polish officials have also accused Moscow of carrying out cyber-attacks against the country. Another false film stated that Finland was ready to send hundreds of tanks to its eastern border with Russia, in order to increase tensions. A freight train was shown in the film hauling equipment to western Finland for annual military training. This shows how the effects of deep fakes can spill over to other countries as well.
Effects of Deep-fakes on Business – The effects of deep-fakes also includes the damages it can do on a business. Assume a video is published in which a CEO of a corporation expresses (allegedly) opposing views on crucial issues. This can quickly lead to a reduction in the value of a company’s stock. Even if your public relations department answers quickly and denies the veracity of the video, stakeholders are not required to believe it. By the time evidence of this is available, significant damage to your company’s reputation may have already occurred. After all, studies show that the majority of reputational damage occurs within the first 24 hours of an occurrence. Deepfakes pose a similar reputational risk to business fraud, but with far more ramifications. If a well-made fake spreads online, it is difficult to invalidate credibility and a reputation disaster.
In political context imagine if a deep-fake of POTUS got circulated in which the POTUS is supporting the Russia in this Ukraine crisis, the effects of this would change the whole dynamic of the world politics for a moment, at least till an official statement is released regarding the video being fake but till then a lot of damage would have happened. The Ukraine would lose every last bit of hope, the western countries would start questioning the credibility of the United States etc. Now we all know this example is far too extreme, but this was just to show how catastrophic this technology can be.
Conclusion – All of this clearly shows that deep-fake is a very poor step in technological growth. Its shortcomings vastly exceed its advantages. The latest example of its shortcomings is clearly visible in the Ukraine Crisis. The Ukraine crisis has fostered a plethora of kinds of deception, ranging from images taken out of context to digitally edited movies that use artificial technology to spread lies.