https://theprpost.com/post/6912/

Disinformation: A Silent Weapon Threatening Businesses and Elections

By Ashwani Singla, Founding Managing Partner, AstrumDisinformation, the deliberate spread of false or misleading information, has emerged as a major threat in our interconnected world. The World Economic Forum's 2024 Global Risks Report highlights this concern, even above climate change and war on its short-term risk list. Disinformation poses a significant danger to democratic processes, institutional trust, and the stability of businesses and their leaders, across the globe. The Amplification Machine: Social Media and DisinformationSocial media platforms, with their echo chambers and confirmation bias algorithms, act as powerful amplifiers for disinformation. A 2023 study by Astrum titled “Technology Shaping Communications” revealed the rise of Coordinated Inauthentic Behaviour (CIB) by malicious actors. These tactics involve building fake accounts with seemingly harmless content to establish trust, before using them to spread disinformation. Large follower bases then amplify these messages through likes, shares, and comments, further deceiving social media algorithms and propelling the false narratives. Disinformation campaigns are designed to hijack our emotions with fear, anger, and "us vs. them" narratives, exploiting our natural biases and bypassing critical thinking, making us act, share, without checking facts. The rise of AI-generated news outlets creates a veneer of legitimacy for fabricated stories, making it more difficult to identify and stop the spread of misinformation.Disinformation the new Kingmaker?Advancements in Generative Artificial Intelligence (GPT) are lowering the cost and barriers for producing manipulative content. Deepfakes, voice clones, and synthetic media are becoming increasingly accessible and realistic, posing challenges for verification and trust. This is particularly concerning in the context of elections. For example, YouTube in India has seen a surge in unlabelled content, including deepfakes targeting prominent politicians. The short-form video format encourages rapid scrolling and engagement, potentially exposing viewers to a flood of disinformation without proper context. Platforms like Facebook and WhatsApp are used to further amplify these issues, as legitimate regulatory guidelines to remove such fabricated content within 24 hours may not be fast enough to prevent significant damage.Take the example of Taiwanese Elections. In the buildup Taiwan elections, a flood of video content inundated social media platforms, all under the banner of 'the secret history of Tsai Ing-Wen.' These videos, featuring news anchors fluent in both English and Chinese, propagated a series of false claims concerning missile activities and the outgoing president and her party. On the election day itself, January 13th, an audio clip surfaced allegedly depicting Terry Gou, a candidate who had withdrawn from the race in November, endorsing the KMT Party. It was later observed that Mr. Gou had not made any such endorsement.According to media reports, operating behind these coordinated efforts were propaganda groups, known under various aliases such as Spamouflage, Dragonbridge, and Strom 1376. According to the disclosures made by the threat intelligence team at Microsoft, this marked the first instance of a state backed actors utilizing AI-generated content to influence a foreign election. Notably, the videos' news anchors were created using CapCut, an app developed by ByteDance, the parent company of TikTok. At their peak, these videos were shared at an astonishing rate of 100 times per minute. The Taiwan election serves as a warning of the future, where AI-driven disinformation campaigns pose a significant threat, particularly in a year where over 60% of the world population will vote to elect their governments including those in India and the upcoming one in the USA.The consequences of a disinformation-rigged election are dire. They can plummet democracy, deepen social divisions, weaken institutions and public trust in them, strain international relations, further rupturing our RUPT(ured) world. The Corporate Labyrinth: Disinformation as a Reputational MinefieldCompanies and CXOs are not immune to the threat of disinformation. Malicious campaigns can damage corporate reputation, finances, and even lead to operational disruptions. A prominent example is the 2016 deepfake attack targeting PepsiCo and its CEO, Indra Nooyi. Trump supporters snubbed PepsiCo on account of false reports that CEO Indra Nooyi told Americans who voted for him to “take their business elsewhere.” The PepsiCo incident prominent as it may be, was before the era of GPTs. Between this and the Cancel-Culture, companies and executives are navigating a minefield of reputation and risk, inviting more scrutiny from the regulators.Publicly traded companies lose an estimated USD 39 Billion annually due to disinformation-related stock market losses. A single deepfake, like the one depicting an explosion at the Pentagon in May 2023, can cause temporary stock market dips of half a trillion dollars. With rise in phishing and targeted cyber security breaches, deep fakes can also be used to trick employees into giving away sensitive information. Not hard to imagine the legal, reputational and financial consequences of such acts!Combating the Puppet Master: Strategies for a Disinformation-Resilient WorldFortunately, we're not powerless. In the face of disinformation campaigns, companies need both peacetime and wartime tools. During peacetime, it's crucial to build trust and understanding with stakeholders. This means proactively acting on and communicating the company's purpose and values, showcasing the ‘walking our talk’ leadership of the C Suite, and fostering a strong brand identity. By "being known for something" positive, companies establish a baseline reputation that can be defended when disinformation strikes.Wartime, however, demands a different approach. Since speed and responsiveness is of essence, foremost is the “Crisis Preparedness” of an organization. Is the management team ready to respond effectively to a crisis? Are their established protocols? Designated Roles and Responsibilities? Scenarios gamed out? Content ready to distribute? Just a peek into what it takes to just avoid playing catch up to the news and social media cycles. Readiness is the key to an effective Response.During wartime, a range of tactics come into play. For e.g. Crisis Response Team (CRT) can swiftly address false narratives, while support from credible external advocates including press bolsters the flow of verified information and generate much needed endorsements. Besides, tables can be turned by using the same tools and technology used by bad actors to narrowcast authentic content at scale. By being ‘prepared and ready’ companies can effectively counter disinformation and protect their reputations. Dealing successfully with disinformation has to be management responsibility as important as any business continuity plan. “Character is like a tree, and reputation like its shadow. The shadow is what we think of it; the tree is the real thing.”  -Abraham LincolnOrganizations need to focus on the tree, a strong shadow makes weathering the storms of wartime disinformation much more manageable.