Global security is facing a novel form of threat as tweets, bots and artificial intelligence (AI) increasingly become equally as important as conventional munitions for states aiming to gain advantage in warfare.
This was an ever-present theme at the NATO Strat Comm Dialogue in Latvia I attended earlier this month, where two days of disheartening panels and presentations forecast the decade of war ahead. As the dialogue made clear, wars are now not just material, but are fought within the global flows of information and through the exchange of disinformation. This new paradigm operates at an expansive scale, is personally targeted, and lethal.
The power of propaganda
Propaganda has always been a part of war. It is the tool that mobilises public sentiment, galvanises a population, and sustains the expenditure of resources and human lives – something that would be otherwise intolerable in times of peace.
It is symbolic that the battlefield of war has long been described as a theatre, for the violence that unfolds there has always been intensified by theme and story – a narrative with purpose.
Here stories of unity and righteous cause are critical to motivating a brutal consensus, pitting one group against another. And narratives of validation and triumph sustain momentum through adversity in pursuit of an ultimate conclusion.
But the new era of information warfare is not propaganda, at least not in the ancestral mass-media form. In the digital age it is disinformation and disruption. During the Vietnam War, America literally bombed with ideas, dropping billions of leaflets across the country and actively occupying the airwaves in an attempt to instil disunity amongst the North Vietnamese people and persuade them on the virtue of the American cause.
It had little if any impact, but it served as a projection of American internal propaganda abroad. It was traditional wartime propaganda.
The new technologies of warfare are increasingly precise and intimate. They are drones and sophisticated tactical weapons, but they are also the weaponisation of large-scale social media platforms, where individuals are targeted en-masse, closed groups are infiltrated, and the information ecosystem is flooded with unlimited voices of dissent.
These voices are indistinguishable from your neighbour or your local barista. They seem like a mum from your yoga group, and in the future, they will include sophisticated deep fakes that present as any trusted voice in your life. The disinformation created in an AI future will weaponise your social network against you.
The numbers crunch
Anthropologist Robin Dubar proposed that the number of people that any one person can maintain stable social relationships with is about 150. We now live in a world where our digital profile retains a perfect record of our 150 people (or less) and their relative impact on our beliefs and attitudes.
Imagine the voice of your best friend in a conversation that cannot be distinguished from reality convincing you of the merits of a candidate, cause or position. Imagine a world where you cannot tell if it is them, or a deep fake. That is the potential impact of AI – it is capable of disrupting the part of human interaction that binds us, corroding not only our national stories and narratives, but our ability to maintain our communities of trust.
In a multipolar world, we are in a new security environment where the global arms race is a battle for the space to maintain democracy. But the weapons are only in the hands of one side; the battle is asymmetrical.
The global alliance of democracies can create internal narratives, but so far we have not elected to develop the ability to wage a disinformation war. To do so contravenes the principles of governance that sit at the heart of our democratic institutional structures. We have handicapped ourselves with unequal rules of engagement, like a treaty with one-sided signatories.
We know foreign countries are moving seamlessly inside our borders, penetrating our conversations and reshaping our views, and we are defenceless. Not only will we not undertake the same actions in their countries, but we have done little to build an information shield and protect our systems for communication and news at home.
We have not built critical digital media skills, regulated social media platforms or enacted proper privacy legislation. Not in Australia and not in America. Instead of implementing these countermeasures we have relied on providence or some notion of moral deservedness to protect the values we wish to uphold.
Old foes, new plans
China and Russia are actively moving to reshape a global order they feel no longer suits their interests, if it ever did. A mutual alignment is forming, not from cultural ties or political ideology, but from a desire to challenge the existing state of play.
Both countries suffer terminal demographics, both are unhappy as actors subservient to a system where the US dollar is the reserve currency, and both see each other as necessary allies in a war that increasingly pits centralised power autocracies against ever more polarised democracies.
The cooperation between China and Russia is intensifying on every level: economic, military, and strategic. There is every reason to expect that collaboration and coordination will extend to the information war and the tactics of disinformation in the aim of undermining the power of democratic states. In practice, this will include influencing elections, inciting social unrest, increasing polarisation, and undermining trust in public institutions.
Our great weakness
Democracies, with their culture of public discourse, empowerment of diverse viewpoints, and structural protections of free of speech, are by design more vulnerable to disinformation warfare. This is at the very least twofold:
Firstly, democratic protections against censorship result in a digital information ecosystem with minimal content moderation. Content selection and regulation is largely left to platform self-governance under free market forces, which when managed by commercially-oriented attention-optimising algorithms, preferentially amplifies provocative and engaging content.
This comes at the expense of socially-oriented goals such as upholding factual accuracy, fostering balanced discourse, and promoting social cohesion.
These open ecosystems are highly vulnerable, not only to the insertion of disruptive content by malicious actors but also to strategically crafted content that aims to exploit the inbuilt algorithms by appealing to their amplification preferences. By contrast, states that employ centralised regulated content control and engage in strategically coordinated information engineering appear to be less susceptible to external information insertion and have ecosystems that are less likely to amplify undesirable content.
Secondly, by virtue of the political influence exerted through cyclical elections, democracies are more responsive to shifts in public sentiment. This enables them to adapt to changing tides and shape outcomes accordingly.
On one hand, voter responsiveness enables an agility of leadership and ensures continual renewal to address the priorities of constituents. However, it also renders a state more vulnerable to the potential fragmentation of its existing commitments within a context of civic discord.
While our democracies are indeed particularly vulnerable to disinformation warfare, it will ultimately be our inaction to protect ourselves that will be our undoing. We can, and must, expand digital literacy, protect citizen privacy through legislation, and expand our capability to effectively regulate social media platforms. The real challenge is finding the political will to do so.
Donate To New Matilda
New Matilda is a small, independent media outlet. We survive through reader contributions, and never losing a lawsuit. If you got something from this article, giving something back helps us to continue speaking truth to power. Every little bit counts.