In the modern, hyperconnected world, wars are not fought on land, in the air, or on the sea. They are embodied in the hidden realm of algorithmic procedures; instead, the code that determines what the billions of the population see, read, and think daily. This competition is a summary of the term “Algorithm War,” which refers to a struggle over the control of digital attention and opinions, as well as the systems of recommendations that mediate them.
Washington to Beijing, New Delhi to Brussels, all governments are increasingly becoming conscious of the way algorithms are ceasing to be neutral infrastructural mechanisms and are becoming instruments of power. They can even impact the electoral process, mobilize protest movements, and even alter the direction of conflict.
What Is the Algorithm War?
The war in algorithms is not fought in the traditional conflictual, kinetic interaction, but rather through the stakeholders competing for control of data, platforms, and the software that determines the visibility of information. Institutionalized engines that operate on TikTok, Facebook, YouTube, and Instagram process how users behave, and they present content that is engineered to keep a user to a compulsive engagement longer and longer, with a screen.
However, the search for prolonged engagement comes with a price. It has been studied that algorithms are biased to advance hyper-outrageous or emotionally-charged content artificially, as the outraged and the fear-provoking are shown to have a greater click-through rate. These dynamics are also divisive of a society, contribute to the spread of misinformation, and manipulate people’s moods. The Cambridge Analytica case is a premonition of this, in which the data obtained about the user through Facebook could be utilized to create psychographic data, which could be utilized to create highly-targeted political adverts to shape electorates in the United States, India, and elsewhere.
The TikTok Deal: A Turning Point in Algorithmic Geopolitics
The algorithm war may be best summed up by the TikTok deal that is now polishing US-China technology relations. TikTok is another successful digital media platform that belongs to the Chinese technological firm, ByteDance. It has more than a billion users all around the world. The “For You” feed is commonly known as one of the most successful recommendation engines ever developed, which can quickly learn user preferences and generate long-term user engagement.
U.S. Concerns Over Algorithmic Influence
The data security issues of Washington go beyond just privacy. The first is the potential access the Chinese government would have if it could gain access to TikTok’s algorithmic processes or user information. In theory, such access could allow the state to control the information received in the feeds of millions of Americans, subtly preparing certain narratives and excluding others, so as to guide public opinion.
In 2025, the Biden administration signed a treaty
- High US ownership of the American operations of TikTok.
- Algorithmic control by operators authorized by the US, namely Oracle.
- Data localization, to ensure that data belonging to American users is hosted within the borders of the US.
ByteDance agreed to license TikTok’s algorithm for the US, but retain the rights to the intellectual property in China, which presents great technical and legal hurdles. This deal has ramifications far beyond the TikTok business. It creates a precedent for regulation and audit of algorithms – a strategic asset that can be subject to regulation, audit, and even forced localization.
Algorithmic Influence on Conflicts and Politics
The power of algorithms isn’t limited to social entertainment applications; these have become key tools in modern politics, warfare, and information operations.
Ukraine–Russia War: Algorithms on the Battlefield
In the context of the Russia-Ukraine war, algorithms are strategic factors in the military context. Artificial intelligence systems support the evaluation of intelligence data, target determination, and unmanned aerial systems. As an example, human-machine interface has accelerated warfare by helping Ukrainian forces make faster decisions, thereby improving the speed at which they can react to situations than they ever could under traditional chains of command. Russia also utilizes algorithms to organize propaganda campaigns, spreading disinformation through bot networks and recommendation systems to influence how the world perceives the war.
India and Pakistan: Social Media as a Political Battleground
When India-Pakistan relations were at an all-time low due to the April attack in Phalgam, social media became a cradle of visual expressions of war. In Pakistan, a breakdown of the hashtag #IndianFalseFlag alone led to the publication of more than 14,000 posts in less than 24 hours. Many of those posts featured content synthesized by artificial intelligence, altered images, and other untrustworthy constructions.
In India, authorities have also taken steps to suspend thousands of social media accounts and channels for spreading “inflammatory” or false information. For example, the Indian government ordered the removal of more than 8,000 accounts from the site, X (formerly Twitter), including Pakistani media outlets and figures, as well as some Indian journalists or critics. Local cybercrime cells flagged hundreds of misleading or hate-crime posts; in Nagpur, for example, almost 50 posts were taken down and 2000 accounts were investigated following a spate of cross-border disinformation.
Regulation and Pushback: The Global Response
Governments and regulators around the world are taking steps to limit the dominance of algorithms. A precursor to this is the European Union’s Digital Services Act (DSA), which currently is the most comprehensive, as it requires transparency about how algorithms rank and recommend content, options for the user to opt into non-personalized feeds in chronological order, and regular audits for systemic risks (i.e., disinformation, hate speech, etc).
Simultaneously, there is litigation in the United States that argues TikTok’s addictive design and algorithmic architecture are negative for the mental health of young people. Several states have enacted or proposed outright bans or heavily restricted use by underage individuals.
Algorithmic Arms Race
Despite regulatory interventions, states have still driven competition in the use of proprietary (state-end-developed) AI systems for influence operations. The United States, China, Russia, and others spend heavily on the kinds of AI-powered psychological actions, targeted propaganda, and predictive policing tools. These systems can tailor messages to certain demographic groups, thus increasing message efficiency compared to traditional media campaigns.
The arms race for algorithms is expected to get worse. As countries continue to realize that control over narrative is a strategic advantage, they will likely attempt to either control the algorithms of other countries, just as the US did with TikTok, or create alternative platforms so that they can gain sovereignty over digital influence.
The Human Cost: Attention, Identity, and Truth
Ultimately, the struggle over algorithmic systems involves people – the attention, agency, and ability of people to make decisions, to judge. While algorithms have the capacity to help drive connectivity, entertainment, and educational outcomes, they also can create polarisation, addictive behaviors, and manipulation.
For adolescent constituents, the constant availability of highly curated audiovisual content can increase anxiety, promote unrealistic expectations, and create compulsive behaviors. For voters, personalized feeds can create parallel realities, where different groups consume entirely different “truths.” Left unchecked, this erodes the shared information space that democracies depend on.
The Future of Algorithmic Power
This is the real battlefield of our time. Algorithms are the smartest way to spread perceptions and narratives. The crude example is that they can even shape elections. Every country upgrading its defense and security must integrate this domain as well. Those who combine hard defense with the control of the digital influence will be able to guard their sovereignty. Hence, in future conflicts, narratives, perceptions, and algorithms will be as important as kinetic warfare.
Otherwise, without a balanced system of governance, placing algorithmic authority solely in the possession of commercial organizations or rival nation-states would tend to operate as an instrument of control, as opposed to a driver of promoting empowerment. The next issue, then, is the establishment of multidisciplinary structures that have the required legal, technical, and civic aspects and operate in a manner that both maintains an apparently invisible arena of algorithmic competition and guarantees equal footing, compatibility with fundamental human values.
If you want to submit your articles and/or research papers, please visit the Submissions page.
To stay updated with the latest jobs, CSS news, internships, scholarships, and current affairs articles, join our Community Forum!
The views and opinions expressed in this article/paper are the author’s own and do not necessarily reflect the editorial position of Paradigm Shift.
Mohammad Urva Rind is a student of Defence and Strategic Studies at Quaid-i-Azam University, Islamabad, with a keen interest in South Asian security and diplomacy, along with painting a positive image of Pakistan.



