Information and communication technologies opened broad space for hostile actors who want to bend awareness toward engineered outcomes. Adversaries who understand human cognition gained tools that allow direct shaping of perception at scale. A modern smartphone notification cycle delivers emotional cues faster than any traditional leaflet or loudspeaker from past conflicts. A social feed becomes a battlespace once hostile forces insert narratives that feel personal, urgent, and unavoidable.
Fabricated realities form the first pattern. A clear example appeared during the early phases of Russia’s assault on Ukraine when Telegram channels pushed false videos claiming Ukrainian forces abandoned major cities. Deepfake audio of Zelensky telling troops to surrender circulated during the Kyiv defense phase. A fabricated strategic picture pushed despair, not information, into the minds of defenders and civilians.
Misinformation adds another layer. A common example surfaced when pro-Kremlin networks told residents in Mykolaiv that drinking water contained poison from retreating Ukrainian forces. Panic spread briefly until local authorities disproved the claim. A similar tactic appeared in Syria when Russian channels circulated false warnings about “chemical attacks staged by White Helmets,” shaping perception before ground events unfolded.
Propaganda aligns with a sharper military purpose. Chinese state outlets used coordinated narratives during the Philippine reef confrontations, claiming Filipino vessels attacked Chinese ships despite clear video evidence showing the reverse. A population that absorbs skewed imagery starts to question facts. That confusion slows collective response.
Engineered fears and panic play out in cyber environments. A notable example emerged when Iranian-linked actors targeted Israeli civilians with SMS messages warning of incoming chemical rockets during periods of high tension. A message on a private device feels intimate, and that intimacy amplifies fear. Hamas-aligned information channels follow the same pattern when they push false evacuation orders into Israeli WhatsApp groups to trigger mass confusion.
Control of stable preferences becomes possible through repetition. Kremlin-linked Facebook pages targeted Eastern European audiences for years with slow-drip narratives claiming NATO expansion guaranteed war. Older citizens exposed to years of such content shifted from cautious support to active opposition. A long horizon attack creates hardened attitudes that survive fact checks.
Military aggression accelerates once populations lose confidence in their own understanding. An adversary steps into that cognitive vacuum and shapes the story that governs fear, loyalty, and action. A forthcoming section can move deeper into platform-specific case studies such as Russian influence on VK, Iranian campaigns on Aparat, and PRC micro-targeting on Douyin to map how each actor blends cyber operations with psychological pressure.
