Op-Ed, Left Field Proejct, Interview, Work Duane Hansen Fernandez Op-Ed, Left Field Proejct, Interview, Work Duane Hansen Fernandez

Manufacturing Fear: How Data Became the Most Powerful Political Weapon

I come to this conversation from direct experience. I spent years working in marketing and advertising, specializing in social campaigns and audience behavior, eventually becoming one of the top consultants in this space across several industries…

I come to this conversation from direct experience. I spent years working in marketing and advertising, specializing in social campaigns and audience behavior, eventually becoming one of the top consultants in this space across several industries, and developed a deep fluency in how audience data, psychology, and targeted messaging can influence behavior and drive purchasing decisions… But over time, the work began to feel increasingly hollow, often pushing people toward things they didn’t need or creating desires that weren’t truly their own. Filmmaking had always been my real goal, so I eventually left brand work behind, using the skills I’d developed to transition into the entertainment industry and apply that same understanding of audiences toward something more meaningful, connecting people with stories rather than persuading them to buy something. Now working as a filmmaker, I’m grateful to be far removed from that world, though my time inside it revealed how powerful these tools can become in the wrong hands.

At our core, people are fairly simple. We gravitate toward our interests, our hobbies, and the communities that feel authentic to us. Marketing has always tapped into those affinities; that part isn’t new. What is new is the scale and precision made possible by social media platforms and data extraction. Companies led by figures like Mark Zuckerberg, Peter Thiel, and Elon Musk now operate with unprecedented access to personal behavioral data, shaping information flows and social dynamics in ways that often deepen division rather than strengthen communities. Many critics argue that powerful actors have exploited these systems for profit and influence, contributing to social fragmentation and mistrust, especially amid ongoing controversies and revelations surrounding elite networks and abuses of power. Whether through negligence, indifference, or active pursuit of profit above all else, the result feels the same: communities pulled apart while enormous wealth concentrates at the top. Below is a deeper look at the real harms created through the exploitation of our data.

THE PROBLEM

The exploitation of our data affinities and personal interests by powerful tech elites is one of the most dangerous developments of the modern digital age. When companies like Meta and Palantir construct empires built on the extraction and manipulation of human behavior - while billionaires increasingly consolidate influence through the ownership of major media institutions like The Washington Post and Paramount—the harm extends far beyond privacy violations; it strikes at the foundations of autonomy, democracy, and truth itself.

At the most basic level, our data affinities are intimate reflections of who we are: what we fear, what we desire, what we believe, and what we might become. When these signals are harvested at massive scale, corporations gain the ability not only to predict behavior, but to shape it. Platforms no longer simply reflect our preferences; they actively steer them, optimizing for engagement, profit, and power rather than well-being or social good. Human attention becomes a commodity, and people become instruments within systems designed to keep them reactive, predictable, and profitable.

The political danger emerges when this behavioral power is used to manipulate public emotion, particularly fear. Data-driven platforms learn precisely which messages provoke anxiety, outrage, tribal loyalty, or resentment… and then amplify them. Fear spreads faster than nuance. Anger generates more engagement than reason. Algorithms reward the most emotionally activating content, regardless of accuracy or social consequence.

The result is a public sphere increasingly shaped by manipulation rather than deliberation. Citizens are pushed toward emotional reaction instead of reflection. Complex issues are reduced to outrage cycles. Political actors, both domestic and foreign, can exploit these systems to divide populations, suppress trust in institutions, and mobilize people around perceived threats rather than shared solutions. Democracy depends on citizens capable of independent thought and informed debate, yet these systems bypass reason and trigger reflex.

The danger intensifies when this influence is concentrated in the hands of a few unaccountable actors. Companies wielding data resources and algorithmic power rival nation-states in influence, yet operate without democratic oversight. Their platforms increasingly determine which information spreads, which narratives gain legitimacy, and which voices are amplified or buried. Shared reality fractures into personalized information bubbles, making consensus nearly impossible.

Even more troubling is how these systems exploit psychological vulnerabilities at scale. Platforms learn what makes individuals feel insecure, validated, or afraid, then feed them more of it to maintain engagement. Fear and identity-based conflict become business assets. Polarization deepens, radicalization accelerates, and populations become easier to manipulate politically and economically.

There is also a chilling effect on freedom itself. When people know their behaviors and beliefs are constantly monitored, they self-censor. Exploration narrows. Political dissent becomes riskier. Innovation and democratic participation decline as conformity becomes safer than questioning. Surveillance doesn’t need to be overt; its mere presence reshapes behavior.

Children and young people are especially vulnerable. Their psychological development now unfolds within systems designed to capture attention and shape behavior before they can meaningfully consent. Emotional manipulation becomes normalized from an early age, embedding patterns of dependency and comparison that persist into adulthood.

Geopolitically, data concentration creates new forms of power. Detailed psychological and demographic profiling enables targeted propaganda and information warfare across borders. Elections and social movements become vulnerable to invisible influence campaigns designed to inflame division and undermine trust.

Security risks add another layer: massive data stores inevitably attract breaches, exposing intimate details of millions of lives. Personal histories cannot be reset once stolen, creating long-term vulnerability.

Culturally, personalization fragments shared experience. Citizens increasingly inhabit separate informational realities. Without common reference points, public dialogue collapses into conflict, and compromise becomes impossible.

At the moral core lies a deeper problem: human emotion itself becomes raw material for profit. Loneliness, fear, hope, curiosity… every signal is captured and monetized. Emotional states become inputs in business models that reward systems capable of keeping people anxious, reactive, and engaged.

The next wave of artificial intelligence threatens to magnify these risks. As predictive systems grow more sophisticated, platforms will anticipate needs and influence decisions before individuals consciously recognize them. Persuasion becomes personalized, invisible, and continuous. Machines may understand human behavior better than humans understand the systems shaping them.

Ultimately, the exploitation of data affinities is dangerous because it shifts power away from individuals and communities and toward opaque systems optimized for control rather than dignity. Freedom becomes something performed rather than lived. Choice persists in form but not always in substance.

The deepest fear is political: that societies governed by citizens making independent decisions will slowly transform into populations guided by engineered emotion, especially fear… shaped by systems designed to influence rather than inform.

If left unchecked, this trajectory risks creating a world where algorithmic incentives outweigh human values, where shared reality dissolves, and where those with the most data know us better than we know ourselves, allowing bad actors to exploit these systems and use that knowledge not to empower us, but to manipulate us and pull communities apart.

THE RESPONSE

A meaningful response starts with recognizing that this problem isn’t abstract—it affects how we think, how we relate to each other, and how our societies function. So the call to action has to be both personal and collective.

First, we have to reclaim agency over our own attention. That means being conscious of how platforms try to provoke emotional reactions… especially fear and outrage, and refusing to let algorithms decide what we care about. Slowing down before sharing, questioning emotionally charged content, and seeking out diverse sources of information are small acts, but they restore independence of thought.

Second, we must demand transparency and accountability from the companies and legislators who shape our information ecosystems. Supporting stronger data privacy laws, algorithmic transparency, and regulation of behavioral targeting isn’t anti-technology, it’s pro-democracy. The digital world has grown faster than the rules governing it, and citizens must push policymakers to catch up.

Accountability must extend to the companies and leaders who have amassed unprecedented influence over how information moves through society. Companies like Meta, X, Google and Palantir wield enormous power through platforms and technologies that shape public discourse, behavior, and access to information. Holding these companies accountable means demanding meaningful limits on how personal data is collected and used, increasing transparency around algorithmic systems, and reducing the unchecked access corporations have to our lives and attention. Rebalancing that relationship is not about rejecting technology, but about ensuring it serves the public interest rather than undermines it.

The goal isn’t to reject technology or retreat from the digital world, because I believe we can find real community and build meaningful connections there. It’s to ensure that human dignity, truth, and community come before profit, engagement metrics, and systems that divide our communities. The future of public discourse, democracy, and even personal autonomy depends on whether we choose to take that responsibility seriously now, rather than later.

Fourth, we need to rebuild real-world community. Algorithmic systems thrive when people are isolated and angry. Shared physical spaces, local organizations, arts communities, schools, neighborhood initiatives, and cultural events… create bonds that can’t be easily manipulated by digital systems. Strong communities are harder to divide.

Fifth, those of us who work in media, storytelling, technology, and culture have a responsibility to use our skills ethically. Stories, films, journalism, and art can reconnect people rather than divide them. Technology itself isn’t the enemy -- its purpose and governance are what matter.

The goal isn’t to reject technology or retreat from the digital world, because I believe we can find real community and build meaningful connections there. It’s to ensure that human dignity, truth, and community come before profit, engagement metrics, and systems that divide our communities. The future of public discourse, democracy, and even personal autonomy depends on whether we choose to take that responsibility seriously now, rather than later.

Read More

Subscribe to our Newsletter.