QUESTIONING ANSWERS: how information diets shape our silicon curtains
Disclaimer: I am not immune to clickbait, gaslighting, manipulation and biased narratives. I, therefore, first and foremost, wrote this article for myself. But I have a sneaky suspicion you might also be caught in the same web of slanted views and silicon curtains. Please read with an honest approach and an open mind
Have you ever wondered why critics of Donald Trump struggle to grasp how his supporters can find no fault in him—and why his supporters, in turn, are puzzled by the intense opposition he faces? Or perhaps, like me, you’ve been disturbed by how pro-Israel supporters remain seemingly unmoved by the suffering in Palestine, while pro-Israel supporters can’t comprehend why so many Christians seemingly support Hamas by not standing with Israel in their time of need.
This begs the question…
- Do we believe we are right and others are misinformed because of our moral superiority? Probably
- Is this divide driven by ignorance, fake news or conspiracies? Rarely.
- Are our convictions shaped by moral principles, Biblical truths, or ideological beliefs? Hardly.
Whatever the reason, we can safely assume that our disconnect is mostly rooted in the different information diets we consume and the toxic cycle of clicking, consuming, absorbing, and then reacting. We click on the news that feeds our biases, only to be fed the same narrative the next time we click. We then consume these selected narratives so often that what we absorb, we then believe as truth and internalize as conviction. Over time, this shifts our reasoning and mental habits to a point where we cannot react objectively anymore. We think we’re doing research, but in reality, we’re just reinforcing our own biases. We believe we’re open-minded, but we’re just rearranging our existing opinions. And then the cycle repeats itself—again and again. We become conditioned to answer questions rather than to question the answers.
These aren’t just ideological and theological divides anymore. They’re symptoms of a deeper phenomenon: Algorithms that create a silicon curtain, which in turn establishes a distorted and lopsided reality that few can escape. This makes us unkind, defensive, self-righteous, and spiritually proud.
This condition is a combination of three components
- Narrative diets
- Algorithm editors
- Silicon curtains
1. THE RISE OF NARRATIVE DIETS
In today’s digital age, we don’t just consume information—we’re fed it.
This is crucial to understand. Algorithms on Social Media shape our news feeds based on engagement, not truth. The result? We’re served content that aligns with our existing beliefs, reinforcing them and filtering out dissenting views. It is a biased-feeder per excellence and makes it impossible to discern alternate views and different opinions. The videos we watch are not shared to assist us in gaining a broader understanding, but to confirm our biases and gain more views. We are now being fed according to the diet we chose on previous occasions
This creates what some call an informational diet—a steady stream of content that shapes not only what we know, but how we think. Over time, these diets become ideological echo chambers, where questioning our beliefs feels unnecessary because every answer we see confirms them.
2. ALGORITHMS: THE INVISIBLE EDITORS OF OUR REALITY
Most people today—especially those under 35—get their news from platforms like Facebook, YouTube, TikTok, Google, or news aggregators. Even AI provides information that will seldom contradict your already steadfast convictions. These platforms rely on algorithms to decide what content you see – and what content will carry your future approval. But these algorithms aren’t neutral. They’re designed to maximize engagement, which often means prioritizing:
- Sensationalism over substance
- Echo chambers over diversity
- Emotionally charged content over balanced reporting
According to the Reuters Institute, this shift has made direct access to news websites increasingly rare, especially among younger audiences. Instead, we’re fed a personalized stream of content that reinforces our existing beliefs and biases.
3. THE “SILICON CURTAIN”: A NEW KIND OF DIVIDE
The term “Silicon Curtain” draws inspiration from the Cold War’s “Iron Curtain,” but instead of physical borders, it refers to the digital divide created by powerful tech platforms and data-driven systems. Unlike the Iron Curtain of the Cold War, this divide isn’t geographic—it’s algorithmic. It separates people not by borders, but by belief systems.
As described by Jason Perysinakis, it symbolizes how data colonialism—the extraction and control of personal data by tech giants—creates new forms of inequality and ideological control.
This curtain isn’t just about access to technology. It’s about who controls the flow of information, how that information is filtered, and what narratives are amplified or suppressed.
The Three Layers of the Silicon Curtain
Aurelis.org outlines three “Silicon Curtains” that shape our digital experience:
- Technological Echo Chambers: Algorithms create filter bubbles where we only see content that aligns with our views. It prioritize content that resonates with our past behaviour, creating a feedback loop of confirmation bias. This narrows our worldview and makes it harder to engage with differing perspectives.
- Global Digital Fragmentation: Countries like the U.S. and China follow vastly different digital paths, leading to technological nationalism and incompatible internet ecosystems. This, in turn, leads to incompatible worldviews.
- Superficiality and Speed: The demand for quick, clickable content discourages deep thinking and critical analysis, replacing nuance with noise.
FROM CONVICTION TO CONDITIONING
What’s striking is that these divisions aren’t necessarily rooted in deep moral convictions. They’re often the result of conditioning. When your digital environment consistently answers your questions with agreeable content, you stop questioning altogether.
We’ve shifted from questioning answers to answering questions—a subtle but profound change in how we engage with the world.
WHY THIS MATTERS
Belief Formation: When algorithms curate our information diet, they shape not just what we know—but how we think.
- Democratic Erosion: A misinformed or polarized public is more vulnerable to manipulation, conspiracy theories, and authoritarian narratives.
- Cultural Isolation: The digital divide isn’t just about access—it’s about exposure. When we’re only shown what we already agree with, empathy and understanding erode.
- Polarization: When people live in separate informational realities, dialogue becomes nearly impossible.
- Manipulation: A public that doesn’t question is easier to influence—by politicians, corporations, or foreign actors.
- Loss of Empathy: Exposure to diverse perspectives fosters understanding. Without it, compassion is the first to disappear.
BREAKING THE CYCLE
To reclaim our cognitive autonomy, we must:
- Diversify our sources: Seek out perspectives that challenge your worldview.
- Understand the algorithms: Awareness is the first step toward resistance.
- Practice media literacy: Learn to spot bias, manipulation, and misinformation.
In a world shaped by invisible algorithms and curated realities, the most radical act might be this: to question the answers we’re given.
Sources:
Tech Growth Policy – The Silicon Curtain
Reuters Institute – Attitudes Toward Algorithms
Aurelis – Three Silicon Curtains