It’s Not You, It’s Me
The original (Spanish) version of this post can be found here.
About a year ago, I wrote a thread on Twitter that went viral within minutes. This article is the third (and final) piece in a small series where I’m sharing some of the deeper ideas that were underlying that thread. I believe these ideas are useful not only for understanding the world, but also for navigating life.
The first installment was about uncertainty, the second about our perception of time, and this third one is about how attention works.
If I ask you how your mind works, how you think, you’d probably tell me that you go through the world perceiving reality with your senses and analyzing it—“pretty well, actually!”—with your brain.
We believe we think similarly to a machine, collecting and scientifically analyzing all observable data (in fact, we’ve designed machines based on this belief). And we have the feeling that we’re pretty good at understanding our surroundings and what’s happening to us. That’s why we argue with others: it’s hard for us to reconcile our view of the world with someone else’s, especially when they insist on seeing it differently.
But if everyone has a different perspective and all are valid, how can reality be the same for all of us?
Because our mind doesn’t really work the way we think it does.
Every second, trillions of electromagnetic, auditory, thermal, tactile, barometric, olfactory, vestibular, proprioceptive, and many other stimuli are happening around you. Your brain can perceive only about 0.00001% of that, and it can consciously process an even tinier fraction.
Until recently, the most advanced science believed that the brain’s job was to select certain stimuli it found relevant, discard the rest, blend the selected ones with existing ideas and knowledge, and invent... a story. This was George Lakoff’s theory of mental frames: we have pre-existing notions that help us understand what our senses perceive. So, when I see a cow, I don’t need to observe every attribute to understand it’s a cow. With just a glance, I can pick up the key features to identify it.
But recent breakthroughs in neuroscience are telling a different story. We’ve long known that sensory organs, like the eyes or skin, communicate with the brain by sending electrical impulses. What we didn’t know is that the information flowing from the senses to the brain represents only 20% of all neurological communication. In reality, 80% of the signals go from the brain to the senses, not the other way around.
The Experience-Making Machine
Recent advances in neurophilosophy paint a different picture. The human brain actually perceives very little. Instead, it’s a prediction-generating machine that tells itself what it’s going to observe and fits a few sensory inputs into a narrative it created before experiencing reality.
It clings so tightly to these predictions that if stimuli don’t match the narrative, the brain just makes them up. That’s why we sometimes think our phone is ringing when it’s not. Our brain expects it to ring and generates the sensation. That’s why some people still feel a lost limb. That’s why we perceive imperfect shapes—like an unfinished circle—as perfect.
So, if I see a black and white dog in a field, I need to pay a lot of attention and get pretty close to realize it’s not a cow. The dog practically needs to bark in my face.
That’s why the brain is a storyteller that only speaks in first person: it’s making up the story. We’re not machines that observe and analyze—we’re narrators inventing a story and stubbornly trying to make reality match it. That’s why so many people think that if their view of the world doesn’t align with others’, it must mean everyone else is an idiot. The very idea that we’re impartial and accurate observers of reality was itself... a story!
That’s also why we struggle to see the world from someone else’s shoes. No matter how hard we try, we can’t truly grasp that at this very moment—right now, as you read these lines—there’s a woman giving birth in Beijing, a frightened child with a broken leg in Dakar, and an old man taking his last breath in Lima. We can only tell the world through our own story. It’s not you, it’s me. It’s always me and the story I’m telling myself in my head.
For a few years—not many—the period between the rise of the printing press and the explosion of social media, mass media managed to create a shared stage for all our stories. We all watched the same TV shows, so even if our personal narratives differed, they happened more or less in the same setting. That was the consensus.
But the rise of algorithms that tailor content to each user based on their interests has blown that common stage apart. Today, every person watches a different “TV,” often completely different from what others around them are seeing. That’s why it’s so hard to capture the attention of large groups of people at once. Politics, the economy, opinion leaders, advertisers, and media outlets no longer have the public’s undivided attention.
The result? Increasingly fragmented parliaments, and more and more frequent conflicts in places where people with different worldviews converge—like parent groups or neighborhood communities. Hence the fierce battles over giving kids sugar, the brutal Twitter wars, and the misnamed “polarization.”
And it’s this unprecedented transformation of the human experience—much more than AI hype, the metaverse, or other buzzwords—that’s poised to shake the foundations of our society.
Is all of this a catastrophe? Only for those who, until now, held a monopoly on mass communication. For everyone else, it may well be a form of emancipation. But it’s true that in the meantime, we risk losing other valuable things that relied on that shared consensus—like democracy.
The way to prevent this is to take part in the global conversation that the world is having—but to remind each other of what unites us, rather than what separates us.
Photo by Isaac Benhesed on Unsplash