← Back to Blog
BPURMarch 18, 20264 min read

Debugging My Own Bias

By Oche

Chariots by Paper route

There is a strange confidence that comes with believing you are right.

Not the quiet kind of confidence that comes from careful thinking.
The louder kind.

The kind where your conclusion feels so obvious that you begin to wonder how anyone could possibly see the world differently.

I have experienced that feeling more than once.

And each time it felt convincing.

Until it didn’t.

At some point in life, almost everyone has a moment where they realize something uncomfortable:

their internal logic was bugged.

Not maliciously.

Not because they were trying to deceive anyone.

But because the information they were consuming had quietly shaped the way they saw the world.

Bias rarely announces itself.

It doesn’t say, “Hello, I am about to distort your thinking.”

Instead, it builds slowly through repetition.

Through headlines.
Through conversations.
Through the media environments we spend time in.
Through the communities we identify with.

Over time, certain narratives become familiar.

And familiarity has a strange power.

The more often you hear something, the more reasonable it begins to feel.

Even if you have never seriously examined it.

This is why one of the most interesting scenes in Focus has nothing to do with romance or crime.

It is about attention.

In the scene, a man is constantly shown the number 77 throughout the day.

Not directly.

Just subtly.

On signs.
On clothing.
On small details in the environment.

By the time the moment arrives where he has to make a decision, his mind has already been gently trained to notice that number.

So when he is asked to choose, he confidently picks 77.

He believes it is his choice.

But the choice was quietly engineered long before he realized it.

That scene is about confidence tricks.

But it is also a surprisingly good metaphor for how bias works.

Because most of us like to believe our opinions are purely the result of independent thinking.

We imagine ourselves carefully evaluating information and arriving at conclusions through reason.

But in reality, our minds are constantly being nudged by what we repeatedly see.

The news sources we follow.
The social media feeds we scroll through.
The conversations we have with people who already agree with us.

When you hear the same perspective again and again, it starts to feel like common sense.

Not because it is necessarily true.

But because it is familiar.

And familiarity can easily disguise itself as certainty.

That realization is uncomfortable.

Because it forces you to admit something most people resist:

You are not immune to influence.

No one is.

Even people who pride themselves on being “independent thinkers” are still shaped by the environments they inhabit.

The question is not whether influence exists.

The question is whether you are aware of it.

And awareness often begins with a moment of doubt.

A moment when you encounter an idea from the other side and instead of dismissing it immediately, you pause.

You ask yourself a difficult question:

What if my internal reasoning contains a bug?

In technology, debugging means examining a system to find errors in the logic.

In life, it requires something similar.

Looking closely at your assumptions.

Tracing them back to their sources.

Asking where they came from.

And sometimes discovering that the conclusion you felt certain about was built on incomplete information or selective exposure.

This does not mean every opposing viewpoint is correct.

But it does mean that certainty deserves to be questioned.

Because certainty is often where bias hides most comfortably.

The world today makes this harder than it used to be.

Modern media environments are designed to reinforce beliefs, not challenge them.

Algorithms learn what you like.

They show you more of it.

Gradually your information environment becomes narrower.

Your ideas feel confirmed.

Your conclusions feel validated.

And slowly, without realizing it, your mental world begins to resemble that scene from Focus.

You start seeing 77 everywhere.

Not because it is the only number that exists.

But because it is the number your attention has been trained to notice.

That realization changed something for me.

It made me less confident in my first reactions.

Less comfortable dismissing people who see things differently.

And more curious about perspectives that initially feel wrong.

Because sometimes the difference between being correct and being biased is simply the difference between seeing more of the picture and seeing only the part that was repeatedly placed in front of you.

Debugging your own thinking is uncomfortable work.

It means admitting that the mind you trust most — your own — is capable of subtle errors.

But it also leads to something valuable.

Humility.

The humility to say:

I might be wrong.

And the willingness to examine the system again until the logic makes sense.

How did this post make you feel?

Enjoyed this post?

Subscribe to BPUR to get future articles sent directly to your inbox.

Comments