menu_open Columnists
We use cookies to provide some features and experiences in QOSHE

More information  .  Close

Want to Reclaim Your Attention? Establish a Ritual

16 0
yesterday

Understanding Attention

Find a therapist to help with ADHD

Social media and news algorithms optimize for engagement and revenue, not human care or shared well-being.

Personalized feeds can fracture shared reality, weakening common norms and trust.

Belonging depends on shared attention over time, not private, individualized streams.

Platform tweaks may help, but belonging cannot be outsourced to better code.

In February, Meta—the parent company of Facebook and Instagram—unveiled “Dear Algo,” an AI-based feature that lets users “talk” to the algorithm that decides what appears in our personalized online worlds.

It’s the latest of several high-profile efforts to “humanize” the algorithms that govern so much of our lives these days. With features like these, tech giants are responding to a widespread feeling: that we’ve lost our sense of agency online. Market forces and opaque formulas decide what we notice, what we miss, and what we carry around in our thoughts and feelings throughout our days. These algorithms determine our news and information diets, our shopping and music preferences, dating matches, and even the emotional tone and tenor of our political discourse.

It’s good news that tech companies are beginning to recognize that there’s a problem. Still, is it really possible to “humanize” an algorithm?

The issue isn’t just what social media platforms are (or aren't) showing us. The deeper question is what these systems do to our attention—and therefore to our lives.

The Real Purpose Algorithms Serve

It’s easy to think of algorithms as impossibly complex, even mysterious. Yet, in essence, they’re simple decision rules. They’re sets of instructions that sort and rank what you see next based on what the system predicts will keep you watching, clicking, liking, or sharing.

And they serve a purpose. These “decision rules” filter information to offer the convenience of fewer choices, faster results, and less effort. The trouble is that we often relate to these systems as if they’re objective and responsible. Sometimes we implicitly think they’re looking out for our best interests. In reality, they’re designed to serve financial incentives, and they’re inherently incapable of anything approaching real human care.

The true cost of algorithms isn’t just to our agency in making online decisions. They exact a serious toll on our experience of belonging in a community and a society. That’s because algorithms monopolize the basic currency of human connection: our attention.

Attention is the exercise of our free will—it’s where we allocate our mind, our time, our effort, our care. Yes, algorithms can amplify bias, reward outrage, and distort what feels true or common. Yet the bigger challenge, I'd argue, is simpler and more intimate: Many systems are built to keep attention “always on,” using novelty loops, autoplay, infinite scroll, and constant notifications.

You probably know the feeling. You open an app to check something quickly, maybe just to see a new notification. Within minutes, you’ve absorbed a fear-inducing headline, a perfect family photo, a political attack, a dreamy vacation ad—all designed to provoke feelings that keep you scrolling. Half an hour disappears. Time feels like a blur. You feel overstimulated but somehow undernourished—less interested in real life.

Algorithms don’t just personalize news and entertainment. They often divide our realities. Two people can live in the same neighbourhood, walk the same streets—and still feel as if they’re living in different countries, because they’ve been fed different values, different villains, different ideas about the common fact base and the common good. Community becomes a form of content—something we consume rather than something we build and maintain, or something to which we are accountable. Compare all of that to spending time in a real gathering: a neighbourhood walk, a choir rehearsal, a block party, a sweet dinner with friends.

Understanding Attention

Find a therapist to help with ADHD

What we attend to becomes what we value. And what we attend to together becomes what we share—our norms, our moral commitments, our sense of “us.” Belonging requires shared attention over time, as well as real encounters that build trust, mutual responsibility, and care.

How Rituals Can Help You Lessen the Pull of Algorithms

So, what’s the opposite of an algorithm? In a word: a ritual.

Ritual doesn’t have to mean burning incense or wearing ancient uniforms. A ritual is something that brings us together in shared attention for a set period of time. It’s something meaningful that we do together. The walk, the choir practice, the block party—those can all be rituals.

Think of a ritual as a structure for sharing our attention. The Korean philosopher Byung-Chul Han describes rituals as techniques that help us “feel at home in the world.” He explains that they make time feel less like a blur and more like “a place we can actually live inside.”

Algorithms, in contrast, lack “stopping cues.” One short video transitions into the next, and our attention slides forward, often without making a conscious decision to be there.

Algorithms are on-demand, individualized, frictionless, and open-ended. Rituals are scheduled, shared, embodied, repetitive. Algorithms individualize attention: each person is fed a different, computer-generated world. Rituals socialize attention. We turn toward the same thing, at the same time, in the same place—so that a meaningful sense of “us” can emerge.

That's why I believe that the antidote to the algorithm problem isn’t to smash our iPads or delete our Instagram accounts. It’s to adopt rituals that defend shared attention.

You might start small: taking part in a book club, cooking with a loved one, signing up for a monthly volunteer shift. You can make a tech boundary a ritual, too—for example, one hour each evening when the whole family’s phones go into a designated drawer.

New innovations from tech companies, like the one Meta recently rolled out, could potentially ease the attention crisis and make our social media feeds more sane and sensible. That’s certainly welcome. Yet these kinds of tweaks can’t rebuild our common fact base or restore the experience of being held in community.


© Psychology Today