August 27, 2024
When I reflect on what truly matters most above all else in this world, I often think about its opposite: what does not matter? What will I regret? I find it very useful to view life through a regret-minimization lens. If you pay attention, you'll find that the majority of your regrets won't come from the things you've done, but rather from the things you haven't.
Leaps of faith never taken—acts of omission. Born from indecisiveness masquerading as 'just keeping one's options open.' Pretending we preserved a decision to be made another day in the future, while failing to realize that by doing so, we already made the decision today. Unchosen paths become our heaviest burden, captured perfectly by John Greenleaf Whittier:
"Of all words of tongue and pen, none are sadder than these: 'it could have been.'"
The most fascinating recent research I found on this topic was conducted by Bronnie Ware, who visited elderly homes and asked those nearing their end about their regrets. She compiled the following list, ranked from most to least important:
I would very much like to avoid all of these, but if I am being honest, I can also see myself making some of these mistakes nonetheless. Overall, I am pretty prudently prepping for the longest cycle. However, at least three out of five scare me more than I’d like to admit. That said, I very much appreciate Paul Graham’s practical idea of putting them on top of my to-do list.
These are the regrets of our elders today. However, unfortunately, just like what happens when you listen to your parents advice on how to succeed, the reality is, at best, you’re operating on 30-year-old outdated software. You might be wise to recapitulate the formula, which brings me to my next point:
What are the most devastating regrets that will torment my generation 60 years from now?
For a new generation, I believe a new widespread regret is inevitable. For me to make this claim, however, two things must hold true: the regret needs to be predictable enough a large collection will be affected through inaction, yet, secondly, non-intuitive enough that most will also fail to counteract it with action personally. What I suspect this new regret will be is rooted in a remarkable quote from 1890 by William James in The Attention Merchants:
"We must reflect that, when we reach the end of our days, our life experience will equal what we have paid attention to, whether by choice or default."
I believe the following regret will not just be an addition to the list; I believe it will be its frontrunner. I believe the most dreaded regret of my whole generation will be: I wish I had steered the spotlight of my attention more by choice, rather than by default."
The root cause behind this new regret involves an economic phenomenon. The introduction of a seemingly impossible business model: suddenly, once paid for at full price is now sold dirt cheap, below cost, or even offered entirely free of charge. At first on the surface, everything appears fine. The quality appears intact, demand surges, and the architects behind this business are momentarily celebrated as visionaries. Yet, beneath the surface, history tells a different story—a warning about the hidden costs of what Tim Wu calls The Attention Economy.
The attention economy is a pervasive force that has corrupted industries across decades. Interesting examples include the news industry, the Great Poster Problem in Paris, and the rise of passive consumption of entertainment through the television.
Benjamin Day is most famously known for founding The Sun newspaper in New York City on September 3, 1833. The Sun was one of the first successful “penny press” newspapers. Benjamin Day priced his newspaper at just one cent per copy, which was significantly cheaper than other newspapers of the time, which usually sold for around six cents. This unheard-of price point made seemingly no economic sense, but rendered The Sun dramatically more affordable, thus significantly broadening its readership among the entire working class. The Sun was somehow being sold below the marginal cost of production. How was this possible?
Instead of selling newspapers directly to readers, Day implemented a novel business model for the times that instead resold readers’ attention indirectly to advertisers. This marked a significant departure which shifting revenue away from subscription sales to advertisement revenue, giving rise to a new economic imperative Day could've hardly foreseen: by entering the business of reselling attention to advertisers:
he became chained to the economic imperative not only to capture but, more importantly, also hold readers’ attention indefinitely. What could go wrong?
Under increasing competitive pressure, in August 1835, The Sun published a series of sensational articles claiming that Sir John Herschel, a famous astronomer, had discovered life on the Moon using a powerful telescope. Elaborate lunar landscapes inhabited by strange creatures, including bat-like humanoids and other fantastical beings had been spotted. This led to a massive surge in sales. There was only one problem: everything was a lie. This falsehood spread incredibly far and wide, fueled by the self-reinforcing nature of social proof and the great pretentious pseudo-scientific effort it underwent to look like real journalism. Many point to The Great Moon Hoax as the birth of sensational journalism, the erosion of the news industry, and the seed of the devastating post-truth era of “alternative facts” we face today.
A similar pattern emerged in late 19th-century Paris, where the development of lithography led to an overwhelming proliferation of advertising posters. This new technology enabled the mass production of colorful, high-quality images at relatively low costs. Without regulations to protect the city’s architectural integrity, public spaces became bombarded with advertisement posters, covering buildings, fences, walls, kiosks, and even street furniture like lampposts and benches. Companies engaged in advertising wars, aggressively overlapping competitor posters in high-traffic areas. This littered the city’s beautiful streets with layers of washed-up paper, leaving public spaces disorganized, dirty, and cluttered, leaving Parisians feeling they had lost their city’s beauty.
Lastly, the attention economy found a new medium. Now the major American broadcasting networks’ new programming could be freely accessed if one would only purchase their receiver: the television. By the mid-1950s, more than half of American households had a TV set, and families would gather around the couch during dinner time to watch programs in silence in the living room. When asked, people reliably underestimate how much television they watch, but when tracked, an American household watches 4-5 hours per day, totaling around 1,770 hours annually, or approximately 74 continuous days of watching each year.
The incentives established by the attention economy in television are pervasive: episode storylines are artificially dragged out; unnecessary and unsatisfying cliffhangers are inserted to hook viewers into the next episode. Insightful documentaries and cinematic storytelling make way for programming that appeals to the lowest parts of ourselves in pursuit of ever-larger audiences. In every hour of content, 10 to 20 minutes are dedicated to advertisements with the phrase, “We will be right back after these messages from our sponsors.” What is lamentable here is that American-Hungarian psychologist Mihaly Csikszentmihalyi concluded, among many others, that the research suggests that people self-report this time to be, on average, best described as “uneventful” or “mildly depressing.”
The pattern is always the same: instead of selling to end users directly, those who follow the attention economy model make a big splash by flaunting their magical ability to offer something for seemingly "free." They insist "everything you love stays the same" while boasting this will be revolutionary, world-changing, humanistic. Saying perhaps with breathtaking hubris: "Our company was not originally created to be a company. It was built to accomplish a social mission—to make the world more open and connected."
Yet quietly, of course, these companies now tailor their entire operations towards servicing a new master: the advertiser. At first, on the surface, things might appear untouched, perhaps even made better by a sudden reduction or even complete elimination of cost. However, in a historical view of American journalism, French architecture, and global entertainment, a clear pattern emerges. Shifting the customer base, and thus the revenue source, creates perverse incentives that corrupt the service. This creates the illusion of a gain, only to reveal a deeper loss of something at first difficult to define—something we failed to appreciate until we lost it. This separation disrupts the balance between supply and demand, leads to systematic pricing errors, and, in extreme cases, causes entire market failures. Ultimately, whatever you incentivize happens. So one can beg the question: is there no better way forward? The answer most likely sits in your pocket.
There is a reason why consumer-facing products, like those from Apple, reliably delight millions by offering new capabilities, conveniences, and noticeable sparks of joy through their premium experiential goods: they are massively incentivized to do so—generating $391 billion in annual revenue as of 2024, to be more exact. This alignment occurs because their customers are perfectly aligned with their end-users. In other words, they are one and the same. An arrangement we've taken for granted over the last two decades.
These feedback loops might make for good Excel sheets, but they will not make for great products.
Now contrast that to how you are avoiding enterprise software like the plague. It makes you feel overwhelmed, actually distracts you from building things people want, and induces a state of mind that would make you eligible for prescription drugs. This is, again, because they are incentivized to do so. Their end-users are not their customers; their customers are the managerial class who make decisions based on the fact that the quoted pricing looks low enough, or the plethora of supported API integrations looks high enough, to ultimately minimize their chances of being blamed by superiors. These feedback loops might make for good Excel sheets, but they will not make for great products.
Let’s take this example of misaligned incentives of enterprise software a step further. Imagine you’re running a successful large-scale business, and a salesperson calls you with an ‘offer you can’t refuse.’ You are among the first selected few granted access to their brand-new project management tool. ‘But wait, there is more.’ This particular tool miraculously comes free of charge and is ominously called ADvantage. Your phone wouldn’t even have time to register the call in your call history after how fast you would hang up the phone. No responsible person would consider running their business this way. Now, imagine a tool that you would use to manage something even more important than your business—your life. Would you pick up the phone?
Over the last two decades, the attention economy has unfortunately become the dominant business model behind the free and open Internet, news, and our everyday toolset. Masquerading as tools meant to support our intentions, while their ulterior motive is to capture and hold our attention. Nearly every individual with internet access is a social media user. More than 4 billion people as of today's writing, or over half the species, if you will.
No technology has ever gripped us so completely, spread so far, so fast, across so many. With such unprecedented scale and speed comes a catastrophe to match it, one that mirrors another existential crisis of similar magnitude we are currently tackling: climate change—a useful analogy I will end on in order to understanding this crisis better.
Why has the attention economy's scale become so grotesque? Three forces emerged in the following sequence, each new one amplifying the last. First came the social pull of same-side network effects—where each new user made the platform exponentially more appealing to others. Then came the technical enabler of ubiquitous access: smartphones and high-speed cellular internet transformed occasional desktop browsing into anytime-anywhere-anything-connectivity, causing the transition to mobile-first in 2012. It is important to remember this is a full 8 years after the launch on desktop where usage was naturally contained and therfore reserved within healthy boundaries.
Then, most crucially, with this unprecedented scale, the economic imperative of monetization kicked in and it is here where things went astray. Where ultimately the thing that 3 billion people now entrusted into their lives got corrupted. What exactly happened? Well, imagine for a moment that you are an executive tasked with profit-maximization when you stumble upon this inconvenient realization: the slice of time users will tolerate ads is actually highly fixed at around 15-20%. As such, the only way to increase revenue from that slice is to increase the size of the whole pie; hence increase overall screen time became an obsession—and apparently willing to do so at all costs, as we now can see so clearly.
Setting aside whether increasing time spent was ever a net-positive—or even healthy—increasing it is exactly what transpired next. We started at 6 hours per month, which is 12 minutes per day in 2010, then went to 20 minutes in 2012, to half an hour by 2015, to approximately four hours by 2020 for Gen-Z—that's a staggering 20x increase in time spent in just one decade. Today, the average American spends more time in front of screens than sleeping. An average Gen-Z spends a total of 8 hours and 54 minutes on screens daily. This leads to a staggering 8,760 hours per year, amounting to approximately 24.42 years over a lifetime. Ask yourself, what percentage of this time is truly time well spent?
Hence, neuroscientist Adam Gazzaley remarks in his book The Distracted Mind that most of us inhabit an information-saturated world where no one can keep up with the increasing volume and velocity of modern-day life. Already saying this aloud feels nothing short of a fish describing the world to his fellow constituents, only to be greeted with the question, 'Wait, what is water?'
Herbert A. Simon, a Nobel Prize-winning economist and cognitive psychologist, commented on the unexpected side effect of information overabundance all the way back in 1971, before the existence of personal computers. At first glance, one might intuitively think that a wealth of information offered by, say, the internet would lead to something akin to a second scientific revolution. This belief was widespread among early internet proponents who envisioned an inevitable explosion of wisdom among the populace. But history tells a different story in line with Herbert A. Simon's famous warning : "In an information-rich world, the wealth of information means a death of something else: a scarcity of whatever it is that information consumes. What information consumes is rather obvious: it consumes the attention of its recipients. Hence, a wealth of information creates a poverty of attention."
Even Simon couldn't have foreseen at what scale this scarce commodity of human attention would be commercialized and hence so systematically exploited in the 21st Century. The omnipresent attention economy has fundamentally rewired how human attention operates on a global scale—obliterating our endogenous attention that flows internally from our curiosity, values, and goals and replacing it with exogenous, stimuli-driven, bottom-up attention where outside forces continuously manipulate you for the entirety of your life. Till what end? To sell fucking eyeballs to advertisers, of course.
Returning to our climate change comparison; through an unforeseen negative externality, the fossil fuel economy depleted a commodity once so abundant we were blind to its value: the carbon-absorbing capacity of our atmosphere and oceans. A classic elementary economic textbook example of The Tragedy of the Commons. The attention economy follows the same pattern. We gained something, but simultaneously lost something invaluable along the way. Something we could barely articulate until we almost completely lost it. What undervalued sacred shared resources does the attention economy exploit today? Our human attention, our relationships, and our collective trust in one another and in society at large. The parallel is clear: just as we cannot afford to ravage our planet's environment, the runaway effects of an unchecked attention economy that depletes these three will see civilization become undone.
In a society more materially rich than our ancestors could ever have dreamed of, all data today points to a decline in well-being indicators across the board. Technology is miraculous. The internet is amazing. But social media, arguably functioning as the default operating system of most individuals' lives, has been a catastrophic mistake. I am more than willing to accept that if the supply of cool and interesting content increases, it follows that we would allocate more of our time. If, for example, those who were most engaged with online content experienced greater well-being, built better relationships, or found greater purpose and direction, it would make for an interesting discussion. Unfortunately, for the sake of our discussion, the reality is, respectively, we do not, it doesn’t, and the reverse is true.
“Each medium, like language itself, makes possible a unique mode of discourse by providing a new orientation for thought, for expression, for sensibility”, Neil Postman writes in Amusing Ourselves to Death. “[…] to take a simple example of what this means, consider the primitive technology of smoke signals. While I do not know exactly what content was once carried in the smoke signals of American Indians, I can safely guess that it did not include philosophical arguments.”
When used as clearly intended, the most adamant social media users seeking connection end up feeling lonelier; those who follow the news become verifiably less informed; and those who look for role models are left to follow ‘influencers’—a nonsensical term from an era of confused individuals that no sane person should legitimize. Instead of exercising character and becoming a valuable member of your community, helping your fellow human beings through labor, these so-called 'influencers' primarily succeed in promoting pettiness, self-obsession, and get-rich-quick schemes. They lure all males into thinking investing in yourself means betting on altcoins, while all females are constantly seduced by the thought that they could make millions if they would reduce themselves to a collection of objectified body parts for others to jerk off to. And worst of all, of course, it gave us Logan Paul.
As The Center For Humane Technology (CHT) stresses, despite what tech moguls claim, technology is not neutral at all. Due to the attention economy, today’s communication technology is neither designed to foster understanding nor is our information technology in the business of informing us. Instead, these technologies are crafted to capture and hold our attention indefinitely. Put bluntly, the internet in general, and social media specifically, have essentially become technological narcotics in the 21st century.
People are working tirelessly to distract, making each moment spent online more addictive than the last. Even worse, we are increasingly living behind screens, wearing our mobile devices like gloves, and carrying them with us from cradle to grave. As a result, the world is becoming an increasingly addictive place, one for which we have yet to develop the antibodies needed to protect ourselves. We are now left to fight tirelessly for our human attention, volition, and self-determination, 24/7/365, against algorithms that continually improve and never sleep. This is why I fear that, at the end of their lives, the most widely shared regret among a new generation will be the following regret:
"I wish I had steered the spotlight of my attention more by choice, rather than by default."
Inevitably, the majority of people who grew up in the 21st century will come to the devastating realization that their self-authorship was taken away from them, cradle to grave. That the operating system with which they lived was never on their team, but on someone else's. That, for all the things to which they could have devoted their lives, they have only succeeded—as Neil Postman put it—in "amusing themselves to death." At the end of their days, they must reflect and come to the agonizing truth: they paid the ultimate price by paying no attention at all.
The path forward is clear: we must move way from the attention economy as fast as possible. Discover how Telos plans to accelerate this transition by by offering a viable alternative in our Mission page.
Below you can read a story about Samantha, a teen girl who is part of the first generation raised by social media, and what pernicious effects it has had on herself and her friendships.
Learn more about Telos and our story.