top of page
  • Writer's pictureJoel T. Sanders

What's Our Plan to Overcome The Infobesity Epidemic?

Updated: Feb 28, 2021


I'm not the first person to have noticed the similarities between the epidemic of obesity and the epidemic of information overload.


The word "infobesity" was coined in 2014, meaning, "an excess of information, especially when this makes it more difficult or impossible to make rational decisions."


We overconsume carby, fatty foods because of their ever-present availability and very low cost. We overconsume information for the same reasons.


Inside and outside of work, the way we encounter, process, and interact with information is akin to constantly snacking on the worst kinds of junk foods. Our email inbox is like a bag of irresistible potato chips: It's always at our side and we're constantly nibbling. Slack, Instagram, the news and similar snippets of information are the cookies, donuts, and candy bars we can't put down.


Alas, our poor diet and lack of exercise is making us fat, and our ironically-named "smart" phones are, in a very real way, making us dumb.(1)


Unfortunately, knowledge of the problem in and of itself is of no help. Governments, businesses, and desperate consumers have thrown trillions of dollars at the obesity epidemic, and it's only getting worse. Everybody knows they need to eat better and exercise more. But we can't get ourselves to do it.


That's why I doubt that a growing awareness of our distracted lives caused by our computers and "smart" phones will stem the infobesity epidemic. Deep, focused thinking on hard problems is the antidote to fractured workflows brought about by email. But deep, focused thinking is hard, just as training to run a triathlon is hard. Very few of us have the discipline to pull it off, so we return to our email inboxes and life as usual.


Are Answers To Our Infobesity Challenges Lurking in Our Historical Past?

I have a hunch that clues to the way forward may exist in our historical past. I've written several times on simplifying tools for knowledge work, such as returning to thinking practices involving paper and pen. I'm also helping clients build systems, templates, and processes for coordinating knowledge work that, by their nature, assist with deep focused thinking and asynchronous collaboration outside of email.


But beyond these practices, I suspect that there could be more profound answers from those who actually helped to bring about the information age—specifically the researchers, academics, and business leaders who were the dissenters of the world as we've presently constructed it. Google, Instagram, and email, after all, were not inevitable consequences of technological determinism; they are the consequences of those companies and products that won the war for our attention. As they say, history is written by those who win wars, not by those who lose them.


Psychedelics: An Example of Buried Research That's Promising Once Again


As evidence that breakthrough thinking my be lurking in our historical past, I point to the fascinating renewed research into psychedelics. Michael Pollan helped bring public awareness to the new science of psychedelics in his fascinating book How to Change Your Mind.


In the 1950s and 1960s, research into the potential of LSD and other psychedelics was widespread and enthusiastic. And for good cause: psychedelics were profoundly promising as treatments for smoking, alcoholism, depression, and other serious psychic illnesses. Well-funded teams at universities around the world were contributing to a growing body of promising academic work.


All of this came to a screeching halt as governments became increasingly frightened of the powerful effects of these substances. Parents, too, became fearful as young people, artists, hippies, and others on the cultural fringe recounted experiences of otherworldly, out-of-body experiences. Public service campaigns spread frightening news of teens hurling themselves to their deaths because they "thought they could fly," and other stories that were mostly untrue. By the early 1970s, funding dried up completely and the research was buried—and almost lost to history.


Yet research into psychedelics is once again making its way into such revered institutions as Johns Hopkins University, Stanford, and other top-flight research centers in the U.S. and around the world, due to a convergence of social and political opportunity and a well-crafted strategy by a group of committed activists.


I'll let you read Pollan's excellent book for the rest of the story. My point is that all of the new psychedelics researchers are relying on foundational work produced as far back as the 1930s and 40s. Might there be analogs for information technology and business process management?


Does Using Google As a Primary Research Tool Lead to Uncreative Answers?


Several weeks ago as I began exploring this line of thinking, I predictably came up short researching directly on Google and Google Scholar. After all, Google was on the winning side of the information war.


The use of Google as a primary research tool has led to an overwhelming problem of research recency bias, i.e., favoring newer research citations over research from past decades. While there's plenty of available research on the challenges we face as a society living with constant distraction, there's very little in the way of finding answers in the form of tools and business processes. Much of what I came across were the same clickbait titles, regurgitating the same closed-loop concepts and citing the same sources.


Thus, my strategy shifted to books, and the references in those books. Of particular interest to me are the older research papers that might leave clues to the dissenters who lost the fight. There were, after all, competing information theories.


Information Without Meaning or Context is Junk Information


One interesting scientist from the early years of the development of information science—who was on the losing side of history, to my understanding—is Donald MacKay.


MacKay was a British physicist who suggested that the concept of "meaning" should be included as fundamental to information theory. MacKay's theory was intriguing, but created a profoundly difficult problem, as it brought the realm of human subjectivity into the ostensibly objective world of information science. The difficulties of incorporating MacKay's ideas proved too much to overcome for early computing, and thus became a bit of a dead-end at the time.


I don't pretend to know if MacKay's ideas have since been attempted in information science or not. The cursory evidence says no, considering the overwhelming amount of information online that is divorced from any kind of meaning at all. Out of context "clickbait" snippets are a winning strategy, perhaps THE winning strategy online.


The world we live in places humans as just a node in the great information highway, easily automated away and divorced from what's seen as the superior work of computers and artificial intelligence. Instead of being served by our tools, more often than not, we are the products for computer algorithms.


McKay could be a theorist from our past who had answers we didn't pursue at the time, but that could be pursued now. His ideas on information requiring depth and meaning came to light in the wake of World War II. Viktor Frankl's Man's Search for Meaning was published in 1946, so a theory of information that includes humanity at its core is more than understandable.


The fact that the problem of meaning was too difficult to tackle at the time due to technological limitations is no reason to leave these theories to the dustbin of history today.


Searching the Fringes to Compose New Ways of Working

My greater point is that breakthrough thinking most often comes from the fringes. It requires dogged persistence and disciplined, sustained concentration over hours, days, months, and even years.


Thus, I'm looking to books and research publications from 2010 and earlier, even into the late 19th century, for clues to alternative ways of working and a more human-centered future.


I've been searching not only within information technology and business operations literature but also on the edges, in the realm of cultural and literary criticism. In that vein, I recently came across a book published in 1999 that dives deep into the theorizing that brought about our present information age, using a multiple disciplines lens. I won't yet share the title, as I'm only getting started. But the early pages are promising. (If you're wondering, this is where I learned about Donald MacKay.)


My present line of thought may be a dead-end, but that's OK. On the edges of this line of thinking are more paths to potentially pursue, nicely tucked into the bibliographies of books published at the dawn of, and before, the unleashing of the World Wide Web. It's within the journey itself that the adventure lies, where we encounter and bathe ourselves in subjective meaning—not in a Google search that turns up instant answers.


At the very least, my iPhone is tucked away on airplane mode in another room, leaving me here to focus on writing what's become a somewhat lengthy blog post on a Saturday afternoon.


Which, by the way, I was struggling with mightily, and didn't fully come together until after a 40-minute nap. What might that teach us about cognition and focus? Do businesses need to consider defocussing activities like naps, recess, or game time as fundamental strategies that enable breakthrough focus work?


We have to be open to anything. Business life as we've constructed it for knowledge workers is a world full of anxiety and exhausting multitasking. It's making us depressed and feeling without agency over our own lives.


This fact is only made worse in our down time, when all we can think to do is reconnect to the internet and take in new fragments of disconnected junk news, or snippets from our relatives and friends, devoid of any context, depth or meaning.


The way forward just might be in looking to our past.



(1) For a summary of at least 12 studies making the case that our so-called "smart" phones are making us anxious, unfriendly and dumb, see the afterword to the 2020 edition of Nicholas Carr's excellent book, The Shallows: What The Internet is Doing to Our Brains.



Other Reading You Might Enjoy:

bottom of page