Reclaiming Reality

Doctoring and Discipleship in a Hyperconnected Age

Article

Contents

Have we counted the cost?

While the many benefits of smartphones and the digital revolution they represent reveal themselves readily, I fear we fail to fully appreciate the toll they take.

My concerns echo those of past generations. Something about humanity’s indomitable drive “to strive, to seek, to find, and not to yield”1 has shepherded into the world a ceaseless cycle of technological revolutions. With each new wave of technology, some naysayers have bemoaned the passing era and looked with trepidation toward the future. Before the internet, we worried about the overpowering effects of television; in the early twentieth century, cultural critics lamented “talkies,” radio, and the emergence of “mass culture”; and long before that, philosophers and religionists fretted over the advent of the printed word and the end of memorizing our most important ideas.2

I am acutely aware of this history and that current concerns over the internet’s effect on society may seem like little more than a longing for a nonexistent golden yesterday. Still, I can’t shake the sense that society’s tectonic plates are moving beneath our feet in ways we will not fully appreciate for years, maybe decades. Some days it seems that “things are in the saddle, and ride mankind.”3 My persistent concerns persuade me to write them down.

But why should you care what I have to say?

Perhaps in part because I was born in 1980. This may seem a faint qualification, but hear me out. As a Xennial (not quite a Gen-Xer, not quite a Millennial), it’s as if I moved to the digital world while I was young, but aware. I’m a passable—even well-camouflaged—resident, but not really a native. I may seem to overstate the effect of my exact age, but sociologists and demographers have made a similar argument.4 My non-native discomfort keeps me keenly aware and grants me special insights into a culture I understand well but from which I will forever feel apart.

Beyond this, perhaps my strongest qualification is simply that the more I lean into the pursuits that matter most to me—evolving as a father and husband, doctoring, and discipleship—the more troubled I become. All around me I sense the effects of an infiltrating and nearly omnipresent technology that we often do not notice because it is our forest’s trees.

My experiences as a doctor have been particularly poignant in this regard. Facing down existential threats with my cancer patients brings me enormous satisfaction and adds great depth and meaning to my life. Doctoring is a deeply spiritual pursuit and an integral part of my Christian discipleship. In this sense, my professional and spiritual lives feed off each other—and I see the internet affecting them both.

Don’t get me wrong: the things my phone, in particular, does—and the speed and fluency with which it does them—stagger me. Without moving from my chair, I log into Facebook and look at photos of friends I have not seen for many years and watch birthday videos of a child born to a girl I taught in Mexico as a missionary. I watch my wife loop through the hills near our home in an app that tracks her training runs. I briefly log onto a webpage that contains the most up-to-date information on virtually every medical topic, and then I check my email to find an important message sent to me just two minutes ago by someone across the country and flick off an instant response. Later, my wife sends a video showing me our youngest son’s first steps, and I push a button on my phone and dictate an answer detailing my delight. Simultaneously, the nurse practitioner on my oncology team sends me a message detailing a chemotherapy calculation to which I work out the answer on my phone and respond within moments.

Beyond even these magical abilities, the advent of the internet and widespread access to smartphones have unquestionably affected our lives in broader ways as well. The internet has shrunk the world and forever changed commerce. It has opened our eyes—often in real time—to corners of the globe that previously would have remained largely obscure to us. It has made citizens into reporters and allowed access to information in ways unimaginable even twenty years ago.

All this frequently leaves me feeling like I’ve slipped into the wizarding world of Harry Potter, where I hold a kind of magic in my hands. My smartphone tidily represents the technological transformation I have witnessed over twenty-five years—from plodding, earthbound, ugly computers to beautiful, sleek, and efficient technological marvels. My iPhone has become my constant companion and my handheld portal into an endless world of wonder, efficiency, and possibility.

And yet.

I sense, too, that this technology is changing me from the inside out. Neil Postman memorably argued—some thirty years ago, in Amusing Ourselves to Death—that Aldous Huxley’s Brave New World should worry Americans much more than 1984 because we are hardily independent and bristle at the slightest forcible attempt to withdraw our freedoms (à la Big Brother). Lull us to sleep, however, and the matter changes entirely. Ply us with comfort, convenience, and pleasure, and you can enwrap us in spider strings that, woven together, become strong enough to lead us wherever those wily enough to master those enticements want us to follow (see 2 Ne. 26: 22).5

I fear that without noticing I may wake up one morning bound and mummified: a prisoner in my own Brave New World.

Part of me wonders, am I already there?

Part 1: Transformative Technology

Virtual Doctoring

I am not sure how concerned I should be, and I am not sure I want my patients to know, but having recognized it, I might as well say it: the internet now forms part of my brain.

I am a medical oncologist, which means I give chemotherapy to patients with cancer. Making appropriate and cutting-edge recommendations to my patients requires my staying abreast of an enormous, ever-changing body of medical literature. Keeping up with the constant flow of new information daunts me.

Consequently, I resort to the internet multiple times a day to fill in my knowledge gaps. Usually, this is a double-check. Sometimes, however, I simply don’t know—especially if the question lies outside my narrowly defined specialty. Many years ago, this situation would have required consultation with an enormous medical encyclopedia or, heaven forbid, going to a medical library to leaf through a stack of journals. Now, however, print journals seem superfluous, and I sometimes wonder why brick-and-mortar medical libraries exist at all. I simply pull up one of a few trusted medical websites, punch in the magic words, and—voila!—the information I need appears.

What concerns me, or at least unnerves me, however, is the gnawing awareness that my relationship with online information is much more complicated and nuanced than it might at first appear. I wish I could believe that the things I need to look up online were encompassed in one tightly contained and contiguous area. Increasingly, however, I recognize it’s not really like that. More and more, the borders between the information in my physiologic brain and that in my internet brain bleed into one another: sometimes I’m not sure which facts reside where.

When I was in medical school, I felt like I needed to know all the things. In retrospect, of course, I recognize the folly and hubris of thinking that would or could ever happen, but when the supervising physician on my team would pepper me with questions in front of a group of doctors, that was certainly how I felt. Compounding my insecurities, it seemed like everyone else on the team knew everything already anyway. When I didn’t, I felt a twinge of shame. Increasingly, however, I sense not only that I don’t know all of the things (that became glaringly obvious a long time ago), but that I’m not even really supposed to—at least not in the way I imagined ten years ago. Facts available in my internet brain, after all, don’t need to also reside in my physiological brain—do they?

Technology has begun to infiltrate not just what I know but how I know it. I sense that the technological portion of my brain has become like a symbiotic tumor that is slowly spreading fingerlike projections into my cerebral cortex. I doubt I could remove it if I wished. Stranger still, I don’t wish. I’m glad it’s there. I’m not sure I could fully function without it.

Well, you might counter, isn’t that all for the good? If medical literature is as complex and vast as you describe, Dr. Johnson, shouldn’t we be grateful that technology augments doctors’ brains to allow them to access the entirety of the data when making medical decisions? To this question, hesitantly, I answer yes. But even before the answer crosses my lips, it catches uncomfortably in my throat because I recognize that technology influences my doctoring in other ways too.

The internet also challenges my doctoring because it fractures my thinking. In hospitals where doctors are learning to doctor, “rounds” fill most mornings. Rounds are a complex didactic ritual where doctors-in-training marshal all the information they have gleaned about a patient into a formal presentation that they rehearse in front of a large group of medical professionals that includes other doctors-in-training of various classes as well as the “attending physician”—a senior doctor who leads the team and takes responsibility for the patient’s care. As you might imagine, this process can be deeply stressful and also immensely power­ful for teaching young doctors. When I first began to “round” eleven years ago, the iPhone had not yet been invented and its predecessors were poor enough that they did not seduce much attention. Now, of course, we live in the world of technological sirens like the iPhone X and the Google Pixel. As this technological evolution has unfurled, the very devices that so captivate us have increasingly and frustratingly inserted themselves into rounds (just as they have into almost all other classroom settings) so that now it is not uncommon to find medical students scrolling through various feeds while a doctor on the other side of the circle is presenting a patient, and many mornings the buzz of text messages and incoming calls punctuate the teaching process so frequently it can be hard to proceed in a meaningful and linear fashion. Before I get ahead of myself, however, I jump to admit I am the pot calling the kettle black. I recognize in myself that same fractured thinking—whereas ten years ago I could easily follow complex oral arguments (synthesizing a patient’s history or arguing for and against a particular treatment) for hours on end, I note that this now requires greater sustained mental effort. I am accustomed to the online world, where I can and do jump back and forth endlessly between apps and information streams. Focusing on just one line of thought for hours is increasingly difficult.

Perhaps the effect that worries me the most, though, is not how the internet is changing our doctoring brains, but the insistent way the digital world pulls us apart from our patients. Increasingly, the patient herself is the last place many doctors look for important medical information—after all, everything I need to know is in the electronic medical record. When I care for a patient in the hospital, I can arrive in the morning, and within about seven minutes I can ascertain everything that happened to the patient overnight, the results of all scans and blood tests from the last twenty-four hours, every vital sign since I last saw the patient, the opinion of every other doctor caring for the patient, and every note from a nurse or other practitioner, all without ever doing something so prosaic as dialing a phone, calling a colleague, or actually seeing the patient. Indeed, perhaps we shouldn’t be surprised that this era has also seen the rise of the “virtual ICU,” where a health-care professional is given patient data remotely and largely manages the patients’ care from afar.6

This consolidation of information dramatically increases our efficiency, but at a cost. One of the country’s best-regarded physicians captured this sense in his unforgettable essay, “Culture Shock,” ten years ago.7 In that piece, he described how there was a time twenty years earlier when a doctor caring for patients in the hospital spent virtually all her time caring for patients. Increasingly, however, the embodied patient has faded into a secondary role, largely replaced by a digital avatar. Doctors in training now spend more time in front of computers and less time engaging with patients. When we make “rounds” (as described above), it becomes more and more of a chore to peel the young trainees away from their computer screens to “round” in the first place; after all, “everything that matters” seems to reside in the computer anyway. All of this has led to a startling irony—many patients admitted to the hospital see nurses, physical therapists, dieticians, and many other health-care practitioners frequently but are left wondering where all the doctors have gone.

This, again, causes me deep concern. Technology was supposed to augment our ability to care for patients by routinizing the busywork that previously kept us from them. In an existential sleight of hand that is both ironic and disturbing, however, instead of freeing us, technology demands increasingly more of doctors’ time.8 While causality would be virtually impossible to prove, I am nonetheless struck that the digital medical revolution just preceded a wave of doctorly stress, burnout, and disengagement.9 A profession that was once regarded by both the public and its practitioners as among the most noble of arts has recently seen diminishing public respect and a souring of its own doctors, with one recently and infamously labeling the practice of medicine “the most miserable profession.”10 Instead of carrying us to our patients, computers are carrying us away from them—we increasingly ignore the people in the beds to tend to the screens in our workrooms. Interacting with screens, it turns out—even if they are filled with important information—does not fulfill us doctors in the same way caring for people in beds does.

I was reminded of the potential seriousness of this toll on the very day I was preparing final edits to this essay. That afternoon, in the midst of a busy clinic, my team and I saw a woman with a serious cancer that had spread to her liver, lungs, and other organs. Diagnosed about a year ago, she had subsequently received from us a sequence of chemotherapy drugs that had so far kept her cancer at bay. Recently, however, she had grown sicker, and we suspected the chemotherapy was no longer working. Two days ago, she had a CT scan, and yesterday I reviewed it and saw that it clearly demonstrated her tumor had continued growing, in spite of the chemotherapy. This afternoon, we met in my office. We outlined the results of the scan, and, with the same unblinking stare with which she has viewed me every two weeks for the last year, she asked me what this meant. I explained that we had no further chemotherapy to offer.

And so there we sat, face-to-face, as tears began to brim over her eyelids and stream in rivulets down her cheeks.

What scene could more effectively underline the ultimate impotence of modern medicine? The drugs I have given her over the last year are really little more than carefully controlled poison, poison we hope will harm the cancer cells more than the healthy ones. And now even the poison would not work anymore. There was nothing more I could offer.

And yet, how untrue that is.

Because in that tearful moment, it was as if the world stopped spinning around us, and we sat, her hand in mine, eyes locked, in silence, as she cried. This is the moment that makes doctoring doctoring. The day may well come when my brain is all but replaced by a machine whose stores of knowledge will be vast and whose ability to sift through information to compose a coherent plan will far exceed mine. Already, we live in a world of iPatients and virtual ICUs. But none of that has taken or ever will take away this most fundamental of human and doctoring moments—the instant where we sit together, facing an unconquerable illness, and where I say to her: We are your doctors; we will always be here to care for you.

What we must ensure is that technology does not so alter medicine and the people who practice it that they become either unable or unavailable to engage in these crucial moments.

At the end of the day, then, what am I to make of the ways in which technology has changed me as a doctor? As with any transformative force, there is no easy answer. Technology has expanded my knowledge but shallowed my thinking. It has streamlined my work but lured me away from the very people to whom I need to attend. I fear it has made me more knowledgeable but less wise, more efficient but less present, more capable but less compassionate, more machine and less me.

Hyperconnected Discipleship

It is not just in my doctoring, however, that technology is changing me. I likewise worry that technology profoundly affects the way I live out other aspects of my Christian discipleship.

Part of this is a prioritization problem. One of the internet’s defining characteristics is its endless supply of what Elder Bednar called “digital distractions, diversions, and detours.”11 Even a person steering clear of sinister content can find his life consumed by the thick of ephemerally thin things. While the internet offers substantive content, the online world’s very design makes meaningful engagement with this content more difficult. Multiple studies have shown the vast majority of readers very rarely finish even a fairly simple online news article, let alone important long-form content that requires deep engagement over hours. Importantly, the problem is not a lack of meaningful information—you can just as easily access The Iliad or Shakespeare as you can BuzzFeed or 1,001 cat videos on YouTube. The problem instead is that the online universe is designed such that it makes the meaningful processing of long-form content more difficult. Hyperlinks are the order of the day, and each click on one transports a reader to a different online world. Thus, the internet isn’t even content to allow us to peacefully peruse its own offerings—it is almost by definition a fractured and frenetic place where nearly constant pings, alerts, and interruptions intrude on whatever meaningful sustained engagement we might attempt there. It is as if the internet is a grocery store where the Doritos, Twinkies, and Swedish Fish are dispensed for free from bright bins just inside the door, while the fruits, vegetables, and whole grains are in the very back corner, hidden in an unmarked room.

Furthermore, the internet distracts us not only from the content we consume within its confines but also from the world around us; this sense that our phones increasingly invite us to devote significant time to insignificant things is not just anecdotal. Multiple studies show that the average adult checks her phone 80 to 160 times a day, and teens, especially, now spend some eight hours daily confronting a screen of some kind. Emerging data indicate this screen time may be linked to increased rates of teen depression,12 and it is concerning if not diagnostic that, if a common screening test for alcoholism is applied to smartphone use, virtually everyone I know would be classified as a phone-aholic.13 Studies have even shown that we don’t need to be directly engaging with an electronic device for it to sap our attention and presence; a phone buzzing on a table in a room where I am sitting distracts me even if I never touch it and cannot see its screen.14

And of course phones can be much more than just distracting.

I remember vividly sitting in general conference as a teen, before the internet’s ubiquity, and listening to President Hinckley implore “any within the sound of [his] voice” to eschew pornography.15 That advice was vital then but has become even more urgent in a world where the internet has facilitated the widespread dissemination of prurient content ranging from troubling to shocking to exploitative. In some ways, however, I worry that the manifest problems with pornography may lead us quietly and too contentedly to pass by other, perhaps even more pervasive, problems. This is because even though pornography elicits special concern through its sexual dimension, it is also the leading indicator of a broader problem with this brave, new virtual world: as we increasingly wander the endless halls of the internet’s infinite maze, we can commensurately abandon the real world.

On the one hand, as I indicated in discussing the ways medical rounds have changed over the last ten years, our abandonment of the real world for a virtual one is changing the ways we think. In his unsettling book The Shallows, Nicholas Carr describes how the internet is robbing an entire generation of its ability to think deeply. Carr’s preferred metaphors are those of scuba diving and waterskiing. Whereas previous generations could freely do the former—meaning they had the ability to immerse themselves in lengthy manuscripts and to linger on words, phrases, and ideas—the millennial generation finds this a progressively impossible task. Instead, they are often merely skimming across the top of information, imbibing endless streams of tweets and headlines but rarely even finishing the end of an article, let alone sustaining attention over minutes, months, or years toward deeper understanding and long-term endeavors. This is not to imply, of course, that the generation has lost the ability entirely, but only that the cultural consciousness is migrating away from attention and toward quick informational fixes.16

I have felt that shift within myself.

During my junior year at Brigham Young University, I took the best class of my undergraduate education: “Studies in the American Experience.” So many aspects of the class—Professor Neil York among them—were superlative, but what lives most vibrantly in my memory were the nights spent in front of a fire with Tocqueville’s Democracy in America. Those evenings passed swiftly as I scoured the pages, sometimes perplexed, but often dazzled. I can still trace the way my emotions swelled—the way I very nearly held my breath—as I read one particularly erudite passage in which Tocqueville felt his way toward what he considered the wellspring of American democracy’s success. I heavily highlighted the pages leading up to that section, and the passage where he finally reveals the secret at the center of his explorations—our “habits of the heart!”—finds my margins erupting with exclamations.17 Reading that book demanded my sustained attention over weeks, maybe even months.

Sometimes I wonder if I am capable of such immersive learning anymore.

The dark side of immediately accessible information is that its very convenience robs me of the ability to have experiences like the one I describe above. One of a cell phone’s principal functions is to make everyone constantly, universally, and immediately accessible to everyone and everything else. This sounds wonderful until we remember that perhaps we are not designed to be so pervasively and ceaselessly accessible. I am sitting in my room typing, but within moments my eyes stray to the score of the NBA game I’ve been tracking, then my email pings and I’m distracted by an incoming message, after which a text arrives to which I am expected to reply promptly, and then I see my Facebook queue has filled up in the last ten minutes and demands to be checked, and by the time I circle back to my writing, I can’t even remember the subject of my paragraph, let alone the flow of the sentence. What masquerades as impressive efficiency is just as surely creeping distractedness. Yes, of course, our minds have always wandered, and daydreams predate the advent of the internet by millennia, but never before has a technology so comprehensively and effectively distracted us.

Research bears out these suspicions. Carr lays out many of these findings. One researcher whose work he discusses attached tiny cameras to the glasses of study participants so he could track the movement of their eyes as they read. When participants read pages from a book, their eyes moved as you would expect, from left to right, in descending lines. When asked to read pages online, however, the movements changed dramatically and instead of continuous descending lines he found their eyes roughly traced large “Fs” over the surface of a page, skipping large chunks of content and skimming only a few lines to try to gather highlights, but without time for depth, analysis, or understanding. Unsurprisingly, then, he also cites multiple studies showing that participants consistently learn and understand less when reading online than when reading on paper.

Beyond even changing the way we read, however, consuming digital media also rewires our brains. In one of the most striking studies Carr cites, volunteers were sorted by their experience with online media into novices and experts. Both groups were asked to read online content while being monitored with fMRI (functional MRI is a way of imaging the brain that uses glucose consumption to demonstrate the areas of the brain that are being used across time, rather like seeing wires glow as electricity passes across them). When the experts consumed the online content, certain brain circuits lit up quite brightly that did not light up in the novices’ brains. In other words, those users had trained themselves through practice to use those circuits more nimbly, just as a bodybuilder has larger biceps than a couch potato. Even more striking, however, when the novices were given just a couple of weeks to practice consuming content on the web and were then invited back for the same experiment, those same circuits had already begun lighting up quite brightly. That is to say: just a few weeks of online media consumption had already begun rewiring their brains.

We do not know, of course, the exact long-term implications of this phenomenon, but such fundamental changes in such a short time should call our attention and make us at least stop to wonder what they mean. By the same token, while the study is small, a recent investigation demonstrating that internet addiction seems to atrophy certain critical brain areas should raise alarms.18 The take-home point is not that this research definitively proves that digital media consumption rots our neural circuits, but rather that it raises serious and profound questions about a technology that was virtually unknown ten years ago but without which we can now hardly imagine our lives.

All of this is to say that the attention we pay to the internet is not just a question of distraction. If it were, the answer would be simple: put away my phone. What all of the above indicates, however, is that cell phones and the digital revolution they represent don’t just distract us; they also warp our brains. Even when the phone is absent, long-term and consistent use of pervasive digital media make us long-lastingly less capable of sustained concentration. They don’t just rob us of time but actually change our brains and dull our ability to think deeply.19

This matters, not because it is bad to be able to skim large amounts of information quickly; indeed, in the new information economy this may become a vital skill. Rather, it is a problem because those raised on this kind of learning may not fully develop the intellectual resources necessary for deeper dives. In a chapter outlining the advent of the written word and the widespread coming of literacy in the world, Neil Postman described the requirements of deep reading like this: “The reader must come armed, in a serious state of intellectual readiness. This is not easy because he comes to the text alone. In reading, one’s responses are isolated, one’s intellect thrown back on its own resources. To be confronted by the cold abstractions of printed sentences is to look upon language bare, without the assistance of either beauty or community. Thus, reading is by its nature a serious business. It is also, of course, an essentially rational activity.”20 This serious intellectual engagement cannot come from tracing large Fs across the surface of online screens filled with text.

Something is slipping away—and that something matters profoundly to us. We proclaim, after all, that “the glory of God is intelligence,” and we believe that the things we learn—and, one would assume, the way we learn—is one of the few precious things we will carry with us into the eternities.

What worries me even more than how the internet is changing our brains is the way it is hardening our hearts. Just as Carr’s book left me unnerved, Sherry Turkle’s Reclaiming Conversation left me deeply saddened.21 In addition to describing other ways the internet impairs our ability to think, Turkle tackles the ways in which it handicaps our ability to feel. The book arose out of hundreds of hours of interviews with students who came of age during the millennial era and years spent researching the intersection between humans and our technology. The picture that emerges startles me. I might have thought that the compulsion to text, for instance, arose from (or perhaps caused) a sort of face-to-face social forgetfulness; texting is so easy, after all, that not placing a call or visiting a friend may simply be a matter of convenience. What Turkle found, however, was more than simply a drive for efficiency. Instead, apparently because of the rise of interpersonal technology, college students over the last ten years are both less willing and less able to have face-to-face conversations (especially difficult ones). One student, for instance, looks at Dr. Turkle incredulously when the author suggests discussing a thorny relationship question face-to-face with a friend. Doing so would require being party to the other person’s broken heart and wounded feelings, after all, and who would want to be present for that?22

But of course, that’s just the point. A parallel finding Turkle outlines in detail is that current college students are not simply communicating differently. Instead, those generational communication changes are profoundly warping the way college students relate to others in general. Most noticeably, students now are statistically (and clinically) less able to empathize with their peers. Who can be surprised at this? If you shy away from another’s suffering by hiding behind a text—how can it be any wonder you’re less able to relate to other people’s pain?

These effects are not peripheral or incidental to our Christian discipleship.

Chaim Potok’s The Chosen tells the story of two young Jews coming of age and coming to terms with their faith, their culture, and their intellects. One of the young men, Danny, is the son of a Hasidic rabbi. The rabbi, Reb Saunders, raises Danny in almost complete silence. Except for short phrases they exchange while studying the Talmud, he never speaks to his son. This practice baffles and frustrates nearly everyone around them and, near the book’s conclusion, the rabbi seeks out his son’s best friend, Reuven, to explain and implicitly apologize. Because the rabbi still refuses to speak directly with Danny, he instead engages Reuven and explains his reasoning within earshot of Danny to allow his son to hear without formally breaking the code of silence.

The rabbi explains how he recognized very early that Danny was frighteningly smart, but knew the intelligence came at the cost of caring for others. Danny had a mind like a “jewel,” a “pearl,” and a “sun” but initially seemed to his father to have no soul.

Reluctantly, after praying, the rabbi decided to raise his son as he himself was raised: in silence. Reuven does not understand how this could possibly help, and so the rabbi explains:

My father himself never talked to me. . . . He taught me with silence. He taught me to look into myself, . . . to walk around inside myself in company with my soul. When his people would ask him why he was so silent with his son, he would say to them that he did not like to talk, words are cruel, words play tricks, they distort what is in the heart, . . . the heart speaks through silence. One learns of the pain of others by suffering one’s own pain, he would say, by turning inside oneself, by finding one’s own soul. And it is important to know of pain, he said. It destroys our self-pride, our arrogance, our indifference toward others. It makes us aware of how frail and tiny we are and of how much we must depend upon the Master of the Universe.23

The rabbi’s extremism notwithstanding, there is a jewel of truth in his words. The heart needs purposeful silence—the cessation of input to the brain with an intention to reflect—to process pain and learn empathy. Smart phones in particular, and our hyperconnected world in general, relentlessly fill the spaces that might otherwise allow silence to flourish. This brings to the fore one of the internet’s many paradoxes: on the one hand, our digital world—especially as embodied in our smart devices—pulls us away from the people around us, whereas, on the other hand, our phones also make us progressively less capable of finding meaning in silence. The point in both cases, however, is that our phones pull us away from what matters most and trap us instead within the hypnotic glow of those tiny screens.

This matters for us as we seek to become like Jesus.

Mormonism—like most branches of Christianity—derives its power from being both a meditative and a communitarian religion. We must attend to the life of the soul but also remember that humankind, as Marley’s ghost reminded Ebeneezer Scrooge, really is our business.24 We therefore derive our own spiritual succor from quiet moments spent drawing inspiration from holy texts, the best books, silence, and music, and then turn around and share that spiritual nourishment by serving others. Mormonism’s deepest meaning comes when we carry out our collective covenant to lift up the hands that hang down and strengthen the feeble knees. One of my defining covenants as a Mormon, after all, is to sorrow with those who are sad.

That is why Turkle’s observations about the upcoming generation so unnerve me. While our phones may keep us silent, it is most often a spiritually empty silence, bereft of meaningful solitude. At the same time, I fear that the rise of a ubiquitously “connected” world is paradoxically tearing us apart from those around us as well. On the one hand, the hopelessly idealized façades pervading social media foster jealousy and a deep sense of inadequacy, resentment, and spite. On the other hand, that very connectedness breeds a deep sense of atomization, such that an important and recent social commentary (also written by Sherry Turkle) was titled Alone Together. It is unsurprising, in this context, that Elder Bednar warned of the “stifling, suffocating, suppressing, and constraining impact of some kinds of cyberspace interactions and experiences upon our souls.” He raised a warning cry: “Be careful of becoming so immersed and engrossed in pixels, texting, earbuds, twittering, online social networking, and potentially addictive uses of media and the Internet that you fail to recognize the importance of your physical body and miss the richness of person-to-person communication.”25 The more I read his address, the more it motivates me to keep the things that matter most at the center of my life.

Perhaps no anecdote has brought home this point quite as chillingly as a story Dr. Turkle shares in her book.26 She was called to consult at a middle school where the teachers were concerned about the effect technology was having on their students. One of the students there was a young boy whose father had recently committed suicide. One day at school the boy got into a spat with one of his classmates; in response to her frustration over the tiff, the classmate posted a picture of the young boy on her Facebook page with a caption saying, “I hope he ends up just like his father.” Horrified, the principal called the young girl into his office. What he discovered in the conversation that ensued was that it was not so much that the young girl was callous to the boy’s feelings as it was that she was oblivious to the fact that her words might harm someone else—the façade of the internet had allowed her to operate under the belief that posting words like those online was an action in a void, without consequences. The technology placed her at a remove from the object of her taunt. Had she flung something like that at the boy on the playground, she would have immediately found herself, literally, face-to-face with the consequences of her action, but because she leveled the blow over the internet, it was as if she genuinely did not understand the words’ potential consequences. What was once inescapable had been rendered by mobile technology all but invisible. And that invisibility prevents the possibility of real empathy.

As Christian disciples, we are called to tend to each other. Our ministry is to care for the people around us: the actual, physical, imperfect, frustrating, beaming, suffering, crying, laughing, joyful people. If we are not careful, however, our phones can lure us into a world filled with our virtual avatars while diverting us away us from the place where our actual fellow travelers live.

The tragedy is not that virtual connections cannot be real or that they cannot provide our lives with additional meaning and depth—anyone who has seen a geographically distant grandfather interact with his grandchild by video chat knows they can do just that. Rather, the vital truth is remembering that virtual connections can never fully replace real ones, even though such a consuming technology may tempt us to think they can. While an encouraging text or a happy Facebook message can do good, they will never replace the meaning of a warm hug or an actual shoulder to cry on. Virtual missives of any kind can constitute part, but not all, of our reaching out to those who need us. I cannot be meaningfully present in another’s suffering—even from afar—if I have forgotten how to be meaningfully present in the first place. The Mormon gospel is one of real and imperfect but striving Saints—no virtual representation can ever replace them.

Abandoning Truth

Just as troubling, the internet affects not only our relationship with other people but also our relationship with truth itself.

The rise of the internet was supposed to herald the arrival of better and more accurate reporting. In the 1950s, twenty-nine million Americans tuned in their televisions to get their news from figures like Edward Murrow and Walter Cronkite. In many circles, these anchors were considered the voice of authority.27 It was assumed they would report real stories with as little bias as possible. The 1960s and 1970s, however, saw a cultural rebellion against such centralized authority, and a desire for independent reporting ascended. The passion of this inclination perhaps sagged toward the end of the last century but came roaring back with the emergence of the internet in the early 2000s. People assumed that this democratization of access to information and the ability to report it would usher in an era of reportage that had greater fidelity to the facts on the ground.

What has happened instead is much more complex. In politics, the hyperconnected world has sown chaos. While the proliferation of blogs has democratized the publication of opinion, the internet has also given rise to an array of communication channels that report stories with no attribution, filled with apparent facts that may not be true at all. The monochromatic voice of authority of the 1950s may have lent itself to myopia and unacknowledged bias, but the rise of “every person a reporter” has so blurred the line between fact and fiction that one of the main weapons for hostile foreign states is now the seeding of misinformation. With the rising sea of disinformation, we are seeing a worldwide retrenchment by the forces of autocracy, demagoguery, extremism, and spite. When culture comes unmoored from its ties to the truth, we reap the whirlwind in the vacuum left in truth’s place.

Reality, we must remember, is not a political issue; and while the LDS Church remains steadfastly nonpartisan, on this point our doctrine is unavoidably clear. We believe in truth. We encourage debate and acknowledge the complexity inherent in the interpretation of messy realities, but appeals to a factless world run counter to our theology and the best elements of our culture.

In the Doctrine and Covenants, section 88, comes some of our most stirring religious language: “Intelligence cleaveth unto intelligence; wisdom receiveth wisdom; truth embraceth truth; virtue loveth virtue; light cleaveth unto light; mercy hath compassion on mercy and claimeth her own” (v. 40). In other words, by using our limited, flawed, mortal means to gather what truth, wisdom, and light is within our power to collect, we invite God to grace us with the light, truth, and wisdom that are his alone to give. President Uchtdorf has likewise reminded us that while our imperfect understanding unavoidably limits our ability to grasp all truth, nonetheless, “our Father in Heaven is pleased with His children when they use their talents and mental faculties to earnestly discover truth,” and “Latter-day Saints are not asked to blindly accept everything they hear. We are encouraged to think and discover truth for ourselves.”28

All of this is to say, a dogged pursuit of truth should be one of Mormonism’s defining virtues. Appeals to “alternative facts” should deeply concern us, regardless of the political preferences of their proponents.

By the same token, it strikes me as troubling that the internet has (virtually certainly) exacerbated—or at least facilitated—our inclinations toward tribalism, incivility, and the rhetorical savaging of our opponents. Perhaps it is the anonymity of internet chat forums, perhaps it is the internet’s propagation of confirmation bias, or perhaps it is the internet’s ability to allow us to remain ignorant of the effects our verbal barbs have on their targets that has so degraded our discourse. More precisely, the internet does not act as the agent here but is nonetheless the medium by which—out of cupidity or at least apathy—individuals and corporations have created digital conditions that have facilitated and hastened this cultural decline. Regardless of the exact origin of the effect, however, the last two decades have seen a serious defining down of what were once considered elemental components of civic and political discourse. It troubles me deeply that so many view the vitriol passing between politicians—and even neighbors—as normal.

Beyond even these effects, however, the internet’s most worrisome consequences on our search for truth may be all the more dangerous because they are less obvious. Perhaps the wired world’s most potent effects come because our online lives rob us of collective presence.

The Absence of Presence

Presence is the gift of being where you are. On the face of it, this seems tautological—how, after all, could you be anywhere else? But in the internet age, almost no one is really where they are. It strikes me, in the hospital where I work, for instance, that I can roam the halls during the day, with people passing in all directions and sun streaming through the windows, and find that so many of those I pass have their eyes fixed on their screens. We are still walking, but in a haunting foreshadowing we are devolving toward the immobile subhumans on the spaceship in Pixar’s prophetic Wall-E. G. K. Chesterton once theorized about a madman who believed the entire world revolved around him (in the form of a conspiracy). Chesterton imagined that if we were trying to dissuade such a man from his madness, we might plead: “How much larger your life would be . . . if you could really look at other men with common curiosity and pleasure . . . ! You would begin to be interested in them. . . . You would break out of this tiny and tawdry theatre in which your own little plot is always being played, and you would find yourself under a freer sky, in a street full of splendid strangers.”29

When I pass so many people whose minds are clearly tethered to their phones (and sometimes I am one of them), I can’t help but find that description—of a “tiny and tawdry theater”—especially apt. This tethering troubles me in part because so much of what I consume on my phone places me at the center of my tiny virtual universe. I am like the madman not only because I am trapped within such a small space but because so much of what occupies that cosmos is myself.

Beyond this, even when I venture outside the universe of self, phones endlessly draw me to what doesn’t matter. Engineers designed smartphones to facilitate “multitasking.” While I used to admire this ability before I had an iPhone, what I now see as I use my phone is that what I thought of as multitasking turns out in large measure to be an endless stream of disruption, distraction, and discontinuity. Indeed, recent neuroscience demonstrates that even if we could multitask without extraneous interruptions, just trying to do two things at once makes us less efficient and less accurate.30 Smartphones excel at many things, but they are engineered to preclude presence.

This worries me in part because presence fundamentally undergirds all religious experience. Our common daily practices as Mormons make this apparent. Who has not spent his prescribed minutes of scripture study running over strings of words, only to find that intruding ideas rendered the sentences meaningless? Who among us has not attended the temple only to find her mind was elsewhere and that the session had no impact? And who has not listened to general conference while other demands distracted him, only to find that he hardly knows what was said, let alone what it really meant or what he should do with the counsel? Immediately apparent to the religious seeker is the fact that religion practiced pro forma is not religion. Only my presence—my active, hopeful, imperfect, but striving engagement—allows the Divine to expand my vision, deepen my knowledge, make real my empathy, and change who I am.

The importance of presence in understanding the divine saturates our doctrine as well as our daily experience. Alma’s allegory in Alma 32 reminds me of this. Alma goes to pains, as he talks of nurturing the word, to illustrate that the process requires careful and sustained cultivation. He says, “And behold, as the tree beginneth to grow, ye will say: Let us nourish it with great care, that it may get root, that it may grow up, and bring forth fruit unto us. And now behold, if ye nourish it with much care it will get root, and grow up, and bring forth fruit” (v. 37). Through repetition that echoes the allegory’s overall arc, Alma insists that this process requires presence, persistence, and care over a great expanse of time; indeed, he summarizes at the end of the chapter: “Ye shall reap the rewards of your faith, and your diligence, and patience, and long-suffering, waiting for the tree to bring forth fruit unto you” (v. 43).

Diligence.

Patience.

Long-suffering.

Waiting.

A lightning-strike revelation, in his mind, is quite rare and insufficient anyway. I am particularly struck that such gentle revelatory language comes from the recipient of one of our canon’s most dramatic spiritual epiphanies, a man who then grew to become the Lord’s prophet. Alma’s language here matters a great deal to us as we contemplate what revelation—to prophets and to each of us—usually looks like.

By the same token, one of our canon’s most telling verses concerning the receipt of personal revelation reads, “Let thy bowels also be full of charity towards all men, and to the household of faith, and let virtue garnish thy thoughts unceasingly; then shall thy confidence wax strong in the presence of God; and the doctrine of the priesthood shall distil upon thy soul as the dews from heaven” (D&C 121:45). The two foci of that verse are the verb “distil” and the analogy “dews from heaven”—both connote stillness, the kind of process and product that requires an inner quiet to observe. In parallel fashion, a telling verse in Doctrine and Covenants 6 finds the Lord gently reminding Oliver Cowdery: “Behold, thou knowest that thou hast inquired of me and I did enlighten thy mind; and now I tell thee these things that thou mayest know that thou hast been enlightened by the Spirit of truth” (v. 15; italics added). In other words, beyond the inspiration itself Oliver apparently needed to have the illumination pointed out to him; it had come so subtly he apparently did not recognize its provenance.

And he didn’t even own a smartphone.

For most of us, then, most of the time, revelation distills like dewdrops—quietly, subtly, even imperceptibly. As one poet penned, God reveals himself most often in a manner that is “unasked, unforced, unearned.”31

Thus, the flight of our collective presence matters. Its importance can be highlighted, perhaps, by recognizing what we lose when presence flees. In a beautiful passage in James Agee’s A Death in the Family, Agee writes of a father and son walking home from a movie:

Rufus had come recently to feel a quiet . . . contentment [here at the corner], unlike any other that he knew. He did not know what this was, in words or ideas, or what the reason was; it was simply all that he saw and felt. It was, mainly, knowing that his father, too, felt a particular kind of contentment, here, unlike any other, and that their kinds of contentment were much alike, and depended on each other.32

Then, a page later:

He knew these things very distinctly, but not, of course, in any such way as we have of suggesting them in words. There were no words, or even ideas, or formed emotions, of the kind that have been suggested here, no more in the man than in the boy child. These realizations moved clearly through the senses, the memory, the feelings, the mere feeling of the place they paused at . . . , and above them, the trembling lanterns of the universe, seeming so near, so intimate, that when air stirred the leaves and their hair, it seemed to be the breathing, the whispering of the stars.33

So much of what occurs in that scene—the irony being that nothing much “happens” at all—relies on the presence of the father and the son. The father is present with his boy, walking home from a Charlie Chaplin picture, and the son is present with his dad, his own skin, his five senses, and the canopy of stars. If the father were engrossed in the dim blue glow of his smartphone, the scene would immediately evaporate. Similarly, if the son were wound up in his Facebook feed, he wouldn’t even be cognizant of the outside world, let alone fully present to the miracle of the breathing stars. Presence necessarily precedes an appreciation of beauty and, similarly, all catalyzing religious experience. In a corollary vein, smartphones battle every microsecond against the contentment in which Agee revels above; a smartphone, by design, must never allow you to be content—it is ever at the horizon, beckoning through to infinity.

Part 2: Veiling Reality

Reaching—or Not—for a Reality beyond Our Grasp

All of the foregoing worries me deeply. The internet has changed the way I practice medicine—making me “smarter,” yet pulling me away from my patients and corroding my ability to determinedly approach intellectual problems. Likewise, our hyperconnected world has rendered us less present, while social media has paradoxically atomized modern culture. And, finally, truth has become a secondary concern in much of the virtual world, with our collective thinking becoming shallower and more focused on clicks than on meaning. Even beyond this grim tally, however, there are further, and perhaps subtler—but consequently all the more dangerous—ways in which the digital world marshals an assault on our spiritual well-being.

Part of the danger here is that social media entices us to prioritize appearance over substance and thus inverts the Christian paradigm of selflessly diving into the work of becoming more like Jesus. As Elder Oaks taught, the aim of the gospel is to facilitate our becoming who God wants us to be, but the internet is motivating us to appear to be whatever the cultural moment demands.34 This might be trivial (and morally neutral), except that sometimes that endless hunger to seem to measure up to some worldly standard directly detracts from our Christian quest to become new beings in Christ. These two aims do not always work at cross-purposes, but a generation weaned on preening for the internet may have trouble discerning our priorities when the time comes to choose between the two.

Beyond even this, however, the internet also keeps us from seeking to understand “things as they really are” (Jacob 4:13). To articulate fully why this so deeply concerns me, I need to take a bit of a detour here to talk about the way we conceptualize language and reality and about just what it is words can and cannot do. At the end of the detour, I will weave this explanation back into my concerns about our digital age.

To understand part of what the internet threatens to take away, we need to first recognize that some tremendously important ideas are, inherently, ineffable; these ideas defy words, not because a great poet has never tried to articulate them, but because, categorically, they cannot be contained by our limited vocabulary. Words, after all, no matter how beautiful, are but symbols, which, when arranged this way or that, attempt to communicate an idea’s essence. Yet, in spite of Shakespeare, Cervantes, Frost, and Fitzgerald, words will forever fail to fully capture truth, beauty, and the universe’s other elemental essences. Holy writ affirms this; of Jesus’s ministry to the Nephite children we read, “And no tongue can speak, neither can there be written by any man, neither can the hearts of men conceive so great and marvelous things as we both saw and heard Jesus speak; and no one can conceive of the joy which filled our souls at the time we heard him pray for us unto the Father” (3 Ne. 17:17; italics added).

That qualitative inadequacy notwithstanding, however, what strikes me about the best literature is that it tries. You can feel the strain as the words stretch themselves—hoping desperately to fully convey the divine idea. Yet in today’s world, we find this equation flipped. In the universe of Twitter, Facebook, and countless forms of social networking, often the words published or posted seem hardly to try to convey something ultimate or real. Instead, much of what is written is rhetorical flotsam—ephemeral bubbles that hardly hang together on their own, let alone represent some deep, unspeakable truth. Twitter, particularly, seems an almost nihilistic, Kafkaesque parody of probing language.

As a Mormon, this particularly concerns me because we believe a profoundly beautiful world shimmers just beneath the often drab visible reality surrounding us. Part of the reason we seek things that are “virtuous, lovely, or of good report” (A of F 1:13) is because they provide glimpses into that hidden world. Eliza R. Snow captured this succinctly: “Ofttimes a secret something whispered, ‘You’re a stranger here,’ and I felt that I had wandered from a more exalted sphere.”35

By the same token, one of Joseph Smith’s most meaningful doctrines is that a “veil” hides from us a heavenly host and a celestial world—and that that veil can be parted. Many Mormons thus speak easily of the veil being “thin” as a way of describing particularly visceral holy experiences, and our culture likewise boasts an unusually easy sense that there are supportive ancestors pulling for us “on the other side.”

Which brings me to another observation by Joseph Smith. In November of 1832, he wrote in a letter to W. W. Phelps, “Oh Lord when will the time come when . . . [we may] gase upon Eternal wisdom engraven upon the hevens. . . . Oh Lord God deliver us in thy due time from the little narrow prison almost as it were totel darkness of paper pen and ink and a crooked broken scattered and imperfect language.”36

That Joseph, whose revelatory rhetoric fills the pages of the Doctrine and Covenants, would complain in such vivid terms about the inadequacy of language—crooked, broken, scattered, and imperfect—to convey the full meaning of the Divine strikes me as telling. One of his most pressing messages seems to be just that: there is a fundamental difference between the thing and his description of the thing. In my mind’s ear, I can almost imagine him pleading with me: I can tell you about God, but my description is not God. Within the constraints of this broken thing called language, I will try to convey to you the majesty and empathy, the wisdom and unending love, the grandeur and filial compassion of God our Father and Heavenly Mother—and yet I will fail. My writings and sermons are more invitation than explanation. You must come and see for yourself—but please, please, please come!

While the preceding words are mine, they strike me as reflecting a theme that underlies much of Joseph Smith’s religious world-building. As Richard Bushman observed in the closing paragraphs of Joseph Smith: Rough Stone Rolling, Joseph’s followers “were happy to grant him the authority of a prophet if he would connect them with heaven, and that was the key to his success.”37 He connected them, but he also recognized the limitations of the bonds he could forge for others and so insisted they use the religion restored through him as a jumping-off point for developing a more personal feel for and understanding of revelation and the character of divinity. He reminded the world that no true religion is possible without a correct understanding of God’s character and then taught the world an enormous amount about that character. Beyond those explicit teachings, however, what he emphasized even more was our personal responsibility for coming to know God ourselves. A similar strain runs consistently through Joseph’s successors as prophets and presidents of the LDS Church; indeed, in this implicit plea, Joseph is joined and bookended by President Russell M. Nelson, who, in his first sermon to the entire Church as prophet, pled, “I urge you to stretch beyond your current spiritual ability to receive personal revelation, . . . [because] there is so much more that your Father in Heaven wants you to know.”38

There are parallels between Joseph Smith, born in 1805, and John Muir, born in 1838. Joseph opened to his people the mysteries of the heavens; Muir opened to the world the marvels of Yosemite and the American West. Joseph was the founder of one of American’s great homegrown religions; Muir, one of history’s great naturalists and authors. Both men fairly quivered with an urgent sense of having glimpsed a great beyond, and both wore out their lives trying to bring others to see it too. Regarding a trip to Glacier Bay, Muir wrote:

We were startled by the sudden appearance of a red light burning with a strange, unearthly splendor on the topmost peak of the Fairweather Mountains. . . . It spread and spread until the whole range . . . was filled with the celestial fire. In color it was at first a vivid crimson, with a thick, furred appearance, . . . every mountain apparently glowing from the heart like molten metal fresh from a furnace. Beneath the frosty shadows of the fiord we stood hushed and awe-stricken, gazing at the holy vision; and had we seen the heavens open and God made manifest, our attention could not have been more tremendously strained. . . . Then the supernal fire slowly descending, . . . the cold, shaded region beneath, peak after peak, . . . caught the heavenly glow, until all the mighty host stood transfigured, hushed, and thoughtful, as if awaiting the coming of the Lord.39

In the immediacy and urgency of Muir’s language here, I hear echoes of Joseph Smith describing one of his many encounters with the Divine. What strikes me most about this passage, however—in spite of the stirring prose—is the gap between reading it and being there. Having seen Yosemite Valley, I’m acutely aware of the distance; and that awareness of language’s inadequacy in a realm I know well whets my appetite to experience just what divine reality will be like when we no longer need words.

I know that over many years I have tried to narrate my own most profound spiritual experiences, and yet sufficient words forever elude me. Even the words of renowned poet Emma Lou Thayne fail to fully capture the incandescence of those moments, but a description from her autobiographical The Place of Knowing is as close as I’ve ever found. When asked by a Jewish friend why she continued believing in Mormonism, Emma Lou wrote of going to the Salt Lake Tabernacle as a little girl to hear Helen Keller speak. After Ms. Keller finished her remarks, she asked if the “Mormon Prophet” (Heber J. Grant) would introduce her to the tabernacle organ so she could hear “your famous pioneer song.” Emma Lou watched, riveted, as President Grant led Ms. Keller to the base of the consoles and placed her hands such that she could feel the organ throb as Alexander Schreiner played “Come, Come, Ye Saints.”

So then—that tabernacle, that singing, my ancestors welling in me, my father beside me, that magnificent woman, all combined with the organ and the man who played it and the man who had led her to it—whatever passed between the organ and her passed on to me. I believed.

I believed it all—the seeing without seeing, the hearing without hearing, the going by feel toward something holy, . . . something that could move me, alter me, . . . something entering the pulse of a little girl, something that no matter what would never go away. . . .

I believe in it. I get impatient with people’s interpretations of it . . . , but somewhere deep inside me and far beyond impatience or indifference there is that insistent, confounding, so help me, sacred singing—“All is well! / All is well!” My own church, inhabited by my own people. With my own feel for its doctrines, it is my lamp, my song. . . . I would be cosmically orphaned without it.40

Taken together, these theological observations paint a foundational scene from Mormon theology and remind us of one of the internet’s most insidious dangers. We are trapped, as it were, in a world where we can see the true beauty of the universe only “through a glass, darkly” (1 Cor. 13:12). Joseph—by dint of a life saturated with visions, revelations, and divine whispers—parted the curtain veiling this deeper reality and returned to try to explain what he had seen. His words paint sometimes powerful, even visceral, pictures, but the words are not God, or celestial glory, or the whole of truth, or the love of Jesus Christ—they are symbols. This is not to say they are unimportant—far from it. Those words are necessary and can be phenomenally powerful catalysts, yet they must ultimately be the portal, not the destination.

An argument can be made that the aim of a Mormon life is to dig past layer upon layer of appearance, striving to come to the core that represents things as they really are. Our Christian discipleship is a journey beyond current understanding to a place where we will truly understand God, the universe, and our place in it. Thus, King Benjamin pleads with us to understand that a beggar is not a beggar, but an eternal soul, with divine potential, transiently dressed in rags; the Savior invited the people of his time to look beyond the social nothingness of children to see instead the ways in which young people innately embody some of the most vital Christian virtues; Nephi understood that nature was not just the wilderness but in its beauty could also become a temple; and the entire Christian canon rests on the belief that a Judean carpenter was not just a carpenter but the literal Son of God who bore the world’s every sin and then took up his own life again after suffering death by torture.

Whereas the gospel invites us to understand that things are not as they seem—that what we see on the surface is not all there is—the internet and the digital world obstruct our discipleship by placing filters between us and the Divine. Instead of uncovering truth, the internet can further obscure it; instead of bringing us to each other in vulnerability and sorrow, social media invites us to chronicle our lives as a kind of vaguely artificial performance art; instead of inviting us to a life of quiet virtue, if we are not careful, the internet may call us to live lives of puffed-up righteousness; and instead of helping us see things as they really are, the internet may convince us that seeming is more important than being.

It is as if, instead of working to part the veil, the internet hangs layer upon layer of curtains, each further obscuring our view of reality. If Joseph Smith is like a prophetic John Muir, pleading with his people to trek to a spiritual Yosemite Valley with its divine waterfalls and towering granite peaks, the virtual world stands in a place opposite, forever beckoning us away, alluring us with shiny convenience, trying to convince us that the valley is not really that beautiful anyway.

Thus, instead of talking face to worry-lined face with embodied friends, we “chat” with their disembodied avatars. Instead of embracing those we love in the midst of the messy glory of their cluttered homes, we interact with the Photoshopped nearly perfect version of a life that is posted online—feeling at once further away and hopelessly inferior. And instead of being swallowed up in the meaning of a religious experience that first demands our attentive presence, our minds flit about from this to that, never in one place long enough for any scene to make a lasting impression. We seek likes more than revelation and exposure more than friendship, followers more than friends and the next link before meaningful insight.

Reclaiming Reality

Thus, our mobile devices and the technological revolution they represent tap into some of our deepest, most instinctual desires—for connection, stimulus, and the new—and they do so too well. Their very success—and our susceptibility to their coaxing—can leave us at their mercy. We must devise techniques not to eliminate them from our lives, but to ensure they serve us in the ways that reflect their true value while leaving us free to attend to the things that matter most.

How do we do this?

First, we can recognize that the efficiency of a hyperconnected life is a mirage. While it may strike me initially as helpful to be available 24/7 to every social network, communication tool, and sports score in which I have interest, such unending availability limits not only my capability to do any one of those things well but also my ability to think linearly at all. Part of standing up to the tide of hyperconnection involves resisting the ephemeral efficiency of “available everywhere and always to everyone” for the paradoxically more efficient single-minded commitment to first doing this and then finishing this before moving on to that.

By the same token, I can recognize the primacy of the person in front of me. As an oncologist, when I see patients I am often accosted by a litany of competing thoughts: What does this new symptom mean? Is the patient’s loved one influencing her decisions? Should I be offering new chemotherapy? Am I worried about this change in the patient’s lab values? Is it time to order the next CT scan? The list goes on and on, and often these questions flit and dart about in my brain as I speak with the patient in the room. Every once in a while, however, I face a full-stop moment that should halt me in my tracks and demand my full attention. When such moments arise, I ought to put down my pen or stop typing entirely, square my shoulders to the patient, lock eyes, and listen.

While day-to-day life is not usually so dramatic as a visit to the oncologist, I find I am surprised by the number of moments asking that I put away everything else to attend to them. These moments may be subtle: my three-year-old son approaching me with a newfound treasure; a sunset lighting the western sky ablaze; the silence of a moonlit house with the children asleep; our youngest son’s first knowing smile. These are my moments to channel James Agee and hear the breathing of the stars; I will miss them if I am mesmerized instead by the neon monotony of a smartphone.

Third, we must remember and honor the Sabbath. The Sabbath may initially strike us—terribly busy as we are—as paradoxical, inconvenient, and even frustratingly inefficient. How vital, though, this day apart has become in a world hurrying heedlessly on to the next thing. One element of our lack of modern presence is our inability to dwell in the now. We forget that the most meaningful spiritual and life experiences happen in the holy present. Perhaps that is one meaning of our Sabbath: it is day for focusing on its own labors. It is a time to appreciate the family surrounding me now, and to savor the strains arising from this moment’s song. It is a pause, a space, a solace. By the same token, our brave new technological world may also demand from us a new kind of Sabbath observance—times to completely unplug. Whether this means Sundays free from digital distractions, weeks spent in the mountains without technology, or a sacred space at the dinner table, we must find times to escape those tiny, tawdry theaters so that we can reconnect with those around us.

Likewise, we can embrace the haven afforded by the temple. Where else on earth can you go and see a large group of people sit for two hours without glancing at a smartphone? In our age of unending availability, the temple offers an oasis where we can disconnect from the demands of the pressing outside world.

Fourth, we can recapture the magic of thinking locally. One of the internet’s most powerful effects is making the global local. Yet, even as I learn about—and come to vicariously care for—sufferers in far-flung places, I must take care not to ignore the beggars I pass on my own streets and the sufferers with whom I rub shoulders every day. As we recently learned in general conference, part of the great work Latter-day Saints are about is ministering to those who immediately surround us. I can sit all day concerned about the tragedies I face virtually in the New York Times and yet might do more to assuage the world’s suffering by a single ministering visit.

Fifth, as a Mormon, I cannot dwell in echo chambers, and I cannot accept willful falsehood or even a seeming apathy toward truth from public officials. No matter how strongly I may feel about a cause or a political figure, I cannot allow my allegiance to persuade me to accept anything less than the facts. While it may sometimes be both harder and more discomfiting, I must search out news sources that make accuracy their bedrock priority, even—perhaps especially—if that accuracy challenges me.

Sixth, we can simply admit that we are vulnerable. Vulnerable is a word Dr. Turkle uses throughout the last part of her book, and it is carefully chosen. Many of the people she interviews cop to being “addicted” to the internet and their mobile devices in particular. While some elements of our relationship with online technology mirror addictive behaviors, her experience shows her that claiming an addiction to technology can often serve as an all-or-nothing excuse instead of a positive entryway into improving behavior. Since most of us cannot function in modern jobs and family life without any technology at all, if we give ourselves up to addiction we may claim, “Well, there’s not much I can do.” If, instead, we say, “I will need to access email/social media/my mobile phone/whatever, but I am vulnerable to spending too much time there,” this thinking can spur us to become innovative in modifying our behavior within the constraints of reality to allow for positive change.

All of this is to say, even as we embrace the marvels of technology, we can insist on the importance of the real and the now. We can seek meaningful, genuine encounters with the Divine by being present enough to receive revelation. We can assure music does not become a droning backdrop to whatever we are really doing but can instead: Stop. Wait. Listen—lingering on the mastery of a virtuosic violinist or the dexterity and soul of a marvelous pianist. We can turn off our phones and engage meaningfully and wholeheartedly with family—dwelling silently with loved ones as they sorrow and cheering lustily as they succeed. We can leave our screens and venture off into the mountains, not even content with the rousing prose of John Muir but insistent instead on feeling that winter wind running through our own hair and seeing sunbeams dancing on snow drifts with our own eyes. We can read Joseph Smith’s thrilling descriptions of the Divine and then wear out our lives endeavoring to come to know God ourselves. In all things, we can seek truth—and we can search ceaselessly to unveil the stunning reality that lies beneath the world as it seems to be.

About the author(s)

Tyler Johnson is a clinical assistant professor in the oncology division of the Stanford University School of Medicine. He received an MD from the University of Pennsylvania in 2009 and a BA in American Studies from Brigham Young University in 2005. He teaches institute in Palo Alto, California, and has focused most of his teaching on the prophets of the Book of Mormon.

Notes

1. Alfred, Lord Tennyson, “Ulysses,” Poetry Foundation, https://www.poetryfoundation.org/poems/45392/ulysses.

2. Nicholas Carr, The Shallows: What the Internet Is Doing to Our Brains (New York: W. W. Norton, 2010), 54, see also 69; Pamela Radcliff, “Defining Mass Society and Its Consequences,” ch. 8 in Interpreting the 20th Century: The Struggle over Democracy, The Great Courses, https://www.thegreatcourses.com/courses/interpreting-the-20th-century-the-struggle-over-democracy.html.

3. Ralph Waldo Emerson, “Ode, Inscribed to William H. Channing,” Poetry Foundation, https://www.poetryfoundation.org/poems/45874/ode-inscribed-to-william-h-channing.

4. Anna Garvey, “The Oregon Trail Generation: Life before and after Mainstream Tech,” Social Media Week, https://socialmediaweek.org/blog/2015/04/oregon-trail-generation/.

5. Neil Postman, Amusing Ourselves to Death: Public Discourse in the Age of Show Business (New York: Penguin Books, 1985), xix–xx.

6. “Anatomy of a Virtual ICU: Study Probes Teamwork among On-Site, Remote Staff,” June 2, 2015, VA Research Currents, https://www.research.va.gov/currents/june15/0615-1.cfm.

7. Abraham Verghese, “Culture Shock—Patient as Icon, Icon as Patient,” New England Journal of Medicine 359 (December 25, 2008): 2748–51.

8. One might argue that the delivery of better patient care might validate the need for increases in documentation requirements. It would be relatively difficult to prove such improvements conclusively since a randomized controlled trial with this as an intervention would be very difficult (and, in any case, impractical since virtually all health systems either have moved or are moving en masse to using electronic medical records). These caveats notwithstanding, I am not aware of any conclusive evidence that the advent of electronic medical records in general—let alone the volume and complexity of documentation they currently require—has improved patient outcomes.

9. Carol Peckham, “Medscape National Physician Burnout and Depression Report,” January 17, 2018, https://www.medscape.com/slideshow/2018-lifestyle-burnout-depression-6009235#3.

10. Daniela Drake, “How Being a Doctor Became the Most Miserable Profession,” The Daily Beast, April 4, 2014, https://www.thedailybeast.com/how-being-a-doctor-became-the-most-miserable-profession.

11. David A. Bednar, “Things as They Really Are,” Ensign 40 (June 2010): 19, https://www.lds.org/ensign/2010/06/things-as-they-really-are?lang=eng.

12. Jean M. Twenge and others, “Increases in Depressive Symptoms, Suicide-­Related Outcomes, and Suicide Rates among U.S. Adolescents after 2010 and Links to Increased New Media Screen Time,” Clinical Psychological Science 6 (January 1, 2018): 3–17, https://doi.org/10.1177/2167702617723376. It is worth noting that the correlation seen in this paper did not persist if the depressive symptoms were compared to use of nonscreen activities (for example, reading a book or doing homework) and persisted even when controlling for other variables such as race and socioeconomic status.

13. A common, quick screening test for alcoholism is to ask patients the “C.A.G.E.” questions: Do you feel the need to Cut down on your drinking? Have people Annoyed you by criticizing your drinking? Have you ever felt Guilty about your drinking? Have you ever felt you needed a drink first thing in the morning as an Eye-opener? While this has been scientifically validated only in the setting of alcohol use, the parallels to internet use seem intuitive. This is not to imply that it can or should be used as an instrument for diagnosing addiction to digital media, as such use would require its own validation in that context.

14. Cary Stothart, Ainsley Mitchum, and Courtney Yehnert, “The Attentional Cost of Receiving a Cell Phone Notification,” Journal of Experimental Psychology: Human Perception and Performance 41 (August 2015): 893–97, http://dx.doi.org/10.1037/xhp0000100.supp.

15. See, for instance, Gordon B. Hinckley, “A Tragic Evil among Us,” Ensign 34 (November 2004): 59–62, https://www.lds.org/general-conference/2004/10/a-tragic-evil-among-us?lang=eng.

16. Carr, Shallows, 115–48.

17. Alexis de Tocqueville, Democracy in America (New York: Alfred A. Knopf, 1994), 321–23.

18. Kai Yuan and others, “Microstructure Abnormalities in Adolescents with Internet Addiction Disorder,” PLOS One, June 3, 2011, http://journals.plos.org/plosone/article?id=10.1371/journal.pone.0020708.

19. Carr, Shallows, 115–48.

20. Postman, Amusing Ourselves to Death, ch. 4.

21. Sherry Turkle, Reclaiming Conversation: The Power of Talk in a Digital Age (New York: Penguin, 2015), Kindle.

22. Turkle, Reclaiming Conversation, 34.

23. Chaim Potok, The Chosen (New York: Simon and Schuster, 1967), 278.

24. Charles Dickens, A Christmas Carol (Cambridge, Mass.: Candlewick Press: 2006), 35.

25. Bednar, “Things as They Really Are,” 20–21.

26. In Turkle, Reclaiming Conversation, “Two Chairs: Friendship.”

27. Karlyn Bowman, “The Decline of the Major Networks,” Forbes, July 27, 2009, https://www.forbes.com/2009/07/25/media-network-news-audience-opinions-columnists-walter-cronkite.html#12e7afc47a5f.

28. Dieter F. Uchtdorf, “What Is Truth?” CES Devotional, January 13, 2013, https://www.lds.org/broadcasts/article/ces-devotionals/2013/01/what-is-truth?lang=eng.

29. Gilbert K. Chesterton, Orthodoxy, ch. 2.

30. Saraswathi Bellur, Kristine L. Nowak, and Kyle S. Hull, “Make It Our Time: In Class Multitaskers Have Lower Academic Performance,” Computers in Human Behavior 53 (December 2015): 63–70, https://doi.org/10.1016/j.chb.2015.06.027.

31. Jaroslav J. Vajda, “Where Shepherds Lately Knelt,” included in Glory to God (Louisville, Ky.: Westminster John Knox Press, 2014), no. 120.

32. James Agee, A Death in the Family (New York: Penguin, 1938), 18.

33. Agee, Death in the Family, 19.

34. Dallin Oaks, “The Challenge to Become,” Ensign 30 (November 2000): 32–34; see also David Brooks, “The Shame Culture,” New York Times, March 15, 2016, https://www.nytimes.com/2016/03/15/opinion/the-shame-culture.html.

35. Eliza R. Snow, “O My Father,” Hymns (Salt Lake City: The Church of Jesus Christ of Latter-day Saints, 1985), no. 292.

36. “Letter to William W. Phelps, 27 November 1832,” in Documents, Volume 2: July 1831–January 1833, ed. Matthew C. Godfrey and others, The Joseph Smith Papers (Salt Lake City: Church Historian’s Press, 2013), 320.

37. Richard Lyman Bushman, Joseph Smith: Rough Stone Rolling (New York: Alfred A. Knopf, 2005), 560.

38. Russel M. Nelson, “Revelation for the Church, Revelation for Our Lives,” Ensign 48 (May 2018): 95.

39. John Muir, “The Discovery of Glacier Bay by Its Discoverer,” in Wilderness Essays (Layton, Utah: Gibbs Smith, 1980), 18.

40. Emma Lou Warner Thayne, The Place of Knowing (Bloomington, Ind.: iUniverse, 2011), 45–46.

 

Purchase this Issue

Share This Article With Someone

Share This Article With Someone