Author’s Note: The busy summer months have now kicked in, and, predictably, I’m falling behind on my Substack. Currently, I’m about halfway through a post on Top Gun: Maverick. I hope to have it out next week. Meantime, here’s a repost of something I wrote in January 2021 on a different platform. Recently this piece came up in conversation, and I thought it worth cataloging here. Many thanks for your interest and, indeed, your patience!
Since the first iPhone appeared in June 2007, it has become fashionable to lament the smartphone’s deleterious influence on a number of cultural habits, practices, and traditions. People today, especially young people, are less likely to be happy, less likely to have sex, and less likely to engage in deep reading. A number of prominent academics and journalists have traced (if not reduced) the problem to the ubiquitousness of smartphones. For Nicholas Carr, whose book The Shallows: What the Internet Is Doing to Our Brains (2010) won the Pulitzer Prize, smartphones inhibit the acquisition of knowledge and “diminish” our intelligence by constantly interrupting attention. For example, he cites a recent study in which an inverse relationship was found between smartphone proximity and student test scores. The closer the test taker’s smartphone, the worse her score on the test.
Carr has had his share of critics, but, as time has passed, many of his concerns have been borne out. For example, recent studies show that time spent online has doubled since the mid-2000s, and reading scores on the SAT have slipped accordingly. Other data is even starker in its findings: the number of high school seniors who read a book, magazine, or newspaper on a daily basis declined by almost 50 percent from the 1970s until the 2016—a number that has undoubtedly worsened during the COVID-19 pandemic of 2020 and beyond.
Smartphone apologists have countered that, even if digital devices do contribute to the decline of the printed word, they support other media. For instance, digital cameras have fostered an unprecedented interest in photography, with the number of annual pictures being taken now exceeding one trillion. Similarly, smartphones have led to a surge of “podcasts,” various episodic series of spoken-word digital audio files to which users can subscribe. Indeed, some commentators have argued that podcasts, far from marking a technological breakthrough, are actually just updated versions of an older media form—namely, the radio shows of the pre-television era.
Yet, even in these cases, problems remain. It appears that podcasts are particularly popular with males of an average age of 40 years, and young people do not seem especially taken with the medium. Likewise, the rampant use of smartphone photography has contributed a veritable crisis in teenage self-esteem. As Charlotte Markey explains, “Body image research has found that any appearance-related information can be ‘triggering’ for individuals who already experience body dissatisfaction.” But to what extent are these phenomena related? Are increasing rates of teenage depression, narcissism, and suicide somehow connected with the decline of reading? Are young people more unhappy because they’re no longer perusing Ivanhoe (1819) in their downtime?
Such a thesis, at least on the face of it, almost sounds ridiculous. But what if reading is simply one of a number of related activities that are dying due to smartphone usage? Take movies. Since I’ve been at Villanova, I’ve carved out a reputation, however modest, as a cinephile. I developed a course called “Theology and Film,” co-hosted a small conference on the same subject, and have published fairly widely on the topic, including books on Terrence Malick and Martin Scorsese. And while I am something of a film buff, who knows his John Ford from his John Cassavetes, I did not gravitate toward the academic study of religion and film on accident. Rather, I was hoping to find a way of linking scholarly questions in philosophy and theology to popular movies that might resonate with undergraduate students. By no means has this been a fruitless venture, but, over the last several years, unexpected challenges have appeared. Whereas a decade ago I could assume a measure of cinematic “common ground,” I have learned that that is no longer the case. Gone are the days when cultural literacy requires that one watch, say, The Godfather (1972) or The Big Lebowski (1998). At first, I reasoned that this was a historical accident. By the mid-2010s, streaming services such as a Netflix were cultural presuppositions, and it seemed, at least for a time, that students were preferring TV series such as Black Mirror (2011-) to proper films. But I’ve had to revise this thesis yet again. It’s now increasingly rare to find a student who has watched a TV series, apart from the occasional Game of Thrones (2011-19) diehard. It has gotten so bad that, during the Fall 2020, I offered a “bonus” assignment to my undergraduate students: watch the hit Netflix series Cobra Kai (2018-) and write a brief analysis of how it contrasts pre- and post-Internet Western culture. No one took me up on the offer, though a few students did take a look at the series. One of them said she plans to finish it.
Notably, my anecdotal observations are in line with research on this subject. Already in 2015, a study done by Nielsen Co. led to a striking conclusion: “Smartphones are winning and traditional television is losing.” The situation has only intensified in the ensuing years. A more recent Nielsen report, derived from data culled during the first quarter of 2020, demonstrates how lopsided this comparison has become: “To give some context to the extent to which digital has supplanted traditional TV for youth, this latest report indicates that 18-34-year-olds spent almost three times as much time using apps and the web on smartphones alone than watching traditional TV.” And this is to say nothing about the movie industry, which was already dying before the pandemic of 2020. In a 2019 article in the New York Times, Kyle Buchanan asked a number of Hollywood insiders if movies will survive the next decade. One of the most discerning answers was given by comedian and screenwriter Kumail Nanjiani:
I was at a bar with a friend who directs big movies, and while we were in line for the bathroom, he was saying that movie theaters were going to go away. He was like, ‘Kids don’t watch movies, they watch YouTube.’ Which I thought was crazy. So he goes, ‘Watch this.’ There was a girl in front of us in line, and he said, ‘Hey, excuse me, what’s your favorite movie?’ And she said, ‘I don’t watch movies.’ Just randomly, he picked someone — and she was like 25, she wasn’t a child or anything. We were like, ‘Well, do any of your friends watch movies?’ And she said, ‘Not really.’
Yet, while much ink has been spilled over the existential threat to movie theaters, not to mention the stranglehold that “theme-park movies” have on the medium, it appears that few have dug into to the deeper issue mentioned by Nanjiani — that young people actually prefer YouTube to cinema. In other words, it’s not that the cost of going to the cinema or paying for a streaming service is forcing young people to migrate to YouTube; they simply don’t want to watch movies. But why? And what are the implications?
These are weighty questions, which could be analyzed in a variety of important ways. With that qualification, I will sketch out just a few provisional responses. First, it’s worth considering that YouTube’s basic function and interface aligns well with the cultural presuppositions of many millennials and iGenners. Though one can watch movies on YouTube, most prefer to flit around the site, clicking (often haphazardly) on an assortment videos. In a single visit to YouTube, one can watch videos about installing lace wigs, the meaning of Eucharistic adoration, the ten weirdest injuries in NFL history, how to have sex on a plane, and what you need to build your own off-grid cabin. Needless to say, there is no crimson thread linking such heterogenous content. Other than generating ad revenue for YouTube (which, in fact, is a subsidiary of Google), these videos are ostensibly shorn of any larger meaning or final coherence. In this sense, YouTube is a paradigmatic example of what philosophers have dubbed “the postmodern condition,” in which all walks of life can converted into a source of entertainment and, in turn, income. That YouTube can be easily accessed via an app—dipped into and then, equally suddenly, out of—only heightens this aspect. If movies and, to a lesser extent, television appeal to the modern individual, who unwinds with a show or two after a 9-5 workday, YouTube abandons this neat dichotomy. It can be used, or discarded, as the mood strikes.
Second, and following on from the previous point, YouTube embraces postmodernity’s oft-mentioned incredulity toward “metanarratives.” Indeed, one might wonder if YouTube is not incredulous toward “narrative” as such. After all, movies and TV shows feature a handful of themes that are, in turn, organized and worked out within a plotted structure. As Gustav Freytag argues in Die Technik des Dramas (1863), dramatic narratives are inherently systematic, originating with an exposition and building towards a resolution. In contrast, many of the most popular videos on YouTube are “vines,” compilations of humorous and occasionally scatological clips that are strung together in random fashion:
The difference between “story” and “clip” is obvious: the former is a coherent whole, arranged so as to present certain truths or raise certain questions about the human condition or existence writ large; the latter does not belong to a logical whole but instead captures a fleeting moment, which has no significance other than entertainment. It turns out, then, that Nicholas Carr’s fears about the influence of smartphones on reading habits were not broad enough, because the same phenomenon is now at work in film. Traditional movies and long-form TV series are proving too methodical and protracted for those raised on YouTube, Instagram, and so on.
The implications of this situation are manifold, but I want to close with a Kierkegaardian observation. In a number of places in his authorship, Kierkegaard describes the human self as a synthesis of dialectical elements, including “necessity” and “possibility.” What he means, among other things, is that the self is shaped by the interplay of embodiment-finitude on the one hand and imagination-infinitude on the other. For example, an athlete is a combination of her physical talent and effort as well as her ability to think creatively and to set future-oriented goals. Indeed, for Kierkegaard, it is essential that the self strive to keep its dialectical structure in balance and harmony, lest the self lapse into disunity and, finally, despair. It is in this connection that the turn toward YouTube becomes especially concerning. Older forms of art require and even facilitate a kind of self-development: to read a novel is to “sit” with the text in and over time, to “curl up” and to “get into it;” however, reading is also an activity of psychological projection and, ideally, of existential revelation. Something similar could be said about movies. After all, to use a common example, what is “family movie night” but a social gathering in which various persons simultaneously engage an ordered imaginative representation? With YouTube, however, it is different. The platform is sheer possibility, requiring little physical commitment from its user. Of course, this contingent variety is precisely what makes YouTube and other smartphone apps convenient and fun, but, if Kierkegaard is right, they ultimately bear many dangers—dangers that are already beginning to be felt and seen, certainly within the literary and cinematic worlds but, it seems increasingly likely, within the self too.
June 23, 2022
Good points. It’s challenging for many people to strengthen their intelligence when ever-increasing amounts of information fight everyday for space in our already-limited attention spans. I think not merely smartphones but excessive engagement with YouTube, TikTok, Instagram, etc. stifles concentration which, with respect to the Kierkegaard bit, may breed higher rates of angst, anxiety, despair—perhaps similarly to experiences of withdrawal from highly addictive substances? Social media appeal, in general, seems borne out of an increasingly pervasive fear of missing out. Paradoxically, one’s heightened engagement does not squash but rather promotes the fear—similarly e.g., to the workings of nicotine addiction. My fear is that excessive interaction with fragmented media leads to more scattered thought processes and thus weaker interpersonal skills, less deep connection, among other socio-existential issues.
So, I agree that “…Carr’s fears about the influence of smartphones on reading habits were not broad enough...” I would argue (non-exhaustively) the culprits of diminished intelligence and increased distractibility qua the fragmented contents of social media, coupled with the pervasive need or expectation for instant but brief communication.