“Did you see what happened on Twitter today?” For Twitter’s users, posters and lurkers alike, that breathless question might define the feel of the platform. “Users” is apt, because addiction is built into the website, or “the hellsite,” as the damned citizens of the Republic of Twitter say. To those not on Twitter, accounts of the latest blowup are opaque, inscrutable, and disturbing — byzantine descriptions of the court politics of some psychotic remote civilization in the midst of civil war. To the academics and journalists whose professional lives are most closely involved with the happenings in Birdland, the strife is of professional interest — if only to learn how to stay out of the way.
But it’s not all compulsive self-torture! What peruser of academic Twitter has not stumbled across some article or book at just the right time? And the platform’s promotional usefulness is undeniable — a Twitter account can be the difference between an article’s sinking into obscurity or its elevation into the higher atmosphere of fame. At least, it sometimes feels like it
We asked three scholars to take the measure of Twitter. For the prosecution, Katherine C. Epstein discusses what she sees as the platform’s formal incompatibility with the vocation of the scholar, and Irina Dumitrescu warns about its degrading effects on the way we think. For the defense, Rafael Walker speaks to the platform’s real possibilities for establishing community — and even lasting friendships.
Time for a Long Pause
Attention. Unless you are extremely disciplined about how and when you use it, social media is very likely affecting your ability to concentrate for substantial periods of time on endeavors that are challenging and meaningful. These platforms are engineered to keep you checking your accounts so often that it becomes a habit. If you have ever absent-mindedly picked up your phone to check your notifications, or started typing the address of Facebook or Twitter into your browser without intending to do so, then you have become their ideal product — your attention already sold to advertisers. It is common these days to complain that students are unable to maintain their focus long enough to read a book or follow a lecture. But consider: Is your concentration what it was 10 or 20 years ago? Are you able to lose yourself in a book or a movie as often as you would like? Can you sit down and work on an essay or a research problem for an hour without interrupting yourself?
To be fair, Facebook, Twitter, and their ilk are not solely responsible for the difficulty many of us feel concentrating these days. Other websites and applications share their addictive qualities. YouTube (which some consider a social-media platform) and Netflix keep users hooked through streams of content adapted to their interests. News sites deliver craveable hits of novelty and excitement. Even checking email can become a habit that interferes with deep work. But with targeted content, tagging, direct messaging, constant notifications, and the dopamine boosts provided by likes and reshares, social media is particularly good at encouraging us to leave whatever we are doing and log on, again and again and again.
Depth. Once back on the platform, we are in a system that accustoms us to react and behave in a way that is antithetical to the habits of thought cultivated by higher education. Academic training aims, at least, to teach people to think slowly, deeply, and carefully. We teach our own students to pay attention to context, evaluate sources critically, and consider counterarguments and opposing views, even if it is to argue against them more effectively. In the best-case scenario, this kind of thinking produces ideas that move beyond what is already obvious to most people.
The trend on social media is in the opposing direction. Twitter, TikTok, and Instagram demand content that is short, fast, and shareable without context. It is true that some platforms, such as Facebook and LinkedIn, allow for longer posts and extended discussions. (Perhaps not coincidentally, these are generally seen as being for an older demographic.) But even platforms that allow for longer material are now heavily promoting short-form videos to serve decreasing attention spans. We may be able to see memes, sound bites, and hot takes for what they are: entertaining, provocative, disposable. But the more time we spend each day on social media, the more we are habituated to think in the forms it has given us. We might share a nascent idea before it is ripe, or offer a quick reaction to an issue before thinking it through. Brevity can be a virtue, but only when it is the result of discernment. What social media offers is the fast-food version of thinking.
Truth. Social media can be deceptively useful. We may feel that staying online keeps us informed. The problem is that the quality of information and discussion varies dramatically, even on a single platform. On Twitter, for example, you may be able to read commentary on current events from someone with hard-won expertise: a journalist reporting on a crisis on location, say, or a scholar who can put a new discovery in context. This is part of what makes that platform so appealing.
At the same time, those insights will be mixed in with unreliable or deliberately deceptive posts. Profiles you follow might share dubious claims out of habit or a desire to seem supportive. Many people retweet links to articles they have not read. Online, people regularly claim expertise they do not have, assume fake identities, or impersonate prominent figures. Malicious bots spread misleading claims and engage human users in conversation. Consider the efforts to which you go to teach your students how to base their research on reliable data and sources. Now ask yourself how much of your day you spend taking in questionable data from questionable sources, and what that is doing to your judgment. It is possible to curate a social-media feed that mainly delivers reliable, sound information, but that takes energy, attention, and time. Do you have so much of those resources that it is worth spending them in this manner?
Independence. Perhaps worst of all, social media encourages its users to embrace opinions popular in their circles without thinking them through. Sometimes this happens organically. People instinctively want to fit in with their peers, to be seen to be supporting the right causes, adopting the correct interpretation, supporting the right people. Academics like to think that they are above groupthink, but they are just as liable to it as anyone else. Sometimes people fall in line because of social pressure. In some circles it is common to demand immediate public statements from people on hot-button issues, and to assume that silence is tacit support for the opposing viewpoint. The costs of holding a different opinion from your peers are more immediate.
In this context, it is difficult for people to consider evidence and come to their own conclusions. There may be a diversity of ideas on social media, but because algorithms amplify posts that elicit strong emotions, it is the most extreme versions of those ideas that are boosted — and fast. It’s not just that nuance, ambiguity, and complexity are sidelined on these forums, though they are. It’s still possible to resist the urge to take a position on an issue thoughtlessly. But it is harder to come to an independent opinion when many in your circle have already lined up on one side, when you have read their answers before you even knew what the question was.
The negative emotions which social-media algorithms foster and reward undermine the very bonds these platforms once helped create.
Community. One of the great boons of social media has been connection. Scholars who might not have anyone at their own university with whom they can discuss their research can do so easily online. It is hard to forgo this benefit, especially if you have few colleagues in your geographic area. There are subtle costs, however, to moving scholarly communities online, rather than maintaining them through conferences, direct communication, or the now-ancient-seeming listserv. Put simply: the negative emotions which social-media algorithms foster and reward undermine the very bonds these platforms once helped create.
Envy has been an inevitable part of scholarly life since the beginnings of the university. There are angels among us who feel no twinge when they see a colleague thrive, who never feel sad because of someone else’s accomplishments. But most of us are not angels. These feelings are more manageable when we remember each other’s humanity. In a conversation, we get to know a real person. On social media, we see people become brands, advertising their new jobs, fellowships, and prizes.
What does it mean to build our communities on platforms we do not control? Social-media algorithms promote the incendiary, the sensational, and the outrageous. Imagine organizing a conference where the microphones are turned up every time someone says something critical, angry, or accusatory. Social media is also notorious for facilitating harassment, which disproportionately affects marginalized people. Traditional academic events are not free of harassment either, a problem scholarly associations have often been slow to deal with. But scholars still have much more influence over their own organizations than they do over the policies of Twitter or Facebook. As Jeffrey Lawrence has pointed out in these pages, “digital platforms not only reproduce the racial and gender biases of society at large; they often refuse to modify the algorithms that promote such biases if they believe that doing so will substantially detract from their bottom line.”
It has never been so difficult to pull back from social media. The Covid-19 pandemic has made it harder than ever to keep in touch with scholarly communities. Even before the pandemic there were reasons online networks seemed like an indispensable way to connect with colleagues: lack of research budgets, physical or practical inability to travel, or sheer distance. Add to this the growing pressure (real or imagined) on academics to become managers of their own public brands, constantly updating the world on their work.
But the use of social media remains a choice. The era of endless Zoom meetings is precisely the right time to try to save what is left of our ability to think with clarity, independence, and depth. If you find that being online is ruining your attention span, your relationships, or your ability to think and speak for yourself, consider taking a break. A long one.
Irina Dumitrescu is a professor of English medieval literature at the University of Bonn
A Vindication of Academic Twitter
Less flippantly, David Bromwich, in a recent interview with The Review, asseverated that tweeting “goes against the vocation of being a scholar” and predicted that “it’s going to reduce the prestige of professors” by making us seem “more like everyone else.” Assessing the platform with a more even hand, Irina Dumitrescu — in an interview stemming from her earlier essay in these pages condemning “professorial groupthink” — worries about the ease with which such avenues as Twitter enable toxic academics to disseminate their toxicity, their defamations and other kinds of abuse.
For my part, the risks exercising Twitter’s critics are much less concerning. For one, companies have been collecting and profiting from our personal information for a very long time, and Twitter, whether under Musk’s leadership or someone else’s, is anything but unique in this regard. We all voluntarily offer up what amounts to reams of information about ourselves with virtually every keystroke and click. Moreover, the demagogues have shown that they will find venues for their rabble-rousing, no matter what (hence the proliferation of social-media platforms after Trump’s ejection from the site).
As to the toxicity question — in my view, the most legitimate criticism of Twitter — Dumitrescu provides the counterargument for us: “narcissists and sociopaths,” as she points out, will exist no matter what, and they will find some way or another to inflict the harm that their egos crave. I would round out her observation by noting the fact that Twitter has both a “mute” and a “block” option (both of which I use liberally), tools helping users to avoid injurious or annoying people more efficiently than is possible in face-to-face interactions.
The courage to descend from the ivory tower and expose the full range of your humanity is admirable, a welcome corrective to the snobbish aloofness that so many of us have learned to cultivate.
Finally, Oates’s and Bromwich’s fear that tweeting academics risk embarrassing themselves or — Heaven forfend! — seeming too much like the common folk demonstrates how easily the adjective elite can slip into its ugly cousin, elitist. To me, the courage to descend from the ivory tower and expose the full range of your humanity is admirable, a welcome corrective to the snobbish aloofness that so many of us have learned to cultivate. Besides, only small, ungenerous minds would impugn someone’s scholarship solely on the basis of a dog photo or a lighthearted Twitter rant against mayonnaise. Such people ought not be pandered to.
Maybe academic Twitter isn’t as bad as many have suggested, but is it any good? I’m convinced that it is, and here’s why.
The professional networking benefits are unparalleled, and this advantage alone makes Twitter worth it. Precisely because of its open-air format, Twitter brings people to your attention, and you to theirs, with whom you would be very unlikely to have sustained contact otherwise. Sheerly through tweeting, for example, I’ve accrued important professional allies, received invitations to coveted panels and other speaking engagements, and been offered excellent publication opportunities (including a prospective publisher for my first book).
Those are some of the reasons I encourage friends and mentees not only to join the site but also to tweet regularly. It’s not enough to be a “lurker,” the craven voyeur who logs on to take but never share. This habit, while safer, forfeits one of the greatest benefits of the site — its capacity to put you on the radars of potential collaborators or editors. You can’t appear on anyone’s radar if you are invisible. To be seen, you must exhibit.
I understand that many academics, especially more established ones, may find the platform’s rampant incivility and half-baked diatribes enough to tilt the scales in favor of abstention. This is no trivial concern. The internet is, generally, an uncivil place. Add to that the psychological finding that it requires three good cognitions to counteract just one bad one — that pleasant inputs to the brain have only a third of the staying power of bad ones — and it’s easy to grasp why some struggle to shake the Twitter jitters.
But the site can also yield fulfilling, long-term connections, and it is worth asking whether the transient discord that one encounters on Twitter — discord scarcely less prevalent elsewhere in academic life — is worth depriving oneself of the possibility of lifelong friendships. After all, we do not apply this logic in other areas of our lives. We have one-night stands, go on dates, and get married fully aware that so much could go wrong but optimistic that so much could go right. Sociability is risky, wherever it happens.
But we have a choice in how we approach those risks. We can take Sartre’s cynical view that “hell is other people” (indeed, “hellsite” is a favorite epithet for Twitter’s detractors). Or we can leap headlong into social life, social media included, with Tennyson’s sanguineness, accepting that “’Tis better to have loved and lost than never to have loved at all.”
I have chosen the Tennyson way, and I don’t regret it one bit. Despite Twitter’s wet blankets (most of whom I have blocked or muted), I have forged relationships that I couldn’t imagine having forged otherwise. When I accepted my job in New York, for example, I had plenty of friends in the city but knew virtually no gay academics in the area, and, much as I loved my existing friends here, I yearned for more simpatico company — friends who would understand what I meant when I referred to the overserved 20-something at the bar as Lydia Bennet.
Then 2020 threw an enormous wrench into the prospect of hanging out with strangers. But Twitter helped to pry out that wrench. I connected with two brilliant gay English professors around my age, we met for drinks and French fries in real life, and they are now so close to me that I can hardly remember not knowing them. Since then, I have continued to meet people from all over the country — for meals, dancing, Zoom-based happy hours, rooftop hangs — and even started a remote reading group focusing on Moby-Dick and Middlemarch with two scholars (now cherished friends) whom I had never met outside Twitter but now see almost weekly.
My story isn’t unique. I know many other academics who have developed friendships on Twitter that might not have stood much chance without the site. I have also come to understand that Twitter has been a lifeline not only for scholars socially starved by pandemic-related restrictions but also for those belonging to groups historically shut out for all kinds of structural reasons. For disabled scholars, the accessibility benefits afforded by Twitter defy enumeration. In discussing these benefits with two scholars of disability studies, Jason S. Farr and J. Logan Smilges (both initially Twitter friends, incidentally), I learned that Twitter enhances access significantly for hearing-impaired, mobility-impaired, and neurodivergent people. And it provides a forum for scholars with disabilities to build communities — in which they can, among other things, figure out ways of navigating our deeply ableist profession.
Scholars from other underrepresented backgrounds reap similar benefits from the platform. Often demographic loners in our departments (if not our entire institutions), many of us in the minority find on Twitter a source of reassurance that we are not alone in the profession and that we belong here.
Graduate students often feel less daunted corresponding with more-senior scholars on Twitter than they might by email, and, when they tweet civilly, they have the potential to form career-launching connections. Scholars at small, far-flung colleges — where, within a prohibitively wide radius, they may be the only one in their specializations — are better able to keep up with their scholarly communities.
Trained to be critical, and to believe that enthusiasm is weakness, academics are endemically uneasy about expressing fondness for anything, much less a corporatized site enjoyed by the masses. The profession’s tacit norms conspire to turn each of us into a Pococurante, the world-weary Italian senator of Voltaire’s Candide almost nihilistic in his bloodless detachment from the world’s delights. But, at least so far as Twitter is concerned, I refuse this perverse posture. A resource that unites me with inspiring, like-minded people from across the globe; that lets me evade voices noxious to my well-being; that delivers collaborations and contracts to my doorstep; that opens the profession to people unjustly excluded from it and from whom I want to hear — that, to me, seems about as good as it gets. So, until the site ceases to pay such prodigious dividends, you can continue to tweet me @raf_walk.
Rafael Walker is an assistant professor of English at Baruch College of the City University of New York.
0 Comments