by Mark Bauerlein

 

In Fall 2017, the president of Wesleyan University, Michael Roth, invited me to speak as part of a “difficult conversations” initiative. Wesleyan is a determinedly left-wing campus, and Roth saw the occasional conservative visitor as good for the intellectual climate. We were eight months into the Trump Administration, and I’d written pieces for Vox, CNN, the New York Times, and other liberal outlets that suggested I might praise President Trump in a way that would rise above naked partisanship.

I decided on a presentation of Donald Trump as a traditional American rogue figure, a model of Emersonian nonconformity, an outlandish character in a lineage of comic renegades. No other individual in my lifetime mobilized the entirety of respectable opinion in America against himself, I would tell them, and that very fact deserved analysis. Everybody in the elite denounced him—a strange uniformity for a social group that professes its admiration for thinking outside the box. Hollywood, Silicon Valley, the swamp, the art world, the media, academia . . . they hated him with a passion that revealed more about themselves than it did about the object of their enmity. He had to have something going for him to evoke such a monolithic pageant of slurs.

I laid this out before an audience of 200, and the faculty in the room more or less got the tongue-in-cheek element (though they asked some tough questions about Trump’s sexism).

Not the students, though. This wasn’t funny at all. They went after Trump’s racism and barked objections to “the wall.” A couple of them stormed out and slammed the doors after I answered their challenges. I had mentioned Emerson, Thoreau, and Walt Whitman in my short talk—classic voices that hailed the iconoclast and the irreverent—and the very first question I got chided me for selecting those three white males.

I wanted to reply, “Who ever taught you to think in this way?” but I didn’t want to insult my hosts. Instead, I went with a little sarcasm: “Oh, yes, it’s really bad, especially when Emerson says that society is in ‘conspiracy’ against the ‘manhood’ of its members” (“Self-Reliance”). I stressed the word “manhood” and waited for the questioner to respond. He remained quiet, so I gave the crowd the historical fact that the “manhood” element in Emerson didn’t stop thousands of strong women in the 1840s from reading him and recognizing, “Yeah, that’s me!” The questioner still didn’t reply—had he read one word of those three men before?—so another student shouted, “But were they black women?!”

It was becoming ridiculous. When I praised President Roth for his support of open debate (which he had declared in a Wall Street Journal column announcing his visiting speaker program), a student sitting right behind Roth announced that he didn’t think much of the op-ed, adding, “But then, I wished that the Vietcong won back in the [’60s].” As a professor in the room reminded him that the Viet Cong had won the war, I looked out at the students, baffled once again about why these kids attending a great school like Wesleyan and clearly on the way to success were so bitter and humorless.

Afterwards, a tall young man, part black and part Hispanic, came up to me and in plaintive tones said that young people of color were dying, yes, dying, from what was happening. He wasn’t angry; he was despondent and helpless. I gazed at him and tried to figure out what motives lay behind the overt distress at that otherwise relaxed after-the-event moment. He was sincere; he really meant it; he genuinely wanted to acquaint me with a world of hurt that I had missed, but he didn’t get specific. He wasn’t arguing; he was emoting. Pain was out there, he insisted, smothering and pitiless, and innocent young men and women were falling. I needed to accept that. This was no time to be ironic or to cite irrelevant writers from long ago. The exchange lasted only a couple of minutes, and it wasn’t contentious. It was mystifying.

My interlocutor walked off feeling no better than before. His expression of angst didn’t relieve him at all. I appeared to him as just another obtuse character comfortable in my “privilege” (which another student mentioned in the Q&A), and to his credit he had tried to open my eyes and ears, not attack. That he couldn’t be any more precise than “People are dying!” was my problem, not his. I didn’t tell him that while he was attending an elite private school whose professors and administrators showed an exquisitely solicitous regard for his feelings, I had never attended a private school of any kind in my life, had worked my way through college, and had had teachers who usually couldn’t attach my name to my face.

Instead, as he reiterated the affliction under way, I actually said something like, “Where?” or “Who?”—only confirming for him the exasperating incognizance of those in power. President Roth had conceived the dissenting-speaker program as a benefit for him and his peers, but when people are suffering so, intellectual diversity doesn’t count for very much. He was at Wesleyan, one of the topmost schools in the country, but academic ideals paled before the grand anguish he saw near and far.

Something has happened to Millennials, a change deep in their hearts. When they first became a national story in the mid-00s, surveys and commentators tallied their habits, goals, and tastes, and drew a profile of them that every older American should envy. A 2000 volume bore the title Millennials Rising: The Next Great Generationand 10 years later Pew Research Center summarized them as “confident, self-expressive, liberal, upbeat and open to change.” They went to college in record numbers, ran circles around their elders when it came to digital tools, and helped elect our first African American president (they favored Obama 2-1). They worked hard, but insisted on a sensible work-life balance—no soul-killing, nine-to-five cubbyhole jobs for them! And when it came to social attitudes, Millennials rose up as the most tolerant generation in U.S. history, “tolerance” defined in this study, for instance, as willingness to allow gays, Communists, atheists, militarists, and racists free speech in the public square.

How, then, did Millennials end up scoring in this survey, the American Worldview Inventory 2020, as the most intolerant of age groups? On the very virtue for which Millennials were hailed but a few years back, tolerance, the survey found a stark divergence by cohort:

findings show Millennials—by their own admission—as far less tolerant than other generations. In addition, they are more likely to want to exact revenge when wronged, are less likely to keep a promise, and overall have less respect for others and for human life in general.

Those traits were echoed in another study by Pew entitled “Trust and Distrust in America.” (In it, nearly half of Millennials (46 percent) qualified as “low trusters” and only 11 percent as “high trusters” (30- to 49-year-olds scored 39 and 18 percent, respectively, and 50- to 64-year-olds scored 31 and 25 percent). When asked if they trusted figures in various different areas of society, younger Americans likewise expressed much lower confidence. While 91 percent of Americans over 50 trusted the military, 18- to 29-year-olds came in at 69 percent. Religious leaders earned the trust of 71 percent of people 50 and up, but only half of the younger crowd. Business leaders earned 50 percent trust from the old but only 34 percent from the young.

In the American Worldview Inventory cited above, we find a related distinction between old and young that goes farther than mistrust. There, Millennials acknowledged more than any other age group that they were “committed to getting even” with those who did them some form of injury. The rate of difference was stark: “[I]n fact,” the summary stated, Millennials were “28 percentage points more likely than Baby Boomers to hold a vengeful point of view.”

And they’re not ashamed of those sour opinions, either. When they’re not in the mourning mode of our Wesleyan student, they’re angry, and they’re willing to act upon it, too. That came through in a July 2020 poll of registered voters by Morning Consult and Politico that included questions about cancel culture. Asked whether cancel culture had gone too far, not far enough, or neither, 36 percent of respondents ages 18 to 34 answered “too far,” only one-third of the age group—a dismayingly low figure. Older cohorts showed greater concern: 41 percent of 35- to 44-year-olds believe it had gone too far, 47 percent of 45- to 64-year-olds that it had done so, and 59 percent of those 65 and older. Worse, nearly one in six of the younger respondents (16 percent) claimed that cancellation hadn’t yet claimed enough victims (the wording was “not far enough”), a bit higher than the next older age group (14 percent) and much higher than the other two (seven and five percent).

That “not far enough” number is even worse than it seems, for two reasons. First, 23 percent of 18- to 34-year-olds answered “neither”—that cancel culture had gone neither not far enough nor too far, implying that the current level of cancellation was just fine with them. So if we add together the rates for not-far-enough and currently-just-fine, Millennials who wish to restrain or reverse cancel culture were outnumbered by those who don’t. We should note, as well, that another 25 percent of the age group answered “don’t know/no opinion”—an apathy that favors the aggressors.

Given this breakdown, the advantage that the last group (the indifferent ones) gives to extremists helps explain the remarkable advent and persistence of cancel culture. It’s a pattern one sees all the time. In episodes of protest, an inspired minority often influences the course of events far more than their numbers would suggest. There is the “chill” phenomenon, of course, whereby one case of censorship and punishment compels others to stay quiet for good, but there is more to the power of the minority than that. Squeaky wheels get the grease.

If, for example, three students complain of a teacher’s prejudice while the other 32 students say nothing about it, the few complainers nonetheless take priority. The “Law of Group Polarization”—the tendency of groups in which most members hold to a particular opinion to drift toward more extreme versions of that opinion as they discuss it—is an instance of the same phenomenon. The law was outlined in a 1999 paper by Chicago law professor Cass Sunstein, who examined certain deliberative bodies such as trial juries and found the phenomenon repeated again and again.

In product liability cases, for instance, when an individual sues a company whose product resulted in an injury, if the jury generally regards the company as liable but 10 members initially favor a modest award to the plaintiff while two push for a stiff award, in the deliberations to follow, the two will pull the 10 far in their direction. The final judgment will end up much higher than the average dollar figure as calculated from the beginning. The psychology is perfectly comprehensible. The extreme members feel more strongly about their opinion than the others. That’s the way extremists are. But the moderates basically agree with the extremists about (in this case) the culpability of the manufacturer. The difference lies in the intensity of feeling, and so the moderates don’t regard concessions to extremists as a compromise of principle. They are more flexible, willing to defer to people more passionate about a principle they share.

Thus the 16 percent rate of young Americans who want cancel culture to move farther and faster is worse than it seems, particularly in light of the 23 percent who regard cancel culture as at just the right level and the 25 percent who don’t know or don’t care. If young social justice warriors find someone who has raised objections to homosexuality on Facebook or wrote once on Twitter that Americans of African descent are low-IQ, the extremists spring into action, coaxing peers to sign petitions, send emails, and make phone calls to the culprit’s employer. The moderates may feel uncomfortable about getting someone fired, but they don’t like homophobia and racism, either, and so they sign and send; the pressure of the extremists works, and the numbers of protesters appear to swell. What starts as distaste over a politically incorrect opinion turns into a campaign of intimidation. The moderates can’t resist joining in because the extremists paint the original crime in the lurid colors of historic oppression, which calls for a hammer response, and moderates don’t want to argue over the severity of the original misdeed. In this hothouse zone, it’s impossible to say, “C’mon, it was just a stupid remark; we all do it sometimes.” They will add their signatures to a petition with 1,500 names already on it to get a person fired for uttering an un-P.C. opinion on Facebook. And so the extremists set the tone. Their “idealism” and commitment bring a fair number of their more easygoing peers on board.

Cancel culture proves that Millennials have moral fervor. They aren’t the sly relativists of which Allan Bloom warned in The Closing of the American Mind. What they lack, instead, is a solid education in history, politics, religion, literature, languages, and art. The mentors didn’t give it to them, no real heroes, no gods, no glorious past, no great nations, no meetings with the beautiful and the sublime. They disallowed identification with vivid characters and larger-than-life personages, teaching critical thinking, instead. No patriotism, no faith, no reverence—those are naïve, old-fashioned, uncritical. Let’s demystify and politicize and theorize, rather; just what rising adults need in this dynamic, disruptive 21st century, right?

The result isn’t, however, a youth of critical scrutiny, skeptical of absolutes. It is the opposite, a moral vision that is absolute and simplistic, a Millennial outlook that neatly divides sociopolitical sins (racism, voting for Trump . . .) from utopian wishes (“No person is illegal,” “Love is love” . . .). And it springs right out of the vacancy in their hearts and minds that the mentors failed to fill. We robbed them of greatness and meaning and purpose and transcendence. We gave them no role models. The universal desire for those inspirations didn’t go away, but nobody supplied them. I’d be angry, too.

Once, in the late ’80s, while visiting my mother in Palm Springs for a few days, I stumbled one afternoon upon a jazz concert in the park downtown. It was open to the public, casually organized, not too crowded, and I strode right in. People in their 60s and 70s sat on the grass and in lawn chairs, strolling back and forth from their spots to the concessions building while a band performed onstage. At the time I was just leaving behind the rock ’n’ roll and New Wave music of my early 20s and getting into jazz (my dissertation advisor was a Miles Davis and Horace Silver nut). The timing couldn’t have been better. This is fantastic, I thought, as I found an empty picnic table at the side of the platform. I listened and looked more closely at the players. Wait a minute, that’s Dizzy Gillespie! You couldn’t mistake the cheeks. Yes, that was he, on a Palm Springs lawn, of all places, where you didn’t even need a ticket. No one was dancing, no one clapping and calling his name, but there he was performing, like any other musician. I looked around and saw nothing else to signal that extraordinary fact. The blue sky, the palm trees, the meandering people, mountains in the distance . . . they were the same as always. The music was nice, a small group in a smooth groove, if I recall, not on a hard beat or hitting high crescendos. The identity of the one man had changed everything, though—his identity and the past he bore in his person. I felt it immediately; the sight struck me as a bit of history shooting into my trifling little life. Dizzy Gillespie was more than just an aged man blowing great sounds on a horn. He had played with Charlie Parker, toured the globe for the State Department, written “A Night in Tunisia,” and now he was 50 feet away and playing for real.

It got better. After 10 minutes the musicians took a break, and Gillespie wandered off to the side, came forward to the picnic table, sat at the other end opposite me, laid out a sandwich, and started to eat. He nodded at me, and I nodded back, hoping not to do anything to annoy him. He seemed content, with an easy air, while I was imagining a thousand legendary occasions buried in his memory. The other people in the crowd didn’t seem to notice; nobody sidled up for an autograph. Didn’t they know what was happening here? Maybe they had already lived through too much history to be impressed in the way I was, but for me it was confirmation.

The encounter had a reinforcing effect, though I couldn’t recognize it at that moment because Gillespie’s presence was too overwhelming. I see it now through the lens of a lifetime of teaching. Here was a figure of historic dimension presented to a twentysomething who had none, and it was great. He carried the weight of time—and did so with a smile. It gave me a feeling of destiny, not my own, really, but destiny “out there.” For an hour, at least, the time and place were not routine and uninspiring. I wanted to go home and listen to “After Hours” with him and Sonny Rollins and Sonny Stitt, and to read more novels, too, and some history and philosophy, and draw a little more of the greatness that he carried all the time.

It’s the kind of meeting that should happen to 20-year-olds as often as possible. Gillespie made the past reverberate, and every youth deserves such encounters with the legends. When they brush by, the air thickens, your reflexes slow, and the course of time takes a pause.

I have a friend whose father was a famous intellectual and scholar, and he used to take my friend with him now and then to international events. My friend told me recently that in 1971, when he was 12, during a symposium in a palazzo in Venice, the father drew him forward to an old man sitting in a chair with some notes in his lap. They hesitated a few feet away. “Pardon me, Maestro,” the father said, “but let me introduce my son.” A smile and a nod, and my friend meekly nodded back. They turned and walked away, as the father muttered, “You just met Ezra Pound.” He remembers it vividly. He’s Jewish, and he knows what Pound said of the Jews during the War, but those hands wrote “The apparition of these faces in the crowd” and piloted more than anyone else the progress of Modernism, one of the standout explosions of literary talent in all of European history. Moral or immoral, villains or victims, such figures bring to life romance and greatness and mystery.

I cannot stress too much how salvific they are to a young sensibility tied to an iPhone and never off Instagram, who inhabits a universe with no enchanted objects, a history with no climaxes and turning points, just regular ups and downs, a tradition with no masterpieces, only a heap of entertainments. These young people don’t know that experience can be so much better, that they don’t have to settle for crude consumptions, not when there is an immortal in Palm Springs with his horn, a poet in the hotel with his papers, or any of the items on a Western Civ syllabus.

A young man stopped in front of my townhouse the other day blasting his pop music with the top down, and I said to him, “Do you know that if you drove through your neighborhood playing Beethoven that loud you would be a renowned figure, instead of a guy just like a million other guys your age?” He gave me a puzzled look that implied, You some kind of a nut? and drove off, but one can hope that the notion stayed with him. There is greatness to be enjoyed, he should be told over and over; there are talents to revere; the past contains wonders.

Without that belief, our 30-year-olds are disappointed, uncertain, pessimistic, and resentful. It’s a natural response. We can’t all be liberal ironists skating through life free of deep commitments and supportive deities. They want a meaningful existence, and they look for it in false gods of social justice and the like, unable to find it where they should, in church, in tradition, in humanitas, in country, and in role models. As we watch them march against systemic racism, science-denial, and the rest, let’s attribute those errors to what didn’t happen in their classrooms 10 years earlier.

– – –

Mark Bauerlein is a senior editor at First Things and professor of English at Emory University, where he has taught since earning his Ph.D. in English at UCLA in 1989. For two years (2003-2005) he served as director of the Office of Research and Analysis at the National Endowment for the Arts. His books include Literary Criticism: An AutopsyThe Pragmatic Mind: Explorations in the Psychology of Belief, and The Dumbest Generation: How the Digital Age Stupefies Young Americans and Jeopardizes Our Future. His essays have appeared in PMLA, Partisan Review, Wilson Quarterly, Commentary, and New Criterion, and his commentaries and reviews in the Wall Street Journal, Washington Post, Boston Globe, The Guardian, Chronicle of Higher Education, and other national periodicals.
 

 

 

 

 

 

 


Content created by the Center for American Greatness, Inc. is available without charge to any eligible news publisher that can provide a significant audience. For licensing opportunities for our original content, please contact [email protected].