There’s a scene in the Norwegian movie The Worst Person In the World in which one of the main characters, a comic book artist, is being interviewed on TV about his early work. The interview quickly turns into an inquisition. The comic that had made the middle-aged artist famous decades before was classic 90s-style underground pop culture: raunchy, rude, vulgar, obscene, and deliberately offensive, as well as trenchant and emotionally honest. Back in the cartoonist’s youth, it was a genre that was embraced by artists, intellectuals, and others on the disaffected political left as both authentic and transgressive, and demonized by religious conservatives as decadent, debased and culturally corrosive. But in the interview, the cartoonist is pilloried for all of these things from the left. They berate him for his crude sexualization of women. Doesn’t he understand that some of his readers could be victims of incest and rape? He’s using his male privilege to mock those weaker than him, they scold.
The cartoonist, exhausted from years of answering these same charges, loses his temper. After all this time, the allegations still perplex him. As an artist, his moorings remain the same as they’ve always been. But like Don Draper, the world has changed radically around him.
The film is hardly the first to have observed it, but it’s an effective portrayal of how the cultural left and right have swapped places. The censors who once came with pitchforks after Robert Mapplethorpe, “Piss Christ,” 2 Live Crew and “The Last Temptation of Christ” in the name of Christian family values are now the scolds forming online outrage mobs over Dave Chappelle and Dr. Seuss in the name of social justice. The conflict is exactly the same, only the characters have been scrambled and the script updated.
How did this happen?
Last week I wrote about political activism and how its professionalization has generated a whole range of corporate interests on the part of activist non-profits — interests that are entirely separate from those of the constituencies the activists purport to champion, and in some cases are in direct conflict with them. This dynamic has compounded the dysfunction of politics and government and made it virtually impossible to solve our most vexing social problems.
What I didn’t get into was how political activism became such a booming industry in the first place. It’s this process — the growth and development of a vast technocracy of professional demand-makers —that has warped the incentive structures of just about every industry populated by the professional managerial class, from Hollywood to Washington, DC, leaving in its wake a world that, in Bari Weiss’ words, has “gone mad.”
Over the last decade and a half, the proportion of Americans over 25 years old with a bachelor’s degree has increased by a little short of five percent. This continues a steady climb in college graduation rates that began in the 1990s. (This increase has seen a brief reversal since the beginning of the pandemic. It remains to be seen whether that decline becomes a secular trend, but in any case it impacts the elite strata of four-year private colleges and universities the least of any higher education sector.)
On the surface, the reasons more and more Americans have chosen to go to college seem pretty straightforward: Traditionally, college graduates have made more money and have been more employable than high school graduates. That reality became only more pronounced as high wage blue collar manufacturing jobs disappeared at the end of the last century.
But that reality is, on the one hand, more complicated than it appears, and on the other, less true than it used to be. As for money, it’s true that as a group, college graduates have a higher lifetime earning potential than non-college graduates. But there is so much overlap between the two groups that it is quite possible to enter the workforce with only a high school diploma and make more than you would have with a bachelor’s degree. As for employability, for decades, college graduates had an easier time getting jobs than non-college graduates. That advantage, however, reversed in 2018. Since then, the unemployment rate of recent college graduates has outpaced the unemployment rate for workers in general. That’s been the case for every single month over the last twenty months straight.
College degrees have suffered a decline in value because there’s a glut of four-year college graduates on the labor market. Those Bachelor’s degree holders who earn lower wages and salaries than many high school graduates tend to be either clustered into low-paying fields that nevertheless require college diplomas, like social work and teaching, or they’ve entered into careers that don’t require a college education in the first place, like retail, office management and customer service. In the first case, the abundance of college-educated candidates, combined with depleted public treasuries, has generated entire sectors of credentialed professions that command little-to-no bargaining power on the labor market. In the second case, the fierce competition for college degree-worthy positions has inevitably pushed less competitive college-educated candidates into jobs they probably didn’t envision for themselves when they enrolled in a four-year degree program. In general, this buyer’s market for employers has chiseled away at the traditional advantages of a college degree and led, since 2018, to a stubborn unemployment problem among those whose educations once largely insulated them from it.
So over the last couple of decades we’ve been minting more college graduates than ever, but their career prospects are bleaker than they used to be. It’s a quandary that has forced these new job entrants to adapt in ways that have transformed the industries they’ve infiltrated.
Quite understandably, these young, educated professionals aspire to the upper-middle class lifestyles that they believed their college degrees promised to provide for them. But as less than a third of college degrees awarded each year are in STEM fields, they tend to lack clearly marketable skill sets. Their Comparative Literature and Political Science classes haven’t taught them how to build or design new products or how to plan and implement new business strategies. What they have in abundance, however, is cultural capital, and more specifically, its college-inculcated subvariant, moral capital.
For a couple of long stretches — during the dot com boom of the late 90s and then the pre-pandemic tech boom — many of these recently graduated job candidates found lucrative careers in the digital economy. And it wasn’t just the coders. For those who majored in the humanities and social sciences, there was a plethora of jobs in marketing, people management, and communications to fill. The tech sector, once idealistic and libertarian, came to embody the moral capital of this wave of new entrants, becoming, in a word, woke. Other, lesser industries, including my own, also did their part to absorb this surplus white collar labor, and were radically subverted in the process.
But probably no industry has scooped up more of this labor market overflow than the non-profit sector. Unlike in tech and media, in the world of progressive NGOs, there is an actual organic demand for the moral capital that these job applicants have spent four years of college accumulating. There, one’s finely calibrated sensitivity to microaggressions, one’s native fluency in the obscure grammar and lexicon of social justice speak, and one’s acute ability to discern the structures of racism in literally anything are assets rather than liabilities. And from there, one can literally create the consumer market for those talents out of thin air, simply by inventing new social problems to solve.
From the perch of one’s non-profit post, one can diagnose the rest of the world’s afflictions, perceiving the white supremacy at work in corporation X’s failure to achieve racial parity in its recruitment practices, and recognizing the misogyny and transphobia in the gender lopsidedness of industry Y’s consumer base. One can divine the contours of new marginalized communities, from “voice-hearers” to “MAPs,” each desperate for new 501(c)(3)s to protect them from their oppressors. Defense against social exclusion, it turns out, is an infinitely elastic commodity, as it takes nothing more than a college-educated imagination to discover new threats, name new villains, and then sell your remedial services to the wrongdoers in the form of audits, trainings and consultations.
The business model is simple: extortion. The non-profit world’s moral technicians scan the landscape for organizations, whether public or private, that can afford their services, diagnose them as acutely infected by some form of structural oppression or another, and then offer up their suite of services to set them on their public path to healing and redemption. As Malcom Kyeyune has noted, the ideology of wokeness operates like a political protection racket, insinuating its practitioners into every industry and enterprise as intermediaries between creators and their creations. You can run your company, you can write your screenplay, you can draft your bill, you can bring your product to market — but not without bringing in an outside team of minders to ensure that it conforms to the ideological standards of the moral intelligentsia, for a hefty fee. For a class devoid of any discernible skills of actual economic value, this parasitical function is an enviable source of political and social power.
“Diversity, Equity and Inclusion” consulting jobs have thus metastasized into a global, multi-billion dollar industry. That industry has become not only a life raft for otherwise employment-challenged college graduates, but a job-producing juggernaut that can sustain the credentialed class indefinitely, whatever happens to the real economy.
The DEI confidence game faces little resistance from its marks because, as corporate America has found, the demands these consultant-activists put on their targets as penance for their misdeeds are hardly onerous; indeed, they’re barely unwelcome. They usually come down to hiring an overpriced DEI consulting shop to run mandatory trainings in racial sensitivity and unconscious bias for the company’s hapless employees and to write new procedures and policies to oversee how those workers relate to clients, customers, and one another in any area proximate to race or any other intersectional identity. In other words, the price corporations pay for their moral missteps is to be handed new tools of discipline and control over their workforces. It’s an arrangement that works out almost as well for the bosses as it does for the consultants, while the workers, as usual, get squeezed.
In the aggregate, the deal that the lumpenbourgeoisie has struck with the capitalists amounts to the equivalent of a small tax on corporate America to keep the college-educated class gainfully employed — and to forestall the political instability that tends to ensue when a segment of the elite is denied the fulfillment of its aspirations. As a bonus, those corporations get to look virtuous to their college-educated employees and, for certain companies, to their largely college-educated consumer base. It’s a pretty good deal for the elite all around. The only downside is that it’s making the entire world insane.
Liberal arts colleges could solve this by bringing back and reinvigorating core curricula to match the modern economy. Out of a 32-course load for a 4-year degree, there might be no fewer than four math, two computer science, and four computational / hard science courses (e.g., scope and methods in Political Science using R and Github) required, plus existing required core in western lit / philosophy and writing. This would (1) eliminate many unqualified people who don't belong in college, (2) get rid of a lot of BS departments that manufacture professional grievance-havers, and, (3) prepare graduates for economically productive careers.
That's quite the downside.
I can't help but think that if most of our workforce had to or could make things with their hands rather than being relegated to either being a disposable worker (retail, food service, etc.) or trying to justify a highly disposable job, we'd be much better off as a society.
Thank you for another great article.