The Supreme Court Killed the College-Admissions Essay

Nestled within yesterday’s Supreme Court decision declaring that race-conscious admissions programs, like those at Harvard and the University of North Carolina, are unconstitutional is a crucial carveout: Colleges are free to consider “an applicant’s discussion of how race affected his or her life.” In other words, they can weigh a candidate’s race when it is mentioned in an admissions essay. Observers had already speculated about personal essays becoming invaluable tools for candidates who want to express their racial background without checking a box—now it is clear that the end of affirmative action will transform not only how colleges select students, but also how teenagers advertise themselves to colleges.

For essays and statements to provide a workaround for pursuing diversity, applicants must first cast themselves as diverse. The American Council on Education, a nonprofit focused on the impacts of public policy on higher education, recently convened a panel dedicated to planning for the demise of affirmative action; admissions directors and consultants emphasized the need “to educate students about how to write about who they are in a very different way,” expressing their “full authentic story” and “trials and tribulations.” In other words, if colleges can’t use race as a criterion in its own right, because the Court has ruled doing so violates the Fourteenth Amendment, then high schoolers trying to navigate the nebulous admissions process may feel pressure to write as plainly as possible about how their race and experiences of racism make them better applicants.

Turning personal writing into a way to market one’s race means folding oneself into nonspecific formulas, reducing a lifetime to easily understood types. This flattening of the college essay in response to the long hospice of race-based affirmative action comes alongside another reductive phenomenon upending student writing: the ascendance of generative AI. High schoolers, undergraduates, and professional authors are enlisting ChatGPT or similar programs to write for them; educators fear that admissions essays will prove no exception. The pitfalls of using AI to write a college application, however, are already upon us, as the pressure to sell one’s race and race-based adversity to colleges will compel students to write like chatbots. Tired platitudes about race angled to persuade admissions officers will crowd out more individual, creative approaches, the result no better than a machine’s banal aggregation of the web. Writing about one’s race can be clarifying, even revelatory; de facto requiring someone write about their racial identity, in a form that can veer toward framing race as a negative attribute in need of overcoming, is stifling and demeaning. Or, as the attorney and author Elie Mystal tweeted more bluntly yesterday, “Why should a Black student have to WASTE SPACE explaining ‘how racism works’”?

[Read: Elite multiculturalism is over]

Such essays can feel prewritten. Many Black and minority applicants “believe that a story of struggle is necessary to show that they are ‘diverse,’” the sociologist and former college-admissions officer Aya M. Waller-Bey wrote in this magazine earlier this month; admissions officers and college-prep programs can valorize such trauma narratives, too. Indeed, research analyzing tens of thousands of college applications shows that essay content and style predict income better than SAT scores do: Lower-income students were much more likely to write about topics including abuse, economic insecurity, and immigration. Similarly, another study found that girls applying to engineering programs were more likely to foreground their gender as “women in science,” perhaps to distinguish themselves from their male counterparts. These predictable scripts, which many students believe to be most palatable, are the kind of stale, straightforward narratives—about race, identity, and otherwise—that AI programs excel at writing. Language models work by analyzing massive amounts of text for patterns and then spitting out statistically probable outputs, which means they are adept at churning out clichéd language and narrative tropes but quite terrible at writing anything original, poetic, or inspiring.

To explore and narrativize one’s identity is of course important, even essential; I wrote about my mixed heritage for my own college essay. Race acts as what the cultural theorist Stuart Hall called a “floating signifier,” a label that refers to constantly shifting relationships, interactions, and material conditions. “Race works like a language,” Hall said, meaning that race provides a way to ground discussions of varying experiences, support networks, histories of discrimination, and more. To discuss and write about one’s race or heritage, then, is a way of finding and making meaning.

But molding race into what an admissions officer might want is the opposite of discovery; it means one is writing toward somebody else’s perceived desires. It’s not too dissimilar from writing an admissions essay with a language model that has imbibed and reproduced tropes that already exist, blighting meaningful self-discovery on the part of impressionable young people and instead trapping them in unoriginal, barren, and even debasing scripts that humans and machines alike have prewritten about their identities. Chatbots’ statistical regurgitations cannot reinvent language, only cannibalize it; the programs do not reflect so much as repeat. When I asked ChatGPT to write me a college essay, it gave me boilerplate filler: My journey as a half-Chinese, half-Italian individual has been one of self-discovery, resilience, and growth. That sentence is broadly true, perhaps a plus for an admissions officer, but vapid and nonspecific—useless to me, personally. It doesn’t push toward anything meaningful, or really anything at all.

[Read: The college essay is dead]

A future of college essays that package race in canned archetypes reeking of a chatbot’s metallic touch could read alarmingly similar to the very Supreme Court opinions that ended race-conscious admissions yesterday: a framing of race “unmoored from critical real-life circumstances,” as Justice Ketanji Brown Jackson wrote in her dissent; a pathetic understanding of various Asian diasporic groups from Justice Clarence Thomas; a twisting of landmark civil-rights legislation, constitutional amendments, and court cases into a predetermined and weaponized crusade against any attempt to promote diversity or ameliorate historical discrimination. Chatbots, too, make things up, advance porous arguments, and gaslight their users. If race works like a language, then colleges, teachers, parents, and high-school students alike must make sure that that language remains a human one.

{"email":"Email address invalid","url":"Website address invalid","required":"Required field missing"}
>