How America Has Changed Since the First Affirmative-Action Case

This is an edition of Up for Debate, a newsletter by Conor Friedersdorf. On Wednesdays, he rounds up timely conversations and solicits reader responses to one thought-provoking question. Later, he publishes some thoughtful replies. Sign up for the newsletter here.


Question of the Week

If you were in charge of the admissions office at a top-50 college or university, how would you decide which applicants got accepted as undergraduates and which got rejected? (How would you weight grades? Test scores? Athletic ability? Musical prowess? Volunteer work? Parents willing to write a big check to the building-renovation fund? Other factors? Would you think merely of ranking individuals, or also of composing a whole class?)

Send your responses to [email protected] or simply reply to this email.


Conversations of Note

While I encourage wide-ranging responses to that Question of the Week, it is inspired by the Supreme Court’s consideration of cases challenging affirmative action in college admissions––and if you want fodder as you ponder the subject, a lot has been produced of late.

At The Washington Post, Megan McArdle provides a bit of historical context that suggests affirmative-action policies are a more awkward fit for America than they used to be:

One of my favorite statistics for shocking Washingtonians is to reveal that in 1960, more than five out of every six accounted for in the census were White — and of the remainder, the overwhelming majority were Black, with the rest of the “non-White” population totaling less than 2 million … Unsurprisingly, our civil rights architecture was primarily structured to equalize the relations between a Black minority that had suffered centuries of state-sponsored racial oppression and a majority group that had perpetuated that manifestly unjust system.

But the country changed: 1965 heralded an era of mass immigration that would complicate racial preferences. Says McArdle:

A system that drew its political support from our desire to eradicate Jim Crow ended up covering a number of protected classes, though along somewhat arbitrary lines that were driven as much by political maneuvering as by any rational criteria.

This created various ad hoc absurdities — a Pakistani is “Asian,” but an Afghan born a few miles across the border might be coded “White”; the daughter of a Spanish doctor is Hispanic, eligible for various private and government-sponsored affirmative action programs, while the child of an Italian janitor, who might be visually indistinguishable from the doctor’s child, is presumably in no need of help. The more immigrants who arrived, the more these complications multiplied, even among Black Americans. American descendants of enslaved people are our most disadvantaged citizens, with enduring gaps in education, income and wealth, but African immigrants are much better educated than average.

The old system assumed a large White majority that was self-contained and thoroughly dominant; it was simply not built for a world where “biracial” was a meaningful category, or where some minority groups were more successful than the (rapidly shrinking) White majority.

Meanwhile, my colleague Adam Harris, author of The State Must Provide: Why America’s Colleges Have Always Been Unequal―And How to Set Them Right, complicates arguments for ending affirmative action by describing and commenting on an exchange that occurred during Monday’s Supreme Court oral arguments, as Justice Ketanji Brown Jackson questioned the lawyer Patrick Strawbridge.

Here’s Harris:

She offered a hypothetical to emphasize her point. There are two applicants who would like their family backgrounds recognized. One writes that their family has been in North Carolina since before the Civil War, and that if they were admitted to the university, they would be a fifth-generation student there. The other student is also a North Carolinian whose family has been in the state since before the Civil War—but their ancestors were enslaved and, because of years of systemic discrimination, were not allowed to attend the university. But now that they have the opportunity, they would like to attend. “As I understand your no-race-conscious-admissions rule, these two applicants would have a dramatically different opportunity to tell their family stories and to have them count.” Both applicants were qualified, Jackson offered, but the first applicant’s qualifications could be recognized in the process, whereas “the second one wouldn’t be able to [get credit for those qualifications] because his story is in many ways bound up with his race and the race of his ancestors.”

Strawbridge thought for a moment, then offered that UNC does not have to give a legacy benefit to the first applicant if it doesn’t want to. This is true, but it was not Jackson’s point: “No, but you said it was okay if they gave a legacy benefit.” Race, she said, would be the only thing that couldn’t be considered under that program. And that would disadvantage the Black student who, in a similar set of circumstances, wants “the fact that he has been in North Carolina for generations through his family” considered.

In a day filled with questions about the meaning of “true diversity” or the educational benefits of diversity, Jackson’s questions cut through the muck. Some students had historically been denied access to some of the nation’s most well-resourced institutions of higher education—feeder campuses for prominent roles throughout society—because of their race.

If the court rules against affirmative action, “that fact will be one of the only things a university cannot consider in its admissions process,” Harris concluded, “as though that history never happened—as though the system is fair enough already.”

The Future of the Music Business

Ted Gioia argues that mere individuals will begin to surpass even the biggest record labels in launching new artists to stardom. His explanation begins five decades in the past:

Over a fifty-year period, record labels relentlessly dumbed-down their A&R departments. They shut down their recording studios, and let musicians handle that themselves—often even encouraging artists to record entire albums at home. Then they let huge streaming platforms control the relationship with consumers. At every juncture, they opted to do less and less, until they were left doing almost nothing at all.

The music industry’s unstated dream was to exit every part of the business, except cashing the checks. But … if you don’t add value, those checks eventually start shrinking …

The major labels would like to own the music stars of the future, but they won’t … And who will win if record labels lose? You think it might be the streaming platforms? Think again—because that’s not going to happen. Spotify and Apple Music are even less interested than the major labels in nurturing talent and building the careers of young artists.

Here’s my craziest prediction. In the future, single individuals will have more impact in launching new artists than major record labels or streaming platforms. Just consider this: There are now 36 different YouTube channels with 50 million or more subscribers—and they’re often run by a single ambitious person, maybe with a little bit of support help. In fact, there are now seven YouTube channels with more than 100 million subscribers. By comparison, the New York Times only has nine million subscribers … Just ponder what it means when some dude sitting in a basement has ten times as much reach and influence as the New York Times. If you run one of these channels and have any skill in identifying talent, you can launch the next generation of stars.

Gioia goes on to argue that similar dynamics will apply to other creative industries. While pondering his thesis, I began to wonder if it might be a solution to the problem that Derek Thompson identifies in his insightful “What Moneyball-for-Everything Has Done to American Culture.”

Noticing Negative Polarization

Kat Rosenfield has a theory:

Politics no longer have anything to do with policy. Nor are they about principles, or values, or a vision for the future of the country. They’re about tribalism, and aesthetics, and vibes. They’re about lockstep solidarity with your chosen team, to which you must demonstrate your loyalty through fierce and unwavering conformity. And most of all, they’re about hating the right people. Politics in 2022 are defined not by whom you vote for, but by whom you wish to harm.

In her telling, that explains why conservatives keep mistaking her for one of their own rather than the liberal that she is:

Not because I argue for right-wing policies or from a right-wing perspective, but because progressives are often extremely, publicly mad at me for refusing to parrot the latest catechism and for criticizing the progressive dogmas that either violate my principles or make no sense. I look like a friend of the Right only because the Left wants to make me their enemy — and because I can’t bring myself to do the requisite dance, or make the requisite apologies, that might get me back in the Left’s good graces.

… It’s remarkably easy these days to be named an apostate on the left. Maybe you were critical of the looting and rioting that devastated cities in the wake of George Floyd’s murder by police in 2020. Maybe you were skeptical of this or that viral outrage: Covington Catholic, or Jussie Smollett, or the alleged racial abuse at a BYU volleyball game that neither eyewitness testimony nor video evidence could corroborate. Maybe you were too loud about the continued need for due process in the middle of #MeToo. Maybe you wouldn’t stop asking uncomfortable questions about the proven value of certain divisive brands of diversity training, or transgender surgeries for kids, or — come the pandemic — masking. Maybe you kept defending the right to free speech and creative expression after these things had been deemed “right-wing values” by your fellow liberals. This is a fraught moment for those of us who aren’t reflexive team players, who struggle with reading the room, who remain committed to certain values on principle even when they’ve become politically inexpedient. The present climate leaves virtually no room for a person to dissent and yet remain in good standing.

A similar phenomenon causes conservatives such as David French and Jonah Goldberg to be treated as apostates for sticking to their principles, even when that means criticizing Donald Trump.

Forgive Them, for They Knew Not What They Did

Here at The Atlantic, Emily Oster harkens back to the earliest months of the pandemic, when everyone was making decisions under conditions of significant uncertainty, and argues that our treatment of one another going forward ought to be informed by that context:

Given the amount of uncertainty, almost every position was taken on every topic. And on every topic, someone was eventually proved right, and someone else was proved wrong …  

The people who got it right, for whatever reason, may want to gloat. Those who got it wrong, for whatever reason, may feel defensive and retrench into a position that doesn’t accord with the facts … These discussions are heated, unpleasant and, ultimately, unproductive. In the face of so much uncertainty, getting something right had a hefty element of luck. And, similarly, getting something wrong wasn’t a moral failing. Treating pandemic choices as a scorecard on which some people racked up more points than others is preventing us from moving forward … We have to put these fights aside and declare a pandemic amnesty. We can leave out the willful purveyors of actual misinformation while forgiving the hard calls that people had no choice but to make with imperfect knowledge.

The Restaurant Industry’s Worst Idea

In my estimation, it is the QR-code menu:

Never mind dying peacefully in my sleep; I want to go out while sitting in a restaurant on my 100th birthday, an aperitif in my left hand and a paper menu in my right. And as eager as I’ll be for heaven if I’m lucky enough to stand on its threshold, I want one last downward glance at a paramedic prying the menu from my fist. In that better future, where old-school menus endure, I’ll go to my urn happy that coming generations will still begin meals meeting one another’s eyes across a table instead of staring at a screen.

QR-code menus are not really an advance. Even when everything goes just right––when everyone’s phone battery is charged, when the Wi-Fi is strong enough to connect, when the link works––they force a distraction that lingers through dessert and digestifs. “You may just be checking to see what you want your next drink to be,” Jaya Saxena observed in Eater late last year, “but from there it’s easy to start checking texts and emails.”

And wasn’t it already too easy?

The article goes on to express this hope for the future: Rather than remembering the pandemic as a tipping point in the digitization of restaurants and bars, I hope we look back on its aftermath as the moment when an ever more atomized society understood the high costs of social isolation––and settled on mealtime norms as an especially vital way of mitigating them.

What if, three times a day, society was oriented toward replenishing what is growing more absent from the rest of our waking hours: undistracted human interactions unmediated by technology?


Provocation of the Week

John Tierney has been a skeptic of recycling plastic since at least 1996, when he wrote in The New York Times Magazine:

Believing that there was no more room in landfills, Americans concluded that recycling was their only option. Their intentions were good and their conclusions seemed plausible. Recycling does sometimes makes sense—for some materials in some places at some times. But the simplest and cheapest option is usually to bury garbage in an environmentally safe landfill. And since there’s no shortage of landfill space (the crisis of 1987 was a false alarm), there’s no reason to make recycling a legal or moral imperative. Mandatory recycling programs aren’t good for posterity. They offer mainly short-term benefits to a few groups—politicians, public relations consultants, environmental organizations, waste-handling corporations—while diverting money from genuine social and environmental problems. Recycling may be the most wasteful activity in modern America: a waste of time and money, a waste of human and natural resources.

Last week, Greenpeace published a new report on recycling, announcing it in a press release that began:

Most plastic simply cannot be recycled, a new Greenpeace USA report concludes. Circular Claims Fall Flat Again, released today, finds that U.S. households generated an estimated 51 million tons of plastic waste in 2021, only 2.4 million tons of which was recycled.

In City Journal, Tierney claims vindication:

The Greenpeace report offers a wealth of statistics and an admirably succinct diagnosis: “Mechanical and chemical recycling of plastic waste has largely failed and will always fail because plastic waste is: (1) extremely difficult to collect, (2) virtually impossible to sort for recycling, (3) environmentally harmful to reprocess, (4) often made of and contaminated by toxic materials, and (5) not economical to recycle.” Greenpeace could have added a sixth reason: forcing people to sort and rinse their plastic garbage is a waste of everyone’s time. But then, making life more pleasant for humans has never been high on the green agenda.

How we ought to think about and regulate plastic going forward is a complex issue that Tierney and Greenpeace disagree about, and that Up for Debate will return to. But I am convinced by both of their bodies of work that plastic recycling is not an effective “green” policy—which is depressing, given how much time well-intentioned people have spent on it all these years.

That’s all for this week––see you on Monday.

source site

Leave a Reply