糖心Vlog

Big data could help mitigate the affirmative action ban

It isn鈥檛 perfect, but data and analytics could capture the disadvantages applicants face and the diversity they may represent, says Carlo Ratti

Published on
July 7, 2023
Last updated
July 7, 2023
Data graphs projected onto a black woman's face
Source: iStock

When the US Supreme Court released its verdict outlawing race-based affirmative action last week, I聽was reminded of聽a young postdoc I聽hired for my lab at the Massachusetts Institute of Technology many years ago. The researcher 鈥 let鈥檚 call her Sasha 鈥 hailed from a聽former Soviet republic, and my聽colleagues thought her r茅sum茅 was not up聽to our standards. Yet, I聽reasoned, if聽Sasha had achieved so聽much in聽an environment that afforded her so聽little, why聽not see what she could do at聽MIT?

The bet paid off 鈥 she became one of聽our best researchers and today is聽a well-known professor at聽a leading US聽university.

This is not a聽tale about race-based affirmative action, yet it聽might teach us something about how to move forward.

Why is affirmative action so important? First, every individual should have equal opportunities regardless of their background. Admissions officers cannot just look at the finish line 鈥 grades and test scores 鈥 when the starting blocks are different. Second, diversity enriches educational environments, leading to better outcomes for everyone.

糖心Vlog

ADVERTISEMENT

The Supreme Court argues that structural racism is no聽longer a sufficient disadvantage to justify positive discrimination. This is . As聽Justice Ketanji Brown Jackson wrote in , 鈥淒eeming race irrelevant in law does not make it so in life.鈥 So we need to make affirmative action work 鈥 but without specific reference to race.

Our first response might be to give increased weight to race-neutral variables, such as parental income and address. Such efforts are welcome, but they are not a聽. Within the same zip code or income bracket, a poor Black student is more likely to come from generations of poverty, attend worse-off schools, and even breathe polluted air or drink contaminated water.

糖心Vlog

ADVERTISEMENT

have that a variety of alternatives to race-based affirmative action, such as race-neutral holistic review or systems that focus on income or geographic diversity, do聽not work as well as expected. Strikingly, ignoring race does not only make a class less racially diverse, as expected, it even makes it less socio-economically diverse. No algorithm gets better when you restrict its access to information. Or, as our dean of admissions at MIT, Stu Schmill, recently , 鈥淚f you take away a carpenter鈥檚 tools, they will have a much harder time building.鈥

A further issue is that all the variables we have discussed so far 鈥 from race to census-based socio-economics 鈥 are coarse, focusing on groups rather than individuals. Yet the rise of big data is driving a in the social sciences. In 20 years of research, I聽have seen a trove of huge empirical datasets emerge to describe urban areas, unlocking a 鈥溾 that allows us to understand the greatest metropolis down to the smallest block. Big data and analytics could also help admissions officers quantitatively capture the kinds of disadvantages applicants face and the kinds of diversity they may represent.

Think about all the variables that impact a student鈥檚 life but are invisible in聽a college application. A truly fair system would take into account not only an applicant鈥檚 high school zip code but also the quality of their pre-kindergarten programme and the levels of lead in their water pipes. We are far from being able to obtain every relevant variable, but in 2023 we could get a lot more.

Just as important as bringing more data in, colleges should also get more data out. Before Harvard鈥檚 admissions process was challenged, much of it was a black box to the public; it should not have taken a lawsuit at the Supreme Court to remedy that. As universities scramble to devise new admissions policies, every school could become a test bed for shared innovation.

糖心Vlog

ADVERTISEMENT

Admissions offices could embrace experimentation and data collection, even within a single class. Instead of devising a single new policy, they could try different ones and track the results over time in terms of admissions, campus experiences and later careers. Even small differences would be measurable, and they could help to challenge a variety of misconceptions. Harvard has long argued that letting in too many poor students would compromise its academic excellence; what if it admitted a聽few more and tested this out?

Of course, a data-driven process cannot carry us all the way to justice. Data collection is hard, and privacy will always be a concern. We should also remember the controversies that swirled around the 聽鈥 a proposal to supplement standardised testing scores according to socio-economic factors in a student鈥檚 school and neighbourhood. The project drew criticism for being too opaque, simple and presumptuous in assuming that adversity could be so easily quantified (it聽was ultimately reworked into the more modest SAT Landscape). Open experimentation and transparency could be powerful curatives, but they might also generate new culture wars and lawsuits.

There is no perfect way to optimise elite university admissions 鈥 there are too many incredible applicants and too few slots. In the long term, the best solution is probably to ensure that there are better paths to a good education and social mobility beyond a seat on the shiny 鲍厂厂听笔谤颈苍肠别迟辞苍. But for now, in the absence of race-based affirmative action, universities owe it to themselves 鈥 and to their applicants 鈥 to innovate and find new algorithms.

More data will promote more justice 鈥 and help ensure that no Sasha goes unnoticed in the admissions pool.

糖心Vlog

ADVERTISEMENT

Carlo Ratti is professor of practice of urban technologies and planning at the Massachusetts Institute of Technology, where he directs the Senseable City Lab.

Register to continue

Why register?

  • Registration is free and only takes a moment
  • Once registered, you can read 3 articles a month
  • Sign up for our newsletter
Please
or
to read this article.

Related articles

Reader's comments (1)

I'm sorry but this is illogical and misunderstands the use of data, "big" and other. Has the MIT professor never heard of ecological fallacies and complications of deductive reasoning. He misses the point about the purposes, the history, and the context of affirmative action in all its many, sometimes contradictory forms.

Sponsored

Featured jobs

See all jobs
ADVERTISEMENT