According to The Economist, researchers from the University of Pennsylvania led by Marius Guenzel analyzed photos of 96,000 graduates using AI algorithms. They extracted what they call the “Photo Big Five” personality traits—agreeableness, conscientiousness, extraversion, neuroticism, and openness—from facial features alone. The study then linked these facial analysis results to actual labor market outcomes and found predictive power for post-MBA earnings and job mobility. While the predictive power is described as incremental rather than definitive, the implications are profound. The research raises immediate questions about anti-discrimination laws and whether companies might eventually use such technology despite legal risks.
The uncanny valley of hiring
Here’s the thing that makes this research both fascinating and terrifying. Personality assessments are already common in hiring—we’ve all taken those surveys where you try to guess what the “right” answers are. But this takes it to a whole new level. Imagine walking into an interview where your face is being analyzed before you even speak. The algorithm might decide you’re not conscientious enough based on your facial structure, not your actual work history.
And let’s be real—this feels like it’s straight out of a dystopian sci-fi novel. The researchers admit they don’t even know exactly what the AI is picking up on. It’s basically a black box making judgments about your career potential. What happens when people start getting rejected for jobs because of their bone structure?
The fairness dilemma
Now, the researchers raise a chilling ethical question: “Among white male job candidates, is it ethical to screen out individuals whose faces predict less desirable personalities?” That’s a loaded way to frame it, isn’t it? They’re basically asking if it’s okay to discriminate when you’re not technically discriminating against protected classes.
But here’s where it gets really messy. We already know physical appearance affects hiring decisions—there’s research showing taller people get hired more often, for example. Some might argue that facial analysis is actually more meritocratic than favoring Ivy League graduates. Kelly Shue from Yale, one of the paper’s authors, is even looking at whether lenders could use this to assess loan repayment likelihood.
Yet the whole concept threatens something fundamental: our sense of agency. What’s the point of working on yourself, developing better habits, or gaining experience if an algorithm has already decided your face says you’re not management material?
Why this won’t go away
Despite the obvious legal and ethical landmines, companies will be tempted. The study suggests there’s real predictive value here, however small. And in competitive hiring environments, even incremental advantages get attention.
Manish Raghavan from MIT’s Sloan School notes that companies are currently wary of using AI for facial analysis in hiring. But he’s more worried about bias creeping into chatbot summaries of resumes and LinkedIn profiles. That’s telling—we’re already comfortable with some forms of algorithmic screening, just not this particular flavor.
The fundamental question is whether people will accept decisions based on immutable characteristics. We accept that younger drivers pay higher insurance premiums because there’s clear causal logic. But your face determining your career? That’s a much harder sell.
The human element survives
At the end of the day, I think there’s something irreplaceable about human judgment in hiring. However good these algorithms get, people will always want that chance to win over an interviewer face-to-face. There’s chemistry, there’s rapport, there’s that undefinable something that makes you think “I want to work with this person.”
And honestly, that’s probably our saving grace. The study authors themselves caution against overstating their findings. This is early research with plenty of caveats. But it’s a warning shot across the bow—the conversation about algorithmic hiring is just getting started, and it’s going to get uncomfortable.
