The café was one of those Palo Alto places with $7 oat milk lattes and exposed brick that probably cost more than the espresso machine. October rain streaked the windows. Chen Wei sat across from me in a faded hoodie— Google's actually, which I thought was a nice touch given what she was about to tell me.
"Three years." She stirred a latte she never drank. "Three years teaching myself machine learning after my night shifts at Target. Built a fraud detection system that caught $2.3 million in chargebacks. Contributed to TensorFlow. Got 847 stars on a GitHub project." A pause. "You know what the recruiter told me when she rejected me?"
She didn't wait for me to ask.
"'We loved your technical skills, but we need candidates with a CS degree for this role. It's client-facing.' Client-facing." Chen Wei laughed—not a real laugh. "I asked her: will the client be reading my diploma during our meetings? She just... she didn't know what to say."
Two months later, that same company hired someone with a Stanford master's who—this is according to a friend on the team, and I've heard this story enough times to believe it—couldn't implement gradient descent without Stack Overflow open on his second monitor. He's still there.
Chen Wei? She's VP of Engineering at a startup now. But that's not the point. The point is she found one company willing to look past the missing piece of paper. One out of—
"How many did you apply to?"
"Before I stopped counting? Sixty-something."
Stories like this have been piling up in my inbox for eighteen months. Warehouse workers who taught themselves Python. Nurses who became data analysts. Veterans whose military training—years of high-stakes decision-making under pressure—translates to exactly nothing on a civilian resume.
Harvard Business School says 70 million Americans are "hidden workers." Locked out because they don't have the right credential.
Here's what keeps me awake: I run an AI recruiting company. I built technology that's supposed to fix this. And when I pulled our own data last month—just to check, just to be sure—I found the same patterns. Our algorithms, trained on years of historical hiring decisions, still give subtle preference to candidates from certain schools. Certain backgrounds. Certain paths.
We're trying to fix a broken system while running on the same broken rails.
The Lie
Let me tell you something uncomfortable about how I got my first job.
2014. I had a degree from a name-brand university. Walked into interviews knowing my resume would clear the filters. Knowing my credential signaled something to hiring managers. Not that I could do the work—not exactly. It signaled that I was safe. That hiring me wouldn't be a risk.
The degree was just... how things worked. We didn't question it.
The bachelor's degree became a hiring filter in the mid-twentieth century. Companies needed to screen thousands of applicants. A four-year degree said: this person can stick with something, navigate bureaucracy, demonstrate baseline competence. The signal was imperfect but cheap. Scalable.
Then it got corrupted.
By 2017, Harvard found that 67% of production supervisor job postings required a bachelor's degree—even though only 16% of people actually doing that job had one. The degree stopped measuring capability. Started measuring access. Who could afford four years without earning? Whose family had that safety net?
Between 1980 and 2020, college costs went up 1,200% in constant dollars.
A recruiter in Chicago—let's call her Dana—put it to me bluntly over drinks last summer. "Look, I know the degree requirement is bullshit for half our roles. Everyone knows it. But if I hire someone without a degree and they don't work out, that's on me. If I hire someone with a degree and they fail? Well, they had all the right credentials. Nobody questions my judgment."
She wasn't being cynical. She was being honest about incentives.
The result is a paradox so stupid it should be satire: millions of openings companies can't fill, sitting next to millions of capable workers those same companies won't hire. The "skills gap" costs $8.5 trillion annually. A third of that gap isn't a gap in skills. It's a gap in how we measure them.
The Press Releases
In 2018, Tim Cook said roughly half of Apple's U.S. employees didn't have a four-year degree. Google launched Career Certificates—six-month credentials positioned as equivalent to a bachelor's. IBM coined "new collar." Bank of America, Delta, Walmart followed.
I remember reading those announcements and feeling hopeful. Finally, I thought. The gates are opening.
I was wrong.
A friend—let's call him Raj—works in talent acquisition at a Fortune 100 tech company that made one of those splashy announcements. We got dinner in December 2024, a year after his company's "skills-first commitment."
"Want to know what actually changed?" He poured himself more wine. "We removed the degree requirement from job postings. That's it. Same ATS filters. Same hiring manager preferences. Same interview panels looking for people who feel—" he searched for the word "—familiar."
"So it was just PR."
"The announcement was real. The intention was real." He shrugged. "Nobody rebuilt the actual process. Nobody trained managers to evaluate differently. Nobody changed what metrics we track. So everyone reverted."
The data confirms this. A 2025 analysis found that while 85% of companies claimed skills-based hiring, only 0.14% of actual hires were affected by degree requirement removal.
Zero point one four percent.
Microsoft, Intel, Meta announced relaxed requirements. Their actual job postings barely changed. Corporate rhetoric sprinted ahead of corporate practice.
This is where my industry enters the story. Not with good intentions. With technology that might actually force the change those announcements couldn't.
What the Machines Actually Do
The problem with skills-based hiring has always been measurement. Degrees are binary. Skills are continuous, contextual, hard to verify. A resume claims "proficient in Python." What does that mean? Can you write production code? Debug someone else's mess? Architect systems at scale?
Without reliable measurement, hiring managers default to proxies. The most available proxy is still the credential.
AI assessment platforms try to break this loop. They don't ask candidates to describe abilities. They test them.
TestGorilla offers 350+ skill assessments. Their 2025 report: companies using skills tests before screening resumes made quality hires 96% of the time, versus 87% for traditional methods. Time-to-hire dropped 50%.
Those numbers are real. I've verified them with customers. The technology works.
For technical roles, platforms like HackerRank have candidates write actual code. Debug real systems. The AI evaluates not just whether the code works, but how—efficiency, approach, quality.
A recruiter at a mid-sized fintech told me about a candidate they almost passed over. "No degree. Job-hopped a lot. Resume was a mess. But his code assessment? Top 3% of everyone we'd ever tested." She paused. "He's our best engineer now."
Stories like this should make me optimistic. Sometimes they do.
Then I remember what happened in our hallway last April.
What Our Algorithm Learned
Our lead ML engineer—I'll call her Min—caught me after an all-hands meeting. Fluorescent lights. That weird corporate carpet smell. She was holding her laptop like evidence.
"We need to talk about the model."
I was already late for a board call. "The bias audit came back clean. We passed everything."
"That's the problem." She didn't move. "The audit measures whether protected groups advance at equal rates. It doesn't measure whether they should advance at equal rates."
I stopped. The hallway felt smaller.
"Think about who applies," she continued. "By the time candidates from underrepresented backgrounds reach our system, they've already been filtered by years of bias. The ones who make it through are, on average, stronger than candidates who never faced those barriers." She let that land. "'Equal advancement rates' might mean we're holding them to a higher bar."
I didn't have a response. My mouth opened but nothing came out.
"There's more." She opened the laptop. "College prestige. We removed race and gender from training." She turned the screen toward me. "Look. The model learned to use school tier as a proxy. For what? Quality? Socioeconomic status?" She shook her head. "We can't tell."
"Can we remove it?"
"Tried. Accuracy drops 8%. I brought it to product last month." She closed the laptop. "They killed it."
We stood there in the hallway. Someone walked past us. The HVAC hummed.
"We're selling 'fair hiring,'" Min said quietly. "Admitting the model discriminates by proxy—that's not a conversation anyone wants to have."
I've thought about that conversation every day since. We still haven't removed college prestige from the model. The product team still says 8% is unacceptable. And we still send bias audits to the board showing we "passed."
The University of Washington published research last year: leading AI models preferred white-associated names 85% of the time when evaluating identical resumes. Black male names? Zero percent.
Zero.
We're replacing human bias with algorithmic bias and calling it progress.
The One Who Didn't Make It
I've told you about Chen Wei, who succeeded. About Rosa, the 47-year-old who eventually found work. The stories that end okay.
Let me tell you about James.
Former Army logistics specialist. Managed supply chains under conditions most civilian employers can't imagine. Separated in 2023. Applied to 200+ jobs over fourteen months. Got thirty-some interviews. No offers.
He emailed me after reading one of my earlier pieces on skills-based hiring. "I thought things were changing," he wrote. "I read all those articles about companies not caring about degrees anymore. So I applied. And applied. And applied."
His skills assessments, when companies bothered to send them, were strong. Top quartile in logistics. High problem-solving scores. Good communication.
The interviews went nowhere.
"They'd ask about my background," he wrote. "I'd talk about managing $40 million in equipment. Coordinating movements across three countries. They seemed impressed. Then I'd never hear back." A pause in his email—you could feel it. "I started wondering if my military background was the problem. Like maybe they couldn't imagine me fitting in. Or maybe they thought I'd be... difficult."
Last I heard, James was driving for DoorDash. He stopped applying for operations jobs. "Not worth the hope," he said.
James isn't in anyone's skills-based hiring statistics. He's not a success story about credentials mattering less. He's what happens when we announce change without building it. When we pass him through assessments and still reject him because something—we can't quite say what—doesn't fit.
I think about James when I hear companies brag about going "skills-first."
320 Million Badges
If credentials don't work and AI assessments carry hidden bias, maybe the answer is credentials that do measure skills. Verifiable instantly. No human judgment required.
Digital badges. Micro-credentials. The growth is staggering: 74.7 million issued globally in 2022. By 2025, 320.4 million.
These aren't participation trophies. Modern digital credentials carry metadata—what was demonstrated, how it was assessed, who verified it. They're specific, short, stackable.
Accredible's 2025 research: 91% of employers actively look for digital credentials. 72% prefer them over traditional certificates.
Google treats its six-month Career Certificates as equivalent to four-year degrees. OpenAI plans to certify 10 million Americans in AI skills by 2030.
I want to believe this works.
But Rosa had credentials. She passed every assessment. Still got four callbacks from seventy-three applications.
James has skills that would translate beautifully to a civilian ops role. The credentials didn't help him either.
A credential—any credential—only works if the people making decisions actually trust it. And trust is harder to engineer than technology.
The Accenture Question
When people ask if skills-based hiring can work at scale, I point them to Accenture.
They started building skills infrastructure in 2014. Didn't just remove degree requirements—built the taxonomy, the assessment systems, the promotion criteria. Skills became what they call a "currency" within the organization.
IBM did something similar with "new collar" roles. Not just relabeling jobs. Building apprenticeship pathways, training programs, real assessment systems.
A friend who worked on Accenture's initiative told me: "We got 80% of the way there. The system works for most roles, most of the time." She took a breath. "But there's still this 20% where humans intervene and all the old biases flood back."
These transformations took a decade. Massive investment. Executive commitment that didn't waver when quarterly earnings got bumpy.
Most companies? They announce skills-based hiring. Remove the degree line from job postings. Call it done.
That's why the implementation gap persists.
Shenzhen Doesn't Care About Your Audit
Everything I've said so far is about the U.S. market. Here's what I don't know how to factor in: what's happening in China.
A contact at a Singapore recruiting firm told me ByteDance's internal hiring AI is years ahead of anything in the West. Faster. More integrated. Trained on datasets we can't access.
Alibaba supposedly screens a million applications monthly. Nobody publishes research. Nobody gets sued. Nobody audits for bias—or if they do, nobody talks about it.
The EU can regulate emotion recognition. They can require transparency. Doesn't touch Shenzhen.
I don't know what to do with this except note it. The skills-based hiring revolution we're arguing about in the West is one version. There are others. They're probably further along. And they're not playing by our rules.
The 59 Out of 100
The World Economic Forum says if the global workforce were 100 people, 59 would need training by 2030 just to remain employable.
Not to advance. Not to get promoted. Just to keep the jobs they have.
Of those 59, employers estimate 29 could be upskilled in current roles. Nineteen could be retrained and redeployed. But 11—11 out of every 100—won't receive the reskilling needed. Their prospects are, the report says, "increasingly at risk."
Hundreds of millions of people. Parents. Mortgage holders. People with fifteen years of experience suddenly worthless.
This is what's driving the skills-based hiring movement. Not idealism about meritocracy. Not equity concerns. Cold economic necessity. Companies literally cannot fill positions using traditional credential filters. They need more people in the pool.
I find myself torn.
On one hand: good that economic pressure is forcing change equity arguments couldn't. Doors opening for people locked out.
On the other: companies are doing this because they need bodies. The moment the labor market loosens, do they revert? I've learned not to trust press releases.
The Number I Can't Stop Thinking About
Last week I asked our data team to pull something. Probably shouldn't have.
Of the candidates who passed our skills assessments in 2025—genuinely qualified based on objective measures—how many actually got hired?
The answer was 23%.
Twenty-three percent. Three out of four qualified candidates, rejected. Not because they couldn't do the job. Because something else—interview "fit," salary expectations, a weird gap on the resume, the hiring manager's gut feeling—got in the way.
Our platform proved they had the skills. The humans behind the platform still found reasons to say no.
I don't know what to do with this number. We're building better filters. Better assessments. More sophisticated skill taxonomies. And 77% of qualified candidates still don't get through.
Maybe the problem isn't the technology. Maybe it's us.
What I Think I Know (And Don't)
Two months on this piece. Forty-seven interviews. Twelve platform demos. Too many research papers.
Here's what I think I know:
The credential system is broken. Degrees measure access more than ability. Skills-based hiring—the real kind, not the press release kind—works. It can give people like Chen Wei a chance to prove themselves.
But AI assessment carries risks we don't understand. The models learn from biased history. Our bias audits may miss the actual bias. Humans working alongside biased AI don't correct it—they absorb it.
Corporate announcements are mostly performative. The real change requires organizational transformation most companies won't undertake. Removing degree requirements without rebuilding the process just moves the bias somewhere else.
Here's what I don't know:
Whether my company is making things better or worse. Whether the 11 out of 100 who won't get retrained will find paths or get discarded. Whether skills-based hiring becomes real or stays a niche practice adopted when labor markets are tight.
Whether James ever gets that operations job. Whether the next Chen Wei finds her one company faster. Whether any of this matters or whether we're just building new walls in place of old ones.
Chen Wei Again
That October afternoon in Palo Alto, after the latte had gone cold and the rain had stopped, Chen Wei asked me something.
"Your company. The AI recruiting thing." She looked at me directly. "Does it actually work? For people like me?"
I wanted to say yes.
Instead I told her the truth: sometimes. For some people. When the humans behind the algorithms let it. When the company actually wants to find talent instead of just avoiding risk.
She nodded. Didn't look surprised.
"That's more honest than most people in your industry," she said. Then she laughed—a real laugh this time. "You know what got me my current job? Not an assessment. Not a credential. The CEO's teenage daughter uses one of my GitHub projects. He saw my name in the code and remembered it when my application came through."
Luck. Network effects. The right person seeing the right thing at the right time.
All this technology. All these platforms. All these debates about credentials versus skills. And sometimes it still comes down to whose daughter uses what software.
I don't know how to fix that. I'm not sure anyone does.
But somewhere right now, there's another Chen Wei sending another application into another automated system. Another Rosa studying for another certificate. Another James wondering if this time will be different.
We owe them better than what we've built so far.
Whether we'll actually build it—I honestly don't know.