A typical CV lands in a hiring manager's inbox. Within seconds — before reading a single sentence about the person's actual work — the brain has already processed: name, photo, university, current company, graduation year. Each of those signals can activate a bias. Most of the time unconsciously.
This isn't a character flaw. It's how human cognition works under time pressure and information overload. The problem is that we've built a recruitment process that feeds those triggers at every step.
The bias triggers in a standard CV
- ×Name — signals ethnicity, gender, class
- ×Photo — age, attractiveness, apparent ethnicity
- ×University — prestige bias, regional bias
- ×Current / previous employer — halo effect from brand names
- ×Graduation year — reveals age
- ×Address — neighbourhood signals class and commute
- ×LinkedIn activity — recency and visibility skew
Why "diversity initiatives" don't fix this
Many companies run unconscious bias training, add diversity statements to job ads, and set demographic targets. These are not without value — but they don't fix the pipeline problem. If you're starting from a CV with a photo and a name, the bias has already been triggered before any deliberate decision is made.
Bias training improves awareness. It rarely changes outcomes, because awareness doesn't eliminate the cognitive shortcut that fires in under a second when you see a face or read a name.
What actually reduces bias
The research on what works is consistent: you have to change what information you start with, not just remind people to think harder about the information they already have.
Remove the photo. Remove the name. Remove the university name where possible. Evaluate on tasks and outcomes first. Then, if you want to meet someone, meet them.
The CV is a backwards document
A CV is a record of the past. It tells you where someone has been, not what they want to do next or how they think about the work you actually need done. Two people with identical CVs may be looking for completely different things. You won't know until you talk to them.
The real question is: can this person do the job we need done, and do they want to? The CV is a poor proxy for both.
CV-first process
- ×Starts with history and biography
- ×Bias triggers fire in first 3 seconds
- ×Most talent never applies — too passive
- ×Shortlist = survivors of the filter, not best fit
- ×You're sorting people, not discovering them
Task-first process
- ✓Starts with what someone wants to do
- ✓No photo, no name, no employer brand
- ✓Reaches talent that never applied anywhere
- ✓You evaluate work, then meet the person
- ✓You're discovering, not sorting
The EU AI Act changes the stakes
From August 2026, companies using AI to sort, rank, or exclude applicants must document and explain every exclusion decision. Most AI recruitment tools — ATS keyword matching, CV scoring, video interview analysis — will be classified as high-risk AI systems under the regulation.
This isn't primarily about the bias risk, though that's real. It's about compliance exposure. "Our AI ranked them lower" is not a legally sufficient explanation.
ByeByeBias — compliant by design
ByeByeBias has no AI scoring, no ranking, and no exclusion logic. Companies browse an anonymous talent pool filtered only by what talent themselves have stated they're open to. There is no algorithm selecting who gets shown and who doesn't. Nothing to explain. Nothing to audit.
What unbiased hiring looks like in practice
You don't need to overhaul your entire hiring process overnight. Start with the filter that happens before anyone reviews a candidate:
- 1Remove photos from your screening process
If your ATS or submission form allows photos, disable them. If candidates submit LinkedIn profiles, ask screeners not to view them until after an initial assessment.
- 2Evaluate on outcomes, not employer names
What did they actually build, change, or deliver? The company they did it at is less relevant than what they did there.
- 3Remove degree requirements that aren't genuinely required
Most degree requirements are habits, not necessities. What problem do you need solved? Can someone without a specific degree solve it? Usually yes.
- 4Use structured interviews with consistent questions
Unstructured interviews are one of the worst predictors of job performance and one of the strongest activators of similarity bias. Same questions, same evaluation criteria, every time.
- 5Start with passive talent
The 47% who are open to the right opportunity but not actively applying are disproportionately experienced, currently employed, and not visible on job boards. That pool is far less homogeneous than the active applicant pool.
The bottom line
Unbiased hiring isn't a diversity quota or an audit checklist. It's a structural decision to look at what someone can do before you process who they appear to be. That decision starts with what information you put in front of your hiring team in the first 30 seconds.

About Silje Sundal
Founder & CEO
Silje brings over 20 years of experience from leading enterprise software companies including HP, Citrix, and Workday. With deep expertise in Revenue Operations and GTM strategy, she's passionate about using AI to transform how people find work that makes them thrive.