This report is grounded in 21 qualitative interviews I conducted between March and April 2026. All participants are anonymised. A supplementary section places the findings against published quantitative research, to show where the numbers confirm what people said and where they diverge. The interviews came first. The data corroborates.
Seven things that are broken
These patterns appeared across nearly every interview, on both sides.
Candidates optimise CVs for bots and write AI cover letters nobody reads. Recruiters write AI-generated job descriptions and reject via automated templates. The system is performing for itself.
79% of senior placements at one major Stockholm firm happen through personal contacts. The best recruiters have rebuilt their process around personal networks. Job ads are a fallback.
Every recruiter uses it as their primary tool. Every recruiter says it fails them: bad filters, incomplete profiles, no availability signal, too many irrelevant applications. No one has an alternative, and LinkedIn has little incentive to change.
"8-10 years required" is often driven by client procurement rules, not by genuine competency needs. The number inflates because recruiters know candidates apply anyway.
Candidates receive template rejections after five-round processes. Recruiters feel uncomfortable with automation but are overwhelmed by volume. The emotional damage shows up in nearly every interview: ghosting, opaque rejections, silence.
Multiple candidates filter on product ethics, culture, and personal values before applying. No existing platform surfaces this reliably. Candidates cold-message employees at target companies manually to get honest culture reads.
Candidates use AI to pass ATS filters. Recruiters receive AI-written cover letters. One candidate described an interviewer reading ChatGPT-generated questions verbatim, with no follow-up. Both sides feel increasingly alienated.
The candidate side
Job searching has become a full-time job. The effort required keeps going up. The signal coming back keeps going down.
Every candidate I spoke to had built a personal tracking system: Trello boards, Figma boards, colour-coded folders, Gmail tab systems. All self-built. All painstaking. All solving a problem a platform should solve. The workaround is now standard practice.
The company side
Recruiters are surviving, not screening. Volume replaced signal. Nobody fixed it.
"We have great consultants who don't get the assignments they should get because their CV doesn't reflect who they are."
Source: One major Stockholm recruiting firm. Consistent with patterns across all recruiter interviews.
AI in the hiring process
Both sides are using AI extensively. Neither side is happy with the result. What follows are specific, concrete adaptations described in detail by the people living them.
Calling this cheating misses the sequence of events. Employers deployed ATS keyword filtering first. Candidates adapted. Employers deployed AI screening. Candidates adapted again. One designer had spent a year training a dedicated ChatGPT thread on her own profile. Another built a personal AI filter to ask "does this job fit me?" before deciding whether to apply. A third carefully edits AI-generated text to remove traces, then reviews the output manually. The sophistication is real. The effort is exhausting. And none of it improves the underlying match quality.
Every recruiter I spoke to said AI-generated CVs and cover letters are now immediately recognisable, and that they make screening harder because the signal has collapsed. One recruiter said AI text "makes noise noisier." Another tried using AI to write rejection emails and gave up: automated rejections feel cold, and candidates react badly. A third described spending 3-5 hours a day processing applications despite using automated tools, because the tools cannot make the judgment calls that matter. The result: AI tools built to reduce screening time are creating more volume, which requires more screening time.
Job descriptions, cover letters, and CV summaries are all increasingly written by AI, and they are starting to sound the same. One candidate described an interviewer reading ChatGPT-generated questions word for word, with no follow-up. The human on the company side had also outsourced their judgment. When both sides use the same tools, the output converges. A modern hiring process is starting to look like a conversation between two AI systems, with humans on either end growing more alienated from a process they still nominally control. The signal recruiters actually need: working style, real capability, genuine motivation. That information disappears when everything runs through the same machine.
That number reflects something specific. Candidates know they are being filtered by systems they cannot see, appeal, or understand. One candidate with 13 years of experience called ATS keyword matching "opaque and illogical." The opacity is structural: automated systems do not explain their decisions. The EU AI Act will require them to from August 2026, but until then they are under no obligation. Being filtered out by an invisible system, with no basis you can understand or respond to, changes how people behave. It makes honesty irrational and performance rational. It produces the exact dynamic every recruiter says they hate: candidates performing rather than showing up.
Despite widespread adoption on both sides, nobody I interviewed trusted AI for the parts that matter most. Candidates use AI to draft and filter, but do their own research into company culture, often by cold-messaging current employees because no platform offers honest culture signal. Recruiters use AI to manage volume but make their actual hiring decisions based on direct human conversations. One candidate said she wanted "an assistant, not an agent." Another described AI as useful for "filtering, not deciding." The consensus across 21 interviews: AI for logistics, humans for judgment. The frustration is that the current deployment runs in reverse. AI handles judgment: screening, scoring, filtering. Humans handle logistics: volume management, scheduling, tracking. Both sides are working against the grain.
From August 2026, the EU AI Act classifies AI systems used in hiring as high-risk. This requires documentation, human oversight at decision points, bias testing, and the right to an explanation for automated decisions. The current standard practice of opaque ATS filtering, with no human review and no explanation to rejected candidates, will not be legally defensible. Several recruiters I spoke to were already aware of this and had begun adjusting their processes. Most companies using commercial ATS tools are not.
The signals no tool currently captures
These are the gaps every recruiter named. No existing platform addresses them.
The noise problem
Two connected failures define the experience of job searching right now. Volume makes it impossible for companies to see clearly. Then comes the silence: you don't get through, and nobody can explain why.
AI tools make applying easier, so candidates apply to more roles. More applications overwhelm recruiters, who deploy more automated filtering. Filtered applications are less trusted, so candidates apply to more roles to compensate. Everyone is worse off. The system optimises for volume because volume is measurable. Quality is not.
When a recruiter receives 300 applications in 24 hours, selection becomes reactive. They look for reasons to exclude rather than reasons to include. The real filters: curiosity, working style, cultural fit, trajectory. These cannot be assessed at volume. So proxies take over: years of experience, job titles, recognisable company names. Everyone knows these proxies are imprecise. They are also the only thing that scales. The actual criteria for selection end up unstated, because the actual criteria cannot survive contact with the volume.
Several recruiters described the same pattern: the job description was written by HR without consulting the hiring manager. Real requirements only emerge during the process, sometimes only after seeing candidates who do not fit. One recruiter named "fejkrekrytering": the risk of convincing yourself a candidate is right because you want them to be right. A genuine failure mode, experienced personally. The criteria are often opaque because companies are still figuring them out, not because they are hiding them.
When criteria are unstated or discovered mid-process, giving honest feedback becomes very difficult. "You didn't match our culture" is the most honest answer available, and also the least useful one. Structured feedback requires clear criteria. Clear criteria require deliberate role definition. Deliberate role definition rarely survives the volume. The recruiters I spoke to wanted to give feedback. Several did, at real cost in time. The gap is structural: you cannot explain decisions that were made without explicit criteria.
One candidate went through a five-round process: two case studies, a whiteboard exercise, travel to Stockholm, a team meeting, a CEO meeting. Then a rejection with no explanation. Another described a process that ended in silence after four weeks of regular contact. A third rated their own interview honesty at 5 out of 10. Not because they lied, but because the format makes honesty irrational. When you have no signal about what someone is actually looking for, you perform the expected candidate profile regardless. The opacity creates the behaviour everyone hates.
"Curiosity and initiative matter more than years of experience or sector fit. These qualities are impossible to screen for via CV. I have to meet the person, which means I'm already spending time before I know anything real."
"The process is opaque in both directions. I don't know what they're really looking for. They don't know who I really am. We're both performing for a system that isn't designed to let us actually find each other."
Voice of the market
Direct quotes from the 21 interviews. All participants anonymised by role and context.
What the numbers confirm
Large-scale datasets confirm most of what I heard in the interviews. Where they diverge, that gap is worth noting.
Large-scale surveys show 60% of hiring managers say AI is helping them find candidates they would otherwise miss. Every recruiter I spoke to disagreed. They described AI tools as noise amplifiers. The gap may be what managers say in surveys versus what recruiters experience on a Tuesday afternoon.
The market in numbers *
Here is what the official data shows, and why it lines up with what I heard.
Arbetsförmedlingen reports rising unemployment alongside employer difficulty finding candidates with the right skills. Both are true at the same time. Sweden has more available consultants than ever. The shortage is not talent. The system cannot connect the right person to the right role. Every recruiter I spoke to described this.
* Data sources: Arbetsförmedlingen Arbetsmarknadsutsikterna (autumn 2025) for unemployment figures and labour market outlook · Yrkesbarometern (September 2025) for occupation-level job prospect ratings · Statistics Sweden (SCB) salary data for salary benchmarks by occupation · Swedish market research compiled from Konsultguiden and Kompetensförsörjning 2025 for consultant volume data.
The test nobody trusts, and everyone still uses
Every recruiter I interviewed was sceptical or contemptuous of standardised personality tests. The research agrees with them. The industry does not.
MBTI and DISC tests persist because they reduce uncertainty cheaply and quickly, and because the companies selling them have a strong incentive to keep doing so. MBTI alone generates over 20 million pounds annually for its foundation. The tests are a business. Every recruiter I spoke to who used them did so reluctantly, or had already abandoned them. None believed they predicted anything meaningful.
An independent view
After the research was done, a 30-year veteran of the Swedish design industry published a newsletter about what he believed the market needed. He had spoken with me once, briefly, before writing it. He arrived at almost the same conclusions on his own.
"The recruitment process is designed almost entirely for the employer. The job seeker is not the customer. The job seeker is the raw material being processed."
He described a service where professionals own their own data: CV, target role, long-term goals, values, salary expectations. Under their own control.
He observed that people move between urgent job search and long-term career planning. "A service worth building would work for both situations, and keep working in the background even when we are not thinking about it." The interviews confirmed the same thing.
He described an application tracker: Suggested, Applied, In review, Interview, Offer, Rejected, Accepted. "More sustainable than the spreadsheet you stopped updating three weeks in." Seven of the thirteen candidate interviews had built exactly this system themselves, manually.
He ended: "I'm pretty sure I'm not the one to build this. But I'd sign up on day one. And I have a long list of people who would too." He wrote this without knowing this research existed.
When a researcher spends three months interviewing people and a practitioner with thirty years of experience writes the same conclusion independently after a single conversation. That is not coincidence. The problem is structural, that it is widely understood, and that no one has yet built what both sides are asking for.
What people actually want
I asked everyone the same question: if you could design this process from scratch, what would it look like? The answers were specific, consistent, and buildable. The same ideas came up again and again across very different people.
Both candidates and recruiters wanted salary expectations visible early, ideally before the first conversation. A basic respect-of-time principle. "If we're not in the same range, let's find out in five minutes, not five weeks." The technology to make this bilateral already exists. The culture has not caught up.
Every candidate described the absence of feedback as one of the most demoralising parts of the process. Detailed feedback for every application is unrealistic. But a real reason when you get far enough to matter: that is the ask. "I don't need to know why 40 companies didn't reply. I need to know why the one I cared about said no." Recruiters agreed. Several call every finalist regardless of outcome. They know it matters. The system does not support it at scale.
Multiple candidates proposed the same idea independently: a short, informal, unscripted first contact before the formal process starts. One called it "the first five minutes before the performance begins." Another described her best interview as "coffee with a friend." A third wanted "speed dating, not another form." The format people want is dialogue. An exchange, not an assessment.
Candidates filter for values before anything else, and do it manually: cold-messaging employees to find out what a company is actually like. Companies say they want culturally aligned candidates. Neither side has infrastructure for this conversation. One recruiter put it plainly: "I don't want to waste their time or mine if the product is something they would never work on."
Every recruiter named the same gap: they cannot see who is passively open to a move. LinkedIn shows employed or not employed. That binary is useless. What they need: "happy where I am but would consider the right thing." Or "open, available in two months." Candidates want this too: to be findable without broadcasting that they are looking. The technical solution is straightforward. Nobody has built it.
The most consistent signal across all 21 interviews: people want AI to handle the administrative parts: tracking, scheduling, first-pass filtering. They want it out of the parts that require judgment. "AI for logistics. Humans for decisions." One candidate said it clearly: "I want an assistant, not an agent." The distinction is autonomy. Nobody wants a machine deciding their career. Everybody wants a machine that clears the path so a human can.
None of the people I spoke to were cynical about work itself. They were frustrated by the process of finding it. The designers I interviewed are good at what they do and they know it. The recruiters genuinely want to find the right people and build long relationships with them. The failure is in the infrastructure between them.
What people want is not complicated. Salary transparency. Real feedback. A short honest conversation before the formal process. Visibility into values. A way to signal openness without broadcasting desperation. AI that removes friction rather than adding judgment. These are all solvable. None of them require breakthrough technology. They require someone to build a system that treats both sides as people rather than inputs.
The market is broken because the infrastructure was designed for a different era, optimised for a different goal, and nobody has rebuilt it from the ground up with both sides in mind.
About this research
I am a designer and product thinker based in Stockholm, with a background in UX, product leadership, and more recently, building with AI. Over the past several years I have worked across the Swedish and Nordic design market, both as a practitioner and as someone helping to shape products and teams.
I have been on both sides of the hiring conversation. I know how it feels to apply. I know how it feels to try to hire. Both experiences pointed to the same conclusion: the system is not designed for the people in it.
This research is a private, independent project. It is not sponsored by any company, platform, or recruiter. I have no financial interest in how the hiring market works today, and no product to sell you at the end of it.
I started interviewing people because I was curious whether what I personally experienced was shared. It was. The intention was to understand both sides honestly, surface what people actually feel but rarely say out loud, and contribute something useful to a conversation that is mostly happening in frustration rather than in public.
The goal of this research is not to expose anyone or advocate for a particular solution. It is to make the experience of hiring, and of being hired, a little more honest and a little more human. Hiring is how careers begin. It shapes who works where and with whom, and therefore what gets built and how. It deserves more care than it currently gets.
Sources and further reading
The primary research for this report is qualitative: 21 interviews I conducted in Sweden between March and April 2026. The quantitative data cited throughout draws on the following published sources. Where a source offers a deeper read on a topic covered in the report, that is noted.
Official Swedish labour market forecast. Source for unemployment figures (6.9%), employer hiring difficulty, and the matching paradox cited in chapter 09. Updated twice yearly. Available at arbetsformedlingen.se/statistik.
Occupation-level job prospect ratings for Sweden. Source for the "Små" (small) job prospect rating for visual and UX designers cited in chapter 09. Updated twice yearly. arbetsformedlingen.se/statistik/yrkes-och-kompetensanalyser.
Based on 1.78 million tracked job applications across the full year 2025. Source for time-to-offer data (57 to 83 days), ghosting rates, emotional state data, and candidate channel behaviour. The most comprehensive candidate-side dataset available. huntr.co.
Source for the 8% fairness figure (job seekers who believe AI screening makes hiring fairer) and the 44% figure on willingness to fabricate resume details. Also the origin of the "AI doom loop" framing. greenhouse.com.
Source for the 9,500 applications per minute figure and the 45%+ surge in application volume. Also covers the homogenisation problem from the platform side. linkedin.com/business/talent.
Source for the finding that both cost-per-hire and time-to-hire increased in 2025 despite widespread AI adoption. The Society for Human Resource Management publishes this annually. shrm.org.
The 50% retype rate and zero predictive validity figures cited in chapter 10 draw on decades of peer-reviewed psychometric research. For a clear overview: Adam Grant, "Say Goodbye to MBTI, the Fad That Won't Die," Psychology Today (2013). For the most recent synthesis: Erford et al., "A 25-year Review and Psychometric Synthesis of the MBTI Form M," Journal of Counseling and Development (2025). The MBTI Foundation's own disclaimer about use in hiring is available at themyersbriggs.com.
AI systems used in recruitment and candidate selection are explicitly classified as high-risk under Annex III of the EU AI Act. Core obligations for high-risk systems become enforceable from August 2, 2026. Guidance specific to Sweden: Lindahl Law (Stockholm), published via Lexology, November 2025. EU AI Act full text: eur-lex.europa.eu.
Tomas Chamorro-Premuzic, "AI Has Made Hiring Worse — But It Can Still Help," Harvard Business Review, January 2026. Dartmouth and Princeton research on AI cover letters reducing company trust in cover letters entirely, covered in CNN Business, December 2025. HeroHunt, "AI Adoption in Recruiting: 2025 Year in Review," herohunt.io.
Source for the SEK 5,000 monthly salary premium for UX specialists over generalist graphic designers. SCB publishes annual occupational salary statistics. scb.se/hitta-statistik/statistik-efter-amne/arbetsmarknad/loner-och-arbetskostnader.