AI & RoboticsNews

Inside LinkedIn’s AI overhaul: Job search powered by LLM distillation

LinkedIn’s AI: Revolutionizing Job Searches Today

The advent of natural language search has encouraged people to change how they search for information, and LinkedIn, which has been working with numerous AI models over the past year, hopes this shift extends to job search. LinkedIn’s AI-powered jobs search, now available to all LinkedIn users, uses distilled, fine-tuned models trained on the professional social media platform’s knowledge base to narrow potential job opportunities based on natural language.

“This new search experience lets members describe their goals in their own words and get results that truly reflect what they’re looking for,” said Erran Berger, vice president of product development at LinkedIn, told VentureBeat in an email. “This is the first step in a larger journey to make job-seeking more intuitive, inclusive, and empowering for everyone.”

LinkedIn previously stated in a blog post that a significant issue users faced when searching for jobs on the platform was an over-reliance on precise keyword queries. Often, users would type in a more generic job title and get positions that don’t exactly match. From personal experience, if I type in “reporter” on LinkedIn, I get search results for reporter jobs in media publications, along with court reporter openings, which are a totally different skill set.

LinkedIn vice president for engineering Wenjing Zhang told VentureBeat in a separate interview that they saw the need to improve how people could find jobs that fit them perfectly, and that began with a better understanding of what they are looking for.

“So in the past, when we’re using keywords, we’re essentially looking at a keyword and trying to find the exact match. And sometimes in the job description, the job description may say reporter, but they’re not really a reporter; we still retrieve that information, which is not ideal for the candidate,” Zhang said.

LinkedIn has improved its understanding of user queries and now allows people to use more than just keywords. Instead of searching for “software engineer,” they can ask, “Find software engineering jobs in Silicon Valley that were posted recently.”

How they built it

One of the first things LinkedIn had to do was overhaul its search function’s ability to understand.

“The first stage is when you’re typing a query, we need to be able to understand the query, then the next step is you need to retrieve the right kind of information from our job library. And then the last step is now that you have like couple of hundred final candidates, how do you do the ranking so that the most relevant job shows up at the top,” Zhang said.

LinkedIn relied on fixed, taxonomy-based methods, ranking models, and older LLMs, which they said “lacked the capacity for deep semantic understanding.” The company then turned to more modern, already fine-tuned large language models (LLMs) to help enhance their platform’s natural language processing (NLP) capabilities.

But LLMs also come with expensive compute costs. So, LinkedIn turned to distillation methods to cut the cost of using expensive GPUs. They split the LLM into two steps: one to work on data and information retrieval and the other to rank the results. Using a teacher model to rank the query and job, LinkedIn said it was able to align both the retrieval and ranking models.

The method also allowed LinkedIn engineers to reduce the stages its job search system used. At one point, “there were nine different stages that made up the pipeline for searching and matching a job,” which were often duplicated.

“To do this we use a common technique of multi-objective optimization. To ensure retrieval and ranking are aligned, it is important that retrieval ranks documents using the same MOO that the ranking stage uses. The goal is to keep retrieval simple, but without introducing unnecessary burden on AI developer productivity,” LinkedIn said.

LinkedIn also developed a query engine that generates customized suggestions to users.

LinkedIn is not alone in seeing the potential for LLM-based enterprise search. Google claims that 2025 will be the year when enterprise search becomes more powerful, thanks to advanced models.

Models like Cohere’s Rerank 3.5 helps break language silos within enterprises. The various “Deep Research” products from OpenAI, Google and Anthropic indicate a growing organizational demand for agents that access and analyze internal data sources.

LinkedIn has been rolling out several AI-based features in the past year. In October, it launched an AI assistant to help recruiters find the best candidates.

LinkedIn Chief AI Officer Deepak Agarwal will discuss the company’s AI initiatives, including how it scaled its Hiring Assistant from prototype to production, during VB Transform in San Francisco this month. Register now to attend.


Author: Emilia David
Source: Venturebeat
Reviewed By: Editorial Team

Related posts
AI & RoboticsNews

Beyond GPT architecture: Why Google’s Diffusion approach could reshape LLM deployment

AI & RoboticsNews

Do reasoning AI models really ‘think’ or not? Apple research sparks lively debate, response

AI & RoboticsNews

Rethinking AI: DeepSeek’s playbook shakes up the high-spend, high-compute paradigm

Cleantech & EV'sNews

Chevy closes in on Tesla with the Equinox and Blazer EVs 'right in the heart of the market'

Sign up for our Newsletter and
stay informed!

Share Your Thoughts!

This site uses Akismet to reduce spam. Learn how your comment data is processed.

Worth reading...
1Password and AWS join forces to secure AI, cloud environments for the enterprise