AI & RoboticsNews

LinkedIn’s AI generates candidate screening questions from job postings

LinkedIn is using AI and machine learning to generate screening questions for active job postings. In a paper published this week on the preprint server Arxiv.org, coauthors describe Job2Questions, a model that helps recruiters quickly find applicants by reducing the need for manual screening. This isn’t just theoretical research — Job2Questions was briefly tested across millions of jobs by hiring managers and candidates on LinkedIn’s platform.

The timing of Job2Questions’ deployment is fortuitous. Screening is a necessary evil — a LinkedIn study found that roughly 70% of manual phone screenings uncover missing basic applicant qualifications. But as the pandemic increasingly impacts traditional hiring processes, companies are adopting alternatives, with some showing a willingness to pilot AI and machine learning tools. Job2Questions is designed to reduce the time recruiters spend asking questions they should already have answers to or exposes gaps candidates themselves can fill.

As the researchers explain, Job2Questions generates a number of screening question candidates, given the content of a job posting. It first divides postings into sentences and converts these sentences into pairs of question templates (e.g., “How many years of work experience do you have using…” and “Have you completed the following level of education:”) and variables (“Java” and “Bachelor’s Degree”). Then, it classifies the sentences into one of several templates designed by hiring experts and taps an entity linking system to detect the parameters corresponding to the chosen templates, namely by tagging specific types of entities from the sentences (like “education degrees,” “spoken languages,” “tool-typed skills,” and “credentials”). A pretrained, fine-tuned deep averaging network within Job2Questions parses posting text for semantic meaning. And lastly, a ranking model identifies the best questions of the bunch.

To collect data to train the machine learning models underpinning Job2Questions, the LinkedIn researchers had annotators label sentence-question pairs, which enabled the prediction of the templates from sentences. Then, the team collected 110,409 labeled triples — data samples containing a single job posting, a template, and parameters — submitted by job posters on LinkedIn, which served to train Job2Questions’ question-ranking model to anticipate whether a job poster would add a screening question to a posting. Screening questions added and rejected by recruiters and posters served as ground-truth labels.

VB Transform 2020 Online – July 15-17, 2020: Join leading AI executives at VentureBeat’s AI event of the year. Register today and save 30% off digital access passes.

In the course of a two-week experiment involving 50% of LinkedIn’s traffic, the researchers claim that only 18.67% of applicants who didn’t answer screening questions correctly were rated as a “good fit” by recruiters, while those who answered at least one question correctly had a 23% higher ranking. They also claim that ranking candidates by their screening question answers improved the applicant good fit rate by 7.45% and reduce the bad fit rate by 1.67%; that applicants were 46% more likely to get a good fit rating for job recommendations informed by their answers to questions; and that jobs with screening questions yielded 1.9 times more recruiter-applicant interaction in general and 2.4 times more interactions with screening-qualified applicants.

“We found that screening questions often contains information that members do not put in their profile. Among members who answered screening questions, 33% of the members do not provide their education information in their profile. More specifically, people who hold secondary education degree are less likely to list that in their profile. As for languages, 70% of the members do not list the languages they spoke (mostly native speakers) in their profile. Lastly, 37% of the members do not include experience with specific tools,” wrote the paper’s coauthors. “In short, we suspect that when people composing their professional profile, they tend to overlook basic qualifications which recruiters value a lot during screening. Therefore, screening questions are much better, direct signals for applicant screening compared to member profile.”


Author: Kyle Wiggers.
Source: Venturebeat

Related posts
Cleantech & EV'sNews

Einride deploys first daily commercial operations of autonomous trucks in Europe

Cleantech & EV'sNews

ChargePoint collaborates with GM Energy to deploy up to 500 EV fast chargers with Omni Ports

Cleantech & EV'sNews

How Ukraine assassinated a Russian general with an electric scooter

CryptoNews

Day-1 Crypto Executive Orders? Bitcoin Bulls Brace for Trump's Big Move

Sign up for our Newsletter and
stay informed!