Can an Algorithm Hire Better Than a Human?

Can an Algorithm Hire Better Than a Human?
Advertisement
Hiring and recruiting might seem like some of the least likely jobs to be automated. The entire process seems to need human skills that computers lack, like making conversation and reading social cues.

But people have biases and predilections. They make hiring decisions, often unconsciously, based on similarities that have nothing to do with the job requirements - like whether an applicant has a friend in common, went to the same school or likes the same sports.

That is one reason that researchers say traditional job searches are broken. The question is how to make them better.

A new wave of startups - including Gild, Entelo, Textio, Doxa and GapJumpers - is trying various ways to automate hiring. They say software can do the job more effectively and efficiently than people can. Many people are beginning to buy into the idea. Established headhunting firms like Korn Ferry are incorporating algorithms into their work, too.

If they succeed, they say, hiring could become faster and less expensive, and their data could lead recruiters to more highly skilled people who are better matches for their companies. Another potential result: a more diverse workplace. The software relies on data to surface candidates from a wide variety of places and match their skills to the job requirements, free of human biases.

"Every company vets its own way, by schools or companies on résumés," said Sheeroy Desai, co-founder and chief executive of Gild, which makes software for the entire hiring process. "It can be predictive, but the problem is it is biased. They're dismissing tons and tons of qualified people."

Some people doubt that an algorithm can do a better job than a human at understanding people.

"I look for passion and hustle, and there's no data algorithm that could ever get to the bottom of that," said Amish Shah, founder and chief executive of Millennium Search, an executive search firm for the tech industry. "It's an intuition, gut feel, chemistry."

He compared it to first meeting his wife.

Yet some researchers say notions about chemistry and culture fit have led companies astray. That is because many interviewers take them to mean hiring people they'd like to hang out with.

"Similarity between the interviewer and interviewee - they're from the same region, went to the same school, wore the same shirt, ordered the same tea - is hugely influential, even though it's not predictive of how they perform down the road," said Cade Massey, who studies behavior and judgment at the Wharton School of the University of Pennsylvania.

Instead, researchers say, interviewers should look for collegiality and a commitment to the business's strategy and values.

"A cultural fit is an individual whose work-related values and style of work support the business strategy," said Lauren Rivera, who studies hiring at Northwestern's Kellogg School of Management. "When you get into a lot of the demographic characteristics, you're not only moving away from that definition but you're also getting into discrimination."

They recommend that companies use structured interviews, in which they ask the same questions of every candidate and assign tasks that simulate on-the-job work - and rely on data.

Gild, for instance, uses employers' own data and publicly available data from places like LinkedIn or GitHub to find people whose skills match those that companies are looking for. It tries to calculate the likelihood that people would be interested in a job and suggests the right time to contact them, based on the trajectory of their company and career.

Desai said Gild finds more diverse candidates than employers typically do. In tech, it surfaces more engineers who are women and older and who come from a wider variety of colleges and socioeconomic backgrounds.

"If you have white, young male engineers, who are they going to know?" Desai said. "White, young male engineers."

More than 80 percent of the technical employees at most tech companies are men, and less than 5 percent are black or Latino.

One engineer had applied twice to Rackspace, a cloud computing company, without luck. As an Army veteran who worked in public radio with no high school degree or professional programming experience, he did not fit the pattern that Rackspace looked for. But Gild suggested him based on the software he had been writing on his own, and he was hired.

The tech industry is a focus for some of the hiring startups in part because it has more jobs than it can fill, and tech companies are under pressure to make their work forces more diverse. At Twitter, for instance, just 10 percent of technical employees are women, and at Facebook and Yahoo, it's around 15 percent. Some women and minorities in tech describe an unwelcoming culture, and in response to the criticism, tech companies have begun publishing their diversity data and pledging to make changes.

Some of the software sounds as touchy-feely as the most empathetic personnel director. Doxa, a new service, plans to match candidates with tech companies and even specific teams and managers based on skills, values and compatibility - like whether a team has more solo work or collaboration, or whether women feel that their opinions are taken seriously.

"There are just so many limitations to the human part of hiring, and the way we're doing it now isn't working because people are unhappy with work," said Nathalie Miller, chief executive and co-founder of Doxa.

So far, Doxa has uncovered aspects of working at companies that are rarely made public to job seekers. The data, from anonymous employee surveys, includes what time employees arrive and leave, how many hours a week they spend in meetings, what percentage work nights and weekends and which departments have the biggest and smallest gender pay gaps.

Another service, Textio, uses machine learning and language analysis to analyze job postings for companies like Starbucks and Barclays. Textio uncovered more than 25,000 phrases that indicate gender bias, said Kieran Snyder, its co-founder and chief executive. Language like "top-tier" and "aggressive" and sports or military analogies like "mission critical" decrease the proportion of women who apply for a job. Language like "partnerships" and "passion for learning" attract more women.

So where do humans fit if recruiting and hiring become automated? Data is just one tool for recruiters to use, people who study hiring say. Human expertise is still necessary. And data is creating a need for new roles, like diversity consultants who analyze where the data shows a company is lacking and figure out how to fix it.

People will also need to make sure the algorithms aren't just codifying deep-seated biases or, by surfacing applicants who have certain attributes, making workplaces just as homogeneous as they were before.

"One of the dangers of these kinds of algorithms," Rivera said, "is people just get overconfident because they're relying on data."

© 2015 New York Times News Service

Comments

For the latest tech news and reviews, follow Gadgets 360 on X, Facebook, WhatsApp, Threads and Google News. For the latest videos on gadgets and tech, subscribe to our YouTube channel. If you want to know everything about top influencers, follow our in-house Who'sThat360 on Instagram and YouTube.

Further reading: Algorithm, Internet
MTV Goes Looking for Likes With Its Latest Revamp
Sony Xperia Z3+ With Snapdragon 810 SoC Launched at Rs. 55,990
Share on Facebook Gadgets360 Twitter Share Tweet Snapchat Share Reddit Comment google-newsGoogle News
 
 

Advertisement

Follow Us

Advertisement

© Copyright Red Pixels Ventures Limited 2024. All rights reserved.
Trending Products »
Latest Tech News »