[ad_1]
In today’s era, artificial intelligence (AI) technology has been widely applied across various industries, including in the recruitment process. Whether it’s a small start-up or a large company with a mature professional team, AI recruitment tools have increasingly become an important aid for finding and hiring new employees. However, these AI tools can also inadvertently exacerbate biases in the job market.
One specific example is Torrey Podmajersky, a user experience consultant from Seattle, USA, who received a cold email from a recruiter named Jennie Johnson. The email mentioned that Johnson had created a career profile for Podmajersky and recommended her for several job vacancies. However, the profile information was too vague and lacked key details, and Podmajersky wasn’t actively looking for new job opportunities, she even initially suspected it was a scam email. Later, she discovered that Jennie Johnson was not a real human recruiter but the embodiment of a “virtual career coach” simulated by the AI program of the company behind it.
Similarly, Douglas Hamilton, a retired sales and marketing executive in Albany, New York, also encountered a similar situation. The recruitment information organized by AI recruiter Jennie Johnson he received was just as mismatchedâwrong roles, wrong industries, wrong management levels. Hamilton was initially interested in the email because it was rare to include a photo with a recruitment letter, but later he realized that even the photo was generated by AI.
The online job-hunting tool featuring Jennie Johnson was created by Hyperleap, a company based in Park City, Utah. The tool targets users who register and activate the service, automatically sending job information curated by AI. Kevin Holbert, Hyperleap’s chief operating officer, explained how Jennie Johnson works: AI searches online job boards and company websites to build a searchable database. Then, it analyzes people’s LinkedIn profiles, resumes, and work history, matching this information in its database to recommend the most suitable job opportunities to them.
Although these AI recruitment tools have shown significant potential in improving efficiency, as they become more complex and widespread, experts in the recruitment industry are increasingly worried that these tools might lead to a more mechanized, impersonal interaction process with job seekers and, if not properly designed and supervised, AI might inadvertently replicate and amplify existing marketplace biases.
Although Hyperleap’s services are open to everyone, according to company representative Holbert, some unsolicited emails received by users like Podmajersky and Hamilton were actually side effects of early marketing campaigns. These early activities included collaborations with recruitment platforms, unfortunately resulting in users who had registered on the platforms being at risk of getting bombarded by the bot Jennie Johnson. Notably, Podmajersky and Hamilton had not actually registered on any of the mentioned recruitment platforms. Holbert added that Hyperleap had ended the partnership and was continuously working to remove inactive users from the list.
In the field of human resource management, the use of automation technology has a history of at least a decade, giving rise to services like those of Jennie Johnson. Regan Gross, an expert from the Society for Human Resource Management (SHRM), shares her experience using these automation tools with recruitment practitioners. These tools include applicant tracking software capable of filtering resumes based on keywords.
According to Fortune magazine, 97% of the global Fortune 500 companies have relied on such technology to screen candidates. A 2022 SHRM survey also showed that 42% of large organizations (with over 5000 employees) confirmed the use of automation or artificial intelligence in HR-related operations like recruitment and hiring.
Although Gross also sees the benefits of these automation tools, she also points out their limitations. For example, AI can help perform simple tasks like organizing a job seekerâs list of skills, but it is unable to identify personal characteristics and qualities – which are equally crucial for recruiters. Therefore, she predicts that the widespread application of AI will lead job seekers to find new ways to showcase their individuality.
With the continual development of AI, more parts of the recruitment process are becoming automated. Tools similar to Jennie Johnson, such as major platforms like LinkedIn, now can automatically match candidatesâ work experiences with vacant positions. Some platforms, like ZipRecruiter, offer chatbot services that simulate human interactions. The latest generation of generative AI technology even allows employers to outsource the entire recruitment processâincluding searching for candidates and conducting preset interviewsâto algorithms.
According to Holbert, Hyperleap’s strategy is to give AI personalized characteristics to increase affinity, making interactions more natural. However, Zahira Jaser, an organizational behavior researcher at the University of Sussex Business School, warns that the opacity of automated recruitment processes could have negative effects. Jaser questions whether candidates can discern if they are interacting with real people or not, which may not only change their performance during the job search but also disrupt the recruiters’ ability to accurately assess the candidates’ normal behaviors.
In-depth observations of AI interviews have revealed that interviewees tend to become stiff and unnatural when they realize they may be interacting with AI rather than a real person. They may focus more intently on the screen and reduce their natural movements like hand gestures. Particularly for certain groups like first-generation college students, individuals with distinct accents, or those from weaker socio-economic backgrounds, AI interview environments could be detrimental.
Research indicates that without sufficient human oversight, AI-based recruitment processes could exacerbate existing biases and inequities. Researchers state: âWhile the use of AI systems may appear attractive from a cost-effectiveness standpoint, they could pose diversity and inclusiveness issues if they lack necessary verification.â
Developers of AI recruitment tools often claim that their products can assess candidates in a more objective manner than humans, provided that the training data they rely on is comprehensive and diverse. However, experts from the University of Cambridge specializing in AI ethics point out that this belief is misleading because it overlooks how biases operate in the real world.
Experts explain that the development of AI recruitment systems is often based on historical hiring data, which itself is likely to be filled with biases and discrimination. Such systems are very likely to establish discriminatory connections between different words and concepts, thus perpetuating past unfair hiring patterns, even when trying explicitly to exclude factors such as race and gender.
Although the current hype around AI recruitment tools is quite loud, many HR departments are still in a wait-and-see mode. They want to see AI technology truly prove its effectiveness, fairness, and reliability before considering its adoption. Researchers add that it is only when AI systems can be proven to perform their tasks with fairness and impartiality that they can be expected to be more widely used in the recruitment process.
AI recruitmentstartuplarge professional teamemployeestoolsbias
[ad_2]