October 2022
VOLUME 41, ISSUE 9

Table of Contents

Features

Innovations Fresh Thoughts For Managing

AI Can Make Your Hiring Process More Inclusive

Law firms often lack clarity around which candidates to hire, relying on intuition, resume data and preconceived notions about what makes someone successful. Attrition and burnout persist while efforts to promote diversity remain slow to progress — all of which cost firms cash and drag down morale and efficiency. 

Matt Spencer

As a result, many hiring teams wish they had a more scientific method for choosing the right candidates. This desire for a more data-driven approach has caused many firms to turn to AI-powered tools to help support these critical talent decisions.

When used properly, artificial intelligence (AI) can enable law firms to create high-performing, happy and diverse teams. But very few products offer a solution that is efficient, effective and ethical. Here are some ways firms can effectively and responsibly approach this valuable technology.

AI SHOULDN’T “SCALE DOWN” TALENT PIPELINES

Reviewing candidate applications is a time-consuming task that can be made more efficient with the right kind of tool. Resume scanners are one solution that many companies apply to the problem of large candidate pools. In fact, according to a Harvard Business School study, resume scanning tools are used by 75% of U.S. employers and 99% of Fortune 500 companies.

However, such software can lead to missed opportunities. The same Harvard Business School study confirms that automated Applicant Tracking Systems (ATS) take an all-too-simplistic view of who are good or bad candidates, thereby rejecting those who would otherwise become high-performing and well-suited additions to an organization.

Resume scanners have also been flagged by the U.S. Justice Department and the Equal Employment Opportunity Commission (EEOC) as tools that could violate civil rights laws and already established EEOC guidelines.

The EEOC states that if the selection rate for a certain demographic group is less than 80% of that of the group with the highest selection rate, then adverse impact is present and measures should be taken to resolve the disparity. In the legal industry, many firms use grade point average (GPA) requirements as a way to narrow down the hiring funnel. However, our legal industry data shows that deploying a GPA cutoff of 3.5, for example, results in selection rates of 52% and 74% against Black and Hispanic candidates, respectively (compared to White candidates), making it an inequitable means for determining selection.

For these reasons and more, using AI to screen resumes and scale down talent pipelines is not a particularly effective or equitable method of streamlining or democratizing the hiring process. Instead, use AI to “scale up” your candidate consideration capabilities. In fact, AI can be the perfect tool for scaling up your hiring process. When used in an inclusive (versus exclusive) manner, this technology can help recruiting teams consider more candidates on a deeper, more comprehensive basis.

To do so, firms must collect and assess additional candidate information. This can include personality, aptitude or skills-based data. AI is able to detect patterns between your current attorneys and the candidate pool to make predictions about their potential for success, resulting in dramatically improved hiring outcomes. 

“AI can be the perfect tool for scaling up your hiring process. When used in an inclusive (versus exclusive) manner, this technology can help recruiting teams consider more candidates on a deeper, more comprehensive basis.”

This includes improvements in diversity. For the legal industry to achieve adequate demographic representation, firms should be recruiting from schools outside of their historical targets. Currently, only 3.6% of attorneys identify as Black/African American. In order for the number of Black law students in the 2L Summer Program across all of Big Law to reach parity with the U.S. population (12.4%), firms have to consider students from the top 110 law schools — a dramatic increase considering most firms typically focus their efforts on a set of 10 to 20 target schools.

Ultimately, any AI-powered hiring tool you deploy should provide explainable and supplemental insights that allow you to consider more, not fewer, candidates. These insights should be used in conjunction with all the other data collected and evaluated in your hiring process to arrive at AI-informed — but human-led — decisions.

TESTING AND REGULATION SHOULD BE WELCOMED

Because of its potential to exclude candidates on inequitable bases, regulations surrounding this technology are cropping up. For example, a law goes into effect January 2023 in New York City that requires employers to conduct an independent audit of the AI hiring tools they use. It’s only a matter of time before more widespread regulations will require thorough, documented, ongoing and third-party testing for any AI-powered hiring tools. 

To be prepared, every AI model used in the hiring process should be tested against a large candidate pool across a range of demographics to ensure all outputs meet or exceed EEOC guidelines. Ultimately, this increased oversight will ensure that candidates are not discounted because of biased factors. However, AI companies and the firms who employ them should not wait for the regulations to require it, but rather take it upon themselves to proactively ensure all candidates are given a fair shot in the hiring process.

 

Departments

ALA Now