Issue Brief Final Draft

You might also like

Download as pdf or txt
Download as pdf or txt
You are on page 1of 8

Sorted Out:

A Brief on Discriminatory Hiring


Decision AI in Pennsylvania

April 17, 2024

Issue Brief Prepared By


Matthew McGinn

ENGL 138H

Overview
This brief aims to educate and urge policymakers within the Commonwealth of Pennsylvania to
institute change concerning possible discrimination caused by the utilization of artificial intelligence
(AI) in the hiring process. It addresses the origins of possible discrimination, policies needed to
address bias, and recommends legislation to the PA General Assembly to regulate AI hiring decision
tools.
Introduction

PA Workers’ Rights: A Keystone Issue

It has been almost seven decades since the Pennsylvania General Assembly made the
progressive step of passing the PA Human Relations Act (HRLA) in 1955 to provide protection
against labor discrimination and create the Pennsylvania Human Relations Commission to
affirmatively further equality in employment. This legislation was a testament to Pennsylvania’s
dedication to enshrined inalienable rights under law and predates the federal Civil Rights Act of
1964 by nine years. However, unlike the Civil Rights Act, the HRLA has never been amended
through its long lifespan, but since 1955, the labor market has changed considerably.
One shift in human relations never imagined by the framers of the HRLA and yet to be
fully addressed by the current General
Assembly is the introduction of AI-based
software known as automated
employment decision tools (AEDTs).
The technology has very quickly
grown popular, with ninety-nine
percent of all Fortune 500
companies1 and twenty percent of
leaders in the HR field using
AEDTs at least once.2 Human
relations workers first simply input
a job description and application
materials like résumés and cover
letters into the AEDT. Then, the AI
embedded within sifts through the
language of the material and
calculates the probability of
individual candidate success from
the words on their application and the entered job description using patterns learned from
previous applications. The final product is an AI-constructed list ranking candidates from best to
worst.
Through AEDTs, résumé parsing has come from a monotonous, time-consuming process
skewed by the subjective opinion or fatigue of an HR worker to an automated, seemingly
hyper-accurate task performed in a few minutes. In one study, AI-based systems reduced the time
spent reading and ranking applications by 90% when compared to human HR workers.3

2
Moreover, human HR workers look at a résumé for about seven seconds on average. AI looks for
an incredibly small fraction of that time and retains far more information.4

The “Black Box” Problem and A Discriminationatory Machine

The artificial intelligence in AEDTs does not necessarily look for the same generic
qualities in applicants as human HR workers like “good experience” and “solid work ethic.”
Instead, AI will cut as many corners and make as many broad assumptions as possible to quickly
provide a list of quality candidates from the patterns it learned from existing data. This
algorithmic thinking can lead to very obscure attributes becoming the leading cause of an
applicant earning a high ranking. The phenomenon is best represented by the results of a 2018
audit of an American AEDT. The audit showed the two most significant characteristics used by
the AI to determine high job performance in a candidate were the name “Jared” and whether the
candidate had played high
school lacrosse.5
Herein lies what is
known as the “Black Box”
problem. When treated
like an opaque,
unquestioned reading
machine, AI can make
unreasonable decisions
that seem legitimate, and
developers rarely code AI
to “show their work.”6
If we apply the “Black Box'' problem to AEDTs, we can see a huge risk of
discrimination. AI typically learns the patterns it uses to find successful candidates from
pre-existing datasets they are “educated” from. For AEDTs, these data sets typically consist of
past applications from current or past employees. If the AI is trained on datasets including groups
already well-represented in the labor force, underrepresented populations can face more
difficulties making it past the AI screening stage of the hiring process.
This is not just theory. From 2014 to 2018, Amazon failed in its creation of an AEDT due
to developers being unable to solve the system’s inherent gender discrimination.7 Furthermore, a
recent experiment found that when the generative AI ChatGPT sifted through fabricated résumés
Black male applicants were top-ranked only 7.6% of the time — 10% less frequently than the
best-performing group.8 Even in the aforementioned 2018 audit, the characteristics of “Jared”
and “high school lacrosse” lending themselves to a high ranking immediately skews success
towards white male applicants.9

3
Sorting Out the Next Step

Underrepresented groups
currently encounter challenges finding
legal recourse for discriminatory AI.
Applicants may not even know AI
interacted with their application or led
to the discrimination that cost them a
job. Suing for algorithmic
discrimination is also difficult under
current federal law. In a Title VII Civil
Rights Act lawsuit for labor
discrimination, applicants would need to
prove that an employer engaged in
actions—intentional or not—that had a
disproportionate effect on their hiring
chances. Nevertheless, most attempts to find
discriminatory intent direct or otherwise due to algorithmic bias involve great uncertainty.10
As of early 2024, no U.S. state legislature has passed a law mitigating AEDT-based
hiring discrimination.11 Thus, the Pennsylvania General Assembly is in the unprecedented
position of acting first on comprehensively legislating on a widespread issue in employment
inequality. The House Committee on Labor & Industry has a currently drafted bill sitting before
it that would amend the HRLA to include protections from AEDT discrimination, which is an
important first step in generating discussions on algorithmic bias. However, the bill has not been
touched by the Committee since its referral in the fall of 2023.
Regarding employment anti-discrimination law, Pennsylvania’s legislators have
historically been ahead of the curve. To serve the civil rights of all Pennsylvanians and continue
the Pennsylvanian legacy of progressivism in tackling employment discrimination, the
Pennsylvania General Assembly should take action in reforming and regulating AI use in the
hiring process as it currently stands. Allowing automated applicant tracking systems to go
unsupervised could result in systemic discrimination against underrepresented groups in the
labor market. Also, current protections afforded by the PA Human Relations Acts are not enough
to fully safeguard Pennsylvanians from discriminatory AI. The purpose of this issue brief is to
address how the General Assembly can take the opportunity to break ground on state regulation
of AI in the hiring process and how PA legislators can shape the Commonwealth’s current
position and proposals on AI into workable, enforceable law.

4
Proposals

Proposal 1: Looking Into the “Black Box”

If employers or software developers cannot be trusted to examine


and remedy biases in AI algorithms, then the state government should
handle oversight on discrimination. A common way in which AEDTs
can be checked for bias is through independent auditing. According to
the Brookings Institute, audits of AI systems should inspect the
decision tool for “bias,…accuracy, robustness, interpretability,
privacy characteristics, and other unintended consequences.”12 In
essence, these common qualities survey the AI’s degree of disparate
impact, which is unintentional discrimination against a protected
group with no clear justification by a process or technology deemed
neutral at face value. Audits very commonly check the datasets the AI
learns from for representativeness and provide AI with fabricated
application materials to check statistical differences for success
between groups. Auditors also commonly check AEDTs for what
factors play the greatest role in the algorithmic determination of
candidate success, ensuring that reasonable and equitable
characteristics matter most. Well-designed and well-explained audits
can give confidence to both applicants and employers about the safety
and equity of the AI used for a certain job opening.
The mandating of audits for developers or employers has been
done before both in the United States and abroad. In January of 2023,
New York City began enforcing Local Law 144, which stipulates that
an independent third-party audit inspecting for discrimination must be
performed on an AEDT before a company or HR firm uses it.13 In the
European Union, the new sweeping 2024 AI reform law requires
developers to provide evidence that reasonably explains the
elimination or mitigation of bias to the EU AI Office before the AEDT
can enter the market.14 Throughout its lifetime, AI providers must also
routinely check for bias. Under New York City law, employers are
found legally liable for the usage of unaudited AEDTs and may be
subject to a schedule of fines or facilitated legal redress from applicants.
In the EU, scrutiny is placed more on AI developers, with similar
penalties for illicit selling of unaudited AI.

5
Proposal 2: Turning the “Black Box” Transparent

A solution that often runs concurrently with auditing is the notification of applicants to
the use of AI in the hiring process. A 2023 Pew study found that 66% of American job applicants
would not apply for a job knowing that AI was used in the hiring process.15 Thus, a system of
notifying applicants of AI and providing evidence for fairness in the system would be beneficial
for applicants to make well-informed decisions about where to apply and where to search for
possible discrimination. In the aforementioned New York City and EU AEDT laws, companies
are required to notify applicants of possible AI interaction with their application materials, and in
New York City, companies must post the results of the most recent AI audit on their official
public website.1617 The results are in the form of an executive summary, but some say the data
sets that inform the AI are also needed to paint a greater picture of the AEDT’s equality.
AI software developers oppose such laws for revealing trade secrets and violating a
corporation's right to privacy under the law. Under the Pennsylvania Uniform Trade Secrets Act,
practices or data sets considered “ trade secrets” are
protected if they are “the subject of efforts that are
reasonable under the circumstances to maintain its
secrecy.”18 The threat created by unregulated AI
poses clear reasoning to override the “reasonable
circumstances” clause that AI coding or training
datasets would otherwise be protected under. Thus,
regulating AI makes legal sense and does not
constitute an infringement of an organization’s right
to the maintenance of trade secrets.

Analysis and Recommendation


Taking into consideration the current proposals set before the state, the Pennsylvania
General Assembly ought to form and pass legislation mandating audits for AEDTs and requiring
notification and transparency to applicants. As previously mentioned, the Pennsylvania General
Assembly currently has a drafted bill known as H.B. 1729 amending the PA Human Relations
Act to include protection from AEDT bias19 It would require any company based in Pennsylvania
wishing to use AEDTs to have an independent audit conducted on the tool before use, with audit
results posted to the company’s website, and requires acquisition of consent from the applicant
before allowing AI to interact with their application. All complaints of violating the bill are set to
be directed to the Human Relations Commission. The bill has been sitting in the House
Committee on Labor and Industry since September of 2023 with no debate, hearing, or
amendment on the bill’s contents.

6
Moving this bill
forward is important, but
definite amendments are
needed. First of all, all
liability falls upon
employers using AI, with
the burden of conducting
audits and culpability for
discrimination falling
upon them. Very often,
firms use ready-made AI
tools created by
third-party developers
rather than an AEDT
made in-house. The bill
places no scrutiny on
such independent AI
providers. Further, no
language indicates what
exactly audits should be
looking for when
checking AEDTs for bias.
The bill indicates the
audit should “assess the
disparate impact” but
elaborates no further,
with no statistical thresholds or
guidelines for rectification of
found bias. Indicating that
audits should look for adherence to the EEOC’s “80% rule” for the ratio of underrepresented to
well-represented group hires and calling for the immediate suspension of AEDTs found to be
biased until proven otherwise by a subsequent audit would be important additions. 20
Even though AI may seem like dangerous machinery that can be incredibly haphazard in
situations of high risk, artificial intelligence is the definitive future of human relations. Whether
that future is shaped by an unquestioned algorithm making clear discrimination look like logical
destiny or a transparent tool utilized in conjunction with good human judgment is up to
legislators to decide upon. To ensure the future is shaped in a manner for better business and
furthered civil rights, Pennsylvania should act first to manage hiring AI as early as possible
before the road forks in the wrong direction.

7
Endnotes (Text)
1. Myers, Sydney. “2023 Applicant Tracking System (ATS) Usage Report: Key Shifts and Strategies for Job Seekers.” Jobscan, 2 Oct. 2023,
www.jobscan.co/blog/fortune-500-use-applicant-tracking-systems/#general-ats-distribution. Accessed 27 Mar. 2024.
2. Schultz, Sara. “70% of Talent Leaders Use AI – but Only 20% Use It in the Hiring Process, according to AMS.” AMS, 13 Oct. 2023,
www.weareams.com/news/70-percent-of-talent-leaders-use-ai-only-20-percent-use-it-in-the-hiring-process/. Accessed 27 Mar. 2024.
3. Black, J. Stewart, and Patrick van Esch. “AI-Enabled Recruiting: What Is It and How Should a Manager Use It?” Business Horizons, vol. 63,
no. 2, 1 Mar. 2020, pp. 215–226, www.sciencedirect.com/science/article/pii/S0007681319301612, https://doi.org/10.1016/j.bushor.2019.12.001.
Accessed 27 Mar. 2024.
4. Inc, Ladders. “Ladders Updates Popular Recruiter Eye-Tracking Study with New Key Insights on How Job Seekers Can Improve Their
Resumes.” Www.prnewswire.com, 6 Nov. 2018,
www.prnewswire.com/news-releases/ladders-updates-popular-recruiter-eye-tracking-study-with-new-key-insights-on-how-job-seekers-can-impro
ve-their-resumes-300744217.html. Accessed 27 Mar. 2024.
5. Gershgorn, Dave. “Companies Are on the Hook If Their Hiring Algorithms Are Biased.” Quartz, 22 Oct. 2018,
qz.com/1427621/companies-are-on-the-hook-if-their-hiring-algorithms-are-biased. Accessed 27 Mar. 2024.
6. Blouin, Lou. “AI’s Mysterious “Black Box” Problem, Explained | University of Michigan-Dearborn.” Umdearborn.edu, 6 Mar. 2023,
umdearborn.edu/news/ais-mysterious-black-box-problem-explained. Accessed 27 Mar. 2024.
7. Dastin, Jeffrey. “Amazon Scraps Secret AI Recruiting Tool That Showed Bias against Women.” Reuters, 10 Oct. 2018,
www.reuters.com/article/us-amazon-com-jobs-automation-insight/amazon-scraps-secret-ai-recruiting-tool-that-showed-bias-against-women-idUS
KCN1MK08G/. Accessed 27 Mar. 2024.
8. Yin, Leon, et al. “OpenAI’s GPT Is a Recruiter’s Dream Tool. Tests Show There’s Racial Bias.” Bloomberg.com, 8 Mar. 2024,
www.bloomberg.com/graphics/2024-openai-gpt-hiring-racial-discrimination/#:~:text=The%20interviews%20and%20experiment%20show.
Accessed 27 Mar. 2024.
9. Gershgorn, Dave. “Companies Are on the Hook If Their Hiring Algorithms Are Biased.”
10. Chin-Rothmann, Caitlin. “Assessing Employer Intent When AI Hiring Tools Are Biased.” Brookings, 13 Dec. 2019,
www.brookings.edu/articles/assessing-employer-intent-when-ai-hiring-tools-are-biased/. Accessed 27 Mar. 2024.
11. NCSL. “Artificial Intelligence 2023 Legislation.” Www.ncsl.org, 20 July 2023,
www.ncsl.org/technology-and-communication/artificial-intelligence-2023-legislation. Accessed 27 Mar. 2024.
12. Engler, Alex. “Auditing Employment Algorithms for Discrimination.” Brookings, 12 Mar. 2021,
www.brookings.edu/articles/auditing-employment-algorithms-for-discrimination/. Accessed 27 Mar. 2024.
13. Francis, Simone R.D. “New York City to Restrict Use of Automated Employment Decision Tools: What Employers Should Know.” Ogletree,
6 Jan. 2022,
ogletree.com/insights-resources/blog-posts/new-york-city-to-restrict-use-of-automated-employment-decision-tools-what-employers-should-know/
. Accessed 27 Mar. 2024.
14. Future of Life Institute. “High-Level Summary of the AI Act | EU Artificial Intelligence Act.” EU Artificial Intelligence Act, 27 Feb. 2024,
artificialintelligenceact.eu/high-level-summary/. Accessed 27 Mar. 2024.
15. Nadeem, Reem. “AI in Hiring and Evaluating Workers: What Americans Think.” Pew Research Center: Internet, Science & Tech, 20 Apr.
2023, www.pewresearch.org/internet/2023/04/20/ai-in-hiring-and-evaluating-workers-what-americans-think/. Accessed 27 Mar. 2024.
16. Francis, Simone R.D. “New York City to Restrict Use of Automated Employment Decision Tools: What Employers Should Know.”
17. Future of Life Institute. “High-Level Summary of the AI Act | EU Artificial Intelligence Act.”
18. Buchanan Ingersoll & Rooney. “Primer on Pennsylvania Trade Secret Law Following Enactment of the Uniform Trade Secrets Act.”
Buchanan Ingersoll & Rooney PC, 5 Apr. 2011,
www.bipc.com/primer-on-pennsylvania-trade-secret-law-following-enactment-of-the-uniform-trade-secrets-act. Accessed 27 Mar. 2024.
19. DataGuidance. “Pennsylvania: Bill on AI Use in Hiring Process Introduced to House of Representatives.” DataGuidance, 30 Oct. 2023,
www.dataguidance.com/news/pennsylvania-bill-ai-use-hiring-process-introduced#:~:text=The%20bill%20would%20require%20employers.
Accessed 27 Mar. 2024.
20. Purdue Global Law School. “Automated Employment Decision Tools in the Crosshairs of New Law.” Purdue Global Law School, 23 Jan.
2024, www.purduegloballawschool.edu/blog/news/automated-employment-decision-tools. Accessed 27 Mar. 2024.

Endnotes (Graphics)
1. Myers, Sydney. “2023 Applicant Tracking System (ATS) Usage Report: Key Shifts and Strategies for Job Seekers.” Jobscan, 2 Oct. 2023,
www.jobscan.co/blog/fortune-500-use-applicant-tracking-systems/#general-ats-distribution. Accessed 27 Mar. 2024.
2. Walk-Morris, T. These are the flaws of AI in hiring and how to tackle them. World Economic Forum.
https://www.weforum.org/agenda/2022/12/ai-hiring-tackle-algorithms-employment-job/.
3. Temple University. African Americans Protesting against the Philadelphia Transportation Company’s Policy of Not Hiring Black Motormen
or Conductors.; 1943. http://gamma.library.temple.edu/crnc/items/show/17151 (accessed 2024-03-29).
4. Engler, A. Auditing employment algorithms for discrimination. Brookings.
https://www.brookings.edu/articles/auditing-employment-algorithms-for-discrimination/.
5. (4)Nadeem, R. AI in Hiring and Evaluating Workers: What Americans Think. Pew Research Center: Internet, Science & Tech.
https://www.pewresearch.org/internet/2023/04/20/ai-in-hiring-and-evaluating-workers-what-americans-think/.
6. City of New York. § 5-300 Definitions. American Legal Publishing.
https://codelibrary.amlegal.com/codes/newyorkcity/latest/NYCrules/0-0-0-138393 (accessed 2024-04-15).

You might also like