How to Avoid Discrimination When Using AI

Global HR

Given the prevalence and rising use of artificial intelligence for customer service, feedback and general information, it’s no surprise that HR teams are adopting AI-driven bots for workplace communication.

Companies are embracing bot tools as time and money savers to conduct and evaluate interviews, substituting them for face-to-face conversations. AI tools can also screen resumes, monitor employees and provide predictive analytics.

Jennifer Betts, an attorney with Ogletree Deakins, joined her colleague Joseph L. Beachboard during the session “AI, the Right Way: Avoiding Employment Discrimination with Artificial Intelligence” at the recent SHRM Annual Conference & Expo 2021 to discuss the trend.

The Emergence of AI

AI uses machines, or computers, to perform tasks in a way that is intelligent, in that the computers can change course depending on the information being collected. These tasks are conducted through algorithms, or a set of instructions for the computer to follow.

AI-enabled content generation and other AI-based tools have been available on the marketplace for years. Now they have transitioned from the “hype” of new technology to the adoption phase due to necessities resulting from the pandemic, Betts said.

There are many different forms of AI. The two most important forms for employers to understand are machine learning and natural language processing.

Machine learning involves AI systems that show improved performance as they are fed more data and as they predict more outcomes. In other words, they become wiser over time and through more extensive use.

Natural language processing is the branch of computer science—and more specifically, the branch of AI—concerned with giving computers the ability to understand text and spoken words in much the same way humans can.

AI today can be found in autonomous vehicles, injury prediction, fraud detection, precision medicine, photo tagging and “talk to text.”

CREATE LASTING IMPACT IN THE WORKPLACE

Join us at the

SHRM INCLUSION 2021
conference Oct. 25-27 in Austin, Texas, for three engaging days of learning and networking. You will get the tools, best practices and actionable solutions you need to build a more diverse, equitable and inclusive workplace.

Register Now

Proper AI Programming Is Key

AI-powered analytics tools make it easier to effectively and cheaply, for example, measure productivity, identify trends and recognize potential areas for improvement—all necessary enhancements in the workplace of the future.

There are many slow adopters and skeptics of AI. Betts said it’s important to realize that “Artificial intelligence itself is neither inherently good nor inherently bad. It’s critical to remember that AI’s effectiveness is all about how the AI and bots are programmed and maintained, not the concept of AI itself.”

AI and Hiring Practices

AI has grabbed a lot of headlines about hiring lately, with articles reporting on how applicants can “beat the system” and have their resumes gain more attention by using and prioritizing specific words and phrases that suggest they are a better fit for the job description.

Many organizations utilize AI during the initial stages of the hiring process, such as to deliver programmed questions to applicants via a robot, which can save time for hiring managers who may be considering dozens of candidates for a position.

AI also can decipher the responses given—or words and phrases used on a resume—so that companies can narrow their search for viable candidates.

By looking at word choice, for example, it can uncover any potential biases in the responses—or even in the way questions are framed by the company to the candidates.

Anti-Discrimination Legal Guidelines

This is where things can become legally complicated for employers. Employers must be aware of federal anti-discrimination laws such as those administered by the Equal Employment Opportunity Commission (EEOC), Title VII of the Civil Rights Act of 1964, Age Discrimination in Employment Act (ADEA), Genetic Information Nondiscrimination Act of 2008 (GINA), and the Americans with Disabilities Act of 1990 (ADA).

Under most federal anti-discrimination laws, employers may be sued for disparate treatment or disparate impact. According to the EEOC, disparate treatment “occurs when an employer treats some individuals less favorably than other similarly situated individuals because of their race, color, religion, sex, or national origin. To prove disparate treatment, the charging party must establish that respondent’s actions were based on a discriminatory motive.”

Disparate impact occurs when “discrimination results from neutral employment policies and practices which are applied evenhandedly to all employees and applicants, but which have the effect of disproportionately excluding women and/or minorities.”

To avoid claims of disparate treatment or disparate impact, Betts recommended establishing methods of validation of any employment screening tool, including AI-enabled tools; establishing pre-employment testing scores; and documenting validity.

State Law Summary

Employers using AI-powered tools must also be mindful of compliance with state law because the list of states with laws applicable to hiring and AI is growing. Among them:

  • Maryland and Illinois: Both states have passed laws relating to the use of facial recognition software with employment applicants.
  • Connecticut: Along with other states, Connecticut requires certain disclosures regarding any employer electronic monitoring.
  • California: California has passed a concurrent resolution promoting the use of AI in the employment setting and has comprehensive data privacy law.
  • Illinois, Texas and Washington: All three have laws relating to biometric privacy.

Other AI Uses

Sensitive communication. Bots can be used to streamline communication between a company and its workers. For example, an employee might be hesitant to share certain details of a case of bias with a person, but could be more willing to respond to questions from a bot. There are a lot of comfort-level issues to be worked through with AI-human communications in general, though.

Employee monitoring. AI can be used in employee tracking, such as monitoring desktop activities, blocking distractions, and calculating work time and break time.

When using employee monitoring software, the company should roll it out with an effective communications campaign. An employer should consider telling employees what is being tracked, how and why. Monitoring can be controversial and, if not communicated properly, may lead employees to believe they are on an episode of “Big Brother.”

Given the fiercely competitive hiring environment today, where so many employees (over
40 percent) are looking for other jobs, HR professionals don’t want to give workers a reason to leave the company.

Therefore, Betts said, “In most instances, state laws require you to protect employees’ privacy rights by giving them advance notice of your monitoring. The best practice is to get employees’ consent for monitoring in writing.”

HR Best Practices

Employers that are considering using AI-powered tools in their workplace may do well to keep the following in mind:

  • Develop multidisciplinary innovation teams that include legal and human resource staff.
  • Continue human review of AI-assisted decision-making.
  • Implement disclosure and informed consent when necessary and appropriate.
  • Audit what is being measured before implementing the program and on an ongoing basis.
  • Impose tight controls on data access.
  • Engage in careful external vendor contract reviews.
  • Work with vendors that take an inclusive approach to design. Consider whether the designers and programmers come from diverse backgrounds and have diverse points of view.
  • Insist on review of external validation studies.

Choosing an AI Software Partner

There are many firms that can provide AI software for the workplace. Betts said it is critical for employers to carefully review any contracts and provided a list of questions employers should ask when weighing their options. Betts suggested asking vendors the following questions :

  • What kinds of statistical analysis do you perform to test your products, and how and why did you select those methods?
  • What were the results of your analysis? Can I see a copy?
  • Do you retest for disparate impact over time? How frequently?
  • Can you give references for companies that have used your services or tool?
  • Do you have diversity consultants or similar staff with whom you consult regarding your tools?

Paul Bergeron is a freelance writer based in Herndon, Va.

Articles You May Like

Ending the Motherhood Penalty – TLNT
Hiring Plans Go Beyond Great Resignation’s Impact
Here’s Why You’re Ignoring Signs That Your New Hires Might Fail – ERE
Zillow’s New Remote Comp Strategy – TLNT
Employers Continue to Deprioritize Diversity Hiring – TLNT

Leave a Reply

Your email address will not be published. Required fields are marked *