SlideShare ist ein Scribd-Unternehmen logo
1 von 7
Downloaden Sie, um offline zu lesen
 
Episode 224: ​Eliminating Algorithmic Bias 
in Hiring and Employment  
 
Episode Link: ​http://workology/ep​2​24-wp
Intro: ​[00:00:01.02] ​Welcome to the Workology Podcast, a podcast for the disruptive workplace leader. Join host
Jessica Miller-Merrell, founder of Workology.com, as she sits down and gets to the bottom of trends, tools and case
studies for the business leader, HR and recruiting professional who is tired of the status quo. Now here's Jessica with
this episode of Workology.
Jessica Miller-Merrell: ​[00:00:26.99] ​This Workology podcast is sponsored by Workology. The business case for
artificial intelligence and HR in your workplace is growing by leaps and bounds, every single day. Employers and HR
leaders, though, have real concerns about bias when hiring using AI, which is commonly referred to as algorithmic
bias. Say that three times fast. Algorithmic bias. In today's Workology podcast, we're going straight to the source and
talking to one AI company that works in HR. And we're talking to them about their approach on reducing algorithmic
bias and the steps that employers can take. This episode is part of the Workology podcast and it's part of our Future
of Work series, which is powered by PEAT. They're the Partnership on Employment and Accessible Technology. In
honor of the upcoming 30th anniversary of the Americans with Disabilities Act this July, we're investigating what the
next 30 years will look like for people with disabilities at work and the potential of emerging technologies to make
workplaces more inclusive and accessible. Today, I'm joined by Steve Feyer. He's the Director of Product Marketing
with HR technology company, Eightfold AI. Steve, welcome to the Workology podcast.
Steve Feyer: ​[00:01:41.69] ​Thank you so much for having me, Jessica. It's a pleasure to be here.
Jessica Miller-Merrell: ​[00:01:44.57] ​Talk a little bit about your background. Walk us through how you became the
director of product marketing for Eightfold AI.
Steve Feyer: ​[00:01:50.99] ​Sure. ​I’d be happy to. Well, we are a Silicon Valley based company providing an artificial
intelligence solution for talent. I found my way here by working with several of the current executives and founders of
the company and, in general, I've spent my whole career working in startup businesses, focused on cutting edge
technology and in particular on business solutions that, at least I feel, are going to have a positive impact on the
world. And so one of the reasons why Eightfold is very exciting for me is that we are helping to create better inclusion
in the world of employment.
Jessica Miller-Merrell: ​[00:02:31.73] ​Awesome. Well, I'm excited to have your expertise and background because as
you might imagine, there is a lot of questions and just need for information or resources on the subject of artificial
intelligence for human resources and recruitment. Can you walk us through maybe the difference between some
different types of biases? These are common when I'm thinking and I’m talking with HR people. But the difference
between conscious and unconscious bias when it comes to hiring and employment.
Steve Feyer: ​[00:03:07.58] ​That is such an important question. So if I may, I'd like to just talk very quickly about what
bias is in the first place. And a social scientist may have a much more nuanced definition, but in its most basic form,
bias is making any decision based on something that should be an irrelevant factor. So if we're thinking about hiring
decisions that a hiring manager would make, a bias would be taking place if that manager is making their hiring
decision based on a factor that doesn't matter for the job. The manager should be looking for the person who has the
Workology Podcast​ ​| www.workologypodcast.com | @workology
right skills, the right experience, the right capability to do the job. That's what matters. But if they're making a decision
based on that person's gender, that person's age, if that person is a person with a disability, those are examples of
bias. So a conscious bias would be any situation in which that decision was purposeful. The hiring manager
purposefully thinks, I don't want to hire female candidates, or I don't want to hire older candidates. I think if we
encountered that type of conscious bias in our lives, we probably noticed it right away because it may almost be
shocking to see, but unconscious bias can occur as well. And that would be any situation in which the decision maker
is making a decision based on that irrelevant factor. But they're not aware of it. That's not their intention. So in the
world of hiring, I think a really good example of unconscious bias that I've seen a lot of awareness of in recent years
is the use of language in job descriptions. So companies will write job descriptions for the jobs that they're hiring and
often the words that they use will stop describing the skills and experience they're looking for and actually start to
describe the personality of who they're looking for.
Steve Feyer: ​[00:05:04.63] ​So these are if you see words like, we're looking for a go-getter or we're looking for a
self-starter who powers through obstacles. Things like that. What I see a lot of in Silicon Valley is that companies will
say, we're looking for a ninja. I want a marketing ninja or an engineering ninja for this job. But the problem is, this kind
of language actually turns out to be, I think, unintentionally discriminatory, in particular discriminatory against female
candidates, because studies have shown that certain words and phrases discourage female candidates from
applying, but do not have that impact on male candidates. So this particular example would be one kind of
unconscious bias. As I said, I think fortunately companies are becoming very aware of it. And starting to take action to
impact it. But there's one example of unconscious bias or potential unconscious bias that I've been thinking about a
lot lately, Jessica, and that is video interviewing and how the rise of video interviewing could be impacting people with
disabilities. So many companies today are starting to require their job applicants to record a video of themselves as
part of an interview process, and they have many perfectly good reasons to do this. It can be very helpful for them to
understand who the candidates are, who they're looking at. But the concern I have about this is if there may be
unintentional, unconscious bias against job candidates with disabilities who may not be able to use the video
interview on a level playing field with candidates without disabilities. Perhaps they would require additional time to
answer questions or require some other reasonable accommodation with the video interview. If that possibility is not
being considered, then there could be an unconscious bias being created.
Jessica Miller-Merrell: ​[00:07:03.1] ​Thank you for all this insightful discussion on this topic. I feel like oftentimes,
especially when we're looking at our HR tech, we don't hear all these things. It's just a few buzzwords and you know,
and then it's straight to the demo. So I appreciate all the insights here and the reference towards things like artificial
intelligence and video interviewing. One of the promises for many when it comes to artificial intelligence is that it is
designed to reduce or eliminate bias. But you're, when you're talking about video interviewing specifically, that might
not be necessarily the case. Before we dive into that kind of topic, I did want to talk about and have you share some
of the primary benefits of using artificial intelligence in hiring and employment.
Steve Feyer: ​[00:07:57.91] ​Absolutely. So artificial intelligence offers a tremendous possibility for efficiency and for
better decision making. Without getting into a long pitch for my company's products, suffice it to say that AI that we
can offer, and that other companies are offering in many fields, are enabling companies to find the right information
right away in order to make a decision. So there could be a huge benefit there. But now there's still the issue of the
potential for bias, because if you see an artificial intelligence system that is analyzing information in order to give you
a recommendation, you want to be confident there is not some sort of unconscious bias behind it. So I really like to
think about something that our founder and CEO, Ashutosh Garg, who is a world expert on artificial intelligence, has
told me about this. And I think it's a really interesting way to think about how the issue of bias and AI intersect. And
the first point is, we're all human and we all have unconscious biases. Not suggesting that these particular biases
may be related to issues such as gender or disability. It's just that our biases exist because we're wired to survive as
people. And that means that our brains process information before we can make a conscious decision. So that
means we're going to have a reaction as a human that could potentially trigger a bias. And we cannot forget the
information that we see.
Workology Podcast​ ​| www.workologypodcast.com | @workology
Steve Feyer: ​[00:09:44.41] ​So, for example, if we meet someone and we observe that they're younger or older than
we are, we're going to notice that. Or if we meet someone and we are going to immediately make a judgment about
what gender we think that person is, we're going to make that judgment unconsciously and immediately. And we can't
forget that. Once we make that judgment call, we can't forget it. We can't turn off our ability to notice it either. So as
humans, we're primed for unconscious bias all the time. But the opposite is true of a computer. And that's where AI
can actually offer the opportunity to help us with our unconscious biases. Because, for a computer, we can tell the
computer all the information that that computer has. And, what's more, we can tell the computer to forget certain
information. And once that computer forgets that information, it's impossible for it to remember. So we as humans
gather information whether we want to or not. And then we can't forget. Computers only know the information we give
them. And if we tell the computer to forget, it will. So with that combination of human and computer capabilities, I think
we could actually then have a meaningful impact on preventing bias, leading to our ability to offer fair work
opportunities and create inclusion in our organizations.
Jessica Miller-Merrell: ​[00:11:07.02] ​So you're saying that the computer can forget bias? I feel like it's “Men in
Black.” Right. Like we click the little pen and then suddenly the light flashes and there's no more bias. But walk me
through that a little bit more. And then what happens if we don't flick the pen and the light and that algorithm still has
bias in it. Walk us through that.
Steve Feyer: ​[00:11:30.21] ​Well, I love the Men in Black example, Jessica, because it's funny, because humans
precisely don't work that way. You just can't tell someone to forget something and they'll do it. But yes, let me jump
into how a computer can actually be told to forget something and actually prevent bias. So, to start with, a computer
working in AI. Artificial intelligence is data analysis that's scaled. An artificial intelligence algorithm is gathering lots of
lots of information and then using that information to set a target and provide recommendations against that target.
So in the case of talent, the case of what we do, AI is taking in millions and millions of resumes and job applications
and other information about job candidates and job applicants. This AI is then taking all of that information, running
billions of calculations and trying to figure out which factors are going to predict success in a job. So as it turns out,
our AI in doing this has found that there are more than five hundred factors it can identify that can predict success.
Specifically by predict success, I mean, is this someone who is going to be someone who will take a job, whatever
that job may be, and then be successful in that job? Will they hold that job for a long period of time? Take another job
like it? Get promoted? Factors like that, that would show this person was successful.
Steve Feyer: ​[00:13:09.41] ​So of all of the factors that we can look at, if you think about the information that would
appear on a resumé or that can be inferred from a résumé, you want to be sure that those factors are the things that
matter. What's this person's educational background? What kind of skills do they have, what prior jobs did they hold
and so forth. And you want to be sure that the factors that don't matter, such as, is this person over the age of 50? Is
this person a member of a minority group? These are things that do not matter. And you want to be sure that the AI
does not consider those factors. So we're able to tell the computer, first of all, if you see gender or if you see
something that could infer gender like the person’s name, simply forget about it. That's the easy part. So, done. The
computer has forgotten about that information. But now there's the next step. Is it possible that you could.
unintentionally predict that person's personal characteristics, the things that don't matter for a talent decision, based
on other information in the resume. For example, there are colleges that are female only. If you see that college, you
know that the candidate, or you will believe that the candidate, is a female candidate. So does that information create
a bias, even though that now is potentially relevant to whether this is someone who you may want to hire? So we're
able to address potential bias in that as well in the algorithm.
Steve Feyer: ​[00:14:43.27] ​And we do this with a methodology called Equal Opportunity Algorithms, which is a series
of statistical tests that we run against the AI. What this method does is it, you take an input that you're concerned
about. In this case, let's let's consider the input of gender. And this method will look at all the factors you're
considering and tell you if any of them are actually creating a bias against gender. So it may look and say, you know
Workology Podcast​ ​| www.workologypodcast.com | @workology
what, this factor of the college that someone went to, it turns out that there is still some gender bias in the output
based on that factor. So when you detect that, you can modify the data that the computer is allowed to use, tell it to
forget something, run the test again and confirm that now the bias is no longer present. So we will conduct this
process. And before we will begin to use any algorithm for real, before we will deploy it, we will ensure that there is no
bias based on the three dimensions of gender, age and ethnicity. And then we can also offer statistical proof that our
AI results are not providing bias based across these factors.
Jessica Miller-Merrell: ​[00:15:57.81] ​I love this. This sounds fantastic, right? Because I am thinking about the case
study of Amazon and how it wasn't intentional, what they were, what they created to review resumes.. But the
algorithm was giving preference to men because that's most of the resumes that were coming in during that period of
time. But it also was looking at other things, like you said, women's universities or colleges or women's studies
activities or extra, additional curricular events that weren't necessarily mentioned consistently. But it was saying, hey,
these candidates don't fit what we're looking for.
Break: ​[00:16:45.72] ​Let's take a reset. This is Jessica Miller Merrell. And you are listening to the Workology podcast.
Today, we're talking with Steve Feyer about eliminating algorithmic bias in hiring and employment. This podcast is
sponsored by Workology and is part of our Future of Work series in partnership with the Partnership on Employment
and Accessible Technology or PEAT.
Break: ​[00:17:09.11] ​The Workology podcast Future of Work series is supported by PEAT, the Partnership on
Employment and Accessible Technology. PEAT'S initiative is to foster collaboration and action around accessible
technology in the workplace. PEAT is funded by the U.S. Department of Labor's Office of Disability Employment
Policy, ODEP. Learn more about PEAT at Peatworks.org. That's Peatworks.org.
Jessica Miller-Merrell: ​[00:17:38.14] ​I love that the equal opportunity algorithm allows and has created a process to
help remove, on an ongoing basis, bias that is found within the technology or within the parameters of what the HR
tech, your guys's tech is doing. I wondered, as other HR leaders are looking at other types of artificial intelligence, not
just your guys's, which I hope they take a look at and do a demo and get with you guys to learn more. But what are
some questions that HR leaders can ask their HR technology companies they're considering to determine if their tech
has a process designed to address and eliminate bias in artificial intelligence.
Steve Feyer: ​[00:18:26.72] ​Yeah, yeah, absolutely. So I think that there are a couple of questions that HR leaders
should ask when they're considering AI technologies. First of all, I think it's important to ask specifically about how
they address bias. Because AI technologies can be free of bias. So ask, how is this AI being used to prevent bias?
Steve Feyer: ​[00:18:54.83] ​How is it reaching a specific prediction without bias? And we use a methodology called
equal opportunity algorithms. There are other approaches that can be used. Just be sure that your AI vendor has
considered this issue and can offer proof that the results are unbiased against the appropriate factors. I would note
that the proof that we can offer is statistical. So if I show it to you, it's going to be a table with numbers. Perhaps it
doesn't mean all that much. So we will typically work with a data scientist at the organizations we're working with to
understand what they're looking at. I think there's another important question as well. More generally, which is to ask
the provider of AI if their AI, can offer any explanation about how it's reached a given result. This is something that we
call explainable AI and it should be possible for the results that are offered to give a reason why this result was given.
Another advantage of asking this question is it will reveal just exactly how deep this company is in terms of its AI
sophistication.
Jessica Miller-Merrell: ​[00:20:12.17] ​Steve, I'm hearing you talk about the algorithm and how it is eliminating bias.
But I didn't hear you mention people with disabilities. So what are some ways that Eightfold AI is working to and
focusing on eliminating bias when it comes to hiring people with disabilities?
Workology Podcast​ ​| www.workologypodcast.com | @workology
Steve Feyer: ​[00:20:31.99] ​Absolutely, Jessica. So let me address that. I did mention that our bias prevention in the
algorithm does not actually address people with disabilities. And the reason for that is the information the algorithm
knows actually doesn't really know anything about someone's disability status. You can, with the resumé, figure out
someone's age. Perhaps you can infer their ethnicity or gender. But if that person is a person with disabilities, that's
not something that our algorithm is actually able to detect. So to prevent potential bias against people with disabilities,
we've added another layer of technology to conduct that bias prevention. This is an analytics process that looks at
every stage in the hiring process and calculates whether there is any difference in the outcomes based on personal
characteristics. This can then be done for all kinds of personal characteristics, including those that may not be
present on a resume, but that someone may declare in their application. So let me give you an example,
hypothetically. Suppose that you have a company and you hire 10 percent of your job applicants. 500 people apply.
On average, you're going to hire about 50 of them. So that would mean that across every different kind of personal
characteristic, you're going to see about 10 percent or at least you should. You should hire about 10 percent of male
candidates and 10 percent of female candidates. And that also means you should hire about 10 percent of the people
with disabilities who apply for jobs and about 10 percent of people without disabilities who apply for a job. But now
suppose that the numbers show you something different. Suppose you see in the numbers that only 9 percent of the
people with disabilities who apply for a job ended up getting hired.
Steve Feyer: ​[00:22:35.87] ​Well, 9 percent and 10 percent are pretty close to each other. Maybe that's random. The
numbers do fluctuate up and down. After all, it will never be exactly the same in every month or every week or every
year. But maybe that number is different. Maybe it's 8 percent instead of 10 percent. Or maybe only 7 percent or 5
percent. At a certain point, that difference becomes statistically significant. At a certain point, you can say I can prove
that that number is different enough from the average that it probably represents a truly different outcome. I can't tell
you exactly why that outcome is different, but I can tell you it is different. So there is some cause for concern and we
do need to take corrective action somewhere because maybe there is a bias. So we run these analytics at the most
granular level possible. We run them for every single stage of the hiring process and we run them daily and we will
notify the people who need to know about that. Whoever the organization has identified needs to know, we can alert
them. So that as soon as something may be wrong, they can go take corrective action. Suppose there was one stage
of the hiring process and people with disabilities are getting out of the hiring process at that stage, possibly just for a
specific department in the company, or in a specific region of the country, or a specific country in the world? We can
flag that. The company can then go identify what is causing that to happen, can take corrective action and see if the
results are different after they've taken that action.
Jessica Miller-Merrell: ​[00:24:15.25] ​When you say corrective action, are we thinking like an investigation where we
dive in, do a deeper dive into the actual process and the candidates that are being analyzed are just more of an
investigative process. But we're alerted to that potential anomaly or concern. And then it forces us to make a decision
to say, hey, let's dive into this a little deeper and see if we can uncover what is potentially causing this change.
Steve Feyer: ​[00:24:51.66] ​That's exactly right. There could be many reasons. This could be very much an example
of a bias taking place, conscious or unconscious. You may identify that a specific department is not hiring people with
a certain background. So perhaps that department leader is doing the wrong thing. Perhaps their job descriptions are
written in a way that is creating a bias and a certain group of the individuals are not applying for those jobs. Or
perhaps in your series of technologies and tools and processes, perhaps there is something in that series of tools that
is unintentionally discouraging candidates. Perhaps you have required something which is challenging for people with
disabilities, and if you are able to identify when that's happening, you will be able to then correct that disparate
impact.
Jessica Miller-Merrell: ​[00:25:54.76] ​I think this is something that HR people can relate to in that this is really part of
our jobs. Somebody comes to our office and says, hey, this person is being treated differently or this is occurring or
we're notified by the EEOC with a charge or things like that, that result in further investigation and research needed to
be done in a particular area.
Workology Podcast​ ​| www.workologypodcast.com | @workology
Steve Feyer: ​[00:26:17.77] ​Exactly. Exactly. We'll give you the information so that you can then go and do what you
want to do. Go and make that an inclusive playing field for everybody.
Jessica Miller-Merrell: ​[00:26:29.38] ​Steve, as we look into the next 30 years of work, what emerging trends or
technologies do you think will have the biggest impact on people with disabilities? This is especially in light of the 30th
anniversary for the Americans with Disabilities Act, which is happening later this year in July.
Steve Feyer: ​[00:26:49.39] ​Well, I think there are two very important trends. The first, naturally, is artificial
intelligence. And one, I think really powerful thing about what we can do with AI, is by taking bias out of the equation
as much as possible, we are enabling the decision makers to focus on what people are actually able to do and to
think about job candidacy, to think about people's internal transitions and internal mobility within companies, to think
about their careers in terms of their ability to do the job and really focus on their potential rather than any personal
characteristics that are irrelevant. I think this is so important because people with disabilities have that right to work.
And frankly, I think organizations are obligated to do everything they can to realize that right that everybody has. And
I think AI really can make a difference. I'm an optimist. I hope it will in time even change our mindset so that if we
each as individuals receive a negative outcome, I'm sorry you weren't chosen for the job. Our reaction will not be, oh,
they were unfair, but rather our reaction will be, well, it wasn't right for me, but the next one will be. So we can maybe
even have a more optimistic reaction as individuals. But I think another important trend is the rise of accessibility
technology. I recall working with a colleague several years back who, he was hearing impaired and he explained to
me that he had found it very helpful to have a voice-to-text technology that he could use so that he could make sure
he was able to work seamlessly with his colleagues. And we've seen this technology improve so dramatically. Voice
assistance, screen readers and all these other kinds of accessibility technologies now live in our pocket. And they
work really well. So I'm hopeful that these types of technologies will make it possible for more and more people with
disabilities to participate fully in every kind of activity, every community in the workplace.
Jessica Miller-Merrell: ​[00:29:12.62] ​I agree. And that's one of the reasons why I've partnered with PEAT and we've
focused so much on inclusion and accessibility when it comes to hiring and employment for people with disabilities. I
wanted to ask you one more question here about, maybe your advice that you could give HR leaders who maybe
want to offer a more inclusive recruiting experience or have a more inclusive recruiting strategy that's focused on
hiring all types of qualified individuals, including people with disabilities. Do you have any insights to share?
Steve Feyer: ​[00:29:49.67] ​I think just one, Jessica. I want to make a suggestion that isn’t about technology really at
all. And the technology is important, don't get me wrong. Technology can make a great difference, but I think
something that's so important to inclusion in hiring is to create a very standardized process and be very consistent
and really stick to that process as an organization. When you are determined that you're going to hire for a role, make
sure that you have very clear criteria for how you're going to decide who to hire, and a very clear process for a hiree
going through that hiring process. And then no matter what else happens, no matter who may show up late in the
hiring process or who gets referred by someone else in the business, no matter who has a cousin that they would like
to bring in. Stick to the process.
Steve Feyer: ​[00:30:43.93] ​Stick to the plan. Because by doing this, you're going to be able to include those
processes that remove bias in your process and really have it be fair for everybody. I think that's probably the single
most important thing that organizations could do from a process perspective.
Jessica Miller-Merrell: ​[00:31:05.24] ​Agreed. Consistency is everything. Steve, thank you so much for taking the
time to join us today. Where can people go to learn more about you and also Eightfold AI?
Steve Feyer: ​[00:31:18.04] ​Jessica, it's been my pleasure. Thank you so much for inviting me on to the Workology
podcast. So if you want to learn more about what we're doing, please visit our Website that’s at Eightfold.AI. The
Workology Podcast​ ​| www.workologypodcast.com | @workology
word “eight”, the word “fold” dot AI. Or if you use your favorite search engine, search for Eightfold. Should bring us
up right away.
Jessica Miller-Merrell: ​[00:31:39.25] ​Awesome.​We'll include a link to Eightfold AI as well as a link to Steve's
Linked-In profile. If you want to connect with him more and talk about algorithms and all the cool things that are
happening in the HR tech space and what Eightfold is doing. So thank you so much for taking the time to chat with us
today.
Closing: ​[00:31:57.43] ​Are you tired of putting your professional development on the backburner? It's time for you to
invest in yourself with UpskillHR by Workology. We're a membership community focused on personal development
for HR. Gain access to our elite community training, coaching and events. Learn more at UpskillHR.com.
Closing: ​[00:32:22.6] ​Eightfold AI's transparent approach is what we need for more AI tech companies, especially in
HR in recruiting. Because of employment laws, HR needs to be working hand-in-hand with our HR technology
companies, especially those that use AI. As HR leaders, I believe it's our responsibility to be educated and aware
about the potential benefits and potential pitfalls that exist when using artificial intelligence. This starts with educating
yourself on the fundamentals of artificial intelligence. I'm linking to a number of podcast interviews that we've done in
the past on AI, as well as additional resources to help you start diving into this fast moving technology. The Future of
Work series is in partnership with PEAT. And it's one of my favorites. Thank you to PEAT as well as our podcast
sponsor Workology.
Workology Podcast​ ​| www.workologypodcast.com | @workology

Weitere ähnliche Inhalte

Mehr von Workology

Episode 220: Future of Work: How to Find and Select Accessible Workplace Tech...
Episode 220: Future of Work: How to Find and Select Accessible Workplace Tech...Episode 220: Future of Work: How to Find and Select Accessible Workplace Tech...
Episode 220: Future of Work: How to Find and Select Accessible Workplace Tech...Workology
 
Ep 214: Making Workplace Technology Accessible with Chancey Fleet
Ep 214: Making Workplace Technology Accessible with Chancey FleetEp 214: Making Workplace Technology Accessible with Chancey Fleet
Ep 214: Making Workplace Technology Accessible with Chancey FleetWorkology
 
5 Secrets to Career Growth in HR
5 Secrets to Career Growth in HR5 Secrets to Career Growth in HR
5 Secrets to Career Growth in HRWorkology
 
Inspire software webinar (1)
Inspire software webinar (1)Inspire software webinar (1)
Inspire software webinar (1)Workology
 
7 Strategies to Streamline HR Busywork and Maximize Your HR Team's Productivity
7 Strategies to Streamline HR Busywork and Maximize Your HR Team's Productivity7 Strategies to Streamline HR Busywork and Maximize Your HR Team's Productivity
7 Strategies to Streamline HR Busywork and Maximize Your HR Team's ProductivityWorkology
 
Designing AI Driven Employee Experiences Workshop: Sept. 18, 2019
Designing AI Driven Employee Experiences Workshop: Sept. 18, 2019Designing AI Driven Employee Experiences Workshop: Sept. 18, 2019
Designing AI Driven Employee Experiences Workshop: Sept. 18, 2019Workology
 
Digitizing Talent: Creative Strategies for the Digital Recruiting Age
Digitizing Talent: Creative Strategies for the Digital Recruiting Age Digitizing Talent: Creative Strategies for the Digital Recruiting Age
Digitizing Talent: Creative Strategies for the Digital Recruiting Age Workology
 
Episode 190: Creating Authentic & Inclusive Experiences for Your Workforce
Episode 190: Creating Authentic & Inclusive Experiences for Your WorkforceEpisode 190: Creating Authentic & Inclusive Experiences for Your Workforce
Episode 190: Creating Authentic & Inclusive Experiences for Your WorkforceWorkology
 
Webinar: Evaluating HR's Readiness for Artificial Intelligence
Webinar: Evaluating HR's Readiness for Artificial IntelligenceWebinar: Evaluating HR's Readiness for Artificial Intelligence
Webinar: Evaluating HR's Readiness for Artificial IntelligenceWorkology
 
The Intersection Between HR and AI
The Intersection Between HR and AI The Intersection Between HR and AI
The Intersection Between HR and AI Workology
 
Navigating Immigration: How to Hire Current Visa Holders
Navigating Immigration: How to Hire Current Visa HoldersNavigating Immigration: How to Hire Current Visa Holders
Navigating Immigration: How to Hire Current Visa HoldersWorkology
 
Workology Podcast Episode 141 – Future of Work: Apprenticeships and Employing...
Workology Podcast Episode 141 – Future of Work: Apprenticeships and Employing...Workology Podcast Episode 141 – Future of Work: Apprenticeships and Employing...
Workology Podcast Episode 141 – Future of Work: Apprenticeships and Employing...Workology
 
Workology Podcast Episode 136 – Future of Work: Workplace Accessibility and I...
Workology Podcast Episode 136 – Future of Work: Workplace Accessibility and I...Workology Podcast Episode 136 – Future of Work: Workplace Accessibility and I...
Workology Podcast Episode 136 – Future of Work: Workplace Accessibility and I...Workology
 
Workology Podcast Ep 134: The Future of Work: Job Seeker and Employee Accessi...
Workology Podcast Ep 134: The Future of Work: Job Seeker and Employee Accessi...Workology Podcast Ep 134: The Future of Work: Job Seeker and Employee Accessi...
Workology Podcast Ep 134: The Future of Work: Job Seeker and Employee Accessi...Workology
 
Addressing the Elephant in the Room: eRecruiting & Accessibility
Addressing the Elephant in the Room: eRecruiting & Accessibility Addressing the Elephant in the Room: eRecruiting & Accessibility
Addressing the Elephant in the Room: eRecruiting & Accessibility Workology
 
Ep 132 - Workology Podcast: The Future of Accessible Workplace Technology
Ep 132  - Workology Podcast: The Future of Accessible Workplace TechnologyEp 132  - Workology Podcast: The Future of Accessible Workplace Technology
Ep 132 - Workology Podcast: The Future of Accessible Workplace TechnologyWorkology
 
Workology Podcast Ep 129: Disruptive Digital Trends at SXSW
Workology Podcast Ep 129: Disruptive Digital Trends at SXSW Workology Podcast Ep 129: Disruptive Digital Trends at SXSW
Workology Podcast Ep 129: Disruptive Digital Trends at SXSW Workology
 
Workology Podcast Ep 128: Accessibility in the Gig Economy
Workology Podcast Ep 128: Accessibility in the Gig EconomyWorkology Podcast Ep 128: Accessibility in the Gig Economy
Workology Podcast Ep 128: Accessibility in the Gig EconomyWorkology
 
Workology Podcast Ep 125 Future of Work: The Role of Contract Labor in Govern...
Workology Podcast Ep 125 Future of Work: The Role of Contract Labor in Govern...Workology Podcast Ep 125 Future of Work: The Role of Contract Labor in Govern...
Workology Podcast Ep 125 Future of Work: The Role of Contract Labor in Govern...Workology
 
Workology Podcast Ep 123: Universal Design for the Gig Economy
Workology Podcast Ep 123: Universal Design for the Gig Economy Workology Podcast Ep 123: Universal Design for the Gig Economy
Workology Podcast Ep 123: Universal Design for the Gig Economy Workology
 

Mehr von Workology (20)

Episode 220: Future of Work: How to Find and Select Accessible Workplace Tech...
Episode 220: Future of Work: How to Find and Select Accessible Workplace Tech...Episode 220: Future of Work: How to Find and Select Accessible Workplace Tech...
Episode 220: Future of Work: How to Find and Select Accessible Workplace Tech...
 
Ep 214: Making Workplace Technology Accessible with Chancey Fleet
Ep 214: Making Workplace Technology Accessible with Chancey FleetEp 214: Making Workplace Technology Accessible with Chancey Fleet
Ep 214: Making Workplace Technology Accessible with Chancey Fleet
 
5 Secrets to Career Growth in HR
5 Secrets to Career Growth in HR5 Secrets to Career Growth in HR
5 Secrets to Career Growth in HR
 
Inspire software webinar (1)
Inspire software webinar (1)Inspire software webinar (1)
Inspire software webinar (1)
 
7 Strategies to Streamline HR Busywork and Maximize Your HR Team's Productivity
7 Strategies to Streamline HR Busywork and Maximize Your HR Team's Productivity7 Strategies to Streamline HR Busywork and Maximize Your HR Team's Productivity
7 Strategies to Streamline HR Busywork and Maximize Your HR Team's Productivity
 
Designing AI Driven Employee Experiences Workshop: Sept. 18, 2019
Designing AI Driven Employee Experiences Workshop: Sept. 18, 2019Designing AI Driven Employee Experiences Workshop: Sept. 18, 2019
Designing AI Driven Employee Experiences Workshop: Sept. 18, 2019
 
Digitizing Talent: Creative Strategies for the Digital Recruiting Age
Digitizing Talent: Creative Strategies for the Digital Recruiting Age Digitizing Talent: Creative Strategies for the Digital Recruiting Age
Digitizing Talent: Creative Strategies for the Digital Recruiting Age
 
Episode 190: Creating Authentic & Inclusive Experiences for Your Workforce
Episode 190: Creating Authentic & Inclusive Experiences for Your WorkforceEpisode 190: Creating Authentic & Inclusive Experiences for Your Workforce
Episode 190: Creating Authentic & Inclusive Experiences for Your Workforce
 
Webinar: Evaluating HR's Readiness for Artificial Intelligence
Webinar: Evaluating HR's Readiness for Artificial IntelligenceWebinar: Evaluating HR's Readiness for Artificial Intelligence
Webinar: Evaluating HR's Readiness for Artificial Intelligence
 
The Intersection Between HR and AI
The Intersection Between HR and AI The Intersection Between HR and AI
The Intersection Between HR and AI
 
Navigating Immigration: How to Hire Current Visa Holders
Navigating Immigration: How to Hire Current Visa HoldersNavigating Immigration: How to Hire Current Visa Holders
Navigating Immigration: How to Hire Current Visa Holders
 
Workology Podcast Episode 141 – Future of Work: Apprenticeships and Employing...
Workology Podcast Episode 141 – Future of Work: Apprenticeships and Employing...Workology Podcast Episode 141 – Future of Work: Apprenticeships and Employing...
Workology Podcast Episode 141 – Future of Work: Apprenticeships and Employing...
 
Workology Podcast Episode 136 – Future of Work: Workplace Accessibility and I...
Workology Podcast Episode 136 – Future of Work: Workplace Accessibility and I...Workology Podcast Episode 136 – Future of Work: Workplace Accessibility and I...
Workology Podcast Episode 136 – Future of Work: Workplace Accessibility and I...
 
Workology Podcast Ep 134: The Future of Work: Job Seeker and Employee Accessi...
Workology Podcast Ep 134: The Future of Work: Job Seeker and Employee Accessi...Workology Podcast Ep 134: The Future of Work: Job Seeker and Employee Accessi...
Workology Podcast Ep 134: The Future of Work: Job Seeker and Employee Accessi...
 
Addressing the Elephant in the Room: eRecruiting & Accessibility
Addressing the Elephant in the Room: eRecruiting & Accessibility Addressing the Elephant in the Room: eRecruiting & Accessibility
Addressing the Elephant in the Room: eRecruiting & Accessibility
 
Ep 132 - Workology Podcast: The Future of Accessible Workplace Technology
Ep 132  - Workology Podcast: The Future of Accessible Workplace TechnologyEp 132  - Workology Podcast: The Future of Accessible Workplace Technology
Ep 132 - Workology Podcast: The Future of Accessible Workplace Technology
 
Workology Podcast Ep 129: Disruptive Digital Trends at SXSW
Workology Podcast Ep 129: Disruptive Digital Trends at SXSW Workology Podcast Ep 129: Disruptive Digital Trends at SXSW
Workology Podcast Ep 129: Disruptive Digital Trends at SXSW
 
Workology Podcast Ep 128: Accessibility in the Gig Economy
Workology Podcast Ep 128: Accessibility in the Gig EconomyWorkology Podcast Ep 128: Accessibility in the Gig Economy
Workology Podcast Ep 128: Accessibility in the Gig Economy
 
Workology Podcast Ep 125 Future of Work: The Role of Contract Labor in Govern...
Workology Podcast Ep 125 Future of Work: The Role of Contract Labor in Govern...Workology Podcast Ep 125 Future of Work: The Role of Contract Labor in Govern...
Workology Podcast Ep 125 Future of Work: The Role of Contract Labor in Govern...
 
Workology Podcast Ep 123: Universal Design for the Gig Economy
Workology Podcast Ep 123: Universal Design for the Gig Economy Workology Podcast Ep 123: Universal Design for the Gig Economy
Workology Podcast Ep 123: Universal Design for the Gig Economy
 

Kürzlich hochgeladen

Cleared Job Fair Handbook | May 2, 2024
Cleared Job Fair Handbook  |  May 2, 2024Cleared Job Fair Handbook  |  May 2, 2024
Cleared Job Fair Handbook | May 2, 2024ClearedJobs.Net
 
2k Shots ≽ 9205541914 ≼ Call Girls In Vinod Nagar East (Delhi)
2k Shots ≽ 9205541914 ≼ Call Girls In Vinod Nagar East (Delhi)2k Shots ≽ 9205541914 ≼ Call Girls In Vinod Nagar East (Delhi)
2k Shots ≽ 9205541914 ≼ Call Girls In Vinod Nagar East (Delhi)Delhi Call girls
 
Mercer Global Talent Trends 2024 - Human Resources
Mercer Global Talent Trends 2024 - Human ResourcesMercer Global Talent Trends 2024 - Human Resources
Mercer Global Talent Trends 2024 - Human Resourcesmnavarrete3
 
Arjan Call Girl Service #$# O56521286O $#$ Call Girls In Arjan
Arjan Call Girl Service #$# O56521286O $#$ Call Girls In ArjanArjan Call Girl Service #$# O56521286O $#$ Call Girls In Arjan
Arjan Call Girl Service #$# O56521286O $#$ Call Girls In Arjanparisharma5056
 
Webinar - How to set pay ranges in the context of pay transparency legislation
Webinar - How to set pay ranges in the context of pay transparency legislationWebinar - How to set pay ranges in the context of pay transparency legislation
Webinar - How to set pay ranges in the context of pay transparency legislationPayScale, Inc.
 
HRM PPT on placement , induction and socialization
HRM PPT on placement , induction and socializationHRM PPT on placement , induction and socialization
HRM PPT on placement , induction and socializationRishik53
 
Mastering Vendor Selection and Partnership Management
Mastering Vendor Selection and Partnership ManagementMastering Vendor Selection and Partnership Management
Mastering Vendor Selection and Partnership ManagementBoundless HQ
 
100%Safe delivery(+971558539980)Abortion pills for sale..dubai sharjah, abu d...
100%Safe delivery(+971558539980)Abortion pills for sale..dubai sharjah, abu d...100%Safe delivery(+971558539980)Abortion pills for sale..dubai sharjah, abu d...
100%Safe delivery(+971558539980)Abortion pills for sale..dubai sharjah, abu d...hyt3577
 
RecruZone - Your Recruiting Bounty marketplace
RecruZone - Your Recruiting Bounty marketplaceRecruZone - Your Recruiting Bounty marketplace
RecruZone - Your Recruiting Bounty marketplaceDavide Donghi
 

Kürzlich hochgeladen (9)

Cleared Job Fair Handbook | May 2, 2024
Cleared Job Fair Handbook  |  May 2, 2024Cleared Job Fair Handbook  |  May 2, 2024
Cleared Job Fair Handbook | May 2, 2024
 
2k Shots ≽ 9205541914 ≼ Call Girls In Vinod Nagar East (Delhi)
2k Shots ≽ 9205541914 ≼ Call Girls In Vinod Nagar East (Delhi)2k Shots ≽ 9205541914 ≼ Call Girls In Vinod Nagar East (Delhi)
2k Shots ≽ 9205541914 ≼ Call Girls In Vinod Nagar East (Delhi)
 
Mercer Global Talent Trends 2024 - Human Resources
Mercer Global Talent Trends 2024 - Human ResourcesMercer Global Talent Trends 2024 - Human Resources
Mercer Global Talent Trends 2024 - Human Resources
 
Arjan Call Girl Service #$# O56521286O $#$ Call Girls In Arjan
Arjan Call Girl Service #$# O56521286O $#$ Call Girls In ArjanArjan Call Girl Service #$# O56521286O $#$ Call Girls In Arjan
Arjan Call Girl Service #$# O56521286O $#$ Call Girls In Arjan
 
Webinar - How to set pay ranges in the context of pay transparency legislation
Webinar - How to set pay ranges in the context of pay transparency legislationWebinar - How to set pay ranges in the context of pay transparency legislation
Webinar - How to set pay ranges in the context of pay transparency legislation
 
HRM PPT on placement , induction and socialization
HRM PPT on placement , induction and socializationHRM PPT on placement , induction and socialization
HRM PPT on placement , induction and socialization
 
Mastering Vendor Selection and Partnership Management
Mastering Vendor Selection and Partnership ManagementMastering Vendor Selection and Partnership Management
Mastering Vendor Selection and Partnership Management
 
100%Safe delivery(+971558539980)Abortion pills for sale..dubai sharjah, abu d...
100%Safe delivery(+971558539980)Abortion pills for sale..dubai sharjah, abu d...100%Safe delivery(+971558539980)Abortion pills for sale..dubai sharjah, abu d...
100%Safe delivery(+971558539980)Abortion pills for sale..dubai sharjah, abu d...
 
RecruZone - Your Recruiting Bounty marketplace
RecruZone - Your Recruiting Bounty marketplaceRecruZone - Your Recruiting Bounty marketplace
RecruZone - Your Recruiting Bounty marketplace
 

Episode 224: Eliminating Algorithmic Bias in Hiring and Employment

  • 1.   Episode 224: ​Eliminating Algorithmic Bias  in Hiring and Employment     Episode Link: ​http://workology/ep​2​24-wp Intro: ​[00:00:01.02] ​Welcome to the Workology Podcast, a podcast for the disruptive workplace leader. Join host Jessica Miller-Merrell, founder of Workology.com, as she sits down and gets to the bottom of trends, tools and case studies for the business leader, HR and recruiting professional who is tired of the status quo. Now here's Jessica with this episode of Workology. Jessica Miller-Merrell: ​[00:00:26.99] ​This Workology podcast is sponsored by Workology. The business case for artificial intelligence and HR in your workplace is growing by leaps and bounds, every single day. Employers and HR leaders, though, have real concerns about bias when hiring using AI, which is commonly referred to as algorithmic bias. Say that three times fast. Algorithmic bias. In today's Workology podcast, we're going straight to the source and talking to one AI company that works in HR. And we're talking to them about their approach on reducing algorithmic bias and the steps that employers can take. This episode is part of the Workology podcast and it's part of our Future of Work series, which is powered by PEAT. They're the Partnership on Employment and Accessible Technology. In honor of the upcoming 30th anniversary of the Americans with Disabilities Act this July, we're investigating what the next 30 years will look like for people with disabilities at work and the potential of emerging technologies to make workplaces more inclusive and accessible. Today, I'm joined by Steve Feyer. He's the Director of Product Marketing with HR technology company, Eightfold AI. Steve, welcome to the Workology podcast. Steve Feyer: ​[00:01:41.69] ​Thank you so much for having me, Jessica. It's a pleasure to be here. Jessica Miller-Merrell: ​[00:01:44.57] ​Talk a little bit about your background. Walk us through how you became the director of product marketing for Eightfold AI. Steve Feyer: ​[00:01:50.99] ​Sure. ​I’d be happy to. Well, we are a Silicon Valley based company providing an artificial intelligence solution for talent. I found my way here by working with several of the current executives and founders of the company and, in general, I've spent my whole career working in startup businesses, focused on cutting edge technology and in particular on business solutions that, at least I feel, are going to have a positive impact on the world. And so one of the reasons why Eightfold is very exciting for me is that we are helping to create better inclusion in the world of employment. Jessica Miller-Merrell: ​[00:02:31.73] ​Awesome. Well, I'm excited to have your expertise and background because as you might imagine, there is a lot of questions and just need for information or resources on the subject of artificial intelligence for human resources and recruitment. Can you walk us through maybe the difference between some different types of biases? These are common when I'm thinking and I’m talking with HR people. But the difference between conscious and unconscious bias when it comes to hiring and employment. Steve Feyer: ​[00:03:07.58] ​That is such an important question. So if I may, I'd like to just talk very quickly about what bias is in the first place. And a social scientist may have a much more nuanced definition, but in its most basic form, bias is making any decision based on something that should be an irrelevant factor. So if we're thinking about hiring decisions that a hiring manager would make, a bias would be taking place if that manager is making their hiring decision based on a factor that doesn't matter for the job. The manager should be looking for the person who has the Workology Podcast​ ​| www.workologypodcast.com | @workology
  • 2. right skills, the right experience, the right capability to do the job. That's what matters. But if they're making a decision based on that person's gender, that person's age, if that person is a person with a disability, those are examples of bias. So a conscious bias would be any situation in which that decision was purposeful. The hiring manager purposefully thinks, I don't want to hire female candidates, or I don't want to hire older candidates. I think if we encountered that type of conscious bias in our lives, we probably noticed it right away because it may almost be shocking to see, but unconscious bias can occur as well. And that would be any situation in which the decision maker is making a decision based on that irrelevant factor. But they're not aware of it. That's not their intention. So in the world of hiring, I think a really good example of unconscious bias that I've seen a lot of awareness of in recent years is the use of language in job descriptions. So companies will write job descriptions for the jobs that they're hiring and often the words that they use will stop describing the skills and experience they're looking for and actually start to describe the personality of who they're looking for. Steve Feyer: ​[00:05:04.63] ​So these are if you see words like, we're looking for a go-getter or we're looking for a self-starter who powers through obstacles. Things like that. What I see a lot of in Silicon Valley is that companies will say, we're looking for a ninja. I want a marketing ninja or an engineering ninja for this job. But the problem is, this kind of language actually turns out to be, I think, unintentionally discriminatory, in particular discriminatory against female candidates, because studies have shown that certain words and phrases discourage female candidates from applying, but do not have that impact on male candidates. So this particular example would be one kind of unconscious bias. As I said, I think fortunately companies are becoming very aware of it. And starting to take action to impact it. But there's one example of unconscious bias or potential unconscious bias that I've been thinking about a lot lately, Jessica, and that is video interviewing and how the rise of video interviewing could be impacting people with disabilities. So many companies today are starting to require their job applicants to record a video of themselves as part of an interview process, and they have many perfectly good reasons to do this. It can be very helpful for them to understand who the candidates are, who they're looking at. But the concern I have about this is if there may be unintentional, unconscious bias against job candidates with disabilities who may not be able to use the video interview on a level playing field with candidates without disabilities. Perhaps they would require additional time to answer questions or require some other reasonable accommodation with the video interview. If that possibility is not being considered, then there could be an unconscious bias being created. Jessica Miller-Merrell: ​[00:07:03.1] ​Thank you for all this insightful discussion on this topic. I feel like oftentimes, especially when we're looking at our HR tech, we don't hear all these things. It's just a few buzzwords and you know, and then it's straight to the demo. So I appreciate all the insights here and the reference towards things like artificial intelligence and video interviewing. One of the promises for many when it comes to artificial intelligence is that it is designed to reduce or eliminate bias. But you're, when you're talking about video interviewing specifically, that might not be necessarily the case. Before we dive into that kind of topic, I did want to talk about and have you share some of the primary benefits of using artificial intelligence in hiring and employment. Steve Feyer: ​[00:07:57.91] ​Absolutely. So artificial intelligence offers a tremendous possibility for efficiency and for better decision making. Without getting into a long pitch for my company's products, suffice it to say that AI that we can offer, and that other companies are offering in many fields, are enabling companies to find the right information right away in order to make a decision. So there could be a huge benefit there. But now there's still the issue of the potential for bias, because if you see an artificial intelligence system that is analyzing information in order to give you a recommendation, you want to be confident there is not some sort of unconscious bias behind it. So I really like to think about something that our founder and CEO, Ashutosh Garg, who is a world expert on artificial intelligence, has told me about this. And I think it's a really interesting way to think about how the issue of bias and AI intersect. And the first point is, we're all human and we all have unconscious biases. Not suggesting that these particular biases may be related to issues such as gender or disability. It's just that our biases exist because we're wired to survive as people. And that means that our brains process information before we can make a conscious decision. So that means we're going to have a reaction as a human that could potentially trigger a bias. And we cannot forget the information that we see. Workology Podcast​ ​| www.workologypodcast.com | @workology
  • 3. Steve Feyer: ​[00:09:44.41] ​So, for example, if we meet someone and we observe that they're younger or older than we are, we're going to notice that. Or if we meet someone and we are going to immediately make a judgment about what gender we think that person is, we're going to make that judgment unconsciously and immediately. And we can't forget that. Once we make that judgment call, we can't forget it. We can't turn off our ability to notice it either. So as humans, we're primed for unconscious bias all the time. But the opposite is true of a computer. And that's where AI can actually offer the opportunity to help us with our unconscious biases. Because, for a computer, we can tell the computer all the information that that computer has. And, what's more, we can tell the computer to forget certain information. And once that computer forgets that information, it's impossible for it to remember. So we as humans gather information whether we want to or not. And then we can't forget. Computers only know the information we give them. And if we tell the computer to forget, it will. So with that combination of human and computer capabilities, I think we could actually then have a meaningful impact on preventing bias, leading to our ability to offer fair work opportunities and create inclusion in our organizations. Jessica Miller-Merrell: ​[00:11:07.02] ​So you're saying that the computer can forget bias? I feel like it's “Men in Black.” Right. Like we click the little pen and then suddenly the light flashes and there's no more bias. But walk me through that a little bit more. And then what happens if we don't flick the pen and the light and that algorithm still has bias in it. Walk us through that. Steve Feyer: ​[00:11:30.21] ​Well, I love the Men in Black example, Jessica, because it's funny, because humans precisely don't work that way. You just can't tell someone to forget something and they'll do it. But yes, let me jump into how a computer can actually be told to forget something and actually prevent bias. So, to start with, a computer working in AI. Artificial intelligence is data analysis that's scaled. An artificial intelligence algorithm is gathering lots of lots of information and then using that information to set a target and provide recommendations against that target. So in the case of talent, the case of what we do, AI is taking in millions and millions of resumes and job applications and other information about job candidates and job applicants. This AI is then taking all of that information, running billions of calculations and trying to figure out which factors are going to predict success in a job. So as it turns out, our AI in doing this has found that there are more than five hundred factors it can identify that can predict success. Specifically by predict success, I mean, is this someone who is going to be someone who will take a job, whatever that job may be, and then be successful in that job? Will they hold that job for a long period of time? Take another job like it? Get promoted? Factors like that, that would show this person was successful. Steve Feyer: ​[00:13:09.41] ​So of all of the factors that we can look at, if you think about the information that would appear on a resumé or that can be inferred from a résumé, you want to be sure that those factors are the things that matter. What's this person's educational background? What kind of skills do they have, what prior jobs did they hold and so forth. And you want to be sure that the factors that don't matter, such as, is this person over the age of 50? Is this person a member of a minority group? These are things that do not matter. And you want to be sure that the AI does not consider those factors. So we're able to tell the computer, first of all, if you see gender or if you see something that could infer gender like the person’s name, simply forget about it. That's the easy part. So, done. The computer has forgotten about that information. But now there's the next step. Is it possible that you could. unintentionally predict that person's personal characteristics, the things that don't matter for a talent decision, based on other information in the resume. For example, there are colleges that are female only. If you see that college, you know that the candidate, or you will believe that the candidate, is a female candidate. So does that information create a bias, even though that now is potentially relevant to whether this is someone who you may want to hire? So we're able to address potential bias in that as well in the algorithm. Steve Feyer: ​[00:14:43.27] ​And we do this with a methodology called Equal Opportunity Algorithms, which is a series of statistical tests that we run against the AI. What this method does is it, you take an input that you're concerned about. In this case, let's let's consider the input of gender. And this method will look at all the factors you're considering and tell you if any of them are actually creating a bias against gender. So it may look and say, you know Workology Podcast​ ​| www.workologypodcast.com | @workology
  • 4. what, this factor of the college that someone went to, it turns out that there is still some gender bias in the output based on that factor. So when you detect that, you can modify the data that the computer is allowed to use, tell it to forget something, run the test again and confirm that now the bias is no longer present. So we will conduct this process. And before we will begin to use any algorithm for real, before we will deploy it, we will ensure that there is no bias based on the three dimensions of gender, age and ethnicity. And then we can also offer statistical proof that our AI results are not providing bias based across these factors. Jessica Miller-Merrell: ​[00:15:57.81] ​I love this. This sounds fantastic, right? Because I am thinking about the case study of Amazon and how it wasn't intentional, what they were, what they created to review resumes.. But the algorithm was giving preference to men because that's most of the resumes that were coming in during that period of time. But it also was looking at other things, like you said, women's universities or colleges or women's studies activities or extra, additional curricular events that weren't necessarily mentioned consistently. But it was saying, hey, these candidates don't fit what we're looking for. Break: ​[00:16:45.72] ​Let's take a reset. This is Jessica Miller Merrell. And you are listening to the Workology podcast. Today, we're talking with Steve Feyer about eliminating algorithmic bias in hiring and employment. This podcast is sponsored by Workology and is part of our Future of Work series in partnership with the Partnership on Employment and Accessible Technology or PEAT. Break: ​[00:17:09.11] ​The Workology podcast Future of Work series is supported by PEAT, the Partnership on Employment and Accessible Technology. PEAT'S initiative is to foster collaboration and action around accessible technology in the workplace. PEAT is funded by the U.S. Department of Labor's Office of Disability Employment Policy, ODEP. Learn more about PEAT at Peatworks.org. That's Peatworks.org. Jessica Miller-Merrell: ​[00:17:38.14] ​I love that the equal opportunity algorithm allows and has created a process to help remove, on an ongoing basis, bias that is found within the technology or within the parameters of what the HR tech, your guys's tech is doing. I wondered, as other HR leaders are looking at other types of artificial intelligence, not just your guys's, which I hope they take a look at and do a demo and get with you guys to learn more. But what are some questions that HR leaders can ask their HR technology companies they're considering to determine if their tech has a process designed to address and eliminate bias in artificial intelligence. Steve Feyer: ​[00:18:26.72] ​Yeah, yeah, absolutely. So I think that there are a couple of questions that HR leaders should ask when they're considering AI technologies. First of all, I think it's important to ask specifically about how they address bias. Because AI technologies can be free of bias. So ask, how is this AI being used to prevent bias? Steve Feyer: ​[00:18:54.83] ​How is it reaching a specific prediction without bias? And we use a methodology called equal opportunity algorithms. There are other approaches that can be used. Just be sure that your AI vendor has considered this issue and can offer proof that the results are unbiased against the appropriate factors. I would note that the proof that we can offer is statistical. So if I show it to you, it's going to be a table with numbers. Perhaps it doesn't mean all that much. So we will typically work with a data scientist at the organizations we're working with to understand what they're looking at. I think there's another important question as well. More generally, which is to ask the provider of AI if their AI, can offer any explanation about how it's reached a given result. This is something that we call explainable AI and it should be possible for the results that are offered to give a reason why this result was given. Another advantage of asking this question is it will reveal just exactly how deep this company is in terms of its AI sophistication. Jessica Miller-Merrell: ​[00:20:12.17] ​Steve, I'm hearing you talk about the algorithm and how it is eliminating bias. But I didn't hear you mention people with disabilities. So what are some ways that Eightfold AI is working to and focusing on eliminating bias when it comes to hiring people with disabilities? Workology Podcast​ ​| www.workologypodcast.com | @workology
  • 5. Steve Feyer: ​[00:20:31.99] ​Absolutely, Jessica. So let me address that. I did mention that our bias prevention in the algorithm does not actually address people with disabilities. And the reason for that is the information the algorithm knows actually doesn't really know anything about someone's disability status. You can, with the resumé, figure out someone's age. Perhaps you can infer their ethnicity or gender. But if that person is a person with disabilities, that's not something that our algorithm is actually able to detect. So to prevent potential bias against people with disabilities, we've added another layer of technology to conduct that bias prevention. This is an analytics process that looks at every stage in the hiring process and calculates whether there is any difference in the outcomes based on personal characteristics. This can then be done for all kinds of personal characteristics, including those that may not be present on a resume, but that someone may declare in their application. So let me give you an example, hypothetically. Suppose that you have a company and you hire 10 percent of your job applicants. 500 people apply. On average, you're going to hire about 50 of them. So that would mean that across every different kind of personal characteristic, you're going to see about 10 percent or at least you should. You should hire about 10 percent of male candidates and 10 percent of female candidates. And that also means you should hire about 10 percent of the people with disabilities who apply for jobs and about 10 percent of people without disabilities who apply for a job. But now suppose that the numbers show you something different. Suppose you see in the numbers that only 9 percent of the people with disabilities who apply for a job ended up getting hired. Steve Feyer: ​[00:22:35.87] ​Well, 9 percent and 10 percent are pretty close to each other. Maybe that's random. The numbers do fluctuate up and down. After all, it will never be exactly the same in every month or every week or every year. But maybe that number is different. Maybe it's 8 percent instead of 10 percent. Or maybe only 7 percent or 5 percent. At a certain point, that difference becomes statistically significant. At a certain point, you can say I can prove that that number is different enough from the average that it probably represents a truly different outcome. I can't tell you exactly why that outcome is different, but I can tell you it is different. So there is some cause for concern and we do need to take corrective action somewhere because maybe there is a bias. So we run these analytics at the most granular level possible. We run them for every single stage of the hiring process and we run them daily and we will notify the people who need to know about that. Whoever the organization has identified needs to know, we can alert them. So that as soon as something may be wrong, they can go take corrective action. Suppose there was one stage of the hiring process and people with disabilities are getting out of the hiring process at that stage, possibly just for a specific department in the company, or in a specific region of the country, or a specific country in the world? We can flag that. The company can then go identify what is causing that to happen, can take corrective action and see if the results are different after they've taken that action. Jessica Miller-Merrell: ​[00:24:15.25] ​When you say corrective action, are we thinking like an investigation where we dive in, do a deeper dive into the actual process and the candidates that are being analyzed are just more of an investigative process. But we're alerted to that potential anomaly or concern. And then it forces us to make a decision to say, hey, let's dive into this a little deeper and see if we can uncover what is potentially causing this change. Steve Feyer: ​[00:24:51.66] ​That's exactly right. There could be many reasons. This could be very much an example of a bias taking place, conscious or unconscious. You may identify that a specific department is not hiring people with a certain background. So perhaps that department leader is doing the wrong thing. Perhaps their job descriptions are written in a way that is creating a bias and a certain group of the individuals are not applying for those jobs. Or perhaps in your series of technologies and tools and processes, perhaps there is something in that series of tools that is unintentionally discouraging candidates. Perhaps you have required something which is challenging for people with disabilities, and if you are able to identify when that's happening, you will be able to then correct that disparate impact. Jessica Miller-Merrell: ​[00:25:54.76] ​I think this is something that HR people can relate to in that this is really part of our jobs. Somebody comes to our office and says, hey, this person is being treated differently or this is occurring or we're notified by the EEOC with a charge or things like that, that result in further investigation and research needed to be done in a particular area. Workology Podcast​ ​| www.workologypodcast.com | @workology
  • 6. Steve Feyer: ​[00:26:17.77] ​Exactly. Exactly. We'll give you the information so that you can then go and do what you want to do. Go and make that an inclusive playing field for everybody. Jessica Miller-Merrell: ​[00:26:29.38] ​Steve, as we look into the next 30 years of work, what emerging trends or technologies do you think will have the biggest impact on people with disabilities? This is especially in light of the 30th anniversary for the Americans with Disabilities Act, which is happening later this year in July. Steve Feyer: ​[00:26:49.39] ​Well, I think there are two very important trends. The first, naturally, is artificial intelligence. And one, I think really powerful thing about what we can do with AI, is by taking bias out of the equation as much as possible, we are enabling the decision makers to focus on what people are actually able to do and to think about job candidacy, to think about people's internal transitions and internal mobility within companies, to think about their careers in terms of their ability to do the job and really focus on their potential rather than any personal characteristics that are irrelevant. I think this is so important because people with disabilities have that right to work. And frankly, I think organizations are obligated to do everything they can to realize that right that everybody has. And I think AI really can make a difference. I'm an optimist. I hope it will in time even change our mindset so that if we each as individuals receive a negative outcome, I'm sorry you weren't chosen for the job. Our reaction will not be, oh, they were unfair, but rather our reaction will be, well, it wasn't right for me, but the next one will be. So we can maybe even have a more optimistic reaction as individuals. But I think another important trend is the rise of accessibility technology. I recall working with a colleague several years back who, he was hearing impaired and he explained to me that he had found it very helpful to have a voice-to-text technology that he could use so that he could make sure he was able to work seamlessly with his colleagues. And we've seen this technology improve so dramatically. Voice assistance, screen readers and all these other kinds of accessibility technologies now live in our pocket. And they work really well. So I'm hopeful that these types of technologies will make it possible for more and more people with disabilities to participate fully in every kind of activity, every community in the workplace. Jessica Miller-Merrell: ​[00:29:12.62] ​I agree. And that's one of the reasons why I've partnered with PEAT and we've focused so much on inclusion and accessibility when it comes to hiring and employment for people with disabilities. I wanted to ask you one more question here about, maybe your advice that you could give HR leaders who maybe want to offer a more inclusive recruiting experience or have a more inclusive recruiting strategy that's focused on hiring all types of qualified individuals, including people with disabilities. Do you have any insights to share? Steve Feyer: ​[00:29:49.67] ​I think just one, Jessica. I want to make a suggestion that isn’t about technology really at all. And the technology is important, don't get me wrong. Technology can make a great difference, but I think something that's so important to inclusion in hiring is to create a very standardized process and be very consistent and really stick to that process as an organization. When you are determined that you're going to hire for a role, make sure that you have very clear criteria for how you're going to decide who to hire, and a very clear process for a hiree going through that hiring process. And then no matter what else happens, no matter who may show up late in the hiring process or who gets referred by someone else in the business, no matter who has a cousin that they would like to bring in. Stick to the process. Steve Feyer: ​[00:30:43.93] ​Stick to the plan. Because by doing this, you're going to be able to include those processes that remove bias in your process and really have it be fair for everybody. I think that's probably the single most important thing that organizations could do from a process perspective. Jessica Miller-Merrell: ​[00:31:05.24] ​Agreed. Consistency is everything. Steve, thank you so much for taking the time to join us today. Where can people go to learn more about you and also Eightfold AI? Steve Feyer: ​[00:31:18.04] ​Jessica, it's been my pleasure. Thank you so much for inviting me on to the Workology podcast. So if you want to learn more about what we're doing, please visit our Website that’s at Eightfold.AI. The Workology Podcast​ ​| www.workologypodcast.com | @workology
  • 7. word “eight”, the word “fold” dot AI. Or if you use your favorite search engine, search for Eightfold. Should bring us up right away. Jessica Miller-Merrell: ​[00:31:39.25] ​Awesome.​We'll include a link to Eightfold AI as well as a link to Steve's Linked-In profile. If you want to connect with him more and talk about algorithms and all the cool things that are happening in the HR tech space and what Eightfold is doing. So thank you so much for taking the time to chat with us today. Closing: ​[00:31:57.43] ​Are you tired of putting your professional development on the backburner? It's time for you to invest in yourself with UpskillHR by Workology. We're a membership community focused on personal development for HR. Gain access to our elite community training, coaching and events. Learn more at UpskillHR.com. Closing: ​[00:32:22.6] ​Eightfold AI's transparent approach is what we need for more AI tech companies, especially in HR in recruiting. Because of employment laws, HR needs to be working hand-in-hand with our HR technology companies, especially those that use AI. As HR leaders, I believe it's our responsibility to be educated and aware about the potential benefits and potential pitfalls that exist when using artificial intelligence. This starts with educating yourself on the fundamentals of artificial intelligence. I'm linking to a number of podcast interviews that we've done in the past on AI, as well as additional resources to help you start diving into this fast moving technology. The Future of Work series is in partnership with PEAT. And it's one of my favorites. Thank you to PEAT as well as our podcast sponsor Workology. Workology Podcast​ ​| www.workologypodcast.com | @workology