Cognitive biases in developing biased Artificial Intelligence recruitment system

Date
2021-01-05
Authors
Soleimani, Melika
Intezari, Ali
Taskin, Nazim
Pauleen, David
Contributor
Advisor
Department
Instructor
Depositor
Speaker
Researcher
Consultant
Interviewer
Annotator
Journal Title
Journal ISSN
Volume Title
Publisher
Volume
Number/Issue
Starting Page
5091
Ending Page
Alternative Title
Abstract
Artificial Intelligence (AI) in a business context is designed to provide organizations with valuable insight into decision-making and planning. Although AI can help managers make decisions, it may pose unprecedented issues, such as datasets and implicit biases built into algorithms. To assist managers with making unbiased effective decisions, AI needs to be unbiased too. Therefore, it is important to identify biases that may arise in the design and use of AI. One of the areas where AI is increasingly used is the Human Resources recruitment process. This article reports on the preliminary findings of an empirical study answering the question: how do cognitive biases arise in AI? We propose a model to determine people’s role in developing AI recruitment systems. Identifying the sources of cognitive biases can provide insight into how to develop unbiased AI. The academic and practical implications of the study are discussed.
Description
Keywords
Judgement, Big Data-Analytics and Decision-making, artificial intelligence, cognitive biases, decision-making
Citation
Extent
9 pages
Format
Geographic Location
Time Period
Related To
Proceedings of the 54th Hawaii International Conference on System Sciences
Table of Contents
Rights
Attribution-NonCommercial-NoDerivatives 4.0 International
Rights Holder
Local Contexts
Email libraryada-l@lists.hawaii.edu if you need this content in ADA-compliant format.