The Basis for No Bias
Algorithmic Bias is unavoidable when recognizing that humans create the initial code upon which these AI tools are built. By identifying and assessing the key considerations in how algorithms are created and the data sets they utilize, businesses can avoid negative unintended consequences and establish trust for the products and services they create. This course will review the following topics: Socio-technical nature of algorithmic systems; Structural causes of unintended/unjustified bias; Illustrative examples of algorithmic bias and Key questions to answer when assessing bias in algorithmic systems.
What you will learn:
- Explain why algorithmic system bias is not a purely technical problem
- Name the three categories of failure that lead to algorithmic system bias
- Ask key questions when addressing algorithmic system bias
This course is part of the following course program:
Artificial Intelligence and Ethics in Design: Responsible Innovation
Courses included in this program:
Who should attend: Data engineer, AI/ML engineer, Design engineer, Computer engineer, Security engineer, Software engineer
Instructor
Ansgar Koene
Ansgar Koene is a Senior Research Fellow at Horizon Digital Economy Research institute, University of Nottingham and chairs the IEEE P7003 Standard for Algorithm Bias Considerations working group. As part of his work at Horizon, Ansgar is the lead researcher in charge of Policy Impact; leads the stakeholder engagement activities of the EPSRC (UK research council) funded UnBias project to develop regulation-, design- and education-recommendations for minimizing unintended, unjustified and inappropriate bias in algorithmic systems; and frequently contributes evidence to UK parliamentary inquiries related to ICT and digital technologies. Ansgar has a multi-disciplinary research background, having previously worked and published on topics ranging from bio-inspired Robotics, AI and Computational Neuroscience to experimental Human Behavior/Perception studies.
Publication Year: 2018
ISBN: 978-1-5386-0041-2