Risks of using naïve approaches to Artificial Intelligence: A case study

Prabhu, Sunitha and Hunter, Riley (2019) Risks of using naïve approaches to Artificial Intelligence: A case study. The 10th annual CITRENZ conference, Nelson, New Zealand, 9-11 October, 2019.

Full text not available from this repository.

Official URL: https://www.citrenz.ac.nz/conferences/2019/pdf/201...

Abstract or Summary

In recent years, there has been a push in the Artificial intelligence (AI) field to simplify the application of machine learning so that its use can be more widely adopted. While this reduces barriers to entry for AI and machine learning, they also introduce the risk that persons or organisations with insufficient expertise will have the ability to use these systems to make decisions that have a significant impact on society based on discriminatory factors. Implementers and decision-makers need to have a good understanding of the features that the system might use and infer from to make predictions, and how these can affect their stakeholders. In this paper, we outline the risks of this phenomena occurring in a specific case – the application of machine learning applied to secondary school student grades. We demonstrate that naïve approaches can have unanticipated consequences and can generate predictions based on discriminatory factors such as gender or race. The impact of the application of such flawed decisions in matters such as awards, scholarships etc. could entrench detrimental bias in the education systems.

Item Type:Paper presented at a conference, workshop or other event, and published in the proceedings
Keywords that describe the item:Artificial Intelligence, Machine Learning, Naïve approaches, Risks of AI
Subjects:Q Science > QA Mathematics > QA76 Computer software
Divisions:Schools > Centre for Business, Information Technology and Enterprise > School of Information Technology
ID Code:6946
Deposited By:
Deposited On:18 Nov 2019 21:53
Last Modified:09 Dec 2019 21:01

Repository Staff Only: item control page