Towards a Quantitative Evaluation Framework for Trustworthy AI in Facial Analysis

dc.contributor.author Schreiner, Annika
dc.contributor.author Kemmerzell, Nils
dc.date.accessioned 2023-12-26T18:55:48Z
dc.date.available 2023-12-26T18:55:48Z
dc.date.issued 2024-01-03
dc.identifier.isbn 978-0-9981331-7-1
dc.identifier.other 1d76aa7b-f80a-40ba-9bce-971b6e583a7a
dc.identifier.uri https://hdl.handle.net/10125/107326
dc.language.iso eng
dc.relation.ispartof Proceedings of the 57th Hawaii International Conference on System Sciences
dc.rights Attribution-NonCommercial-NoDerivatives 4.0 International
dc.rights.uri https://creativecommons.org/licenses/by-nc-nd/4.0/
dc.subject Trustworthy Artificial Intelligence and Machine Learning
dc.subject evaluation
dc.subject facial analysis
dc.subject trustworthy ai
dc.title Towards a Quantitative Evaluation Framework for Trustworthy AI in Facial Analysis
dc.type Conference Paper
dc.type.dcmi Text
dcterms.abstract As machine learning (ML) models are increasingly being used in real-life applications, ensuring their trustworthiness has become a rising concern. Previous research has extensively examined individual perspectives on trustworthiness, such as fairness, robustness, privacy, and explainability. Investigating their interrelations could be the next step in achieving an improved understanding of the trustworthiness of ML models. By conducting experiments within the context of facial analysis, we explore the feasibility of quantifying multiple aspects of trustworthiness within a unified evaluation framework. Our results indicate the viability of such a framework, achieved through the aggregation of diverse metrics into holistic scores. This framework can serve as a practical tool to assess ML models in terms of multiple aspects of trustworthiness, specifically enabling the quantification of their interactions and the impact of training data. Finally, we discuss potential solutions to key technical challenges in developing the framework and the opportunities of its transfer to other use cases.
dcterms.extent 10 pages
prism.startingpage 7821
Files
Original bundle
Now showing 1 - 1 of 1
No Thumbnail Available
Name:
0763.pdf
Size:
205.6 KB
Format:
Adobe Portable Document Format
Description: