Please use this identifier to cite or link to this item:

Easy and Efficient Hyperparameter Optimization to Address Some Artificial Intelligence “ilities”

File Size Format  
0094.pdf 752.85 kB Adobe PDF View/Open

Item Summary

Title:Easy and Efficient Hyperparameter Optimization to Address Some Artificial Intelligence “ilities”
Authors:Bihl, Trevor
Schoenbeck, Joe
Steeneck, Daniel
Jordan, Jeremy
Keywords:Big Data and Analytics: Pathways to Maturity
machine learning
professional practice
Date Issued:07 Jan 2020
Abstract:Artificial Intelligence (AI), has many benefits, including the ability to find complex patterns, automation, and meaning making. Through these benefits, AI has revolutionized image processing among numerous other disciplines. AI further has the potential to revolutionize other domains; however, this will not happen until we can address the “ilities”: repeatability, explain-ability, reliability, use-ability, trust-ability, etc. Notably, many problems with the “ilities” are due to the artistic nature of AI algorithm development, especially hyperparameter determination. AI algorithms are often crafted products with the hyperparameters learned experientially. As such, when applying the same algorithm to new problems, the algorithm may not perform due to inappropriate settings. This research aims to provide a straightforward and reliable approach to automatically determining suitable hyperparameter settings when given an AI algorithm. Results, show reasonable performance is possible and end-to-end examples are given for three deep learning algorithms and three different data problems.
Pages/Duration:10 pages
Rights:Attribution-NonCommercial-NoDerivatives 4.0 International
Appears in Collections: Big Data and Analytics: Pathways to Maturity

Please email if you need this content in ADA-compliant format.

This item is licensed under a Creative Commons License Creative Commons