Assessing the Risk of an Adaptation using Prior Compliance Verification

Date
2018-01-03
Authors
Marshall, Allen
Jahan, Sharmin
Gamble, Rose
Contributor
Advisor
Department
Instructor
Depositor
Speaker
Researcher
Consultant
Interviewer
Journal Title
Journal ISSN
Volume Title
Publisher
Volume
Number/Issue
Starting Page
Ending Page
Alternative Title
Abstract
Autonomous systems must respond to large amounts of streaming information. They also must comply with critical properties to maintain behavior guarantees. Compliance is especially important when a system self-adapts to perform a repair, improve performance, or modify decisions. There remain significant challenges assessing the risk of adaptations that are dynamically configured at runtime with respect to critical property compliance. Assuming compliance verification was performed for the originally deployed system, the proof process holds valuable meta-data about the variables and conditions that impact reusing the proof on the adapted system. We express this meta-data as a verification workflow using Colored Petri Nets. As dynamic adaptations are configured, the Petri Nets produce alert tokens suggesting the potential proof reuse impact of an adaptation. Alert tokens hold risk values for use in a utility function to determine the least risky adaptations. We illustrate the modeling and risk assessment using a case study.
Description
Keywords
IS Risk and Decision-Making, adaptive plans, colored Petri nets, risk assessment, self-adapting systems, verification concerns, verification process reuse
Citation
Extent
10 pages
Format
Geographic Location
Time Period
Related To
Proceedings of the 51st Hawaii International Conference on System Sciences
Rights
Attribution-NonCommercial-NoDerivatives 4.0 International
Rights Holder
Email libraryada-l@lists.hawaii.edu if you need this content in ADA-compliant format.