Polycentric Generative‑Assurance Theory: Toward Adaptive Governance in Generative AI-Augmented Software Assurance
Loading...
Files
Date
Contributor
Advisor
Editor
Performer
Department
Instructor
Depositor
Speaker
Researcher
Consultant
Interviewer
Interviewee
Narrator
Transcriber
Annotator
Journal Title
Journal ISSN
Volume Title
Publisher
Journal Name
Volume
Number/Issue
Starting Page
780
Ending Page
Alternative Title
Abstract
The integration of generative AI (GenAI) into software development is transforming how code is authored, reviewed, and assured. While GenAI boosts productivity and creativity, it disrupts longstanding assurance frameworks, introducing epistemic opacity, validation deficits, accountability ambiguities, and governance challenges. This paper introduces Polycentric Generative-Assurance Theory (PGAT), a sociotechnical framework explaining how trust in AI-generated code is sustained through five interdependent responsibilities: epistemic mapping, adversarial socio-technical analysis, meta-validation, computational ethics, and evolutionary governance. Our findings reveal that assurance is no longer linear or role-bound, but rather a distributed, adaptive, and emergent practice. PGAT reframes assurance as a responsible process of trust orchestration, where multiple responsibilities coalesce to ensure the reliability, maintainability, and ethical integrity of software development practices.
Description
Citation
DOI
Extent
10 pages
Format
Type
Conference Paper
Geographic Location
Time Period
Related To
Proceedings of the 59th Hawaii International Conference on System Sciences
Related To (URI)
Table of Contents
Rights
Attribution-NonCommercial-NoDerivatives 4.0 International
Rights Holder
Catalog Record
Local Contexts
Email libraryada-l@lists.hawaii.edu if you need this content in ADA-compliant format.
