Please use this identifier to cite or link to this item:
Data Systems Fault Coping for Real-time Big Data Analytics Required Architectural Crucibles
|Title:||Data Systems Fault Coping for Real-time Big Data Analytics Required Architectural Crucibles|
Money, William H.
|Issue Date:||04 Jan 2017|
|Abstract:||This paper analyzes the properties and characteristics of unknown and unexpected faults introduced into information systems while processing Big Data in real-time. The authors hypothesize that there are new faults, and requirements for fault handling and propose an analytic model and architectural framework to assess and manage the faults and mitigate the risks of correlating or integrating otherwise uncorrelated Big Data, and to ensure the source pedigree, quality, set integrity, freshness, and validity of data being consumed. We argue that new architectures, methods, and tools for handling and analyzing Big Data systems functioning in real-time must design systems that address and mitigate concerns for faults resulting from real-time streaming processes while ensuring that variables such as synchronization, redundancy, and latency are addressed. This paper concludes that with improved designs, real-time Big Data systems may continuously deliver the value and benefits of streaming Big Data.|
|Rights:||Attribution-NonCommercial-NoDerivatives 4.0 International|
|Appears in Collections:||Big Data and Analytics: Concepts, Methods, Techniques and Applications Minitrack|
Please contact email@example.com if you need this content in an ADA compliant alternative format.
Items in ScholarSpace are protected by copyright, with all rights reserved, unless otherwise indicated.