Please use this identifier to cite or link to this item:

Data Systems Fault Coping for Real-time Big Data Analytics Required Architectural Crucibles

File SizeFormat 
paper0125.pdf1.11 MBAdobe PDFView/Open

Item Summary

Title: Data Systems Fault Coping for Real-time Big Data Analytics Required Architectural Crucibles
Authors: Cohen, Stephen
Money, William H.
Keywords: Analytics
Issue Date: 04 Jan 2017
Abstract: This paper analyzes the properties and characteristics of unknown and unexpected faults introduced into information systems while processing Big Data in real-time. The authors hypothesize that there are new faults, and requirements for fault handling and propose an analytic model and architectural framework to assess and manage the faults and mitigate the risks of correlating or integrating otherwise uncorrelated Big Data, and to ensure the source pedigree, quality, set integrity, freshness, and validity of data being consumed. We argue that new architectures, methods, and tools for handling and analyzing Big Data systems functioning in real-time must design systems that address and mitigate concerns for faults resulting from real-time streaming processes while ensuring that variables such as synchronization, redundancy, and latency are addressed. This paper concludes that with improved designs, real-time Big Data systems may continuously deliver the value and benefits of streaming Big Data.
Pages/Duration: 10 pages
ISBN: 978-0-9981331-0-2
DOI: 10.24251/HICSS.2017.121
Rights: Attribution-NonCommercial-NoDerivatives 4.0 International
Appears in Collections:Big Data and Analytics: Concepts, Methods, Techniques and Applications Minitrack

Please contact if you need this content in an ADA compliant alternative format.

Items in ScholarSpace are protected by copyright, with all rights reserved, unless otherwise indicated.