Wei Xu, Peter Bodík, and David Patterson
Modern computer systems are instrumented to generate huge amounts of system log data. This data contains valuable information for managing the system, localizing failures, and recovery. However, the complexity of these systems greatly surpasses what can be understood by human operators and thus automated analysis systems are beginning to be used. Due to preprocessing required by the statistical algorithms, the extremely high volume of data cannot be processed using ad-hoc scripts. We present a flexible, modular and scalable architecture for statistical learning from large data streams that can easily process lots of data. We built a prototype that is evaluated using system log data from
a commercial on-line service. Moreover, the results of the analysis were genuinely useful for the on-line service operators.
|Published in||Temporal Data Mining: Algorithms, Theory and Applications|