What Does Real-time Really Mean In Data Analytics?

Before jumping into real-time data analytics, organizations should define what they mean by ‘real time’ in their specific business use case.

Paige Roberts, Open Source Relations Manager, Vertica

July 9, 2023

4 Min Read
spainter vtx via Adobe Stock

Is an analytical response within 300 milliseconds on data generated yesterday considered real-time? In today’s fast-paced digital landscape the concept of real-time data analysis is increasingly prevalent and essential to business success. Yet, there’s a lot of confusion about what “real-time” really means.

Understanding the definition when discussing real-time data analysis is crucial to unlocking the potential of real-time analytics to propel business growth in this data-driven era.

One refinement I propose is to recognize the need to differentiate between end-to-end real-time data analysis and fast response from already prepared data. Response latency refers to the time it takes for a system to process a request or query and respond. End-to-end real-time data analysis refers to the time between the generation of new data, the time it takes to transport it, transform it, and to prepare it for analysis, plus the time for the analysis itself.

Low Latency Real-Time Data Analysis

The first category of definitions is related to response latency:

Sub-second response: Realtime often refers to responding anywhere from a few hundred milliseconds, common in a good analytical database, to a few microseconds, or nanoseconds, which is attainable only in a few highly specialized technologies. Applications like cybersecurity or stock exchange bidding systems necessitate this exacting category of near-instantaneous response. Fraud detection would generally work fine with a response measured in milliseconds.

Interactive response: This is from an analytics user’s perspective. Systems that respond to queries or actions such as clicking to drill down for more detailed information on an analytic graph are real-time. While a few seconds of latency might be acceptable at times, exceeding this threshold can result in user frustration or lost opportunities.

End-to-End Real-time Data Analysis

The second category of definitions include processing the data from the source, not just getting a response from already prepared data:

Streaming: As opposed to “batch” where data accumulates, then is processed all at once, streaming involves processing and analyzing data as it flows in continuously, usually one unit at a time. Often, “micro-batches” process data from a small-time window such as a few seconds or minutes. Many popular streaming data processing technologies actually process in micro-batches, so this is still considered streaming. Monitoring, or acting on data from sensors or other Internet of Things (IoT) devices is a common use case. Predictive maintenance or network optimization are good examples. Sentiment analysis on social media streams is another.

Event-driven: This revolves around triggers or actions that initiate data analysis and response. Rather than adhering to scheduled intervals of time, the goal is responding promptly to specific events. Examples include change data capture, which pulls changes to source databases as they happen. Another is loading, processing, and analyzing data as soon as it arrives from a third party. Performance expectations in event-driven scenarios rely on timely completion of processing before subsequent events occur so that the system is ready to process the new data.

 Unlock Real Value From Real-Time Analytics

The ability to swiftly process incoming data and deliver insights in a timely manner enables businesses to seize opportunities, detect anomalies, and drive proactive decision-making. To harness the real value of real-time data analysis, organizations must establish a strong foundation:

Realtime definition clarity: Consider the requirements of your use cases, whether you need sub-second or human interactive latency, and whether you need streaming or event-driven processing to get the data ready in a short time window. It’s not uncommon to need one strategy from each category to prepare data rapidly for analysis and analyze it at the speed the use case demands.

Infrastructure readiness: Invest in a robust infrastructure that supports your chosen definition of real-time processing. This includes selecting the right technologies such as streaming data platforms, analytical databases, and hardware or cloud instances.

Performance optimization: Fine tune your analytical systems to meet end-to-end processing and latency requirements. Any good data processing or analysis technology should give you extensive options for monitoring, locating, and refining any workloads that aren’t meeting latency needs. Throwing more hardware at the problem is not an ideal solution, and in the end, will increase both costs and energy burned in a world that needs energy conservation.

Pipeline speed focus: Fast response on stale data is no longer acceptable in business and use cases with modern real-time requirements. Instead of slow, batch data transformation in a staging area, best practices are moving toward automated loading of data into analytical databases.

Today’s organizations are sitting on massive amounts of data. But in the absence of a proper analytics foundation, much of this valuable data stays unusable. One obvious piece to a robust, real-time analytics foundation is having a complete understanding of what customers expect when it comes to real-time. The other critical piece is having a platform that can reduce the time taken for data to be made ready for analysis as well as fast execution of the analysis itself.

Embracing real-time analysis will empower organizations to respond swiftly, make informed decisions, and deliver exceptional experiences in an increasingly dynamic and interconnected world.

About the Author(s)

Paige Roberts

Open Source Relations Manager, Vertica, Vertica

Paige Roberts is Open Source Relations Manager for Vertica by OpenText. With 25 years of experience in data management and analytics, she has worked as an engineer, trainer, support technician, technical writer, marketer, product manager, and consultant. She’s contributed to “97 Things Every Data Engineer Should Know,” and co-authored “Accelerate Machine Learning with a Unified Analytics Architecture” both from O’Reilly Publishing.

Never Miss a Beat: Get a snapshot of the issues affecting the IT industry straight to your inbox.

You May Also Like


More Insights