Advertisement

Real-Time Analytics: Examples and Benefits

By on
real-time analytics
Read more about author Dhruba Borthakur.

Technology behemoths like Netflix, Uber, and Meta set the standard for how application users experience data. Users expect data to be integrated into the application, making it easier to find relevant content, track deliveries, provide a spam-free internet experience, and make quick, informed operational decisions. Up until this point, the speed and scale of real-time analytics have been challenging to achieve in applications.

Real-time analytics has required custom-built technologies and armies of data and infrastructure engineers to manage them. That’s changing with the widespread adoption of real-time streaming data and cloud services that lower operations and improve resource efficiencies.

This article explains real-time analytics, contrasts it with batch analytics, and provides examples and benefits across industries.

What Is Real-Time Analytics?

Real-time analytics is all about using data as soon as it is produced to answer questions, make predictions, understand relationships, and automate processes. Gartner defines it as “the discipline that applies logic and mathematics to data to provide insights for making better decisions quickly.” The core requirements of real-time analytics are access to fresh data and fast queries, which are essentially two measures of latency: data latency and query latency.

Data Latency: Data latency is a measure of the time from when data is generated to when it is queryable. There is usually a time lag during this process, and real-time analytics databases are designed to minimize that lag, allowing for changes in data to be quickly reflected.

Low data latency can be challenging to deliver, as the database must be able to write incoming data while simultaneously allowing the application to make queries on the most recent data. That means having a database that can handle high write rates and is optimized for real-time data processing, not batch analytics jobs, which have traditionally been the data processing method for analytics.

Query Latency: Query latency is the time required to execute a query and return a result. Applications want to minimize query latency for snappy, responsive user experiences, and teams are increasingly setting sub-second query latency standards for their data applications. That said, massaging data and optimizing indexes to deliver consistently low query latency can be time-consuming, making it challenging for teams to iterate and expand on their analytical features.

Real-Time vs. Batch Analytics

Real-time analytics is optimized for low-latency analytics and ensures that data is available for querying in seconds while batch is high-latency analytics, where queries return results on data that is at least tens of minutes or hours old.

One use case for batch analytics is business intelligence reporting, which uses historical data to report on business trends and answer strategic questions. In these scenarios, the goal is to use data to craft strategy; not to take immediate action. Real-time data would not generally impact the result of the trend analysis, making this better suited for batch analytics. Batch analytics use cases like business intelligence, reporting, and data science have less stringent latency requirements and therefore can tolerate ETL pipelines to homogenize and enrich data for analytics. In contrast, real-time use cases have low latency requirements and attempt to reduce or remove the need for ETL processes.

Many analytics systems like Hadoop and data warehouses were designed for batch analytics. Batch analytics systems process the data in batches, data is collected and loaded into the system over a period of time. Rather than having an “always on” system for data processing, they can restrict data processing to specific time intervals to reduce costs. Batching also helps with data compression, reducing the overall storage footprint and making it economical for periodic analytics on large-scale data.

On the other hand, databases designed for real-time analytics have native support for semi-structured data and other modern data formats to avoid ETL processes and achieve low data latency. They are also optimized for compute efficiency to reduce the resources required to constantly process incoming data and execute high-volume queries.

Use Cases for Real-Time Analytics

The increasing demand for real-time analytics is being driven by several benefits. 

Snappy, responsive experiences: Snappy, responsive experiences increase user adoption. One investment management firm increased their application usage by 350% by lowering the latency of their user-facing analytics. As a result, the application insights became embedded into the day-to-day decision making of the organization. 

Faster decision-making: If every question of your data takes seconds or minutes to return, you don’t dig as deep into the information and rely more on intuition. Seesaw, an edtech company used by more than 10 million K-12 teachers, created a data-driven culture with sales, support, and product teams using real-time analytics to quickly improve the experience of schools and teachers. 

Semi-automated and automated intelligence: Automated or semi-automated intelligence can reduce the cognitive load of decision-making. Whatnot, a live video marketplace, uses a real-time ranking engine to show users viral videos, relevant social interactions, and personalized shopping recommendations, keeping them engaged on the site.

Time-sensitive interventions: Time-sensitive interventions save on operational costs and increase revenues. Command Alkon, a construction logistics company, tracks concrete deliveries across the North America market ensuring that construction sites are prepared for deliveries. As concrete has a short lifespan, the sites needed to be ready to use the concrete immediately or risk jeopardizing the entire construction project.

Growth in Real-Time Analytics

Real-time analytics databases have matured, making it easier for engineering teams to access streaming data and achieve low-latency analytics. Engineering teams are no longer required to custom-build or self-manage complex, distributed systems to achieve real-time analytics. 

The most fundamental change enabling the growth in adoption of real-time analytics is the cloud. Companies can scale up and down resources to meet changing application demands, avoiding overpaying for excess capacity when traffic slows down. Real-time analytics databases have also separated storage and compute so you no longer need to overprovision resources, achieving better price-performance at scale. The cloud offers new levels of operational simplicity and resource efficiency that will put real-time analytics within reach of even more companies in 2023.