Streaming data is getting extremely popular lately because of its use cases and a growing number of tools and databases that are coming out there in the market for stream analytics. It becomes extremely important these days to have some idea about it and that's what this blog post is all about.
Questions that this blog post will answer:
- What is streaming data?
- What are the sources of streaming data?
- Examples of streaming data.
- How it's collected, processed, and analyzed?
- How it helps companies to make data-driven decisions?
This blog post will also answer how companies more about YOU than you.
Data you're familiar with: Product's data(product_id, name, price ..), customer's data( customer_name, id,...). This kind of data is typically stored in SQL and NoSQL databases, depending on the business problem.
On the other hand, Streaming data is completely different from the data that most of you are familiar with.
Streaming data: data that's coming continuously from various sources and that needs to be processed either in real-time or after some time.
Let's break it down further.
Stream data = data generated from events(most of the time)
Event: anything that happens in the real world at a particular timestamp.
Continuous stream of events: never-ending succession of individual events.
They're continuous just like the water flowing through the river. And the same goes for stream data.
Now, let's come back to the world of programming.
Applications run 24x7 serving millions of users. These applications generate both kinds of data, data that you're familiar with, and streaming data.
Example of streaming data:
1) Logs: tells developer, what's happening during execution of code(you know about them already)
Logs are continuously generated by applications. Logs encapsulate what happened, what did not, where did the problem in code arise, edge case you missed, etc.
2) Events data
User X clicked Y product. User X liked Y tweet. User X added Y product to cart. User X spend Y mins to watch the Z video.
These are all events that are being sent to servers for data analysis.
This is how companies know more about you than you.
3) Time-series data
Data generated from sensors, IOT devices also come under streaming data.
Sensor X reads Y value at Z time.
Timestamp plays a big role in streaming data because we need to know the time at which an event occurs.
Streaming data brings a different viewpoint to look at data.
Collection of stream data
Stream data is generated by event producers and then usually send to Apache Kafka topics for further downstream of data.
Apache Kafka decouples event producers from event consumers.
You can read more about Kafka here.
Processing of stream data
The raw event data is either processed using Kafka's stream API or using Apache Samza.
And the analytics part is mainly done using DB called Apache druid that's made to serve applications that have high ingestion data rate and stream analytics.
How companies use stream analytics
The data generated from stream events is GOLD for companies. This is where most of the value and insights about user lies.
Numerous use cases:
- Deployed some UX improvements? User activities will help whether it's good or not.
- A/B Testing, tweaked some things on the website? Again, user activities come to rescue.
- Users are spending less time on the website now? Need to look at what went wrong.
- Heap memory got almost full, the container is going to terminate. Alert on slack. Fix that.
And many more.