tech-docs

Living documentation for evolving technologies

View on GitHub

Event Streaming


Table of Contents

Event streaming is a data processing approach that involves the continuous and real-time transmission of data as a sequence of events. It’s a mechanism for handling and processing large volumes of data, particularly in scenarios where data is generated, ingested, and processed at high speeds. Event streaming is widely used in various applications, including real-time analytics, monitoring, and distributed systems.

Key Concepts and Components

Events

Events are discrete pieces of data that represent a change in state or an occurrence in a system. They can be as simple as a timestamp or as complex as a structured message. Events are generated by various sources, such as sensors, applications, or users.

Back to top

Event Streams

Event streams are sequences of events, typically organized chronologically. These streams are often persisted and can be replayed for historical analysis. Event streams are durable, meaning that they retain data even if consumers are not actively processing them.

Back to top

Publish-Subscribe Model

Event streaming often employs a publish-subscribe model, where producers (those generating events) publish events to specific topics or channels, and consumers (applications or services) subscribe to those topics to receive events of interest. This decouples the producers from the consumers, allowing for scalability and flexibility.

Back to top

Event Brokers

Event brokers are the infrastructure components responsible for managing the distribution of events. They handle the routing, storage, and delivery of events to the appropriate consumers. Popular event brokers include Apache Kafka and Apache Pulsar.

Back to top

Event Processing

Event processing involves consuming and acting upon events in real-time. Consumers can perform various tasks, such as data transformation, aggregation, filtering, and triggering actions based on event content.

Back to top

Fault Tolerance and Scalability

Event streaming systems are designed to be fault-tolerant and highly scalable. They often use distributed architectures to ensure reliability and handle large workloads.

Back to top


Ref.


Get Started | Real Time Processing Data Processing ___