In more sophisticated scenarios event-driven architectures can help to cope with complexity and acelerate data processing. The idea is that a central event-hub serves as a router for various event provider and event consumer. Events are then saved in message queues (topics) that can be subscribed and read by various consumer in a asynchronous way. The majore benefits of such an approach are:
It is easy to integrate new event provider and consumer. Moreover, a topic can be subscribed by arbitrary consumers without interference of other systems.
The asynchronous idea of an event-driven architecture makes it very fault-tolerance as a failure in one node does not affect the system at all. Moreover, event loks, like Apache Kafka, persist the messages queues, which shifts the persistence responsibility from the consumer to the event log and enables new use cases.
Modern event-logs, like Apache Kafka or Azure Event Hubs, allow distributed operation and thereby are easily scalable to process large data volumes.
Streaming becomes an increasingly important part of modern software systems. Depending on the use case a streamlined pipeline or a more sophisticated event-driven architecture can provide an adequate solution. Due to the manifold technologies and products it is important to analyze the requirements and build a future-proof software landscape.