event driven

3 advantages of Event-Driven Architecture

My latest posts have put a lot of focus on Cloud native technologies. The last few have mentioned things like CloudEvents and Knative Eventing and it got me thinking… why might people want to implement event driven ecosystems in the first place?

I’ve decided to put together three advantages that I think offer pretty attractive prospects for implementing an Event Driven Architecture pattern.

True Decoupling of Producers and Consumers

The nature of an Event Driven Architecture ecosystem lends itself to microservices and, in this type of system there is (hopefully) a loose coupling between the services. Depending on the communication between microservices, there may still be dependencies between them
(e.g a http request/response approach).

In the excellent book, ‘Designing Event-Driven Systems‘, Ben Stopford tells us that event-driven services core mantra is “Centralize an immutable stream of facts. Decentralise the freedom to act, adapt and change”.

Because the ownership of data is separated by domain, this gives a nice logical separation between the production and consumption of events. As a producer I do not need to concern myself with how the events I produce are going to be consumed. Vice versa for the team consuming them. They are free to figure out for themselves what to do with the events, they do not need to be instructed. The message structure is also not important. It can be json, xml, avro etc. Doesn’t matter.

The broker and some kind of trigger between it and the services enables messages to be ingested into the event driven eco-system and then broadcast out to whichever services are interested in receiving them.

Business narrative of what has happened that can’t be changed

We have all heard the term ‘single source of truth’ and this is usually just a rumor (like the treasure chest hidden at the end of the rainbow). Well, in an event-driven ecosystem it really exists!

As mentioned above, an event-stream should be an immutable stream of facts. This is very representative of how our daily lives unfold; as a series of events. These events happened and it’s not possible to go back and change them unless you own one of these (remember, terrible things can happen to those who meddle with time)…

This is an advantage for business data governance as you can always look back in the log for auditing or to see what happened.

It is becoming more and more common for companies to need to explain their ‘data-derived’ decisions, e.g why a customer’s application for finance or insurance has been rejected. The log of immutable events that EDA provides us can provide a key component of this auditing.

Real-time event streams for Data Science.

One of the reasons I am enthusiastic about EDA is that it is particularly well suited to in-stream processing. It lends itself to fast decision making, things where milliseconds count.

Business logic can be applied while data is in motion rather than needing to wait for the data to land somewhere and then do the analysis. This is good for things like fraud detection, predictive analytics. Oftentimes, we need to know if a transaction is fraudulent before it completes.

Further Reading

There are many reasons you might want to use eventing as the backbone of your system and if you want to find out more about Event-Driven Architecture then I recommend the following resources as a start:

  • Designing Event-Driven Systems by Ben Stopford
  • Building Event-Driven Microservices by Adam Bellemare (pre-release)
  • Cloud Native Patterns by Cornelia Davis
  • I wrote a follow up, longer article on this topic here on IBM Developer.

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s