My Event Sourcing journey so far
My Event Sourcing journey so far

My Event Sourcing journey so far

2021, Dec 23    

Almost a year has passed since my last article on Event Sourcing. I left the SuperSafeBank repository untouched for a while, I definitely needed some fresh air. In the meantime, I noticed with pleasure that there was some interest: it got forked and received a decent amount of stars.

In March I’ll be giving a talk about Event Sourcing at the Worldwide Software Architecture Summit ‘22, so I decided it was the right time to do some house cleaning.

Now, like every other time I went through some “old-ish” codebase of mine, I kept having those recurring WTF moments.

WTF per minute

“Did I really wrote that?”. In total honesty I find this to be a really good sign. Means that after this time, I grew as developer, learned new techniques, acquired new knowledge.

It would be a personal defeat realising that I’ve learned nothing new in a whole year.

So, what did I change? Well, few bits, here and there. Let’s start from the easiest one: I ported everything to .NET6. From the code perspective, this resulted in few changes, from record types to the new console app template.

After that, I also decided to avoid reusing Domain Events as Integration Events, and introduced proper classes for those. Why? Well, I gave it a try after a conversation on LinkedIn and some googling. The resulting desing got definitely cleaner, and it also gives me an easier way to manually pick which events I want to be broadcasted to subscribers.

The previous implementation instead was directly publishing every Domain Event, right after having persisted them on the Event Store.

Now, I don’t see this necessarily as a big issue. Publishing Integration Events in a reliable way also means the introduction of a retry mechanism (Polly anyone?) and/or an Outbox. Publishing Domain Events instead, leverages the fact that those events are already persisted by definition into our Event Store, which makes things somewhat simpler.

Next one: the Event Consumers. I decided to split both the On-Premise and the Azure projects into two parts each:

  • one API to handle Queries and Commands
  • one Worker to process incoming Events (eg. for refreshing Materialized Views)

Now, let’s have a look at one of these Workers, for example the one responsible for rebuilding the Customer Details view. This is how our view looks like:

public record CustomerAccountDetails(Guid Id, Money Balance);
public record CustomerDetails(Guid Id, string Firstname, string Lastname, string Email, CustomerAccountDetails[] Accounts, Money TotalBalance);

As you can see, it exposes some basic user details and a list of all his Accounts.

Here’s our handler (a redacted version of it):

public class CustomerDetailsHandler : INotificationHandler<CustomerCreated>
{
    public async Task Handle(CustomerCreated @event, CancellationToken cancellationToken)
    {
        _logger.LogInformation("creating customer details for customer {CustomerId} ...", @event.CustomerId);

        var customerView = await BuildCustomerViewAsync(@event.CustomerId, cancellationToken);
        await SaveCustomerViewAsync(customerView, cancellationToken);
    }

    private async Task<CustomerDetails> BuildCustomerViewAsync(Guid customerId, CancellationToken cancellationToken)
    {
        var customer = await _customersRepo.RehydrateAsync(customerId, cancellationToken);

        var totalBalance = Money.Zero(Currency.CanadianDollar);
        var accounts = new CustomerAccountDetails[customer.Accounts.Count];
        int index = 0;
        foreach (var id in customer.Accounts)
        {
            var account = await _accountsRepo.RehydrateAsync(id, cancellationToken);
            accounts[index++] = CustomerAccountDetails.Map(account);

            totalBalance = totalBalance.Add(account.Balance, _currencyConverter);
        }

        var customerView = new CustomerDetails(customer.Id, customer.Firstname, customer.Lastname, customer.Email.Value, accounts, totalBalance);
        return customerView;
    }

    private async Task SaveCustomerViewAsync(CustomerDetails customerView, CancellationToken cancellationToken)
    {
        var filter = Builders<CustomerDetails>.Filter
                        .Eq(a => a.Id, customerView.Id);

        var update = Builders<CustomerDetails>.Update
            .Set(a => a.Id, customerView.Id)
            .Set(a => a.Firstname, customerView.Firstname)
            .Set(a => a.Lastname, customerView.Lastname)
            .Set(a => a.Email, customerView.Email)
            .Set(a => a.Accounts, customerView.Accounts)
            .Set(a => a.TotalBalance, customerView.TotalBalance);

        await _db.CustomersDetails.UpdateOneAsync(filter,
            cancellationToken: cancellationToken,
            update: update,
            options: new UpdateOptions() { IsUpsert = true });

        _logger.LogInformation($"updated customer details for customer {customerView.Id}");
    }
}

When we receive the CustomerCreated event, we start by rehydrating the Customer model from our Events Store. Then we loop over all his accounts, map them to CustomerAccountDetails and at the same time compute the total customer balance.

Now that we have a fully populated instance of CustomerDetails, we can proceed with the upsert operation. This example is using MongoDb as backing storage, which exposes a very easy API for this.

Some of you might have noticed that I’m using a Money class to represent, well, money. I’m doing the same for Emails as well. This is done to avoid Primitive Obsession (thanks, Vladimir).

That’s it for today. Event Sourcing is a complex pattern and should be handled with care. The examples provided here are simple, and definitely this code is not battle-tested for production. Still, they should provide enough guidance and food for thought.

Ad maiora!

Did you like this post? Then