Consuming message queues using .net core background workers – part 2: background workers
Consuming message queues using .net core background workers – part 2: background workers

Consuming message queues using .net core background workers – part 2: background workers

2019, Jul 04    

In the previous article of this series we talked a bit about Message Queues. This time instead I’ll be introducing Background Workers.

Just to recap, Message Queues can be used to handle asynchronous communication between services, improving resiliency and scalability.

Now, suppose you have an API for handling blog posts and tags. Every post can be assigned to one or more tags. Let’s say that you’re using MongoDB with a single “Posts” collection. Something simple, like this:

    title: ...,
    description: ...,
    creationDate: ...,
    tags: ["lorem", "ipsum", "dolor"]

Works fine, your API can handle a huge amount of requests. All is good, everybody is happy.

One day you’re asked to add a “tag-cloud” functionality: the API has to expose a new endpoint that returns a list of all the tags plus the posts count. Something like this:

    {tag: "lorem", posts_count: 42},
    {tag: "ipsum", posts_count: 13},
    {tag: "dolor", posts_count: 71}

Again, nothing fancy. Now the question is: how do you collect the data?

One option would be to update the counts with an upsert operation every time a blog post is added or updated.

This works fine but it’s not exactly scalable. The whole operation could take time, or fail and you would end up with an inconsistent state.

Yes you could add a Circuit Breaker but I’m sure it won’t be enough.

Another option would be to use a Background Worker! In a nutshell, the application will spin up a new thread and execute whatever operation it has to.

Of course this thread will be running outside the http request/response cycle so you won’t get access to anything like logged user, cookies and so on.

Going back to our small example, at application bootstrap we could spin up a Background Worker that would recreate the tag cloud from scratch at regular intervals (say every 6 hours or so, depends on how frequently you have new posts published). Being it on MongoDb it could use map/reduce or the aggregate pipeline, doesn’t matter.

A Background Worker can also be used to consume messages published on a queue. For example we could react to an “Order fulfilled” event and have the worker send notification emails to the customer.

Or maybe we can have a Background Worker consuming a “Blog post updated” event and refresh a de-normalized version of the Post in the Queries db, which will eventually be used by our Query engine in a CQRS architecture.

That’s all for today. Next time I’m going to show some code so stay tuned!

Did you like this post? Then