There and back again; from monolithic to microservices and back


Now, this is a story all about how, Makro got flipped turned upside down and I’d like to take a minute, just sit right there. I’ll tell you how I became the user of C# for good I swear!

Yeah. That right there was so cringeworthy, I know. I just tried to come up with something fresh (pun intended) and fun. To be honest, sometimes it feels that I spend too much time trying to find a nice pun or trying to find suitable memes to use which I could use to keep these posts fun-ish. Especially when I know that the post is going to be a rather long one like this. But I digress.

Like I have written a few times before Makro’s backend was made with Node.js. At first, it was a monolithic but then I slowly made my way towards microservices architecture by separating services to its own, well, services.

While I do like how Node.js is so fast to use to spin up some kind of backend quickly I’ve slowly started kinda not liking it in a way. Don’t get me wrong; I don’t dislike it per se but I enjoy other technologies more. And that is one of the reasons which lead me to rewrite the backend. The other reason started out as a joke but ended up affecting the decision quite a lot.

A bit over a month ago my wife was checking out some summer courses for school. After a while, she stumbled upon programming courses and started reading them aloud since she is well aware of what I do for a living (and as a hobby). So she went down the list: “Ruby, Java, Python, C” etc. Suddenly she stopped and her eyes lit up and she said (quite enthusiastically might I add) “Look, they have C hashtag programming class”. I was a bit confused for a second and then it hit me: she was talking about C# aka C Sharp. I burst into laughter. Of course for me, C# has always been C Sharp but naturally, she wouldn’t know what it is called but just assume that the #-sign would stand for hashtag since that is what it is used for in her world, so to speak. After that, we have had our nice little inside joke regarding that misunderstanding and since she got so excited about the now infamous C hashtag programming language, I had to start learning it.

C hashtag!

So I started rewriting the backend. Even though C# is a lot like Java I had a rough start. But after a while, I got into it and started really enjoying the work even though I was just writing all the stuff I had done previously but with a different language. Like all things in life, it wasn’t all sunshine and rainbows, though. Previously, Makro’s database of choice was MongoDB but I wanted to change that, too and that posed some challenges. Especially when I had some technical debt carrying over from the first iteration of Makro.

To get all the data out of Mongo and into the new database of choice, PostgreSQL, I had to write a script that will get the data and format it correctly for me to be able to dump it into the new database in a usable form. For that, I used Node.js since it was a lot faster for me to write the script with that than with anything else (C# for example). At first, I had to make a relational model for the new database so I’d know what the data should look like and create these entities with C#. To create the database, handle migrations and access the database I am using Entity Framework Core. All in all, it took me quite a while to get the script working properly and the data into PostgreSQL. Since the initial migration, I have had to do it multiple times again since I have made some minor changes to the database.

I do have to admit that there have been some struggles with using a relational database coming from NoSQL. Sometimes everything feels just so much harder and unnecessary. It has had its moments though. Now, it is a lot easier to get all the data needed with just one request than it was before (without callback hell) and (at least locally) ASP.NET Core does perform really well. Actually, it has been faster than Node.js which did come as a bit of a surprise for me. Not that I care that much about the speed though since Makro is fast enough as it is.

As I’ve been rewriting everything I got to admit that I have come to really, really like C#. For me, it feels a bit better than Java (I can’t really say why exactly, though) and a lot better than JavaScript. Although it is not as fast to write the code initially when comparing to the Node.js and MongoDB (at least not yet it isn’t) when one could just get the data and spit it out to the frontend. Now I have to (usually) format the data a bit before sending it. Usually, that means converting entities to DTOs (data transfer objects) but I found a nice tool for that, though, called AutoMapper.

The new backend feels that much more robust and more like a real backend compared to the old one that consisted of a few microservices. I was so anxious to deploy it to production and see how it performs there. On that note, I was quite surprised when I tried running it in Docker container locally. I was expecting it to be the same kind of memory hog as JVM but it seemed to eat up only around 50 megabytes: about the same as each of my Node microservices. That looked very promising. But after deploying it seems to take about 250 megabytes of RAM which is interesting, to say the least. But a quick Google search suggested that docker stats vastly underestimates memory usage on macOS compared to Linux.

The backend is not the only that has changed, though. I have had to rewrite a lot of the frontend as well for it to work with new APIs. Take the QA for an example. Before I had to do multiple HTTP requests to get the all the data needed but now one is enough since I can easily just get all the questions and include answers and comments while getting them and before returning the data I can add the likes as well. Mucho bueno like Pablo Escobar would say (at least according to Narcos).

After I was like 99% done with everything, I decided to scrap a lot of features. Makro had become slightly bloated with (unused) features so I made a harsh decision and removed them altogether. Features that got removed were articles, q&a section, and the most recent help dev section.

In the end, it has been a challenging but enjoyable journey. Most of the challenges I had during this operation was the migrating of data from MongoDB to Postgresql but through trial and error, I did manage to get it working properly. Now that it is done, Makro will be (hopefully) that much easier to maintain and develop it further. If nothing else at least the data is much more structured now thanks to a relational database.


Leave a Reply

Your email address will not be published. Required fields are marked *