The new MooCow architecture

So in April 2019, what is this website running on now?

In the last post, I mentioned that I previously had a Virtual Private Server (VPS) through a New Zealand web hosting company to power my website, citing my needs for Java-based web host and the ability to tinker. With the VPS, everything ran on it: the application environment, the database, the large image server, etc. This cost me around NZ $27.50/month (the equivalent plan on OpenHost is much cheaper now - I was on a legacy plan that put me somewhere between their existing offerings).

With the move to various cloud services, I unfortunately lose the ability to keep everything in the country (no more supporting my local businesses 😢), but managed to remove the server management aspect that I didn't enjoy while saving myself some money:

Java environment: Heroku

Outside of work, I write most of my code using Groovy, a language that runs on the Java Virtual Machine and so lets me take advantage of the large Java ecosystem. I'm still using Thymeleaf, and since the majority of Thymeleaf users are via the widely popular Spring framework, I'm also running Spring Boot. The Spring Boot app is all built, bundled, and deployed on Heroku's Java containers, of which I am able to get by using their free tier. The caveat of the free tier is that the environment goes to sleep after 30 minutes of inactivity, but I've managed to get around that with...

Site status checker: Uptime Robot

When I was a more prolific blogger, my mum would be a regular visitor to this website, kindly e-mailing me whenever my website was down (thanks mum!). With Uptime Robot, I've set up a monitor to ping this website every 5 minutes to notify me if it goes down, also doing the double-duty of keeping my website active enough so that Heroku doesn't put it to sleep!

Database: MongoDB Atlas

I think this particular choice of tech was one of those hype bandwagon ones - my workmates at the time were using it as the data store in a lot of prototypes, so I thought to give it a try myself. Luckily for me and this website, I really liked what I saw and so stuck with it ever since. I'm also glad to find that you can run MongoDB clusters for free for certain combinations of cloud provider and region.

Large image hosting: AWS S3

Well there had to be at least 1 big name cloud provider in all of this, and AWS' S3 service became my go to for the larger images that I delegate to their own static file server. AWS' free tier for S3 is mighty generous: 20,000 GET requests and 2,000 PUT/COPY/POST/LIST requests. I have hit the PUT limit a few times while I still mess around with the website, and if I continue to do so I might actually have to put down some money so I don't have to wait for the month to roll over!

Real-time image optimization: Cloudinary

Serving large images, particularly to those trying to view my site on metered data connections, is pretty irresponsible. I was thinking of editing the images directly and saving them in more optimized formats with higher compression, but found that this is something that services like Cloudinary can do in real-time! Their free plan requires a little bit of mental math: they provide 25 "credits" per month where 1 credit roughly translates to 1,000 image transformations or 1GB of viewing bandwidth.

Logging: Papertrail

Reading large text files is never a fun way to spend your time. So I'm glad there are a lot of services out there now which take the bore out of interrogating your application log files. Papertrail's free plan as a Heroku add-on lets you store 10MB of logs per day, search through the last 48 hours of logs throuh their dashboard, and archiving logs for 7 days if you really need to download them.

Content Delivery Network: Cloudflare

Serving all the content needed for a person to view a web page is hella slow when it has to travel from the other side of the world. With parts of this website now hosted in all sorts of locations, I really felt how slow that was during my testing. Cloudflare helps by caching as much as it can in servers closer to the viewer, serving those items up instead of having to make the round trip to wherever those things are originally held. Their free plan lets you host just 1 website through them (you list 1 domain and that plus any subdomains counts as the website) with no limit on the amount of data they will cache and serve on your behalf.


This has basically been one huge ad for all of the above products 😛 So far I've managed to run everything without having to spend any money, all the while improving my own quality-of-life when it comes to maintaining this website.

Aside from the hard requirements of being able to keep my existing setup (a Java-based website running on MongoDB), I tried to pick things that I didn't already use at work, just so I could flex my brain a bit and see how things are run elsewhere. Sure there are similarities to what I use at my job, but for much of the above I liked the simplicity of the offerings when not having to view things through the eyes of some kind of "enterprise-level dashboard" or something normally targeted to much larger companies (although the AWS Console is still stupid overwhelming when I'm using it for just 1 thing).

Overall, I'm very happy with how things are running for now while I make further changes under the hood to help with my intention of returning to blogging. Those plans will be in the next post 😉