How I finally migrated my whole website to the cloud
This is going to be a tale of blood, sweat and tears, describing what I experienced over multiple years while trying to get rid of maintaining my own servers to host my website. This battle lasted for, literally, over a decade, until I was finally able to migrate every piece of code that comprises it (not just this blog). Since this week, it runs on a cloud infrastructure over multiple providers where I have little to no worries about its maintenance, offers more reliability than I actually need and it’s also pretty cheap.
The myhro.info
domain was registered in April 2007. Initially I had no real intentions of hosting anything on top of it, as it was more like “oh, now I have my first domain!”. In the first years, the web hosting was provided by 000webhost. It was free, but I also had no guarantee about its availability and faced some downtime every once in a while. This continued until, after finding some interesting offers on Low End Box, I migrated it to its own VPS server by the end of 2010. I remember the year because it was my first one at the Information Systems course, around the same time I got my first part-time System Administrator job. The experience I got maintaining my own Linux + Apache + PHP + MySQL (LAMP) server was crucial in the beginning of my professional career and some learnings from the time are still useful to me these days.
In April 2011 this blog was started on a self-hosted WordPress installation, in the same previously mentioned server. At first I had almost no service to really care about its availability and probably the only exception was the Myhro.info URL Shortener (hosted under the myhro.net
domain). The problem is that, after starting a blog, I had to worry about it being online at all times - otherwise people would not be able to read what I spent hours writing.
Maintaining your own WordPress instance is not an easy job, even for small blogs. I spent endless hours fighting comment spam and keeping the installation secure and up-to-date. It was such a hassle that in less than two years, it was migrated to OctoPress, a static site generator in blog format, in the beginning of 2013. Publishing posts was now a matter of copying HTML files over rsync
, but I still had to maintain a HTTP server for it. That’s why this blog was moved to GitHub Pages in 2014 and to Jekyll in 2015 and is still hosted there currently. Now I was free from maintaining a web server for it and this became a GitHub problem. At the same time the blog was migrated to Jekyll, its HTTPS support was re-enabled using Cloudflare (something that was lost in the GitHub Pages migration).
Migrate blog.myhro.info
to GitHub Pages + Cloudflare was marvelous and I haven’t worried about its maintenance ever since - not to mention that it also didn’t cost me a cent to do it. Now I had to take care of other parts of my website that required server-side scripts, like myhro.info/ip: a page that shows visitor’s IP address and user agent in a simple plain text format. It’s really handy to use it in the command line with curl
and, in my experience, faster than using ifconfig.me. The main issue with this service is that it was written in PHP.
I don’t remember exactly when was my first try of migrating the IP page to a cloud service, but it was probably between 2015 and 2016, when I tried AWS Lambda, rewriting it in a supported language. This didn’t worked, as to make a Lambda function available via HTTP, one have to use the Amazon API Gateway and it didn’t offered the possibility of using a simple endpoint like myhro.info/ip
. I think this can be achieved with Amazon CloudFront, routing a specific path to a different origin, but it seemed too much work (and involved the usage of a bunch of different services) to achieve something that is really simple in nature. Trying to do the same using Google Cloud Functions yielded a similar experience.
After these frustrating experiences, I stopped looking for alternatives. Maybe the technology to host a few dynamic pages (in this case, only one) for a website which has most static content wasn’t there yet. Then, after two hopeless years, I read the announcement of Cloudflare Workers which seemed exactly what I wanted: run code on a cloud service to answer specific requests. Finally, after reaching open beta and general availability, in 2018 I could truly and easily deploy small “serverless” applications tightly integrated to an already existing website. For that I just had to learn a little bit of JavaScript.
It took me years of waiting and a few hours in a weekend to write JavaScript replacements for the PHP and Python (in the end I also migrated heroku.myhro.info, a service that returns random Heroku-style names) implementations, but I had finally reached the Holy Grail. Now it was a matter of moving the static parts of the website to Amazon S3, which is quite straightforward. S3 doesn’t offer HTTPS connections for static websites hosted in there, but as I already used Cloudflare, this was a no-brainer.
So, Cloudflare Workers aren’t free (the minimum fee is $5/month), neither are they perfect. There are some serious limitations, like the “one worker per domain” restriction on non-Enterprise accounts, that can be a blocker for larger projects. But in this case, where I wanted to have a couple dynamic pages for a mostly static website, they fit perfectly. I’m also happy to pay an amount I consider reasonable for a service I’ve been using for free for years. Looking at the company recent innovations, they may become even better in the future.