Seems interesting, I've always appreciated the simplicity of a Jekyll-type approach.
Apologies for semi-hijacking to mention it but, I take a slightly different approach in my personal blog (custom, messy code) - I rsync markdown documents to my webserver which are compiled into HTML and put into a redis in-memory collection which the server uses to render the blog. That gives you in-memory caching for free and avoids having a whole bunch of static files having to be generated every time. I use node on the backend and angular on the frontend to allow for a single-page website.
Currently the solution involves regenerating all HTML each time files are rsync by a script run remotely via SSH, however I ultimately intend for it to use an inotify-style approach to only import files that have actually changed, running both locally and remotely, so publishing an article need only require you to write some markdown and save it in a particular folder.
Though of course all this (currently) requires you to have a server such as linode to which you can rsync + have a remote serving script watch a folder, I mention it so to ask whether anybody would be interested in me cleaning it up and open sourcing it?
If you have compiled it to HTML, wouldn't it be more efficient to serve it up as static HTML files via nginx (or any web server). Why store the HTML in redis and serve off it via an application?
With aggressive disk caching on servers static files are probably near enough in-memory anyhow (+ I'm sure you could configure a modern web server application to cache anyhow), so yeah it's probably pretty fast.
However it's more limiting to simply serve static files - you're limited to what you've generated. With redis you can serve it as json data and use it dynamically for e.g. search or showing all articles with a given tag, in a given date range, etc.
Additionally, I'm not a big fan of a whole bunch of static files sat in a folder somewhere that needs to be regenerated every time I change something. Personal preference, perhaps :-)
I host it in DigitalOcean's New York datacenter (on a 512 MB VPS). They're pretty alright, but any VPS would do the trick. In fact, I only use a VPS because I can't properly set caching or gzip headers on GitHub pages.
As far as I know, the only thing that I do that most static sites don't is precompile gzip files for HTML pages, and minify pretty much everything (including HTML and images.) PageSpeed, Pingdom and RedBot were very helpful for providing web server optimizations. I would just observe and implement every tweak they mentioned; there's a lot you can do managing your own server that you can't with AWS or GitHub.
I love how crazy fast your site is on my slow browser (chrome on iOS seems to drag when you've got 100+ tabs). I'm starting a blog network as a side project and setting it up as Jekyll for exactly this kind of performance.
Also love the honest assessment that in your repo's description that your setup is "far more neckbeard and far more work" :)
Interesting. It's still useful to have the data in REDIS so you can easily search through data server-side, but of course that could simply load from static JSON if + when REDIS is restarted/new data is added.
Apologies for semi-hijacking to mention it but, I take a slightly different approach in my personal blog (custom, messy code) - I rsync markdown documents to my webserver which are compiled into HTML and put into a redis in-memory collection which the server uses to render the blog. That gives you in-memory caching for free and avoids having a whole bunch of static files having to be generated every time. I use node on the backend and angular on the frontend to allow for a single-page website.
Currently the solution involves regenerating all HTML each time files are rsync by a script run remotely via SSH, however I ultimately intend for it to use an inotify-style approach to only import files that have actually changed, running both locally and remotely, so publishing an article need only require you to write some markdown and save it in a particular folder.
Though of course all this (currently) requires you to have a server such as linode to which you can rsync + have a remote serving script watch a folder, I mention it so to ask whether anybody would be interested in me cleaning it up and open sourcing it?