I think it’s perfectly fine to do whatever you want because it’s fun, but some of the other justification for accepting and rejecting dependencies seems a bit weird.
For example, one of your reasons for rejecting Hugo is that you can’t read all the code including dependencies and future updates yourself. However, you have no problems with using Go, which has a pretty large standard library and a lot of code for its implementation. Have you read through all of it? Sure, choosing Hugo is strictly worse by this metric (because it’s written in Go), but is even the goal of reading all source code and keeping up with changes tractable if Go is a dependency?
Another one is your rejection of GitHub pages for uptime and future maintenance. There are many reasons you might not want to use it, but I don’t think those in particular are compelling ones. If you just use it to host plain or even templated HTML, then the migration cost if and when they fail is pretty low right? Just switch to one of the other HTTPS hosts or do the self hosting thing then?
The maintenance tradeoff for not using GitHub pages is quite high too. I see your website is using HTTPS. In my opinion managing certificates is more maintenance than using GitHub pages. And then there are security updates.
Dynamically generating websites has a security tradeoff too. Are you keeping your toolchain up to date? Go has had vulnerabilities in the past.
BTW, I’m not not criticising your approach, just some of the justification for it. It sounds pretty neat to have a single binary website, but I’m not too sure about all the other justification.
I'm a big fan of single-binary websites like this one which rely on as little dependencies as possible (Small, well-written, auditable libraries).
What I've found in the go ecosystem as of the past few years is just this enormous graveyard of projects that didn't really seem to get the zeitgeist of keeping as much as close to the standard lib as possible, and included all the batteries. Now you open those old projects written in such frameworks and the batteries are all leaking all over the place and you got some of the juice on your hand, and the github issues page is just a bunch of people asking if the project is still alive yet. You really don't see that with the ones that kept it as simple as, and relying on as little as possible. My old stuff I wrote like that is still chugging along.
My only dependencies to track in this case are go, my webserver frontend (openbsd httpd) and the OS updates. Wow!
The applications I've written to this end aren't the kind of thing you can host on github pages, they're really dynamic and do all this image processing or handle tons of money a day. Turns out people throw more money around at stuff like that and this alone justifies the means.
Hell I know people who even do this in C or C++. That's right, web applications, usually in even more constrained environments like embedded or for game consoles. They get paid more than me, too.
Generating static files has a smaller attack vector than running an executable on a server.
Keeping a service up and running over a long period of time is much more work than using generic static site hosting.
HTML that works today will likely still work in 10 years time, but a binary you compile today most likely won't, or at least, you'll have wanted to have changed up your server.
I've got some golang projects I've maintained for very nearly a decade now for the oldest one, which I deliberately wrote to keep to the stdlib and a few small dependencies. All of these are either web facing or consuming. They definitely still compile today. Nine years feels like a pretty sweet benchmark for me.
The biggest thing to have to deal with really was the mod system. I think that fucked a few dependency-heavy projects out there which you're going to have a hard time compiling today, yet which hilariously still get recommended as if the github issues page isn't full of confused people.
That said I love static site generation when I don't need anything special, but Hugo definitely has that bloated rotten whale smell to it for me after reading other's experiences with it. SSG is like some shit I've done with shell scripts even. It's just at its most basic, slapping html together, with maybe some templating.
There's a difference between trusting Go without reading its source (no dependencies other than the previous version of Go and maybe a C compiler if you want to bootstrap it from version 1.4, run in production at big tech companies, actively maintained by Google) and trusting Hugo (1200 deps, huge plugin ecosystem with less battle-tested packages, unaudited templates that could inject anything they want into your site unless you read each release carefully).
Golang is "only" 2 million lines of code, and it's quite easy to go look at the implementation of something, as the docs link directly to the code. It wouldn't be impossible to read through it, particularly if you skipped architectures other than x86 and parts of the standard library you didn't use.
Getting really familiar reading the golang source code is a great way to understand how to write better go code. It's one of the thing I recommend to new programmmers in this language all the time because it's so accessible from a properly configured editor which will actually go to the definition in the standard lib at the whim of a couple of keystrokes.
Definitely agree with all the above. However, personal websites are neat because the only requirements are your own, so you don't need to justify it.
I love seeing solar powered websites, retro computers hosting modern sites, etc. Having a single binary is a neat idea too, even if it's a bit pointless.
It almost reminds me of the woman that came up with binaries that could run on different OS's unchanged. Definitely neat stuff!
Yeah, it would have been perfectly fine to just write "I wanted to build my website as a binary, so I did it, and here it is" - but for some reason the author felt they needed to justify it, and I agree with the OP that those justifications feel a little bit artificial. I mean, if you're scared of Hugo having too many dependencies, I advise you to steer well clear of any Node.js projects...
I feel you're looking a few levels into the decision tree of Hugo then using that level as the root of a decision tree to review all the other choices. There's not really a feasible way an individual can cut out a programming language or rebuild large portions of the stdlib like HTTPS and end up with something that's easy to maintain or reliable (or time feasible). There ARE easy ways an individual can generate static HTML without pulling in 1200 dependencies. Should there not have been a reasonable alternative to generating static pages that better aligned with the author's stated goals then sure, Hugo+Github would make just as much sense as pulling in the Linux, bash, and go dependencies did.
With GitHub I didn't catch that the reasons revolving around SLA/uptime percentages rather non-breaking change intervals and probability the service will exist usable as is in 10 years. Migrating to another host doesn't necessarily fix this unless there is a particular one to mention that will guarantee existence of the exact method of operation for the next decade and have a solid guarantee for the service to be offered as is for that long. The choices of Golang's stdlib and Linux provide reasonably solid choices in this regard as neither is going to change in a way that breaks deploying/running the site unless they _absolutely_ need to and both have about as big of support backing as one could reasonably expect to find so are unlikely to pull the rug out from the site on the premise of moving on to something newer and shinier like business offerings are often changed/deprecated for.
Certificates I definitely see a closer balance/argument on. On one hand the Github Pages way is really easy, they do it for you. On the other hand they are trying to avoid services that can disappear from them and are probably using ACME which seems to be pointed at Let's Encrypt right now. If for some reason Let's Encrypt did disappear though, which I'll be honest I'm on the side of that being less likely than some breaking change in the way an automated Github Pages hosted static site would function but I can see that being debated either way, at least it's not particularly hard to change the ACME provider for such a case. There may also be some personal sway on whether it's better to manage your own security certificates or not but I also can see that falling to either side depending on the person.
Regardless of choice the toolchain is going to have to be kept up to date to stay secure. There isn't any option to choose which is indefinitely secure and without security bugs.
I struggled with this issue when I realized that it was too much work to update my blog to use the latest version of Hugo. I wasn't able to run the old hugo binary on my new version of MacOS (segfault). And I couldn't even compile the old code from source, because I couldn't get older versions of Go to work. Some of the old Go dependencies had even been renamed or deleted.
Eventually I realized that I could continue using my old version of Hugo forever (0.21) if I run it in a Linux Docker container. The output is still static HTML and CSS, and all the static files are cached and served with a CDN (CloudFlare), so I'm not particularly worried about security updates.
I'm not sure how long this will last though. It would be very nice if new versions of Docker would always provide backwards-compatibility for older Docker images, even in 20-30 years.
I worked at a company that acquired expage.com, which was a late 90s/early 00s homepage builder.
The entire site was a single C source file littered with embedded HTML, a few separate HTML files for things like an FAQ, and a directory of compressed user page data in some proprietary format (including images) that was decompressed on-the-fly. Served over Apache with some CGI magic.
While not a single C file, Fossil SCM also uses C with HTML inside it (it uses a custom preprocessor[0] for it), e.g. check the chat.c file[1]. The entire program ends up in a single binary file too.
> The entire site was a single C source file littered with embedded HTML, a few separate HTML files for things like an FAQ, and a directory of compressed user page data in some proprietary format (including images) that was decompressed on-the-fly. Served over Apache with some CGI magic.
I know that it is a very slender hope but it would be cool to see the source of this.
> [...] a directory of compressed user page data in some proprietary format (including images) that was decompressed on-the-fly. Served over Apache with some CGI magic.
This is pretty neat. I believe that was done to save on server storage, and before client side gunzip was a thing?
Author and I have surprisingly overlapping values and surprisingly non-overlapping outcomes. And I’m here for it!
Also, props for a brutally-minimalist design that’s actually usable on a phone. Most of the sites I encounter like this feel like “this is a motherfucking website”[1] in principle twisted (intentionally or not) to be completely the opposite in spirit.
It looks better than that, for sure, but only actually looks good if my phone is in landscape orientation.
In portrait, the section headers / titles are aligned right instead of center, which is odd, but the page title itself breaks onto two lines with the second aligned left.
There is also something odd about line breaks in general- I would swear that some lines are breaking early, as if the browser can't figure out that the first word of the next line absolutely looks like it could have fit. I actually switched to horizontal orientation just to see if the weird breaks weren't some odd artistic choice.
I've got to admit, though, that other than those complaints, I really do like the way there is content- just content.
I've tried hand rolling my own blog a number of times, but I always end up in the same position- I remember that I don't actually have anything to say that merits a standalone blog post that I actually would want anyone else to read.
I appreciate weird sites like this. Makes me want to dust off one of my web projects.
It's interesting that you generate the site dynamically, but then drop everything into a pre tag which makes it very anti-responsive. Like others have mentioned, the word-wrapping is choppy on smaller screens
I like the idea and I’m likely to try the concept on something. This website has wrapping issues on my phone. Since it’s binary, that’s probably easy to solve in a single function. I’m really curious how data is stored in this system. Is all this text inside the code base, set as variables or something? Is it tacked onto the end of the bin and then the bin reads and processes it somehow? Or, is this reading plain-text stored alongside the binary?
There are a lot of static site hosting options, all of which have a different idea of "easy".
What I'm not seeing is what the easy dynamic hosting options are, other than a little VPS. Things like FaaS look easy once you've learned their lengthy setup and configuration process, and there's always the risk of a cloud billing nightmare.
Basically I think there should be a cloud version of Visual Basic.
The entire index loads every time, and you can download it. It might be the only truly private search engine because your request is handled in the browser.
Oh, oooohhh, this is so neat. You guys need to see its source too (In Firefox, for me, is right click->View page source). I mean I could read the same from source, with line breaks in same place. Try do that with any modern website, and you get everything in one line most of the time.
I should add that I've written my own Nim web stack on top of Jester. I'm getting it release ready. It integrates with a Flutter front-end engine that will be released first. https://nexusdev.tools/
I upvoted because this person is so different from me. I would not use a .sh file for something like that even if I was paid to do so.
But if it works for them, who am I to argue. I do have a website which uses a static html generator that I have not maintained in years, mostly because I don't enjoy having to tangle with the more Operations aspects of it (which are very simple! but still not enjoyable).
I think the lesson that I got from this writeup is: try to optimize for enjoyment on your own personal stuff. Otherwise it will get deprioritized the moment you struggle with time.
The easiest website to maintain is a website maintained by someone else and you pay a few bucks or find a free offering. The author wants both, full control and minimum maintenance. This is a well known contradicting.
> I decided that i would host a website on a machine that i controlled
If you host yourself then you have to maintain that server, but you're in full control. If you pay someone else todo so, you give up full control, but you don't have to maintain the server. It's a tradeoff.
Having separate data and code never causes problems either? Just because you bundle them together, doesn't mean the data and code are together during development. They get bundled together in the build process.
It's not as bad as say, writing HTML in JavaScript (JSX).
thanks for sharing! Starting with values is so essential!
However I'd opt for the self-made static site generator and even more so annual partitions (for styling etc.), so that things can safely be heterogenous over time.
I'm sure this is late, but OP might like zola, It's a SSG that's contained in one binary (written in rust), It's lightweight and more barebones than most SSGs imo.
Sorry, I don't get it. Looking at the source tree, there's a folder of html. So you're compiling it down to a single binary, for what purpose? If you love golang, run caddy and point it to a folder of html.
I used to maintain a web site with a Guile script. It read in s-expression data files and spat out a whole bunch of HTML, which I would then ftp to the server.
There are plenty of good reasons to roll your own solution. A blog is about as risk-free of a situation as you will get when it comes to rolling your own.
It's not like the author is rolling their own encryption library.
me too. this must have been studied linguistically somewhere? when i'm in lowercase mode i'll actually even go back and delete an accidental uppercase letter! so it's not just about laziness. maybe it has something to do with formality register? like how it can even be inappropriate to use a too formal register with a close friend, imagine if you suddenly called your spouse _mister_ or _misses_ smith!
For example, one of your reasons for rejecting Hugo is that you can’t read all the code including dependencies and future updates yourself. However, you have no problems with using Go, which has a pretty large standard library and a lot of code for its implementation. Have you read through all of it? Sure, choosing Hugo is strictly worse by this metric (because it’s written in Go), but is even the goal of reading all source code and keeping up with changes tractable if Go is a dependency?
Another one is your rejection of GitHub pages for uptime and future maintenance. There are many reasons you might not want to use it, but I don’t think those in particular are compelling ones. If you just use it to host plain or even templated HTML, then the migration cost if and when they fail is pretty low right? Just switch to one of the other HTTPS hosts or do the self hosting thing then?
The maintenance tradeoff for not using GitHub pages is quite high too. I see your website is using HTTPS. In my opinion managing certificates is more maintenance than using GitHub pages. And then there are security updates.
Dynamically generating websites has a security tradeoff too. Are you keeping your toolchain up to date? Go has had vulnerabilities in the past.
BTW, I’m not not criticising your approach, just some of the justification for it. It sounds pretty neat to have a single binary website, but I’m not too sure about all the other justification.