It's an interesting difference in backgrounds, maybe? I tried to build this on OpenBSD where we don't have docker and we use 'make' instead of random shell scripts.
For what it's worth, this is from the build log:
error code 1
error path /home/holsta/3rdparty/fusion/frontend/node_modules/@sveltejs/kit
error command failed
error command sh -c node postinstall.js
error /home/holsta/3rdparty/fusion/frontend/node_modules/rollup/dist/native.js:84
error throw new Error(
error ^
error
error Error: Your current platform "openbsd" and architecture "x64" combination is not yet supported by the native Rollup build. Please use the WASM build "@rollup/wasm-node" instead.
Looks good! I am curious on why you recommend to deploy using docker, being a single binary with no external dependencies I find the deployment simple enough.
I write all my personal projects using Go and one of the things I most like is that it compiles to a binary without external dependencies.
The SQLite driver uses cgo, so we use both Ubuntu and Windows Server in CI to avoid cross-compiling. However, we still can't confirm that it's 100% ok on Windows. If any weird bugs occur on Windows, we don't have much experience or energy to deal with them.
The Docker image is based on Debian, we are more familary with it.
it's still easier to manage docker containers if they're 50 MB instead of 300MB and if the rest of the fleet is being managed via docker-(whichever) then there's something to be said about consistency. managing everything through one interface is easier than remembering all the special cases. but to each their own.
I don't even like docker, but it still doesn't sound that terrible to me. It's an option. Use docker or use the single binary, but presumably if you like docker and have it set up for other things, you'll just use that rather than rolling your own startup scripts etc.
I do something a bit similar for my own project - it's a single binary REST server, but I still package it up with dpkg-deb and deploy that to a private apt repo so I can update it easily on the servers with "apt-get update && apt-get install blah" and that fits nicely with my existing processes and I can just add the repo and dependency to my cloud-init setup. If I used docker, I'm sure I'd find his docker image the easiest path to getting it installed and updated.
Consistency is key. it's easier if you're using docker to run all the things, then docker ps shows all the things running, instead of having to check docker, and then also check this other thing over here that's different
Encountered the same problem last year. My tech stack was React Native as I was mostly building stuff that has few interactions and it was easy to get something good on both platforms. Then, I got a notice from Google about API version. Updating the project was such a nightmare due to compatibility issue. Some libraries were abandoned. Some had breaking changes. It was easier to write the two native versions than to deal with npm mess.
For any project that you depend on that has opted into the NPM quagmire, you really should be running `git add --force ./node_modules` and periodically pushing a copy of this to a branch/repo that you control instead of depending on upstream, since most projects that ostensibly use Git tend to thwart its entire raison d'etre—hobbling its ability to do effective version control by abusing .gitignore for their overlay VCS of choice (i.e. the "package manager").
The issue was not finding the libraries code. The issue was the churn. So one day, you have a (in my case, small) set of libraries to get things going. Then 2 years after, compilation issues as they all have different requirements. So you have to find another common intersection between them and the node/react-native versions. I should have vendored some in the project.
In Android, libraries are much stabler. Deprecated functions are picked by the IDE and an alternative is often presented in the comments. I'd much prefer to have big libraries (as we have tree-shaking) especially when dealing with frameworks instead of the bazillion packages when trying to do anything with npm.
> So one day, you have a (in my case, small) set of libraries to get things going. Then 2 years after, compilation issues as they all have different requirements. So you have to find another common intersection between them and the node/react-native versions.
That's what the whole version control thing that I mentioned is good for. Check out a two-year old copy that resolves to a faithful reproduction of whatever it was you were able to use successfully when you first checked it in.
It explicitly states small VPS as a target, so - yes - Linux. I realize you can deploy non-Linux image on VPS, but hosting docker containers on Linux is the default
And the documentation literally specifies it requires docker.
How did this become "minimal dependencies"?