Hacker Newsnew | past | comments | ask | show | jobs | submitlogin
Xmake: A cross-platform build utility based on Lua (xmake.io)
67 points by fanf2 on March 15, 2020 | hide | past | favorite | 40 comments


I don't understand the motivation behind an imperative build system. Why is it "add_rules(a,b)" instead of just "rule(a,b)"? It's like someone discovered the benefits of a domain specific language and butchered the concept by making the internal implementation the public interface. Why does it matter to the user that something is getting added to something? Why doesn't it look closer to this [0]?

[0] https://repl.it/repls/DeadCalmPrinter


It's imperative in order to allow arbitrary control structures instead of bending over backwards with 80% solutions and stupid fragile tricks like you would need with Make.


I'm not sure we need yet another build system. We pay a real price for this sort of fragmentation. If I have to pick up your cross-platform C++ codebase, I shouldn't have to learn yet another build system.

CMake/autotools/Premake/SCons/waf/Ninja/Meson/GYP/qmake/Bazel... and now xmake.

I'd rather CMake just be improved, than see endless incompatible reinventions. If you'll forgive a little snark: on the plus side, it's unlikely this new challenger will see any real uptake.


I feel you're grossly exaggerating the current state of affairs.

From your list alone, cmake is the de-facto standard. All others are either defunct, abandoned, or dropped out of favour, or never had any expressive adoption. For instance autotools is largely a relic of back in the days where C++ was C with classes, and Qmake was already dropped by their makers in favour of cmake.

Additionally, some of your entries don't make much sense. For example, ninja is at most a low level alternative to raw makefiles, and in fact tools like cmake are able to generate either makefiles or ninja files, and GYP is just a high-level format designed to generate other build system project files, specially IDE-specific projects such as XCode and Visual Studio project files.


"For instance autotools is largely a relic of back in the days where C++ was C with classes..."

Minor point: autotools is a relic of the days when there were multiple, only somewhat compatible implementation of Unix in common use. Now that the world is a Linux monoculture and most software is installed by end users in the form of pre-built binary packages, the form of portability provided by autotools is no longer needed.


I don't agree with you at all. Autotools boil down to a bunch of shell scripts that enable projects to abstract away the OS/distribution while following a somewhat standard structure.

And binary packages are completely irrelevant and unrelated, because autotools is a build system, thus designed to help build software. Although the availability of source code packages enabled build systems to be also used as installers, thay does not mean the need for build systems is replaced with plain old installers.


That doesn't sound right. The world isn't a Linux monoculture.

Most C++ code I've written has been Linux/Windows. I've never targeted other Unices, but they're out there, and I like to think my portable code would build under FreeBSD with minimal effort.


> All others are either defunct, abandoned, or dropped out of favour, or never had any expressive adoption.

So you're agreeing that, if we see a new challenger succeed, we pay the price of fragmentation?

I agree that these sorts of projects rarely gain any real traction.

I don't see that the history of autotools is relevant here. It's still very common to see in the wild.


I fully agree. The more fragmented cross platform build tools become the less useful they are. Cmake is really pretty good and has some support from Microsoft in Visual Studio and Visual Studio Code.


With years of experience with Scons, I can say quite confidently that I don’t ever want to work with a Turing-complete build system ever again.


> I don’t ever want to work with a Turing-complete build system ever again.

so do you instead write custom additional executables that may require a cross-compilation step for whenever you have to perform custom pre/postprocessing tasks on your code (e.g. parsing it, etc)


I also have had bad experiences with scons and much prefer declarative build systems with options to escape to external executables when you really can't help it. Because the vast majority of the time, you can actually help it.

Having friction to using bespoke compilation steps is a feature in my experience, it means that devs will use standard build elements whenever possible and if a problem occurs the "wildcard" external binaries can more easily be probed and reviewed.


And your point is that it would be better to write pre/postprocessing logic in the build system? That sounds like a recipe for disaster and slow build times.


> And your point is that it would be better to write pre/postprocessing logic in the build system

well, it has to be somewhere and i'd rather write my build configuration in a language instead of N. So yes, that's what I do currently and it causes me exactly zero issues.


What I would advise is to think carefully about your custom pre-/post-processing tasks in light of how they will affect componentization and reuse. Working with many teams and codebases, it’s been my experience that having a lot of custom code generation or code parsing ties your code to your build system in ways that make it harder to re-use elsewhere, on different platforms or for different products. It also imposes a level of astonishment when analyzing dependencies - libraries that may appear to be largely independent of each other can actually be effectively coupled if one of them has a funky prebuild codegen step that depends on the other. Or coupled to the core build logic and source tree. If you really need the custom build step, and it really needs to be implemented in your build system, I’d encourage you to think about the tools and language you’re using.

If you build it, they will come. Don’t be surprised to find out that a good chunk of your business logic winds up in build files rather than what you would typically think of and analyze as your ‘source’.


Could you elaborate a bit on what got you to that conclusion? That's not meant as a challenge - just curious to get more perspective on your conclusion.


Coming from maven I can say the opposite.


> Coming from maven I can say the opposite.

To be fair, maven is in a class of its own with regards of designing a tool that is the worse of both worlds.


I think this is sort of the creator/maintainer conflict. Creators hate maven because they have their own idea about how their project should be structured and it’s different from maven’s. Maintainers love maven because they inherit a code base with a canonical structure.


As a creator I love Maven for that same reason, I appreciate having a defined structure for my project that makes it easy for anybody on a team to pick it up without needing to navigate a build script of arbitrary complexity.


That ^^^


Looks pretty similar to premake[1] but I couldn't find a comparison on the site. Could you provide a pitch or comparison? From what I can tell so far, it looks like xmake directly invokes the build tools whereas premake generates a Makefile/Xcode project/visual studio project from which you then need to use those tools to perform the final build.

[1] https://premake.github.io/


I've only kicked the tyres of both, but Xmake seems to be more opinionated than Premake. It gives you quite a lot out of the box that you have to build yourself in Premake. Eg (basic) dependency management.

Tbh, I was impressed by Xmake. All I need is a meaningful project to use it on.


Benefits over meson? I found no unique features from skimming the docs.


It doesn't depend on Python.


People are downvoting but it’s 100% true. A build system targeting primarily C/C++ requiring an interpreter and runtime dependencies is a no-go for quite a number of projects. There’s a reason they were written in C/C++ in the first place, and the fact that a lot of people install via packages (separating build from distribution) is scarcely solace.


But lua requires a runtime and an interpreter, right?


lua is unique because it can be embedded into C applications as a library, e.g. the difference between SQLite and MySQL.

For example, there's an nginx module that lets you use lua to configure the server logic. I'm presuming you don't need to have lua installed to use xmake (i.e. batteries included), otherwise you're a hundred percent correct.


I don’t get it honestly and never did. On Linux-likes deps are installed by a pm, and chances are you already have python for various reasons. The same goes for Lua. On msw there is more routine, since you don’t have a pm, and have to install everything by hand (+PATH) or choose a pm and begin dealing with its nuances. The same for Lua and xmake. What’s the difference beyond non-C/++ impurity?


I believe xmake doesn't "require" Lua. Instead, xmake embeds Lua. Specifically it embeds LuaJIT 2.1, based on what I could tell in their repo source. So, when you run xmake, you're actually running a a full-blown Lua implementation that is self-contained in the xmake binary, that has some automatic imports for the xmake embedded build DSL. If you want to run arbitrary Lua using xmake, there is a built-in "lua" plugin, shorthand "xmake l", described here:

https://github.com/xmake-io/xmake/blob/master/README.md#run-...

So, it would be a bit like if you built a build language atop Python, but rather than depending on Python, you made your whole build tool embed the entire CPython interpreter, pinned to some version. Lua is used in this way more commonly than Python is, of course.

So: xmake itself is just a C program with large submodules written in Lua, running inside that embedded Lua context. The upside here is that xmake's releases are just plain platform-specific native binaries that bundle everything you need, and that make no assumptions about your base system. This is a nice benefit in a build tool, as it aids in reproducible builds.


I have used lua this way myself but I don't understand how installing python is a barrier. I understand that depending on a secondary programming language to build your software is sub optimal because only the person setting up the build system gets familiar with the "build language" and when something breaks that person is the only one that can fix the build. Switching away from python to lua only makes that problem worse because python is more popular than lua.


Yes, and this is a big problem. People are unaware how other lang’s environments work. This exact issue I have with AutoHotKey app. Instead of taking something popular, it reinvented its own cryptic language and its author seems to be attached to it to not even think to support other runtimes. Many things I’d like to do in AHK, but this wall is hard. Even if you learn a language, without practice it fades rapidly, and next time you return there it’s still the same cryptic copy-paste cookbook.


I embedded Lua both ways (.a vs .dll). Not sure now, but I think that it is then ~thing in itself and require()-ing external dlls will mess with your runtime badly, but again it was some time ago and maybe there was a way to export .exe api to 3rd-party dlls instead of lua5x.dll found somewhere in path.

Anyway, a single executable makes it hard to argue, and in this case I agree with you and ggp.


I come from FreeBSD where we have an amazing package manager but a “keep it lean” mentality. If a package isn’t on the package manager (the repo is great but nowhere near as expansive as Debian or Fedora) or if I want to build from source, I shouldn’t need to install such a huge system dependency.


Oh, that made me nostalgic of that portupgrade-oniguruma-ruby 20+ minute dep build phase.


In case you’re not aware, the FreeBSD package manager switched to binary packages a while back. No more compilation unless you want to change from the defaults.


You obviously never had Python PATH hell on windows.


Nope, and I’m pretty sure that I have/had both 3and 2 installed, both in PATH. 3 for experiments and 2 as part if node-gyp toolchain. I can think of why this hell exists — some program calling python.exe gets an incompatible version — but that’s an issue with that program. Linux also had issues with /usr[/local]/bin/python pointing randomly, iirc, didn’t it?


Python can be embedded into C applications as a library.


The build file is a full Lua script, unlike Meson's limited DSL, which you might see as an upside or downside.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: