John Cowan writes: > This is the outcome of many years struggling with autotools, and several > more struggling with CMake. (Chicken compiles Scheme to C and is written in > Scheme, so it has to bootstrap itself, something CMake doesn’t or didn’t > like.) [Supposedly CMake has fixed this problem now.] In my opinion this in-between-step to CMake is a mistake many made. I’ve gone through cmake, scons, setuptools, and waf, with a short involuntary detour to the numpy build system, to end up with autotools, because all the other tools had a too narrow view of what was necessary — and when I hit their limits, I was pretty much lost. Once I actually hacked systemwide installed numpy sources to provide support for the intel fortran compiler. Nowadays I experience maven and gradle fail at delivering what autotools already provide, with IntelliJ filling the gap of incremental rebuilds by integrating that into the (huge) IDE. > By contrast, I did a bad restore from backup the other day, and Guile > (which uses autotools, small blame to the maintainers — the FSF insists on > it) got confused because I hadn’t restored the file timestamps properly, so > the byte-compiled Scheme looked out of date relative to the source. No > biggie, I just rebuilt Guile from scratch. Ouch. That took longer than it > would have to *re-restore the backup three times over*, and the hard time > was struggling with autotools; I had to run ./configure about twelve times > before it reported all the missing C-level dependencies (not really > missing, just out of date). Didn’t `make -B` work? Force-rebuild everything. Seeing that this was 8 years ago: Possibly it didn’t because autotools didn’t regenerate the configure automatically. It does that nowadays. > Most of which no longer exist. Autotools is the complete opposite of > Scheme: it piles feature on top of feature rather than removing the > weaknesses and restrictions that make additional features appear necessary. While autotools does have more features than I personally need, using autotools replaces many usages of “spin up a whole OS to run a shell-script” for me. `make distcheck` is what should always be provided: Create a tarball and ensure that the generated tarball can be used to build your tool so that the tests pass using a separate build-directory. There is some unclean stuff where abstractions got broken (which hurts me, but then I did not report it as bug yet — that’s on me) but autotools do actually map the complexity you get in real development. I personally saw the persumably simpler ways fail in real deployment. You say that all this complexity does not exist anymore. I’ve experienced that complexity break every other build tool less than 10 years ago when working on cutting-edge university clusters. Autotools just worked. I think configure-runs should be slimmed down by a first check whether it’s in a known environment (so 90% of the tests can be skipped). But that’s not a general complaint about the approach of autotools. And I think that the autotools documentation is still abysmal for newcomers: There’s no “how to solve this task correctly” for real-life tasks, and most answers to these questions you find online are wrong, so most autotools-users actually do resort to cargo-culting (“this worked in project X”). All the while the world has moved on and most new tools (like npm, maven, go, cargo, …) auto-generate their (often very complex) build structures. I started a project to provide something similar for autotools, but it still has too little support for different languages: https://hg.sr.ht/~arnebab/conf “initialize modern autotools projects” Yet this is where we can actually improve the state of build systems without failing at the same problems over and over again: Make it easier to start clean autotools projects. Best wishes, Arne -- Unpolitisch sein heißt politisch sein ohne es zu merken