Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

The scripting language is bad, most cmake scripts are much more complex than they should be, but cmake does a lot of things under the hood that make multi- and cross-platform development easier (for example: being able to build with the Visual Studio compiler without running in the "Visual Studio Developer Command Prompt").

A lot of fancy new build tools don't even have proper Windows support.



I keep running into CMake based things not being portable, and being broken. And pretty much every single time cmake fails, it does not have ANY log AT ALL of what it did, and why it thinks libfoo is not there.

Take autotools. You get EXACTLY what it did, and you see why it failed. With CMake it just goes "you don't have X installed". But I do. It's right there. CMake refuses to say under what cmdline or whatever it tried, and what the error was. Just "X IS NOT THERE!".

And then it depends on some version of cmake being installed, which it may not be.

I don't do windows coding, but shouldn't autotools be more viable now that Windows ships with bash?

But even if not, CMake stuff as a rule is not even portable between different flavors of Unix, so why even use CMake if it's only going to work on Ubuntu more recent than 18.04 or whatever? It's not a silly suggestion. OpenBSD generally just has raw Makefiles.


Could you explain how cmake stuff isn't portable or broken between different flavors of Unix? I comfortably maintain CMake builds that work on linux (with centos, ubuntu and arch), macos and windows without much trouble.

If you mean cmake versions, I'm not sure what you expect, should cmake just freeze in time and stop adding features so someone gets to use cmake from 5 years ago?

Also, the bit about autotools on windows is silly. If it can't build native windows stuff, it's not at all viable for windows.

I also dislike cmake (particularly dependency management) and wish for something better, but I think our criticisms should be well-founded.


I've never seen an autotools- or make-based project which builds out of the box on Windows in cmd.exe. Telling people to install cygwin just to build a project simply isn't viable when the default compiler toolchain on Windows is Visual Studio. Also it's not about building projects alone, cmake is also a project file generator for IDEs (in my case that was the actual reason why I switched to cmake from my own project file generator for Visual Studio and Xcode).


Interesting. Why do they fail with autotools?


Autoconf requires GNU m4 at build time and POSIX sh at runtime. It also requires all the standard UNIX command-line tools you would run from the shell. Automake requires Perl at build time and Make (preferably GNU Make) at build time.

A stock Windows system doesn't have these tools. Cygwin and MinGW/MSYS provide these, but you're using stuff which is non-standard for the platform and which if you need to integrate with other libraries and tools, end up being incompatible.

If you want to use MSVC, the Autoconf/Automake support is poor. Generating output other than Makefiles is possible, but limited and quite the undertaking.

CMake supports all these other use cases out of the box. Which is why it gets used. It works on every platform, and with every compiler, build system and IDE of note.


I don't know, I guess each project fails in a different way (my experience was mostly with libcurl before it came with cmake support), but IME projects which come with a Makefile or use autoconf don't care about non-UNIX-y operating systems, while (again: IME) a CMakeLists.txt file is often a sign that the project will also build on Windows with MSVC.


> But even if not, CMake stuff as a rule is not even portable between different flavors of Unix,

Not sure what your experience has been but I've had zero problems with cmake portability on nixes.

Raw makefiles are so much worse from a maintainability point of view.

I find it kinda fascinating to be honest. Two devs come to totally opposite conclusions.

Also to get more logging try this: https://stackoverflow.com/a/22803821/3988037


Everyone's experience is different, but that's very different from my memory of autotools - a Turing-complete macro processor which generates several thousand lines of pre-modern shell script with embedded C source codes. When anything goes wrong, it's pretty much impossible to match it to the original macro definition. Just so that one can hypothetically build the project in an ancient UNIX system which is so old that its shell doesn't even have functions.


While I've not had that problem, I can see that it's nonobvious. But it's years between me looking at the actual compiled output. I look at assembly output more often, and most people never even do that.

What I mean though is that ./configure outputs a config.log that says exactly what was done. Exactly what command line was run and exactly what the error was.

I guess I'm confused why you're even looking at the compiled scripts. Would you not look at logfiles and stderr output before you start looking at assembly?

CMake doesn't. The logs are completely useless, even when I get a CMake expert to come and agree, yes that's useless.

With autotools you can see that it failed to build because it couldn't find library foo when building with "gcc blah blah blah", and you go "well yeah, you need -L/opt/foo/lib", so you just add that. (or pkg-config equiv).

CMake, on the other hand, seems to force you to look at CMake "source code" (CMakeLists.txt), which is a horrible mess. I'm not saying autotools isn't (yay, m4 :-( ), but the point is you don't have to, because it actually logs what it does.

So this is one of the many many reasons CMake sucks.


> What I mean though is that ./configure outputs a config.log that says exactly what was done. Exactly what command line was run and exactly what the error was.

... so does CMake ? you get log of the errors in CMakeFiles/CMakeError.log, and you can trace what happens line by line in a very verbose way with cmake --trace (or cmake --trace-expand if you want variables to be expanded)


I just spent some time searching for m4 opinions on HN and it's more like a love-hate than hate-hate.

It looks like if you dive head first into learning autotools, then m4 is going to look like nuisance.

But if you allocate time to learn m4 for what it is, instead of just something you have to put up with as part of your autotools adventure, then you'll hate m4.

m4 is the only (or one of a very few) well-known general-purpose language-agnostic macro processor I know of.


I don't hate m4. Could be worse, could be better.

I used to build websites using m4 back in the 90s. Back when static site generats were popular, before the latest slight resurgence.

To misquote Mitch Hedberg: People either love it or hate it, or they think it's ok.


correction: ... autotools adventure, then you might not hate m4.


Its difficult to match the origibal macro, but dead easy to match into the shell script.

If youre just trying to get something to build as a user, its actually quite easy to read the configure script and see why its failing. The accompanied config.log is also quite detailed.

Autotools are not the best, but i always prefer building autotools packages over cmake. Worst case i can modify the configure script directly.


I never managed to get CMake to work out-of-the box on Windows.

I always end up vendoring the dependencies, because it's too tedious to find the correct installed libraries in a cross-platform way.

autotools are great, but at the same time it's a monstrosity, and easy to misuse (you should never commit the configure script, because it's the autotools that are supposed to generate it according to the platform it's running on).

For C++ dev, I did take a look at buck[1], but it doesn't really support Windows platforms.

I wish we had something like cargo (IMHO, the tooling is one of the top reasons of Rust's success), in the meantime simple Makefiles do the job perfectly.

1 - https://buck.build/


You probably may want to use Bazel. Buck’s Windows support at Facebook was fine (most all the PC oculus stuff is built with Buck) but I don’t know what state the OSS version is in (maybe technically has support but the internal pieces that make it work well don’t have OSS equivalents that are decoupled from other FB-internal things).


> autotools are great, but at the same time it's a monstrosity, and easy to misuse (you should never commit the configure script, because it's the autotools that are supposed to generate it according to the platform it's running on).

The configure script is not generated according to the platform it is running on. The whole reason tbe configure script is such a monstrosity is because its supposed to be portable, its written in the lowest common denominator of shell. You are supposed to distribute it. Its also fine to check in if you want end users building directly from VC rather than from tarballs. Just dont modify it by hand.


From my experience, `autoreconf -i` is responsible for generating the configure script from a configure.ac file, so I assumed some platform-specific logic was happening.

I always wrapped it up in a autogen.sh script that I did commit. Also, end users generally prefer prebuilt binaries, if one wants to compile the software themselves, I expect him to have autotools installed, and mention it as a requirement in the README.


Autotools and the configure script come from a history of distributing software purely as source code instead of binaries. From the early GNU days.

https://www.gnu.org/prep/standards/standards.html#Managing-R...

The intention was that software would be released as source code "tarballs" and contain a configure script, written in the lowest common denominator scripting language to configure that source code to compile on the users system. Additionally by distributing the configure script itself instead of configure.ac, users tend to need fewer dependencies that they don't already have.

It's _less_ applicable now that most users get pre-built binaries from package managers and it just feel pretty antiquated overall. But yeah the intention is that configure is platform agnostic and prepares your source code tree to build on the current platform. Whereas autoconf/autoreconf is intended as a tool for the developer to make writing "configure" a lot easier.


I would say that best practice is to NOT commit the generated files, but DO include them in release tarballs. Because release tarballs are exactly what need to be portable to all your users.


> I keep running into CMake based things not being portable, and being broken. And pretty much every single time cmake fails, it does not have ANY log AT ALL of what it did, and why it thinks libfoo is not there.

Google "cmake --trace-expand"

Is that your main gripe with Cmake?




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: