HI version is available. Content is displayed in original English for accuracy.
Every time I wanted to start something new I'd spend the first hour writing CMakeLists.txt, figuring out find_package, copying boilerplate from my last project, and googling why my library isn't linking. By the time the project was actually set up I'd lost all momentum.
So, I built Craft - a lightweight build and workflow tool for C and C++. Instead of writing CMake, your project configuration goes in a simple craft.toml:
[project]
name = "my_app"
version = "0.1.0"
language = "c"
c_standard = 99
[build]
type = "executable"
Run craft build and Craft generates the CMakeLists.txt automatically and builds your project. Want to add dependencies? That's just a simple command: craft add --git https://github.com/raysan5/raylib --links raylib
craft add --path ../my_library
craft add sfml
Craft will clone the dependency, regenerate the CMake, and rebuild your project for you.Other Craft features: craft init - adopt an existing C/C++ project into Craft or initialize an empty directory. craft template - save any project structure as a template to be initialized later. craft gen - generate header and source files with starter boilerplate code. craft upgrade - keeps itself up to date.
CMakeLists.extra.cmake for anything that Craft does not yet handle.
Cross platform - macOS, Linux, Windows.
It is still early (I just got it to v1.0.0) but I am excited to be able to share it and keep improving it.
Would love feedback. Please also feel free to make pull requests if you want to help with development!

Discussion (166 Comments)Read Original on HackerNews
- (1) Provide a way to compile without internet access and specify the associated dependencies path manually. This is absolutely critical.
Most 'serious' multi-language package managers and integration systems are building in a sandbox without internet access for security reasons and reproducibility reasons.
If your build system does not allow to build offline and with manually specified dependencies, you will make life of integrators and package managers miserable and they will avoid your project.
(2) Never ever build in '-03 -march=native' by default. This is always a red flag and a sign of immaturity. People expect code to be portable and shippable.
Good default options should be CMake equivalent of "RelWithDebInfo" (meaning: -O2 -g -DNDEBUG ).
-O3 can be argued. -march=native is always always a mistake.
- (3) Allow your build tool to be built by an other build tool (e.g CMake).
Anybody caring about reproducibility will want to start from sources, not from a pre-compiled binary. This also matter for cross compilation.
- (4) Please offer a compatibility with pkg-config (https://en.wikipedia.org/wiki/Pkg-config) and if possible CPS (https://cps-org.github.io/cps/overview.html) for both consumption and generation.
They are what will allow interoperability between your system and other build systems.
- (5) last but not least: Consider seriously the cross-compilation use case.
It is common in the world of embedded systems to cross compile. Any build system that does not support cross-compilation will be de facto banned from the embedded domain.
I have an even stronger sentiment regarding cross compilation though - In any build system, I think the distinction between “cross” and “non-cross” compilation is an anti-pattern.
Always design build systems assuming cross compilation. It hurts nothing if it just so happens that your host and target platform/architecture end up being the same, and saves you everything down the line if you need to also build binaries for something else.
This is one of the huge wins of Zig. Any Zig host compiler can produce output for any supported target. Cross compiling becomes straightforward.
Also the problem isn't creating a cargo like tool for C and C++, that is the easy part, the problem is getting more userbase than vcpkg or conan for it to matter for those communities.
Perhaps you can see how there are some assumptions baked into that statement.
Shipping anything built with -march=native is a horrible idea. Even on homogeneous targets like one of the clouds, you never know if they'll e.g. switch CPU vendors.
The correct thing to do is use microarch levels (e.g. x86-64-v2) or build fully generic if the target architecture doesn't have MA levels.
I am willing to hear arguments for other approaches.
However I'm not sure about -O3. I know it can make the binary larger, not sure about other downsides.
It is completely fine to use -march=native, just do not make it the default for someone building your project.
That should always be something to opt-in.
The main reason is that software are a composite of (many) components. It becomes quickly a pain in the ass of maintainability if any tiny library somewhere try to sneak in '-march=native' that will make the final binary randomly crash with an illegal instruction error if executed on any CPU that is not exactly the same than the host.
When you design a build system configuration, think for the others first (the users of your software), and yourself after.
IME -O3 should only be used if you have benchmarks that show -O3 actually produces a speedup for your specific codebase.
I fully concur with that whole post as someone who also maintained a C++ codebase used in production.
Gentoo user: hold my beer.
- skipping cmake completely? would this be feasible?
- integration of other languages in the project?
- how to handle qt?
Feasible but difficult. CMake has a tremendous user mass, so you do want to be able to use a CMake-based project as a dependency. The CMake Target/Config export system expose CMake internals and make that difficult to consume a CMake built project without CMake.
The cleanest way to do that is probably what xmake is doing: Calling cmake and extract targets information from CMake to your own build system with some scripting. It is flaky but xmake has proven it is doable.
That's said: CPS should make that easier on the longer term.
Please also consider that CMake is doing a lot of work under the hood to contains compiler quirks that you will have to do manually.
> integration of other languages in the project?
Trying to integrate higher level languages (Python, JS) in package managers of lower level languages (C, C++) is generally a bad idea.
The dependency relation is inverted and interoperability betweens package managers is always poor. Diamond dependency and conflicting versions will become quickly a problem.
I would advise to just expose properly your build system with the properties I described and use a multi-language package manager (e.g Nix) or, at default, the higher level language package manager (e.g uv with a scikit-build-core equivalent) on top of that.
This will be one order of magnitude easier to do.
> how to handle qt?
Qt is nothing special to handle.
Qt is a multi language framework (C++, MOC, QML, JS and even python for PySide) and need to be handle as such.
15000 what?
The 15000 was a typo on my side. Fixed.
https://github.com/xmake-io/xmake
The reason why I like it (beyond ease-of-use) is that it can spit out CMakeLists.txt and compile_commands.json for IDE/LSP integration and also supports installing Conan/vcpkg libraries or even Git repos.
Then you use it likeAs an example of what I mean, say I want to link to the FMOD library (or any library I legally can't redistribute as an SDK). Or I want to enable automatic detection on Windows where I know the library/SDK is an installer package. My solution, in CMake, is to just ask the registry. In XMake I still can't figure out how to pull this off. I know that's pretty niche, but still.
The documentation gap is the biggest hurtle. A lot of the functions/ways of doing things are poorly documented, if they are at all. Including a CMake library that isn't in any of the package managers for example. It also has some weird quirks: automatic/magic scoping (which is NOT a bonus) along with a hack "import" function instead of using native require.
All of this said, it does work well when it does work. Especially with modules.
e.g. from their docs:
Other than that, both "python layer" and "over the ninja builder" are technically wrong. "python layer" is off since there is now a second implementation, Muon [https://muon.build/], in C. "over the ninja builder" is off since it can also use Visual Studio's build capabilities on Windows.
Interestingly, I'm unaware of other build-related systems that have multiple implementations, except Make (which is in fact part of the POSIX.1 standard.) Curious to know if there are any others.
It's similar, but designed for an existing ecosystem. Cargo is designed for `cargo`, obviously.
But `pyproject.toml` is designed for the existing tools to all eventually adopt. (As well as new tools, of course.)
I'm sorry I have to be a downer, but the fact is if you can use the word "I" your package manager is obviously not powerful enough for the real world.
* Standards committee is allergic to standardizing anything outside of the language itself: build tools, dependency management, even the concept of a "file" is controversial!
* Existing poor state of build systems is viral - any new build system is 10x as complex as a clean room design because you have to deal with all the legacy "power" of previous build tooling. Build system flaws propagate - the moment you need hacks in your build, you start imposing those hacks on downstream users of your library also.
Even CMake should be a much better experience than it is - but in the real world major projects don't maintain their CMake builds to the point you can cleanly depend on them. Things like using raw MY_LIB_DIR variables instead of targets, hacky/broken feature detection flags etc. Microsoft tried to solve this problem via vcpkg, ended up having to patch builds of 90% of the packages to get it to work, and it's still a poor experience where half the builds are broken.
My opinion is that a new C/C++ build/package system is actually a solvable problem now with AI. Because you can point Opus 4.6 or whoever at the massive pile of open source dependencies, and tell it for each one "write a build config for this package using my new build system" which solves the gordian knot of the ecosystem problem.
The world is rich in complexity, subtlety, and exceptions to categorization. I don't think this should block us from solving problems.
your initial comment above focuses on the difficulty of the task, I didn't see mention of the need for collaboration to solve it.
maybe you had that in mind? the comment doesn't spell it out though...
If it's really bad, at least the easy 20%.
The problem is one of Tooling & Methodology. Not just tooling.
There is no divide between tools/methods - although there is often a massive conflict over methods, and thus tools, implying a relationship - but in fact both tooling and methodology need to be aligned or else poor tools are created.
There is much discussion about ‘the build problem’, but it is just as much an issue of solving ‘the method problem’.
For example, a vast majority of the problems of building software can be solved by using Docker (a method), or indeed adding new methods for various situations - i.e. embedded: compiler-onboard vs. cross-compiling, games: —march, etc.
I think the “I and Your” part of your position is ineffective - better to think of it as ‘a tool I/others want to use’ and ‘a method I/others agree to use’, where: tools and methods are aligned in equilibrium for the duration of the project (in an ideal case).
It’s ineffective because of the difference in effect between ‘want’ and ‘agree’ on that equilibrium.
Some methods, a person may not necessarily want to use - but agree to do so, because there is a local tool to assist with the method. Some tools, people will violently refuse to use because they despise the implied method and will never agree to it; sometimes, people get fired for not aligning methodology - but also often for writing a tool nobody (or hardly ever anybody) ever uses/can use/wants to use. (There’s a whole world of software that only one guy knows how to wrangle because they ‘built a better build system’…)
Apropos the “Yet Another C/C++ Configuration/Combobulator” problem, I tend to think the ‘problem with build systems’ is due to the avid desire to just build a new tool to force a new method rather than adopting an existing method and thus using existing tools, and, therefore in my opinion, the fewer new tools in this department, the better… but that’s just me, I’ve been wrangling C/C++ codebases since they were new languages, and I’ve seen some shit, kids ..
The solution is _always_ to align your methods with your fellow user-developers first and then work together on the tooling, old or new… because all software is social, and has a social responsibility to the user, and user-developer tools built to construct software socially, even more so.
Not sure how big your plans are.
My thoughts would be to start as a cmake generator but to eventually replace it. Maybe optionally.
And to integrate suppoet for existing package managers like vcpkg.
At the same time, I'd want to remain modular enough that's it's not all or nothing. I also don't like locking.
But right now package management and build system are decoupled completely. And they are not like that in other ecosystems.
For example, Cmake can use vcpkg to install a package but then I still have to write more cmake to actually find and use it.
I have this solved at our company. We have a tool built on top of vcpkg, to manage internal + external dependencies. Our cmake linker logic leverages the port names and so all you really do is declare your manifest file (vcpkg.json) then declare which one of them you will export publicly.
Everything after that is automatic including the exported cmake config for your library.
And yet it will insist on only giving you binaries that match exactly. Thankfully there are experimental extensions that allow it to automatically fall back.
I wish there was a dead simple installer TUI that had a common API specification so that you could host your installer spec on your.domain.com/install.json - point this TUI at it and it would understand the fine grained permissions required, handle required binary signature validation, manifest/sbom validation, give the user freedom to customize where/how things were installed, etc.
This is now a build system generator generator. This is the wrong solution imho. The right solution is to just build a build system that doesn’t suck. Cmake sucks. Generating suck is the wrong angle imho.
That’s an existence proof that a new tool that doesn’t suck can take over an ecosystem.
Here's my feeble attempt using Deno as base (it's extremely opinionated though and mostly for personal use in my hobby projects):
https://github.com/floooh/fibs
One interesting chicken-egg-problem I couldn't solve is how to figure out the C/C++ toolchain that's going to be used without running cmake on a 'dummy project file' first. For some toolchain/IDE combos (most notably Xcode and VStudio) cmake's toolchain detection takes a lot of time unfortunately.
CMake is a combination of a warthog of a specification language, and mechanisms for handling a zillion idiosyncracies and corners cases of everything.
I doubt than < 10,000 lines of C code can cover much of that.
I am also doubtful that developers are able to express the exact relations and semantic nuances they want to, as opposed to some default that may make sense for many projects, but not all.
Still - if it helps people get started on simpler or more straightforward projects - that's neat :-)
I don't like it. Such format is generally restricted (is not Turing-complete), which doesn't allow doing something non-trivial, for example, choosing dependencies or compilation options based on some non-trivial conditions. That's why CMake is basically a programming language with variables, conditions, loops and even arithmetic.
In Rust, you have Cargo.toml, in go, it's a rather simple go.mod.
And even in embedded C, you have platformio which manages to make due with a few .ini files.
I would honestly love to see the cpp folks actually standardizing a proper build system and dependency manager.
Today, just building a simple QT app is usually a daunting task, and other compiled ecosystems show us it doesn't have to be.
That's a nice experience as long as you stay within predefined, simple abstractions that somebody else provided. But it is very much a scripted build system, you just don't see it for trivial cases.
For customizations, let alone a new platform, you will end up writing python scripts, and digging through the 200 pages documentation when things go wrong.
Just alone reverse engineering the Xcode and Visual Studio project file formats for each IDE version isn't fun, but this "boring" grunt work is what makes cmake so valuable.
The core ideas of cmake are sound, it's only the scripting language that sucks.
Build systems don't plan to converge in the future =)
If you're happy to bake one config in a makefile, then cmake will do very little for you.
You need to define a CMake toolchain[1] and pass it to CMake with --toolchain /path/to/file in the command-line, or in a preset file with the key `toolchainFile` in a CMake preset. I've compiled for QNX and ARM32 boards with CMake, no issues, but this needs to be done.
[1]: https://cmake.org/cmake/help/latest/manual/cmake-toolchains....
Cmake has a lot of warts, but they have also put a lot of effort into finding and fixing all those weird special cases. If your project uses CMake odds are high it will build anywhere.
Fighting the standard often creates it's own set of problems and nightmares that just aren't worth it. Especially true in C++ where yhou often have to integrate with other projects and their build systems. Way easier if you just use cmake like everyone else.
Even the old hold outs, boost and google open source, now use cmake for their open source stuff.
It signals that the speaker doesn't understand that the two are different languages with very different communities.
I don't really think that C users are entirely immune to dependency hell, if that's what OP meant, though. It is orthogonal.
As a user, I do believe it sucks when you depend on something that is not included by default on all target platforms(and you fail to include it and maintain it within your source tree*).
How does craft handle these 'diamond' patterns where 2 dependencies may depend on versions of the same library as transitive dependencies (either for static or dynamic linking or as header-only includes) without custom build scripts like the Conan approach?
What exactly is it you do/need that can't be reasonably solved using the FetchContent module?
https://cmake.org/cmake/help/latest/module/FetchContent.html
What I've been doing to manage dependencies in a way that doesn't depress me much has been Nix flakes, which allows me a pretty straightforward `nix build` with the correct dependencies built in.
I'm just a bit curious though; a lot of C libraries are system-wide, and usually require the system package manager (e.g. libsdl2-dev) does this have an elegant way to handle those?
C++ build system, at the core, boils down to calling gcc foo.c -o foo.obj / link foo.obj foo.exe (please forgive if I got they syntax wrong).
Sure, you have more .c files, and you pass some flags but that's the core.
I've recently started a new C++ program from scratch.
What build system did I write?
I didn't. I told Claude:
"Write a bun typescript script build.ts that compiles the .cpp files with cl and creates foo.exe. Create release and debug builds, trigger release build with -release cmd-line flag".
And it did it in minutes and it worked. And I can expand it with similar instructions. I can ask for release build with all the sanitize flags and claude will add it.
The particulars don't matter. I could have asked for a makefile, or cmake file or ninja or a script written in python or in ruby or in Go or in rust. I just like using bun for scripting.
The point is that in the past I tried to learn cmake and good lord, it's days spent learning something that I'll spent 1 hr using.
It just doesn't make sense to learn any of those tools given that claude can give me working any build system in minutes.
It makes even less sense to create new build tools. Even if you create the most amazing tool, I would still choose spending a minute asking claude than spending days learning arbitrary syntax of a new tool.
https://cmkr.build/
But how this tool figures out where the header files and build instructions for the libraries are that are included? Any expected layout or industry wide consensus?
https://github.com/randerson112/craft/blob/main/CMakeLists.t...
...and for custom requirements a manually created CMakeLists.extras.txt as escape hatch.
Unclear to me how more interesting scenarios like compiler- and platform-specific build options (enable/disable warnings, defines, etc...), cross-compilation via cmake toolchain files (e.g. via Emscripten SDK, WASI SDK or Android SDK/NDK) would be handled. E.g. just trivial things like "when compiling for Emscripten, include these source files, but not those others".
Yes, config packages are better. But I think doing find_package everywhere is better. Assuming you install an SDK for others to use your project. If you're a "product", vendor away. The issue comes when you want to vendor X and Y and both vendor Z independently. Then you're stuck de-vendoring at least one and figuring out how to provide it yourself internally. IMO, better to just let Z make its own install tree and find it as a package from there.
One can write good Find modules, but there is some "taste" involved. I wish we had more good examples to use as templates.
Another comment suggested batch files - these do zero out of three.
Once you appreciate the vastness of the problem, you will see that having a vibrant ecosystem of different competing package managers sucks. This is a problem where ONE standard that can handle every situation is incalculably better than many different solutions which solve only slices of the problem. I don't care how terse craft's toml file is - if it can't cross compile, it's useless to me. So my project can never use your tool, which implies other projects will have the same problem, which implies you're not the one package manager / build system, which means you're part of the problem, not the solution. The Right Thing is to adopt one unilateral standard for all projects. If you're remotely interested in working on package managers, the best way to help the human race is to fix all of the outstanding things about Conan that prevent it from being the One Thing. It's the closest to being the One Thing, and yet there are still many hanging chads:
- its terribly written documentation
- its incomplete support for editable packages
- its only nascent support for "workspaces"
- its lack of NVIDIA recipes
If you really can't stand to work on Conan (I wouldn't blame you), another effort that could help is the common package specification format (CPS). Making that a thing would also be a huge improvement. In fact, if it succeeds, then you'd be free to compete with conan's "frontend" ergonomics without having to compete with the ecosystem.
Is it though?
When I read the tutorial: https://docs.conan.io/2/tutorial/consuming_packages/build_si...
It says to hand write a `CMakeLists.txt` file. This is before it has me create a `conanfile.txt` even.
I have the same complaint about vcpkg.
It seems like it takes: `(conan | vcpkg) + (cmake | autotools) + (ninja | make)` to do the basics what cargo does.