In the recent discussion of package handling I have realized that I (and others) have maybe misunderstood the capabilities of nimble. I'm just trying to make up my mind how this could be avoided. In my opinion, presenting nimble from a different perspective could help.
Before trying Nim I spend a few months with Rust, and I think it is interesting to compare nimble to Rust's cargo. In Rust one of the first things that the tutorial teaches you is that you create your projects by calling cargo new. This creates a cargo.toml for you, with the basic required content. You then generate some source code and use either cargo build (to just build) or cargo run (to build and run) your project. You want to run all the unit tests of your project? Call cargo test. If your projects requires some lib, you just add a dependency statement to your cargo.toml. You do not have to manually install anything; everything happens on the next cargo run/build/test. And these patterns always work in Rust: You pull anything from GitHub and all you have to do is cargo run/build/test. Imho cargo is a very strong plus for Rust. If you are interested, here is a nice overview: http://doc.crates.io/guide.html
For me, the surprising thing about nimble is that it actually offers similar functionality (nimble build), but it took me almost two months to find out. I think the main problem is that nimble is advertised as a package manager and not as the powerful build tool it is. As a result, it offers a low-level interface compared to the high-level interface of cargo. The suggested workflow for nimble is mainly that you use nimble to install or uninstall packages. The immediate thought regarding this workflow (not only) for me was: I do not want to install packages globally. My projects will depend on individual versions of certain packages, so I want to handle dependencies on project level. It took me a while to see that this actually is not a problem. But why would I want to run install manually at all? If the idea is that dependencies are stored in the Requires field, than I do not need (or want) to communicate with nimble on a "install this and that package version" level. As a user, I want to tell nimble to "build" (or maybe run/test) and do its thing (leading to install if necessary). Another example: I do not want to tell nimble to "update" the package list manually. From a user perspective this is a low level instruction; it should just be performed when necessary (e.g. when a package is not found). Overall I think nimble and cargo are not too different in functionality. I only would prefer to establish and advertise a similar high-level interface. Currently, nimble's strongest feature (nimble build) is even discouraged in the documentation.
What do you guys think?
I agree with you completely. I have actually recently tried Cargo myself and was surprised by just how similar it is to Nimble.
I suppose the workflow needs to shift and nimble needs to become a requirement when developing software in Nim. This is rather difficult though, personally (even though I have written Nimble), I find the idea of using Nimble solely as strange. This may be due to the fact that I have been using the compiler directly for such a long time. But now that you can compile any of your project's files the same way you would in Nim (via nimble c) it is perhaps time to break our habits.
IMHO To me a turing complete build system is overkill and dangerous. A build system should be declarative. I want this built this way. dot. Furthermore a build system should interact with IDEs in some way. So by making it imperative it becomes harder for an IDE to know what the build system will produce.
+1 @Jehan Since Nim has good tools for wrapping automatically C/C++ libraries an automatic build task for those libraries could be a nice addition.
+1 @all Having a a common way for creating a project and building it's a must have for a modern language. Otherwise the only way we can improve the situation is through an IDE with Nimble integration. (feature that i would like to develop)
@Araq i hope my comment didn't sound rude. Cause i didn't want to. Given that, i cannot speak regarding Scons, however i don't know any good IDE with CMake integration. I use daily CMake with both QtCreator and VisualStudio plus i tested it also with CLion. To be honest none has a good integration and i'm saying good..lets forget "perfect". I don't know your experience but having to invoke cmake every time something changes or for recreating visual studio solutions or editing the CMakeLists.txt file for adding a file for me it's called bad integration. CLion is the somewhat better but non perfect and cannot handle subprojects or lots of imperative code inside CMake. Furthermore CMake shouldn't be taken for a nice or pleasant build system.
Talking about declarative syntax i used most of the languages/tools/techs that you listed and never felt them so weak but maybe and i didn't reach your level of usage.
What i'm noticing (but it's just my view) is that newer build systems try to be less imperative and more declarative. Take a look at :
With this i mean that maybe there're imperative parts inside the build systems (some kind of "modules") but from the outside they're used declaratively. So a module for building cpp files has an imperative logic but options are plugged declaratively. To me client of the build systems should only use declarative syntax, instead plugins or extensions of the build system could use both declarative or imperative syntax. In this way complexity is moved to backends.
Lastly i would like to not have to write scripts for building a project. If i dream to work with an ipotetic build system i see myself writing some kind of options and a list of files. Nothing more...but hey, maybe it's just me :)
@filcuc In general, my experience with declarative systems is that they work quite well... until you need to do something that they don't support. That's when you need some way to extend them, usually using procedural/imperative code.
That Qt Build System is oddly reminiscent of Wix or Buildbot (Though, at least Buildbot allows mixing in procedural hooks).
@Arrrrrrrrr: I wasn't aware of nake, thanks for the info, very interesting! And yes, what I'm looking for here is pretty much a unification of nake and nimble.
Some nitpicking: What I'm missing in nake is the declarative feel, since you just manually invoke "nim c". That is maybe a bit too low-level/imperative to integrate e.g. dependency handling (you probably do not want to assemble the nimble search path on your own). A certain layer of abstraction would also allow to decouple the behavior of the compiler from the build files. For instance, Araq considered to change the behavior of -o to make it relative to the build path, and not the source path. In nimnimble I tried to make an abstraction for this, i.e., when the -o behavior changes, the build files do not have to change. Also, I think it would be good to standardize tasks like in SBT or cargo.
I'm still not sold on the idea of an imperative .nimble file.
I am always open to suggestions for new ways to improve Nimble so please continue discussing anything that you think may have merit. My thoughts however are that imperative .nimble files could create the following disadvantages:
I would be more open to the idea of a build script as an addition to the current .nimble files that we have. The amount of packages which will actually need it should be minimal anyway.
dom96: I'm still not sold on the idea of an imperative .nimble file.
That's why I suggested having a build = ... entry analogous to cargo where you can use the build tool of your choice if necessary and don't have to care about it otherwise. I'm not really keen on familiarizing myself with yet another build tool and the ones that exist do the job just fine. Nim as an option is okay if you want to minimize dependencies, but you can already use Nim with nim c -r; that's more a question of providing suitable libraries (e.g. logic that knows how to build a dynamically loadable library from C code on arbitrary architectures or how to figure out #include dependencies).
I'm less worried about the security of nimble; you're going to install executables or libraries that can already do pretty much whatever they do on your computer. Much more important would be the ability to sign packages cryptographically.
dom96: Bugs in the .nimble files leading to the need to debug a package's .nimble file.
In my opinion that is exactly the advantage of "using Nim to build Nim". Users potentially produce bugs with any syntax. Currently nimble has to parse the '.nimble' syntax manually and error reporting is very basic. Obviously the nim compiler is much more sophisticated when it comes to reporting syntax errors. Also, Nim developers are familiar with the syntax errors the compiler produces, whereas we are not aware of the error messages from nimble. I tried to a few possible syntax error examples with both nimble and nimnimble and for me the compiler output was just much easier to understand. Furthermore, the Nim's meta programming features like when declared are imho perfect to do the validation and can produce easy-to-understand DSL specific errors like missing build information.
dom96: Security concerns, an imperative .nimble file could potentially erase your hard drive.
I fear this will always be a problem. Isn't it already possible now to run malicious code at compile time in a static block? Even if we could avoid all dangers at build time, malicious code could eventually be run at runtime.
dom96: Backwards compatibility: there is 170+ packages using the current declarative .nimble file format.
I'm fully with you on this point: Backwards compatibility would definitely be very nice to have. However, I see many things that the current format does not offer, and I'm just not sure if it is possible to add them without breaking anything at all. Even if we just add an optional build= the result would be pretty messy: The build script would be split into three parts: A '.nimble' file, a '.cfg' file for compiler options, and (optionally) an additional build script. In my opinion having three files is ugly, considering that in other build system there is just one place to store everything. Also, some things might even be worth deprecating just for the sake of beauty (e.g. putting all deps in one string). I guess as soon as the solution provided by nimble is not convenient (or not flexible) enough, users will not adopt it or start to use other solutions. Some will write their own shell build scripts, others will use nake or yet something else. I think we already have many '.nimble' files which exist purely since they are required for publishing a package but not for actual building, since nimble does not offer a certain feature (e.g. separating bin from src). In the end, building in Nim could be fragmented. Like I said: I think it is a huge plus that I can build anything in Scala/Rust by just running sbt compile/cargo build and never have to check specific build instructions. So from my point of view a breaking change might be worth it in this case (btw: the Rust team has deprecated the predecessor of cargo in similar situation). But in the end it is entirely up to you :), and I really hope my suggestions do not come of rude.
bluenote: Like I said: I think it is a huge plus that I can build anything in Scala/Rust by just running ``sbt compile/cargo build`` and never have to check specific build instructions.
Eh, Rust may be one story, but one of the problems with the JVM ecosystem is that build tools are so fragmented. No, you won't be able to build everything in Scala with sbt compile, and you may not want to. That's why zinc is still a thing and everybody has their own build tool. You'll see people using Maven instead, Twitter using Pants (that they developed themselves), others yet use Gradle or Buildr. And part of the reason why it turned out that way is the deep integration of package management and the actual build process for JVM languages. (Granted, there were also other problems, such as the XMLified mess that Maven is.)
The current crop of build tools likes to try and do it all, i.e. solve all of the following four problems in one integrated package:
That can be nice and convenient when it works and that's all you need. It's not nice and convenient if expectations break down. E.g. if you have only restricted internet access, are dealing with a multi-language setup, your use of external packages needs to be filtered through corporate auditing rules, etc.
I do see pros and cons with both declarative and imperative, but at the end of the day, I must say I would prefer imperative more. I wouldn't mind if it goes through a compilation stage first, until it can be handled by the VM fully.
The security concern would be the same as always, but you don't run make with sudo privs. And of course it can wipe your home dir anyway (I assume both these hold true in windows/mac/etc too, with different wording), but as always, it comes down to trust, you don't run sudo make install on just anything. (Where "make" translates to "nimble"...)
Regarding the whole concept, I do think it might be a good shift. Making all dependency solving etc. less of a concern, and producing great apps the main focus is all good with me. Apart from C++ (which ofc sucks bollocks in this department) the only pkgmgr/build helper I've worked with is those of iojs: "npm" and "node-gyp" (for building C++ extensions, which is the closest ever to a C pkg mgr), and that's a dream compared to not having it at all.
So I'll be very happy to see what it turns into, and no matter what, I'll be happier with it than with C++ :-)
And of course it can wipe your home dir anyway
This is offtopic, but this cannot be stressed enough. IMHO the definition of "security" for operating systems is completely broken. And so most concerns about "security" are downright ridiculous.