Lately package management comes up in all corners of the community.
What i personally want from a package manager:
It must install packages, and install the newest freshest. So when i type pk install mypkg
I must be able to just do:
import mypkg
nim c -r myscript.nim
It must be able to install a package from a local directory (aka dev mode)
When i list the packages it must tell me:
mypkg version [devpath: path]
It must be super clear where a package came from, so where its loaded from, why its loded from that location etc.
It must be fast and it must inform the user what its doing. Currently for example nimble just "hangs" and does something and it takes long, and i have no idea what it does (and this annoys me).
And it must stay out of my way when hacking or writing simple throw away scripts (which i do very often), so when a package was installed with the package manager, it must be found by the compiler!
So this MUST work:
nimble install mypkg
# myscript.nim
import mypkg
nim c -r /some/folder/myscript.nim
AND uninstalling a package must uninstall it, the current situation with nimble is, i have ~3-5 different version of a lib some are ancient and in the pkg or pkg2 folder. Then i compile stuff, then it chooses the wrong installed lib. Or who knows what it does. Then uninstalling does not work, since some other libs depend on it. then i get angry and try to mess with nimble internals. This is not good.
Little (packaging/dependency) problems like this initially drove me away from python years ago.
And nowadays every time i need to interact with nim's packaging system i get a bad gutt feeling, which is not good. And it got a little worse over the years.
I think the tooling we have is not bad per se i think some small polishing could do wonders here:
- Tune the package manager for the most common case first!
- Most people do not need tagged/locked dependencies, or such things.
- They write simple scripts with simple modules and this use case must work -> flawlessly <- .
- They put their modules in a project folder, and want to include stuff from their project folder, while developing it. This use case must work -> flawlessly <- .
- Just look a the nimbles github issue front page, and you see the issue: "...Nimble develop...", "...nimble get stuck...", "develop", "not installing globally", again "develop" etc.
- Make it possible to easily use advanced mechanisms, like taggin etc pp.
- But keep those things for advanced users.
Uh oh ... now you got me started about package management. As somebody who tries to keep multiple packages maintained everything that is more than "I pushed to master" is a liability. I love Nimby's ideas here and with some refinement of its ideas it would nail package management. It's the one "I know it when I see it" solution I've been looking for quite some time.
I think the system can scale to much larger ecosystems once it grows an override for HEAD. Here's the key insight: when packages have conflicting requirements for the same dependency, use the dependency chain's depth to resolve conflicts. A package could demand a specific commit of a dependency, and when these requirements conflict, prefer the requirement from the dependency that is earlier (closer to the root) in the dependency chain. No SAT solver is required and arguably the resulting system is much easier to understand:
The rule is: proximity to root in the dependency graph determines priority, independent of declaration order.
YourApp (depth 0)
├── A (depth 1) → wants D@commit-x
│ └── B (depth 2) → wants D@commit-y
│ └── C (depth 3)
A's requirement for D wins over B's, because A is closer to the root. A chose to depend on B, so A's constraints take precedence over B's.
You can also directly list an otherwise indirect dependency in your dependency file to override what would otherwise be chosen by the depth rule. This lets you pick a specific commit (or version) that would have been overwritten by a deeper dependency's requirement. Instead of commits, versions can be used too as they are easier on the eyes, but these get resolved to commits much like Atlas/Nimble do today.
No special [overrides] section. No resolutions field. No patch mechanism. Just: list it yourself if you care.
This also means lock files are just "every transitive dep promoted to depth 0" - a snapshot of the fully resolved graph as direct dependencies.
The dependency depth can also be used to create a "natural" directory structure that keeps things inspectable:
workspace/
project A # depth 0: project you work on directly
project B # depth 0: project you work on indirectly
deps/
deps of project A # depth 1: direct dependencies of project A
deps of project B # depth 1: direct dependencies of project B
deps/
deps of deps of project A/B # depth 2: transitive dependencies of project A/B
Want to move a project into the workspace/ directory directly? List it as an explicit dependency in your .nimble file, which promotes it to depth 0.
The algorithm for the initial clone and an update is identical: a breadth-first traversal (processing dependencies level by level) of the dependency graph:
proc resolve(root: string) =
var seen = newHashSet[string]()
var queue = @[(loadNimble(root), 0)]
while queue.len > 0:
let (pkg, d) = queue.pop()
for dep in pkg.dependencies:
if not seen.containsOrIncl(dep.name):
pullOrClone(dep.name, dep.url, dep.commit, d + 1)
queue.add((dep, d + 1))
It's always a good sign if the design simplicity translates into a simple algorithm.
This way dependencies can easily use multiple versions at the same time.
Can you elaborate? Suppose App -> (A, B), A -> C(v1), B -> C(v2). If C isn't a direct dependency of App (so A and B don't interact via common types from C), I guess it is safe to have to versions of C at the same time. But how will this work with path resolution during imports in A and B? C(v1) and C(v2) still share the same name.
Well C's HEAD is used or a specific commit according to the rules that I outlined. Regardless of what is checked out, let's assume C made a breaking change so it ended up with this directory layout:
src/
foobar.nim
v2/foobar.nim
foobar.nim then contains:
{.deprecated: "use v2/foobar instead".}
import v2 / foobar
var globalState = setupState()
proc oldApi() = api(globalState)
And things keep working. The versioning is now in the Nim code, it's not in the git history.
But everything I described here is entirely optional from the rest of my proposal! You don't have to like this part! ;-)
The rule is: proximity to root in the dependency graph determines priority, independent of declaration order.
Unfortunately that fails pretty quickly. It's pretty common to require the same dep at the same level:
YourApp (depth 0) ├── A (depth 1) → wants D < 2.7 ├── B (depth 1) → wants D > 3.0
No SAT solver is required and arguably the resulting system is much easier to understand:
The general consensus in other languages has been to move towards tools with SAT. For example uv is super popular in Python now. Cargo uses it now and generally Cargo works very well.
I've used uv some lately and it's way better than previous Python tools which didn't use SAT. It seems after decades of saying they didn't need deterministic SAT solvers the Python world has moved full onto uv. Unlike pip which used greedy solutions with fallbacks and backtraces which isn't too different than the proposed algorithm. It's simple but breaks just like Nimble did.
What would be nice for Atlas (and Nimble) IMHO would be to implement or use PubGrub which is based on a paper that allows clear error message resolution with SAT solver.
Unfortunately the SAT in both Atlas and Nimble don't give good failure messages and just spin.
Outside specific conflicts or requirements SAT just selects the latest versions of packages, so eh?
What i personally want from a package manager:
Matches what I generally want and Atlas has been working well for me. Just be sure to install the latest Atlas! It's cool to see other ideas and PM's but they all seem built for a specific authors use case.
It must install packages, and install the newest freshest. So when i type pk install mypkg
Yep it'll use the latest. Unless there's constraint which are pretty rare with Nimble packages. Only Status deps are complex enough to really run into that.
I must be able to just do: import mypkg nim c -r myscript.nim
Exactly! That's why I don't use Nimble anymore. Unfortunately it's hard to predict where packages will come from. The newer Nimble releases are much better, yet somehow I still seem to get weird packages.
Atlas sets the nim.cfg and I run my nim c ... commands. I also put tasks in my config.nims so I can do nim test or whatnot.
It must be able to install a package from a local directory (aka dev mode)
Atlas supports "linking", e.g. atlas link ../mydeps/. It creates a deps/mydeps.nimble-link file which shows where it's coming from.
When i list the packages it must tell me: mypkg version [devpath: path]
Atlas does show the version, but not the location. Though it'd be easy to add. However you can run a cat nim.cfg to see locations. This works for newer Nimble with nimble.paths.
It must be super clear where a package came from, so where its loaded from, why its loded from that location etc.
That's a bit lacking. See my previous comment. Generally Atlas show which version it selected with the nearest options and whether it selected head or not using ^ next to the version and git hash.
However something like the PubGrub algorithm would be super awesome for those (rare) times conflicts happen.
IMHO, Atlas does need a bit more polish on things like deleting repos. There's currently not a command for that. Instead I just delete the deps/ folder or the package in deps/foo. The documentation could use polishing a bit, etc. It'd be nice to have an atlas install --update to update and install in one go.
Unfortunately that fails pretty quickly. It's pretty common to require the same dep at the same level: ...
Yeah, well, I don't care about the complex solutions that make semver somewhat work when semver itself is annoying crap.
work when semver itself is annoying crap:
Who's talking about SemVer? :P
Most projects tend to use just a form of ZeroVer. I'd agree SemVer is hard and rarely done well. We all know that Knuth's Tex versioning scheme is approaching infinite perfection.
Numeric versions are a handy human friendly way to communicate progress that indicates some effort of verification / testing on the package devs side. They're simple and monotonic so it's easy for devs to see "version 50" > "version 33".
For example I recently updated my ChatGPT Codex in my package manager. It was easy to see that my version of 0.40 was pretty far behind the current 0.60 release.
Yeah, well, I don't care about the complex solutions that make semver somewhat
Irrespective of SemVer or even a numeric scheme some deps need > #abc123 for some feature while others requires < #abc222 where ##abc123 is before #abc222. BTW supporting > #abc111 would be nice in Atlas.
Three numbers are not enough information to convey the complexity of the software so you're better off reading changelogs, try an update and run the test-suite. And when you do that (and you have to do that) you might as well ignore the version numbers.
Sure numbers are a lossy mechanism, but good enough for many cases. Besides who get's excited about trying out Nim ab00c56904e3126ad826bb520d243513a139436a! :P
To be honest, I fully agree with the way nimby manages packages, ie pulling the repo and using git as the version picker (Just checkout to tag X). That being said, I feel the most reasonable versioning scheme for Nim is calendar (25.12 or 2025.10) because it effectively just says "This was last updated in this year.month", so it adds information without adding complexity.
Even if a library is "finished" the language isn't, and this gives you a good idea of how compatible the library might be ("Oh this predates Nim 2.2"). Obviously saying "You shouldn't want to use anything than calendar ver" is extremely forceful, but I do think it would be better to just have it as a the default of the package manager over say semver which becomes 0ver.
You never truly know requires P >= 12 as it's an open set, all you know is "worked for version 12". So the choice is between "this one version I tested and HEAD", it keeps everything neat. If there is a conflict try HEAD, it's at least as reasonable as any other choice.
Also, you can run a more elaborate algorithm (SAT) to detect "works with version 12, 13 but not with 14" and encode that in your .nimble file directly due to the depth-based prioritization. But there is no reason to run a SAT solver every time when there is a conflict, it's based on the packages in question. This can be left to an external SAT tool, it doesn't have to burden every package author with its complexity! (Though in practice making it an opt-in switch for the package manager is more convenient than a separate tool.)
"Half of publishing is maintanance"
If you publish, it comes with the obligation / reponsibility to maintain. I get the feeling that part of discussion on versions and packaging is about moving that resposibility to somewhere else. But in the end you can't.
Quality is a lot about user expectation. I, as a user, expect, within a reasonable amount of time, that your library works with the latest version of Nim and the libraries you used to build upon. That also implicates when I update to a new version, the old one goes away. When there are incompatible changes they should be clearly announce upon installation. It's up to me to fix my stuff so it works with updated building blocks.
But not just publishing. I had about 60 projects in python, all in venvs. In the end this requires script upon script just to keep everything up to date. Let alone testing if every thing keeps working. A quick analysis showed that some packages where in more than 50% of the venvs. End of venvs, back to global. A lot less strain on updating stuff.
I always wondered why not use a local git, or fossil, as the location of libraries to use. Why duplicate the same thing(s) numerous times in projects? I only very recently learned about nimby and had good hopes, but it copies (duplication) instead of using a local VCS. Version control is what a VCS is for, not a package tool?
"Half of publishing is maintanance"
Yeah, so let's make maintanance much easier.
I always wondered why not use a local git, or fossil, as the location of libraries to use. Why duplicate the same thing(s) numerous times in projects? I only very recently learned about nimby and had good hopes, but it copies (duplication) instead of using a local VCS. Version control is what a VCS is for, not a package tool?
I feel this creates extra external dependencies. Now I MUST have git and I MUST have fossil, and I MUST have svm, etc etc. So nim and the upmw (ultimate package manager or whatever) become dependent on a huge number of VCS. jj is a new vcs, should this be supported? Maybe? Because it is compatible with git but are there use cases where it makes sense etc.
A good reason to use versioning is because you can use either curl, or better wget (less deps), to fetch the tarball and then untar it so you don't have to worry about bloating your system with multiple copies of the same .git dir etc etc.
So nim and the upmw (ultimate package manager or whatever) become dependent on a huge number of VCS.
Y're solving the wrong problem. VCS's should "talk with each other", have a translation layer, the same API etc. whatever is needed for easy access and exchange.
The new package manager on the block is created because the other ones couldn’t keep up with de library development speed (as I understood), then why not base it on the one source of truth, or fork thereof?
...but they all seem built for a specific authors use case.
Yes and no. Percy is largely modeled after PHP's composer except in some sense more simplified because it uses git worktrees. I mostly just want to be able to work out of my vendor directories without issues. That means:
Nimble has some insane caching issues whic as the OP of this thread points out, are basically incomprehensible/indiscernable. Trying to get things to update and use the latest is some kind of black magic ritual (particularly in early dev). It has some legacy dependency on having a file that commits you version number, not to mention a package name inherently tied to a file name. Atlas just makes a mess of the directory, even with name overloads you're gonna end up with a bunch of domain noise. It still in that sense, cares about nimble file/package names inherently, so much so that I squashed multiple repos because it was loading up some nimble file from some old commit where the repo URL was different, which was forcing me to add pkgOverrides just to reconcile my own previously migrated repo.
Anyway... Percy is in a pretty good state, and is working well. I'll continue to work on it, but if anyone is curious the README is fairly extensive and, so far, has mostly "just worked" (except when nimble couldn't install it, cause nimble).