As nim is approaching the 1.0 release, I think it's worthwhile to ask the community (the users) this question.
for me, it's the missing/partial debugging functionality. to a lesser extent libraries, but that is unavoidable I guess. so in other words, the tools for using the language more effectively - not the language itself.
I am completing current projects which began in a different language, and changing now is not an option.
I am strongly considering using Ada, in particular the SPARK 2014 dialect, because I am interested in provably correct software. After that I am impressed by Eiffel's Design-By-Contract, which is essentially what SPARK 2014 uses; SCOOP; and void-safety. I am not aware of those latter two being so well-handled in any other language. (Kotlin does nicely with void-safety, but according to the Eiffel mailing list Kotlin's developers admit they don't have it down pat.)
Nim has some great features, and I love the efficiency, but a lot of the safety aspects are still being worked out, including some fairly fundamental features such as pointers and in particular pointer safety. So it's in the mix, and I dabble with it for time to time, but it's not my top choice for my next project.
I'd say immature libraries, you're going to a complete ecosystem not just the language. Looking back at https://xmonader.github.io/nim/2018/12/06/nim-good-ok-hard.html I find there're lots of improvements like lots of effort on documentation and clearer error messages, but still concurrency is rough and communication and orchestration isn't nice.
lots of the available projects on nimble seem like weekend projects which is fine, but know your risks.
Aside from that, nim is a great chance to reinvent the wheel for lots of things e.g wrote a redisparser, client and terminaltable! which I would never do in Python or Go world.
This is a great question and I thank you for asking it.
In relation to version 1.0 however, many of the answers here point to things which I don't believe belong to a 1.0 release. For example, good debugging functionality is an important feature but it isn't something that must be done before 1.0 (implementing it can be done at any point in the future, as it will not need breaking changes).
What I would be interested in more is features/bugs that are preventing you from using Nim as your main language, and require backwards incompatible changes to fix. This is our last chance to make these changes.
What I would be interested in more is features/bugs that are preventing you from using Nim as your main language, and require backwards incompatible changes to fix.
I think I did answer that question, but in case it wasn't clear:
- void / null safety (as in Eiffel or Kotlin; the current pointer proposal doesn't address that)
- design-by-contract (with support in compiling and debugging, as in Eiffel and Ada 2012 / SPARK 2014)
I don't think you can concentrate on those at the moment, so I don't expect you to.
The change in heap management was the one thing that really gave me pause, because people had sung the old system's praises so thoroughly that I had the impression it was unique to Nim, and worked without a hitch.
Then Araq proposed the new model & while he by no means trashed the old setup, it became clear that everyone had been quiet about some really serious problems. Araq's points on this have all been very well-laid out and it's a good thing he's doing them; it impressed me quite a bit. But it did drive home the point that there are some other very mature, tried and true languages that address these problems, and while I do want to use Nim, it isn't quite ready for me.
I have used Nim for many small experiments, but the main reason why I am not using Nim as my main programming language is that it is not popular enough. I know this is circular, but where I work I am the Nim person. If anything should go wrong on a project where Nim fails to deliver on some points, the burden of having chose that lies on me. If a project goes well, I am bound to maintain it indefinitely because I am the local Nim expert. All this means that using Nim in anger is too risky for me now, although I like to use it for experiments of many kinds.
Other than that, I am worried about the new runtime. Starting to plan the move to a new runtime just before 1.0 is... well, not a good sign. I also don't especially like that this new runtime requires a complex set of lifetime annotations (sink, move, owned...), which makes code less understandable, while before that Nim could be considered a benchmark of readability. Finally, the new runtime doe not seem to be based on sound research on formal type systems (there's just Bacon and Dingle, but it seems an abandoned approach) and new special cases that evade the analyses done so far keep popping out. I think if one wants to really follow such an approach, it must be tried on paper and proved correct with an actual demonstration before jumping to the implementation.
I am pushing for Nim in production though it is still one year away I guess. All existing production code base is in C++.
The main reason I am trying to migrate to Nim is a couple of business user requests that I can't simply do in C++ in any reasonable way. C++ simply can't do it, period. In Nim I can do it (still rather difficult) via 10k to 30k lines of AST macros code. Effectively it is the whole new codegen pass.
Reasons for doing Nim: Nim has AST macros and it integrates well in existing C++ infrastructure. I can use C++ templates when have to, I can catch C++ exception when I have to, I can export Nim function to C++ when I have to.
My five cents: Stop branding Nim as faster python with types or C++ with more readable syntax. Focus on features that different from C++, Python, D, Rust and you will find your audience.
@andrea I 100% agree with your first point. It is one of the main reasons I can't use Nim as my main language, but use it as my main hobby language. The community just needs to grow, in all senses. We need more projects, more libraries, more users, and more marketing. These are hard things, and trying to grow too fast can be very detrimental, so I respect the situation.
This is your personal opinion of course, but I have to disagree with your other points about the new runtime, as I feel that they misrepresent Nim.
First, you seem to be implying that the GC is going to be completely removed for 1.0. AFAIK, this is definitely not the case. The Newruntime and the GC are going co-exist for a long time. (Core maintainers correct me if I am wrong here.)
You don't like the new lifetime annotations. This is personal preference, and I respect that. I personally have been burned by Rust and find this syntax easier to reason about than Rust (by a large margin), while still giving many of the same benefits. But that is just my opinion.
Finally, the new runtime doe not seem to be based on sound research on formal type systems... I think if one wants to really follow such an approach, it must be tried on paper and proved correct with an actual demonstration before jumping to the implementation.
I think is a completely unreasonable thing to say. By this reasoning, the only language you should use are languages that have been formally verified like Ada Spark, Idris, or APL.
There are many counter examples to your argument:
You talk about Beacon and Dingle like they are bad things. Rust took a very similar approach, and is highly successful.
The Rust borrow checker was based on Cyclone, an extremely dead academic language. Rust took that idea and improved it by fixing many edge cases. A project that took years. People still see Rust as a "safe language based on sound research".
The Midori project at Microsoft is a "dead" project that produced important research on async systems, that inspired many mainstream async concepts.
Haskell was implementing state of the art lazy evaluation and Garbage Collection algorithms based on research paper drafts (not final publications) back in the 80's. Haskell is explicitly a research language that incorporates unfinished research even today. This is something Haskell is very proud of, Yet Haskell is still very popular in some industries, and is considered a fundamentally safe language based on "sound research".
The Nim GC, as well as the allocator under the GC, are based on research papers that were considered cutting edge when Araq implemented them. The Nim GC is not just some generic GC based on Java from the 90's. It is actually quite advanced.
After several years of real world use of the GC algorithm, serious flaws were discovered in the GC with regards to multi-threaded programs. Instead of trying to "hack" the GC to fix the problem, Araq searched the latest peer reviewed research to find a better solution to the problem. This is an extremely principled approach!
Academia is not some "magic place" that researchers go to commune with the computer science gods, and come down with the perfect answer to a problem. That is the worst case of "waterfall" software development. This is a known anti-pattern, that does not work well.
From what I have seen, out of the "new languages" craze of the last few years, Nim is one of the most principled when it comes to making decisions based on academic papers. Araq spends much of his time keeping up with the latest research. Compared to many of these other "new languages" that base their design decisions purely on hype, by committee, or by uneducated personal hunches, I personally find this very comforting.
That is unfortunate as many times you really want a crossplatform solution for something as essential as performant http server.
It is not about the dev experience but about deployment target where windows desktop or windows server is still a thing.
Man, I work with large Go codebases, Go is the equal or more strict than Rust, and anyway you may find unmaintainable code in extremes that your head explode.
From my extense experience, bad developers can use the most strict language ever created and the codebase will still be an enormus shit.
And good developers can code in the most random languages (aka PHP) and you can find million lines codebases with the most beautiful, simple and understable code you have ever seen.
So it's a matter of people, not languages