It would be possible and quite easy ... Override malloc so that it counts allocations and use -d:useMalloc is one option, for example.
Interesting, I could see how maintaining some effects could become tedious. For this case wouldn't implementing a test on every bit of no-alloca code be more work than just having alloc0/alloc/... implement an allocates effect and checking for it? I'll have to look into overloading malloc.
Though I do find dealing with gcsafe to become pretty annoying, but mainly because I need to do the whole {.cast(gcsafe).}: block thing.
Also, if you cannot allocate on the heap, the solution is not to instead use large arrays on the stack, that's usually even worse... So you would need a "no big stack allocations either" effect too.
Yah, that becomes an issue. Usually what I'd want is to avoid allocations in a tight loop, or say a small section of code like an interrupt handler. In those cases usually you'd use preallocated memory or array.
Maybe I'll play around with both ways at some point just to see.
effect system
how do you override / disable an effect though? basically, try/except kills the exception effect and says "it's fine, I'm dealing with it" which is what makes the effect "workable" - how do you do that generally?
Many problems are of this nature - containing certain difficult constructs to certain regions of the code so as to treat them with extra care and maintain the ability to find them quickly in a large codebase - whether that be mutation, global variables, raw pointer access etc: you do that somewhere, then manually introduce a border.
Exceptions for example is an excellent place to go when auditing for bugs - you find one every time you look for it in most codebases that use them - likewise for many other similar constructs.
I think this is a cool idea. Certainly adding these annotations shouldn't hurt us, so why don't we add them and see how things go? :)
In general I love the effect system and would like to see it expanded plus used more. I would also like to add a BlockingIO effect that async can forbid.
Certainly adding these annotations shouldn't hurt us, so why don't we add them and see how things go? :)
Simple, because they do hurt.
In general I love the effect system and would like to see it expanded plus used more. I would also like to add a BlockingIO effect that async can forbid.
It got significantly expanded in version 1.6 with its effectsOf annotation.
Nim's original design philosophy was to be a lean, simple, powerful, elegant and efficient garbage collected programming language. ...
Personally I'm relatively new to Nim and have a bit different perspective. Initially was a bit confused by all the "features" and different configurations of Nim. However, I think Nim has been finding more of a theme and direction since 1.0 and especially with ARC/ORC.
Overall I still find Nim and the compiler impressively nimble overall, while being surprisingly stable given how flexible it is.
This may have been the wrong "hill to die on", but I'm not going to pretend this is an isolated incident where the attitude of - let's add feature X, Y or Z to Nim because it would be cool or for shits and giggles or whatever - has played out.
I actually agree with the Nim team not adding features for the sake of adding features. But from the outside I see something that could be an almost trivial change that might add a lot of value to an existing underused feature.
Maybe an "allocation effect" would be usable or maybe not; however tracking allocations is a valuable use case in systems programming, including embedded or most any high performance code (game dev, scientific, graphics, etc). Oddly tracking allocation it is not something most languages are capable of doing, despite many efforts to optimize allocating memory.
To rephrase my question, if Nim has an effect system and it's not going away, then it'd be nice to use it to solve problems me and others run into. Or rather enable community members like myself to try out ideas to see if it'd even be feasible or useful. Maybe it's a PR to enable others to discuss whether creating an RFC would even be appropriate.
I actually like that Araq's response was a bit skeptical and suggested a non-effects solution which is feedback I wanted on my original question/idea. He didn't block the overall notion and rather suggested an alternative solution. Seems reasonable to me.
I respect your experience and opinion. You stated you were new to Nim and that your impression of the language differs from mine - that's fine. I've been using Nim since pre-2015 and my impression of the language is different from when I first started.
I'm not sure what you mean by "nimble" when you describe the compiler. I'm not referring to its user experience I'm referring to how difficult the compiler backends are to extend / maintain. If you find Nim's compiler easy to hack on - bravo to you, you should join the core dev team, or consider contributing to the compiler if you can!
I'm not really interested in newcomer's impressions of Nim to be honest, and I don't say that to sound callous, it's just I know what attracted me to Nim and I know what direction I want to see Nim taken in and which direction I don't want to see it taken in. I believe you lack a lot of historical knowledge and context when it comes to this project's history.
One final note - the hill to die on comment is important as I don't have issue with these effects per se. I just don't think there are far more important fish to fry / things to fix in regards to Nim as a whole.
Nim's original design philosophy was to be a lean, simple, powerful, elegant and efficient garbage collected programming language. It has drifted so far from that original design premise, because it has had all of these experimental and R&D'd features to it that have eventually gone out of maintenance.
I don't see how we are not on track here, the old Nim v1 was stuck in a local optimum -- fast, but threading was a pain. So we had to do something, we got destructors and ORC. It took us long enough. Everything else was tweaking the existing language, very broadly speaking.
You can say that v1 already had too much cruft in it we should have removed for v1, but that would have delayed the version 1 release even further. Time to plan 2.0, I guess.
You can say that v1 already had too much cruft in it we should have removed for v1
There was (at least) this forum discussion back around the time. It is often hard work to remove features giving workarounds to users. This indeed would have delayed the long-delayed 1.0 as Araq mentions.
There are also RFCs from time to time to simplify the language/compiler (that often get push back).
It sounds like we probably have close to a year-ish until 1.8 and finally 2.0. I would encourage "language/stdlib minimizers" to open RFCs or Forum threads about each individual simplification to solicit more focused discussion (even though, yes, features can interact a bit). Now-ish is probably a good time.
To rephrase my question, if Nim has an effect system and it's not going away, then it'd be nice to use it to solve problems me and others run into. Or rather enable community members like myself to try out ideas to see if it'd even be feasible or useful. Maybe it's a PR to enable others to discuss whether creating an RFC would even be appropriate.
Indeed, please don't be dissuaded from doing so. In fact since this is such a simple change I would go straight to creating a PR to implement it. We can discuss details there. No need for an RFC (this thread is already kind of an RFC anyway).
That said, it's definitely worth considering Araq's suggestion. I'm much more enamored by the effects system than he is. Although I speculate that's mainly due to how annoying the effect system is to keep stable (not that that's a bad reason).
there are contexts where being able to assert "no allocation" would be powerful.
Agree. A hint disable by default alerting you on where allocations will happen would be nice.