Hey all, so I've hit a performance issue in server code, when it spends a lot of time in nim GC machinery mostly because nimFrames stuff. I'm compiling with -d:release --stackTrace:on --checks:on and it's very handy to get pretty symbolized errors from the production server, however I'm only interested in the errors originating from my code, while the dependencies and runtime don't need to be as excessive. So the open-ended question is: how do we go about finer-grained optimization controls: compile runtime and (some) dependencies with no checks, stacktraces and maximum optimization, and compile our code with checks and stacktraces.
One could argue I shouldn't be using checks and stacktraces in release mode, but this is also an issue in debug mode, when we're suffering long loading times of a debug build, just because the image decompression takes a lot of time in debug mode, even though we're not expecting any bugs in it. Likewise for game engines, which we want to run at maximum possible performance even for debug builds.