I just recompiled a small tool that reads a binary file and converts it into a different (text-based) format using nim 1.4 and the new ORC GC. This tool did not compile with the ARC GC in 1.2.6 (due to some problem with the docopt library) but it complies just fine (both with ARC and ORC) in nim 1.4.
I used the tool to convert a file that too 19.5 seconds to convert with the regular GC in 1.2.6. In 1.4, with the regular GC, the tool takes the same amount of time to convert the file (i.e. no performance differences). However, with --gc:orc, it only takes 12.4 seconds (and I get the same result with --gc:arc). These results are very consistent and I also confirmed that the file output is identical.
This is really impressive! Just add a small switch at compile time and get a 37% performance improvement! Awesome job, guys!
I don't think it makes too much of a difference if you're using --exceptions:goto, but might provides considerable boost if --exceptions:setjmp is used.
--panics:on allows the compiler to stop assuming that all procs can throw, as all Defect will then be compiled down into quit(). This should reduce the amount of try-finally pairs generated, which is "slow" for setjmp.
With that said, not many code actually use the new destructors function nor defer so the perf gains might not be huge.
ORC ---> BUILD PROD termScr.nim-> termScr size : 578K
ARC ---> BUILD PROD termScr.nim-> termScr size : 568K
c -f --gc: ...ARC/ORC...
-d:useMalloc --deadCodeElim:on
--verbosity:0 --hints:off --hint[Performance]:off --warning[UnusedImport]:off
--threads:on --opt:size
--passc:-flto -d:release -o:$projet_bin $projet_src
with ORC i have less memory used when working.
a project of more than 8000 rows with dynamic tables
"with ORC i have less memory used when working." this is probably because your app has cycles and ARC can't collect them :)
Also why do you care about --opt:size? And why do you use -d:useMalloc (it's really only used for debugging memory leaks or on platforms Nim's TLSF allocator isn't ported for) and "-d:release" instead of "-d:danger" for more performance.
Another tip - you can put a lot of these in a config file (.nims or .cfg) instead of having a very big command
by removing -d: useMalloc and --opt: size
BUILD PROD termScr.nim-> termScr size : 696K --->ARC
BUILD PROD termScr.nim-> termScr size : 713K --->ORC on the other hand I have the same memory footprint when working
working ORC---> 648k utils
by removing --opt: size BUILD PROD termScr.nim-> termScr size : 684K --->ORC 666K---> ARC I have a much smaller memory footprint from single to double and -d: useMalloc working ORC---> 384k utils
tip - you can put a lot of these in a config file (.nims or .cfg) instead of having a very big command
I have a procedure included in VS or Geany batch or standard makefile for the moment everything goes through this procedure, I did not see the interest of making a .cfg except when publishing
if someone could help me i would like to provide "termkey" which is on github my english being too weak i dread making a dumpling
french: J'ai une procédure incluse dans VS ou Geany batch ou makefile standard pour le moment tout passe par cette procédure , je n'ai pas vue l'intérêt de faire un .cfg si ce n'est lorsqu'on publie
si quelqu'un pouvait m'aider j'aimerai mettre à disposition "termkey" qui est sur github mon anglais étant trop faible j'appréhende de faire une boulette