I've got a project that converts strings of hex digits into unsigned 32-bit numbers.
Under C / C++ this has been easy and straightforward using casting and/or other techniques. But under javascript... I can't seem to find a method that works.
let hex = "7c000000"
var a: uint
when defined(js):
a = ???(hex)
I suppose I could make use of importjs to get around this, but I was wondering if anyone has come up with a more "nimic" answer.
John
This works just fine with both C and JS backends, so I'm not sure what's your question :)
import std/strutils
let hex = "7c000000"
var a = fromHex[uint](hex)
echo a
I had trouble under certain circumstances with parseHexInt with JS and unsigned integers.
Ah, but fromHex. How the heck did I not see that? That also simplifies my code under C.
Thanks!
Sadly, this did lead to an odd JS bug. I'll look into posting in a issue.
var mask = 0x80000000'u32
if mask == (mask and mask):
echo "good"
else:
echo "not good"
One gets different results based on compiler target. I suspect the lack of integrated "unsigned int" support in JS itself is causing it to not like the high bit being set on an and. But that is just a guess on my part.
the js isn't emiting the >>> 0 it ought. this workaround convinces it to do so, (although it then does it twice):
var mask = 0x80000000'u32
if mask == (mask and mask) shr 0:
echo "good"
else:
echo "not good"
spec: https://262.ecma-international.org/5.1/#sec-11.10 bitwise ops in js are signed (which seems like a bad design decision)
see also https://stackoverflow.com/questions/8936523/javascript-bitwise-operator-confusion
the correct fix is for nim compiler to insert >>> 0
note that >>> 0 indeed inserts it twice, which is another bug