Why does this work?:
var
tab: array[ 19, uint16]
for i, v in [ 16.uint16, 17, 18, 0, 8, 7, 9, 6, 10, 5, 11, 4, 12, 3, 13, 2, 14, 1, 15]:
tab[ i] = v
but not for
var
tab: array[ 16, uint32]
for i, v in [ 0x00000000.uint32, 0x1DB71064, 0x3B6E20C8, 0x26D930AC, 0x76DC4190, 0x6B6B51F4, 0x4DB26158, 0x5005713C, 0xEDB88320, 0xF00F9344, 0xD6D6A3E8, 0xCB61B38C, 0x9B64C2B0, 0x86D3D2D4, 0xA00AE278, 0xBDBDF21C]
tab[ i] = v
which always fails with
Error: type mismatch: got ‘int64' for ‘0x00000000EDB88320‘i64' but expected ‘uint32‘
So that I have to go round the houses with this
var
tab: array[ 16, uint32]
i: int
for v in """0x00000000, 0x1DB71064, 0x3B6E20C8, 0x26D930AC,
0x76DC4190, 0x6B6B51F4, 0x4DB26158, 0x5005713C, 0xEDB88320,
0xF00F9344, 0xD6D6A3E8, 0xCB61B38C, 0x9B64C2B0, 0x86D3D2D4,
0xA00AE278, 0xBDBDF21C""".split ", ":
let s = v.strip
tab[ i] = s.parseHexInt.uint32
inc i
which is ok ish but irritating?I don't know why the hex literals above high(int32) are locked into int64 but you do not have to parse hex strings here, you can just use an array of int64:
var tab: array[16, uint32]
for i, v in [0x00000000.int64, 0x1DB71064, 0x3B6E20C8, 0x26D930AC, 0x76DC4190, 0x6B6B51F4, 0x4DB26158, 0x5005713C, 0xEDB88320, 0xF00F9344, 0xD6D6A3E8, 0xCB61B38C, 0x9B64C2B0, 0x86D3D2D4, 0xA00AE278, 0xBDBDF21C]:
tab[i] = v.uint32