Hi,
I'm a nim newbie so really confused by what I'm seeing in the code below. Can anyone help with an explanation?
import typetraits
let g = @[
[true, false],
[false, true],
[true, false]]
for i, row in g:
echo "i = ", i, " of type ", i.type
echo "row = ", row, " of type ", row.type
for j, col in row:
echo "j = ", j, " of type ", j.type
echo "col = ", col, " of type ", col.type
This returns the following:
i = 0 of type int
row = [true, false] of type array[0..1, bool]
j = 0 of type range 0..1(int)
col = true of type bool
j = 1 of type range 0..1(int)
col = false of type bool
i = 1 of type int
row = [false, true] of type array[0..1, bool]
j = 0 of type range 0..1(int)
col = false of type bool
j = 1 of type range 0..1(int)
col = true of type bool
i = 2 of type int
row = [true, false] of type array[0..1, bool]
j = 0 of type range 0..1(int)
col = true of type bool
j = 1 of type range 0..1(int)
col = false of type bool
Why does the inner loop index have a type of range 0..1(int) whilst the outer loop has an index type of int? This is especially confusing to me as the inner loop is working with a regular sequence whilst the outer is concerned with a sequence of sequences.
Thanks for your help. I've hit this early on so just want to make sure I'm not bringing baggage with me to a new language.