x = {'a'..'z', '0'..'9'}
it works! but :
x = {1..9, 15, 45..78}
is not. What is wrong?
The error message for the second case isn't as clear as it could be, if you don't know what to look for maybe. It says:
Error: type mismatch: got <set[range 0..65535(int)]> but expected 'CharSet = set[int16]'
See the (int) after the range ...? That tells you the type it gets is actually of type int. Your set however takes int16. So to make it work you have to give explicit int16 literals:
x = {1'i16..9'i16, 15'i16, 45'i16..78'i16}
It is just a set type definition - and it works for every acceptible type
It is just a variable declaration - nothon unusial
x = {1..9, 15, 45..78}
It is a set constructor - it does not work with anything but char... It is pity...
If so...
1. there is not need in auto inferring, since type of x well-known before, and it can not be changed implicitly by Nim rules.
there is not need in auto inferring, since type of x well-known before, and it can not be changed implicitly by Nim rules.
It's the job of the compiler to ensure that the type of the value on right hand side matches with the type on left hand side.
I cannot rely on the user always doing the Right Thing. It's quite likely the error we see in your second example is a real error because the user put non-int16 values in the set by mistake :)
2. why
You have to accept that compilers may not see x = {1..9, 15, 45..78} as one entity, but as an assignment, where the right side is evaluated first and then assigned to the var on the left. And {1..9, 15, 45..78}is evaluated as a set of integers, which is not compatible to a set of int16. x is set of int16, so assignment can not work.