const
AlphaNum: set[char] = {'a' .. 'z', 'A' .. 'Z', '0' .. '9'}
MathOp = {'+', '-', '*', '/'} # set[char]
ANMO = AlphaNum + MathOp # union
var x: char = 's'
echo x in AlphaNum
echo x in MathOp
type
CharRange = set['a' .. 'f']
# var y: CharRange = {'x'} #invalid
var y: CharRange = {'b', 'd'}
echo 'c' in y
type
ChessPos = set[0'i8 .. 63'i8]
var baseLine: ChessPos = {1.int8}
var p: int8
echo p in baseLine
type
H = set[0 .. 63]
var h: H = H({1})
echo typeof({1})
echo typeof({-1})
echo typeof({1'i8})
Sets of char are easy. But sets of numbers are more difficult.
The operations with the ChessPos type are OK. But with
var h: H = H({1})
I do not get the assignment to var h working. Nothing of following code seems to compile:
var h: H = {1} var h: H = H({1}) var h: H = {1'i8} var h: H = {1'}
And
echo typeof({-1})
is
set[range 0..65535(int)]
which is also a bit strange.
Yes, this is rather strange. I was able to make it work using an explicit range.
type
R = range[0..63]
H = set[R]
var h: H = {R(1)}
For sure, the compiler expects the values in the set literal to be of the "right" type. Apparently, that means here that 1 alone cannot work.
I think in Modula and Oberon we had symmetric difference (/) and set complementation (unary minus) for sets.
Is that not available for Nim? And how can we emulate that efficiently?
Complement: {'\0'..'\255'} - s
Symmetric difference: No known applications, used to be -+- irrc, removed.
You should always define your sets like
type
Foo = <type definition>
Foos = set[Foo]
So here,
type
ChessSquare = range[0..63]
ChessSquares = set[ChessSquare]
var baseLine = {0.ChessSquare .. 7.ChessSquare}