Hi,
When trying to solve some project Euler problem, I declared something like that:
import tables
const N = 16
type Digit = range['0'..'9']
var counters: array[N, TableCount[Digit]]
var exclusion: array[N, set[Digit]]
. . .
and got the usual warning message when dealing with a range which doesn't include the null value (here '0'):
Warning: Cannot prove that 'counters' is initialized. This will become a compile time error in the future. [ProveInit]
Of course, I could use char instead of Digit, but for the sets using a char will be less efficient (256 bits for each set instead of 9 bits rounded to 16). In this case, this is not the memory cost which would be a problem, but the performance. And anyway using char instead of Digit is ugly.
Using {.noInit.} doesn't change anything, so, for now, I have to live with the warning.
But, is there a better way to do this?
Thanks, it works!
But, as this only hides the warning and doesn’t suppress the check , I hope that, despite what is said in the message, this warning will never become a compile time error.
It seems that I was not in a great shape yesterday. There are some errors in my message.
Indeed, a set of range['0'..'9'] will occupy 58 bits rounded to 64, not 9 bits rounded to 16. I will have to declare Digit as a range[0..9] to use only 10 bits (not 9!) rounded to 16 and, in this case, the warning is no longer emitted as Digit includes the null value.
For my problem I can (and will) use a range[0..9]. In some cases, it may not be the right solution, so my question is still valid, I think, and the solution proposed by mratsim is the way to hide the warning.