I'm exploring Nim and so far I really like what I've seen. Being able to look into the generated C code and see how efficient is under the covers is really interesting and helps a lot into understanding how the language is implemented.
I like to follow the language manual and play along with short programs as concepts are introduced. While experimenting with structured types I've run into this apparent inconsistency:
type
Position = object
x, y, z: float32
PositionTuple = tuple
x, y, z: float32
var pos: Position
# no initialization and works OK
pos.x = 4
var posT: PositionTuple
# no initialization and works OK
posT.x = 4
var positionsA: array[0..5, Position]
# no initialization and works OK
positionsA[2] = Position(x: 1, y: 2, z: 3)
var positions: seq[Position]
# no initialization, crashes, no warning from the compiler
positions.add(Position(x: 1, y: 2, z: 3))
Looking under the covers I can see why: all the vars are a global static of the concrete object (or array), but the positions is implemented as a pointer, which never gets initialized. So without a positions = @[] in the Nim code it just points to somewhere invalid. Looking at the implementation of seq I understand why (it needs realloc for growing so always being a pointer makes it more efficient), but the doc doesn't mention this difference and the compiler will let you use positions without a warning.
Maybe a note in the docs would be appropriate? Is there something I am missing?
You can use
var positions: seq[Position] not nil
Then you get a warning like this at compile time:
gotchas.nim(20, 4) Warning: Cannot prove that 'positions' is initialized. This will become a compile time error in the future. [ProveInit]
If you want to make sure to use initialized seqs at all times:
type safeSeq[T] = seq[T] not nil
var positions: safeSeq[Position]
Sequence variables are initialized with nil. However, most sequence operations cannot deal with nil (leading to an exception being raised) for performance reasons. Thus one should use empty sequences @[] rather than nil as the empty value. But @[] creates a sequence object on the heap, so there is a trade-off to be made here.
I've found similar gotchas with ref types pointing to nil by default. I know it completely makes sense in the context of the language and its implementation as def and Araq explained, mind you, it's just that with all the hype with Rust and how often it's compared to Nim I guess I was kind of expecting the compiler to hold my hand much harder when it came to memory and references, specially since the language has a GC. But instead it feels like it's much closer to C semantics a lot of the time in those areas at a basic language level.
Now I have to see how far I can stretch the type system and/or object variants to do nil-free (or at least with mandatory checks) programming, sounds like a fun project and it's very good I can have many paradigms in the same language.