Don't get me wrong here:
Nim concepts introduce wonderful advantages; squarely among them:
Unfortunately, very much like {.base.} methods, a strange type of boilerplate is introduced:
from math import PI,pow
type
ShapeKind = enum Circle,Rectangle
ShapeProps = tuple[shape:string,area,perimeter:float]
CircleShape = concept a
a.r
buildCircleProps(a)
RectangleShape = concept a
a.w
a.h
buildRectangleProps(a)
Shape = object
case kind:ShapeKind
of Circle:r:float
of Rectangle:w,h:float
# Boilerplate start -->
func buildCircleProps[T](ob:T):ShapeProps = discard
func buildRectangleProps[T](ob:T):ShapeProps = discard
# Boilerplate end
func buildCircleProps(circle:CircleShape):ShapeProps =
($Circle,PI*circle.r.pow 2,2.0*PI*circle.r)
func buildRectangleProps(rectangle:RectangleShape):ShapeProps =
($Rectangle,rectangle.w*rectangle.h,2.0*rectangle.w+2.0*rectangle.h)
Nim is one of the least verbose languages I have encountered - which is a delight for someone that's been tortured by Java for years...
But once in a blue moon I run into these very strange decisions. My thought was: incredible, Go-lang interfaces in Nim via concepts, the absolutely correct choice of technology!
And then I realized that in Nim - of all languages - I had to go through example code on an obscure repo and write a novel to get there. What a shame...
Any good reason for this (and {.base.} methods) other than bureaucracy?
your concepts are wrong, saying that
type CircleShape = concept a
a.r
buildCircleProps(a)
means that for anything to match CircleShape, it must already have a buildCircleProps defined for it. if you do not want that, you remove the buildnnnn() lines and then it will compile properly without the boilerplate.The lengths to which people go to preserve extreme levels of OO abstraction and inheritance baffles me. I've found zero use for any of it in the real world. I'm going to be "that guy" and suggest that Nim's objects map to structs fairly closely for a reason, and that maybe learning to simplify your abstractions will help you adapt to Nim more readily (and construct more maintainable software). Your example contrivance ironically demonstrates the contrived ridiculousness of those extra layers of abstraction.
If my comment strikes you as hostile or ill-mannered then please ignore me.
Try this:
const shapes = [
Circle:Shape(kind:Circle,r:10).buildCircleProps,
Rectangle:Shape(kind:Rectangle,w:10,h:10).buildRectangleProps,
]
I no longer get the point of this forum post and should no longer participate, but...Thank's to leorize's https://github.com/alaviss/union there is a usable union solution.
import union
type
CodeCmd = distinct string
ReplaceCmd = object
elementId, content: string
Command = union(CodeCmd | ReplaceCmd)
proc eval(code: CodeCmd): lent string = string code
proc eval(replace: ReplaceCmd): lent string = replace.content
var c = CodeCmd"alert('hi')" as Command
assert unpack(c, eval(it)) == "alert('hi')"
c = ReplaceCmd(content: "meh") as Command
assert unpack(c, eval(it)) == "meh"
You doing manual boxing/unboxing. here's the diff with union types, judge yourself which one is better.
The one that doesn't imply a language change for the loud minority consisting of 1 being that will use TypeScript anyway in 3 ... 2 ... 1. ;-)
Variant have problems, for example, many functions need just the single variant, but you had to pass the all variants, and end up with lots of unneded check like ``variant.kind == 'a'`` etc. And some other type safety and other problems.
I do believe that my example: the one that sparked this whole thing; demonstrates - if nothing else - that you can use Nim concepts to render object variants type safe...
If you feel the need for it, feel free to use some interface macro. And not again this "it needs to be in the stdlib!". Point (2) applies for "must be in stdlib" too.
Well, I'm the one who suggested than an interface macro should be in the stdlib and I feel somehow bad for it.
In fact what I'd like to see in the stdlib is standard types (but not necessarily implementations) where every implementation could agree on. In particular for I/O streams. I believe we need good streams types that could be implemented by a HTTP server library, by the operating system files or anything in between that could suit the need for everyone. std/streams, last time I checked, is not very good at that. The reason why I said that iface should be in stdlib is because this is a way I know where multiple parties can agree on an interface not tied to a specific implementation. But anything that could do the job of standard types would equally work well in my opinion.
Preventing boilerplate in a language with macros as a focus is not good enough to justify direct union/sum types over object variants. And if you want type safety, you will still experience a similar amount of friction with either design.
I also know only 1 language that implements them, Crystal. So it's a lot of responsibility to get it right.
That being said, there are (fairly tentative) recent RFCs with a lot of relevant discussion, https://github.com/nim-lang/RFCs/issues/527 and https://github.com/nim-lang/RFCs/issues/525.
in a language with macros as a focus
That's the problem in itself. Do I want a language with "macros as a focus" - absolutely not, why would I want extra complexity in my project if I can avoid it? Macros are nice to have, but it should be avoided, unless you absolutely need it. Like in 0.1% of cases, to implement specific things like test or assert etc. Using macros "as a focus" is a fundamentally wrong way.
Would you want to deal or manage a project complicated project filled with custom macros? (Scala by the way famous for that, I know many companies who mistakenly did some projects in Scala, and then tried to get rid of it, rewriting Scala in Java, or Kotlin, ha ha) So you have a hard time shifting teams around, because working with such project requires long learning time? Definitely no, it's a big no for any software company.
TypeScript implement union types.