I'd say it's much less useful, especially considering that 1-char "numbers" are just single digits, so you can get a number from an ASCII character easily:
echo ord('5') - ord('0')
I don't mean to sound catty when I say this but is nim not meant for beginner programmers as well? this is something I struggled with while learning nim. the solution I eventually found was to turn it in to a string using $. However this was not immediately obvious for me how to do. I also didn't know the ord trick as well. I love nim, I love most of the design decisions for nim. However its small stuff like this, where I have to spend an twenty minutes googling trying to solve something that someone could have either put a extra two lines into the std/library or posted this somewhere accessible by google.
Again if nim is not meant for beginners please advertise as such and not let poor fools like me fall in love only to have our hearts broken by non obvious complexity (this sounds a bit hyperbolic but I used to have a passion for reading about languages until I found nim which I vibe with on a fundamental level)
I would also be completely fine with it if my experience improved the experience of future programmers like me
The best way to have this happen is to document your issues formally in a pull request to update the docs.
It's not as concrete as the forum but the real time Nim chat is generally pretty helpful for solving these small issues. So if you want to quickly get around your issues it's the best place to go in my opinion.
As your programs grow bigger, avoid the string datatype. string indicates a lack of structure, it's effectively dynamically typed code. Use the static type system to the best of your abilities.
Nim is not optimized for beginners but it's not a bad choice either:
Sure, the documentation could be better and sometimes "2 line helper procs" are missing from the standard library, but that's details that won't help you as much as you seem to think. You don't need "more fish", you need to learn how to fish.
All great points (although I'm not entirely sure what you have against PEGs), but even as an experienced programmer both in Nim and other languages I do sometimes want to just have the fish handed to me. Having a parseInt overload for char which just does the ord trick Yardanico showed (maybe with an assert to check that the character is actually in the range '0'..'9') would be nice. I know how to do it myself, but sometimes it's nice to have things in a library.
And @xigoi, while parseInt might not be the perfect name for it I don't think anyone will see parseInt(c: char): int and be confused about what that function does. And in a way it does actually do parsing since it checks if c is in the correct range and converts it to a different value than what a normal ord would do.
"As your programs grow bigger, avoid the string datatype. string indicates a lack of structure, it's effectively dynamically typed code. Use the static type system to the best of your abilities."
Another example of a simple sentence that, as an amateur (beginner) programmer will have me scratching my head and searching Google, etc, for a few hours. My amateurish programs are filled with string-type variables...
you don't start learning to read with the Bible
I don't really disagree, but would add that I think this phrasing speaks to the modern age of perfectly tailored educational content which has us all somewhat spoiled, culminating in "programming by StackOverflow copy-paste". Specifically, the Gutenberg Bible revolutionized literacy in Medieval Europe. OTOH, it sure can be nice to be spoiled. :-)
FWIW, I think that book by Jeff Erickson, while nice, is more a 2nd/3rd course for most. It assumes readers know about balanced trees, hash tables, etc. I'm also not sure "fancy graph algorithms" vs. "just using libs/concepts" is most helpful for true beginners who are just learning how to "decompose problems". I also like Sedgewick as a first/early course { though I also had the original Pascal. I haven't really looked at the Java editions. Maybe they become more dry, corporate, verbose & roundabout, like Java itself. ;-) } Cormen, et al. seems too formal, IMO, and it is also helpful to use "real" prog.langs so learners can just type stuff in and try it out (like Stefan's book) without "hand translating". I personally also loved Knuth's TAOCP, but I agree that it is not for everyone.
scratching my head and searching
This is roughly how "Unix shells" and other command languages like Tcl and command.com work. Any "unitype-driven" system has all the same issues as an "everything is a PyObject" system, for all the same reasons. Bob Harper of CMU likes to talk this way. I don't know if that blog post helps or is better than your Googling or is just a wall of text for the already knowledgeable, but perhaps it makes this thread more self-contained.
There is basically a natural complexity sequence from unitype/strings to "flat structures" like database tables/spreadsheets/struct-like objects to trees (such as file hierarchies) to more general graphs. A beginner might think that by "sticking to the simple string/PyObject" they are "saving complexity", but really you are just squeezing complexity jell-O - it oozes into the logic of how you produce/consume those strings { and similarly for the next steps. DB/spreadsheet people "link" together tables for "joining"/zippering/etc. }.
It is difficult to express how to keep life simple "in the large" (in this and honestly most areas of life!) without many examples and real shared experience to reference - the true cost of learning. That cost needs personal motivation which is often, well, personal, as is how "end-to-end/complete" things must be to inspire. E.g., maybe graph algos are the best starting point for a social network junkie. :-)
Maybe textbooks themselves are dinosaurs. Maybe lots of examples with a "Wikipedia-driven way" is the new, best way? Or YouTube's with nice animations for those who "reason more graphically/visually"?
Nah, read Sedgewick first. ;-)
But I still don't know what static type I should use to replace strings (pre-defined arrays of characters ?).
type
Filename = distinct string
Displayname = distinct string
Option = enum ...
Options = set[Option]
etc.
This is roughly how "Unix shells" and other command languages like Tcl and command.com work. Any "unitype-driven" system has all the same issues as an "everything is a PyObject" system, for all the same reasons.
There is actually a big difference between Unix shells and the PyObject system. The PyObject system does not introduce a hairy mess of error-prone quoting rules.
perfectly tailored educational content [...], culminating in "programming by StackOverflow copy-paste"
I think you're making a big leap here, I don't think these two facts are related really.
Specifically, the Gutenberg Bible revolutionized literacy in Medieval Europe.
I knew that comparison was not very strong, as soon as I typed it. To be frank, you don't need to really get into the stuff you're reading if all you need is the reading skill itself. Not so much with learning to program. What I was emphasizing is just the sheer amount of information, which can surely overwhelm the student, especially if you take into account the Bible is ~ 800k words and TAOTP 1-4A is ~ 2500k words (and code!). And don't get me wrong, this book is a treasure and a blessing for us all and I loved those parts of it that I read.
Maybe textbooks themselves are dinosaurs. Maybe lots of examples with a "Wikipedia-driven way" is the new, best way? Or YouTube's with nice animations for those who "reason more graphically/visually"?
The answers are: No; It's the old best way (good textbooks have lots of examples); No generally, Yes for some domains/specific topics.
I've skimmed over the old edition of Sedegwick, looks like a different book altogether. The later editions (at least the 4th ed. by Addison-Wesley) have much more helpful schemes and visualizations.
@cagyul, Just supplement different sources, check yourself meticulously with practical exercises and soon you'll get what works best for you personally. Don't treat any single suggestion as set in stone.
What Araq says is very fair in practice, but maybe requires clarification in theory. Quoting rule complexity is not intrinsic to the command language problem (which might almost be defined as "everything is a string literal but without quotes unless they are 'necessary', whatever that means"). At least it doesn't/didn't need to be quite so error prone.
E.g., the admittedly obscure Plan9 rc shell (PDF) which has a Unix port (dating back to the early 90s) made simple quoting a priority and came up with much simpler rules than Bourne/Korn/Zsh/Bash tradition (perhaps command.com, too). Just single quote (and also backslash escapes). It is very true that what becomes popular seems to be hairy messes. In this and many things. :-)
Also, Python just has enough other syntax going on to require quoting of string literals even if semantically things are all PyObject underneath. There is still a lot of variety of single-, double-, triple-, r-string, f-string, u-string, b-string and maybe I'm missing one. That's 3*5=15 ways to spell a string literal (including unqualified). So, I wouldn't say Python fully escaped "quoting hell". :-) { I think Nim does better here by just being more flexible. }
@Zoom - I think we also don't disagree, but to clarify I almost qualified that StackOverflow bit with "but I'm not sure I'd call that 'learning'". Maybe. Depends on what you mean by "education", "depth of knowledge", etc. Programming by just looking up answers/trial and error is a real thing, esp. for beginners, though - enough to be a sales pitch.
I fully support/reiterate "no suggestion as set in stone" and also recommend @cagyul (or anyone) muster the most specific questions they can, in other threads, and with the most problem context possible. Abstract questions have only abstract answers which are often unsatisfying for beginners. One reason Rosetta Code can be nice is its "fully worked out" nature, but the problems/scope are often a bit more limited than Sedgewick/algo books/etc.