if you look from far enough yeah, they are programming languages
if you actually look better, you'll find apples and oranges
Already discussed in https://forum.nim-lang.org/t/10159.
And it's Nim, not NIM (it's NOT an abbreviation!!!)
Serious answer:
You can actually download and use the Nim compiler. It is mature and will hit 2.0 soon.
Nim is open-source.
Mojo appears to use manual memory management with borrow checking, like Rust, whereas Nim has fully automatic, yet deterministic, memory management via ARC (https://nim-lang.org/blog/2020/10/15/introduction-to-arc-orc-in-nim.html)
Why is the marketing always targeted at A.I.? > Mojo 🔥 — a new programming language for all AI developers.
Too bad. I am a HPC researcher so probably not for me ;)
Yes, Mojo is a white-space significant language. It introduces let and var statements. It has simple algebraic data types with enum`s. It just copies too much Python: why did it keep the `__special_methods__ the self keyword and other ugly syntaxes? It is a dynamic typed language and no, the developer has to type in the size of the elements used for computing, since that depends from the use-case constraints.
I am not adding anything substantial from the post linked above. Mojo will probably be a better language than Python, but it is in a toy-project state right now, and will never have the same claims than Nim.
Why is the marketing always targeted at A.I.?
Because that's where the VC money is at? It's best to think of what drives VC's: FOMO and not looking bad. AI is all the rage and Lattner is a definitely an "IBM name" at this point. To be fair, the state of AI programming seems subpar.
One of the main ideas behind mojos syntax is: Mojo aims to be compatible with Python, enabling an easier transition from Python to mojo. Similar to C++ to C, it is a superset of Python.
In other words, you can copy and paste your existing Python code into a mojo file and it should run. If you want performance benefits, then you need to use added syntax for types and compiled procedures/functions, etc. This seems to be due to learnings from Swift, I.e. low adoption of tensor flow swift due to researchers not wanting to transition. This probably wasn’t the only reason, but it probably one of them. For large companies with existing large Python code bases this makes the transition to mojo trivial (batch rename a file), allowing for optional opt-in performance features.
With its backend it aims to be similar to Julia with REPL/jupyter support implemented via the llvm ecosystem. It differs from Julia with MLIR for gpu and other device support to be implemented and extended over time.
The language is in a pretty early state. We’ll have to see how it goes when they open source it.
We'll see how it goes...
with large projects planning to migrate to it:
It died, even with Google/Apple backing:
Not saying Mojo will suffer the same fate, they do have large funding, but it requires overwhelming advantages to overcome the weekly output of AI researchers around the world, in particular those funded by Facebook, Google, Nvidia, Microsoft and OpenAI.
That ecosystem advances very fast.