Are there any packages / issues that would be good candidates for Hacktoberfest? That is, issues that are pretty self-contained or good introductions to the package?
I have my eye on NimData, Neo, and ArrayMancer for myself, but it might be worth adding the "Hacktoberfest" label to issues on your project so that others participating can find them.
I know we already have the "Easy" label for the Nim project itself, so I think it shouldn't be too hard to find good Hacktoberfest issues from there.
@perturbation, Thanks for your interest.
Many of the low hanging fruits in Arraymancer are done but I could use your help in implementing reduction functions like "max", "min", and in-place functions like msqrt, msin, as mentioned here: https://github.com/mratsim/Arraymancer/issues/58
From their docs, it looks like neo and Arraymancer are both intended to be substitutes for NumPy. Can someone explain why I would prefer one over the other?
Nim could be very useful in data science and scientific computing; it's good to see some libraries sprouting up.
I will reply for Neo and then let mratsim speak for ArrayMancer.
Neo was born first (at least, its precedessor linalg was). I tried to give it an interface more familiar for people coming from linear algebra (everything there is expressed in vectors and matrices), as opposed from the popular Numpy approach where everything is a n-dimensional tensor. Not that tensors are not from mathematics, but - well, there are many kinds of tensors in mathematics, they are a bit more complicated than simply an n-dimensional table, so I left that part for later. I tried to focus on providing bindings to the BLAS, LAPACK and CUDA libraries, in order for the actual computations to run fast, and tried to postpone abstractions such as tensors for later (I have not decided on the interface yet).
For similar reasons, only lately I introduced operations such as the element-wise product of vectors, which do not have a natural mathematical interpretation (but it is there at last). Also, I tried to design Neo in such a way that it could accomodate matrices and vectors allocated on the stack.
My recent interest is in trying to adapt Neo to work with sparse matrices, both on the CPU and GPU side - which is complicated, since - unlike BLAS for dense matrices - there is not a common approach on the two sides. I would also like to focus on matrices decompositions and other features from LAPACK (but unfortunately, these are not always implemented on CUDA).
I also moved the actual libraries bindings into separate packages: CUDA, BLAS, LAPACK so that they could be used by other libraries in the Nim ecosystem.
All that said, work on Neo is now quite slowed down. In general, I had not much time to work with Nim lately - I still have plans to go on with Neo, but I have made no actual progress in the latest few months. ArrayMancer is currently more active, but I will let mratsim speak about its features.
Arraymancer was born in-between linalg and neo and is targeted at machine learning, with a focus on deep learning and computer-vision (and later speech and language hopefully).
I started Arraymancer because you need 4-dimensional tensors for deep learning on images (Number of images, RGB colors, height, width) and 5d for videos or 3D images.
It takes a Numpy/Julia/Matlab like multidimensional array approach meaning operations like reshaping, concatenating arrays over an arbitrary dimensions, permuting dimensions, working with stacked 2d matrices in a 3d tensors is possible and efficient.
There are a few twists:
Like Nim, I choose to have value semantics for my tensors, meaning it's copy on assignation. You can opt-in to use "unsafe" operations like "unsafeReshape" or "unsafeTranspose" that will provide a view over the same data.
It avoids a lot of beginners gotchas and like broadcasting makes it easy to grep for potential issues.
Note: For CudaTensors I currently can't due to = overloading issues.
My short-term plans (consolidating the base):
Mid-term plans (focus on data):
Long-term plans (hardware and format):
Very long-term plans (wishlist):
Very (very !) long-term plans (wishlist reloaded):
What is not a focus: