I've been looking at using nim to create extension modules for python. I would typically do this using cython and C/C++ with the workflow: Write in python -> profile -> re-write slow functions in cython -> compile to shared object (via c). Using the nimpy lib, I've been doing essentially the same thing with nim, which after initial set-up actually seems easier!
But I'm confused about how to approach distribution of the extensions as python libraries. With cython-written extensions, I would:
create a sourse distribution of .py and .c files. The end user builds this so it matches their system architecture/OS.
and/or create platform specific wheels for the most obvious platforms from .py and .so/.pyd
and/or create general wheel from .py and .pyx files, with a cython dependancy.
With nim, the generated .c is platform specific, does this means 1) is out?
Creating a nim dependancy for 3) doesn't seem feasible to me, even if the end user was willing to install nim (cython is a pypi package, so the installer can grab it and put it in the correct place automatically).
So that just leaves 2), creating all the combinations (e.g. 22(!) for numpy) of binaries for platform-specific wheels.
Are there any better methods to do this? The only nim extension pypi libraries I've seen (e.g. faster_than_requests) are not cross-platform enough even for my limited uses (Linux and windows 32 and 64 bit)
I can't say how it works though.
Otherwise, I think the best way to do that is to make producing a .dll/.so part of your CI (i.e. continuous deployment). There are a couple of examples here for CI: https://github.com/nim-lang/Nim/wiki/BuildServices, I'm not sure there are some for auto-deployments.
https://github.com/yglukhov/nimpy/wiki#publish-to-pypi
I use Docker for Windows on Windows, and most people seems to be using Docker anyways, for other OS, both languages have a construct named if that can be used to branch the build.
Pseudo code
if WINDOWS:
sources = "windows/*.c"
elif LINUX:
sources = "linux/*.c"
^ Basically, you see even more crazy stuff on Pythons setup.py whatsoever.
I’ve been thinking about this problem for a while too.
I copied Jonesmartinze three options here and numbered them for easier reference.
1. create a source distribution of .py and .c files. The end user builds this so it matches their system architecture/OS.
2. and/or create platform specific wheels for the most obvious platforms from .py and .so/.pyd
3. and/or create general wheel from .py and .pyx files, with a cython dependancy.
Number 1 is out because nim c code is platform dependent.
Number two is possible and numpy does it. I haven’t found a simple package to use as a pattern though.
If I’m understanding correctly, juancarlospaco is pointing out a fourth option:
4. generate c code for the target platforms ahead of time. Include each set of c files in the python package and at install time compile the correct set for the target platform.
However, the yglukhov nimpy wiki example referred to only shows one platform, ubuntu-latest, not Windows and Mac ... like I was expecting. What am I missing?
Assuming 4 works, which is better 2 or 4?