I moved from Python after eventually, almost grudgingly, realising that types help more than they hinder - although far less than popularly imagined in data science & prototype exploration phase...anyway.
For the mundane & semi-mundane part of the week I am now fully invested in typescript, which is on front and back end. This is homogenously better than js & django for the web stuff that is so commercially ubiquitous these days. Deno is probably gonna be great.
In creative work I can also get back to an old love for digital romanticism / visualisations (p vs np). But of course I now want more speed. Much more. And I adore Nim - but think it is chasing the wrong targets- imo its closest rivals are typescript & node.
What I've seen so far of GPU control (regardless of language choice) is very high learning curve, and I want to bust out of the browser sandbox from time to time anyway coz some gifs are going to take hours to process.. is there a decent DSL that helps get GPU leverage? Pointers / tips gratefully received.
I have been learning GPUs and how to best use them. And its not easy. You can't use GPUs to just make your CPU things faster, CPU and GPU are good at different things. Many algorithms are very annoying and slower on the GPU so it makes sense to do those on the CPU, while very parallelizable tasks can be quite good for the GPU. Most algorithms end up being a hybrid where CPU sets up the data in a way that makes it easy and fast for GPU to do its thing.
I have written a nim-to-glsl compiler that might start you off: https://github.com/treeform/shady . It does both vertex and fragment shades you might use for games and compute shaders you might use for more "general" computing.