I’ve just released v1.3.0 of my chess engine Gyatso, written entirely in Nim, and this version marks a major milestone: NNUE evaluation is now fully integrated and working.
🔹 What’s new Added a working NNUE (Efficiently Updatable Neural Network) evaluation Current architecture: (768 → 256) × 2 → 1 Trained on ~242 million positions Optimized for fast CPU inference Fully integrated with the existing search (LMR, pruning, move ordering, etc.)
This replaces/augments the previous handcrafted evaluation and has already shown promising improvements in strength and scalability.
🔹 Project context
Gyatso has been under active development as a learning-focused but performance-driven engine. Before NNUE, it already had:
Advanced search techniques (LMR, pruning, heuristics) Strong handcrafted eval (~2800 Elo range)
With NNUE now in place, I’m shifting focus toward:
Better architectures
Training improvements
Efficient Nim-based inference optimizations
🔹 Why I’m posting here
Since this project is written in Nim, I’d really appreciate feedback and contributions from the Nim community, especially around:
Performance optimizations (SIMD, memory layout, etc.)
NNUE inference improvements in Nim
Better training pipelines or architecture ideas
General code quality and idiomatic Nim suggestions
🔹 Links
Repository: https://github.com/GyatsoYT/GyatsoChess
Release: https://github.com/GyatsoYT/GyatsoChess/releases/tag/v1.3.0
🔹 Contributing
The project is fully open-source, and contributions, suggestions, or even just code reviews are welcome.
If you're interested in chess engines, NNUE, or high-performance Nim code, your input would be valuable.