Hello!
I wanted to use Ternary Bonsai—a highly capable, compact local LLM—in a chat interface, but since I couldn’t find a suitable chat tool, I built one myself using the Nim language.
Ternary Bonsai : https://prismml.com/news/ternary-bonsai
nimchat: https://github.com/nimmer-jp/nimchat
It uses the Crown framework and Tiara components, which I’m currently developing.
Please feel free to give it a try!
Sorry, some parts are still in Japanese🙏
## run ternary bonsai model
$ python3 -m mlx_lm server --model prism-ml/Ternary-Bonsai-8B-mlx-2bit --port 8082
## run nimchat
$ nimble install
$ crown dev
