Getting Started
Install
Build and install the Yule CLI
Build from Source
git clone https://github.com/visualstudioblyat/yule.git
cd yule
cargo install --path yule-cliRequirements
- Rust 1.85+ (2024 edition)
- Windows 10+, Linux, or macOS (sandbox currently Windows-only)
- A GGUF model file (see below)
Get a Model
Yule runs any GGUF model file. For testing, grab TinyLlama:
# ~600MB download
curl -L -o tinyllama.gguf \
https://huggingface.co/TheBloke/TinyLlama-1.1B-Chat-v1.0-GGUF/resolve/main/tinyllama-1.1b-chat-v1.0.Q4_0.ggufFor real usage, any Llama, Mistral, Phi, Qwen, or Gemma model in GGUF format works. Check Supported Models for the full list.
Verify the Install
yule --version
yule inspect tinyllama.ggufThe inspect command should print the model's metadata (architecture, parameters, vocab size, etc.) without running inference.