kraai is an agentic runtime for llm tool calling.
The main goals of the project are improving token efficiency and model accuracy.
- Toon Formatted Tool Calls: All tool calls use less context
- Dynamic System Prompt: Token caching works with an ever changing system prompt
- Stateful Tools: Tools can cause system prompt injection with updating context every turn
- Small Tool Set: open_file, edit_file, bash, search_files, list_files, close_file
Run in the terminal
kraaiRun through nix
nix run github:kraai-io/kraaiBuild with cargo
cargo runMost configuration can be done through the UI. Currently all configuration is unstable and might change at any time.
Apache-2.0