Prerequisites
Before you begin, make sure you have:- Rust 1.70 or higher installed
- An Anthropic API key, Google AI API key (Gemini), or OpenAI API key
- Basic familiarity with async Rust and Tokio
Installation
Add PiCrust to your project’sCargo.toml:
Environment Setup
Set up your LLM provider credentials:Your First Agent
Create a new filesrc/main.rs and add the following code:
Run Your Agent
Execute your agent:Understanding the Code
Let’s break down what’s happening:1. LLM Provider
LlmProvider trait.
2. Agent Runtime
3. Session
./sessions/{id}/). This enables conversation continuity across restarts.
4. Agent Configuration
AgentConfig defines agent behavior. StandardAgent is the main agent implementation that handles the request-response loop.
5. Spawning
6. Communication
7. Output Processing
TextDelta contains text tokens, Done signals completion.
Adding Tools
Let’s make the agent more useful by adding file access:Switching Providers
All three providers are interchangeable — just swap one line:Handling Permissions
By default, tools require user permission. Handle permission requests:Viewing Conversation History
All conversations are automatically saved to disk. View them:Complete Example
Here’s a complete interactive agent:Next Steps
Now that you have a working agent, explore more features:Core Concepts
Learn about Runtime, Sessions, and Agent States
Built-in Tools
Explore all available tools
Permission System
Understand the three-tier permission system
Hooks
Intercept and modify agent behavior
Common Issues
”Permission denied” errors
Make sure to handleOutputChunk::PermissionRequest or use:
No output streaming
Remember to subscribe before sending input:API key not found
Set the env vars for your chosen provider:What’s Next?
You’ve built your first agent! Continue learning:- Architecture Overview - Understand the system design
- Streaming & History - Critical patterns for UIs
- Custom Tools - Build your own tools
- Tauri Integration - Build desktop apps
Ready to dive deeper? Explore the Core Concepts next!