Provider interface returning AsyncIterable<StreamEvent>.
Available Providers
| Provider | ID | Default Model | Auth |
|---|---|---|---|
| Nebius | nebius | MiniMaxAI/MiniMax-M2.1 | NEBIUS_API_KEY env or nebiusApiKey |
| Anthropic | anthropic | claude-sonnet-4-5-20250929 | ANTHROPIC_API_KEY or anthropicApiKey |
| Fireworks | fireworks | accounts/fireworks/models/minimax-m2p5 | FIREWORKS_API_KEY or fireworksApiKey |
| GitHub Copilot | github-copilot | claude-sonnet-4.5 | lukan copilot-auth |
| OpenAI Codex | openai-codex | gpt-5.3-codex | lukan codex-auth |
| z.ai | zai | glm-5 | ZAI_API_KEY or zaiApiKey |
| Ollama Cloud | ollama-cloud | Fetched from API | OLLAMA_CLOUD_API_KEY or ollamaCloudApiKey |
| OpenAI Compatible | openai-compatible | Fetched from endpoint | OPENAI_COMPATIBLE_API_KEY or openaiCompatibleApiKey |
| Lukan Cloud | lukan-cloud | Inherited from Anthropic | Managed by Lukan Cloud account |
Setting Up a Provider
Interactive Setup
Manual Configuration
-
Set the API key via environment variable:
-
Or add it to
~/.config/lukan/credentials.json:
~/.config/lukan/credentials.json.
Selecting Models
Per-Channel Configuration
You can override the provider per-channel (WhatsApp, Email) inconfig.json:
OAuth Authentication
GitHub Copilot
OpenAI Codex
Ollama Cloud
Ollama Cloud fetches models dynamically fromhttps://ollama.com/api/tags. Vision-capable models are detected automatically (patterns: vision, -vl, multimodal, llava, llama-4).
OpenAI Compatible
Generic provider for any OpenAI-compatible endpoint (vLLM, Ollama local, LM Studio, etc.). Requires a base URL:/models endpoint. The API key is optional.
Lukan Cloud
Managed proxy service that forwards requests to Anthropic. Base URL defaults tohttps://api.lukan.cloud and can be overridden with the LUKAN_CLOUD_URL environment variable.
A 402 response indicates quota exceeded. Visit your Lukan Cloud account to upgrade.
