doc2anki searches for configuration files in the following order:
- CLI argument -
--config /path/to/config.toml - Current directory -
./config/ai_providers.toml - User config directory -
~/.config/doc2anki/ai_providers.toml
For first-time setup, place your configuration in the user directory for global access:
mkdir -p ~/.config/doc2anki
cp config/ai_providers.example.toml ~/.config/doc2anki/ai_providers.toml[provider_name]
enable = true # Whether to enable this provider
auth_type = "env" # Authentication method
api_key = "OPENAI_API_KEY" # API key or variable name
default_base_url = "https://api.openai.com/v1"
default_model = "gpt-4"API key is stored directly in the configuration file:
[local_llm]
enable = true
auth_type = "direct"
base_url = "http://localhost:11434/v1"
model = "qwen2.5:14b"
api_key = "ollama"Note: Ensure proper file permissions to prevent key exposure.
Read API key from an environment variable:
[openai]
enable = true
auth_type = "env"
api_key = "OPENAI_API_KEY" # Environment variable name
base_url = "OPENAI_BASE_URL" # Optional: env var for base URL
model = "OPENAI_MODEL" # Optional: env var for model
default_base_url = "https://api.openai.com/v1" # Fallback value
default_model = "gpt-4o" # Fallback valueSet the environment variable before use:
export OPENAI_API_KEY="sk-xxxxxxxxxxxxxxxx"Load credentials from a .env file:
[deepseek]
enable = true
auth_type = "dotenv"
dotenv_path = "/home/user/.env" # Path to .env file (required)
api_key = "DEEPSEEK_API_KEY" # Key name in .env file
base_url = "DEEPSEEK_BASE_URL" # Optional: key name for base URL
model = "DEEPSEEK_MODEL" # Optional: key name for model
default_base_url = "https://api.deepseek.com"
default_model = "deepseek-chat"Example .env file content:
DEEPSEEK_API_KEY=sk-xxxxxxxxxxxxxxxx
| Field | Description |
|---|---|
enable |
Whether to enable this provider (true/false) |
auth_type |
Authentication method: direct, env, or dotenv |
api_key |
API key value or variable name (depends on auth_type) |
| Field | Description |
|---|---|
base_url |
API endpoint URL (or env var name for env/dotenv) |
model |
Model name (or env var name for env/dotenv) |
default_base_url |
Fallback base URL for env/dotenv modes |
default_model |
Fallback model name for env/dotenv modes |
dotenv_path |
Path to .env file (required for dotenv auth) |
[openai]
enable = true
auth_type = "env"
api_key = "OPENAI_API_KEY"
default_base_url = "https://api.openai.com/v1"
default_model = "gpt-4o"[deepseek]
enable = true
auth_type = "env"
api_key = "DEEPSEEK_API_KEY"
default_base_url = "https://api.deepseek.com"
default_model = "deepseek-chat"[qwen]
enable = true
auth_type = "env"
api_key = "DASHSCOPE_API_KEY"
default_base_url = "https://dashscope.aliyuncs.com/compatible-mode/v1"
default_model = "qwen-plus"[zhipu]
enable = true
auth_type = "env"
api_key = "ZHIPU_API_KEY"
default_base_url = "https://open.bigmodel.cn/api/paas/v4"
default_model = "glm-4-flash"[moonshot]
enable = true
auth_type = "env"
api_key = "MOONSHOT_API_KEY"
default_base_url = "https://api.moonshot.cn/v1"
default_model = "moonshot-v1-auto"[ollama]
enable = true
auth_type = "direct"
api_key = "ollama" # Ollama doesn't require a real key
base_url = "http://localhost:11434/v1"
model = "qwen2.5:14b"[openrouter]
enable = true
auth_type = "env"
api_key = "OPENROUTER_API_KEY"
default_base_url = "https://openrouter.ai/api/v1"
default_model = "anthropic/claude-3.5-sonnet"[together]
enable = true
auth_type = "env"
api_key = "TOGETHER_API_KEY"
default_base_url = "https://api.together.xyz/v1"
default_model = "meta-llama/Meta-Llama-3.1-70B-Instruct-Turbo"Check all enabled providers:
doc2anki validateCheck a specific provider:
doc2anki validate -p openaiList all providers (including disabled):
doc2anki list --allYou can configure multiple providers and switch between them:
[openai]
enable = true
auth_type = "env"
api_key = "OPENAI_API_KEY"
default_base_url = "https://api.openai.com/v1"
default_model = "gpt-4o"
[deepseek]
enable = true
auth_type = "env"
api_key = "DEEPSEEK_API_KEY"
default_base_url = "https://api.deepseek.com"
default_model = "deepseek-chat"
[ollama]
enable = true
auth_type = "direct"
api_key = "ollama"
base_url = "http://localhost:11434/v1"
model = "qwen2.5:14b"Use the -p flag to select a provider:
doc2anki generate notes.md -p openai
doc2anki generate notes.md -p deepseek
doc2anki generate notes.md -p ollama- Don't commit keys to version control -
ai_providers.tomlis already in.gitignore - Use environment variables or dotenv - Recommended for production environments
- Restrict file permissions -
chmod 600 ~/.config/doc2anki/ai_providers.toml - Rotate keys regularly - Follow each provider's security best practices
- Use different configs for different projects - Pass
--configto isolate credentials
Error: Provider 'xxx' not found in configuration
Check that:
- The provider section exists in your config file
enable = trueis set- The config file path is correct
Error: Failed to authenticate with provider
Verify:
- API key is correct
- For
envauth: environment variable is set - For
dotenvauth:.envfile path and key name are correct
Error: Connection refused
For local providers (Ollama), ensure:
- The service is running
- The port is correct
- No firewall blocking the connection