Configuration
seekr uses a TOML configuration file located at ~/.config/seekr/config.toml.
Configuration Options
The config file is split into main sections for providers, agent, and ui settings.
[[providers]]
Settings related to your configured LLM API connections. Seekr supports multi-model configuration via the providers array in combination with active_provider.
| Key | Default | Description |
|---|---|---|
name | "Seekr AI" | A display name for the provider. |
key | "" | Your API Key for the provider. |
model | "gpt-4o" | The model ID to use (e.g., deepseek-chat, gpt-4o). |
base_url | "https://api.openai.com/v1" | The base URL for the API. |
active_provider
| Key | Default | Description |
|---|---|---|
active_provider | 0 | The integer index of the currently active provider in the providers array. |
[agent]
Controls the behavior and constraints of the AI agent.
| Key | Default | Description |
|---|---|---|
max_iterations | 25 | Maximum number of tool-use steps per task. |
auto_approve_tools | false | If true, the agent will run tools without asking for permission. |
working_directory | "." | The default directory where the agent operates. |
[ui]
Customizes the look and feel of the terminal interface.
| Key | Default | Description |
|---|---|---|
theme | "dark" | UI Theme (currently supports dark). |
show_reasoning | true | Whether to show the agent's internal reasoning steps. |
Example Config
[[providers]]
name = "DeepSeek"
key = "sk-..."
model = "deepseek-reasoner"
base_url = "https://api.deepseek.com"
[[providers]]
name = "OpenAI"
key = "sk-..."
model = "gpt-4o"
base_url = "https://api.openai.com/v1"
active_provider = 0
[agent]
max_iterations = 50
auto_approve_tools = true
working_directory = "/home/user/projects"
[ui]
theme = "dark"
show_reasoning = true
