Overview
Creor supports two approaches for using your own keys:
- Direct API keys -- use your own credentials for any of the 19+ built-in providers (Anthropic, OpenAI, Google, etc.).
- Custom endpoints -- connect any service that implements the OpenAI-compatible chat/completions API format.
Both methods give you full control over billing, rate limits, and data routing. Your API keys are stored securely in the OS keychain and never transmitted to Creor servers.
Adding Provider API Keys
For any supported provider, you can use your own API key instead of the Creor Gateway.
Via Settings UI
- Open Creor and go to Settings.
- Navigate to the Providers section.
- Find the provider you want to configure.
- Enter your API key in the key field.
- The key is encrypted and stored in the OS keychain immediately.
Via Environment Variables
Each provider has a standard environment variable for its API key:
| Provider | Environment Variable |
|---|---|
| Anthropic | ANTHROPIC_API_KEY |
| OpenAI | OPENAI_API_KEY |
| Google AI Studio | GOOGLE_GENERATIVE_AI_API_KEY |
| Google Vertex | GOOGLE_CLOUD_PROJECT (+ gcloud ADC) |
| AWS Bedrock | AWS_ACCESS_KEY_ID + AWS_SECRET_ACCESS_KEY |
| Azure OpenAI | AZURE_OPENAI_API_KEY + AZURE_OPENAI_ENDPOINT |
| OpenRouter | OPENROUTER_API_KEY |
| Groq | GROQ_API_KEY |
| Together AI | TOGETHER_AI_API_KEY |
| DeepInfra | DEEPINFRA_API_KEY |
| Cerebras | CEREBRAS_API_KEY |
| Mistral | MISTRAL_API_KEY |
| Cohere | COHERE_API_KEY |
| Perplexity | PERPLEXITY_API_KEY |
| xAI | XAI_API_KEY |
| Vercel | VERCEL_API_KEY |
Via creor.json
Warning
creor.json. This file is typically checked into version control. Use the Settings UI (OS keychain) or environment variables for credentials.Use creor.json only for non-secret provider configuration like base URLs, timeouts, and model filtering:
Custom OpenAI-Compatible Endpoints
Any service that implements the OpenAI chat/completions API format can be used as a provider in Creor. This includes LLM proxies, corporate API gateways, and OpenAI-compatible inference servers.
Defining a Custom Provider
Add a custom provider in creor.json using the @ai-sdk/openai-compatible npm adapter:
Then reference it like any other model:
Authentication for Custom Endpoints
Set the API key for your custom provider via the Settings UI (it will be stored under the provider ID you chose) or via a provider-specific environment variable. You can also add headers directly in the provider options for token-based auth:
Self-Hosted Models
If you run your own LLM inference server (vLLM, Ollama, llama.cpp, TGI, etc.), you can connect it to Creor as long as it exposes an OpenAI-compatible API.
Example: Ollama
Example: vLLM
Tip
Credential Storage
Creor stores API keys securely using the operating system's native credential manager:
| OS | Storage Backend |
|---|---|
| macOS | Keychain (via SecretStorage API) |
| Linux | libsecret / GNOME Keyring / KWallet |
| Windows | Windows Credential Manager |
Keys entered through the Settings UI are encrypted at rest and never written to disk in plaintext. They are not included in creor.json or any other configuration file.
Provider Configuration Reference
The full provider configuration schema in creor.json:
Troubleshooting
Provider not showing up
- Make sure the API key is set (via Settings UI or environment variable).
- Check that the provider is not in your disabled_providers list.
- If using enabled_providers, make sure your provider is included.
- Restart Creor after changing environment variables.
Authentication errors
- Verify your API key is valid and has not expired.
- Check that your account has billing/credits set up with the provider.
- For custom endpoints, verify the base URL is correct and the server is running.
- Check Creor logs for detailed error messages.
Custom endpoint returning errors
- Ensure the endpoint implements the OpenAI chat/completions format.
- Verify the model ID matches what your server expects.
- Check that the context and output limits in your config match the server's capabilities.
- Try increasing the timeout if the server is slow to respond.