Agentkube lets you configure custom API keys for various LLM providers to use your own accounts when making AI requests, giving you flexibility and control over your usage and billing.

Getting Started with Custom API Keys

You can connect your own provider accounts to extend Agentkube’s capabilities beyond the included quota limits.

Custom API keys allow you to send as many AI messages as you want at your own cost, using your preferred LLM providers.

Currently, adding API keys will store them but won’t override the default keys since this feature is under development. Full functionality will be rolled out soon.

Managing API Keys

Things to know about API Keys in Agentkube

  • API keys are stored in ~/.agentkube/settings.json in base64 format
  • You can add keys through the Agentkube Desktop app or by manually editing the settings file
  • Some Agentkube features require specialized models and won’t work with custom API keys
  • Custom API keys only work for features that use standard models from providers

Supported Providers

OpenAI API Keys

You can get your own API key from the OpenAI platform.

OpenAI’s reasoning models (o1, o1-mini, o3-mini) require special configuration and are not currently supported with custom API keys.

Anthropic API Keys

Similar to OpenAI, you can also set your own Anthropic API key so that you will be using Claude-based models at your own cost.

Google API Keys

For Google API keys, you can set your own API key so that you will be using Google models such as gemini-1.5-flash-500k at your own cost.

Azure Integration

You can also set your own Azure API key so that you will be using Azure OpenAI models at your own cost.

AWS Bedrock

You can now connect to AWS Bedrock using access keys and secret keys, and enterprises can authenticate using IAM roles.

Configuring API Keys

To use your own API key:

  1. Go to Agentkube Settings > Models
  2. Enter your API keys in the appropriate fields
  3. Click on the “Verify” button
  4. Once validated, your API key will be enabled

You can also manually edit the keys by modifying the ~/.agentkube/settings.json file directly.

Debugging API Key Issues

If you’re having issues with your API keys:

  • Verify the API key is valid and has the correct permissions
  • Check that the key is for the correct environment (production vs development)
  • Ensure you have sufficient credits/quota with the provider
  • Review the Agentkube logs for any error messages

FAQ

Will my API key be stored or leave my device?

Your API key will not be stored in plain text, but encoded in base64 format in your settings file. It will be sent up to our server with every request. All requests are routed through our backend where we do the final prompt building.

What custom LLM providers are supported?

Agentkube only supports API providers that are compatible with the OpenAI API format (like OpenRouter). We do not provide support for custom local LLM setups or other API formats. If you’re having issues with a custom API setup that isn’t from our supported providers, we unfortunately cannot provide technical support.

Can I use multiple API keys simultaneously?

Currently, Agentkube will prioritize using one API key per provider. You cannot specify different keys for different features within the same provider.

Remember that when using custom API keys, you are responsible for any charges incurred through your provider accounts.