Virtual Keys
Portkey’s virtual key system allows you to securely store your LLM API keys in our vault, utilizing a unique virtual identifier to streamline API key management.
This feature is available on all Portkey plans.
This feature also provides the following benefits:
- Easier key rotation
- The ability to generate multiple virtual keys for a single API key
- Imposition of restrictions based on cost, request volume, and user access
These can be managed within your account under the “Virtual Keys” tab.
Creating Virtual Keys:
- Navigate to the “Virtual Keys” page and click the “Add Key” button in the top right corner.
- Select your AI provider, name your key uniquely, and note any usage specifics if needed.
Tip: You can register multiple keys for one provider or use different names for the same key for easy identification.
Azure Virtual Keys
Azure Virtual Keys allow you to manage multiple Azure deployments under a single virtual key. This feature simplifies API key management and enables flexible usage of different Azure OpenAI models. You can create multiple deployments under the same resource group and manage them using a single virtual key.
Configure Multiple Azure Deployments
To use the required deployment, simply pass the alias
of the deployment as the model
in LLM request body. In case the models is left empty or the specified alias does not exist, the default deployment is used.
How are the provider API keys stored?
Your API keys are encrypted and stored in secure vaults, accessible only at the moment of a request. Decryption is performed exclusively in isolated workers and only when necessary, ensuring the highest level of data security.
How are the provider keys linked to the virtual key?
We randomly generate virtual keys and link them separately to the securely stored keys. This means, your raw API keys can not be reverse engineered from the virtual keys.
Using Virtual Keys
Using the Portkey SDK
Add the virtual key directly to the initialization configuration for Portkey.
import Portkey from 'portkey-ai'
const portkey = new Portkey({
apiKey: "PORTKEY_API_KEY", // defaults to process.env["PORTKEY_API_KEY"]
virtualKey: "VIRTUAL_KEY" // Portkey supports a vault for your LLM Keys
})
Alternatively, you can override the virtual key during the completions call as follows:
const chatCompletion = await portkey.chat.completions.create({
messages: [{ role: 'user', content: 'Say this is a test' }],
model: 'gpt-3.5-turbo',
}, {virtualKey: "OVERRIDING_VIRTUAL_KEY"});
Using alias with Azure virtual keys:
const chatCompletion = await portkey.chat.completions.create({
messages: [{ role: 'user', content: 'Say this is a test' }],
model: 'gpt-4o', // This will be the alias of the deployment
}, {virtualKey: "VIRTUAL_KEY"});
Setting Budget Limits
Portkey provides a simple way to set budget limits for any of your virtual keys and helps you manage your spending on AI providers (and LLMs) - giving you confidence and control over your application’s costs.
Prompt Templates
Choose your Virtual Key within Portkey’s prompt templates, and it will be automatically retrieved and ready for use.
Langchain / LlamaIndex
Set the virtual key when utilizing Portkey’s custom LLM as shown below:
# Example in Langchain
llm = PortkeyLLM(api_key="PORTKEY_API_KEY",virtual_key="VIRTUAL_KEY")
Was this page helpful?