Comment on page
Production-grade features to bring your LLMs to production, easily!
Portkey serves as a strategic gateway between your app and LLM providers, enhancing existing workflows with production capabilities without compromising on speed. With Portkey, we've natively built:
- 1.An AI Gateway
- 2.An Observability layer
- 3.A Live Feedback layer
- 4.A Prompt Manager
- 5.An Experimentation & Evals framework
- 6.And Security & Compliance protocols
Here’s a detailed overview of how each feature contributes to productionizing your app:
- Weighted Feedback: Obtain nuanced information by attaching weights to user feedback values.
- Multi-provider Support: Portkey supports an extensive range of AI providers, including OpenAI, Anthropic, Cohere, Azure, and Huggingface.
- Prompt Versioning: Track modifications, make adjustments, and deploy different versions to your production environment as needed.
- Raw Mode: Define prompts with full JSON flexibility, catering to the most complex and specific requirements.
- A/B Test: Execute A/B tests programmatically, utilizing custom weights to meticulously compare any model parameters, whether it be model name, provider, temperature, top_p, or any other aspect!
- Redact Sensitive Data: Automatically remove sensitive data from your requests to prevent indavertent exposure.
- Access Control & Inbound Rules: Control which IPs and Geos can connect to your Portkey deployments and stay compliant with your SOC2 policies.
- Compliant with Global Standards: Portkey complies with security and privacy best practices with standard certifications.