ZhipuAI / ChatGLM / BigModel

ZhipuAI has developed the GLM series of open source LLMs that are some of the world's best performing and capable models today. Portkey provides a robust and secure gateway to seamlessly integrate these LLMs into your applications in the familiar OpenAI spec with just 2 LOC change!

With Portkey, you can leverage powerful features like fast AI gateway, caching, observability, prompt management, and more, while securely managing your LLM API keys through a virtual key system.

Provider Slug: zhipu

Portkey SDK Integration with ZhipuAI

1. Install the Portkey SDK

Install the Portkey SDK in your project using npm or pip:

npm install --save portkey-ai

2. Initialize Portkey with the Virtual Key

Create a virtual key in the Portkey vault by adding your ZhipuAI API key. Then, initialize the Portkey client with your virtual key:

import Portkey from 'portkey-ai'
 
const portkey = new Portkey({
    apiKey: "PORTKEY_API_KEY", // defaults to process.env["PORTKEY_API_KEY"]
    virtualKey: "VIRTUAL_KEY" // Your ZhipuAI Virtual Key
})

3. Invoke Chat Completions

const chatCompletion = await portkey.chat.completions.create({
    messages: [{ role: 'user', content: 'Who are you?' }],
    model: 'glm-4'
});

console.log(chatCompletion.choices);

I am an AI assistant named ZhiPuQingYan(智谱清言), you can call me Xiaozhi🤖


Next Steps

The complete list of features supported in the SDK are available on the link below.

pageSDK

You'll find more information in the relevant sections:

Last updated