zai

glm-4.5

GLM-4.5 is a 355B parameter Mixture-of-Experts model with 32B active parameters. Features 128K context window, hybrid reasoning modes (Thinking/Non-Thinking), and generation speeds over 100 tokens per second. First among open-source models globally.

zai

glm-4.5

GLM-4.5 is a 355B parameter Mixture-of-Experts model with 32B active parameters. Features 128K context window, hybrid reasoning modes (Thinking/Non-Thinking), and generation speeds over 100 tokens per second. First among open-source models globally.

zai

glm-4.5

GLM-4.5 is a 355B parameter Mixture-of-Experts model with 32B active parameters. Features 128K context window, hybrid reasoning modes (Thinking/Non-Thinking), and generation speeds over 100 tokens per second. First among open-source models globally.

Provider:

zai

Model type:

chat

chat

chat

Location:

rest

Context Window

128K

Intelligence Rating

Speed Rating

Cost Efficiency Rating

Pricing

$0.60

Input tokens per million

$2.20

Output tokens per million

Features

Tool Calling

Supported

Create an account and start building today.

Create an account and start building today.

Create an account and start building today.