
zai
glm-5
GLM-5 is a 744B parameter Mixture-of-Experts model with 40B active parameters. Features 200K context window, 128K maximum output tokens, and is designed for agentic engineering. Trained on 28.5T tokens with enhanced coding, reasoning, and agent capabilities.
Provider:
zai
Model type:
chat
Location:
rest
Context Window
204800
Intelligence Rating
Speed Rating
Cost Efficiency Rating
Pricing
$
1
Input tokens per million
$
3.2
Output tokens per million
Features
Tool Calling
Supported
JSON Mode
Supported
