zai

glm-4.7

GLM-4.7 is a 355B parameter Mixture-of-Experts model with 32B active parameters. Features 200K context window, 128K maximum output tokens, and enhanced agentic coding capabilities. Delivers comprehensive upgrades in general conversation, reasoning, and agent capabilities with 38% improvement over GLM-4.6 on HLE benchmarks.

zai

glm-4.7

GLM-4.7 is a 355B parameter Mixture-of-Experts model with 32B active parameters. Features 200K context window, 128K maximum output tokens, and enhanced agentic coding capabilities. Delivers comprehensive upgrades in general conversation, reasoning, and agent capabilities with 38% improvement over GLM-4.6 on HLE benchmarks.

Provider:

zai

Model type:

chat

Location:

rest

Context Window

204800

Intelligence Rating

Speed Rating

Cost Efficiency Rating

Pricing

$

0.6

Input tokens per million

$

2.2

Output tokens per million

Features

Tool Calling

Supported

JSON Mode

Supported

Create an account and start building today.

Create an account and start building today.