55+ Zen Models

Hanzo Zen Models
Language · Code · Vision · Audio

55+ open-weight models from 0.6B to 1T+. Zen MoDE (Mixture of Diverse Experts) architecture. All models on HuggingFace.

zen-models
55+
Models
1.04T
Max Params
2M
Max Context
From
Language + Code + Vision + Audio
Zen MoDE — Mixture of Diverse Experts
OpenAI-compatible API
Open-weight on HuggingFace

Full Model Catalog

55+ models · Available via API and HuggingFace

Browse on HuggingFace
EdgeProMaxUltraVisionMultimodal
zen4
Ultra744B40B active202K ctx202k context window
zen4-pro
Ultra80B3B active131K ctx131k context window
zen4-max
UltraN/A1M ctx1m context window
zen4.1
UltraN/A1M ctx1m context window
zen4-mini
ProN/A128K ctx128k context window
zen4-ultra
Ultra744B40B active262K ctx262k context window
zen4-thinking
Max80B3B active131K ctx131k context window
zen4-coder
Ultra480B35B active163K ctx163k context window
zen4-coder-flash
Max30B3B active262K ctx262k context window
zen4-coder-pro
Ultra480B131K ctx131k context window
zen3-omni
Max~200B202K ctx202k context window
zen3-vl
Max30B3B active262K ctx262k context window
zen3-nano
Pro8B128K ctx128k context window
zen3-guard
Pro4B65K ctx65k context window
zen3-embedding
Max3072 dimensions8K ctx8k context window
zen3-embedding-medium
Pro4B40K ctx40k context window
zen3-embedding-small
Pro0.6B32K ctx32k context window
zen3-reranker
Max8B40K ctx40k context window
zen3-reranker-medium
Pro4B40K ctx40k context window
zen3-reranker-small
Pro0.6B40K ctx40k context window
zen3-image
VisionN/Atext-to-image
zen3-image-max
VisionN/Atext-to-image
zen3-image-dev
VisionN/Atext-to-image
zen3-image-fast
VisionN/Atext-to-image
zen3-image-sdxl
VisionN/Atext-to-image
zen3-image-playground
VisionN/Atext-to-image
zen3-image-ssd
Vision1Btext-to-image
zen3-image-jp
VisionN/Atext-to-image
zen3-audio
Max1.5Bmulti-language
zen3-audio-fast
Pro809Mmulti-language
zen3-asr
MaxN/Astreaming
zen3-asr-v1
ProN/Astreaming
zen3-tts
Max82M40+ voices
zen3-tts-hd
UltraN/Ahd quality
zen3-tts-fast
Pro82Mlow latency
zen5Research Preview
UltraTBA1M ctx
zen5-proResearch Preview
UltraTBA524K ctx
zen5-maxResearch Preview
UltraTBA2M ctx
zen5-ultraResearch Preview
UltraTBA1M ctx
zen5-miniResearch Preview
ProTBA262K ctx
zen-nano
Pro0.6B32K ctx32k context
zen-eco
Pro4B32K ctx32k context
zen
Pro8–32B32K ctx32k context
zen-pro
Max32B32K ctx32k context
zen-max
Ultra235B22B active131K ctx131k context
zen-next
UltraTBD256K ctx256k context
zen-coder
Max32B131K ctx131k context
zen-coder-flash
Pro7B32K ctx32k context
zen-code
Pro14B32K ctx32k context
zen-vl
Max32B32K ctx32k context
zen-omni
Multimodal72B131K ctx131k context
zen-guard
Pro8B32K ctxcontent moderation
zen-embedding
Pro3072 dimensions8K ctx3072 dimensions
zen-reranker
Pro568M8K ctxcross-encoder
zen-agent
Ultra32B131K ctxtool use

Why Zen Models?

OpenAI-Compatible API

Drop-in replacement for GPT/Claude. Same SDK, same format, lower cost.

Open Weight

All Zen models are open-weight on HuggingFace. Run them yourself.

Full Spectrum

82M edge to 1T+ frontier. Text, code, vision, audio, image — one API.

MoE Efficiency

Mixture-of-Experts activates only a fraction of params per token.

Every Modality

Text, vision, code, speech — all through a single unified API

Language

Chat, reasoning, RAG

zen4-max
zen4-ultra
zen4-pro

Code

Generation, review, debug

zen4-coder-flash
zen4-coder
zen4-mini

Vision

Image, video, OCR

zen-vl-4b
zen-vl-8b
zen-vl-30b

Audio

Speech-to-speech <300ms

zen-omni
Coming Soon
Next Generation

Zen 5 Ultra

2T+ parameter MoDE. Trained on-chain via NVIDIA TEE confidential compute on hanzo.network. The largest open-weight model in history.

2T+
Parameters
MoDE
Mixture of Diverse Experts
TEE
On-Chain Verifiable Training
䷀ ䷸ ䷹ ䷺ ䷻ ䷼ ䷽ ䷾ ䷿ ䷡

The Philosophy Behind the Models

These models are built on ten engineering principles drawn from the 64 hexagrams of the I-Ching. Orthogonality. Smallness. Completeness. Clarity. Composability. Ancient pattern language for systems that last.

易經 · Explore the Full Philosophy

Ready to build with Zen?

55+ models, full spectrum of modalities, one API. Start building today.