Hanzo Gateway
Unified AI inference proxy
Route to 100+ LLM providers through a single API. Load balancing, fallbacks, caching, and cost optimization.
{
"model": "gpt-4o",
"messages": [
{"role": "user",
"content": "Hello!"}
]
}{
"choices": [{
"message": {
"content": "Hi!"
}
}],
"usage": {...}
}{
"model": "gpt-4o",
"messages": [
{"role": "user",
"content": "Hello!"}
]
}{
"choices": [{
"message": {
"content": "Hi!"
}
}],
"usage": {...}
}Gateway Quick Start
Get started in minutes with your language of choice
Use the OpenAI SDK with Hanzo Gateway
from openai import OpenAI
client = OpenAI(
base_url="https://api.hanzo.ai/v1",
api_key="your-hanzo-key"
)
response = client.chat.completions.create(
model="gpt-4o", # or claude-3-5-sonnet, gemini-pro, etc.
messages=[{"role": "user", "content": "Hello!"}]
)
print(response.choices[0].message.content)Features
Everything you need to get started
Official Gateway SDKs
Use our official SDKs to integrate Gateway into your application
Join the Gateway Community
Get help, share ideas, and contribute to the project
Want to Contribute?
We welcome contributions of all kinds: bug reports, feature requests, documentation improvements, and code contributions.
Read our Contributing GuideRelated Products
More from Hanzo Compute
Powered by LiteLLM
18k+Hanzo Gateway is built on top of LiteLLM, an open-source project.Call 100+ LLMs with a unified interface. The most popular LLM proxy with 18k+ GitHub stars.
Licensed under MIT
We're grateful to the LiteLLM maintainers and community for their incredible work.
Ready to get started with Gateway?
Deploy in minutes with Hanzo Cloud or self-host with our open-source release.