The habitat for AI models. Publish, version and discover model weights, datasets and interactive demos — with on-demand inference, sandboxed Spaces and signed leaderboards, all in one place.
# Push a model, pull weights, run inference from koder_zoo import Client zoo = Client.login() # Publish a model with weights and a card zoo.models.push( "acme/helios-7b", weights="./out/weights.safetensors", card="./MODEL_CARD.md", ) # Pull someone else's model model = zoo.models.pull("koder/aurora-base") # Run inference on the hosted endpoint out = zoo.inference.run( "koder/aurora-base", prompt="Describe a monstera leaf", stream=True, )
Weights, data, demos and evaluations — four first-class inhabitants sharing the same registry, access control and storage layer.
Versioned model weights with rich metadata, signed provenance and content-addressed storage. Push via the Git LFS protocol or the Python SDK.
safetensorsggufonnxSchema-aware datasets with train/val/test splits, row-level preview and streaming download. Reuse the same storage engine as models.
parquetsplitsstreamingStructured docs covering intended use, limitations, training data, evaluation metrics and ethical considerations — rendered from markdown with frontmatter.
markdownschemasignedInteractive demos running in sandboxed Firecracker microVMs — one permanent URL per Space, cold-started on request and frozen while idle.
Firecrackersandboxedper-URLOn-demand, streaming and batched inference for any hosted model. Integrated with gateway quotas and usage-based billing out of the box.
streamingbatchedquotasTask-scoped benchmarks computed by Koder Eval and published as signed attestations. Rank models per metric across common tasks.
MMLUHumanEvalattestedA registry that scales from a single notebook experiment to tenant-wide production models — without changing tools.
Push and pull weights with the familiar Git LFS protocol. Any existing tooling that speaks LFS works unchanged.
Every blob is addressed by its hash, deduplicated across versions and tenants. Rollbacks are instant.
Access control is delegated to Koder ID. Public, private and org-scoped repositories with fine-grained roles.
Every push carries a signed attestation — who uploaded what, when, from which runtime. Tamper-evident by design.
Blobs live in a MinIO cluster you can back up, replicate and scale independently from metadata.
Inference usage flows straight into Koder Billing. Per-tenant quotas enforced by the gateway in front of every endpoint.
Each Space runs in its own Firecracker microVM, with strict CPU, memory and egress limits.
Leaderboard results come from Koder Eval as signed attestations, not self-reported numbers.
Everything in Koder Zoo is scriptable. The SDK and the Git LFS protocol give you two paths to the same store.
Upload weights, the model card and a manifest in a single operation. Koder Zoo hashes the blobs, deduplicates against existing versions and signs the manifest.
# Publish a model from koder_zoo import Client zoo = Client.login() zoo.models.push( repo="acme/helios-7b", version="v0.3.0", weights=[ "out/weights-00001-of-00002.safetensors", "out/weights-00002-of-00002.safetensors", ], card="MODEL_CARD.md", tags=["text-generation", "en", "apache-2.0"], )
Run inference on any hosted endpoint. Requests go through the Koder AI gateway, which enforces quotas and bills the calling tenant automatically.
# Stream tokens from a hosted model for chunk in zoo.inference.stream( model="koder/aurora-base", prompt="Write a haiku about monstera leaves", max_tokens=128, temperature=0.7, ): print(chunk.text, end="", flush=True)
Every layer is independently scalable. Metadata in KDB, blobs in MinIO, Spaces in Firecracker microVMs, inference routed by the Koder AI gateway.
Koder Zoo is a private-first, tenant-scoped model hub for the Koder AI platform. Here is how it stacks up against common alternatives.
| Capability | Koder Zoo | Hugging Face | S3 + Git LFS | MLflow |
|---|---|---|---|---|
| Versioned model weights | ✓ | ✓ | ✓ | ✓ |
| Dataset hub with previews | ✓ | ✓ | — | — |
| Sandboxed interactive Spaces | ✓ | ✓ | — | — |
| On-demand streaming inference | ✓ | ✓ | — | — |
| Signed leaderboards | ✓ | — | — | — |
| Signed provenance on every push | ✓ | — | — | — |
| Tenant-scoped quotas & billing | ✓ | — | — | — |
| Private-first, self-hostable | ✓ | — | ✓ | ✓ |
| SSO via central identity | ✓ | — | — | — |
Start publishing models, datasets and Spaces on the Koder AI platform. Get Started takes you to the Koder ID login, no credit card required.
Get Started