Loading...

Self-Hosted AI API

OpenAI-compatible API running on local hardware. No data leaves your infrastructure. Pay only for what you use.

View Docs

Available Models

All models run locally on our servers. Real-time availability shown below.

Loading models...

Why Choose Us

Enterprise-grade AI infrastructure you can trust.

🔒
100% Private
All data stays on our local servers. No third-party API calls. Complete data sovereignty.
OpenAI Compatible
Drop-in replacement for OpenAI API. Works with LangChain, LiteLLM, and any OpenAI SDK.
💰
Pay As You Go
No subscriptions required. Only pay for the tokens you actually use. Transparent pricing.
🧠
Multiple Models
Text, Vision, and Audio models available. Switch models with a single parameter change.
📊
Usage Dashboard
Track your usage, costs, and performance in real-time. Export data anytime.
🛠️
Tool Calling
Full function/tool calling support. Build agents and complex workflows.

Quick Start

Get started in under a minute.

Python
# Install OpenAI SDK
pip install openai

# Use our endpoint
from openai import OpenAI

client = OpenAI(
    base_url="https://gateway.simplecore.app/v1",
    api_key="your-api-key"
)

response = client.chat.completions.create(
    model="model-name",
    messages=[{"role": "user", "content": "Hello!"}],
    stream=True
)

for chunk in response:
    print(chunk.choices[0].delta.content, end="")

Simple Pricing

Transparent, usage-based pricing. No hidden fees.

Pay As You Go
€0.0001 / 1K tokens
Perfect for getting started
  • No minimum commitment
  • All models included
  • Streaming support
  • 5 requests/minute
  • Pay via bank transfer