Claude Opus 4.7 + Sonnet 4.6 + GPT-5.5 — up to 7.7× value
Prepay credits, plug into Claude Code / Codex / Cursor, and every credit goes 7.7× further. No subscription, no monthly cap.
Which credits get consumed?
One API key works for both. Routing is decided by the model you call.
Claude Opus 4.7, Sonnet 4.6, GPT-5.5 drain Coding Credits first (FIFO), then spill to General Credits if Coding Credits run out.
Image, video, audio, 3D, and all other LLMs drain General Credits only. Coding Credits are never touched — enforced at the SQL layer.
Pick a pack
One-time purchase. Credits expire 365 days after purchase.
coding-starter
Bonus +25%coding-pro
Bonus +50%coding-max
Bonus +75%Credits expire 365 days after purchase. Coding credits are locked to the 3 eligible models — they cannot be spent on video / image / other LLMs.
Which models do these credits work on?
Exactly three model families. Hard-coded — not configurable.
How far each credit goes
Same upstream models. Same tokens. A lot more value per dollar.
| Model | Our $ / MTok (in / out) | Public API rate | Your multiplier |
|---|---|---|---|
Claude Opus 4.7 / 4.6 | $3.40 / $16.96 | $15.00 / $75.00 | −77% |
Claude Sonnet 4.6 | $0.68 / $3.40 | $3.00 / $15.00 | −77% |
GPT-5.5 | $0.42 / $2.52 | $1.20 / $7.20 | −65% |
Per 1M tokens. Output dominates real coding workloads — most of the value shows up there.
How to use it
Drop Hypereal into any tool that speaks OpenAI or Anthropic-compatible APIs.
Claude Code
The official Anthropic Claude Code CLI talks to /v1/messages — point it at Hypereal and pin Opus or Sonnet.
# 1. Get your API key
# https://hypereal.cloud/manage-api-keys
# 2. Point Claude Code at Hypereal (Anthropic-compatible endpoint)
export ANTHROPIC_BASE_URL="https://hypereal.cloud/v1"
export ANTHROPIC_API_KEY="sk-hyp-..." # your Hypereal key
# 3. Pin a coding-plan model (opus 4.7 or sonnet 4.6)
export ANTHROPIC_MODEL="claude-opus-4-7"
# 4. Run
claudeOpenAI Codex CLI
OpenAI Codex CLI uses /v1/chat/completions. Set OPENAI_BASE_URL to Hypereal and pin GPT-5.5.
export OPENAI_BASE_URL="https://hypereal.cloud/v1"
export OPENAI_API_KEY="sk-hyp-..."
export OPENAI_MODEL="gpt-5.5"
codex --provider openaiCursor
Cursor accepts any OpenAI-compatible endpoint. Override the OpenAI host in Settings → Models.
# Cursor → Settings → Models → "Add OpenAI-compatible"
# Base URL: https://hypereal.cloud/v1
# API Key: sk-hyp-...
# Model: gpt-5.5 (or claude-opus-4-7 / claude-sonnet-4-6)
#
# Enable "Override OpenAI" in the same panel so Cursor routes
# completions through Hypereal instead of api.openai.com.OpenCode / Cline / Aider
OpenCode, Cline (VS Code), Aider — all of them accept ANTHROPIC_API_BASE / API_KEY env vars or settings.
# OpenCode / Cline (VS Code) — Anthropic-compatible
{
"anthropic.baseURL": "https://hypereal.cloud/v1",
"anthropic.apiKey": "sk-hyp-...",
"anthropic.model": "claude-opus-4-7"
}
# Aider — pass through env
export ANTHROPIC_API_BASE="https://hypereal.cloud/v1"
export ANTHROPIC_API_KEY="sk-hyp-..."
aider --model anthropic/claude-opus-4-7Hermes Agent
Hermes Agent works with any OpenAI-compatible endpoint. Configure once and run.
# Hermes Agent supports OpenAI-compatible endpoints out of the box.
hermes config set provider openai-compatible
hermes config set base_url https://hypereal.cloud/v1
hermes config set api_key sk-hyp-...
hermes config set model gpt-5.5
hermes run "Refactor auth.ts to use jose"Raw curl
For agents and scripts: hit /v1/messages directly with your API key.
curl https://hypereal.cloud/v1/messages \
-H "x-api-key: sk-hyp-..." \
-H "content-type: application/json" \
-H "anthropic-version: 2023-06-01" \
-d '{
"model": "claude-opus-4-7",
"max_tokens": 1024,
"messages": [{"role": "user", "content": "Write a Python quicksort."}]
}'
