Skip to content

Proxy API

The Proxy API is the public surface of api_gateway projects. Any HTTP method, any tail path:

ANY /api/{uuid}/proxy/{slug}/{any?}

Requires proxy scope. Forwards to endpoint.upstream_url + tail_path, applying the endpoint’s method allowlist + header policies + OAuth.

What the client sends:

GET /api/{uuid}/proxy/weather/data/2.5/weather?q=Berlin
POST /api/{uuid}/proxy/github/repos/owner/repo/issues
PUT /api/{uuid}/proxy/myapi/users/42
DELETE /api/{uuid}/proxy/myapi/users/42

What gets forwarded upstream:

GET https://api.openweathermap.org/data/2.5/weather?q=Berlin
POST https://api.github.com/repos/owner/repo/issues
PUT https://my-api.example.com/users/42
DELETE https://my-api.example.com/users/42

The exact transformation depends on the endpoint config — see API Gateway for the full set of toggles.

DirectionHeaderBehaviour
Client → PromptGateAuthorization: Bearer pg_…Required for auth. Always stripped before forwarding.
Client → PromptGateAnything elseForwarded if in forward_headers allowlist (or if no allowlist).
PromptGate → UpstreamAuthorization: Bearer <oauth>Injected if endpoint binds an OAuth Service Connection.
PromptGate → Upstreaminject_headers contentServer-side static headers always added.
PromptGate → UpstreamHost, Cookie, Content-LengthAlways blocked.
Upstream → ClientMost response headersForwarded.
Upstream → ClientTransfer-Encoding, Connection, Content-LengthBlocked (let HTTP layer manage).

Body is passthrough. Whatever the client sends, PromptGate forwards verbatim. No JSON parsing, no transformation. Same for the response body.

So you can proxy any content type — JSON, form-encoded, multipart, binary.

Terminal window
curl "$PG_URL/api/$PG_UUID/proxy/weather/data/2.5/weather?q=Berlin&units=metric" \
-H "Authorization: Bearer $PG_TOKEN"
Terminal window
curl -X POST $PG_URL/api/$PG_UUID/proxy/github/repos/me/repo/issues \
-H "Authorization: Bearer $PG_TOKEN" \
-H "Content-Type: application/json" \
-d '{"title": "Bug report", "body": "Found an issue."}'

(The endpoint has an OAuth Service Connection bound, so Authorization: Bearer <github-oauth> is injected upstream automatically.)

import os, requests
# Read
r = requests.get(
f"{os.environ['PG_URL']}/api/{os.environ['PG_UUID']}/proxy/weather/data/2.5/weather",
headers={"Authorization": f"Bearer {os.environ['PG_TOKEN']}"},
params={"q": "Berlin"},
)
print(r.json())
# Write
r = requests.post(
f"{os.environ['PG_URL']}/api/{os.environ['PG_UUID']}/proxy/github/repos/me/repo/issues",
headers={"Authorization": f"Bearer {os.environ['PG_TOKEN']}"},
json={"title": "Bug report"},
)
print(r.status_code, r.json())
const url = `${process.env.PG_URL}/api/${process.env.PG_UUID}/proxy/myapi/users/42`;
const r = await fetch(url, {
method: 'PUT',
headers: {
'Authorization': `Bearer ${process.env.PG_TOKEN}`,
'Content-Type': 'application/json',
},
body: JSON.stringify({ name: 'Sam' }),
});
console.log(r.status, await r.json());

Request → auth (proxy scope) → rate limit → SSRF check on upstream URL → header build (forward + inject + OAuth) → forward to upstream → log to gateway_logs.

Note: SSRF runs at proxy time too, even though it ran at endpoint create time. Defense against DNS rebinding.

SituationStatusBody
Successful proxy (upstream 2xx/3xx)200/201/etc.passthrough
Method not in allowed_methods405{ok:false, error:"Method 'POST' is not allowed for this endpoint."}
SSRF guard blocks upstream URL422{ok:false, error:"Upstream URL blocked by SSRF guard.", detail:"..."}
OAuth refresh fails502{ok:false, error:"Upstream OAuth connection failed.", detail:"..."}
Upstream times out502{ok:false, error:"Upstream unreachable.", detail:"..."}
Upstream returns 4xx/5xxpassthroughupstream’s response body verbatim
Endpoint inactive / unknown slug404endpoint-not-found
Wrong scope403scope mismatch
Wrong project type400type mismatch
Per-minute rate limit hit429+ Retry-After

Every proxy hit lands in gateway_logs:

ColumnValue
provider_key"http"
provider_model"<METHOD> <STATUS>" (e.g. "GET 200", "POST 502")
latency_msfull round-trip including upstream call
status"ok" for 2xx/3xx, "error" for 4xx/5xx
error_messagepopulated on SSRF block, OAuth failure, upstream unreachable

So Live Logs and Metrics work the same way they do for AI traffic — same UI, same filters.

The proxy does not stream. The full upstream response is buffered before the client gets it. Streaming proxies are roadmap.

For SSE-emitting upstreams (e.g. proxying an OpenAI streaming endpoint via API Gateway), use the Wrapper API instead — it has true SSE support.


Next: MCP API.


© Akyros Labs LLC. All rights reserved.