Skip to main content

Framework Examples

The Radicalbit AI Gateway is fully compatible with the OpenAI standard. This means you don't need to change your application code — just point your LLM client to the Gateway by updating three values:

ParameterValue
base_urlYour Gateway URL (e.g., http://localhost:9000/v1)
api_keyYour Gateway API Key (generated from the UI)
modelproject-name/route-name — the project and route defined in your config.yaml

The examples below show how to do this with the most common Python frameworks.


Gateway Configuration

All the framework examples on this page work with the same config.yaml. You only need to define one model and one route:

chat_models:
- model_id: gpt-5.1-assistant
model: openai/gpt-5.1
credentials:
api_key: !secret OPENAI_API_KEY
params:
temperature: 0.7
max_tokens: 500

routes:
my-assistant:
chat_models:
- gpt-5.1-assistant

Routes are accessed using the format project-name/route-name. If your project is called my-project and your route is my-assistant, you pass my-project/my-assistant as the model parameter. The Gateway handles the rest.


Chat Completion

The OpenAI Python SDK works out of the box. Pass base_url and api_key to the client constructor.

from openai import OpenAI

client = OpenAI(
base_url="http://localhost:9000/v1",
api_key="your-gateway-api-key",
)

response = client.chat.completions.create(
model="my-project/my-assistant", # project-name/route-name
messages=[
{"role": "system", "content": "You are a helpful assistant."},
{"role": "user", "content": "What is the capital of France?"},
],
)

print(response.choices[0].message.content)

Streaming

All frameworks above support streaming. Here is an example using the OpenAI SDK — the pattern is identical across frameworks:

from openai import OpenAI

client = OpenAI(
base_url="http://localhost:9000/v1",
api_key="your-gateway-api-key",
)

stream = client.chat.completions.create(
model="my-project/my-assistant",
messages=[{"role": "user", "content": "Tell me a short story."}],
stream=True,
)

for chunk in stream:
delta = chunk.choices[0].delta.content
if delta:
print(delta, end="", flush=True)

Next Steps