o3 Pro
The o-series of models are trained with reinforcement learning to think before they answer and perform complex reasoning. The o3-pro model uses more compute to think harder and provide consistently better answers. Note that BYOK is required for this model. Set up here: https://openrouter.ai/settings/integrations
定價資訊
快速開始
只需更改 base_url 即可透過 BazaarLink API 使用 o3 Pro:
from openai import OpenAI
client = OpenAI(
base_url="https://bazaarlink.ai/api/v1",
api_key="sk-bl-YOUR_API_KEY",
)
response = client.chat.completions.create(
model="openai/o3-pro",
messages=[{"role": "user", "content": "Hello!"}],
)
print(response.choices[0].message.content)import OpenAI from "openai";
const client = new OpenAI({
baseURL: "https://bazaarlink.ai/api/v1",
apiKey: "sk-bl-YOUR_API_KEY",
});
const response = await client.chat.completions.create({
model: "openai/o3-pro",
messages: [{ role: "user", content: "Hello!" }],
});
console.log(response.choices[0].message.content);為什麼透過 BazaarLink 使用 o3 Pro?
- ✓美元計費(台幣報價)+ 統一發票 — 台灣團隊無需海外刷卡
- ✓OpenAI 相容 API — 無需改寫程式碼
- ✓自動故障轉移 — 同模型多供應商備援
- ✓中文客服支援 — 在地團隊即時協助
Frequently Asked Questions
What is o3 Pro?
The o-series of models are trained with reinforcement learning to think before they answer and perform complex reasoning. The o3-pro model uses more compute to think harder and provide consistently better answers. Note that BYOK is required for this model. Set up here: https://openrouter.ai/settings/integrations
How much does the o3 Pro API cost?
o3 Pro costs $20.0000 per 1K input tokens and $80.0000 per 1K output tokens when accessed through BazaarLink.
How do I use o3 Pro with the OpenAI SDK?
Set base_url to "https://bazaarlink.ai/api/v1" and use model ID "openai/o3-pro". All OpenAI SDK methods (chat.completions, embeddings, streaming) work without code changes.
What is the context window for o3 Pro?
o3 Pro supports a context window of 200,000 tokens.
Is o3 Pro available for free?
o3 Pro is a paid model. BazaarLink offers free trial credits on registration so you can test it without a credit card.