Skip to content

OpenAI 兼容接口

OpenAI SDK

介绍

OpenAI SDK(Software Development Kit,软件开发工具包)是指由 OpenAI 官方或社区提供的、用于在应用程序中调用 OpenAI API(如 GPT 系列模型、DALL·E、Whisper 等)的一组编程接口和工具。

安装

bash
pip install openai

OpenAI 兼容接口

接口请求

OpenAI 兼容的 /v1/chat/completions 接口(支持流式和非流式)。

请求体格式

详情
端点https://api.openai.com/v1/chat/completions
请求方法POST
认证请求头 Authorization: Bearer YOUR_API_KEY
内容类型Content-Type: application/json
核心请求参数model(必填,如 gpt-3.5-turbo)、messages(必填,角色数组:system/user/assistant)、temperature(0–2,控制随机性)、max_tokens(最大生成长度)、stream(是否流式返回)、tools/function_call(工具调用)
响应结构含 id、object、created、model、choices(含 assistant 消息)、usage(token 统计)等字段

请求体示例

json
{
  "model": "gpt-3.5-turbo",
  "messages": [
    {"role": "system", "content": "你是一个简洁的助手"},
    {"role": "user", "content": "什么是 RAG?"}
  ],
  "stream": false,
  "temperature": 0.7
}

接口非流式响应

默认stream: false

接口非流式请求

python
from openai import OpenAI

MODEL = "deepseek-v3.2"
API_KEY = "sk-"
BASE_URL = "https://dashscope.aliyuncs.com/compatible-mode/v1"

# 创建OpenAI API客户端
client = OpenAI(api_key=API_KEY, base_url=BASE_URL)

# 调用模型
response = client.chat.completions.create(
    model=MODEL,
    messages=[
        {
            "role": "system",
            "content": "你是一个Python编程专家,并且不说废话",
        },
        {
            "role": "assistant",
            "content": "好的,我是编程专家,并且话不多,你要问什么?",
        },
        {"role": "user", "content": "计算并输出从1加到10,使用Python代码"},
    ],
)

# 处理结果
# print(response.choices[0].message.content)
# print(response)
# 输出json
print(response.to_json())

接口非流式响应示例

json
{
  "id": "chatcmpl-37a1bacd-cff2-995f-ab95-21b3941b3148",
  "choices": [
    {
      "finish_reason": "stop",
      "index": 0,
      "logprobs": null,
      "message": {
        "content": "```python\n# 计算 1 到 10 的和\nresult = sum(range(1, 11))\nprint(result)\n```",
        "role": "assistant"
      }
    }
  ],
  "created": 1770544823,
  "model": "deepseek-v3.2",
  "object": "chat.completion",
  "system_fingerprint": null,
  "usage": {
    "completion_tokens": 29,
    "prompt_tokens": 42,
    "total_tokens": 71,
    "prompt_tokens_details": {
      "cached_tokens": 0
    }
  }
}

接口流式响应

流式响应(stream: true):使用 Server-Sent Events (SSE) 格式,逐块返回

  • OpenAI 的流式接口通过SSE 协议推送数据,SSE 的规范要求响应的Content-Typetext/event-stream
  • FastAPI 可通过 StreamingResponse 实现,Vue 前端用 EventSource 接收。

接口流式请求

python
from openai import OpenAI

MODEL = "deepseek-v3.2"
API_KEY = "sk-"
BASE_URL = "https://dashscope.aliyuncs.com/compatible-mode/v1"

# 创建OpenAI API客户端
client = OpenAI(api_key=API_KEY, base_url=BASE_URL)

# 调用模型
response = client.chat.completions.create(
    model=MODEL,
    messages=[
        {
            "role": "system",
            "content": "你是一个Python编程专家",
        },
        {
            "role": "assistant",
            "content": "好的,我是编程专家,你要问什么?",
        },
        {"role": "user", "content": "计算并输出从1加到10,使用Python代码"},
    ],
    stream=True,  # 是否流式返回结果
)

# 处理结果
for chunk in response:
    # print(chunk.choices[0].delta.content, end="", flush=True)
    print(chunk.to_dict())

接口流式响应示例

shell
data: {"id":"chatcmpl-123","object":"chat.completion.chunk","created":1677652288,"model":"my-llm","choices":[{"index":0,"delta":{"role":"assistant","content":""},"finish_reason":null}]}

data: {"id":"chatcmpl-123","object":"chat.completion.chunk","created":1677652288,"model":"my-llm","choices":[{"index":0,"delta":{"content":"你好"},"finish_reason":null}]}

data: {"id":"chatcmpl-123","object":"chat.completion.chunk","created":1677652288,"model":"my-llm","choices":[{"index":0,"delta":{"content":"!"},"finish_reason":null}]}

data: {"id":"chatcmpl-123","object":"chat.completion.chunk","created":1677652288,"model":"my-llm","choices":[{"index":0,"delta":{},"finish_reason":"stop"}]}

data: [DONE]

参考资料