OpenAI 的官方 Python SDK 已经内置了异步客户端 AsyncOpenAI,用法和同步版几乎一样,只是方法前面要 await。
1.单次异步调用
import os
import asyncio
from openai import AsyncOpenAI
# 从环境变量读 API Key
os.environ["OPENAI_API_KEY"] = "你的 key"
client = AsyncOpenAI()
async def main():
resp = await client.responses.create(
model="gpt-5", # 换成你实际用的模型
input="用一句话解释一下 async/await。",
)
# SDK 提供了 output_text 帮你把文本拼好
print(resp.output_text)
if __name__ == "__main__":
asyncio.run(main())
2.并发调用多个问题
import asyncio
from openai import AsyncOpenAI
client = AsyncOpenAI()
async def ask(prompt: str) -> str:
resp = await client.responses.create(
model="gpt-5",
input=prompt,
)
return resp.output_text
async def main():
prompts = [
"帮我想三个团队建设活动",
"解释一下什么是 Transformer 模型",
"给我五个产品取名的备选方案",
]
# 构造一堆 task 并发执行
tasks = [ask(p) for p in prompts]
results = await asyncio.gather(*tasks)
for i, answer in enumerate(results):
print(f"\n=== 问题 {i+1} ===")
print(answer)
asyncio.run(main())
ask()自己是 async,可以在里面awaitAPI 调用asyncio.gather(*tasks)会并发跑所有请求,只要你的服务端/限流允许







Comments NOTHING