{ "cells": [ { "cell_type": "markdown", "metadata": {}, "source": [ "# 模型客户端\n", "\n", "AutoGen 提供了 {py:mod}`autogen_core.components.models` 模块,其中包含一套用于使用 ChatCompletion API 的内置模型客户端。所有模型客户端都实现了 {py:class}`~autogen_core.components.models.ChatCompletionClient` 协议类。\n", "\n", "## 内置模型客户端\n", "\n", "目前有两个内置模型客户端:{py:class}`~autogen_ext.models.OpenAIChatCompletionClient` 和 {py:class}`~autogen_ext.models.AzureOpenAIChatCompletionClient`。两个客户端都是异步的。\n", "\n", "要使用 {py:class}`~autogen_ext.models.OpenAIChatCompletionClient`,您需要通过环境变量 `OPENAI_API_KEY` 或通过 `api_key` 参数提供 API 密钥。" ] }, { "cell_type": "code", "execution_count": 2, "metadata": {}, "outputs": [], "source": [ "from autogen_core.components.models import UserMessage\n", "from autogen_ext.models import OpenAIChatCompletionClient\n", "\n", "# Create an OpenAI model client.\n", "model_client = OpenAIChatCompletionClient(\n", " model=\"gpt-4o\",\n", " api_key=\"sk-\", # Optional if you have an API key set in the environment.\n", ")" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "您可以调用 {py:meth}`~autogen_ext.models.OpenAIChatCompletionClient.create` 方法来创建一个聊天完成请求,并等待返回一个 {py:class}`~autogen_core.components.models.CreateResult` 对象。" ] }, { "cell_type": "code", "execution_count": 3, "metadata": {}, "outputs": [ { "name": "stdout", "output_type": "stream", "text": [ "The capital of France is Paris.\n" ] } ], "source": [ "# Send a message list to the model and await the response.\n", "messages = [\n", " UserMessage(content=\"What is the capital of France?\", source=\"user\"),\n", "]\n", "response = await model_client.create(messages=messages)\n", "\n", "# Print the response\n", "print(response.content)" ] }, { "cell_type": "code", "execution_count": 4, "metadata": {}, "outputs": [ { "name": "stdout", "output_type": "stream", "text": [ "RequestUsage(prompt_tokens=15, completion_tokens=7)\n" ] } ], "source": [ "# Print the response token usage\n", "print(response.usage)" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "### 流式响应\n", "\n", "您可以使用 {py:meth}`~autogen_ext.models.OpenAIChatCompletionClient.create_streaming` 方法来创建一个带有流式响应的聊天完成请求。" ] }, { "cell_type": "code", "execution_count": 5, "metadata": {}, "outputs": [ { "name": "stdout", "output_type": "stream", "text": [ "Streamed responses:\n", "In a hidden valley surrounded by misty mountains, there lived a wise and gentle dragon named Zephyr. Unlike other dragons, Zephyr had shimmering emerald scales and eyes that sparkled like starlit skies. He spent his days guarding a secret garden brimming with vibrant flowers and ancient trees. \n", "\n", "Villagers from nearby towns whispered tales of Zephyr's garden, believing it to be enchanted. But only a pure heart could find the path through the dense woods. One day, a lost child named Amara stumbled upon the narrow trail leading to the garden. \n", "\n", "Zephyr, sensing her innocence, emerged gracefully, his wings casting a protective shadow. Instead of fear, Amara felt a warmth that echoed the kindness she saw in the dragon's eyes. Together, they played among the wildflowers, with Zephyr teaching her the language of the wind and secrets of the stars.\n", "\n", "As dusk fell, Zephyr led Amara back to her village, invisible to all eyes but hers. Grateful and eager to share her tale, Amara spoke of the dragon's gentle heart, and from that day forward, Zephyr's legend was told not as a tale of terror, but one of unlikely friendship and the magic of believing.\n", "\n", "------------\n", "\n", "The complete response:\n", "In a hidden valley surrounded by misty mountains, there lived a wise and gentle dragon named Zephyr. Unlike other dragons, Zephyr had shimmering emerald scales and eyes that sparkled like starlit skies. He spent his days guarding a secret garden brimming with vibrant flowers and ancient trees. \n", "\n", "Villagers from nearby towns whispered tales of Zephyr's garden, believing it to be enchanted. But only a pure heart could find the path through the dense woods. One day, a lost child named Amara stumbled upon the narrow trail leading to the garden. \n", "\n", "Zephyr, sensing her innocence, emerged gracefully, his wings casting a protective shadow. Instead of fear, Amara felt a warmth that echoed the kindness she saw in the dragon's eyes. Together, they played among the wildflowers, with Zephyr teaching her the language of the wind and secrets of the stars.\n", "\n", "As dusk fell, Zephyr led Amara back to her village, invisible to all eyes but hers. Grateful and eager to share her tale, Amara spoke of the dragon's gentle heart, and from that day forward, Zephyr's legend was told not as a tale of terror, but one of unlikely friendship and the magic of believing.\n", "\n", "\n", "------------\n", "\n", "The token usage was:\n", "RequestUsage(prompt_tokens=0, completion_tokens=0)\n" ] } ], "source": [ "messages = [\n", " UserMessage(content=\"Write a very short story about a dragon.\", source=\"user\"),\n", "]\n", "\n", "# Create a stream.\n", "stream = model_client.create_stream(messages=messages)\n", "\n", "# Iterate over the stream and print the responses.\n", "print(\"Streamed responses:\")\n", "async for response in stream: # type: ignore\n", " if isinstance(response, str):\n", " # A partial response is a string.\n", " print(response, flush=True, end=\"\")\n", " else:\n", " # The last response is a CreateResult object with the complete message.\n", " print(\"\\n\\n------------\\n\")\n", " print(\"The complete response:\", flush=True)\n", " print(response.content, flush=True)\n", " print(\"\\n\\n------------\\n\")\n", " print(\"The token usage was:\", flush=True)\n", " print(response.usage, flush=True)" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "```{note}\n", "流式响应中的最后一个响应始终是类型为 {py:class}`~autogen_core.components.models.CreateResult` 的最终响应。\n", "```\n", "\n", "**注意:默认的使用量响应是返回零值**" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "### 关于流式示例中令牌使用计数的说明\n", "比较上面非流式 `model_client.create(messages=messages)` 与流式 `model_client.create_stream(messages=messages)` 的使用量返回,我们看到了差异。\n", "非流式响应默认返回有效的提示和完成令牌使用计数。\n", "流式响应默认返回零值。\n", "\n", "如 OPENAI API 参考文档中所述,可以指定额外的参数 `stream_options` 来返回有效的使用计数。参见 [stream_options](https://platform.openai.com/docs/api-reference/chat/create#chat-create-stream_options)\n", "\n", "仅在使用流式传输时设置此参数,即使用 `create_stream` 时\n", "\n", "要在 `create_stream` 中启用此功能,设置 `extra_create_args={\"stream_options\": {\"include_usage\": True}},`\n", "\n", "- **注意,虽然其他 API(如 LiteLLM)也支持此功能,但并不总是保证完全支持或正确**\n", "\n", "#### 带有令牌使用量的流式示例" ] }, { "cell_type": "code", "execution_count": 7, "metadata": {}, "outputs": [ { "name": "stdout", "output_type": "stream", "text": [ "Streamed responses:\n", "In a lush, emerald valley hidden by towering peaks, there lived a dragon named Ember. Unlike others of her kind, Ember cherished solitude over treasure, and the songs of the stream over the roar of flames. One misty dawn, a young shepherd stumbled into her sanctuary, lost and frightened. \n", "\n", "Instead of fury, he was met with kindness as Ember extended a wing, guiding him back to safety. In gratitude, the shepherd visited yearly, bringing tales of his world beyond the mountains. Over time, a friendship blossomed, binding man and dragon in shared stories and laughter.\n", "\n", "As the years passed, the legend of Ember the gentle-hearted spread far and wide, forever changing the way dragons were seen in the hearts of many.\n", "\n", "------------\n", "\n", "The complete response:\n", "In a lush, emerald valley hidden by towering peaks, there lived a dragon named Ember. Unlike others of her kind, Ember cherished solitude over treasure, and the songs of the stream over the roar of flames. One misty dawn, a young shepherd stumbled into her sanctuary, lost and frightened. \n", "\n", "Instead of fury, he was met with kindness as Ember extended a wing, guiding him back to safety. In gratitude, the shepherd visited yearly, bringing tales of his world beyond the mountains. Over time, a friendship blossomed, binding man and dragon in shared stories and laughter.\n", "\n", "As the years passed, the legend of Ember the gentle-hearted spread far and wide, forever changing the way dragons were seen in the hearts of many.\n", "\n", "\n", "------------\n", "\n", "The token usage was:\n", "RequestUsage(prompt_tokens=17, completion_tokens=146)\n" ] } ], "source": [ "messages = [\n", " UserMessage(content=\"Write a very short story about a dragon.\", source=\"user\"),\n", "]\n", "\n", "# Create a stream.\n", "stream = model_client.create_stream(messages=messages, extra_create_args={\"stream_options\": {\"include_usage\": True}})\n", "\n", "# Iterate over the stream and print the responses.\n", "print(\"Streamed responses:\")\n", "async for response in stream: # type: ignore\n", " if isinstance(response, str):\n", " # A partial response is a string.\n", " print(response, flush=True, end=\"\")\n", " else:\n", " # The last response is a CreateResult object with the complete message.\n", " print(\"\\n\\n------------\\n\")\n", " print(\"The complete response:\", flush=True)\n", " print(response.content, flush=True)\n", " print(\"\\n\\n------------\\n\")\n", " print(\"The token usage was:\", flush=True)\n", " print(response.usage, flush=True)" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "### Azure OpenAI\n", "\n", "要使用 {py:class}`~autogen_ext.models.AzureOpenAIChatCompletionClient`,您需要提供部署 ID、Azure 认知服务端点、API 版本和模型功能。对于身份验证,您可以提供 API 密钥或 Azure Active Directory (AAD) 令牌凭证。要使用 AAD 身份验证,您需要首先安装 `azure-identity` 包。" ] }, { "cell_type": "code", "execution_count": 1, "metadata": { "vscode": { "languageId": "shellscript" } }, "outputs": [], "source": [ "# pip install azure-identity" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "以下代码片段显示了如何使用 AAD 身份验证。使用的身份必须被分配 [**认知服务 OpenAI 用户**](https://learn.microsoft.com/en-us/azure/ai-services/openai/how-to/role-based-access-control#cognitive-services-openai-user) 角色。" ] }, { "cell_type": "code", "execution_count": 15, "metadata": {}, "outputs": [], "source": [ "from autogen_ext.models import AzureOpenAIChatCompletionClient\n", "from azure.identity import DefaultAzureCredential, get_bearer_token_provider\n", "\n", "# Create the token provider\n", "token_provider = get_bearer_token_provider(DefaultAzureCredential(), \"https://cognitiveservices.azure.com/.default\")\n", "\n", "az_model_client = AzureOpenAIChatCompletionClient(\n", " model=\"{your-azure-deployment}\",\n", " api_version=\"2024-06-01\",\n", " azure_endpoint=\"https://{your-custom-endpoint}.openai.azure.com/\",\n", " azure_ad_token_provider=token_provider, # Optional if you choose key-based authentication.\n", " # api_key=\"sk-...\", # For key-based authentication.\n", " model_capabilities={\n", " \"vision\": True,\n", " \"function_calling\": True,\n", " \"json_output\": True,\n", " },\n", ")" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "```{note}\n", "参见[此处](https://learn.microsoft.com/en-us/azure/ai-services/openai/how-to/managed-identity#chat-completions)了解如何直接使用 Azure 客户端或获取更多信息。\n", "```\n", "\n", "## 使用模型客户端构建代理\n", "\n", "让我们创建一个简单的 AI 代理,它可以使用 ChatCompletion API 响应消息。" ] }, { "cell_type": "code", "execution_count": 4, "metadata": {}, "outputs": [], "source": [ "from dataclasses import dataclass\n", "\n", "from autogen_core.application import SingleThreadedAgentRuntime\n", "from autogen_core.base import MessageContext\n", "from autogen_core.components import RoutedAgent, message_handler\n", "from autogen_core.components.models import ChatCompletionClient, SystemMessage, UserMessage\n", "from autogen_ext.models import OpenAIChatCompletionClient\n", "\n", "\n", "@dataclass\n", "class Message:\n", " content: str\n", "\n", "\n", "class SimpleAgent(RoutedAgent):\n", " def __init__(self, model_client: ChatCompletionClient) -> None:\n", " super().__init__(\"A simple agent\")\n", " self._system_messages = [SystemMessage(\"You are a helpful AI assistant.\")]\n", " self._model_client = model_client\n", "\n", " @message_handler\n", " async def handle_user_message(self, message: Message, ctx: MessageContext) -> Message:\n", " # Prepare input to the chat completion model.\n", " user_message = UserMessage(content=message.content, source=\"user\")\n", " response = await self._model_client.create(\n", " self._system_messages + [user_message], cancellation_token=ctx.cancellation_token\n", " )\n", " # Return with the model's response.\n", " assert isinstance(response.content, str)\n", " return Message(content=response.content)" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "`SimpleAgent` 类是 {py:class}`autogen_core.components.RoutedAgent` 类的子类,用于方便地自动将消息路由到适当的处理程序。它有一个单一的处理程序 `handle_user_message`,用于处理来自用户的消息。它使用 `ChatCompletionClient` 生成对消息的响应。然后它按照直接通信模型将响应返回给用户。\n", "\n", "```{note}\n", "类型为 {py:class}`autogen_core.base.CancellationToken` 的 `cancellation_token` 用于取消异步操作。它与消息处理程序内的异步调用相关联,调用者可以使用它来取消处理程序。\n", "```" ] }, { "cell_type": "code", "execution_count": 6, "metadata": {}, "outputs": [ { "name": "stdout", "output_type": "stream", "text": [ "Seattle is a vibrant city with a wide range of activities and attractions. Here are some fun things to do in Seattle:\n", "\n", "1. **Space Needle**: Visit this iconic observation tower for stunning views of the city and surrounding mountains.\n", "\n", "2. **Pike Place Market**: Explore this historic market where you can see the famous fish toss, buy local produce, and find unique crafts and eateries.\n", "\n", "3. **Museum of Pop Culture (MoPOP)**: Dive into the world of contemporary culture, music, and science fiction at this interactive museum.\n", "\n", "4. **Chihuly Garden and Glass**: Marvel at the beautiful glass art installations by artist Dale Chihuly, located right next to the Space Needle.\n", "\n", "5. **Seattle Aquarium**: Discover the diverse marine life of the Pacific Northwest at this engaging aquarium.\n", "\n", "6. **Seattle Art Museum**: Explore a vast collection of art from around the world, including contemporary and indigenous art.\n", "\n", "7. **Kerry Park**: For one of the best views of the Seattle skyline, head to this small park on Queen Anne Hill.\n", "\n", "8. **Ballard Locks**: Watch boats pass through the locks and observe the salmon ladder to see salmon migrating.\n", "\n", "9. **Ferry to Bainbridge Island**: Take a scenic ferry ride across Puget Sound to enjoy charming shops, restaurants, and beautiful natural scenery.\n", "\n", "10. **Olympic Sculpture Park**: Stroll through this outdoor park with large-scale sculptures and stunning views of the waterfront and mountains.\n", "\n", "11. **Underground Tour**: Discover Seattle's history on this quirky tour of the city's underground passageways in Pioneer Square.\n", "\n", "12. **Seattle Waterfront**: Enjoy the shops, restaurants, and attractions along the waterfront, including the Seattle Great Wheel and the aquarium.\n", "\n", "13. **Discovery Park**: Explore the largest green space in Seattle, featuring trails, beaches, and views of Puget Sound.\n", "\n", "14. **Food Tours**: Try out Seattle’s diverse culinary scene, including fresh seafood, international cuisines, and coffee culture (don’t miss the original Starbucks!).\n", "\n", "15. **Attend a Sports Game**: Catch a Seahawks (NFL), Mariners (MLB), or Sounders (MLS) game for a lively local experience.\n", "\n", "Whether you're interested in culture, nature, food, or history, Seattle has something for everyone to enjoy!\n" ] } ], "source": [ "# Create the runtime and register the agent.\n", "from autogen_core.base import AgentId\n", "\n", "runtime = SingleThreadedAgentRuntime()\n", "await SimpleAgent.register(\n", " runtime,\n", " \"simple_agent\",\n", " lambda: SimpleAgent(\n", " OpenAIChatCompletionClient(\n", " model=\"gpt-4o-mini\",\n", " # api_key=\"sk-...\", # Optional if you have an OPENAI_API_KEY set in the environment.\n", " )\n", " ),\n", ")\n", "# Start the runtime processing messages.\n", "runtime.start()\n", "# Send a message to the agent and get the response.\n", "message = Message(\"Hello, what are some fun things to do in Seattle?\")\n", "response = await runtime.send_message(message, AgentId(\"simple_agent\", \"default\"))\n", "print(response.content)\n", "# Stop the runtime processing messages.\n", "await runtime.stop()" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "## 管理模型上下文\n", "\n", "上述 `SimpleAgent` 总是以只包含系统消息和最新用户消息的新上下文进行响应。我们可以使用来自 {py:mod}`autogen_core.components.model_context` 的模型上下文类来使代理\"记住\"之前的对话。模型上下文支持聊天完成消息的存储和检索。它总是与模型客户端一起使用来生成基于 LLM 的响应。\n", "\n", "例如,{py:mod}`~autogen_core.components.model_context.BufferedChatCompletionContext` 是一个最近使用(MRU)上下文,它存储最近的 `buffer_size` 数量的消息。这对于避免许多 LLM 中的上下文溢出很有用。\n", "\n", "让我们更新前面的示例以使用 {py:mod}`~autogen_core.components.model_context.BufferedChatCompletionContext`。" ] }, { "cell_type": "code", "execution_count": 9, "metadata": {}, "outputs": [], "source": [ "from autogen_core.components.model_context import BufferedChatCompletionContext\n", "from autogen_core.components.models import AssistantMessage\n", "\n", "\n", "class SimpleAgentWithContext(RoutedAgent):\n", " def __init__(self, model_client: ChatCompletionClient) -> None:\n", " super().__init__(\"A simple agent\")\n", " self._system_messages = [SystemMessage(\"You are a helpful AI assistant.\")]\n", " self._model_client = model_client\n", " self._model_context = BufferedChatCompletionContext(buffer_size=5)\n", "\n", " @message_handler\n", " async def handle_user_message(self, message: Message, ctx: MessageContext) -> Message:\n", " # Prepare input to the chat completion model.\n", " user_message = UserMessage(content=message.content, source=\"user\")\n", " # Add message to model context.\n", " await self._model_context.add_message(user_message)\n", " # Generate a response.\n", " response = await self._model_client.create(\n", " self._system_messages + (await self._model_context.get_messages()),\n", " cancellation_token=ctx.cancellation_token,\n", " )\n", " # Return with the model's response.\n", " assert isinstance(response.content, str)\n", " # Add message to model context.\n", " await self._model_context.add_message(AssistantMessage(content=response.content, source=self.metadata[\"type\"]))\n", " return Message(content=response.content)" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "现在让我们尝试在第一个问题之后提出后续问题。" ] }, { "cell_type": "code", "execution_count": 10, "metadata": {}, "outputs": [ { "name": "stdout", "output_type": "stream", "text": [ "Question: Hello, what are some fun things to do in Seattle?\n", "Response: Seattle offers a wide variety of fun activities and attractions for visitors. Here are some highlights:\n", "\n", "1. **Pike Place Market**: Explore this iconic market, where you can find fresh produce, unique crafts, and the famous fish-throwing vendors. Don’t forget to visit the original Starbucks!\n", "\n", "2. **Space Needle**: Enjoy breathtaking views of the city and Mount Rainier from the observation deck of this iconic structure. You can also dine at the SkyCity restaurant.\n", "\n", "3. **Chihuly Garden and Glass**: Admire the stunning glass art installations created by artist Dale Chihuly. The garden and exhibit are particularly beautiful, especially in good weather.\n", "\n", "4. **Museum of Pop Culture (MoPOP)**: Dive into the world of music, science fiction, and pop culture through interactive exhibits and memorabilia.\n", "\n", "5. **Seattle Aquarium**: Located on the waterfront, the aquarium features a variety of marine life native to the Pacific Northwest, including otters and diving birds.\n", "\n", "6. **Seattle Art Museum (SAM)**: Explore a diverse collection of art from around the world, including Native American art and contemporary pieces.\n", "\n", "7. **Ballard Locks**: Watch boats travel between the Puget Sound and Lake Union, and see salmon navigating the fish ladder during spawning season.\n", "\n", "8. **Fremont Troll**: Visit this quirky public art installation located under the Aurora Bridge, where you can take fun photos with the giant troll.\n", "\n", "9. **Kerry Park**: For a picturesque view of the Seattle skyline, head to Kerry Park on Queen Anne Hill, especially at sunset.\n", "\n", "10. **Take a Ferry Ride**: Enjoy the scenic views while taking a ferry to nearby Bainbridge Island or Vashon Island for a relaxing day trip.\n", "\n", "11. **Underground Tour**: Explore Seattle’s history on an entertaining underground tour in Pioneer Square, where you’ll learn about the city’s early days.\n", "\n", "12. **Attend a Sporting Event**: Depending on the season, catch a Seattle Seahawks (NFL) game, a Seattle Mariners (MLB) game, or a Seattle Sounders (MLS) match.\n", "\n", "13. **Explore Discovery Park**: Enjoy nature with hiking trails, beach access, and stunning views of the Puget Sound and Olympic Mountains.\n", "\n", "14. **West Seattle’s Alki Beach**: Relax at this beach with beautiful views of the Seattle skyline and enjoy beachside activities like biking or kayaking.\n", "\n", "15. **Dining and Craft Beer**: Seattle has a vibrant food scene and is known for its seafood, coffee culture, and craft breweries. Make sure to explore local restaurants and breweries.\n", "\n", "There’s something for everyone in Seattle, whether you’re interested in nature, art, history, or food!\n", "-----\n", "Question: What was the first thing you mentioned?\n", "Response: The first thing I mentioned was **Pike Place Market**, an iconic market in Seattle where you can find fresh produce, unique crafts, and experience the famous fish-throwing vendors. It's also home to the original Starbucks and various charming shops and eateries.\n" ] } ], "source": [ "runtime = SingleThreadedAgentRuntime()\n", "await SimpleAgentWithContext.register(\n", " runtime,\n", " \"simple_agent_context\",\n", " lambda: SimpleAgentWithContext(\n", " OpenAIChatCompletionClient(\n", " model=\"gpt-4o-mini\",\n", " # api_key=\"sk-...\", # Optional if you have an OPENAI_API_KEY set in the environment.\n", " )\n", " ),\n", ")\n", "# Start the runtime processing messages.\n", "runtime.start()\n", "agent_id = AgentId(\"simple_agent_context\", \"default\")\n", "\n", "# First question.\n", "message = Message(\"Hello, what are some fun things to do in Seattle?\")\n", "print(f\"Question: {message.content}\")\n", "response = await runtime.send_message(message, agent_id)\n", "print(f\"Response: {response.content}\")\n", "print(\"-----\")\n", "\n", "# Second question.\n", "message = Message(\"What was the first thing you mentioned?\")\n", "print(f\"Question: {message.content}\")\n", "response = await runtime.send_message(message, agent_id)\n", "print(f\"Response: {response.content}\")\n", "\n", "# Stop the runtime processing messages.\n", "await runtime.stop()" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "从第二个响应中,您可以看到代理现在可以回忆起它自己之前的响应。" ] } ], "metadata": { "kernelspec": { "display_name": ".venv", "language": "python", "name": "python3" }, "language_info": { "codemirror_mode": { "name": "ipython", "version": 3 }, "file_extension": ".py", "mimetype": "text/x-python", "name": "python", "nbconvert_exporter": "python", "pygments_lexer": "ipython3", "version": "3.12.3" } }, "nbformat": 4, "nbformat_minor": 2 }