Skip to content

ADK 智能体的 Apigee AI Gateway

ADK 中支持Python v1.18.0Java v0.4.0

Apigee provides a powerful AI Gateway, transforming how you manage and govern your generative AI model traffic. By exposing your AI model endpoint (like Vertex AI or) through an Apigee proxy, you immediately gain enterprise-grade capabilities:

  • 模型安全: 实施安全策略,如用于威胁保护的 Model Armor。

  • 流量治理: 强制执行速率限制和令牌限制以管理成本并防止滥用。

  • 性能: 使用语义缓存和高级模型路由提高响应时间和效率。

  • 监控和可见性: 对所有 AI 请求进行细粒度的监控、分析和审计。

Note

ApigeeLLM 包装器目前设计用于与 Vertex AI 和 Gemini API (generateContent) 一起使用。我们正在不断扩大对 其他模型和接口的支持。

示例实现

Integrate Apigee's governance into your agent's workflow by instantiating ApigeeLlm wrapper object and pass it to an LlmAgent or other agent type.

from google.adk.agents import LlmAgent
from google.adk.models.apigee_llm import ApigeeLlm

# 实例化 ApigeeLlm 包装器
model = ApigeeLlm(
    # Specify of Apigee route to your model. For more info, check out ApigeeLlm documentation (https://github.com/google/adk-python/tree/main/contributing/samples/hello_world_apigeellm).
    model="apigee/gemini-2.5-flash",
    # The proxy URL of your deployed Apigee proxy including the base path
    proxy_url=f"https://{APIGEE_PROXY_URL}",
    # Pass necessary authentication/authorization headers (like an API key)
    custom_headers={"foo": "bar"}
)

# 将配置的模型包装器传递给您的 LlmAgent
agent = LlmAgent(
    model=model,
    name="my_governed_agent",
    instruction="You are a helpful assistant powered by Gemini and governed by Apigee.",
    # ... other agent parameters
)
import com.google.adk.agents.LlmAgent;
import com.google.adk.models.ApigeeLlm;
import com.google.common.collect.ImmutableMap;

ApigeeLlm apigeeLlm =
        ApigeeLlm.builder()
            .modelName("apigee/gemini-2.5-flash") // Specify of Apigee route to your model. For more info, check out ApigeeLlm documentation
            .proxyUrl(APIGEE_PROXY_URL) //The proxy URL of your deployed Apigee proxy including the base path
            .customHeaders(ImmutableMap.of("foo", "bar")) //Pass necessary authentication/authorization headers (like an API key)
            .build();
LlmAgent agent =
    LlmAgent.builder()
        .model(apigeeLlm)
        .name("my_governed_agent")
        .description("my_governed_agent")
        .instruction("You are a helpful assistant powered by Gemini and governed by Apigee.")
        // tools will be added next
        .build();

With this configuration, every API call from your agent will be routed through Apigee first, where all necessary policies (security, rate limiting, logging) are executed before the request is securely forwarded to the underlying AI model endpoint. For a full code example using the Apigee proxy, see Hello World Apigee LLM.