Google Cloud's Apigee includes AI gateway capabilities for managing and securing generative AI API traffic, with model routing, token-based rate limiting, content moderation, and comprehensive analytics.
Helicone is an open-source LLM observability and proxy platform. By adding a single line of code, developers get request logging, cost tracking, caching, rate limiting, and analytics for their LLM applications. Helicone supports all major LLM providers and can function as both a gateway proxy and a logging-only integration.
Unified API platforms and proxies that aggregate multiple LLM providers behind a single endpoint, providing model routing, fallback, caching, rate limiting, cost optimization, and access control.