In this guide, I'll show you how to configure goose to work with OpenAI through Zuplo AI Gateway in just a few minutes.
What is goose?#
goose (yes, that's deliberately lower case) is an impressive AI agent and CLI tool for automating engineering tasks. It's completely open-source with no vendor lock-in, supports local LLMs, has extensive MCP (Model Context Protocol) support, and offers powerful extensibility through recipes. Whether you're working solo or in a team, goose provides a flexible foundation for AI-powered automation.
Why Use an AI Gateway?#
While goose works great out of the box, routing it through an AI gateway like Zuplo gives you several advantages:
- Cost visibility: Track token usage and spending in real-time
- Usage controls: Set budget limits and thresholds
- Team management: Control access across your organization
- Security policies: Add rate limiting and other protections
- Observability: Monitor all LLM requests in one place
Setting Up goose with Zuplo AI Gateway#
Step 1: Configure Your AI Gateway#
First, set up your Zuplo AI Gateway with your OpenAI provider. Create an app for goose and select your desired model (such as GPT-4o). This is also where you can optionally configure usage limits, budget warnings, or other policies.
Once created, grab your app's API key because you'll need this in the next step.
Step 2: Configure goose#
Run the configuration command:
goose configure
Then follow these steps:
- Select Configure providers
- Choose OpenAI as your provider
- Replace the API key with the one from your Zuplo AI Gateway app
- Update the host URL to point to your AI Gateway endpoint instead of OpenAI directly
- Keep the base path as default (
v1/chat/completions
) - Confirm your default model selection
That's it! goose will validate the configuration and save it.
Step 3: Test It Out#
Run Goose and try a query:
goose
Now, you're ready to start working with goose as your trusty AI agent. Ask questions, make plans, or simply start building.
See Your Usage in Real-Time#
Head over to your Zuplo AI Gateway dashboard while Goose is working. You'll immediately see:
- Number of requests made
- Total tokens consumed
- Cost per request and cumulative spending
- Time to first byte and other performance metrics
This visibility is invaluable for understanding your AI usage patterns and controlling costs, especially in team environments.
The Benefits#
By routing goose through Zuplo AI Gateway, you get:
- Immediate observability into all your LLM interactions
- Cost control with the ability to set budgets and alerts
- Team governance when multiple people use goose (which they can do with their own Zuplo AI Gateway API keys)
- Security policies like rate limiting and prompt injection detection
- No changes to your workflow goose works exactly as before
The setup takes less than five minutes, and you gain enterprise-grade management capabilities for your AI agent usage.
Get Started#
Ready to add visibility and control to your goose setup? Try Zuplo AI Gateway and see how easy it is to manage your LLM usage across all your tools and applications.
More from AI Week#
This article is part of Zuplo's AI Week. A week dedicated to AI, LLMs and, of course, APIs centered around the release of our AI Gateway.
You can find the other articles and videos from this week below:
- Day 1: AI Gateway Overview with Zuplo CEO, Josh Twist
- Day 2: Is Spec-Driven AI Development the Future? with Guy Podjarny, CEO & Founder of Tessl
- Day 2: Using AI Gateway with LangChain & OpenAI with John McBride, Staff Software Engineer at Zuplo
- Day 3: Your AI Models Aren't Learning From Production Data with Gideon Mendels, CEO & Co-Founder of Comet ML
- Day 3: Using Claude Code with Zuplo's AI Gateway with Martyn Davies, Developer Advocate at Zuplo
- Day 4: What Autonomous Agents Actually Need from Your APIs with Emmanuel Paraskakis, CEO of Level250
- Day 4: Using AI Gateway with goose AI agent with Martyn Davies, Developer Advocate at Zuplo