Zuplo
AI

How to Use Block's goose AI Agent with Zuplo AI Gateway

Martyn DaviesMartyn Davies
October 3, 2025
4 min read

Configure Block's open-source goose AI agent to route through Zuplo's AI Gateway for token-cost visibility, rate limits, and per-team usage analytics.

In this guide, I’ll show you how to configure goose to work with OpenAI through Zuplo AI Gateway in just a few minutes.

What is goose?

goose (yes, that’s deliberately lower case) is an open-source AI agent built by Block (the company behind Square and Cash App). It’s a free alternative to commercial AI coding assistants, released under the Apache 2.0 license. goose works as both a CLI tool and a desktop app for automating engineering tasks — from writing and debugging code to running shell commands and managing infrastructure. It supports 15+ LLM providers including OpenAI, Anthropic, and Google, has extensive MCP (Model Context Protocol) support, and offers powerful extensibility through recipes. Whether you’re working solo or in a team, goose provides a flexible foundation for AI-powered automation without vendor lock-in.

Why Use an AI Gateway?

While goose works great out of the box, routing it through an AI gateway like Zuplo gives you several advantages:

  • Cost visibility: Track token usage and spending in real-time
  • Usage controls: Set budget limits and thresholds
  • Team management: Control access across your organization
  • Security policies: Add rate limiting and other protections
  • Observability: Monitor all LLM requests in one place

Setting Up goose with Zuplo AI Gateway

Step 1: Configure Your AI Gateway

First, set up your Zuplo AI Gateway with your OpenAI provider. Create an app for goose and select your desired model (such as GPT-4o). This is also where you can optionally configure usage limits, budget warnings, or other policies.

Once created, grab your app’s API key because you’ll need this in the next step.

Step 2: Configure goose

Run the configuration command:

Terminalbash
goose configure

Then follow these steps:

  1. Select Configure providers
  2. Choose OpenAI as your provider
  3. Replace the API key with the one from your Zuplo AI Gateway app
  4. Update the host URL to point to your AI Gateway endpoint instead of OpenAI directly
  5. Keep the base path as default (v1/chat/completions)
  6. Confirm your default model selection

That’s it! goose will validate the configuration and save it.

Step 3: Test It Out

Run goose and try a query:

Terminalbash
goose

Now, you’re ready to start working with goose as your trusty AI agent. Ask questions, make plans, or simply start building.

See Your Usage in Real-Time

Head over to your Zuplo AI Gateway dashboard while goose is working. You’ll immediately see:

  • Number of requests made
  • Token consumption broken down by input and output tokens
  • Current spending per app and per team
  • Time to first byte and other performance metrics

This visibility is invaluable for understanding your AI usage patterns and controlling costs, especially in team environments. If you’ve set daily or monthly budget limits, you’ll see when an app approaches its threshold — the gateway can enforce limits or notify you automatically so there are no surprise bills.

For even deeper analysis, you can integrate with observability platforms like Comet Opik to get full trace-level visibility: latency breakdowns, prompt and response payloads, and LLM-as-a-judge evaluation scoring on your agent’s output quality.

The Benefits

By routing goose through Zuplo AI Gateway, you get:

  • Immediate observability into all your LLM interactions
  • Cost control with the ability to set budgets and alerts
  • Team governance when multiple people use goose (which they can do with their own Zuplo AI Gateway API keys)
  • Security policies like rate limiting and prompt injection detection
  • No changes to your workflow goose works exactly as before

The setup takes less than five minutes, and you gain enterprise-grade management capabilities for your AI agent usage. If you’re also using Anthropic’s Claude Code, the same gateway handles that too — see our guide to using Claude Code with Zuplo AI Gateway.

Frequently Asked Questions

Is goose free?

Yes. goose is completely free and open-source under the Apache 2.0 license. You can run it with a local LLM through Ollama at zero cost. If you use a cloud provider like OpenAI or Anthropic, you pay their standard API fees — which is exactly where an AI Gateway helps you track and control that spend.

Can I run goose against Anthropic and OpenAI through one gateway?

Absolutely. Zuplo AI Gateway supports multiple providers including OpenAI, Anthropic, Google, and Mistral. You can create separate apps for each provider under the same team, giving you unified cost tracking and rate limits across all of them.

What happens when a budget limit is reached?

When an app hits its daily or monthly spend limit, the gateway blocks further requests. goose will stop making calls until the limit resets or an admin raises the threshold — so you’re never caught off guard by runaway costs.

Get Started

Ready to add visibility and control to your goose setup? Try Zuplo AI Gateway and see how easy it is to manage your LLM usage across all your tools and applications.

More from AI Week

This article is part of Zuplo’s AI Week. A week dedicated to AI, LLMs and, of course, APIs centered around the release of our AI Gateway.

You can find the other articles and videos from this week below: