Zuplo
Integrations

LangChain

LangChain is a framework for developing applications that are powered by Large Language Models (LLMs). It implements a standard interface for large language models and related technologies, such as embedding models and vector stores, and integrates with hundreds of providers.

Prerequisites

In order to use the AI Gateway with any LangChain powered application you will need to complete these steps first:

  1. Create a new provider in the AI Gateway for the provider you want to use with LangChain

  2. Set up a new team

  3. Create a new app to use specifically with LangChain and assign it to the team you created

  4. Copy the API Key for the app you created, as well as the Gateway URL

Configure LangChain

In this example we will configure LangChain to work with OpenAI (or OpenAI compatible models).

To work with OpenAI in LangChain it is recommended to use their ChatOpenAI integration.

Code
from langchain_openai import ChatOpenAI def init_chat_model(): """Initialize the ChatOpenAI model""" api_key = os.getenv("ZUPLO_AI_GATEWAY_API_KEY") if not api_key: print("❌ Error: Please set your ZUPLO_AI_GATEWAY_API_KEY in a .env file") exit(1) # Check for custom BASE_URL - this is the AI Gateway URL from Zuplo base_url = os.getenv("BASE_URL") if base_url: return ChatOpenAI(api_key=api_key, model="gpt-4o", base_url=base_url) else: return ChatOpenAI(api_key=api_key, model="gpt-4o")

In the code above, checks are performed for two environment variables:

  • ZUPLO_AI_GATEWAY_API_KEY - This is the API key of the app you have configured to use with LangChain in Zuplo
  • BASE_URL - This is the Gateway URL of your AI Gateway project in Zuplo

Both of these values are passed to ChatOpenAI when it is instantiated, switching the configuration from using default OpenAI APIs to using Zuplo's AI Gateway for all gpt-4o requests that the LangChain SDK will make.

Last modified on