Policies

Archive Request to GCP Storage Policy

Custom Policy Example

Zuplo is extensible, so we don't have a built-in policy for Archive Request to GCP Storage, instead we have a template here that shows you how you can use your superpower (code) to achieve your goals. To learn more about custom policies see the documentation.

In this example shows how you can archive the body of incoming requests to Google Cloud Storage. This can be useful for auditing, logging, or archival scenarios. Additionally, you could use this policy to save the body of a request and then enqueue some async work that uses this body.

import { HttpProblems, ZuploContext, ZuploRequest } from "@zuplo/runtime"; type GoogleStoragePolicyOptions = { bucketName: any; }; export default async function policy( request: ZuploRequest, context: ZuploContext, options: GoogleStoragePolicyOptions, policyName: string, ) { // NOTE: policy options should be validated, but to keep the sample short, // we are skipping that here. // because we will read the body, we need to // create a clone of this request first, otherwise // there may be two attempts to read the body // causing a runtime error const clone = request.clone(); // In this example we assume the body could be text, but you could also // request the blob() to handle binary data types like images. // // This example loads the entire body into memory. This is fine for // small payloads, but if you have a large payload you should instead // save the body via streaming. const body = await clone.text(); // generate a unique blob name based on the date and requestId const objectName = `${Date.now()}-${context.requestId}`; const authHeader = request.headers.get("Authorization"); // This uses simple uploads where the parameters are in the query string, you // could also use multipart uploads to set more properties // See: https://cloud.google.com/storage/docs/uploading-objects#rest-upload-objects const url = new URL( `https://storage.googleapis.com/upload/storage/v1/b/${options.bucketName}/o`, ); url.searchParams.set("uploadType", "media"); url.searchParams.set("name", objectName); const response = await fetch(url.toString(), { method: "POST", body: body, headers: { // Using the authorization header generated by the previous policy authorization: authHeader, // change to whatever content type you want to save "Content-Type": "text/plain", }, }); if (response.status > 201) { const text = await response.text(); context.log.error( `Error saving file to storage in policy ${policyName}.`, text, ); return HttpProblems.internalServerError(request, context, { detail: text, }); } // continue return request; }

Configuration

The example below shows how to configure a custom code policy in the 'policies.json' document that utilizes the above example policy code.

{ "name": "my-archive-request-gcp-storage-inbound-policy", "policyType": "archive-request-gcp-storage-inbound", "handler": { "export": "default", "module": "$import(./modules/YOUR_MODULE)", "options": { "bucketName": "my-bucket" } } }

Policy Options

The options for this policy are specified below. All properties are optional unless specifically marked as required.

  • bucketName <string> -
    The name of the bucket to archive the request.

Using the Policy

Using the Policy#

In order to use this policy, you'll need to setup Google Cloud Storage, create an IAM Service Account, and configure the Upstream GCP Service Auth Policy. You'll find instructions on how to do that below.

Setup a Google Service Account#

In order to authorize your Zuplo API to upload files to Google Storage, you will need to create a Service Account. Instructions for doing so can be found here: https://cloud.google.com/iam/docs/service-accounts-create

The service account you create will also need permissions to write objects to the storage bucket you will use. The easiest way to do that is to assign the account the Storage Object Creator (roles/storage.objectCreator) role. However, you can also scope the permissions to a single bucket if you like.

Download the service account JSON and create an environment variable secret with the contents. In this example, the variable is named SERVICE_ACCOUNT_JSON

Setup Google Cloud Storage#

In order to use Google Cloud Storage you will need to have a bucket created. If you don't have one you can do so by following this guide: https://cloud.google.com/storage/docs/creating-buckets

Upstream GCP Service Auth Policy#

In order to authorize your Zuplo API to upload to the GCP bucket, you will configured the Upstream GCP Service Auth Policy. It is important that the auth policy runs before this custom policy.

The service auth policy will set the Authorization header of the request to a JWT token with the requested permissions. In order to generate the correct JWT, you must set the scopes to https://www.googleapis.com/auth/devstorage.read_write as shown below.

{ "export": "UpstreamGcpServiceAuthInboundPolicy", "module": "$import(@zuplo/runtime)", "options": { "expirationOffsetSeconds": 300, "scopes": ["https://www.googleapis.com/auth/devstorage.read_write"], "serviceAccountJson": "$env(SERVICE_ACCOUNT_JSON)", "tokenRetries": 3 } }

Tip

You can have multiple Upstream GCP Service Auth Policies on the same request. So for example, you might generate a JWT token that first has permission to upload to GCP storage, then you might have a second policy that runs after this policy that authorizes your Zuplo API to all downstream Cloud Run service.

Each auth policy will cache the JWT tokens for an hour by default so having multiple policies will have virtually no impact on your APIs latency.

Read more about how policies work

Previous
Archive Request to Azure Storage