Turner Houghton

Turner Houghton

Head in the AWS Cloud

S3 Presigned POST in NodeJS and React

By: Turner HoughtonTurner Houghton
Published on April 12, 2023, last updated on April 12, 2023 at 6:32 AM

If you're like me you've probably explored a lot of different options for uploading files to S3. S3 offers many ways to securely allow you to interact with your buckets. Presigned requests are one way to give your end users the ability to upload files using short lived URLs that can be scoped down to only specific actions.

If you've worked with presigned GETs and PUTs, these are great options for simple use cases but have their own issues. A presigned PUT can restrict the key a user can upload to, but cannot do any kind of validation on the size of the upload. This can be an issue if you want to enforce some kind of file upload limit so your users aren't uploading insanely large files that you end up having to pay for in storage and access costs.

This is where presigned POSTs are a lot better, even if they involve slightly more work. This article will show you the necessary server-side code for creating a presigned POST, and the frontend code (in React) for uploading a file to that url.

Step 1: Prerequisites

You will need to create an S3 Bucket, and whatever server code you create the presigned POST url in will need to have PutObject permissions on that bucket. If you are using the AWS CDK to define your infrastructure that could look something like this:

import path = require("path"); import * as s3 from "aws-cdk-lib/aws-s3"; import * as lambda from "aws-sdk-lib/aws-lambda-nodejs" import { Runtime } from "aws-cdk-lib/aws-lambda"; ... const myLambdaFunction = new lambda.NodejsFunction( this, "myLambda", { entry: path.join(__dirname, "./myLambda.ts"), runtime: Runtime.NODEJS_18_X } ); this.imagesBucket = new s3.Bucket(this, "imagesBucket", { bucketName: "my-cool-bucket", cors: [ { allowedMethods: [ s3.HttpMethods.GET, s3.HttpMethods.POST, s3.HttpMethods.PUT, ], allowedOrigins: ["http://localhost:3001", "https://example.com"], allowedHeaders: ["*"], }, ], }); this.imagesBucket.grantPut(myLambdaFunction);

Note the last line where we use the CDK helper to give our lambda function permission to call PutObject. If your function does not have the permission set up correctly none of your generated URLs will work. It's like how you have to have your car keys on you to be able to lend them to a friend.

Step 2: Generate the URL and fields

Install packages

In your backend project, install the S3 client and the presigned POST libraries:

npm install @aws-sdk/client-s3 @aws-sdk/s3-presigned-post

Call createPresignedPost

You can use whatever backend framework you wish. In my case I have a GraphQL API that has a requestImageUpload mutation that my authenticated users can call. After simplifying a bit, the code for that mutation looks like this:

import { S3Client } from "@aws-sdk/client-s3"; import { createPresignedPost } from "@aws-sdk/s3-presigned-post"; const s3 = new S3Client({ region: process.env.BUCKET_REGION }); const minFileSize= 1; const maxFileSize = 10485760; // 10mb export async function handler( event: any ): Promise<any> { logger.info(event); const { input: { fileName, contentType } } = event.arguments; const bucketName = process.env.IMAGES_BUCKET_NAME const { url, fields } = await createPresignedPost(s3, { Bucket: bucketName, Key: key, Conditions: [ { bucket: bucketName }, ["eq", "$key", fileName], ["eq", "$Content-Type", `image/${contentType.toLowerCase()}`], ["content-length-range", minFileSize, maxFileSize], ], Expires: 60 * 10, // 10 minutes }); logger.info({ url, fields, msg: "got presigned URL back" }) return { uploadKey: key, uploadUrl: url, fields: Object.entries(fields).map(([key, value]) => ({ name: key, value })) } }

Note that by specifying the content-length-range condition, we can effectively limit how large of a file that can be uploaded to this endpoint.

You may also notice that alongside the url we get back from the createPresignedPost call, we also get back a fields object. This fields object has all of the necessary FormData entries that we need to send in our request. Because GraphQL doesn't like returning objects with unknown structure, I'm sending back the fields as an array of key/value pairs but you are free to do it differently if you are using a different kind of API.

Step 3: Frontend Upload

In your React application, create a new component. I called mine UploadInput and it looks like the following:

import { useMutation } from "@apollo/client"; import axios from "axios"; import { apolloClient } from "graphql/client"; import { gql } from "graphql/__generated__"; import { ContentType } from "graphql/__generated__/graphql"; import React, { FormEvent, useRef, useState } from "react"; const REQUEST_UPLOAD = gql(` mutation RequestImageUpload($input: RequestImageUploadInput!) { requestImageUpload(input: $input) { uploadKey uploadUrl fields { name value } } } `); const UploadInput = () => { const [showUploadButton, setShowUploadButton] = useState(false) const [requestUploadMutation] = useMutation(REQUEST_UPLOAD) const fileInputRef = useRef<HTMLInputElement>(null); const handleFileInputChange = (event: any) => { setShowUploadButton(true); } const requestFileUpload = async (file: File) => { let type: ContentType; switch(file.type) { case "image/jpeg": { type = ContentType.Jpeg break; } case "image/png": { type = ContentType.Png break; } case "image/webp": { type = ContentType.Webp break; } default: throw new Error("Unsupported file type!") } // replace with your API call to get the presigned POST url const { data, errors } = await requestUploadMutation({ variables: { input: { contentType: type, fileName: file.name, contentLength: file.size, } } }) if (errors) { alert(JSON.stringify(errors)) } const { uploadKey, uploadUrl, fields } = data!.requestImageUpload; return { file, uploadKey, uploadUrl, fields } } const uploadFile = async (input: Awaited<ReturnType<typeof requestFileUpload>>) => { const { file, uploadUrl, fields } = input; const formData = new FormData(); formData.append('Content-Type', file.type); fields.forEach(({ name, value }) => { if (name === "bucket") return; formData.append(name, value); }) // NOTE: the file has to be the last field in the formData. // Any fields passed in after the file itself will be ignored. formData.append('file', file); // for some reason axios works but fetch doesn't // copied from https://bobbyhadz.com/blog/aws-s3-presigned-url-react await axios.post(uploadUrl, formData, { headers: { "Content-Type": "multipart/form-data" }, }); } const uploadFiles = async (event: FormEvent) => { event.preventDefault(); const formData = new FormData(event.target as HTMLFormElement); const files: File[] = []; for (const file of formData.values()) { // sanity check to make sure all formData values are actually files if (!(file instanceof File)) { continue; } files.push(file); } const fileRequests = files.map(file => requestFileUpload(file).then(uploadFile)); await Promise.allSettled(fileRequests) setShowUploadButton(false) if (fileInputRef.current) { fileInputRef.current.value = '' } } return ( <form onSubmit={uploadFiles}> <label> <span className="mr-2">Upload Files</span> <input ref={fileInputRef} name="files" type="file" multiple={true} onChange={handleFileInputChange} accept=".png,.jpeg,.jpg,.webp" /> </label> <button className={"block mt-2 px-4 py-2 bg-white" + (!showUploadButton ? " hidden" : "")} type="submit">Upload</button> </form> ) } export default UploadInput;

Your component may look a bit different if you don't use GraphQL to interact with your API, but the bulk of the code should still be pretty similar. What's nice is that this easily supports multiple uploads at the same time, and as long as the files that we upload are less than 10mb in size they will all get uploaded.

Step 4: Give yourself a round of applause!

That's all there is to it! When I looked at implementing this myself a lot of info on presigned POSTs were scattered across StackOverflow, GitHub, and countless of other sites, but I hope that this article is a good collection of all of the necessary steps you need to take in order to implement presigned S3 POSTs.