Share on facebook
Share on twitter
Share on linkedin

How to build a serverless app for on-demand image processing

A Cloud Guru News
A Cloud Guru News

I’ve been exploring AWS Lambda functions for a couple of months — and I’m starting to see the huge benefits of building small serverless applications.

First of all, you don’t need a server — obviously. The problem with creating server side applications has always been the cost of maintaining a server that deploys and hosts my application, and paying for idle resources while waiting for someone to interact with them.

Of course, you can find free NodeJS hosting providers around the web like Heroku — but you’ll be faced with account restrictions, can’t really deploy hundreds of small micro-services, and won’t be charged only for real usage.

AWS Lambda comes to the rescue

  1. You can deploy as many Lambda functions as you want — and basically for free using the AWS free tier account.
  2. You are charged only for the real usage of your functions — without the need of maintaining and paying for an entire server

Another interesting advantage is speed. Lambda functions are executed in an incredibly fast amount of time — normally between 100 and 500ms.

Unlike using Docker, you’ll never have to wait for the virtual environment to be bootstrapped for your code to be executed. Using Lambda functions is really like having a very powerful server machine always up and running — but without the cost.

With the AWS free tier account, you get 400,000 seconds of Lambda execution per month completely for free. Based on an average of 500ms per lambda invocation, it means that you can invoke your functions 800,000 times/month completely free of charge. Not bad at all.

Creating a serverless app with AWS Lambda image resize

My need was to create a NodeJS application responsible for delivering images to my client application.

The app needs the ability to automatically scale my images up and down according to the client’s screen size — so I can avoid creating and storing multiple variations of the same images required for mobile, tablet and desktop versions. The quality and format of the image needs to be changed on demand — also without needing to store all the different images

This seemed like a perfect job for using AWS Lambda functions.

Setup of AWS S3 NodeJS

My first step was to setup NodeJS so I could test my code locally and not have to redeploy my code every time I wanted to validate changes. To get going, start by creating a new folder on your local machine to host the new project.

$ mkdir serverless-image-rendering && cd $_

Then initialize a new npm project and press enter to accept the defaults.

$ npm init

Now we’re going to create an old school Express app to listen on your local port 3000. So create a new app.js file and paste the following code inside:

const app = require('express')();
const bodyParser = require('body-parser');
const PORT = 3000;
const displayStatus = () => ({
  status: `OK`, });
app.get('/status', (req, res) => {
const server = app.listen(PORT, () =>
  console.log('Listening on ' +

For this app, we’re going to need 2 npm packages Express and body-parser .
This 2 packages will only be required for testing your app locally, so we’re going to install them in your development dependencies — this will avoid them be including in your Lambda function.

$ npm i -D express body-parser

I normally also install nodemon globally on my machine — it monitors any file changes which will automatically restart the app.

$ npm i -g nodemon

Then you can bootstrap your local server application:

$ nodemon app.js

You should now be able to open your browser to http://localhost:3000/status and be able to see a "status": "OK" message.

How to fetch your images from S3 bucket

I like to use S3 to store all the images, and have the function fetch an image from the S3 bucket for resizing and delivery to the client app.So I’m going to use AWS to create an S3 bucket and name it images-bucket.

Then I will need an image-fetcher class to open my S3 bucket, find my target image, and return it back to my app. To do this, simply create a image-fetcher.js inside a src folder and paste the following code inside:

const AWS = require('aws-sdk');
const getS3 = (s3, bucketName, fileName) =>
new Promise((res, rej) => {
    Bucket: bucketName,
    Key: fileName
  (err, data) => {
    if (err) {
      return rej(err);
    const contentType = data.ContentType;
    const image = data.Body;
    return res({ image, contentType });
class ImageFetcher {
  constructor(bucketName) {
    this.S3 = new AWS.S3();
    this.bucketName = bucketName;
  fetchImage(fileName) {
    if (!fileName) {
      return Promise.reject('Filename not specified');
  return Promise.resolve(
    getS3(this.S3, this.bucketName, fileName)));
module.exports = ImageFetcher;

This ImageFetcher class will attempt to read a file stored inside the bucketName and return the image if found.

Ok, now we can set our app.js file to consume this class for fetching and delivering an image to the browser. So, let’s create a /fetch-image endpoint!

// app.js
const ImageFetcher = require('./src/image-fetcher');
app.get('/fetch-image', (req, res) => {
  const imageFetcher = new ImageFetcher(process.env.BUCKET);
  const fileName = req.query && req.query.f;
  return imageFetcher
    .then(data => {
      const img = new Buffer(data.image.buffer, 'base64');
      res.writeHead(200, {
        'Content-Type': data.contentType
    .catch(error => {
      res.status(400).send(error.message || error);

Now you should be able to fetch and display an image present inside your previously created images-bucket S3 bucket.

Note that we’re passing a process.env.BUCKET variable into our ImageFetcher constructor. This variable is fetched from your system environment variables — so we’ll need to manually pass that variable to our application. From now on, on our terminal we’ll need to launch our app.js file in this manner:

$ BUCKET=images-bucket nodemon app.js

This will make sure that a BUCKET environment variable will be present and set to our S3 bucket name.

Now we can open a browser to our new endpoint called http://localhost:3000/fetch-image and pass a file name as a query string — although we don’t have any image in our bucket at the moment.

Manually upload a new image called sample.jpg inside your images-bucket and open your browser to http://localhost:3000/fetch-image?f=sample.jpg

An error message should be present in the screen. This is because you probably don’t have reading access to your S3 bucket at the moment.

Create an AWS user
You’ll need to create a new IAM user in AWS, and configure your local machine to use those credentials for accessing your S3 bucket.

First of all create a new credential from your AWS IAM dashboard on

Serverless image rendering
Click Users, then Create a new user

Create a new user called serverless-image-rendering and make sure the Programmatic access option is selected — this will be required for Lambda in later steps.

Create and name a New Group, and check the “AdministratorAccess” from the policies listed. Now all you have to do is create a credentials file under your ~/.aws folder, and paste your IAM informations inside using the following format:


You can set your local preferences to use that profile using your terminal with the following command:

export AWS_PROFILE=serverless-image-rendering

Now you should be able to tart your NodeJS app, and open your browser to http://localhost:3000/fetch-image?f=sample.jpg, and you’ll be able to see your S3 image appearing on your screen!

Create the function for AWS Lambda image processing

The core piece for our app is the image processor responsible for dynamically scaling and changing the quality of your source image.

For serving this purpose, I’m going to use Sharp. The implementation is really straight forward — this is the class I created inside a new src/image-resizer.js file

class ImageResizer {
  constructor(Sharp) { = Sharp;
  resize(image, size, quality) {
    if (!image) throw new Error('An Image must be specified');
    if (!size) throw new Error('Image size must be specified');
    return new Promise((res, rej) => { Buffer(image.buffer))
        .resize(size.w, size.h)
        .webp({quality: quality})
        .then(data => {
          return res({
            image: data,
            contentType: 'image/webp',
        .catch(err => rej(err))
module.exports = ImageResizer;

The resize method is going to receive an image buffer, a size object containing the width and height value for the new image, and a quality attribute.

First, let’s install Sharp in our project.

$ npm i -S sharp

Next, let’s create a new resize-image endpoint inside our Express app to consume ImageResizer.

// app.js
const Sharp = require('sharp');
const ImageResizr = require('./src/image-resizer');
app.get('/resize-image', (req, res) => {
  const imageFetcher = new ImageFetcher(process.env.BUCKET);
  const imageResizr = new ImageResizer(Sharp);
  const fileName = req.query && req.query.f;
  const quality = req.query && +req.query.q || 80;
  const size = {
    w: req && +req.query.w || 800,
    h: req && +req.query.h || null,
  return imageFetcher
    .then(data => imageResizr.resize(data.image, size, quality))
    .then(data => {
      const img = new Buffer(data.image.buffer, 'base64');
      res.writeHead(200, {
        'Content-Type': data.contentType
    .catch(error => {
      console.error('Error:', error);
      res.status(400).send(error.message || error);

Cool! Let’s give it a go.

Bootstrap your Node app once again, and this time open your browser to http://localhost:3000/resize-image?f=sample.jpg

By default the image size is going to be 800px and the quality to 80%. However, we can now change size and quality by simply passing a query string to the URL. We can specify an image width passing the w key, height with the h key, and set a custom quality using the q key.

We can now display our image resized to 600px pixels with a quality of 10% by just pasting our preferred values as parameters in the address bar.

Serverless image handler

So far, we just created a normal NodeJS app — so nothing is working serverless yet. Is this just a typo in the article name? Of course not!

Adding serverless is something you can easily do on top of your conventional NodeJS app. All we need is a serverless configuration file called serverless.yml that we’re going to create inside our project’s root directory.

For this specific project we’re also going to install two serverless plug-ins called serverless-apigw-binary and serverless-apigy-binary. The serverless framework will automatically configure the AWS API gateway to serve the response in application/json format, but we need to deliver an image — so we’ll need to rewrite the document ContentType to be image/webp instead.

Let’s start with installing all the Node modules we require for this final step

$ npm i -S serverless-apigw-binary serverless-apigwy-binary

Now open your new serverless.yml file and paste the following configuration inside:

service: serverless-image-rendering
    - '*/*'
  name: aws
  runtime: nodejs6.10
  stage: dev
  region: us-east-1
  timeout: 5 # optional, in seconds, default is 6
  role: ImageRenderingRole
  BUCKET: images-bucket
  - serverless-apigw-binary
  - serverless-apigwy-binary
  handler: handler.resizeImage
    - http:
      path: resize-image
      method: get
      contentHandling: CONVERT_TO_BINARY
      Type: AWS::IAM::Role
        RoleName: ${self:service}-S3-ACCESS
        Version: "2012-10-17"
          - Effect: Allow
            Action: sts:AssumeRole
        - PolicyName: ${self:service}-s3-access
            Version: "2012-10-17"
              - Effect: Allow
                  - "s3:GetObject"
                  - 'arn:aws:s3:::${self:provider.environment.BUCKET}/*'

This configuration is going to create a Lambda function called “resizeImage”, which invokes a resizeImage function located inside a handler.js file

 handler: handler.resizeImage

It will also configure your API Gateway to invoke that function on any GET request to a resize-image path and return the response in binary format.

    - http:
      path: resize-image
      method: get
      contentHandling: CONVERT_TO_BINARY

Serverless will also create a new AWS IAM Role for you called “serverless-image-rendering-S3-ACCESS” for allowing the Lambda function to read from your S3 Bucket.

While you can also create all of this manually from your AWS dashboard, the serverless framework will save you a lot of time and manual configuration.

From Express to AWS Lambda return image

In the previous step, I mentioned a handler.js file — but we now have an app.js file instead. This is because we cannot run our Express app on Lambda, so we need to create a new file to upload to AWS. It will be similar to our previous app.js but without Express.

So, let’s create a new handler.js file in your project’s root folder. We can just paste inside the previous resize-image logic, and convert it into Lambda code like this:

const Sharp = require('sharp');
const ImageFetcher = require('./src/s3-image-fetcher');
const ImageResizer = require('./src/image-resizer');module.exports.resizeImage = (event, context, callback) => {
const imageFetcher = new ImageFetcher(process.env.BUCKET);
const imageResizer = new ImageResizer(Sharp);
const fileName = event.queryStringParameters && event.queryStringParameters.f;
const quality = event.queryStringParameters && +event.queryStringParameters.q || 80;
const size = {
w: event && +event.queryStringParameters.w || 800,
h: event && +event.queryStringParameters.h || null,
}; return imageFetcher.fetchImage(fileName)
.then(data =>
imageResizer.resize(data.image, size, quality))
.then(data => {
const contentType = data.contentType;
const img = new Buffer(data.image.buffer, 'base64'); callback(null, {
statusCode: 200,
headers: { 'Content-Type': contentType },
body: img.toString('base64'),
isBase64Encoded: true,
.catch(error => {
console.error('Error:', error);
callback(null, error);

This is pretty much the same code we wrote before — but we need to specify this “isBase64Encoded” for Lambda to be able to read our image correctly.

Deploying code using Serverless CLI

Ok, we’re now ready to deploy our code live! The first step requires you to install Serverless globally on your machine with the following command:

$ npm i -g serverless

Now, we can easily deploy all the code we’ve created:

$ serverless deploy

This operation will take a couple of minutes. Serverless is going to package your local application containing all the node dependencies inside a zip file, and upload it inside a new S3 container. Then it’s going to create a new IAM credential, an API Gateway and a Lambda function.

When the deployment process is finished, you’ll see your new Lambda endpoint in your terminal. You can also retrieve your AWS information at any time with the following command:

$ serverless info

And you’ll see something like this in response:

Service Information
service: serverless-image-rendering
stage: dev
region: us-east-1
stack: serverless-image-rendering-dev
api keys:
resizeImage: serverless-image-rendering-dev-resizeImage

Now you should be able to copy the GET endpoint and paste it inside your browser. You’ll be able to see your new serverless app work by passing the same parameters used earlier within your local app and environment.

For example:

Node modules issues
You might encounter a problem when deploying to Lambda function. The problem is that your node modules are installed for a wrong environment configuration compared to AWS — so some packages like Sharp may not work inside your Lambda function. For this reaso, AWS released a Docker image called lambci/lambda that you can use for installing all the node modules before running serverless deploy

Get the skills you need for a better career.

Master modern tech skills, get certified, and level up your career. Whether you’re starting out or a seasoned pro, you can learn by doing and advance your career in cloud with ACG.

GithHub repository code

I created a GitHub repository where you can see the code related to this article — feel free to clone and create your own image processing app using a Lambda function. I’m looking forward to your feedback and comments!


Get more insights, news, and assorted awesomeness around all things cloud learning.

Sign In
Welcome Back!

Psst…this one if you’ve been moved to ACG!

Get Started
Who’s going to be learning?