Serverless Cron Jobs with AWS Batch

Serverless Cron Jobs with AWS Batch

AWS Lambda can only execute in a maximum of 15 minutes. For Cron Jobs that need more than 15 minutes, there is an AWS Batch service


6 min read

When we think about Serverless Jobs in AWS, the first thing that comes to mind is AWS Lambda. AWS Lambda is a fantastic computing service that lets you run code without provisioning or managing servers. But AWS Lambda has some limitations; the function can only execute in a maximum of 15 minutes. For Cron Jobs that need more than 15 minutes, there is an AWS Batch service.

What is AWS Batch

AWS Batch is a set of batch management capabilities that enables developers, scientists, and engineers to quickly and efficiently run hundreds of thousands of batch computing jobs on AWS.

AWS Batch can be integrated with serverless container service (AWS Fargate) and set so that we only pay for what we use.

Build Batch

First of all, we need to build a working batch job.

Write the code

The easiest way to build a batch, in my opinion, is by creating a docker image. As long as it runs on a docker container, you can run any language in AWS Batch. In this example, we will be using python.


To create a docker image, we need Dockerfile. We will create a simple python container and install the package to our requirements.txt.


FROM python:3.9

ADD . /app


RUN pip install -r requirements.txt


Sample Code

For this example, we will run the downloaded package and test it if we can confirm it in the execution logs.



import requests

res = requests.get("")

Put Image in ECR

The next step is putting our image in ECR.

Create New Repository

  1. Go to AWS ECR (Don't forget to check your AWS region)
  2. From the sidebar, go to "Repositories" and click the "Create repository" button.
  3. Choose "Visibility Settings" to "Private," Enter the repository name.

Screen Shot 2022-05-18 at 18.12.png

Push Image

Select the created repository and click the "View push Command" button.


Run those four commands and check if the repository is updated from the AWS console.


P.S. If you face any error while pushing the image, there is a good chance there are authentication problems. The link below might help you.

Execute Batch in AWS Batch

There are four steps before we can run our job.

Create Compute Environment

As the name suggests, Compute Environment is where our jobs will be executed. There are two types of Compute environments.

  • EC2
  • EC2 Spot
  • Fargate
  • Fargate Spot

If you choose EC2, AWS will start a new EC2 Server and run your job there. Fargate is an AWS Serverless container service. Spot is an option to use surplus resources for a low price in exchange for the risk that the resource might be suddenly unavailable. We will be using Fargate Spot.

  • Go to AWS Batch Page
  • Click "Compute Environment" from the sidebar
  • Click the "Create" button
  • Fill out Create Form
    • For "Compute environment configuration."
Compute environment typeManagedLet's make AWS do complicated stuff for us ๐Ÿ˜€
Compute environment name(anything)Up to you ๐Ÿ‘
Enable compute environmentTrueWe want to use it! ๐Ÿ‘Š
  • For "Instance configuration."
Provisioning modelFargate SpotFargate for serverless! Spot for a low price! ๐Ÿ’ฐ
Maximum vCPUs1Only doing simple execution, does not need high compute power
  • For "Networking." Just leave it! Default VPC has a public subnet that can connect internet.
  • Click "Create."

Just like that, we created our computing environment.

Create Job Queue

Jobs are submitted to a job queue where they reside until they can be scheduled to run in a Compute Environment.

  • Go to AWS Batch Page
  • Click "Job queues" from the sidebar
  • Click the "Create" button
  • Fill out Create Form
    • For "Job queue configuration."
Job queue name(anything)Again, Up to you ๐Ÿ‘
Priority1Priority of this queue against other queues. Since we only create one, it doesn't matter.
  • For "Connected compute environments," select compute environment that we created before.
  • Click "Create."

Create Job Definition

Job Definition is a blueprint for creating jobs.

  • Go to AWS Batch Page
  • Click "Job Definition" from the sidebar
  • Click the "Create" button
  • Fill out Create Form
    • For "Job type," Choose "Single Node" just because we don't need parallel execution.
    • For "General configuration," Enter any name you want and timeout of 300 (5 minutes).
    • For "Platform compatibility," Enter Fargate with the latest version, Check "Assign Public IP," and execution role to default.
    • For "Job configuration."
KeyValueExplanation it to your ECR repository
Command syntaxBashBecause we love Bash ๐Ÿ’–
vCPUs1.0We don't need much
Memory2GBLeave it to default

And leave the rest to default.

  • Click the "Create" button

Create Job

Preparation is done, and let us test-run it.

  • Stay on the "Job Definition" Page
  • Select created job definition
  • Click on the "Submit new job" Button
  • Enter job name (again, whatever you want) and choose the job queue that we created
  • Click "Submit"

Congratulations ! you have created your first AWS Batch job ๐ŸŽ‰. Let's sit back and wait for our execution logs (Hint, it's not real-time, and there are delays)



Scheduled Jobs

The last thing we need to do is set a schedule because, without it, this will be a "job," not a "cron job." ๐Ÿ˜†

Set Rule in AWS Eventbridge

  • Go to AWS Eventbridge
  • Click the "Create Rule" button
    • For "Define rule detail," Enter "Name" and "Description" with anything you like. Leave "Event Bus" to default and Rule type to "Schedule."
    • For "Define schedule," Select what schedule you want to run the job. See AWS Schedule Expressions
    • For "Select target(s)," Select target types to "AWS Service" and "Select a target" to "Batch job queue." Enter your resource ARN for Job Queue and Job Definition. Enter any name you like in the Job Name.
    • Skip "Configure tags."
    • Review and create

If done correctly, it will appear in the "Rules" list.



This time we created a serverless job in AWS. Compared to AWS Lambda, AWS Batch need some work to set up, but both have unique use cases, and in my opinion, no tool can fit all scenarios. Knowing many tools on the internet will help us in our journey. Besides, I am happy there are many tools we can use ๐Ÿ˜ƒ.

Did you find this article valuable?

Support Alvin Endratno by becoming a sponsor. Any amount is appreciated!