LocalStack: Simulate AWS S3, Lambda, DynamoDB Locally — Completely Free

Development tutorial - IT technology blog
Development tutorial - IT technology blog

2 AM, the CI/CD pipeline goes red. A Lambda function deployed to staging throws a ResourceNotFoundException — the DynamoDB table doesn’t exist right when you need it most. I sat there debugging, tracing logs, and eventually realized: our entire team’s test workflow was hitting a real AWS dev account directly. Every test run was a prayer that the account wouldn’t get rate-limited or rack up a surprise bill.

That’s when I started taking LocalStack seriously.

Three Ways to Develop AWS Applications — and Why the First Two Fall Short

Before diving into LocalStack, let’s walk through the alternatives to understand why it’s become so widely adopted.

Option 1: Use Real AWS (dev account)

Most teams start here. Create a dedicated AWS account for development, and have developers connect to it directly. Sounds reasonable, but the reality is messier:

  • Cost: even in a dev environment, Lambda invocations + S3 storage + DynamoDB reads/writes add up. A team of 5 running 500 integration tests per day can easily see $30–50 on the monthly bill just for the dev environment.
  • Internet dependency: no network means no testing — I’ve been stuck at an airport for 3 hours unable to get anything done.
  • Conflicts: two developers running tests in parallel → DynamoDB state collisions, flaky tests with no obvious cause.
  • Cleanup: someone forgets to delete an S3 bucket after testing, and stale data accumulates over time.

Option 2: Mock with libraries (moto, unittest.mock)

Mocking AWS calls in Python with moto is a fairly popular choice. I’ve used moto on a project and it works well for pure unit tests. The problems only surface as the codebase grows.

Moto operates at the Python level, not the network level. Boto3 calling S3? Moto handles it. But if you have a Node.js or Go service that also needs to call the same S3 bucket in an integration test — moto can’t help. No matter how good your mocks are, they can’t replace a real HTTP endpoint that every language can call into.

Option 3: LocalStack — Simulate the Entire AWS Stack at the Network Level

LocalStack runs as a Docker container, exposing HTTP endpoints that emulate nearly all AWS services. Any SDK — Python boto3, Node.js AWS SDK, AWS CLI — just needs to point its endpoint to http://localhost:4566 to interact with it as if it were real AWS.

Fully offline, zero cost, no conflicts between developers, and a container restart gives you a clean slate. That’s what LocalStack delivers.

It’s not perfect, of course: some complex services like Cognito, EKS, or certain Lambda runtimes require LocalStack Pro (the paid tier). The free tier covers S3, Lambda, DynamoDB, SQS, SNS, and API Gateway — enough for 80% of real-world use cases.

Set Up LocalStack in 5 Minutes

Prerequisites

  • Docker installed and running
  • Python 3.8+ (if using awscli-local)
  • AWS CLI (for terminal interaction)

Run LocalStack with Docker Compose

Create a docker-compose.yml file in your project. If you’re already familiar with containerized deployments, the setup will feel natural — similar to how blue-green deployment with Docker Compose works for production environments:

version: '3.8'
services:
  localstack:
    image: localstack/localstack:latest
    ports:
      - "4566:4566"
    environment:
      - SERVICES=s3,lambda,dynamodb,sqs
      - DEBUG=0
      - LAMBDA_EXECUTOR=docker
      - DOCKER_HOST=unix:///var/run/docker.sock
    volumes:
      - "/var/run/docker.sock:/var/run/docker.sock"
      - "./localstack-data:/var/lib/localstack"
docker-compose up -d
# Check if LocalStack is running
curl http://localhost:4566/_localstack/health

The JSON response will list the services currently running.

Install awslocal — a Handy Wrapper for the AWS CLI

pip install awscli-local
# awslocal = aws --endpoint-url=http://localhost:4566 --region us-east-1

Hands-On: S3, DynamoDB, and Lambda with LocalStack

S3 — Upload and Download Files

# Create a bucket
awslocal s3 mb s3://my-test-bucket

# Upload a file
echo "Hello LocalStack" > test.txt
awslocal s3 cp test.txt s3://my-test-bucket/

# List objects
awslocal s3 ls s3://my-test-bucket/

# Download it back
awslocal s3 cp s3://my-test-bucket/test.txt downloaded.txt

In Python with boto3:

import boto3

# Point the endpoint to LocalStack
s3 = boto3.client(
    's3',
    endpoint_url='http://localhost:4566',
    region_name='us-east-1',
    aws_access_key_id='test',      # LocalStack accepts any value here
    aws_secret_access_key='test'
)

# Upload
s3.put_object(
    Bucket='my-test-bucket',
    Key='data/config.json',
    Body=b'{"env": "local"}'
)

# Download
response = s3.get_object(Bucket='my-test-bucket', Key='data/config.json')
print(response['Body'].read().decode('utf-8'))

DynamoDB — Create a Table and Perform CRUD

# Create a table
awslocal dynamodb create-table \
  --table-name Users \
  --attribute-definitions AttributeName=userId,AttributeType=S \
  --key-schema AttributeName=userId,KeyType=HASH \
  --billing-mode PAY_PER_REQUEST

# Add an item
awslocal dynamodb put-item \
  --table-name Users \
  --item '{"userId": {"S": "u001"}, "name": {"S": "Nguyen Van A"}, "email": {"S": "[email protected]"}}'

# Query
awslocal dynamodb get-item \
  --table-name Users \
  --key '{"userId": {"S": "u001"}}'

Lambda — Deploy and Invoke a Function

Create a handler.py file:

import json

def lambda_handler(event, context):
    name = event.get('name', 'World')
    return {
        'statusCode': 200,
        'body': json.dumps({'message': f'Hello, {name}!'})
    }
# Package it
zip function.zip handler.py

# Deploy to LocalStack
awslocal lambda create-function \
  --function-name hello-function \
  --runtime python3.11 \
  --handler handler.lambda_handler \
  --role arn:aws:iam::000000000000:role/lambda-role \
  --zip-file fileb://function.zip

# Invoke
awslocal lambda invoke \
  --function-name hello-function \
  --payload '{"name": "LocalStack"}' \
  --cli-binary-format raw-in-base64-out \
  output.json

cat output.json
# {"statusCode": 200, "body": "{\"message\": \"Hello, LocalStack!\"}"}  

Integrating with pytest — The Part Most People Skip

Running LocalStack manually without pulling it into your test pipeline only solves half the problem. Here’s how to set up pytest fixtures that automatically provision resources before each test — no manual setup required. For a broader look at Python testing strategies, this guide to unit tests with pytest covers the fundamentals worth knowing before going deeper:

# conftest.py
import boto3
import pytest

ENDPOINT = 'http://localhost:4566'
REGION = 'us-east-1'
CREDS = {'aws_access_key_id': 'test', 'aws_secret_access_key': 'test'}

@pytest.fixture(scope='session')
def s3_client():
    return boto3.client('s3', endpoint_url=ENDPOINT, region_name=REGION, **CREDS)

@pytest.fixture(scope='session')
def dynamodb_client():
    return boto3.client('dynamodb', endpoint_url=ENDPOINT, region_name=REGION, **CREDS)

@pytest.fixture(autouse=True)
def setup_s3_bucket(s3_client):
    """Create a fresh bucket before each test, delete it after"""
    bucket = 'test-bucket'
    s3_client.create_bucket(Bucket=bucket)
    yield bucket
    # Cleanup: delete all objects, then delete the bucket
    objects = s3_client.list_objects_v2(Bucket=bucket).get('Contents', [])
    for obj in objects:
        s3_client.delete_object(Bucket=bucket, Key=obj['Key'])
    s3_client.delete_bucket(Bucket=bucket)

# test_s3.py
def test_upload_and_retrieve(s3_client, setup_s3_bucket):
    bucket = setup_s3_bucket
    s3_client.put_object(Bucket=bucket, Key='test.txt', Body=b'content')
    response = s3_client.get_object(Bucket=bucket, Key='test.txt')
    assert response['Body'].read() == b'content'

Run the tests with LocalStack running in the background:

pytest tests/ -v

Things to Know Before Using It for Real

Data doesn’t persist by default. Restarting the container wipes everything — S3 buckets, DynamoDB tables, Lambda functions all reset to zero. If you need to retain data between runs, mount a volume as shown in the docker-compose example above.

Lambda cold starts are slower than usual. LocalStack uses Docker-in-Docker to run Lambda, so the first cold start can take 10–30 seconds. Subsequent invocations are much faster. This trade-off is entirely acceptable compared to deploying to real AWS just to run a test.

Use environment variables to switch endpoints. Hardcoding http://localhost:4566 in your code is the fastest way to introduce a hard-to-find bug at deploy time. Do this instead:

import os

endpoint = os.environ.get('AWS_ENDPOINT_URL', None)  # None = use real AWS
s3 = boto3.client('s3', endpoint_url=endpoint)

When running locally, set AWS_ENDPOINT_URL=http://localhost:4566 python app.py. When deploying to staging or production, leave the variable unset — boto3 connects to real AWS automatically, no code changes needed.

It works with GitHub Actions too. Add the localstack/setup-localstack action to your workflow and you’re done. Integration tests run entirely offline inside the runner, with no dependency on any AWS account. If you haven’t set up CI/CD pipelines yet, this beginner’s guide to GitHub Actions is a solid starting point.

Wrapping Up

After that 2 AM incident, I migrated our entire team’s integration test suite to LocalStack. The CI/CD pipeline got noticeably faster. The AWS dev account bill dropped by nearly 70%. More importantly — developers can test fully offline, whether they’re on a train or at a café with spotty wifi.

LocalStack doesn’t solve everything. Complex IAM policies, service-specific edge-case behavior — those still need to be tested against real AWS before a release. But for day-to-day work: writing features, debugging logic, running integration tests — this is the most pragmatic solution I know. For teams going further with cloud infrastructure automation, automating AWS infrastructure with Terraform pairs naturally with LocalStack as your local testing layer.

Share: