How to Write Serverless Python Functions? [+code snippets]

Photo of Jim Kutz
Jim Kutz
November 28, 2025

Summarize this article with:

You don't need to babysit servers to run Python code anymore. When you work with serverless Python functions, you upload a small handler to AWS Lambda, Google Cloud Functions, or Azure Functions. This lets you deploy production-ready logic in minutes while paying only for the time, memory, number of requests, and data transfer your code uses. The cloud provider handles scaling, patching, and availability automatically, so you focus on the business problem instead of infrastructure.

The pay-per-use model works perfectly for lightweight APIs, file-processing jobs, scheduled automation, and other bursty workloads. This guide shows you how to write a minimal handler, package dependencies, and deploy across all three major clouds using concise, real-world code samples. 

TL;DR: Serverless Python Functions at a Glance

  • Run Python code on demand with no servers to manage, triggered by HTTP requests, file uploads, or scheduled events.
  • Pay only for execution time, making serverless ideal for bursty workloads, lightweight APIs, and automation tasks.
  • AWS Lambda, Google Cloud Functions, and Azure Functions use the same core flow: write a handler, bundle dependencies, deploy, and test the endpoint.
  • Ship dependencies by bundling them with your code or using layers, auto-installed requirements.txt files, or container images.
  • Test locally with SAM CLI, Functions Framework, or Azure Functions Core Tools to catch issues before deploying.

What Is a Serverless Python Function?

A serverless Python function is a small piece of code you deploy to a cloud service that runs only when an event (an HTTP request, a file upload, a message) triggers it. You never manage the underlying servers; the provider spins up the runtime, executes your handler, and tears everything down again. This event-driven model means you focus on business logic while the platform handles scaling, patching, and high availability.

Because you pay only for the milliseconds your code runs, serverless often costs less than maintaining always-on infrastructure, especially for spiky or unpredictable workloads. Auto-scaling happens instantly, so a single function can process one request or thousands without extra configuration. These advantages make serverless ideal for APIs, scheduled tasks, and lightweight data pipelines, where execution time is short and traffic varies. If you need long-running jobs or full OS control, traditional hosting still wins.

When your workload is bursty, latency-tolerant, and modular, serverless functions deliver faster releases and lower operational overhead than managing your own servers.

What Do You Need Before You Start Building Serverless Python Functions?

Set up your local machine and cloud accounts before writing code. This preparation prevents deployment issues and "works-on-my-laptop" surprises.

You'll need these tools installed on your development machine:

  • Python 3.x – every major provider supports modern Python; AWS Lambda currently runs Python 3.8 – 3.12, with new versions documented in the Lambda developer guide
  • AWS CLI – install with pip so you can create and invoke Lambda functions
  • Google Cloud SDK – the gcloud tool deploys and logs Cloud Functions
  • Azure Functions Core Tools – gives you the func command for local runs and pushes
# Install the three CLIs
pip install awscli
brew install --cask google-cloud-sdk      # macOS; use apt/yum on Linux
npm install -g azure-functions-core-tools@4

Authenticate each CLI after installation: aws configure, gcloud init, and az login. This stores credentials locally for deployment access.

Create separate IAM users or service principals to keep API keys scoped to just what your functions need. Start every project in an isolated virtual environment (python -m venv .venv && source .venv/bin/activate) so dependency versions match the Linux runtimes your serverless platform uses.

How Do You Write a Minimal Serverless Python Function?

A serverless Python function needs just a handler that receives an event, processes it, and returns a response the platform understands.

# handler.py
def handler(event, context):
return {
"statusCode": 200,
"body": "hello world"
}

That signature works for AWS Lambda, and other clouds follow similar patterns with renamed parameters or wrapped HTTP objects. The platform injects the incoming payload into event and passes execution metadata (such as request ID or remaining memory) through context.

Organize your project with a clean structure so nothing accidental ships to the cloud:

  • handler.py – your function's entry point
  • requirements.txt – third-party libraries
  • src/ – additional modules, helpers, assets

For small dependencies like requests, install them into the project directory, zip everything, and upload. Heavier libraries such as NumPy need packaging in a Linux container or Lambda Layers to avoid incompatible binaries and bloated archives.

Pin exact versions in your requirements.txt. Use requests==2.31.0 instead of requests so every developer, CI job, and deployment gets identical dependencies. Version pinning prevents surprise breakages when upstream maintainers release changes.

With a handler, clean directory structure, and locked dependencies, you can zip and deploy without surprises.

How Do You Deploy a Python Function on AWS Lambda?

You can push a working Python function to AWS Lambda in minutes: write a small handler, zip the code, run one CLI command, and test it instantly. Because Lambda handles scaling and maintenance for you, the only moving pieces are your code, its dependencies, and an IAM role that grants execution rights.

1. Write the Lambda Handler

A Lambda handler is a plain Python function that accepts event and context objects, then returns a JSON-serializable value. The minimal pattern looks like this:

def handler(event, context):
message = "Hello from Lambda!"
return {"statusCode": 200, "body": message}

The runtime passes incoming data in event, while context exposes metadata such as request ID and remaining memory. Supported runtimes now include Python up to version 3.11, so specify a version that matches your code and libraries.

2. Create a Deployment Package

If your function uses third-party libraries, install them into a local folder and bundle everything into a ZIP file:

pip install requests -t python
zip -r function.zip handler.py python/

This ZIP must contain both your source files and any dependencies so they are available inside Lambda's ephemeral runtime.

3. Deploy Using AWS CLI

Upload the package with one command:

aws lambda create-function \
--function-name hello-python \
--runtime python3.12 \
--handler handler.handler \
--zip-file fileb://function.zip \
--role arn:aws:iam::<account-id>:role/lambda-basic-execution

The handler parameter points to filename.function, and the role must reference an IAM role granting any resource access your code needs (such as access to S3 or CloudWatch Logs). You can also deploy through the Lambda console if you prefer a UI.

4. Invoke the Function

Test locally from the CLI:

aws lambda invoke --function-name hello-python out.json

The command writes the response to out.json; logs stream automatically to CloudWatch for deeper inspection. To expose the function over HTTP, attach an API Gateway trigger and use the generated endpoint in your application.

How Do You Deploy a Python Function on Google Cloud Functions?

Moving from AWS to Google's platform requires minimal adjustments. Write a handler function, run gcloud functions deploy, and hit the generated URL. Google Cloud Functions handles the rest.

1. Write the Python Function

Create a file named main.py and add a handler that accepts an HTTP request and returns a response.

def hello_world(request):
"""Simple HTTP handler for Cloud Functions."""
return "Hello, World!"

The request argument is a Flask‐style object that lets you read headers, query parameters, and JSON bodies. Whatever you return becomes the HTTP response body.

2. Deploy With gcloud

From the same directory run:

gcloud functions deploy hello_world \
--runtime python39 \
--trigger-http \
--allow-unauthenticated

This packages your code, uploads it, and tells Google Cloud Functions to invoke hello_world for every HTTP request. The --runtime flag chooses the Python version, --trigger-http registers an HTTP trigger, and --allow-unauthenticated makes the endpoint public. Omit it if you need IAM protection. You can swap the trigger for Cloud Storage, Pub/Sub, or scheduler events, following the same pattern common to other serverless architecture platforms.

3. Test the Function

Once deployment finishes, you'll receive a URL similar to https://REGION-PROJECT.cloudfunctions.net/hello_world. Open it in a browser or curl it and you'll see "Hello, World!". Runtime logs stream to Cloud Logging in near real time, so you can inspect output or stack traces without redeploying.

How Do You Deploy a Python Function on Azure Functions?

Azure takes a slightly different approach with its deployment model. You can deploy a Python function on Azure Functions in three phases: initialize the project, test locally, and publish to the cloud. Azure Functions Core Tools handles the entire process from your terminal.

Start by initializing the project. The CLI generator creates a function app, a parent folder containing one or more individual functions. You'll find main.py, a minimal function.json that defines the trigger, and requirements.txt for dependencies. Azure's binding model lets you configure HTTP, queue, or timer triggers declaratively in function.json without writing boilerplate code.

import azure.functions as func
def main(req: func.HttpRequest) -> func.HttpResponse:
return func.HttpResponse("Hello from Azure Functions")

Next, test the function locally. The same CLI spins up an emulator that mirrors the production runtime and exposes an HTTP endpoint. You can attach a debugger from Visual Studio Code and watch logs stream in real time, showing request IDs and environment variables exactly as they appear in the cloud.

Finally, publish the function. The CLI packages your code and requirements.txt, uploads everything to your target Function App, and configures the bindings automatically. Visual Studio Code offers the same workflow through its Azure extension if you prefer a graphical interface.

How Do You Add Dependencies to Serverless Python Functions?

A dependency that isn't bundled with your code will simply not exist at runtime, so each platform offers its own workflow for getting libraries to the cloud. Understanding these differences helps you choose the right approach for your project's complexity.

AWS

You package every third-party library alongside the handler or place them in a shared Layer. The direct approach looks like this:

pip install -r requirements.txt -t ./python
cd python && zip -r ../function.zip .
cd .. && zip -g function.zip handler.py

The resulting ZIP is uploaded with aws lambda create-function.

When several functions need the same stack (such as NumPy or Pandas), drop those libraries into a Lambda Layer and reference the layer ARN instead of re-zipping every time.

Tools such as the serverless-python-requirements plugin automate both steps and can compile packages inside Docker so the binaries match Lambda's Linux runtime.

GCP

Google Cloud Functions reads a plain requirements.txt, builds the environment for you, and caches it per revision. You deploy with gcloud functions deploy and the service installs each pinned version before the first cold start.

Keep the file lean. Unused libraries slow builds and warm-up times. After deployment, verify everything installed by opening the Logs Explorer; any missing module error will surface immediately.

For heavier stacks consider moving to Cloud Run, but for most event functions the automatic installer is faster and cheaper.

Azure

Azure Functions follows the same requirements.txt convention and handles dependency installation during deployment. Running func azure functionapp publish <app> uploads your code, triggers a remote pip install, and stores the installed dependencies in a platform-managed directory accessible to your function.

Complex native packages can push build times past the default timeout, so split them into smaller helper functions or pre-build a custom container image when needed.

For shared libraries in Azure Functions, package your common code as a Python library and reference it in each project's requirements.txt, instead of copying packages into every project.

How Do You Handle Environment Variables and Secrets?

Serverless functions spin up fresh containers every time they run, so you need a repeatable way to pass configuration without hard-coding credentials. Environment variables keep simple values close to the code, while cloud-native secret stores handle sensitive tokens.

AWS Lambda

Add key-value pairs through the Lambda console or your infrastructure-as-code setup. The runtime exposes them through os.environ. For sensitive data, store only the Secrets Manager ARN as an environment variable, then fetch the actual secret inside your handler.

import os, json, boto3
def handler(event, context):
secret_arn = os.getenv("DB_SECRET_ARN")
sm = boto3.client("secretsmanager")
creds = json.loads(sm.get_secret_value(SecretId=secret_arn)["SecretString"])
return {"statusCode": 200, "body": f"Connected as {creds['user']}"}

GCP Cloud Functions

The gcloud functions deploy command accepts --set-env-vars KEY=VALUE, and your function retrieves them with os.getenv. Pair variables with Secret Manager references so actual secrets never appear in plaintext:

def hello_world(request):
api_key = os.getenv("API_KEY")
return f"Key length: {len(api_key)}"

Azure Functions

Azure stores configuration under Application Settings, automatically injecting each key into os.environ. Link a setting to an Azure Key Vault secret by prefixing the value with @Microsoft.KeyVault(SecretUri=…).

import os
import azure.functions as func
def main(req: func.HttpRequest) -> func.HttpResponse:
conn = os.getenv("SQL_CONN_STR")
return func.HttpResponse(f"Using: {conn[:10]}…")

How Do You Test Serverless Python Functions Locally?

You don't need to push every change to the cloud. Local emulators let you run, debug, and iterate on Python functions straight from your laptop.

AWS offers the SAM CLI, which runs a Docker container that replicates the Lambda runtime. Google provides Functions Framework, and Azure ships Functions Core Tools. Pair these with full-stack emulators like LocalStack's AWS service replica or the Serverless Framework's testing suite to mock S3 events, API Gateway calls, or DynamoDB streams.

Local testing saves money, shortens feedback loops, and gives you full step-through debugging. Live platforms can't match this capability. It also catches IAM mistakes and missing environment variables before they become production outages.

Here's a quick SAM workflow you can set up in minutes:

sam init --runtime python3.11 --name hello-sam
cd hello-sam
sam build                       # bundles code and dependencies
sam local invoke "HelloWorldFunction" \
-e events/event.json          # feeds a test event into the function

The same event file works in CI pipelines, giving you a consistent test harness from laptop to deployment.

What Are Common Mistakes to Avoid?

Serverless functions punish sloppy details. I've watched teams burn hours and budgets on avoidable errors. Keep an eye on these traps:

  • Packaging dependencies on your laptop: skip local builds; use a Linux container or a plugin like handling AWS Lambda Python dependencies to avoid incompatible binaries
  • Selecting an unsupported Python runtime: double-check the list of supported versions so you don't ship code that won't start
  • Hard-coding secrets: store credentials in a secrets manager and pull them at runtime
  • Ignoring cold starts: large deployment packages and heavyweight frameworks slow first invocation. Trim what you ship
  • Writing one giant function: smaller, single-purpose handlers scale faster and isolate failures
  • Skipping exception handling: uncaught errors vanish into logs, leaving you guessing. Wrap calls and return meaningful errors
  • Forgetting timeouts and memory limits: tune both or your function may silently terminate mid-work
  • Relying on print statements: structured logging and tracing are best practices for debugging production issues in cloud environments, enabling efficient querying and correlation. That said, print statements are also reliably captured and can be useful in certain scenarios

What Should You Do Next?

Serverless Python functions let you ship features faster without managing servers. Once your handler runs, the rest comes down to discipline: keep functions small, pin dependencies, handle exceptions cleanly, and watch your logs so you can fix issues before users notice. As your system grows, split work into separate functions instead of expanding one large file. Smaller units deploy faster, scale better, and roll back more safely.

Serverless also fits naturally into the rest of your stack. When you need to run data operations, you can use the PyAirbyte Python SDK inside your functions. PyAirbyte lets you read from sources, write to destinations, or trigger syncs with only a few lines of Python, and it works across AWS, GCP, and Azure.

From here, automate deployments with SAM, CDK, or the Serverless Framework. Connect your functions to event sources and hook them into downstream systems as your architecture grows. Keep handlers focused and packages lean, and you will ship reliable serverless code with far less operational overhead.

Ready to try this pattern in your own stack? Try Airbyte for free, spin up PyAirbyte, and connect your first source in a few minutes.

Frequently Asked Questions

How long can a serverless Python function run?

Each platform enforces maximum execution times. AWS Lambda allows up to 15 minutes, Google Cloud Functions caps at 9 minutes for HTTP-triggered functions (60 minutes for event-driven), and Azure Functions runs up to 10 minutes on the Consumption plan. If your workload exceeds these limits, consider breaking it into smaller functions, using step functions for orchestration, or switching to container-based services like AWS Fargate or Cloud Run.

What causes cold starts and how do I reduce them?

Cold starts occur when the platform provisions a new container to run your function after a period of inactivity. The delay depends on your deployment package size, runtime initialization, and any imports or connections your code makes at startup. Reduce cold starts by keeping packages small, lazy-loading heavy libraries, using provisioned concurrency (AWS), min instances (GCP), or pre-warmed instances (Azure), and avoiding unnecessary imports in your handler's global scope.

Can I use serverless functions for machine learning inference?

Yes, but with constraints. Serverless works well for lightweight models or when calling external ML APIs. For larger models, package size limits (250 MB unzipped on Lambda) and cold start latency become problematic. AWS offers container image support up to 10 GB, and all platforms support calling managed ML endpoints. For latency-sensitive inference with large models, consider dedicated inference endpoints or container services instead.

How do I debug a serverless function that fails in production?

Start with the platform's native logging: CloudWatch Logs for Lambda, Cloud Logging for GCP, and Application Insights for Azure. Add structured logging with correlation IDs to trace requests across function invocations. Use distributed tracing tools like AWS X-Ray, Google Cloud Trace, or Azure Monitor to visualize request flows. For hard-to-reproduce issues, enable detailed error reporting and consider deploying a canary version that logs additional context.

Limitless data movement with free Alpha and Beta connectors
Introducing: our Free Connector Program
The data movement infrastructure for the modern data teams.
Try a 30-day free trial
Photo of Jim Kutz