Serverless Functions, Made Simple.

Serverless on your terms — stable, portable, built for production.

Deploy code to Kubernetes with full control, portability, and commercial support.

Run anywhere

Anywhere

Deploy your functions on-premises or in the cloud, with portable OCI images.

Any code

Any code

Write functions in any language, and bring your existing microservices along too.

Any scale

Any scale

Pro features scale your functions to meet demand, and down to zero when idle.

Run your code anywhere with the same unified experience.

Deploy OpenFaaS to any Kubernetes cluster. Write a new function and ship it to production in minutes, knowing it will scale to meet demand.

OpenFaaS Pro adds autoscaling, event connectors, monitoring dashboards, SSO, RBAC, and direct-to-engineering support.

Templates that just work

Get code into production within minutes using our templates, or create your own.

Efficient scaling

Your functions can be fine-tuned to scale to match the type of traffic they receive, including to zero to save on costs.

Event-driven workloads

Invoke functions through events from Apache Kafka, AWS SQS/SNS, GCP Pub/Sub, RabbitMQ, Postgresql, Cron and MQTT.

How teams use OpenFaaS

A sandbox for your customer's code

The challenge

You want customers to extend your product with custom code — but running untrusted code safely is hard to get right.

With OpenFaaS

Turn code snippets into isolated functions, built and deployed through a REST API. Each tenant runs in its own namespace with security boundaries enforced at every layer.

Used in production by Waylay (IoT), LivePerson, Cognite (ML/data-science), and RudderStack.

Add a FaaS capability to your product →

Build a function editor for your customers →

ETL and data pipelines

The challenge

You're processing large volumes of data but managing the infrastructure for parallel execution, retries, and scaling is a project in itself.

With OpenFaaS

Fan out to thousands of parallel function executions that scale automatically. Batch jobs and ML models run as functions with built-in retries — whether they take milliseconds or hours.

Surge runs async jobs at scale — importing financial and government data to power background checks for customers through Salesforce.

Queue-based scaling with OpenFaaS →

Kubernetes, made usable

The challenge

Kubernetes has the building blocks, but wiring up auto-scaling, queuing, retries, and event triggers from scratch takes months of work.

With OpenFaaS

Get all of that out of the box. Run functions on-demand or on a schedule, trigger them from Kafka or AWS, and benefit from async retries — no extra code. Teams ship to production in hours, not weeks.

Teams migrating from AWS Lambda to Kubernetes use OpenFaaS to keep the scaling and developer experience they had before.

Integrate with OpenFaaS →

PDF generation at scale →

Dedicated installs of your SaaS

The challenge

Your SaaS product needs to run on your cloud, your customer's cloud, and on-premises — with the same experience everywhere.

With OpenFaaS

Functions are portable OCI images that run anywhere Kubernetes does. Ship the same code to every customer environment without vendor-specific wrappers or per-cloud rewrites.

  • Corva deploys AI analytics into oil and gas companies including Baker Hughes and Nabors.
  • Waylay built their multi-cloud low-code platform on OpenFaaS to ship across customer environments.

Low-code automation with OpenFaaS →

Write in any language. Bring your frameworks too.

Use official templates for Node.js, Python, Go, Java, C#, Ruby, and PHP — or create your own.

Already have microservices built with Express.js, Flask, FastAPI, Django, or ASP.NET Core? Deploy them to OpenFaaS as-is and stop managing Kubernetes by hand.

Browse the official templates →

$ faas-cli new --lang java11 java-fn

Handler.java
package com.openfaas.function;

import com.openfaas.model.IHandler;
import com.openfaas.model.IResponse;
import com.openfaas.model.IRequest;
import com.openfaas.model.Response;

public class Handler implements com.openfaas.model.IHandler {

    public IResponse Handle(IRequest req) {
        Response res = new Response();
	    res.setBody("Hello, world!");

	    return res;
    }
}
$ faas-cli template store pull golang-middleware
$ faas-cli new --lang golang-middleware go-fn
handler.go
package function

import (
	"fmt"
	"io"
	"net/http"
)

func Handle(w http.ResponseWriter, r *http.Request) {
	var input []byte

	if r.Body != nil {
		defer r.Body.Close()
		body, _ := io.ReadAll(r.Body)
		input = body
	}

	w.WriteHeader(http.StatusOK)
	w.Write([]byte(fmt.Sprintf("Body: %s", string(input))))
}
$ faas-cli template store pull python3-http
$ faas-cli new --lang python3-http python3-fn
main.py
def handle(event, context):
    return {
        "statusCode": 200,
        "body": "Hello from OpenFaaS!"
    }
$ faas-cli new --lang node22 javascript-fn
handler.js
"use strict"

module.exports = async (event, context) => {
  const result = {
    status: "Received input: " + JSON.stringify(event.body)
  };

  return context
    .status(200)
    .succeed(result);
}
$ faas-cli template store pull bash-streaming
$ faas-cli new --lang bash-streaming bash-fn
handler.sh
#!/bin/sh
for i in $(seq 1 100)
do
    sleep 0.001
    echo "Hello" $i
done
$ faas-cli new --lang dockerfile ruby
Dockerfile
FROM ruby:2.7-alpine

WORKDIR /home/app
COPY    .   .

RUN bundle install

EXPOSE 8080

CMD ["ruby", "main.rb"]

Trusted in production

Start your Serverless Journey

Get started with Python

Build and scale your first Python function with a step-by-step walkthrough.

READ TUTORIAL
Serverless For Everyone Else

Serverless For Everyone Else

Alex's hands-on eBook for building and deploying functions with Node.js.

GET THE EBOOK

Run OpenFaaS in Production

Deploy functions to production with autoscaling, SSO/IAM, and GitOps.

VIEW PRICING