Can you self-host Langflow in 2026?

Quick Answer: Yes. As of April 2026, Langflow is open source under the MIT license and can be self-hosted via Docker, Docker Compose, or Kubernetes (official Helm chart). Production deployments use Postgres for the metadata database and a persistent volume for flow storage.

Self-Hosting Langflow

Langflow is open source under the MIT license, and self-hosting is the most common deployment option for teams that want full control over data and models.

Docker (Single Host)

The fastest path is the official Docker image:

docker run -d \
  -p 7860:7860 \
  -v langflow_data:/app/data \
  -e LANGFLOW_DATABASE_URL=postgresql://user:pass@host:5432/langflow \
  langflowai/langflow:latest

For development, the SQLite default is fine. For production, configure a Postgres database via LANGFLOW_DATABASE_URL and persist the data volume.

Docker Compose

A docker-compose.yml that runs Langflow plus Postgres in one stack is published in the official repo. Suits single-VM deployments under ~50 concurrent users.

Kubernetes

The official Helm chart (langflow-ai/langflow-helm-charts) deploys Langflow with autoscaling, ingress, and a managed Postgres reference. Production deployments above ~10 concurrent users typically run on Kubernetes for resilience and zero-downtime upgrades.

Configuration Notes

  • Authentication — Set LANGFLOW_AUTO_LOGIN=false and LANGFLOW_SUPERUSER / LANGFLOW_SUPERUSER_PASSWORD to enable login. Workspace API keys are managed in-app.
  • Secrets — LLM provider API keys live in environment variables or in Kubernetes secrets, not in flows.
  • Telemetry — Set DO_NOT_TRACK=true to disable anonymous usage telemetry.

When to Use Managed

DataStax operates a managed Langflow service with Astra DB integration. Suits teams that want vector storage and Langflow operated together. Self-host remains free; managed pricing follows Astra DB consumption.

Last updated: | By Rafal Fila