How to Install FlowiseAI with Docker Compose

Learn how you can FlowiseAI with docker compose and Postgres DB and take advantage of no-code AI flows.

How to Install FlowiseAI with Docker Compose

FlowiseAI is an open-source platform (v3.0+) for building and deploying custom AI workflows with a drag-and-drop interface. Recent additions include Agentflows for multi-step autonomous agents, Ollama Cloud integration, API key permissions for access control, and improved security with input validation and MIME type validation. It’s built on Node.js and React, so it’s straightforward to extend.

In this tutorial, we are going to see how easy it is to host FlowiseAI on your VPS server with Docker Compose and have an SSL certificate. I will include also an option to backup the database with Docker DB Backup to have the SQL dumps in case something goes wrong and you can’t use the volumes.

We are going to use Dockge to administrate the Docker Compose file and as reverse proxy CloudFlare Tunnels. You can also use Docker Compose directly to deploy as it will work the same on whatever reverse proxy you prefer.

If you are interested to see some free cool open source self hosted apps you can check toolhunt.net self hosted section.

How to Install FlowiseAI with Docker and Docker Compose

In case you are interested to monitor server resources like CPU, memory, disk space you can check: How To Monitor Server and Docker Resources

1. Prerequisites

Before you begin, make sure you have the following prerequisites in place:

You can use also Traefik as a reverse proxy for your apps. I have created a full tutorial with Dockge install also to manage your containers on: How to Use Traefik as A Reverse Proxy in Docker

Having all of this you will be ready to move to next step and add the containers in Dockge.

2. Create Docker Compose File

The first step is to create a Docker Compose file that defines the services required to run FlowiseAI. Here’s an example docker-compose.yml file:

services:
  flowise-db:
    image: postgres:16-alpine
    hostname: flowise-db
    environment:
      POSTGRES_DB: ${POSTGRES_DB}
      POSTGRES_USER: ${POSTGRES_USER}
      POSTGRES_PASSWORD: ${POSTGRES_PASSWORD}
    volumes:
      - ./flowise-db-data:/var/lib/postgresql/data
    restart: unless-stopped
    healthcheck:
      test: ["CMD-SHELL", "pg_isready -U ${POSTGRES_USER} -d ${POSTGRES_DB}"]
      interval: 5s
      timeout: 5s
      retries: 5

  flowise:
    image: flowiseai/flowise:latest
    container_name: flowiseai
    hostname: flowise
    healthcheck:
      test: wget --no-verbose --tries=1 --spider http://localhost:${PORT}
    ports:
      - 5023:${PORT}
    volumes:
      - ./flowiseai:/root/.flowise
    environment:
      DEBUG: false
      PORT: ${PORT}
      FLOWISE_USERNAME: ${FLOWISE_USERNAME}
      FLOWISE_PASSWORD: ${FLOWISE_PASSWORD}
      APIKEY_PATH: /root/.flowise
      SECRETKEY_PATH: /root/.flowise
      LOG_LEVEL: info
      LOG_PATH: /root/.flowise/logs
      DATABASE_TYPE: postgres
      DATABASE_PORT: 5432
      DATABASE_HOST: flowise-db
      DATABASE_NAME: ${POSTGRES_DB}
      DATABASE_USER: ${POSTGRES_USER}
      DATABASE_PASSWORD: ${POSTGRES_PASSWORD}
    restart: on-failure:5
    depends_on:
      flowise-db:
        condition: service_healthy
    entrypoint: /bin/sh -c "sleep 3; flowise start"

This Compose file defines two services:

  1. flowise-db: A PostgreSQL database service used by FlowiseAI to store data.
  2. flowise: The FlowiseAI service itself, which depends on the flowise-db service.

The flowise service uses the official flowiseai/flowise:latest Docker image and exposes port 3000 (configurable via the PORT environment variable). It also mounts a volume at /root/.flowise to persist data across container restarts.

The Compose file also includes a healthcheck for the flowise service, which checks if the FlowiseAI application is running and accessible on http://localhost:3000.

If you want to include a backup solution for the PostgreSQL database, you can add a third service to the Compose file:

flowise-db-backup:
  container_name: flowise-db-backup
  image: tiredofit/db-backup
  volumes:
    - ./backups:/backup
  environment:
    DB_TYPE: postgres
    DB_HOST: flowise-db
    DB_NAME: ${POSTGRES_DB}
    DB_USER: ${POSTGRES_USER}
    DB_PASS: ${POSTGRES_PASSWORD}
    DB_BACKUP_INTERVAL: 720
    DB_CLEANUP_TIME: 72000
    CHECKSUM: SHA1
    COMPRESSION: GZ
    CONTAINER_ENABLE_MONITORING: false
  depends_on:
    flowise-db:
      condition: service_healthy
  restart: unless-stopped

This service uses the tiredofit/db-backup image to create regular backups of the PostgreSQL database. The backups are stored in the ./backups directory on the host machine.

3. Create .env file with credentials

Next, create a .env file in the same directory as your docker-compose.yml file and add the required environment variables:

PORT=3000
POSTGRES_USER='user'
POSTGRES_PASSWORD='pass'
POSTGRES_DB='flowise'
FLOWISE_USERNAME=bitdoze
FLOWISE_PASSWORD=bitdoze

Replace the values with your desired credentials and database name.

4. Deploy FlowiseAI

With the Docker Compose file and the .env file in place, you can now deploy FlowiseAI using Docker Compose:

docker compose up -d

This command will start the services defined in the Compose file in detached mode (running in the background).

To check if the containers are running, use the following command:

docker ps

You should see the flowise and flowise-db containers listed as running.

5. Configure the CloudFlare Tunnels for SSL and Domain access

To access FlowiseAI securely over the internet, you can set up CloudFlare Tunnels. CloudFlare Tunnels provide a secure way to expose your FlowiseAI instance to the internet without exposing your server’s IP address.

Go in Access - Tunnels and choose the tunnel you created and add a hostname that will link a domain or subdomain and the service and port.

Cloudflare Tunnel setup

You can also check Setup CloudPanel as Reverse Proxy with Docker and Dokge to use CloudPanel as a reverse proxy to your Docker containers or How to Use Traefik as A Reverse Proxy in Docker.

6. Create your first FlowiseAI flow

Once your FlowiseAI instance is up and running, you can access the web interface by navigating to http://localhost:3000 (or the domain you configured with CloudFlare Tunnels).

From the FlowiseAI interface, you can start creating your first AI workflow by dragging and dropping nodes, connecting them, and configuring their settings.

FlowiseAI add flow

You have also a marketplace with flows that can be used.

7. Use the FlowiseAI Chatflow Externally

After you finish designing your flow, you can go and use it externally with few options like Embed, Python, JavaScript, CURL or just share it.

Use the FlowiseAI Chatflow Externally

Conclusions

That covers deploying FlowiseAI v3.0+ with Docker Compose. You’ve set up the Docker Compose file, configured environment variables, and started the instance. You also saw how to use CloudFlare Tunnels for secure access.

FlowiseAI has matured into a solid platform with Agentflows, Ollama Cloud integration, granular API key permissions, and better security defaults. Running it in Docker makes management and scaling easier. The Compose setup also lets you add services like database backups alongside your deployment.

If you want to explore more Docker containers for your home server, check out our guide on Best 100+ Docker Containers for Home Server.