How to backup databases using n8n?

I’m hosting more and more tools on my server and always in the back of my mind I have a disturbing thought:

What will happen to this data if my server goes down?…

Of course, you can create scripts that will do automatic backups, but it is annoying to maintain. I wanted something easier and preferably with an easy interface where you can click everything.

I already use the n8n platform at my place for various automations, and I knew this would be perfect.

Custom n8n docker image

After my first attempts at integration, I quickly discovered that the problem with the basic n8n image, was the lack of available commands for backup. I mainly needed pg_dump or mysqldump. Fortunately, after a brief online investigation and a little help from Claude AI, I was able to prepare a docker image expanded with these commands:

FROM n8nio/n8n:latest

USER root

# Install necessary clients and tools
RUN apk add --no-cache \
    postgresql-client \
    mysql-client \
    mariadb-client \
    mariadb-connector-c \
    sqlite \
    mongodb-tools \
    gzip \
    tar

# Create a directory for backups
RUN mkdir /backups && chown node:node /backups

USER node

# Set the entrypoint back to the original
ENTRYPOINT ["tini", "--", "/docker-entrypoint.sh"]

Then all I had to do was build this image for use on the Linux server and push it out to Dockerhub:

`docker buildx build --no-cache --platform linux/amd64 -t marekbrze/custom-n8n:latest --push .`.

And finally, just replace the image in my application on the Caprover server. At first I was afraid of this fun with Docker, but everything went painlessly. If you want to use my image, go ahead -> Link to repository on Dockerhub .

Workflow n8n

The n8n workflow itself is very simple and I use the following:

  1. Schedule Trigger - Workflow is fired at specific intervals.
  2. Execute command - Head terminal command that connects to the database and dumps the result to a file 1 Example for MySQL -> mysqldump -h [host] -u [username] -p[password] [database_name] > [backup_file.sql]. `
  3. Read/Write Files from the Disk - Load the file created by the previous command from the disk.
  4. Google Drive: Upload file - Upload the file to the selected folder on Google Drive.

And that would be it. I’ll be adding more options in the future, such as notifications of successful/failed backups sent to Slack/Discord/Telegram, or firing backups on demand via Webhook, or a bot command on Telegram. For now, the basic functionality is definitely enough for me.