Restoring/Backing up Postgres Database in a Docker Container

apt-get install -y s3cmd

Restore

In the previous tutorial, after running Postgres in a container, we should restore database from the backup. We need to get the backup first.

cd /mydata && mkdir db_restore && cd db_restore
s3cmd ls s3://your_db_dumps/
s3cmd get s3://db_dumps/dump2015-05-22T09:15:13+0800.txt.gz
gunzip dump2015-05-22T09\:15\:13+0800.txt.gz
docker run -d --name myredispgclient --link mypostgres:postgres -v /mydata/db_restore/:/tmp/ -i -t myredispgclient /bin/bash
# docker exec -i -t myredispgclient bash $ env | grep POSTGRES_
$ psql -h $POSTGRES_PORT_5432_TCP_ADDR -p $POSTGRES_PORT_5432_TCP_PORT -U postgres
> \l # List all databases. Make sure mynodeappdb exists
> \q # Quit
$ psql -h $POSTGRES_PORT_5432_TCP_ADDR -p $POSTGRES_PORT_5432_TCP_PORT -U postgres -d mynodeappdb < /tmp/dump2015-05-22T09\:15\:13+0800.txt
$ exit # Exit the console of the container
# docker stop myredispqclient

Backup

Let’s build mys3cmd image which has s3cmd 1.0 installed. The latest version is 1.5.2. But 1.5.2 fails to upload big files for me. So I have to use 1.0. This is the Dockerfile. It is based on myredispgclient image.

FROM myredispgclient
RUN apt-get update \
&& wget -O- -q http://s3tools.org/repo/deb-all/stable/s3tools.key | apt-key add - \
&& wget -O/etc/apt/sources.list.d/s3tools.list http://s3tools.org/repo/deb-all/stable/s3tools.list \
&& apt-get update \
&& apt-get install -y --force-yes --no-install-recommends s3cmd=1.0.0-4 \
&& apt-get clean \
&& apt-get autoremove \
&& rm -rf /var/lib/apt/lists/\*
cd /DockerizingWebAppTutorial/dockerfiles/mys3cmd
docker build -t mys3cmd .
FROM mys3cmdCOPY .pgpass /root/
COPY .s3cfg /root/
COPY dump_db_and_upload.sh /root/
RUN chmod 600 /root/.pgpassVOLUME /db_dumpsCMD ["/root/dump_db_and_upload.sh"]
postgres:5432:mynodeappdb:postgres:postgres
#!/bin/bash
# A script to dump db and compress it and then upload the file to S3.
# should change mode like 'chmod 777 dump_db_and_sync.sh'
FILENAME=$(TZ=Asia/Hong_Kong date +"dump%Y-%m-%dT%H:%M:%S+0800.txt.gz")
FULLDIR="/db_dumps/"
FULLPATH="$FULLDIR$FILENAME"
S3PATH="s3://db_dumps/"
echo "Begin to dump mynodeappdb to $FULLPATH"
# We don't use $POSTGRES_PORT_5432_TCP_ADDR for host, but use postgres which is linked
# $POSTGRES_PORT_5432_TCP_ADDR will change, but link name postgres does not change.
# We also use the link name postgres in .pgpass
pg_dump -h postgres -U postgres mynodeappdb | gzip > $FULLPATH
echo "Done"
echo "Begin to upload the dump to $S3PATH"
s3cmd put $FULLPATH $S3PATH
echo "Done"
echo "Delete the local dump"
rm $FULLPATH
echo "Finished dump and upload"
cd /DockerizingWebAppTutorial/dockerfiles/mydbbackup2s3
docker build -t mydbbackup2s3 .
docker run -d --name mydbbackup2s3 --link mypostgres:postgres -v /mydata/db_dumps:/db_dumps mydbbackup2s3

Auto-backup

We should backup the database regularly and automatically. Let’s create a cron job by running this command:

crontab -e
#MIN (0-59)  HOUR (0-23)  DoM (1-31)  MONTH (1-12)  DoW (0-7)     CMD
#Dump pophub db and upload to S3 every Tuesday at 10:00 Hong Kong time
0 2 * * 2 docker start mydbbackup2s3

--

--

iOS/Android/Web developer. UX engineering lead at Grab. Singapore/Hong Kong/Shanghai

Love podcasts or audiobooks? Learn on the go with our new app.

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store
Aijin Yuan (Vince)

Aijin Yuan (Vince)

iOS/Android/Web developer. UX engineering lead at Grab. Singapore/Hong Kong/Shanghai