As you can see, backing up your database is easy in PostgreSQL. Unfortunately, backups are meaningless if they are not performed on a regular schedule. If the database is lost or corrupted, any work done since the last backup is also lost. It is recommended that you perform backups at intervals that minimize the amount of work lost. The ideal interval will depend on the frequency of changes made to the database.
The pg_dump utility can be scheduled to run at regular intervals by adding a job to the operating system's task scheduler; the instructions for doing this are available in the PostgreSQL wiki at http://wiki.postgresql.org/wiki/Automated_Backup_on_Windows and http://wiki.postgresql.org/wiki/Automated_Backup_on_Linux.
The pg_dump utility is not adequate for all situations. If you have a database undergoing constant changes or that is larger than a few tens of gigabytes, you will need a backup mechanism far more robust than that discussed in this recipe. Information regarding these robust mechanisms can be found in the PostgreSQL documentation at http://www.postgresql.org/docs/current/static/backup.html.
The following are several third-party backup tools available for establishing robust and advanced backup schemes:
- Barman, which is available at http://www.pgbarman.org
- pg-rman, which is available at http://code.google.com/p/pg-rman