Backup Script Issues

Firstly here is the script that I am running into issues for my custom game server.

!/bin/sh

backup sqlites

for db in map players; do
sqlite3 $db.sqlite ".timeout 1000" ".backup $db-backup_date '+%m->%d-%Y'.sqlite"
done

moves backuped sqlites

mv -backup_.sqlite /home/user/backups

backup the file based contents

tar czf /home/user/backups/world_date '+%m-%d-%Y'.tar.gz -->exclude='.sqlite' *

This is located where it should be and both manual and Cron Job operations creates a few issues…

  1. Should the game server be active (exploring in map) the backups get totally screwed up since for some reason the .backup command doesn't cope with this well??
  2. File sizes for the databases are just a few kbs when they are as much as 3xxMB in size currently and the other still being xxkbs in size!
  3. Tar.gz also backs itself up (so it does ~/backup/own.tar.gz instead of the files in the bash's current directory).
  4. I get "0 bytes" sqlites in a parent directory as well.

The only "error" I was able to get out of running it manually while auto walking in the server was

"tar: map.sqlite-journal: file changed as we read it"

Even though as shown above I "excluded" it and other DBs using the wildcards from the tarring processing. As the Sqlite3 .backup commands were supposed to deal with the data base backing up instead as commented. Honestly I am not sure WHY this is the case when the game server is active enough to flip it out of the control.

Anyone able to help me troubleshoot my backup script to not to be so messed up like this?

1 Reply

Imagine taking a picture. When the subject of your photo is not moving, your picture comes out very clear. When the subject of your photo is moving, your photo can be blurry. This is very similar to how a backup runs. When your server is running and there is lots of activity, you may not get a clear snapshot.

This addresses the first issue you're experiencing:

Should the game server be active (exploring in map) the backups get totally screwed up since for some reason the .backup command doesn't cope with this well??

Something that may help is utilizing an interactive connection to the database, as recommended in this StackOverflow post. This would handle any users that are working in your database while the backup is being taken; locks are placed on your SQLite database, allowing the backup to be performed more safely.

File sizes for the databases are just a few kbs when they are as much as 3xxMB in size currently and the other still being xxkbs in size!

When you create a backup and/or use tar, your files are generally compressed. This results in the size difference you're seeing.

Tar.gz also backs itself up (so it does ~/backup/own.tar.gz instead of the files in the bash's current directory).

This is most likely happening due to an issue with the file path in your backup command. You may need to utilize a variable to ensure the correct directory is being picked up.

I get "0 bytes" sqlites in a parent directory as well.

This may have to do with your regex command not excluding very small or brand new SQLite files. There is some information here that may help.

"tar: map.sqlite-journal: file changed as we read it"

This also points to the fact that there are files not being excluded by the '.sqlite' regex command. You'll need to account for all files that contain '.sqlite', even if they contain other words such as the file referenced here. An alternative could be to create two separate backups, one for your file-based system and one for your database. This would negate the need to exclude specific files.

Reply

Please enter an answer
Tips:

You can mention users to notify them: @username

You can use Markdown to format your question. For more examples see the Markdown Cheatsheet.

> I’m a blockquote.

I’m a blockquote.

[I'm a link] (https://www.google.com)

I'm a link

**I am bold** I am bold

*I am italicized* I am italicized

Community Code of Conduct