[TOP TIP] Useful cron jobs
Without further ado:
Detect AVC errors
This command uses the checkpoint parameter to make things a lot faster, thus only new AVC errors will be reported. You may modify the message type parameter to include more types if you prefer.
/usr/sbin/ausearch --format text -m AVC,USER_AVC --checkpoint /var/log/audit/ausearch_checkpoint.txt 2>&1|grep -v '<no matches="">'</no>
Detect filesystem over-usage
In this command, we report any mounted partition filesystems that are over 80% used. If you prefer, you may modify the 80 number to something that suits your needs.
/usr/bin/df -h | /usr/bin/tail -n '+2' | tr '\%' ' ' | /usr/bin/awk '{ if($5 > 80) print "FS Size Alert "$0; }'
Clamav scan
Here we use ClamAV as a daily scanner. If you try to run clamscan directly, then your Linodes CPU usage will explode and you'll start receiving warning emails (if they are enabled). Since this scan is passive, we use ionice and nice to make the process run in lower priority, thus never cause any CPU usage issues. Another trick, is to limit the scan to files less than 5MB, which covers most typical websites.
/usr/bin/ionice -c3 /usr/bin/nice -n 19 /usr/bin/clamscan --exclude-dir='^/sys' --exclude-dir='^/proc' --exclude-dir='^/dev' --max-filesize=5M --pcre-max-filesize=5M --no-summary --heuristic-scan-precedence=yes -ri /
Detect PHP errors
This may be of interest to those running PHP scripts. We scan for the PHP error_log file, left behind when PHP errors occur. In my case, I get an error or two per year, they don't happen very often but the moment they do I want to know about it. You may change the /home directory to something that suits your needs.
/usr/bin/find /home -iname error_log
MariaDB/MySQL backup
In case you didn't know, the Linode backups do not work with live databases, if you try to restore a Linode, chances are that the database will be corrupt. Thus, you need a daily database backup, right before the actual Linode backup happens. Here is a one liner that works if you have properly setup roots /root/.my.cnf file. It will dump the entire database and compress it at the same time. While this line will keep only one backup archive, if may expand it to switch to separate daily files. By default it stores the archive in the roots home directory, but you may modify that to suit your needs. Works well with small databases, but if you have 50GB of data, then I suggest you use a different method, like a database cluster or a live binary log backup.
/usr/bin/mysqldump --opt --all-databases | bzip2 > /root/all_databases.sql.bz2