Looking to hire admin for a singlle task
I'm looking for someone who can configure my VirtualHost for several sites that are hosted on my linode.
It's a single file that we need to configure with the correct NameVirtualHost directives.
Anyone intrested? How much would this cost?
Feel free to PM me.
12 Replies
NameVirtualHost 173.230.151.109:80
BTW, I noticed in your first post that the port in the error log is 0. You should change it to 80.
With the NameVirtualHost directive defined, the individual site VirtualHost directives will follow it. So you need blocks resembling these:
ServerName
ServerAlias bzzzd.com
DocumentRoot /path/to/bzzzd/site
ErrorLog /path/to/bzzzd/site/logs/error.log
CustomLog /path/to/bzzzd/site/logs/access.log combined
ServerName
ServerAlias americaselitevillas.com
DocumentRoot /path/to/americaselitevillas/site
ErrorLog /path/to/americaselitevillas/site/logs/error.log
CustomLog /path/to/americaselitevillas/site/logs/access.log combined
Save the file in the /etc/apache2/sites-available directory as something like all-sites. You'd enable the sites by issuing a "sudo a2ensite all-sites" command and reload the web server with "sudo service apache2 reload". I'd review the /etc/apach2/sites-enabled directory and disable anything other than all-sites. Use the a2dissite to disable sites.
I normally don't put multiple sites in one file as it makes enabling and disabling individual sites more difficult. I make an exception to this rule when serving up demo and production versions of a site.
Hope this helps.
@Bartster:
Thanks. Would you be interested in doing something for me as a one-time thing? I want to download copies os my sites so I can have a backup with me and to possibly put them on another server down the road. Just enabling FTP access would do.
Please don't use ftp very insecure. If you want to copy files from one server to another ssh into them and then use scp to transfer the files.
@Bartster:
Thanks for your advice. I want to get a copy of my sites as they are now and I want to have it on my hard drive. If it's not done through FTP and it's done through SSH, even better. I'll provide access and I'll pay anyone who's interested in doing it. Anyone?
How about I share a script that I use that you can run. It will back up all your websites and MySql data bases and copy them to another server using a cron job
!/bin/bash
bash script to backup website
BACKUP_LOG=/home/mike/Backup/website.log
date +"%Y-%m-%d %X" > $BACKUP_LOG
bash script to backup mysql
MYSQL_USER=
MYSQL_PASSWORD=
MYSQLBACKUPDIR=/home/mike/Backup
backup mysql databases
/usr/bin/mysqldump –user=$MYSQLUSER --password=$MYSQLPASSWORD --single-transaction --skip-lock-tables mybb | bzip2 -c > $MYSQLBACKUPDIR/data-$(date -I).sql.bz2
/usr/bin/mysqldump --user=$MYSQLUSER --password=$MYSQLPASSWORD --single-transaction --skip-lock-tables inhome | bzip2 -c > $MYSQLBACKUPDIR/data-$(date -I).sql.bz2
/usr/bin/mysqldump --user=$MYSQLUSER --password=$MYSQLPASSWORD --single-transaction --skip-lock-tables blacktop | bzip2 -c > $MYSQLBACKUPDIR/data-$(date -I).sql.bz2
/usr/bin/mysqldump --user=$MYSQLUSER --password=$MYSQLPASSWORD --single-transaction --skip-lock-tables mycloud | bzip2 -c > $MYSQLBACKUPDIR/data-$(date -I).sql.bz2
/usr/bin/mysqldump --user=$MYSQLUSER --password=$MYSQLPASSWORD --single-transaction --skip-lock-tables fog | bzip2 -c > $MYSQLBACKUPDIR/fog-$(date -I).sql.bz2
/usr/bin/mysqldump --user=$MYSQLUSER --password=$MYSQLPASSWORD --single-transaction --skip-lock-tables ps3 | bzip2 -c > $MYSQLBACKUPDIR/data-$(date -I).sql.bz2
remove old MySQL database backups
find $MYSQLBACKUPDIR -maxdepth 1 -type f -name *.sql.bz2 -mtime +30 -exec rm -Rf {} \;
backup file system
rsync -avh -gopt --progress --delete /etc/httpd/conf /etc/httpd/conf.d /srv/www /home/mike/Backup kyrunner@192.168.3.56:/Users/user/Backup
date +"%Y-%m-%d %X" > $BACKUP_LOG
send email
mailx -s "Micro: website log" @gmail.com < $BACKUP_LOG
It would appear you have 5 databases (mybb, inhome, blacktop, mycloud and ps3) being backed up to the same dest file - $MYSQLBACKUPDIR/data-$(date -I).sql.bz2. Each one overwriting the other so that only the last of the 5 is actually backed up (ps3).
Your fog database is going into a different dest file.
I do something similar, but I dump all the databases into a single compressed file. That way I don't need to change the backup script when I add a new database.
He updated the post with this fixed
@Dweeber:
Are you sure you are using that script?
It would appear you have 5 databases (mybb, inhome, blacktop, mycloud and ps3) being backed up to the same dest file - $MYSQLBACKUPDIR/data-$(date -I).sql.bz2. Each one overwriting the other so that only the last of the 5 is actually backed up (ps3).
Your fog database is going into a different dest file.
I do something similar, but I dump all the databases into a single compressed file. That way I don't need to change the backup script when I add a new database.
Ref:
# Same dest file # backup mysql databases /usr/bin/mysqldump --user=$MYSQL_USER --password=$MYSQL_PASSWORD --single-transaction --skip-lock-tables mybb | bzip2 -c > $MYSQL_BACKUP_DIR/data-$(date -I).sql.bz2 /usr/bin/mysqldump --user=$MYSQL_USER --password=$MYSQL_PASSWORD --single-transaction --skip-lock-tables inhome | bzip2 -c > $MYSQL_BACKUP_DIR/data-$(date -I).sql.bz2 /usr/bin/mysqldump --user=$MYSQL_USER --password=$MYSQL_PASSWORD --single-transaction --skip-lock-tables blacktop | bzip2 -c > $MYSQL_BACKUP_DIR/data-$(date -I).sql.bz2 /usr/bin/mysqldump --user=$MYSQL_USER --password=$MYSQL_PASSWORD --single-transaction --skip-lock-tables mycloud | bzip2 -c > $MYSQL_BACKUP_DIR/data-$(date -I).sql.bz2 /usr/bin/mysqldump --user=$MYSQL_USER --password=$MYSQL_PASSWORD --single-transaction --skip-lock-tables ps3 | bzip2 -c > $MYSQL_BACKUP_DIR/data-$(date -I).sql.bz2 #Diff dest file /usr/bin/mysqldump --user=$MYSQL_USER --password=$MYSQL_PASSWORD --single-transaction --skip-lock-tables fog | bzip2 -c > $MYSQL_BACKUP_DIR/fog-$(date -I).sql.bz2
I added data just for example…
This is a example $MYSQLBACKUPDIR/data-$(date -I).sql.bz2.
Where you see data after the directory is really my database names
should look like this $MYSQLBACKUPDIR/ps3-$(date -I).sql.bz2.
backup mysql databases
/usr/bin/mysqldump --user=$MYSQLUSER --password=$MYSQLPASSWORD --single-transaction --skip-lock-tables mybb | bzip2 -c > $MYSQLBACKUPDIR/mybb-$(date -I).sql.bz2
/usr/bin/mysqldump --user=$MYSQLUSER --password=$MYSQLPASSWORD --single-transaction --skip-lock-tables inhome | bzip2 -c > $MYSQLBACKUPDIR/inhome-$(date -I).sql.bz2
/usr/bin/mysqldump --user=$MYSQLUSER --password=$MYSQLPASSWORD --single-transaction --skip-lock-tables blacktop | bzip2 -c > $MYSQLBACKUPDIR/blacktop-$(date -I).sql.bz2
/usr/bin/mysqldump --user=$MYSQLUSER --password=$MYSQLPASSWORD --single-transaction --skip-lock-tables mycloud | bzip2 -c > $MYSQLBACKUPDIR/mycloud-$(date -I).sql.bz2
/usr/bin/mysqldump --user=$MYSQLUSER --password=$MYSQLPASSWORD --single-transaction --skip-lock-tables fog | bzip2 -c > $MYSQLBACKUPDIR/fog-$(date -I).sql.bz2
/usr/bin/mysqldump --user=$MYSQLUSER --password=$MYSQLPASSWORD --single-transaction --skip-lock-tables ps3 | bzip2 -c > $MYSQLBACKUPDIR/ps3-$(date -I).sql.bz2
@Azathoth:
BTW, –single-transaction and --skip-lock-tables is effective only for InnoDB tables (or any other transactional engine). If you use MyISAM you have to lock them or risk taking partial (corrupt) snapshot.
Good point if you want to see what database engine you are using run these commands.
1) switch to root = sudo su - root
2) run this command = mysql -u root -p (thin hit enter and put your password in)
3) run this command = show databases;
4) after you see your databases select one
5) use database;
6) this will see what engine created them =
SELECT tablename,engine FROM INFORMATIONSCHEMA.TABLES WHERE table_schema=DATABASE();
should be it. if I missed something please give your 2 cents.
@Azathoth:
BTW, –single-transaction and --skip-lock-tables is effective only for InnoDB tables (or any other transactional engine). If you use MyISAM you have to lock them or risk taking partial (corrupt) snapshot.
Here is for MyISAM
backup mysqll databases
/usr/bin/mysqldump –user=$MYSQLUSER --password=$MYSQLPASSWORD --all-databases --lock-all-tables --flush-logs --master-data=2 | bzip2 -c > $MYSQLBACKUPDIR/all-$(date -I).sql.bz2