Question about maintaining files on distributed nodes

One of my client's is paranoid about the traffic they might get on their site. While I run a dozen or so clients on a single 512 linode, they have requested that I setup a new system just for them that includes a node-balancer with two 512 linodes behind it sharing the bandwidth.

I have set up this configuration, and to get started I cloned one 512 to another. But as updates happen over the course of time – to the file system and the database – how do I make sure that each linode gets updated? Is it possible to push updates to the first one and have that automagically pushed to the second?

I'd appreciate any advice on maintaining file systems for distributed nodes.

By the way, I have a dev, stage, live environment established, I'm just concerned about the live site being propagated in multiple places.

I'm running a LAMP (PHP-MySQL) stack on Debian 6.

Thanks.

3 Replies

Use rsync to update your files from one server to another, you could set up a cron or if you use git/svn you could set up a hook.

You could also use inotifywait or a similar tool to watch for changes and trigger rsync.

You can use csync2: http://oss.linbit.com/csync2/

It was created just for that use :)

You can also use some kind of VCS and deploy the files from it.

Reply

Please enter an answer
Tips:

You can mention users to notify them: @username

You can use Markdown to format your question. For more examples see the Markdown Cheatsheet.

> I’m a blockquote.

I’m a blockquote.

[I'm a link] (https://www.google.com)

I'm a link

**I am bold** I am bold

*I am italicized* I am italicized

Community Code of Conduct