Optimizing a webserver for progressive download

Hi everyone, I searched the Performance and Tuning section for "Progressive download" and turned up nothing. So I'm under the impression this discussion hasn't happened yet, but if it has, please point me in the right direction.

I'm looking to optimize my linode for delivery of .FLV video using progressive download (my project cannot retrieve the video content via RTMP or other streaming protocols.)

The project consists of many videos, and the download of the next video actually begins when the user is watching the one before. Ideally the video is playing from cache by the time it is needed.

I'm wondering how much I can do to achieve some of the benefits of a streaming server, without actually using one. Specifically:

setting cache headers to expire (in, lets say, 1hr)

throttling bandwidth down to the bitrate that I know my videos require, i.e. modthrottle? modbandwidth? These come up in searches but I've never used them, or know if they're appropriate. The use of iptables also comes up when searching on throttling.. but again I don't know if this is the appropriate way to go.

I don't want to limit the number of connections, I only want to prevent users from sucking up too much bandwidth.

Are there other optimizations for normal web servers that would be helpful here? Akamai sells an entire service/product for this, but of course they don't go into detail on what they actually do, presumably until you're on the phone with a rep, which I'm not ready for yet.

thanks in advance

4 Replies

On the server side, I usually treat it as I would any static file: get the request, serve it out as fast as possible. If you have control over the application receiving the video stream, you can do some additional tricks there, perhaps delaying packet acknowledgements to maintain a target bitrate, or using the Range header to pull the file in chunks. TCP is darned good at throttling itself if it needs to.

Only major server-side advice I would give: ensure that the videos are static, flat files being served directly by the web server, and if you're using Apache, do not use mpm-prefork (which precludes the use of mod_php).

@hoopycat:

On the server side, I usually treat it as I would any static file: get the request, serve it out as fast as possible. If you have control over the application receiving the video stream, you can do some additional tricks there, perhaps delaying packet acknowledgements to maintain a target bitrate, or using the Range header to pull the file in chunks. TCP is darned good at throttling itself if it needs to.

Only major server-side advice I would give: ensure that the videos are static, flat files being served directly by the web server, and if you're using Apache, do not use mpm-prefork (which precludes the use of mod_php).

The concern expressed by my superiors is specifically that 'serve it out as fast as possible' bit. I believe my goal is to throttle down to a sweetspot where the largest video being pre-loaded has enough time to download before it begins playback.

I've also discovered modcband ( http://www.howtoforge.com/mod_cband_apa … throttling">http://www.howtoforge.com/modcbandapache2bandwidthquotathrottling )

anyone with experience using this?

Hmm… haven't really looked into throttling, per se. (This tends to be the realm of CDNs here.)

What's the concern with serving it out at maximum speed? Is it one of wasted bandwidth, e.g. if someone watches 30 seconds of a 2-hour video, you just blew 1:59:30? Or is it something else? Mostly curious, on the off chance that I'm doing it wrong…

If you're serving static content (which it seems like you are, what with the progressive downloads), you may want to consider using a webserver better suited to efficiently serving static content, like nginx or lighttpd. nginx has support for per-connection throttling as a core feature, the limit_rate directive. It can be applied based on location, so different rate limits can apply to different directories.

Remember that with any webserver, even the more memory-efficient ones, throttling bandwidth like this can substantially increase your total number of concurrent connections, in turn increasing your CPU and memory resource usage. For example, if you have a file that can be downloaded in 30 seconds, and 10 clients try to access it, once per 30 seconds, you would only ever have one concurrent transfer. But if you slow everybody down to 10% speed, you're going to have a point where you have 10 concurrent connections consuming 10 times the resources.

The only reason to throttle the connection is to save bandwidth on either the client or server side, since you're sacrificing other resources to do it.

Reply

Please enter an answer
Tips:

You can mention users to notify them: @username

You can use Markdown to format your question. For more examples see the Markdown Cheatsheet.

> I’m a blockquote.

I’m a blockquote.

[I'm a link] (https://www.google.com)

I'm a link

**I am bold** I am bold

*I am italicized* I am italicized

Community Code of Conduct