I'm trying to understand what the essential differences are between a server architecture which is designed to stream media e.g. FLV or h.264 to Flash players vs. an ordinary HTTP-Server (e.g. Apache serving static content) which is capable of providing of feeding the data also from offsets within big files.
I'm talking about static movie content, not streaming from a live camera or something. My impression was always that a casual web server will do it to. Requirements are usual play/pause but also forward seeking on non-preloaded parts of the video. My impression is that the latter is also possible with casual servers.
Is it that real streaming servers are more efficient doing it or are there things they can do, a casual server can't?
thx
The deal with live and/or dynamic streaming is that the simple HTTP protocol doesn't have any mechanisms for it built in (versus something like RTMP). So there has to be intelligence on one end or the other. It's actually better to have the client have the "brains" so to speak, and make the web servers dumb. This is good for scalability. Microsoft, Apple, and Adobe all now have pretty solid streaming-over-HTTP solutions where the client knows how to ask the server for different resolutions, bitrates, and most importantly different time segments of a video stream. This makes them very bandwidth, cache and CDN friendly solutions. Microsoft and Adobe do require a module server side to "chunk" a big file up into segments, while Apple has you pre-chunk files and use a totally standard HTTP server. But otherwise, the intelligence is all on the client plug-in, and you can put proxy caches or CDNs in origin fetch mode into the mix with any of the solutions to scale things up very rapidly.
The traditional connection-oriented streaming protocols require dedicated servers and simply don't scale very well. You have to pay an Akamai or Limelight big dollars to run a large number of streams to a huge audience. They have tens of thousands of servers to handle that sort of thing.
The relatively new HTTP-based options mentioned above actually work with the huge HTTP caching infrastructure already out there on the web in organizations and ISPs, as well as the huge HTTP caching infrastructure offered by CDNs (which typically offer lower prices for HTTP delivery than connection-based streaming).
Apple has even submitted their Live HTTP streaming solution to the IETF for consideration as an open standard (although I suspect there are patents to worry about from Move Networks who pioneered this sort of thing).
"which is capable of providing of feeding the data also from offsets within big files"
This requires some additional smarts on the client and server - essentially you are layering another protocol on top of HTTP. But there are lots of tools available which do exactly that. However I'm not aware of any that operate on a streaming source (e.g. a camera) - only on files which have a defined start to measure an offset from. OTOH its going to be very difficult to navigate through a live stream feed.