In genearal, no, because there are typically only dozens of CDN nodes that will make requests direct to your origin. Even Akamai, which has tens of thousands of edge nodes, typically uses comparatively few of them to make origin requests in a sort of multi-layer hierarchy.
Also, unlike some "dumber" caching software, a CDN's tools will typically "hold" multiple requests for the same file until the first is in cache, rather than passing multiple requests to teh back end for the same file. Even off-the-shelf proxy caching tools like Varnsih and Nginx do this properly now.
That said, I suppose if you have a hugely diverse set of content with a very low temporal correlation, and a very under powered origin... even 12 nodes requesting thousands of different files in rapid succession could be problematic. But if you're using a 256 MB cheapo VPS behind a CDN, well, you're also being too cheap. My advice would be to use your CDN logs to give you an idea of the worst-case scenario you might face, in terms of number of unique URLs being requested in a short time span from what number of CDN nodes. You should then load-test your origin for exactly that scenario and mix of files. Good numbers from a realistic test beat conjecture every time, and are usually not that difficult to achieve.
Quite how you are supposed to determine if your CDN does or doesn't isn't clear to me, but you can hopefully assume that the bigger players get it right.
In genearal, no, because there are typically only dozens of CDN nodes that will make requests direct to your origin. Even Akamai, which has tens of thousands of edge nodes, typically uses comparatively few of them to make origin requests in a sort of multi-layer hierarchy.
Also, unlike some "dumber" caching software, a CDN's tools will typically "hold" multiple requests for the same file until the first is in cache, rather than passing multiple requests to teh back end for the same file. Even off-the-shelf proxy caching tools like Varnsih and Nginx do this properly now.
That said, I suppose if you have a hugely diverse set of content with a very low temporal correlation, and a very under powered origin... even 12 nodes requesting thousands of different files in rapid succession could be problematic. But if you're using a 256 MB cheapo VPS behind a CDN, well, you're also being too cheap. My advice would be to use your CDN logs to give you an idea of the worst-case scenario you might face, in terms of number of unique URLs being requested in a short time span from what number of CDN nodes. You should then load-test your origin for exactly that scenario and mix of files. Good numbers from a realistic test beat conjecture every time, and are usually not that difficult to achieve.
Apparently yes, if you get a CDN assembled out of dumb proxies at any rate:
http://www.jet-stream.com/blog/downsides-of-http-adaptive-bit-rate-streaming/
Quite how you are supposed to determine if your CDN does or doesn't isn't clear to me, but you can hopefully assume that the bigger players get it right.