I have a PHP script that uses cURL
to load a file from another server.
The file is about 24MB. I understand why the script itself loading the file would take a bit to execute, however any request to the site while the script it running will hault until the script finishes.
This did not happen on our old shared server with the same script. The new server is a cloud server. I took it up to 10 nodes (6Ghz dedicated cpu, 3760MB ram, 2500GB bandwidth) and it had no effect on this issue.
I don't mind the script itself taking a long time to execute since it is going to be an automated task for a data feed. I can't have the whole site locking up while it runs though.
Any ideas why this could be happening?
Update it looks like this is only happening locally. If I try to load the site on a separate computer while the script is running it works as expected.
You might be able to "ignore_user_abort(true)" and then use an HTTP redirect on the page to another page.
PHP 5 also has decent threading, that might help as well.
Now, the key question is: what else does your script do than fetches that file? Processes it and inserts the data to database? Perhaps in your old environment you had MySQL using InnoDB tables and in your new environment MySQL with MyISAM tables.
MyISAM requires full table lock during write operations instead of row-level locks InnoDB has, so that might cause your site to hang if your actual site is using same tables than the curl script.