I need to download a large csv file (300 mb uncompressed) and import it into mysql with cron.
Now, I had a php script that downloaded the file in chunks and savig it to the /tmp directory using:
Then in I'd import it into mysql using:
And finally delete the file in /tmp.
This was running fine on my test box but I got an email "CPU usage persists, continued running of this cron job will result in suspension" when I tested it on the live server.
The last thing I want to do is harm the server performance for others, so what would be the best way to achieve this or is it not possible in a shared environment?
Ideally the cron would run once a week (time/day unimportant).
Thanks for any help
Now, I had a php script that downloaded the file in chunks and savig it to the /tmp directory using:
PHP Code:
$handle = fopen($url, "rb");
while (!feof($handle)) {
$contents = fread($handle, 8192);
$handle2 = fopen($file, 'a');
fwrite($handle2, $contents);
fclose($handle2);
}
fclose($handle);
Code:
LOAD DATA LOCAL INFILE '/tmp/filename.csv' REPLACE INTO TABLE table_name FIELDS TERMINATED BY ',' OPTIONALLY ENCLOSED BY '\"' LINES TERMINATED BY '\n' IGNORE 1 LINES";
This was running fine on my test box but I got an email "CPU usage persists, continued running of this cron job will result in suspension" when I tested it on the live server.
The last thing I want to do is harm the server performance for others, so what would be the best way to achieve this or is it not possible in a shared environment?
Ideally the cron would run once a week (time/day unimportant).
Thanks for any help
Comment