I have created a PHP script which scrapes some particular data from a website, looks for the relevant information and store this in a database. The script works fine on the local machine but when i run it on a live server using cron job, it stops after doing 10 database inserts saying
"Fatal error: Allowed memory size of 67108864 bytes exhausted (tried to allocate 35 bytes) in /home/content/36/78632936/html/scripts/simple_html_dom.php on line 809
"
I checked the same script on my local using get_memory_peak_usage() and the maximum usage comes around 8 MB .
I am puzzled as to why the memory usage is spectacularly high on the live machine.. Any help will be appreciated..
Glimpse of my code:
DB_table1: contains list of 60000 rows of data
Main code starts with calling the DB1 and get the data and then use each row of data to form an URL. Each of these URLs will be scraped using simple_html_dom() function and the script will look for some particular information on each page and store this data in another table DB_table2.
Please let me know if you need to know anything else . Thanks :)