Re: [PHP] Allowed memory size exhausted
- Date: Sun, 11 Oct 2015 21:27:22 -0500
- From: Karl DeSaulniers <karl@xxxxxxxxxxxxxxx>
- Subject: Re: [PHP] Allowed memory size exhausted
On Oct 11, 2015, at 9:21 PM, Aziz Saleh <azizsaleh@xxxxxxxxx> wrote:
> On Sun, Oct 11, 2015 at 10:14 PM, Karl DeSaulniers <karl@xxxxxxxxxxxxxxx> wrote:
> Getting this error:
> Fatal error: Allowed memory size of 103809024 bytes exhausted (tried to allocate 523800 bytes) in [...]/html/wp-includes/cache.php on line 113
> It is being generated by a script that reads a excel file and then updates a database with the new info.
> I have went in and unset all arrays that were set along the way to alleviate to no avail. Got down to 9 bytes once though.
> However, my question isn't about allocating memory. I know how to do that server side, but I want to avoid that.
> This is a simple order database with fields filled with order info. Nothing complicated.
> What I am wondering is what is the best way to avoid this memory issue knowing there will be a lot of entries to go through.
> To read each field from the spread sheet and insert individually? Read all fields from spread sheet and then insert individually?
> Any known ways of reading all and inserting all at the same time? Suggestions?
> Karl DeSaulniers
> Design Drumm
> To avoid memory issues, I would do each row individually, reading the file line by line, using something like fgetcsv/fgets.
> If you attempt the read entire file and insert, you would be a hostage to the file size of the file. Doing it line by line way avoids this issue even if you have a 10GB file.
That is what I was thinking since at the moment it is how you describe. Reading everything from the sheet first and then inserting.
My task is now how to iterate each individually, however I have a dependent. PHPExcel.
Not sure how to utilize this and read only one line at a time.
I appreciate the corroboration.