Help with server-side processing of a huge text file
Help with server-side processing of a huge text file
I am currently trying to parse a huge text file (of over 1,000,000 lines) and display in a datatable. I am using server-side processing with sAjaxSource pointing to a PHP file which has the task of reading this text file line by line and parsing into an array to pass to the datatable. Obviously it is not practical to read the whole text file into a massive array so I have to be a little more clever and combine with the pagination to only read x lines at a time (using PHP's fopen and fgets) to create the datatable array and then stop. This is fine until I start getting deep down into the file. Since the sAjaxSource file gets called every time this means that I also have to do an fopen every time and then fgets until I get to the next line to show in the file. If deep down in the file this takes a long time. My question is, is it possible to store the file handler from fopen/fgets in my sAjaxSource file so that I can resume going through the file with fseek the next time that sAjaxSource gets called? Hope that makes sense.
This discussion has been closed.
Replies
Allan