Help with server-side processing of a huge text file

Help with server-side processing of a huge text file

_dp__dp_ Posts: 2Questions: 0Answers: 0
edited November 2011 in General
I am currently trying to parse a huge text file (of over 1,000,000 lines) and display in a datatable. I am using server-side processing with sAjaxSource pointing to a PHP file which has the task of reading this text file line by line and parsing into an array to pass to the datatable. Obviously it is not practical to read the whole text file into a massive array so I have to be a little more clever and combine with the pagination to only read x lines at a time (using PHP's fopen and fgets) to create the datatable array and then stop. This is fine until I start getting deep down into the file. Since the sAjaxSource file gets called every time this means that I also have to do an fopen every time and then fgets until I get to the next line to show in the file. If deep down in the file this takes a long time. My question is, is it possible to store the file handler from fopen/fgets in my sAjaxSource file so that I can resume going through the file with fseek the next time that sAjaxSource gets called? Hope that makes sense.

Replies

  • allanallan Posts: 63,535Questions: 1Answers: 10,475 Site admin
    What you might be able to do is store the resource handle in a $_SESSION parameter that will be available each time the user makes call. That will speed things up a lot - but it doesn't take into account filtering and sorting. Are those features you want? If so, you might be best dumping your text file into an SQL DB table (even a temporary one) and then having the SQL engine do all the grunt work.

    Allan
This discussion has been closed.