Long loading time for extremely large data (server-side processing)

Long loading time for extremely large data (server-side processing)

ixqfeakcixqfeakc Posts: 10Questions: 0Answers: 0
edited July 2013 in General
Hello,

I am currently using the newest DataTables with a huge SQLite database (has around 2 millions of rows, 10+ columns)
I am using my own php code for connecting the SQLite when doing server-side processing, and I also enabled bDeferRendering, delay filter plugin, etc., but it is still relatively slow, requires 20+ sec to load the primary data or scroll down.

Has anyone work with this amount of data but still had a satisfying speed? If so, please give me some hint on how to improve the performance.

As to the testcases, I guess it will not help a lot, but I will still put that here

http://live.datatables.net/iwigex/edit#javascript,html
http://debug.datatables.net/aqosun

Replies

  • fbasfbas Posts: 1,094Questions: 4Answers: 0
    I would guess that your server side script is returning all the rows (rather than paginating, as it should). This seems like the only explanation for the long processing times.
  • ixqfeakcixqfeakc Posts: 10Questions: 0Answers: 0
    Thank you for the response! So, is it expected to be as fast as the usual ones with small data?

    I am using the scroller, and it seems like it is only getting around 80 rows at once (observed in firebug with the returned JSON) , whenever I go down or go up, it will send another request, take another 20+sec, to get the expected 80 rows, which seems pretty reasonable to me...

    Are there any thoughts about where should I try to start working on?
  • allanallan Posts: 63,516Questions: 1Answers: 10,472 Site admin
    > So, is it expected to be as fast as the usual ones with small data?

    If you are using server-side processing, then yes. I've seen server-side processing working with tens of millions of rows returning results in fractions of a second.

    I'd agree with fbas - it sounds like your script is taking a long time to reply, so you might want to look at optimisations there.

    Allan
  • ixqfeakcixqfeakc Posts: 10Questions: 0Answers: 0
    Great! At least I know I am not challenging some mission impossible :)
    Another day of trying, I still cannot figure out the solution, may I have someone look at my PHP code and give some suggestion?

    I have found that the $iFilteredTotal is acting strangely but cannot find a solution for it.

    [code]
    /*
    * Paging
    */
    $sLimit = "";
    if ( isset( $_GET['iDisplayStart'] ) && $_GET['iDisplayLength'] != '-1' )
    {
    $sLimit = "LIMIT ".intval( $_GET['iDisplayStart'] ).", ".
    intval( $_GET['iDisplayLength'] );
    }


    /*
    * Ordering
    */
    $sOrder = "";
    if ( isset( $_GET['iSortCol_0'] ) )
    {
    $sOrder = "ORDER BY ";
    for ( $i=0 ; $iquery( $sQuery ) or fatal_error( 'MySQL Error: ' . $db->lastErrorMsg() );
    $aResultFilterTotal = $rResultFilterTotal->fetchArray();
    $iFilteredTotal = $aResultFilterTotal[0];

    /* Total data set length */
    $sQuery = "
    SELECT COUNT($sIndexColumn)
    FROM $sTable
    ";
    $rResultTotal = $db->query( $sQuery ) or fatal_error( 'MySQL Error: ' . $db->lastErrorMsg() );
    $aResultTotal = $rResultTotal->fetchArray();
    $iTotal = $aResultTotal[0];


    /*
    * Output
    */
    $output = array(
    "sEcho" => intval($_GET['sEcho']),
    "iTotalRecords" => "$iTotal",
    "iTotalDisplayRecords" => "$iFilteredTotal",
    "aaData" => array()
    );

    while ( $aRow = $rResult->fetchArray() )
    {
    $row = array();
    for ( $i=0 ; $i
  • fbasfbas Posts: 1,094Questions: 4Answers: 0
    it would help to see your site, or a test case demonstrating it, or even just the outbound AJAX call that your code is making in your browser.
  • allanallan Posts: 63,516Questions: 1Answers: 10,472 Site admin
    The other thing would be to run the SQL statements that are generated by the PHP script, directly against the database. If they are slow (which I suspect they will be) then you need to look at tuning your database (adding indexes etc).

    Allan
  • ixqfeakcixqfeakc Posts: 10Questions: 0Answers: 0
    @fbas
    My data is currently in a internal server, and I do not really have a website for me to put a 500Mb database.. :/

    @allan
    Thank you for the suggestion. The question IS in the database... I guess I will need to consider examine the performance of sqlite

    Just a question on sqlite. It seems like sqlite will not do count on a SQL statement with LIMIT, but on the PHP script, it is obvious that the count need to be done for representing filtered result, etc. Is there a way to get around into it?
  • ixqfeakcixqfeakc Posts: 10Questions: 0Answers: 0
    Oops, a truly never mind. I just removed the LIMIT statement in that count... works well so far :)
  • allanallan Posts: 63,516Questions: 1Answers: 10,472 Site admin
    Yup. LIMIT is a funny one in SQL - you'd think it would be an obvious one to be standardised, but they all seem to do it slightly differently!

    Allan
This discussion has been closed.