Scrolling - "bServerSide": true with "bDeferRender": true

Scrolling - "bServerSide": true with "bDeferRender": true

ZoltarHawkZoltarHawk Posts: 5Questions: 0Answers: 0
edited September 2011 in General
Greetings,

Just a quick question to ensure that I am not missing something obvious.

Firstly the issue : Using datatables with data being supplied from server MySQL database. Everything works but it is very very slow when scrolling through a 442 record display with empty screen pauses as it is "Processing...".

From looking at the background server routine it clearly completes the 'select' for all required data and parses it into the $output array before the echo json_encode statement. At this point it has the 442 records loaded and displayed (iTotalRecords":"442","iTotalDisplayRecords":"442") and it is possible to very quickly scroll to the bottom of the display list.

Now at this point it gets confusing - firstly it does "Processing..." and returns to line 24 as the top line with the scroll bar indicator at the half way point on the scroll bar. If I move the scroll bar indicator to the top I get an empty screen with more "Processing..." until it finally returns with the first record.

The question : when it is "Processing..." what exactly is it doing as from my analysis it is re-reading the MySQL data table and I can't understand the logic of this approach when you have already got the data once and nothing has changed. Am I missing something obvious ???

Thanks.

Replies

  • fbasfbas Posts: 1,094Questions: 4Answers: 0
    I was under the impression that bDeferRender was intended for ajax-sourced data but not for bServerSide processing data.

    with bServerSide, you can use paging to control performance. rather than returning 442 records, you could return records 1 through 10, 11 through 20, etc., whatever is needed for that page.
  • ZoltarHawkZoltarHawk Posts: 5Questions: 0Answers: 0
    fbas - thanks for the comment.

    Again, my understanding is that to use server side data you have to use the sAjaxSource call as in "sAjaxSource": 'login-history.php'. If this is incorrect then I would appreciate some info on the alternative method.

    I will spend some time looking at paging to see if the performance can be improved. However I would still like to know if my assumption regarding the continual re-accessing of the database is correct.

    Thanks again.
  • allanallan Posts: 63,534Questions: 1Answers: 10,475 Site admin
    fbas is quite correct is saying that bDeferRender is for client-side processing with an Ajax source - it will make absolutely no difference in terms of performance when you are using server-side processing (since it is basically the same thing!).

    The idea with server-side processing is that it will only load the records that you need for the current page. However, if you have scrolling enabled, then do you have paging disabled? <500 rows isn't much, but you can start to feel the performance hit a little at that point. If you aren't using server-side processing, then deferred rendering will make a huge difference to this.

    What you might want to consider is the Scroller plug-in for DataTables: http://datatables.net/extras/scroller/ . That will allow you to combine server-side processing and full scrolling.

    Allan
  • fbasfbas Posts: 1,094Questions: 4Answers: 0
    you need to set sAjaxSource as well as bServerSide: true

    your sAjaxSource also needs to accept the parameters listed on this page: http://www.datatables.net/usage/server-side
    and return a valid json object containing the data (aaData) and a few other properties as described on that page

    here's a sample server side script in PHP:
    http://www.datatables.net/release-datatables/examples/server_side/server_side.html

    here's a blog I wrote that goes line-by-line through the server side script:
    tote-magote.blogspot.com/2011/08/serverprocessingphp-for-datatables.html

    this blog might give you ideas for enhancing the basic script.
  • ZoltarHawkZoltarHawk Posts: 5Questions: 0Answers: 0
    Greetings and thank you for your responses.

    Having gone through all the suggested links I believe at this point it may be more beneficial if I posted the code that I am working with and perhaps if anyone has the time they can advise me as to any changes or enhancements that will speed up the drawing of the screen.

    What I have is a template that I have used for a number of reports and up until now the test data never had more that 20 records so there was never an issue with speed. At this moment it is structured as :-

    1) Include the fnSetFilteringDelay
    2) Set variables - depending on selection criteria I allow two formats of the display output so I need options to control some aspects of the display (in example only the sScrollXInner is set different)
    3) Main section $(document).ready(function(){.... (includes Fixed Columns & Table Tools)

    Again, this all works except that it is noticeably slow with 442 records. Any assistance appreciated.

    -----------------------------------------------------------
    [code]

    jQuery.fn.dataTableExt.oApi.fnSetFilteringDelay = function ( oSettings, iDelay ) {
    /*
    * Inputs: object:oSettings - dataTables settings object - automatically given
    * integer:iDelay - delay in milliseconds
    * Usage: $('#example').dataTable().fnSetFilteringDelay(250);
    * Author: Zygimantas Berziunas (www.zygimantas.com) and Allan Jardine
    * License: GPL v2 or BSD 3 point style
    * Contact: zygimantas.berziunas /AT\ hotmail.com
    */
    var
    _that = this,
    iDelay = (typeof iDelay == 'undefined') ? 250 : iDelay;

    this.each( function ( i ) {
    $.fn.dataTableExt.iApiIndex = i;
    var
    $this = this,
    oTimerId = null,
    sPreviousSearch = null,
    anControl = $( 'input', _that.fnSettings().aanFeatures.f );

    anControl.unbind( 'keyup' ).bind( 'keyup', function() {
    var $$this = $this;

    if (sPreviousSearch === null || sPreviousSearch != anControl.val()) {
    window.clearTimeout(oTimerId);
    sPreviousSearch = anControl.val();
    oTimerId = window.setTimeout(function() {
    $.fn.dataTableExt.iApiIndex = i;
    _that.fnFilter( anControl.val() );
    }, iDelay);
    }
    });

    return this;
    } );
    return this;
    }

    // Set Some Variables //

    var param1;
    var width_val;
    var dom_selection;

    param1 = "<?php echo $company_code;?>";
    if(param1.length == 0) // no company selected so default all companies
    {
    width_val = "1950";
    dom_selection = 'T<"toolbar"><"clear">frtiS';
    fixedc = '1';
    widthc = '150';
    }
    else
    {
    width_val = "1200";
    dom_selection = 'T<"toolbar"><"clear">frtiS';
    fixedc = '1';
    widthc = '150';
    }

    // Main Section

    $(document).ready(function() {

    var oTable = $('#loginhistory').dataTable( {

    "fnServerParams": function ( aoData ) {aoData.push( { "name": "param1", "value": param1 } ); },

    "bProcessing": true,
    "bServerSide": true,
    "sAjaxSource": 'login-history.php',
    "sScrollY": "335px",
    "sScrollX": "100%",
    "sScrollXInner": width_val,

    "bScrollCollapse": true,
    "bPaginate": false,
    "fnInitComplete": function () {new FixedColumns( oTable, {"sLeftWidth": 'relative',"iLeftColumns": fixedc, "iLeftWidth": widthc } ) },

    "sDom": dom_selection,
    "oTableTools": { "sSwfPath": "swf/datatables/copy_cvs_xls.swf",
    "aButtons": [
    { "sExtends": "copy", "sButtonText": "Copy to clipboard" },
    { "sExtends": "xls", "sButtonText": "Save for Excel" }
    ]
    }
    } );
    $("div.toolbar").html(' Login History');
    $('#loginhistory').dataTable().fnSetFilteringDelay(500);
    } );

    [/code]
    ___________________________________________________________________

    Thanks.
  • fbasfbas Posts: 1,094Questions: 4Answers: 0
    1) if you're using bServerSide and want pagination to work, don't set bPaginate: false

    pagination will likely be a huge performance increase. My projects don't run well at all displaying 500 records. it will be even worse for people with low bandwidth between them and the server to get data.

    2) I don't think you can call FixedColumns on oTable in fnInitComplete because the oTable var isn't set until after fnInitComplete is done. instead, move it to the line after the dataTable() call
  • ZoltarHawkZoltarHawk Posts: 5Questions: 0Answers: 0
    fbas - Thanks for your comments.

    Firstly (1) I set the bPaginate: false as that is how it was defined in the Vertical Scroller example at http://www.datatables.net/examples/basic_init/scroll_y.html (see below). Description states it can be set either way but did not indicate that it would work faster if set to true. I will change it to true and see if it helps.

    [code]
    $(document).ready(function() {
    $('#example').dataTable( {
    "sScrollY": "200px",
    "bPaginate": false
    } );
    } );
    [/code]

    (2) On reading your point regarding the positioning of the FixedColumns call, it made perfect sense except that it works as expected except for when I get mixed line heights with some data wrapping to a second line on some records. So I went and re-read some info on FixedColumns and fnInitComplete and the reason it works is explained at http://datatables.net/beta/1.7/examples/api/api_in_init.html.

    What I would truly like to achieve is results similar to the server-side example at http://datatables.net/release-datatables/extras/Scroller/server-side_processing.html. 10,000 records with scrolling and sorting in nano-seconds. I can only assume that this down to the server processing power. Allan, if its not an unreasonable question, could you give me some indication of the processing power available to this server side scrolling example so I can evaluate where I am against where I need to be.

    Thanks.
  • allanallan Posts: 63,534Questions: 1Answers: 10,475 Site admin
    The server that this site is hosted on has 2 quad core 2.66Ghz Intel Xeon processors - but it is a VPS, so not dedicated to serving up only this site. Server-side processing should really be able to cope with millions of rows without too much difficulty.

    A couple of things to check:

    1. Make sure you have an indexed primary key column in your table, and you are using that in the server-side script for counting rows etc.

    2. Echo out the SQL that is being generated and use "EXPLAIN" in your database to see what is slowing things do.

    3. Make sure that there isn't any network overhead - the Chrome developer tools are excellent for this.

    Allan
  • ZoltarHawkZoltarHawk Posts: 5Questions: 0Answers: 0
    Allan - thanks for the comments.

    Firstly I should say thanks to fbas for his comment on the "bPaginate" being set to false. I set it to true and the result was a much faster, more fluid and very acceptable page scrolling.

    Allan can I suggest that you change the example, as per my previous comment, to show the bPaginate setting to 'true' to assist others going forward.

    In reply to your comments above, 1) the table is indexed and the $sIndexColumn is set accordingly, 2) this I will try just to see the results, 3) I have not noticed anything in regard to network overhead when testing using the Chrome developer tools, which I agree are excellent.

    For my part, this issue is now resolved. Thank you all for your help.
This discussion has been closed.