If a lot of lines, speed is very slow!

If a lot of lines, speed is very slow!

TonyTony Posts: 2Questions: 0Answers: 0
edited March 2009 in General
If a lot of lines, speed is very slow!

Replies

  • allanallan Posts: 63,498Questions: 1Answers: 10,471 Site admin
    Hi Tony,

    Perhaps you could provide a little bit more information so I can look at addressing this? How many records is it that you have in your table, how many columns and what initialisation are you using. Also the browser (or more specifically the Javascript engine) can make a big difference.

    If you are using more than a thousand rows (which it most certainly should be capable of detailing with) then you might want to consider using server-side processing to speed things up - this is now available in the 1.5 beta series.

    And one final thing, as you switch of features you should notice things speed up - obviously the more feature you have enabled the longer it takes to process things. The feature that makes the biggest difference on large tables is bSortClasses ( http://datatables.net/usage#bSortClasses ). I'd strongly recommend disabling this for very large client-side tables.

    Allan
  • TonyTony Posts: 2Questions: 0Answers: 0
    Hi Allan,

    Tools:Visual Studio 2005
    language:C#
    Controls:GridView
    Rows:more then 2000 lines
    Columns:around 20 or so

    It is very slow to load data!
    and input the characters very slow filtering!

    Would you help me?
    Ths!

    Tony
  • allanallan Posts: 63,498Questions: 1Answers: 10,471 Site admin
    Hi Tony,

    Yup - 40'000 cells is quite a large amount of data, so this isn't too surprising :-). I would certainly disable the sorting classes as I recommended above. You might also want to consider using the fnSetFilteringDelay API plugin from Zygimantas Berziunas to put a delay into the filtering, such that is doesn't filter immediately on every key stroke: http://datatables.net/plug-ins#api_fnSetFilteringDelay . The final solution is to use server-side processing.

    Do you have a link to your page that you could provide so I can profile DataTables with this amount of data.

    When you say you are using GridView with Visual Studio, how does this relate to DataTables?

    Thanks,
    Allan
  • vexvex Posts: 30Questions: 0Answers: 0
    Looking at the code, it seems like all the DOM elements are created when processing the data for storing (ie, at init/adding rows). Changing it so it generates the DOM elements on display (and store them) would probably speed up DataTables quite a bit. Though that is probably a lot off work and might require quite the restructuring I believe (I've only looked at select bits of the code).

    If you have 2000 rows and the user only ends up viewing ~100 then you've wasted a lot of CPU/memory to create 1900*cells unused DOM elements. Even for searching I don't see why you'd need DOM-elements, just having the data that will reside in the element should be enough.
  • allanallan Posts: 63,498Questions: 1Answers: 10,471 Site admin
    @vex: You are suggesting creating the DOM elements that are required on-the-fly and then storing them? The issue with that is handling events and other options that require DOM access to the whole table.

    For example in my event demos I currently apply a tooltip event to each row. If this was done dynamically I would need to check if the row had an event or not and then assign it if required on each row - which seems like a lot of over head and counter-intuitive for the developer. As such I think the current behaviour is correct.

    DataTables 1.3 (and before) used to create the DOM on demand, and it let to no end of troubles with events, attributes and other things.

    Regards,
    Allan
  • kolyakolya Posts: 4Questions: 0Answers: 0
    Hi Allan.

    I've came across this issue too. And I think that there should be at least an option to create rows on demand. Here is my point.
    There is already interface which allows one to control all kind of events on cells: user of your lib can use fnRender to render any object into the cell with any event and this is compatible with on-demand way of generating rows. Moreover, it is compatible with dynamic data sources and it will work fine if data in the table is changed during page lifetime. Moreover, examples you are referring to won't work correctly if new rows are added.
    On the opposite, it looks like that there is no way around slowness in case it's required to show several thousands rows (paginated) into the table. And there are cases when it's preferable to return everything from server on page load once and then just change pages on client. For example, of you have external data source, which is very slow and cannot paginate data - is this case it's preferable to fetch all at once, push everything to client and let browser deal with it. Hitting server in each page in this case is just suboptimal solution. After all several thousands of rows should not be huge number in 2010.
    So, after all, it seems to me that it should be possible to add on-demand render without loosing any functionality and flexibility and with minimal compatibility impact.
    I'll probably try to look into this.
  • kolyakolya Posts: 4Questions: 0Answers: 0
    Hi.

    I've looked into this and here is what I've accomplished.
    First of all I've changed aoData[].nTr from an element ref to a function. This function creates element on demand. There probably will be incompatibilities after this change. Whole table is not generated any more, so it'll break existing examples. Also, there is some stuff happening with aoData[]._anHidden[], it is possible that my change will break it. I do not use hidden columns, so it's hard to check for me. But it should be fixable.
    Next, aoColumns[].fnRender now can return element, not only html text. This allows to prepare element with all events handled in client code on demand. Current imp,ementation will probably break code which has bUseRendered: true. I do not use this, so didn't check, but should be fixable too.
    One more thing. Custom filter routing in $.fn.dataTableExt.afnFiltering is called for each element without an opportunity to prepare for sorting. This can degrade performance in case there are several UI elements which change filter values (like sliders, checkboxes, etc) and access to values of such elements for each row can consume a lot of time. Instead, I suggest $.fn.dataTableExt.afnFiltering to contain routines which return routine used for actual sorting (once per full table sort). This gives an opportunity to fetch all required data before actually sorting begins. The change is trivial, but incompatible with existing implementation. There are probably other ways to achieve this.
    All this 'works for me' and gives huge performance improvement on 2000 rows.
    Please let me know if anyone is interested in these changes so there is any reason to cleanup issues they create. Patch against 1.6.2 seems to be too big for post here, so posted it on pastebin: http://pastebin.com/GJ149nrV .

    Thanks.
  • lukesslukess Posts: 3Questions: 0Answers: 0
    I would be very interested in being able to handle 2000+ records. I am already doing it and dealing with the load time. IE hates it but Chrome can chew through it pretty quick. Serverside is an option but I believe would be a very difficult undertaking in my case. I am leery of putting an outside patch in place to make it work this way. Allan, is this something you would have interest in investigating? kolya, is this something you could provide as a .js? I want to try plugging it into my project and see what kind of difference I see.
  • kolyakolya Posts: 4Questions: 0Answers: 0
    The patch I' ve provided can be applied against last version of dataTables.
This discussion has been closed.