downloading/rendering large data sets - possible to prevent unresponsiveness and have progress bar?
downloading/rendering large data sets - possible to prevent unresponsiveness and have progress bar?
dessloch
Posts: 5Questions: 0Answers: 0
Probably the only hiccup datatables has for me is when I fetch a large data set and it makes the browser unresponsive while it loads everything into its array and then renders a portion of it (pagination). After that, it's blazing-fast. I usually throw at least 100,000 rows in JSON format at it. That's typically ~30MB sent over the wire, which by itself is actually bearable, because the browser remains responsive during the download. Either way, while I do have a placehoder loading notification before the table renders, the animation on it stops since the browser hangs for a while between download completion and rendering completion.
Is it at all possible right now to prevent datatables from hanging the browser during the rendering, and maybe even display the progress (X out of Y rows loaded, etc.)? Maybe something akin to threading, or to leave enough browser resources (read: not CPU/RAM); that is, free enough to allow users to do input and use other tabs?
Is it at all possible right now to prevent datatables from hanging the browser during the rendering, and maybe even display the progress (X out of Y rows loaded, etc.)? Maybe something akin to threading, or to leave enough browser resources (read: not CPU/RAM); that is, free enough to allow users to do input and use other tabs?
This discussion has been closed.
Replies
It should be perfectly possible to use fnAddData in combination with your streamed data source to load massive tables, while keeping then page responsive.
Allan
The dataset is about 200-300 rows. Added complexity is that each td may be one of various input types (checkboxes, text input, radioboxes), has one "+" icon and has some jquery-added formatting (background colour, on(click) for the + button, etc).
Data is completely loaded on the backend - so the html is already loaded.
The goal was to display the "loading" icon until the table was completely rendered. However, loading animation stops and browser becomes unresponsive. Debuging the javascript shows that this happens during the datatables initialization block.
That's a very interesting suggestion though, since I could easily extend my progress xhr callback to accomodate piece-by-piece parsing of the entire chunk. But I am slightly concerned about performance, as the slight pause of 2-5 seconds loading everything at once is probably better than 100,000 calls to fnAddData.
Is using fnAddData X times after initialization comparable to setting aaData before initialization? I'm concerned also, because I've noticed that when I call fnAddData when there are already a ton of rows in the table, there's a noticeable 1-2second delay PER CALL. Now, that might go away if I set the re-render flag to false (2nd param IIRC?). But what I mean is, if I click the button 5 times really fast, it will take 5-10 seconds for all 5 to be added, and they will be added all at once. This leads me to believe loading the entire chunk at once into aaData pre-init is probably faster than individual fnAddData calls. Do you have any additional insight/benchmarks for this, before I set aside time to find out the hard way? :)
Currently, the memory usage is actually not bad at all. It hovers around 200-300MB for Chrome and 400-500MB for FF, which is more than acceptable for my purposes. Your little plugin is the best heavy-duty-oriented JS grid I've seen so far! The app I'm writing is not really meant to be a white label (might turn into one in the future), but the responsive UX is nice to have as it's intended to be used by non-tech people.
EDIT: Modern browsers are slight mem hogs - FF takes up 150-200MB by itself, Chrome around 70-100MB, both sitting at google.com. So actually it's more like 150-200MB total for just the page that loads/renders that dataset for both browsers. Same for Safari, too. IE obviously takes up more RAM (cause IE fails at everything), but it loads at about the same speed and is also blazing-fast-responsive once it's all in memory.
I have made server side two phase cache, first for large data set and second for part of table that is visible. Worst case scenario, when parameters are changed, cache is destroyed, and about 100 columns times 118 K rows of calculation is needed (and cached), is about 20 seconds. Very acceptable. Sorting when done first time takes about 5 seconds, second time fraction of a second. Normal page load time when data comes from cache is about 0.2-0.3 seconds.
I attach an example of the issue here: https://www.dropbox.com/s/s9iu90s5s0e3t94/DataTablesTest.zip
This table is 221 rows by 24 columns. The problem happens when each row has more content than just a simple text label.
In this case the html is already completely built (by the backend server) - unresponsiveness happens only when the Datatables element is being initialised.
Do you have any suggestion on how to prevent this unresponsiveness?
@dessloch and @janiscmbp - let me consume and reply to your posts separately :-) Rushing a bit atm...
Allan
Yes, indeed! And in this scenario (where each cell can have different elements and styles), outputting something else (JSON) and relying on javascript to render it correctly would probably be worse than it already is.
Let me know what I can do to help figure this out! I've been scratching my head over this for a while!
Let me know what I can do to help figure this out! I've been scratching my head over this for a while! [/quote]
I take it what you're doing is probably something like an ad-serving front-end for (potentially really dumb) customers. :)
Yeah, there isn't really anything you can do if you're trying to keep a lot of stuff in the browser DOM all at once. The reason I can go up to 2000+ rows in pagination on my table is that most of the data is plain text/numbers, no fancy elements/images/etc. I do use inputs to edit/save back to DB, but I use inputs sparingly, removing them once the user is done editing a cell.
The DOM tree in a browser is the biggest bottleneck/limitation to everything, as it's very inefficient at the more intensive operations (animation, moving things around, rendering of images/colors/shapes, etc, etc.). Web sites were originally intended to just be a way to share simple text information, and it'll be a while before they start becoming more like 3D engines such as Unreal/ID-Tech.
Did you have a chance to take a look at this?