big tables ... local cache?
big tables ... local cache?
I'm now in a place where i need to do server side mode and I'm not looking forward to it.
I wonder if anyone has thought about caching up the whole table data in a json blob, but only showing a window of data instead of going back and forth to the server for each window.
Seems like that might be a huge speed advantage...
Maybe it's even possible to consume the dom based table into a json blob and eliminate all but the currently visible bits.
Is this a crazy idea?
I think this has some advantages over the Gears/html5 local storage bits discussed here:
http://datatables.net/forums/comments.php?DiscussionID=36
I wonder if anyone has thought about caching up the whole table data in a json blob, but only showing a window of data instead of going back and forth to the server for each window.
Seems like that might be a huge speed advantage...
Maybe it's even possible to consume the dom based table into a json blob and eliminate all but the currently visible bits.
Is this a crazy idea?
I think this has some advantages over the Gears/html5 local storage bits discussed here:
http://datatables.net/forums/comments.php?DiscussionID=36
This discussion has been closed.
Replies
Allan
http://www.neb.com/polymerases
It's worse with 2000 records
http://polbase.neb.com/references
I have an internal page with 6000 rows that's getting close to unusable.
Maybe for table of size 0-20000 rows it's reasonable to try to scrape the table and modify a set of dom object for paging, filtering and sorting.
Or maybe I'm just "doing it wrong".
For client-side processing, regardless of if you are using ajax sourced data or DOM sourced data, 2000 rows is typically the point at which server-side processing becomes attractive (mainly because of IE - 10000 rows if you can discount IE).
Allan
I'm going to experiment a bit with this json idea. I'll let you know if i come up with anything interesting.
Allan