Long loading time for extremely large data (server-side processing)
Long loading time for extremely large data (server-side processing)
Hello,
I am currently using the newest DataTables with a huge SQLite database (has around 2 millions of rows, 10+ columns)
I am using my own php code for connecting the SQLite when doing server-side processing, and I also enabled bDeferRendering, delay filter plugin, etc., but it is still relatively slow, requires 20+ sec to load the primary data or scroll down.
Has anyone work with this amount of data but still had a satisfying speed? If so, please give me some hint on how to improve the performance.
As to the testcases, I guess it will not help a lot, but I will still put that here
http://live.datatables.net/iwigex/edit#javascript,html
http://debug.datatables.net/aqosun
I am currently using the newest DataTables with a huge SQLite database (has around 2 millions of rows, 10+ columns)
I am using my own php code for connecting the SQLite when doing server-side processing, and I also enabled bDeferRendering, delay filter plugin, etc., but it is still relatively slow, requires 20+ sec to load the primary data or scroll down.
Has anyone work with this amount of data but still had a satisfying speed? If so, please give me some hint on how to improve the performance.
As to the testcases, I guess it will not help a lot, but I will still put that here
http://live.datatables.net/iwigex/edit#javascript,html
http://debug.datatables.net/aqosun
This discussion has been closed.
Replies
I am using the scroller, and it seems like it is only getting around 80 rows at once (observed in firebug with the returned JSON) , whenever I go down or go up, it will send another request, take another 20+sec, to get the expected 80 rows, which seems pretty reasonable to me...
Are there any thoughts about where should I try to start working on?
If you are using server-side processing, then yes. I've seen server-side processing working with tens of millions of rows returning results in fractions of a second.
I'd agree with fbas - it sounds like your script is taking a long time to reply, so you might want to look at optimisations there.
Allan
Another day of trying, I still cannot figure out the solution, may I have someone look at my PHP code and give some suggestion?
I have found that the $iFilteredTotal is acting strangely but cannot find a solution for it.
[code]
/*
* Paging
*/
$sLimit = "";
if ( isset( $_GET['iDisplayStart'] ) && $_GET['iDisplayLength'] != '-1' )
{
$sLimit = "LIMIT ".intval( $_GET['iDisplayStart'] ).", ".
intval( $_GET['iDisplayLength'] );
}
/*
* Ordering
*/
$sOrder = "";
if ( isset( $_GET['iSortCol_0'] ) )
{
$sOrder = "ORDER BY ";
for ( $i=0 ; $iquery( $sQuery ) or fatal_error( 'MySQL Error: ' . $db->lastErrorMsg() );
$aResultFilterTotal = $rResultFilterTotal->fetchArray();
$iFilteredTotal = $aResultFilterTotal[0];
/* Total data set length */
$sQuery = "
SELECT COUNT($sIndexColumn)
FROM $sTable
";
$rResultTotal = $db->query( $sQuery ) or fatal_error( 'MySQL Error: ' . $db->lastErrorMsg() );
$aResultTotal = $rResultTotal->fetchArray();
$iTotal = $aResultTotal[0];
/*
* Output
*/
$output = array(
"sEcho" => intval($_GET['sEcho']),
"iTotalRecords" => "$iTotal",
"iTotalDisplayRecords" => "$iFilteredTotal",
"aaData" => array()
);
while ( $aRow = $rResult->fetchArray() )
{
$row = array();
for ( $i=0 ; $i
Allan
My data is currently in a internal server, and I do not really have a website for me to put a 500Mb database.. :/
@allan
Thank you for the suggestion. The question IS in the database... I guess I will need to consider examine the performance of sqlite
Just a question on sqlite. It seems like sqlite will not do count on a SQL statement with LIMIT, but on the PHP script, it is obvious that the count need to be done for representing filtered result, etc. Is there a way to get around into it?
Allan