JSON from Editor is different when using join
JSON from Editor is different when using join
Link to test case: Not available (on private network)
Debugger code (debug.datatables.net): https://debug.datatables.net/awayil
Error messages shown:
Description of problem:
I have a DB table with a couple hundred thousand rows that I am displaying to the user.
I have a few custom filters that when changed, just does a table.ajax.reload()
which works perfectly.
But with the next version I'm releasing, I've added another filter which needs to do a group by which I know is not possible with Editor yet. So I'm using raw sql here
When the new filter is applied, and the response is from the raw sql - I cannot do an ajax.reload()
because the columns change slightly, thus giving me an ajax/json error. So I need to destroy and then reinitialize the table. This works but just doesn't look good on the client side at all.
Here is my code for the 2 instances:
Without the filter (read directly from table - no group by needed):
JSON:
{"data":[{"DT_RowId":"row_18","performance_table":{date: "2022-05-28", tenant: "Coledon", cli: "+27716877497", attempts: "3", successful: "0",…}]
Is there a way to remove ","performance_table":{
from the response?
PHP:
$editor = Editor::inst($db, 'performance_table');
$editor->field(
Field::inst('performance_table.date'),
Field::inst('performance_table.tenant'),
Field::inst('performance_table.cli'),
Field::inst('performance_table.attempts'),
Field::inst('performance_table.successful'),
Field::inst('performance_table.cancelled'),
Field::inst('performance_table.noanswer'),
Field::inst('performance_table.busy'),
Field::inst('performance_table.failed'),
Field::inst('performance_table.asr')
);// End Fields
$editor->where(function ($q) use ($startDate, $endDate) {
$q->where('performance_table.date', $startDate, '>=')
->and_where('performance_table.date', $endDate, '<=');
});
if ($truecaller){ /* Defined at the top of my script */
$editor->leftJoin( 'master', 'master.cli', '=', 'performance_table.cli' );
$editor->where('master.truecaller', '1', '=');
}
$editor->tryCatch(false);
$editor->debug(true);
$editor->process($_POST);
$editor->json();
With the new filter (raw sql - group by needed):
JSON:
{"data":[{date: "2022-05-28", tenant: "Coledon", cli: "+27716877497", attempts: "3", successful: "0",…}]
PHP:
$sql = "SELECT date AS date,
performance_table.tenant as tenant,
performance_table.cli as cli,
SUM(performance_table.attempts) AS attempts,
SUM(performance_table.successful) AS successful,
SUM(performance_table.cancelled) AS cancelled,
SUM(performance_table.noanswer) AS noanswer,
SUM(performance_table.busy) AS busy,
SUM(performance_table.failed) AS failed,
ROUND(AVG(performance_table.asr), 0) AS asr
FROM performance_table
JOIN master ON master.cli = performance_table.cli
WHERE master.truecaller = 1 AND date BETWEEN '$startDate' AND '$endDate'
GROUP BY performance_table.cli ";
$data = $db->sql( $sql)->fetchAll();
echo json_encode( [
"data" => $data
] );
How I define my columns in javascript:
columns: "cli" == groupby && gb_state && "cli" == gb_state ?
columns = [
{ data: "date" },
{ data: "tenant" },
{ data: "cli" },
{ data: "attempts" },
{ data: "successful" },
{ data: "cancelled" },
{ data: "noanswer" },
{ data: "busy" },
{ data: "failed" },
{ data: "asr" },
] : columns = [
{ data: "performance_table.date" },
{ data: "performance_table.tenant" },
{ data: "performance_table.cli" },
{ data: "performance_table.attempts" },
{ data: "performance_table.successful" },
{ data: "performance_table.cancelled" },
{ data: "performance_table.noanswer" },
{ data: "performance_table.busy" },
{ data: "performance_table.failed" },
{ data: "performance_table.asr" },
],
Is there a way I could prevent the table name being in the json response so I could do a reload instead?
This question has an accepted answers - jump to answer
Answers
The
Field->name()
method can be used to alter the JSON property name.Is really a shorthand for:
So what you could do is:
Regards,
Allan
Thanks for that Allan! This worked perfectly, cheers!
Thanks for that Allan, this worked perfectly! Cheers!