Data tables slow when deployed in windows server and accessed outside the server.

Data tables slow when deployed in windows server and accessed outside the server.

viralbviralb Posts: 8Questions: 5Answers: 0

We are using download version and developed the datatables view. When we work on local machine it works fast. When we use same on our sandbox the data loading time it takes in 5 minutes - 60 minutes. Can you help us.

This question has accepted answers - jump to:

Answers

  • allanallan Posts: 63,523Questions: 1Answers: 10,473 Site admin
    Answer ✓

    Can you link to a test page showing the issue so we can debug it please.

    Allan

  • viralbviralb Posts: 8Questions: 5Answers: 0

    here is the link of our project you asked for.
    http://180.149.247.135:8080/aHealthSaverz/viewBrand.htm.

    Please debug the page and let us know the issue.
    The call from backend is fine to load data into data tables.
    We are ready to take any support if its required .
    Your help will be highly appreciated.
    Please inform us once the testing is done .

  • allanallan Posts: 63,523Questions: 1Answers: 10,473 Site admin
    edited October 2014

    edit Ignore that - there was a trailing dot. I'm looking at it now.

    Allan

  • allanallan Posts: 63,523Questions: 1Answers: 10,473 Site admin

    Request processing failed; nested exception is org.springframework.webflow.conversation.impl.LockTimeoutException: Unable to acquire conversation lock after 30 seconds

    Sounds like a server-side issue. I don't know what your code looks like, so I don't know why it would fail to acquire a lock. I would suggest looking into that as the first step.

    Allan

  • viralbviralb Posts: 8Questions: 5Answers: 0

    Hi Allan,
    Please ignore the previous dataTable link for your debugging

    Use this link instead,http://180.149.247.135:8080/DataTableTest/viewBrand.htm ,this table has 38000 records.Please ignore the session exception and refresh the link if the error comes.

    Also giving a link of the same table with only 1000 records ,for your debugging about the table in case the error comes repeatedly ,http://180.149.247.135:8080/DataTableTest/viewBrandFew.htm

    The second link will work for sure , but the first link has 38000 rows which takes more time to load.
    Please debug that issue.

    Any help is highly appreciated.

    Thanks and Regards.
    Viral B

  • tangerinetangerine Posts: 3,365Questions: 39Answers: 395

    Please debug that issue.

    Could you not look at your browser's console info?

    For example:
    "Error: Graph container element not found"
    in the file morris-0.4.3.min.js (line 1, col 1301)

    which may not solve your issue, but is not DataTables' problem.

  • allanallan Posts: 63,523Questions: 1Answers: 10,473 Site admin
    Answer ✓

    Thanks for the updated link.

    A number of points:

    1. The data being loaded by Ajax is taking ages to load.
    2. You are loading almost 7MB up front. Requiring such a huge amount of data just to display the page is probably a really bad move.
    3. The transfer rate is extremely poor. 7MB of data in ~160 seconds. That's a transfer rate of about 45KB/s! You might want to look into what is causing that - if your server is on a slow connection or it is processor bound when creating the JSON data.
    4. Once the data has been loaded the table render is very fast (about 100mS for me)

    So basically the issue is in the loading of the data, which is not something DataTables itself can do anything about.

    However, I would very strongly suggest you look into using server-side processing which will massively improve performance.

    Allan

  • viralbviralb Posts: 8Questions: 5Answers: 0

    Thanks for your reply and suggestions.

  • john_ljohn_l Posts: 45Questions: 0Answers: 12
    Answer ✓

    On top of everything Allan says (the transfer took 4.5 minutes for me), I see you have a number of extra fields in the JSON data over what you are displaying - sessionID, clientID, message, created_By, extra_1, extra_2, extra_3, extra_4, and extra_5. You could probably cut the data in half or less if you removed them. You should also make sure the data gets compressed in transit to cut it down even more.

    If it is taking a long time to generate the data on the server (rather than just the transfer being slow), and there is nothing you can do to speed it up, then consider caching the data on the server.

    That said, with that many rows you really should look into server-side processing.

This discussion has been closed.