Support for caching + static page URLs?

Support for caching + static page URLs?

daniel437daniel437 Posts: 2Questions: 0Answers: 0

I am working on a Wordpress website for stock market data. The site will display data for ~5,000 stocks. I would like the homepage to have a leaderboard similar to https://coinmarketcap.com. I would then sort stocks by market cap.

DataTables looks perfect for my needs, but I would like to make sure of two things before getting started:
1. My fields will update on a regular basis. Is it possible to prerender / cache the table sorting? I don't want my sorted table to regenerate every time a page is loaded.
2. I would like for my table pagination to have static URLs that you can link to, ie website.com/page/2 would always show results 101-200. Is there a way to do this?

I looked through the documentation + the forums and have so far been unable to find an answer to 2). Any guidance is greatly appreciated!

Replies

  • allanallan Posts: 63,531Questions: 1Answers: 10,475 Site admin
    edited September 2022

    Hi,

    1) When you say "regenerated", I'm not sure I fully understand. If ordering is enabled in the DataTable (which it is by default) then the browser will sort the data. Is the problem that you don't want the client-side to have to perform that processing? How large is your data set - I would be really surprised if that takes more than a few tens of milliseconds on data sets less than 10k rows. Is that a problem?

    You could preorder the data, which would allow the ordering algorithm to basically just run over the data and check it is in order, but that seems like a really minor performance gain (at least with the limited information I have so far).

    2) We don't really have something that will do exactly that at the moment. We do have a deep linking plug-in which provides the basics for that, but it doesn't generate the URL links to click on. A URL rewritter in your HTTP server could make the URLs more attractive such as .../page/2.

    One thing to keep in mind about that kind of linking is that your data might change before the search engine reindexes your page. Certainly I find it frustrating when I use Google and find a page that looks perfect to find that the data has changed since it was indexed and I can't find what I was looking for on the page!

    My own preference for handling this sort of thing is to disable paging for a search engine index. It means dynamically generating the JS or adding an extra check into it, but it might be worth it. It also depends a bit on how much data you have - if we are talking millions of rows, then this approach is sub-optimal since it would take a while for the page to load and Google would penalize you for that.

    Regards,
    Allan

  • daniel437daniel437 Posts: 2Questions: 0Answers: 0

    Thanks for this, Allan! This is exactly what I'm looking for. I need the deep linking to so that Google can easily see all 5,000 entries. A direct link is very important for this. Outdated info should not be a major issue. Google just needs to see the links.

    As for regeneration, I think I can solve this by controlling how often my data updates and leveraging caching.

Sign In or Register to comment.