I built a JS tool that I named "tuples" in an effort to minimize both data generation effort and data transfer. Each page has its own temporary cache of data. Although the cache expires with a page reload, which is obviously a rudimentary cache expiration technique, we do get a performance boost for real-time-updates system that correlates with the effort required to generate the corresponding data.
Why "tuples"? Well, because it was based on tuples... All data was transferred using ordered arrays. The server provides data in a JSON format that represents an ordered array where each element is either an integer/string or an ordered array that includes an integer/string as its first element. The first element of the sub-arrays represent a primary key for the cache, and the presence of an integer/string in place of a sub-array indicates that the value in cache should be used. Tuples are ordered lists of elements.
On the server-side, the list of keys that have been sent to the client's cache is simply stored in the $_SESSION variable to simplify the expiration and segmentation process. If a key has already been sent, then the server does not need to generate a full list of data.
Here is an example of how a complex autocomplete can benefit from this library:
- Response 1: [[1,"complex data", "really complex"],[5,"complex data for 5","and even more complex."]]
- Response 2: [5,[7,"complex for 7","more complex for 7"]]
As you can see, the advantages boil down to just:
- Server does not have to compute extra data for 5 on the second request when the client already has that data.
- Data transfer is minimized since there are no keys transmitted, no unnecessary white space (due to JSON optimization), and no redundant data.
This library has a very specific use case and is highly reliant upon the server code, which is why I have not bothered packaging the JS library as open source. However, the basic minimalization principle is sound, and we have seen the benefits in a production environment.