Reply To: page pollution

Your Account / Forums / Easy FancyBox Pro / page pollution / Reply To: page pollution

Anonymous
September 2, 2014 at 7:46 pm #3340

Hmm, I don’t understand everything of that, although write permissions is obvious.

I’m not even entirely sure about my own host, but I know cURL can write out a cookie file to any location I desire, so it would probably work for me. I do believe WordPress directory locations would not every be more difficult than knowing the location of the installed plugin, since your plugin needs to know where its files are anyway.

Also, you would not write out the CSS and JS to a statically named file. You would make these two files with a dynamic or unique part to it. You would save the names to the WordPress options table, or something of the kind. That options table always gets cached (by WP) which I discovered much to my dismay as I tried to manually rearrange some category IDs but WP actually caches a term hierarchy list that you need to specifically update (or delete) or your changes won’t show. Not exactly the same thing, but performance issues in getting that data won’t be a problem and you are probably already reading from that table. Your plugin then inserts the CSS/JS links it gets from that table which is why I called it “dynamically linked”. Any browser caching is then mitigated.

I guess there could be many WP installations where the only file update mechanism is through FTP, there is a MySQL db that has write permissions and it ends there.

I don’t think multi-site would be more complex than it already is. A multi-site can share the same code while having differing configurations for each site? Causing file-based configuration to be very troublesome. Nevertheless that only means you would see these semi-random files (semi-randomly named files) being stored in that single plugin location. Multi-site seems to be so complex that I never intend to use it myself. But I figure WordPress then takes care of having distinct tables for each site, which means the config of each site (for the plugin) is automatically distinct from site to site. So basically you only need write permissions to your plugin folder.

I have checked into some of those issues. Writing your plugin folder is probably a problem. WordPress can write into themes, plugins and uploads. But the plugins themselves are then not group-writable..

However…

If you create a PHP-generated JS/CSS the way Jetpack seems to do…

if ( isset( $_GET['custom-css'] ) ) {
    header( 'Content-Type: text/css', true, 200 );
    header( 'Expires: ' . gmdate( 'D, d M Y H:i:s', time() + 31536000) . ' GMT' ); // 1 year
    Jetpack_Custom_CSS::print_css();
    exit;
}

…and if you then parametrize that script using a token that is generated on every options-save, then there will be no browser caching for each new token url. You just store the token with the options. The script that outputs the link into the page is aware of the token, as is the script that receives the parameter (that generates the CSS en JS). Both are run-time generated (except for the token itself). Having random tokens also takes care of multi-site if need be. You can then issue browser-caching for the script output and since these URLs are always fixed in terms of the output they generate, response header management is also very simple. Basically, if a browser inquires about changes, you always output the same:

header(‘HTTP/1.1 304 Not Modified’);

From https://www.mnot.net/cache_docs/:

If a resource (especially a downloadable file) changes, change its name. That way, you can make it expire far in the future, and still guarantee that the correct version is served; the page that links to it is the only one that will need a short expiry time.

The token is really only required for caching and perhaps for multi-site differentiation (but not necessarily so).

I mean, this seems like a perfect solution to me? That way you don’t need to deal with “HTTP Etags” which seem to be overly complex. Cache validation and responses are extremely simple.

Then, the CSS/JS generation is still run-time, it just gets cached. Almost nothing changes except that it is sourced through one or two additional HTTP requests that get subsequently cached by the browser. It seems perfect. It is even extremely simple for myself to implement if I want to do that. It would probably not take me more than an hour or maybe two. If you want, I can even do it for you so you can just check the result and see if it is any good. It would probably not even require more than say 20-30 lines of code.

Kudos, B.