This post written on the WeblogToolsCollection.com forums as a news submission has been well received by plugin developers that have taken notice. The article explains how to include CSS and JavaScript conditionally so that the code is not loaded on every page of the site.
If you think about it, there are many plugins that only do something once in a blue moon. Table of contents, text manipulators, galleries, sliders, etc, etc. If only they loaded their frontend code strictly when necessary, most page loads would suddenly become much lighter.
This technique if implemented by plugin authors sounds like it could have a significant impact on end users websites, especially when it comes to loading times. I’m always impressed with the speed of WordPress when I install a fresh copy without any plugins. WordPress loads very quickly both on the front and back ends. However, once I activate 30 or so plugins, most of which add functionality to the front end of the site, I see the page loads increase significantly. It’s a shame too since I routinely hear people claim WordPress is sluggish software only to find out they have over 30 plugins activated on their site.
If you’re a plugin author, can you please tell me what some of the drawbacks are with Artems approach?
Thanks for mentioning this, guys. It is really important to spread the word about the plugin bloat and make authors more aware of what they’re doing and what they shouldn’t be doing.
I looked a few popular blogs as well as my own and saw anything from duplicate jQuery loads to styles and scripts that load on the front page when they’re nowhere to be seen (contact forms, polls, code syntax, bookmarks, files that should only be loaded in admin, etc).
It’s a mess and it needs to be cleaned up.
Artem
http://beerpla.net
http://twitter.com/ArtemR
It’s always good to include what you need only when you need it.
The only drawback to loading JS and CSS conditionally is that it requires more work on the developer’s part. 🙂
Also, your question was already answered in the comments on the original article.
Well, I know quite a few plugin authors read this site so I figured I’d put the question here as well just to generate input.
I’ve used a similar method for look ahead before, but I didn’t think about using the_posts. Clever.
Also note that, generally speaking, it’s actually better to NOT optimize in this case and simply include your script all the time.
Why? Browser caching.
If we assume that your visitor is going to see at least one page that requires the script, then at some point he’ll have to load the script into the browser. But that only happens one time, the rest of the time the script is already in their browser cache. So how much time do you really save them from doing this conditional loading? They have to load the thing anyway, generally. All that adding this conditional loading really does for you is to add a bunch of PHP execution in order to save some time that really isn’t saved…
This is unnecessary optimization. It’s probably a de-optimization on most setups. I’d avoid doing lookahead unless you have a specific use case where it fits. It doesn’t fit the general case.
Even with the browser caching, you’re still sending an extra HTTP request for each script.
No, you’re not. If it’s cached, it’s cached. No HTTP request goes out for a file that doesn’t need to be retrieved.
Actually, that may or may not be true, depending on the conditions and the browser. Sometimes no request would go out, sometimes a request would go out but the server would send back a 304, which means “not modified” but an HTTP request is lost in that case.
That sort of request only goes out if the cache is from a previous session, which is not the current session. Also, HTTP requests don’t add a lot of overhead with modern servers, because you’re not using a new TCP/IP connection. Look up “HTTP Pipelining”.
Heh, http://en.wikipedia.org/wiki/HTTP_pipelining
Internet Explorer 8 does not pipeline requests, due to concerns regarding buggy proxies and head-of-line blocking.[2]
Mozilla Firefox 3 supports pipelining, but it’s disabled by default. It uses some heuristics, especially to turn pipelining off for IIS servers. Instructions for enabling pipelining can be found at How do I enable Pipelining. Camino does the same thing as Firefox.
Google Chrome is not believed to support pipelining, although it may be implemented in the near future.[3]
Not much of a gain there if it either doesn’t work or is disabled for most users.
Good info though. I’ll go enable mine in FF.
So if someone is running 10 plugins each outputting loads of Javascript on the page that is only necessary on a handful of pages… you think it is unnecessary optimization to output all this Javascript on every single page? I strongly disagree.
If plugins do not selectively output Javascript/CSS that means that it can also be output on the Admin and cause unnecessary conflicts and issues with plugins that utilize Javascript within the admin. It’s the source of a lot of plugin conflicts that occur with WordPress.
You also make the assumption that everybody caches everything. Thats not true. Personally, as a developer, I hate browser cache. With the rise “privacy mode” and “incognito mode” functionality built into browsers, more and more people are NOT caching everything.
I strongly disagree with your strong disagreement.
First, there’s a difference between being selective in the admin side of things vs. being selective in the public facing portion. Obviously you wouldn’t include the same scripts site-wide, you’d restrict it to one or the other. Rarely both. And the admin behavior is different than what this optimization is talking about in the first place, since he’s talking about using the_posts, which only occurs on the query, which is very specifically only on the non-admin side of things.
Secondly, all modern browsers do caching. Period. If not on disk, then in memory.
And if you hate browser caching, then my considered opinion is that you’re a lousy web developer. Nothing personal, but caching is *critical* to proper web app operation. Without caching, all the web would be hideously slow. Every single page you went to would have to load the entire page instead of the shared common elements. What fresh
Also, even private modes in browser use session based memory caching.
Otto,
I think Carl means that he hates browser cache as a developer, meaning when you’re writing code and the browser doesn’t refresh some of it, it’s a pain in the ass. I usually disable caching when developing too but that’s really besides the point here.
As far as admin scripts, what Carl meant was that some plugins would load the same scripts and styles on both post pages and admin pages, which is clearly unnecessary but that can be fixed without what we are discussing here. For admin-only scripts, it’s already easy to restrict them to only needed admin pages, as I mentioned at http://beerpla.net/2010/01/15/.....itionally/.
Ah. Okay, I can see that a bit better, I guess. I generally use script src type stuff for including libraries only. My jscript stuff lives in the page itself until I’m done developing it. So caching is not generally a problem there.
And yes, I see his point with admin vs. post page, but that wasn’t what the original article was discussing at all. Admin and viewer-side are two different things, you don’t load stuff on both of them in the same way unless you’re new to plugin programming and hook everything to init.
Wow. No need to get into name calling.
I don’t like browser caching specifically because it interferes with the development process.
As a developer there has been numerous times where during a late night I would bang my head against the desk when a change wasn’t working properly only to realize it was a caching issue.
I develop with caching turned off specifically to avoid silly things like this.
If you look at blog traffic, especially of one popular in search engines, you’ll find that there are vast amounts of visitors who visit once and don’t come back for a bit – they just found what they wanted, so they go on their way.
However, that initial page load was more bloated than it should have been. They have no benefit from browser caching.
Hell, your regular visitors might not too if they only visit once in a while.
See, the problem here is that you have your generalizations and I have mine. Most of my visitors look at several pages on my sites, they’re not one-and-gone.
What I’m really saying is that people think of optimization as a “do this, then it’s more optimal” or something equally insane. Optimization depends on the specific case in question. What is optimal for one site might not be for another. And any “optimization” that adds overhead is a matter of tradeoffs.
This particular optimization of scanning the post content adds a fair amount of overhead. You’re doing a lot of text comparisons and such. It could be more complicated than a simple [code] search type of thing… And in many cases, that overhead can be much larger than the gains you get from not including the script in question.
That’s all I’m saying. It’s crazy to generalize, and it’s crazy to prematurely optimize for cases when you don’t know that that case is actually causing trouble.
Somebody who really wants to optimize their site needs to first perform an analysis of that specific site and determine the weak points, then address those points. You can’t simply plug a bunch of code-concepts in, call it “optimized” and call it a day.
Drawbacks? None. It’s best practice.
We only include JavaScript and CSS when absolutely necessary. We go as far as to only include JS when the output of the plugin needs it, not just if the plugin is present.
For instance, we have Javascript associated with the Datepicker field and Conditional Logic. If your form doesn’t have a Datepicker on it, our plugin doesn’t output the necessary JS. If your form doesn’t have Conditional Logic, our plugin doesn’t output the necessary JS.
It is also best to use wp_enqueue_script when doing Javascript.
If more plugin developers adhered to these best practices there would be less conflicts and problems.
I don’t know how many plugins i’ve seen that output Javascript on EVERY SINGLE PAGE. This includes the Admin. I’m not sure why any plugin needs to output Javascript on EVERY SINGLE ADMIN PAGE. This is the source of many headaches when it comes to plugin conflicts in the Admin.
Definitely with Carl and scribu (who is a core contributor btw) here – great points, guys.
I just posted a follow-up here: http://beerpla.net/2010/01/15/.....itionally/
I like the idea of only loading what you need when necessary. I didn’t know that some plugins were loading scripts into every page in the admin side.
If anyone’s still reading, here’s a method that doesn’t involve parsing the posts twice: http://scribu.net/wordpress/op.....ading.html
+1
This is a significantly better method and will speed up page loads as well.
I use this method in my SyntaxHighlighter plugin. When you use one of it’s shortcodes, it marks down that you need one of it’s Javascript files and then will output it in the footer. 🙂
I am with Otto on this one, I would rather rely or take the chance of using browser cache and load in every page rather than increase my php cpu/memory usage on every page load.
For the client side, there is no API in WordPress to load the js/css conditionally. But for the WP-Admin site, there are APIs to load only on selected admin pages, available after WP2.8, see http://lesterchan.net/wordpres.....dpress-28/ (at the bottom)
Scribu ways seems to be better – http://scribu.net/wordpress/op.....ading.html
On the client side, you could use conditional tags, but that’s not much help, most of the time.
Always test
stripos
exactly againstfalse
. If the needle you’re looking for is the first character, then it will return0
which will fail the if test.if ( false !== stripos( $haystack, 'needle' ) ) {
I’m still shaking my head that some plugin developers that have commented on this post actually think it is okay to load Javascript on every single page despite the fact it is unnecessary. Very disappointed to see Lester Chan chime in that he agrees. I’ll think twice about using his plugins in the future now that I know he condones this practices.
If you have 100 pages and one 1 of those needs the javascript… why in the hell would you load the javascript on the 99 pages that don’t need it?
It’s attitudes like this that help contribute to plugin conflicts. Developers recklessly throwing code around.
To those that are worried about PHP processing overhead… ever heard of server caching? WP SuperCache, Batcache, etc. Problem solved.
+1 to scribu, carl, artem and viper for actually getting it
The more usual case is not 1 in 100, it’s 95 in 100.
What you’re missing is the subtle fact that generally your check for loading conditions takes more time/resources than simply loading the script every time.
It makes no sense to optimize a site for a fractionally small percent of cases.
This post is very help me. But I still have a problem with my blog. My blogdetik use engine wordpress, I want to put a site information like histats and alexa, but I can’t put it in my blog. I have try to use html code, but not yet success. Can you help me to solve this problem.
I am sorry if my english still so bad. I’m from Indonesia. Thanks.
This is something that WP plugin developers should definitely consider. While building a client site recently, I was happy to notice that the cForms plugin had this feature enabled, allowing me to only include it on certain pages.
What would REALLY improve efficiency and reduce the MySQL server load is some way of caching the values of bloginfo(‘name’), bloginfo(‘url’), etc. that are often used multiple times in header.php and footer.php files so that there’s only one database access for each of these values per day (or hour).
Yes, hard-coding content would reduce the database calls but then a site’s php code would have to be checked if anything changed. It would also thwart using a theme for multiple domains since each domain would have to be hardcoded differently.
A 10,000 page site that has a visitor typically see 6-10 pages and each of those pages has a dozen or so total instances of bloginfo(‘name’) in a header or footer.
Imagine if such fields could be determined only once per day or hour. It would also DRASTICALLY reduce server loads during search engine spidering.
I have a server with hundreds of sites with thousands of pages, each. Spidering hammers us.
You’re way offtopic, but I’m going to answer your concern:
bloginfo() _is_ cached already, since it calls get_option(), which caches the values from wp_options.
The same is true for most template tags in WordPress.
Seems to me that if one looks at the query count, each of those bloginfo()’s are counted separately.
And do the values of bloginfo() carry forward from one page to another?
+1 for scribu’s answer. WP loads all options values in one go and caches them for the duration of page execution, unless they are added via autoload = false, which is the 4th parameter to add_option().
An obvious optimization like that – do you think WP team is this naive? 😛
For more info, take a look at get_alloptions() which is called by get_option() the first time.
Hi Jeff!
Thanks for this Post!!
I was looking all over for this until I finally got here 🙂
Now you’re bookmarked for faster finds in the future!
I love speed so it’s so great when the code is not loaded on every page of the sites.