Defining variables as parameters saves bytes (", x" vs "var x") but shouldn't increase performance. Passing document into the function saves writing "document" twice or doing a "var d = document".
Creating a new script tag with JS lets you get asynchronous script loading for older browsers that don't support the async attribute.
"As for your question, we pass in `document` that way to prevent unnecessary scope lookups. We could’ve used `var d = document` inside the IIFE, but that would trigger an additional scope lookup. For `'script'` it doesn’t matter; you could initialize that inside of the IIFE if you want."
Is it premature or unnecessary, if the methodology comes natural? Why not get into the habit of reading and writing Javascript with optimization naturally ingrained into the first pass? The optimization(s) may not be immediately necessary here, but there's no harm in practicing these patterns in all facets of language use, unless we want to have a separate conversation whether the patterns are detrimental to syntax.
It looks like the kind of optimization that a compiler could do pretty trivially, and you shouldn't have to be concerned about as a human, as it makes the code less readable.
Then again, Javascript is a strange animal here as the code is provided to the execution environment as plaintext instead of bytecode/assembly. I guess a JS-to-JS compiler such as closure could handle it.
- Define g and s as parameter even though they're only given a value inside the function
- Pass document into the function instead of using it from global scope
This is not necessarily true: javascript passed arguments by reference, and thus g and s are actually "given back" to the caller, who can then do different things with those objects.
d is not necessarily a document, but can be something different too: consider, for example, the contentWindow of an iframe.
Then the only thing I can think about is that accepting it as an additional parameter in the function call is less overhead (as in, bytes) than defining 'b' as a var:
Yeah, how dare they say the lowest-performing option first in an in-passing comment! What if the user knows nothing about performance and stops there! Their page will load 8 milliseconds less quickly! Amateur hour!11!
The ironic thing about Google's +1, which is meant to get you more clicks on SERPs (as you'll see your friends +1's [1]), is that as Google now use site speed in their ranking algorithm [2]. adding the +1 button to your site would actually make it rank lower.
There's another performance review of +1 here [3], which compares it to the Facebook Like button. Steve Souders (who's written books on web performance) has detailed and compared a few other 3rd party JavaScript buttons and widgets here [4].
Google Instant's previews show the +1 button, so I see no reason why Googlebot wouldn't too. I guess Google wait until window.onload fires instead of just timing the server sending the 1 HTML file. Google could also use data from Google Toolbar users but I doubt they do.
As the article says, even with a primed cache, the +1 code slows down page load. It also said the overhead can be 2 seconds. If you're aiming for fast then that could double your page load time. External assets are very likely to be calculated into a sites speed as they generally take up quite a bit of the page load time.
Of course, no one knows how much of an effect page speed plays in Google's algorithms. My bet, however, is that adding +1 to a fast site could easily make it drop a few places in the SERPs. If your site is already quite slow then an extra 2 seconds may only be a few % slower.
You're assuming search ranking is primarily based on speed. It might weight the ranking, but relevancy must be still dominant.
I mean, come on. Sites with maps, ads, or even big images can take up more than two seconds. It's not a big deal. Your analytics tracking code slows everything even more.
And it's the same network, the crawler might just go to the next server room.
The page speed that Google uses in it's ranking is (in part?) collected from the Google toolbar installed on local clients. The JS is only cached for 6 minutes. So it seems this will indeed negatively impact your search rankings.
Another problem is you can't directly access the div after the iframe is loaded if you want to apply styles to it. I had to use Javascript and look for when the div is loaded and then position it.
I always see suggestions to put script tags just before </body>, or the inline script that generates a new <script> node. Why does nobody ever suggest using the defer attribute? My understanding is that the defer attribute will do precisely what you want - start loading the script now, but not blocking anything else on the page.
Google has added the +1 option to all their search requests / results, I'm sure if the performance was such an issue they would of found a way to fix it as it would degrade their page load times, granted I'm sure it's not the same 'exact' code.
I find it mildly irritating that a guy writes a several-page article explaining individual, specific performance problems together with measurements and suggested fixes, and you respond that you're sure there's no issue because Google is smart.
The +1 widget on the search results page and other 1st party pages is different than the +1 widget for 3rd party publishers. 3rd parties have to use an iframed widget to prevent click fraud.
Does it prevent click fraud? I wonder if you could place +1 inside a div with style="opacity:0" and position it on top of a link saying "Free cheese" or similar.
Edit: you can. This example [1] has opacity:0.3 so you can still see it. It seems as though it would be trivial for an attacker to put the +1 button over every link on a page. You may even be able to have one tracking the mouse cursor.