3 reasons why you should let Google host jQuery for you
AJAX, JavaScript, jQuery, Performance By Dave Ward on December 10th, 2008All too often, I find code similar to this when inspecting the source for public websites that use jQuery:
<script type="text/javascript" src="/js/jQuery.min.js"></script>
If you’re doing this on a public facing website, you are doing it wrong.
Instead, I urge you to use the Google AJAX Libraries content delivery network to serve jQuery to your users directly from Google’s network of datacenters. Doing so has several advantages over hosting jQuery on your server(s): decreased latency, increased parallelism, and better caching.
In this post, I will expand upon those three benefits of Google’s CDN and show you a couple examples of how you can make use of the service.
Update: Since you’re reading this post, you may also be interested to know that Google also hosts full jQuery UI themes on the AJAX APIs CDN.
Decreased Latency
A CDN — short for Content Delivery Network — distributes your static content across servers in various, diverse physical locations. When a user’s browser resolves the URL for these files, their download will automatically target the closest available server in the network.
In the case of Google’s AJAX Libraries CDN, what this means is that any users not physically near your server will be able to download jQuery faster than if you force them to download it from your arbitrarily located server.
There are a handful of CDN services comparable to Google’s, but it’s hard to beat the price of free! This benefit alone could decide the issue, but there’s even more.
Increased parallelism
To avoid needlessly overloading servers, browsers limit the number of connections that can be made simultaneously. Depending on which browser, this limit may be as low as two connections per hostname.
Using the Google AJAX Libraries CDN eliminates one request to your site, allowing more of your local content to downloaded in parallel. It doesn’t make a gigantic difference for users with a six concurrent connection browser, but for those still running a browser that only allows two, the difference is noticeable.
Better caching
Potentially the greatest benefit of using the Google AJAX Libraries CDN is that your users may not need to download jQuery at all.
No matter how well optimized your site is, if you’re hosting jQuery locally then your users must download it at least once. Each of your users probably already has dozens of identical copies of jQuery in their browser’s cache, but those copies of jQuery are ignored when they visit your site.
However, when a browser sees references to CDN-hosted copies of jQuery, it understands that all of those references do refer to the exact same file. With all of these CDN references point to exactly the same URLs, the browser can trust that those files truly are identical and won't waste time re-requesting the file if it's already cached. Thus, the browser is able to use a single copy that's cached on-disk, regardless of which site the CDN references appear on.
This creates a potent "cross-site caching" effect which all sites using the CDN benefit from. Since Google's CDN serves the file with headers that attempt to cache the file for up to one year, this effect truly has amazing potential. With many thousands of the most trafficked sites on the Internet already using the Google CDN to serve jQuery, it's quite possible that many of your users will never make a single HTTP request for jQuery when they visit sites using the CDN.
Even if someone visits hundreds of sites using the same Google hosted version of jQuery, they will only need download it once!
Implementation
By now, you’re probably convinced that the Google AJAX Libraries CDN is the way to go for your public facing sites that use jQuery. So, let me show you how you can put it to use.
Of the two methods available, this option is the one that Google recommends:
The google.load() approach offers the most functionality and performance.
For example:
<script type="text/javascript"
src="http://www.google.com/jsapi"></script>
<script type="text/javascript">
// You may specify partial version numbers, such as "1" or "1.3",
// with the same result. Doing so will automatically load the
// latest version matching that partial revision pattern
// (e.g. 1.3 would load 1.3.2 today and 1 would load 1.6.2).
google.load("jquery", "1.6.2");
google.setOnLoadCallback(function() {
// Place init code here instead of $(document).ready()
});
</script>While there’s nothing wrong with this, and it is definitely an improvement over hosting jQuery locally, I don’t agree that it offers the best performance.

As you can see, loading, parsing, and executing jsapi delays the actual jQuery request. Not usually by a very large amount, but it’s an unnecessary delay. Tenths of a second may not seem significant, but they add up very quickly.
Worse, you cannot reliably use a $(document).ready() handler in conjunction with this load method. The setOnLoadCallback() handler is a requirement.
Back to basics
In the face of those drawbacks to the google.load() method, I’d suggest using a good ‘ol fashioned <script> declaration. Google does support this method as well.
For example:
<script type="text/javascript" src="//ajax.googleapis.com/ajax/libs/jquery/1.6.2/jquery.min.js"></script> <script type="text/javascript"> $(document).ready(function() { // This is more like it! }); </script>
Not only does this method avoid the jsapi delay, but it also eliminates three unnecessary HTTP requests. I prefer and recommend this method.
If you're curious why the script reference is missing the leading http:, that's a helpful trick which allows you to use a single reference that works on both HTTP and HTTPS pages. For more information about that and why it matters, be sure to check out this follow-up post: Cripple the Google CDN’s caching with a single character.
Conclusion
According to a recent study, Google will consume 16.5% of all consumer Internet capacity in the United States during 2008. I think it’s fair to say that they know how to efficiently serve up some content.
The opportunity to let the pros handle part of your site’s JavaScript footprint free of charge is too good to pass up. As often as even returning users experience the “empty cache” load time of your site, it’s important to take advantage of an easy optimization like this one.
What do you think? Are you using the Google AJAX Libraries CDN on your sites? Can you think of a scenario where the google.load() method would perform better than simple <script> declaration?
I wonder why hasn’t this been done for MicrosoftAjax.js?
Update: Microsoft now provides a similar service for MicrosoftAjax.js.
Similar posts
What do you think?
I appreciate all of your comments, but please try to stay on topic. If you have a question unrelated to this post, I recommend posting on the ASP.NET forums or Stack Overflow instead.
If you're replying to another comment, use the threading feature by clicking "Reply to this comment" before submitting your own.
112 Mentions Elsewhere
- Host jQuery at Google (with Intellisense support)
- Dew Drop - December 10, 2008 | Alvin Ashcraft's Morning Dew
- rascunho » Blog Archive » links for 2008-12-10
- if(JTeam && Toolman) {blog.read();} » Blog Archive » Javascript libraries hosted by Google
- Looking at Code | Have Google host libraries.
- Let Google host your jquery.js file | eKini: Web Developer Blog
- Coderies | taggle.org
- links for 2008-12-12 « boblog
- Daily Digest for 2008-12-12 | andrew . tj . id . au
- BlogBuzz December 27, 2008
- Quick Tip – let someone else serve your CSS framework
- Google hostet Javascript-Libraries
- Test
- Let Google Host Your jQuery Javascript File at Will Ayers - Design and Programming Blog
- Daily Links | AndySowards.com :: Professional Web Design, Development, Programming, Hacks, Downloads, Math and being a Web 2.0 Hipster?
- Weekly Web Nuggets #42 : Code Monkey Labs
- Event Calendar: Submitting Events » Death of a Gremmie
- Shizzle» Blog Archive » Let Google host jQuery for you
- ghettocooler.net » The Links Have Risen
- Wordpress插件:Use Google Libraries - 苍楼笔记
- Ventajas y desventajas del uso de Google como host de librerías javascript « Gerardo Contijoch
- Carron Media - Extend Google Analytics with jQuery
- DotNetShoutout
- Introducting Typekit | Industrial Brand
- Jaap Vossers’ SharePoint Blog » Using Google Ajax Libraries API to load jQuery
- Code a Tricky Login Form With Sliding Signup with jQuery. | CodeTricky
- Mike Panitz's Blog » Using jQuery with CakePHP: The Basics
- Use Google Libraries | 风云阁
- Sunfish Interactive | Blog » 3 reasons why you should let Google host jQuery for you | Encosia
- Varför ska man använda jQuery från Google? | Andreas Karman
- Consuming a Web Service in OBIEE Presentation Services using JQuery CDN | Art of Business Intelligence
- JavaScript Standards - Not Just a Hat Rack
- Lightbox Effects, smaller and quicker | Nashville Web Design by GroovySoup
- SODEVE
- 5 Fast Ways to Speed Up Your Blog
- Installing JQuery « Lonelycamel's Blog
- ZENVERSE – Fail-Proof Method to load jQuery Library via Google AJAX Libraries API
- Image Caption Slide using jquery | iamkreative - design, kidney and general blog
- Usa el jQuery de Google, anda. | Quenerapú
- Use Google Libraries | Blue Orbs
- 4 Ways Google Wants to Help Your Site’s Speed | Nashville Web Design by GroovySoup
- Splat Labs » Blog Archive » Let Google host it for you!
- Adding jQuery to a custom theme using wp_enqueue_script | Prolific Notion
- Using CDN Hosted jQuery with a Local Fall-back Copy | I love .NET!
- Ubrzajte izradu web stranica pripremom predloška | Kroativ
- What is Content Delivery Network & how to use it with WordPress? | Blog Design Studio
- Random Links #127 | YASDW - yet another software developer weblog
- Featured Slider
- We Love… » Blog Archive » Speed Up Your Web Content Delivery… the essential checklist
- Website Optimisation - iandevlin.com – blog
- 5 Cool Hacks and Tutorials Using WordPress and jQuery | wpConstructs.com - Community Blog for WordPress Lovers
- Lifesize Blog
- Notes from Day One of WordCamp Ireland | Steve Flinter
- uberVU - social comments
- Let Google host the JQuery of your Trac! « Das Weblog zur Person
- How to marry TYPOlight with jQuery | qzminski
- jQuery för nybörjare
- jQuery HTML effects - Trav's Tinkering
- jQuery – de goede manier | Code snipplets
- The Venture Foundation ::
- Teaching Online Journalism » Looking at jQuery for visual journalism
- Rhonda Friberg » Blog Archive » Google AJAX Libraries
- Caching on the Google AJAX Libraries API | Soupgiant
- DrakNet Web Hosting Blog » Site Performance: Tweaking Distribution
- Faster Speed in WordPress Using Google’s Hosted JavaScript Libraries | kevinleary.net
- Tips & Tricks « jQuery Refuge
- Simplify you Web Analytics with jQuery - 1918 Internet Services
- Cargar jQuery desde Google | GFDEZ: Laboratorio Web 2010
- RSS mixer service update | BrianDart.net Blog
- 让Google给你的网站加速 | 图腾
- First Time Website Design - Page 11 - Overclock.net - Overclocking.net
- Kestrel Internet Development
- jQuery: » The Official jQuery Podcast – Episode 32 – Dave Ward
- The Official jQuery Podcast
- The Official jQuery Podcast – Episode 32 – Dave Ward
- Dave Ward | Sachin Handiekar
- jQuery CDN을 대안적인 로컬카피와 함께 사용하기 | FRENDS.KR
- this is why you should let Google to host your jQuery | 0t@k-B@L!
- Linee guida per un Web Design più Verde | WebCarpenters
- Self-Baked Social Toolbar with jQuery – Thoughts on Design – trinkaus.cc blog
- Five steps to cleaning up that Javascript and CSS in your web application
- Use Google Hosted jQuery in WordPress « T. Longren
- Google CDN Naming Conventions and You | The Hostmaster's Blog :: Web Hosting Tutorials | cPanel Guides | UK Unlimited Domain Web Hosting
- Calling All jQuerys : PubMedia Commons
- jQuery über CDN von Google in Website einbinden | PAS solutions GmbH Blog | Frauenfeld, Thurgau
- CSS OSCommerce » Blog Archive » Cut Page Load time in OSCommerce 2.3 & OSC to CSS
- Site Performance: Tweaking Distribution | A Small Orange
- jQuery deshabilitará el hotlinking
- All jQuery Google Repository Hosting | Devin R. Olsen Web Developer
- Speeding Up Your Blogs/Websites? | Ali Hussain Jiwani
- SOLD and All Change!
- jQuery Filtering - Simple Data Filtering Using jQuery | Think Tank
- Backie » Why not to use Google’s jQuery CDN hosting with WordPress
- 3 reasons why you should let Google host jQuery for you – Encosia | Online Class
- My Bookmarks « Ruman's Blog
- Using Google Ajax Libraries API to load jQuery « Jaap Vossers' SharePoint Blog
- jQuery en 10 minutos | guia para impacientes | javascript | leccionespracticas.com
- Using jQuery on Google's CDN but having a plan B - Dan Esparza
- How To Let Google Host jQuery For You
- Placeholder Image (s) with jQuery and Bing Image Search API »
- How To Let jQuery.com Host jQuery For You
- How To Let Microsoft Host jQuery For You
- Why Use CDN Hosted jQuery?
- jQuery Tips & Tricks: Create Sliding Horizontal Menu | cramie
- Client-side paging with jQuery | Pros Global TV
- dittocode • A Simple jQuery Modal Box
- Increase WordPress performance by using Google’s CDN Javascript Libraries « Tournas Dimitrios
- .NET - Add jQuery to an ASP.NET Web Application | Oscar's Code
- Referencing External Script Libraries | Dysfunctional Spec
- Tips untuk Meningkatkan Kelajuan Laman Web | Heiswayi Nrird
- Add rel="lightbox" to WordPress Galleries without a plugin
- JQuery linkelés | Kerek egy ég alatt



We tried to do this, for all the reasons you state. Unfortunately, for the few weeks that we had it that way, our local development internet connection was a bit unpredictable. Every time the internet connection came down, testing ground to a screeching halt; it just wasn’t considered worth it in the end.
There are ways around this. Not certain of which framework you’re using… But in Django you set a debug settings. And you’ve got access to this in your templating framework.
Dave,
The only thing that bugs me about this is that the intellisense in VS2008 doesn’t pick up the google js file. This isn’t a big deal as I just add a local reference for working with it and then remove it before deployment.
Would be good if the intellisense worked on remote JS files.
Jon
If you’re working in JavaScript include files, you can use:
/// <Reference Path="/path/to/jquery-vsdoc.js"/>This is a great post which expands a post I made on my blog called Differences Between Local And Google Hosted jQuery Installations so hopefully anybody who comes here needing help with jQuery on Google will find all the information they need!
Keep up the good work, Ian.
@Paul: When doing development, you should have a local copy anyway IMHO. You’re not developing/testing on your live server are you?
If you’re doing this on a public facing* website, you are doing it wrong…
Strong words.
What if you are serving up content over https?
https://ajax.googleapis.com/ajax/libs/jquery/1.2.6/jquery.min.js
What if you’re serving up content hosted in and most commonly accessed from within the PRC ( think GFW )?
If you have some users in China (or Iran), you’ll want to use a fallback approach like this one: http://weblogs.asp.net/jgalloway/archive/2010/01/21/using-cdn-hosted-jquery-with-a-local-fall-back-copy.aspx
If your users are primarily in those blocked regions, treat it as an intranet and just use a locally hosted copy.
I’d say 80-90% of our users are inside “The Zone”, so we host all necessary info and force anything that requires “not our” content to use AJAX… Developing a site aimed at the PRC has some interesting challenges – we can’t assume ANY service will be available bar ours.
Using the google.load() method will also allow you to use Google’s ClientLocation :)
http://code.google.com/apis/ajax/documentation/#ClientLocation
True, but caution: ClientLocation is often null even when it’s clear that location information is available.
It’s a nice API and I’m glad they make it free, but just know that even with corporate, fixed-IP addresses it often doesn’t know anything about location.
What if the user’s network (for some reason) is blocking googles CDN.
This could happen if some crazy corporate policy exists, that stops the downloading of some other file on the CDN (or blocking google entirely).
These users will no longer be able to use your site correctly.
I don’t think you’re very likely to find that combination in practice. If they’re willing to block 16.5% of the Internet in one shot, they’ve probably blocked your site too.
I can speak from experience – that this happens.
It’s a royal pain, too.
Indeed it does. And the people on the other end generally won’t be back.
If you’re coding the site correctly, the experience will degrade but not lose any functionality in this scenario. Since some users turn JS off at the browser level, you should be doing this anyway.
yeah yeah, but let’s face it – not everybody has time to pour into perfect degradation when < 1% of their users fall into the no-js category
Thanks for the write-up, Dave. I’ve been using the Google hosted jQuery for a while and wondering if I shouldn’t be. Your points remind me why I started doing it in the first place and makes me feel more confident that I should continue doing so.
Couldn’t you use JS to verify google’s jQuery script loaded… Then fallback to a locally hosted version if it’s unavailable?
if(typeof jQuery == ‘undefined’)
{
// load local version….
}
https://ajax.googleapis.com/ajax/libs/jquery/1.2.6/jquery.min.js….
Nice one fella!
My only concern would be breaking changes with a new version. I know, not probable, but definitely possible…….
If you’re using the <script/> method, you don’t have to worry about that. The 1.2.6 URL will always load 1.2.6, even after 1.3 is released.
If you used the google.load() method and didn’t specify the full version explicitly (e.g. “1″ or “1.2″), that could definitely be a problem.
^^^
Over engineering?
This is probably the most worthless thing Google has come up with. Yeah sure it’s nice that they are supporting the libraries, but really, how is this beneficial? All we do to handle the problem you describe in your post is to download the minified versions of all the plugins and then cut and paste the code into one file.
Now we don’t have to worry about sending 15 files across the pipe (we use a lot of plugins) and all we’re really sending is the minified files and our custom javascript file. 2 files and that’s it.
Why would you want to force me to re-download jQuery in your combined include? That’s an unnecessary waste of time and bandwidth.
Script combining is generally a good idea, but rolling jQuery into your monolithic JS include will never match the performance of using Google’s CDN; not even for users with an empty cache.
You’d be very, very surprised at how much it does matter. Rolling it into 1 big JS is the preferred way to package a JS-packed site.
If you roll jQuery and all your plugins into one JS include and I use the Google hosted jQuery in conjunction with a combined include with just my plugins, my site will load faster than yours every single time. Even for users with an empty cache.
I could argue this all day long since you seem overly stuck in your ways, but 95% of the time it’s far faster to roll up your own JS file, jsmin it, and gzip compress it to your users.
Unless you’re running a heavily loaded server (and even then), it would still be faster to load one file then it would to do 2 full remote calls.
Of course the timing is also dependent upon a few things: your distance to the server hosting the files (hops), your connection, the call order, and how loaded your server is.
I’m not a big fan of someone proposing an idea as the one and only way to accomplish something — and all other ways of doing it are wrong (which is basically what you said, exempting LAN of course). Also by comparing via one tool on one browser is the wrong way to go about showing performance gains.
Anyway, I’d suggest you check out some other methods before claiming a dependency as the best way to do it. I call it a dependency because you’re now fully relying on a third-party. There’s many ways to deliver a payload faster — through things such as dynamic JS importing/loading, etc.
For what it’s worth, so you don’t think I’m picking one way of doing things and arbitrarily claiming it’s best, I’ve been testing it both ways for quite awhile. Even using the old Google Code hosted version, which was less optimized for serving users, was faster than rolling it up.
That’s before you even consider the users who will show up and already have it cached locally.
I’d love to see a post making the counterpoint, with numbers. If you made one, I would be happy to link to it from this post, to offer more complete information.
Interesting post.Thanks for sharing this.
One reason not to use it is privacy:
Google will know all of your client’s IP-addresses, referrers, etc. It’s like using google analytics, but only Google seeing your statistics.
In European data protection laws it may even be illegal.
Yeah, that’s one aspect that I’m not crazy about. At this point though, I’ve mostly given up on Google not knowing what’s going on with a public facing site’s traffic.
++
Privacy was my first thought while reading this article. One should try to not give Google more information than absolutely necessary.
That’s not true, given that a client makes a request once a year if the browser keeps the caching contract, i would hardly call it intrusive.
Chances are your clients will be using google to search for stuff more than once a year.
It would still make the request to get the 304, it just wouldn’t secure a data transfer. Google’s servers could still be logging those 304 requests.
Google currently sets an “expires” header of one year in the future on these files. If your browser has cached Google’s jQuery-1.2.6.min.js on the client side and you visit a new site that uses it within a year, the browser doesn’t even have to check for a 304.
If Google really were as prying as some of the more paranoid among us would suggest, then they wouldn’t set that expires header. They’d happily pay the bandwidth bill to log the 304s.
Thanks for this post! But why do you initialize your script with google.setOnLoadCallback() instead of $(document).ready() ?
Because i still use $(document).ready() even when i load jQuery with google’s jsAPI and it works well.
I’ve found that it depends on how fast google.load() loads jQuery and where your $(document).ready() is.
It would probably be of more value if Google also stored copies of frequently used plugins as well as the main jquery file.
Even with the delay, there is one advantage of the google.load() method.
Remember that SCRIPT tags block the download of other components and the loading of the dom. This blocking nature is primarily designed to support document.write() and other synchronous features of the language.
When a script is loaded dynamically, it is not a blocking download. (Document.write is also broken, but you shouldn’t be using that anyhow.)
So, even though the total time for jQuery’s load might be 200ms longer, if the jQuery loading takes a while, then your page is functional that whole time, rather than freezing up waiting for it.
If you’re just including jQuery, a mere 16723 bytes gzipped, it’s probably not too terrible. If you were loading lots of different scripts and modules, or if perceived load time was absolutely critical, then it could be more significant.
Isaac, that’s a great point. Depending on your existing page’s logic, switching to google.load() may or may not be an advantage if you have onload code that (un)intentionally depends on the blocking behavior of regular script tags. Definitely something to consider and test for if making the switch.
It probably bears more thorough testing, but I’ve found that google.load() exhibits the same blocking behavior as a normal script element.
I’m assuming (dangerous!) that google.load() works by injecting a script element via document.createElement(), which would be subject to the same blocking issues.
Hi Dave. But if there truly was blocking with google.load(), then this isn’t consistent with your response to Julien’s comment above where you said “I’ve found that it depends on how fast google.load() loads jQuery and where your $(document).ready() is.” when he asked about why you would need to use setOnLoadCallback() rather than jQuery’s $(document).ready().
I haven’t done any testing, but if there truly was blocking with google.load(), then I’d think there would be no reason you couldn’t use $(document).ready().
Try this, for example:
Attempting to access the jQuery object in the same block as the google.load will fail because jQuery hasn’t had time to load. Yet, if you watch in something like Firebug, the google.load() of jQuery will still block site.js until it completes loading (after the early $ access already threw an error anyway).
Just to underscore this point, this post has been receiving traffic from this search query:
http://www.google.com/search?q=%22%24+is+not+defined%22+jquery+google+ajax+api
So, people are definitely running into this issue in practice.
The google.load() function supports a callback parameter, to let you execute a function when the script has loaded. You should be using that instead for this sort of thing.
function jq_init() { alert "jquery just finished loading"; } google.load("jquery", "1.2.6", {"callback" : "jq_init" });Hi, thanks
your write but not for all conditions.
read my blog to know why you should not host it on Google.
http://tajary.blogspot.com/2008/12/1-reason-why-you-should-not-let-google.html
thank you
Any comments on this? Seems Google code might be blocked in certain countries. If so, that would be a strong argument against this method. Anyone have more info on this?
Alireza’s problem is due to the US embargo against Iran, as absurd as that is.
Anyone knowledgable as to which countries are blocked from Google code due to this embargo?
Absurd and ridiculous as it may be, it might be a production-stopper if you plan on doing business in any of those countries (and not already hindered from doing so if your company resides in the U.S).
Once again, good point this should NOT be done. I would rather 100,000 users have .5 sec of extra load time — rather than block a single user. Point made.
It’s like .. I want it to be a good idea .. I’m an optimization freak, but it practice, this is just over engineering that doesn’t help.
This is a good piece of code for testing. It demonstrates that while loading jQuery via google.load(), the browser will continue to execute/process the page until it hits another piece of external content to retrieve. I do get the “$ is not defined” error.
In contrast, there’s no error when using a script tag for jQuery since the browser completely halts execution of the page until jQuery has fully downloaded and been parsed.
But with this, as Isaac pointed out, there may be a small delay in the page fully loading compared to google.load(). Assuming all your onload jQuery logic is currently wrapped up in (document).ready(), switching to Google’s setOnLoadCallback() method seems safe to do. If Google’s CDN serves jQuery quickly, I don’t think I’d be worried about just using the script tag, avoiding google.load().
It’s all very well arguing the point of using Google’s setOnLoadCallback() to initiate your code while not hindering page loads, but at the end of the day it is quite likely that you’re calling a third part jQuery plugin locally, and the fact that this WILL be done using a script tag will no doubt result in the script loading fully before the DOM continues to load happily.
What’s the recommended pattern for lazy-loading the jquery framework (and associated plugins)? I am thinking of scenarios like webparts, where you require the use of jquery, but don’t want to cause another http request and subsequent parsing of the same library.
You could use something like this before first use of jQuery in each of your webparts:
I’m not sold on benefits of google hosted jquery.
Whats your assumption re your visitors browser cache settings (on/off)
if a majority of your visitors have clear cache sure, CDN hosting of ALL objects would accelerate loads but otherwise, there are basic principles to improving client side render performance without introducing a reliance on an external host..
- use minimized versions of jquery (remove white space, decrease file size)
- gzip (compress whats left to send over wire)
- apache mod_expires (cache for X timeframe)
- host jquery on separate host (deliver static objects and dynamic by separate hosts)
- locate JS files at bottom (prevent script blocking)
- Load all required JS files on early in visit process. don’t double up…Static object loads shouldn’t be coupled with query responses
also worth noting, once you start a browser session and your browser either confirms the objects status’s or redownloads, for the rest of that session, you won’t notice a benefit because user local cache is used for the rest of that session..
Even if you ignore the caching angle, Google’s CDN is going to serve files faster than you will be able to. Their CDN is one of the best in the world. Users potentially hitting the page with it pre-cached is just icing on the cake.
Sure, CDN hosting all static assets is even better, but how many sites really do that? The vast majority of sites in the wild use no CDN.
Even for sites that do use a CDN, using Google’s jQuery is an opportunity for cost cutting and potentially increased parallelism (assuming more than just JS is distributed via CDN).
* gzip (compress whats left to send over wire)
Totally in agreement for this. It really matters a lot compressed and uncompressed scripts.
Check the headers. Google’s hosted versions are all compressed, minimized, everything. They’ve basically taken it upon themselves to optimize the hell out of these javascript libraries for the fastest possible hosting of them, with far-future expires headers and everything else.
Using these will almost always be faster than not using them, in virtually all cases.
…but what if the user got this fine plugin named “NoScript”.
NoScript won’t load scripts from other domains as standard – the user have to accept them. :(
are there possibilities like Zach said:
excuse my bad english and greets from northrhine westfalia,
mathias
btw. this is a great article!
This is interesting. What if you are using custom jQuery plugins? Can you still use this method?
Sure. It’s just like using a local copy of jQuery, only faster.
Do you know all those sites that fetch jQuery from Google Dave? Well, 9/10 you will see calls to ajax.googleapis.com that are taking ages to reply, so I think that using a different host to fetch jQuery is slower – overall – than including it from the server that one is already connected to.
I’ve always found the opposite to be the case (Google serves it faster than my servers can). Do you have any examples of sites that hang on the dependency?
I’ve seen this with slow DNS servers.
Regarding the “$ is not defined” issue when loading using googles ajax method. I use the following and have had 0 issues with it so far…
—————————————
google.setOnLoadCallback(init);
function init() {
$(function() {
//initialise…
});
}
—————————————
So, uh, Google’s having *serious* latency issues (at least from my part of the world), and large and small sites alike that rely on the hosted jquery are hanging completely. Awesome! Granted, this will hopefully happen only rarely, but…
And that’s exactly why I don’t use Google for my jQuery stuff. Local is best. I started seeing issues with Gmail from yesterday onwards. Who knows, maybe their maps are no longer reduce (or the other way round). Again, CDN is nice in theory, but don’t think that it beats local (well, depending on most sites’ target audience).
Looks to have more to do with network routing and less with Google: http://blogs.zdnet.com/BTL/?p=18064
None of my sites were very affected by this. Remember that the CDN hosted files are served with a +1 year expires header. Returning users don’t even require a 304 ping.
It would be a problem for new users, but how many new users (who don’t have access to Google search to find you) affected by the routing issues are you likely to bounce in that couple hour window? It’s a pain, but no where near a catastrophe.
One big surprise is that with Firefox 3 fetching jquery 1.3.2. from Google I get a 200 status code every time instead of 304.
So there is no caching benefit, just the benefit from the CDN.
Can anybody confirm this ?
thanks
Because it’s served with a far-future expires header, you’ll only see a request if your cache doesn’t contain a copy of the file. For up to a year after it’s cached, no request is made at all when the browser encounters a reference to it, not even to check for a 304.
So, the only time you’ll ever see a request, it will have a 200 response. You shouldn’t see subsequent requests (and 200 responses) though, unless something’s preventing your browser from caching the file.
Firefox’s disk cache is so ineffective that you should assume it doesn’t exist. See https://bugzilla.mozilla.org/show_bug.cgi?id=559729 and its many, many dependencies, especially bugs 175600 (limit of 8192 cached items), 193911 (space limit of 50MB), and 290032 (some files are never cached due to a shitty hash function – may have been fixed in FF3.6).
FF4 may be better – at least, some of those bugs have been marked fixed – but I would want to see test results.
What you mention is true for jquery-1.2.6 but not for 1.3.2
Try the simplest page and check with firebug. If you reload the simple page with 1.2.6 you will see a 304 status code but a 200 status code for 1.3.2:
I think the issue is that the response adds a http header Age not zero which is not included when you request 1.2.6.
Try it! I was quite surprised !
It’s normal to see a 200 response on the first request.
It would be abnormal to see a 304. With a far-future expires header, the browser shouldn’t even be pinging for a 304 if it has it cached.
When it’s being properly cached, you shouldn’t even see it appear in the Firebug net tab.
Sure, the previous post skipped the basic html. I compared a page with jquery 1.2.6 and another with 1.3.2. The page with the old jquery caches properly but not the one with 1.3.2.
At least with firebug 1.3 I see the 304 because if you look at the header responses it returns a Last-Modified, which takes precedence over expires
It takes 5 minutes to do this test in firebug but I assume you haven’t even tested what I am saying.
Your script snippet in Back to Basics references 1.3.2, so you should update the article. I will also try with IE and report the findings.
I use the Google hosted 1.3.2 on several sites. I double checked them this morning, after reading your comment, to make sure it’s still caching properly in Firefox.
It is for me.
The browser never makes a request for jQuery (1.3.2), not even a ping for 304, unless I clear my cache or force an update with a shift/ctrl reload.
I also verified it in Live HTTP headers and Wireshark. Firefox is using the locally cached copy and isn’t sending even a single byte over my connection when it hits a reference to http://ajax.googleapis.com/ajax/libs/jquery/1.3.2/jquery.min.js
Do you have a publicly available page that reproduces what you’re seeing? I’ll take a look at it.
Thanks for checking.
My issue is minor, I was refreshing both a page with 1.2.6 and a page with 1.3.2. The page with 1.2.6 checks the 1.2.6 link and gets a 304. The page with 1.3.2 gets a 200
Bravo, Google :)
If you happen to run a WordPress site, there’s a plugin that will easily do this for you, for all available JS libraries:
http://wordpress.org/extend/plugins/use-google-libraries/
Just install and activate. Operation is automatic, no configuration needed.
Thanks for sharing your knowledge!
If you are using opendns and you have set the protection level to highest, there is a great chance that the google website will be blocked for your entire network, these are some wise words though on CDN and general website performance.
Thanks a lot! You have opened my eye and now I am surely going to put the Google hosted jQuery on my blogger blog! :) thanks once again!
I’ve been doing this as a matter of course since Google first announced they were hosting the files. I’ve not personally seen any performance increase, as I always keep my pages as lightweight as possible in any case, but neither have I had any problems. I agree that this is a best practise missed by most, and it’s always worthwhile exploiting any optimisations you can!
Sorry, the first code should be:
google.load(“jquery”, “1.3.2″);
google.setOnLoadCallback(function() {
//this worked
});
And the second block of code should be:
$(function() {
//This didn’t work on IE but it did on FF
});
I am considering using google api, but I have a question.
What is the likelyhood that google will host old versions indefinatly? I have made websites for clients for almost 15 years now and one of the oldest is still running today. I would hate to think that a site I make today would not run in 20 years time because the js is no longer hosted on google?
That’s a good question that I don’t know the answer to. My guess would be that they would continue hosting it as long as it was actively serving requests. There’s so little overhead in it, I’d be surprised if they went out of their way to break something being used like that.
Worst case, it’s a very quick search-replace to globally update a site to use a locally hosted legacy version instead of Google’s.
Hmm, not very reassuring. But I still would like to use it. Best solution I can think of is to check to see if jquery has loaded, if not load a local version?
Have any thoughts on how that could be done best?
You can do this:
Because script elements block rendering until they’ve loaded and been executed, you can assume if jQuery isn’t present in the subsequent script block that it has failed to load from Google and then react accordingly.
You confirmed my thoughts. Thank you very much for the help.
I’ve tried the check and eventually load locally method above; I’ve intetionally used a bad url for google jquery to see if the local loading of jquery was really done. And indeed it was, but looking at firebug network tab, I see that others js libraries requiring jquery, load before the local copy of jquery, breaking any jquery-dependent functionality on the page.
Here’s my test code:
Firefox loads the second library as soon as google gives the 404 error, and then loads local copy of jquery after some other resource files, as ccs o gifs.
Do you know a way to correct this behaviour? Am I doing something wrong?
Thanks!
Sorry, I’ve lost some piece of code. Hope this works:
if (typeof jQuery === 'undefined'){ document.write(unescape("%3Cscript src='js/jquery-1.4.2.min.js' type='text/javascript'%3E%3C/script%3E")); }thanks for your advice
keep it up
Yeah, I just found out from another programmer out there that this is definitely better than hosting the jQuery script yourself. It was nice to see exactly why it is better after reading your article.
Thanks a lot.
It would be nice to use Google’s CDN but as google is blocked in some countries and some networks … Anyhow, it is a good idea to use some CDN, perhaps build a very own to increase parallelism and improve caching. What about latency? well, we are talking about 20kb of jquery minified and gzipped, just make sure to use less images and optimize their size :)
Thanks Dave, Encosia hasn’t been in my feeds for a very long time, but it never fails to teach me something new and interesting, I like this post and I adore the discussion opened up by the readers, you could add the pro’s & con’s to your post that have risen so far, that would be a good summary :)
thanks again and keep it up!
I have a asp.net app that has a dynamic combining of the various scripts (jquery, plugins, etc) – total of ~110Kb, combined, gzipped.
When I serve jquery combined with other scripts (1 script tag), latency is around 500ms.
When used google’s CDN hosted version of jquery (1.3.2) and firebug shows latency of around 828ms for the scripts in net tab.
All tests are done on local VS development server.
What do you think? Will these results change when I host my application on some online server?
Yes, you should expect to see the extra HTTP request be much more significant in a local setting. That won’t be the case once deployed to a live server.
I personally have other stuff within my “jQuery” file like jQuery Tools and JSHTML with I use on almost every page where I might use jQuery.
So that’s one reason I wouldn’t use it, since you can’t have any other files in it. Also, it’s yet another http request so if you’re already loading other stuff, why not have jQuery be loaded in there as well.
I can see myself using this for stuff like tutorials though, where all I would need is jQuery.
Thanks for the article! Firebug tells me, Google’s response is about 50 ms (jquery) and 150 ms (jquery ui) faster than my remote dev machine. It’s also gzipping, so I don’t have to take care of that. Seems to me like there is an advantage on using Google’s hosting the way you suggest. I also used your fallback code, thanks for that!
Although – right now I’ve got only 4 JS files, which might double or tripple at the end of development. From past experience I still suspect, that nothing is faster than combining and compressing all JS into one (as well as CSS). As far as my limited experience goes, this is always the fastes solution, although it’s more tedious to maintain. The process has to be repeated for every update to any JS or CSS file. One could still use a CDN though, by just buying some cloude space. Should not be too expensive per year for a couple of text files which are often cached.
The fallback code works like a charm if remote JS files are blocked by NoScript. That leaves only the “user lives in Iraq”-scenario, unfortunately. Maybe a country specific IP range can be considered.
For https I’m thinking it would be possible to use the Analytics code:
var gaJsHost = ((“https:” == document.location.protocol) ? “https://ssl.” : “http://www.”);
document.write(unescape(“%3Cscript src=’” + gaJsHost + “google-analytics.com/ga.js’ type=’text/javascript’%3E%3C/script%3E”));
Obviously changing the URL.. Not tried yest but it should work.
It is good for most shared hosting users to offload extra scripts and resources as much as possible, but when you have a low latency, high bandwidth server, you are just as well off since you no longer require an extra DNS request, which can waste lots of time.
Only if you assume that all of your users are closer to your server than they are one of Google’s CDN edge servers and that the user doesn’t have the Google CDN copy of jQuery already cached (which would lead to the browser not making a DNS lookup or HTTP request at all).
Was doing some research into this trying to pick whether to use
jsapi+google.load or the direct library path. I noticed that with the former
method, the js lib comes with a 1yr-in-the-future expiration date while with
the latter, a 1hr. Here are the (relevant) HTTP response headers from FireBug:
using google.load():
Content-Type text/javascript; charset=UTF-8
Last-Modified Thu, 14 Jan 2010 01:36:01 GMT
Date Fri, 22 Jan 2010 20:27:03 GMT
Expires Sat, 22 Jan 2011 20:27:03 GMT
Content-Encoding gzip
Cache-Control public, must-revalidate, proxy-revalidate, max-age=31536000
Age 176
Content-Length 23807
using straight url:
Response Headersview source
Content-Type text/javascript; charset=UTF-8
Last-Modified Thu, 14 Jan 2010 01:36:01 GMT
Date Fri, 22 Jan 2010 20:33:37 GMT
Expires Fri, 22 Jan 2010 21:33:37 GMT
Content-Encoding gzip
Cache-Control public, must-revalidate, proxy-revalidate, max-age=3600
Age 161
Content-Length 23807
Turns out, the confusion is that this reference:
http://ajax.googleapis.com/ajax/libs/jquery/1.4/jquery.min.js
Is not a reference to 1.4, but the latest 1.4.x release. Since it would defeat the point of a “latest” reference to cache it for a year without requiring a 304 check, they use shorter expires headers on those.
Using this reference:
http://ajax.googleapis.com/ajax/libs/jquery/1.4.0/jquery.min.js
Supplies the +1 year expires header as expected.
Paul Irish has more info on that here: http://paulirish.com/2009/caching-and-googles-ajax-libraries-api/
This is a great article but when I try this approach the intellisense will not work. Is there a work around to fix that?
Assuming your code is in .js includes, download the vsdoc and place this at the top of your includes:
/// <Reference Path="~/path/to/jQuery-1.x.x-vsdoc.js" />For Intellisense inline in ASPX pages, you can use this trick:
It will never be rendered to the page, but Visual Studio will still parse the file for Intellisense.
Ok I tried this but I don’t seem to get it to work. Here is what I did
Did I put it in the wrong place? As long as I don't reference the Google page my page works but as soon as I point to Google - no intellisense.
Thanks for your help.
I don’t have VS2008 handy to test with, but that code does give me jQuery Intellisense in inline code blocks in VS2010.
@Terry: I had the same problem.
This worked for me on VS2008:
http://nitoprograms.blogspot.com/2010/02/jquery-hosted-on-googles-cdn-with.html
This method would be great if everyone used it. The problem is that only something like < %5 of sites use it meaning the extra DNS lookup to google costs more time than just hosting the file yourself. Since you are only pulling one file from google, you never beat the initial dns overhead that using their CDN costs
Having total control of your source codes outweights all of the advantages listed and discussed above. Why should you rely on someone else, even Google, may have an impact, beyond your control, on your company/business website?
For a personal website, I can see the benefit but not for company/business websites. Foolish thinking not practical.
Unless you planned on modifying jQuery.js, you still have total control of your source code. If something were to happen, you could switch the reference to a local copy in minutes. It’s no less practical than using any other CDN, which business sites have been doing for years. Unfounded paranoia is no reason to slow the web down as a whole.
The difference is that the Google CDN for javascript libraries doesn’t offer an SLA.
Automatically falling back to a local copy isn’t difficult (see the examples of that throughout the comments here). In the bigger picture, I tend to agree with Nate that SLAs are empty anyway. I’m less interested in who to blame during the .0001% than I am how to improve performance/user-experience during the other 99.9999%.
I think if you agree to an SLA which is ‘empty’ then absolutely, there’s no point to it. But if you properly set an SLA with your hosting provider which guarantees your servers, with fines etc for any problems, then to bring a key part of your page engine away from that definitely negates the whole purpose and would not be justifiable in terms of business accountability.
Switching back to a local copy is your solution if something happens. Then, why bother with that headache in the first place? Beside, your suggestion shows that you did not consider the business implication. Redirecting your customers to Google equates with Google accessing your customer’s intelligent info. Furthermore, it is practical to point out that jQuery is only about 70k. What gain you actually do? NOTHING.
Therefore, it is foolish and ill business practices, still.
Falling back to a local copy is automatic. Why bother? Eliminating a quarter-second or so of page-blocking JavaScript is well worth understanding how to best use the resources available.
As long as the majority of sites are using services such as AdSense, Google Analytics, Google Maps, etc, I have a hard time buying the FUD and paranoia. Of all the services like that, using the CDN is one of the most benign, since the browser may not even contact Google again for a full year after jQuery has been downloaded once.
I think, Billy Nguyen’s concern cannot be dismissed. He points out that there are differences for commercial and for non-commercial websites, and that they must be considered, especially the ones about privacy, and especially if you’re gaining about a millisecond in performance. A business should have the possibility to invest in a save CDN. Just as more and more businesses can’t rely on Google Analytics anymore, because then you’re sending certain 2-party-data to a 3rd party (Google). In some countries like here in Germany, this starts to go to courts already. The solution also for this problem is relying on software which can be hosted on the companie’s property.
On the other hand, since when everyone points to Google for hosting of files, they get cached a lot and Google’s servers are not contacted anymore, so it’s even harder to make “sense” of this info.
Why not do it the other way around for business sites? Use a local copy, and when that is not available, fall back to a 3rd party CDN. Seems like an unusual case (unless you do have your own CDN or CDN-like setup), but if it does happen, it seems like good service to your customers not to bother them with the tech problems.
Privacy on the Internet is a nice notion, but largely an illusion. Of all the actively invasive methods that advertisers and ISPs use to track users, focusing on the potentially once-per-year jQuery CDN accesses is a non-starter for me. If Google were really out to leverage the service for “evil”, they definitely would not serve the file with a +1 year expires header.
Sorry, but I seriously disagree with you here. This cavalier “well, we don’t have privacy anyway” attitude is the very reason that privacy is such an issue in this day and age. When they came for the trade unionists, I said nothing.
You are wrong to just dismiss these claims. At best, you are being disingenuous by not giving security in this matter an equal discussion. At worst, you’re ignorant to the real problems here. In a below response to John Sanders, you state that, “if Google’s CDN were compromised, you’d probably find that out much quicker than if your own hosting was.” I don’t necessarily agree with that. When you use the Google APIs, you don’t have an SLA with them. They are under no contractual obligation to report exposure to you.
More importantly in this day and age, and the real reason I take serious issue with your “this is the best approach for everybody, except maybe intranets” attitude, Google does not need to be compromised in order for the content to be exploited. Man-in-the-middle attacks are extremely easy to pull off, especially when the content is being served via HTTP mechanisms.
Here’s the real problem.
If somebody visits a website that uses the Google-hosted jQuery and their session is compromised just once, every single session they have on any and EVERY site they subsequently visit that uses Google’s jQuery is also compromised. If someone were to inject a modified jQuery to Joe Sixpack while he was browsing the web at Starbucks (or even as he passed by a rogue access point with wifi enabled on his phone or laptop – on a freaking plane, even), the response can be set to cache that content for decades. At which point, every single request to a Google jQuery-enabled site is now affected. The very caching that you espouse as one of your main three benefits also introduces a serious potential vulnerability.
This isn’t being overly-paranoid, this is being realistic. As long as we’re using plaintext protocols, depending upon an outside CDN to provide portions of executable code that are shared among thousands of sites is just an awful idea.
Everybody is free to make their own decisions, but you should really be fair here and give discussion to the fact that in order to enable “Eliminating a quarter-second or so of page-blocking JavaScript,” site managers are opening up a new potential avenue for attack. When explained in these terms, many businesses may make a different decision.
Parallelism is also a dubious argument. Anybody can enable parallel loading of JS content on their site without much difficulty. And if one is concerned about the two-concurrent-sessions-per-host browsers as you discuss, you can just as easily serve the content from a different hostname. Yes, it’s another DNS lookup, which adds latency (assuming they already have the Google DNS entries cached).
Everybody needs to make this decision on their own, weighing all of the benefits against all of the downsides. This is far from a “one size fits all” solution, and there are many factors to consider. Risk assessment is a real concern here.
You have a heavily-linked page on this topic. You would do well to treat the topic honestly and fairly. I’m sorry, but I don’t feel you are currently doing that.
Interesting. I doubt the benefit of users not having to download jQuery at all because it’s cached exceed that of minifying all your javascript files,including jQuery, into one file and serving it from Cloudfront, which is what I do.
Thanks for the article Dave!
What makes me most uncomfortable about this approach is security. If Google’s servers are ever hacked (improbable, but not impossible), then think about how many websites will be running malicious code. Who knows what kind of information a bad person could grab by reading your cookies, or scraping your screen. For this reason, I would only use this strategy for websites that contain no private user information.
Hacking happens.
John
That’s definitely a valid concern.
Of course, if Google’s CDN were compromised, you’d probably find that out much quicker than if your own hosting was.
Exactly. This is the primary reason I’m hosting my JavaScript libraries locally.
However, I’m starting to use Google’s hosted Web fonts, so maybe using their hosted jQuery would be no worse. Also, they’re probably better at protecting the integrity of their CDN than I am — I’m a developer, not a sysadmin.
Dave, have you seen this article?
http://zoompf.com/blog/2010/01/should-you-use-javascript-library-cdns
It says using CDN’s for javascript isn’t worth it – the DNS adds 1/3 second delay verses your own domain, and jQuery only takes 1/3 sec to download.
He’s also says only 2% of Alexa 2000 websites use a javascript CDN, so it’s unlikely the visitor has it cached already.
In short he says:
What do you think?
I don’t agree with their assessment about the likelihood of a cache hit. With such a wide gamut of high-traffic sites (like Time, SlideShare, ArticlesBase, Break, Stack Overflow, Examiner, Woot, and ArmorGames, to name a few) using the Google CDN, it’s becoming more and more likely that the minority of your users are the group that don’t have a local copy cached.
I totally agree D Yeager. 300%.
I tested this on my website and loading MooTools (well it’s not jQuery, but the situation remains the same) from Google CDN adds latency due to the DNS request. We switched to a locally hosted copy and the site is faster now.
My advice is to test before diving into such advice. It may be not that good as it seems…
I agree that testing is important to any optimization. For example, my browser took 678ms just now to download MooTools from your server, but only 91ms from ajax.googleapis.com.
Of course, that datapoint might be irrelevant if you don’t expect to reach a global audience with your particular site. It’s a good reminder of how effective geographically dispersed edge servers are though.
Yes, the location (geographically) of the user plays an important role here. Our server would certainly perfom better in France and nearly countries while you have a penalty from America. Your remark just made me check how many folks visit the our site from the other side of Atlantica. It looks like we have 4%, so it’s reasonnable not to use a CDN.
I guess a very good option here would be to have those major libraries hosted locally, directly embedded in the browser as instance. That would definitively save up some time for everyone!
I’m also wondering how this could affet the new criteria from Google’s pagerank: the site load time is taken into account. If the googlebot fetches everything from America, that would be a load time penality for the crawler. Even more if the DNS are cached: the CDN version would perform a lot better. Any thought on this?
I think I would rather site visitors accessed this locally on my servers than kicking off another DNS request. I think I’ll still run some tests to see if there’s really much difference. Maybe better to use the hosted version if you’re working with Google tools a lot such as Maps?
Cheers.
It’s smart to consider the DNS lookup. However, it turns out that so many of the most trafficked sites are referencing the ajax.googleapis.com domain now, the DNS lookup is going to be already cached at the browser or OS level for most users.
What happens when Skynet(Google) becomes self aware?
Is it possible to use Jquery with Google Sites?
I think I’ll stick to localhost serving of scripts, I got fast server and have run tests and got such mintue improvement it was outweighed by serving extra dns requests to another server, but a great post to highlight the point, it will be better for most sites I’m sure ;-) (lol at Jim Kennellys skynet comment)
Does it help make a decision knowing twice this month I’ve had issues getting Google Code hosted jQuery to work appropriately?
It’s hard to find information out there, but I know it happened.
Great article!
Awesome article, i was always against grabbing jquery from google for risk of it being slower, great work
I was doing this today, until I noticed that http://ajax.googleapis.com/ajax/libs/jquery/1.4.2/jquery.min.js is “Not Found” and will not load on my website. I reverted back to a local copy, it is more reliable.
@James P.
If you agree to use Jquery hosted externally and your problem is the reliability of the external link, you can do a little check like this:
In this way you can serve a local copy of jquery just in case google link fails.
Where would I place this backup code in case Google’s CDN was down and didn’t load? In my header.php or function.php?
Neither. JavaScript doesn’t belong in PHP or HTML files.
Seems like Google is moving away from direct inclusion. And want you to get an API key. Using the API method, would the file js file still be cached?
To be more specific, will this still be cached
Scripts loaded through the jsapi loader are cached.
I still prefer the direct reference for pulling a single script off the CDN. It only requires a single HTTP request, where jsapi/google.load() results in two.
The place where using google.load() makes sense is when you’re pulling several scripts off their CDN. Then, its dynamic script injection will probably be faster than static script references.
I would do this if it weren’t for the fact that I’ve corrected bugs inside jQuery that would otherwise cause errors to be served to my clients.
Thanks for the handy article. :)
Great article! Thaks for share
Great article dude
It’ a great idea, everyone is recommended. Load faster and BW saved.
Looking for your donation button. Finest decision support via comments that I have ever seen.
Hi Dave,
I only receive gzipped jquery from Google and Mircrosoft cdn when i use “https://”
and not when when I use “http://” in firefox. Yet I have not tested in other browsers.
example: “https://ajax.googleapis.com/ajax/libs/jquery/1.4.3/jquery.min.js” – gzipped
“http://ajax.googleapis.com/ajax/libs/jquery/1.4.3/jquery.min.js” – Only Minified not gzipped.
Thank You,
Vishal Astik
It’s coming in gzipped for me, as of a few minutes ago:
It’s possible you’ve got a proxy somewhere between you and the CDN that’s altering request headers on regular HTTP traffic but not SSL. Do you see gzip or deflate in the Accept-Encoding line here? http://www.xhaus.com/headers
Hi Dave,
Thank you very much.
you were right, problem was due to the proxy.
when I try it from home everything works fine.
Thank You,
Vishal Astik.
Hello Dear,
I have try to make one jquery plugin for html loading function. Please refer the following link.
http://aspnet-ajax-aspnetmvc.blogspot.com/2010/10/dyamic-html-loader.html
Please suggest me to improve it more.
Thanks,
Mohan
If you’re following this advice, you are doing it wrong.
There is one single reason that outweights the 3 given in the article:
Google will track down and save the information of all your site visitors, without them even noticing. Using the APIs, you just help to feed Google’s databases.
And if some alerted users have blocked scripts from Google for exactly this reason, they will not be able to use your site.
It is completely naive to think that Google hosts this APIs for the good of mankind. It’s all about harvesting of personal data – who visits which site at which time. Using the APIs surrenders your unsuspecting visitors to Google. Don’t do that.
Luckily, that’s not very realistic. The CDN’s files are served with a far-future expires header, which dramatically minimizes how many HTTP requests are actually made to Google’s servers – exactly the opposite of how they would configure it for “evil”.
I am developed a site for a client that is available world wide (in languages I don’t speak), and ran into issues with people in other countries not being able to load the jquery plugins from google. I was not able to get exact error messages. We switched to having the jquery files locally and it resolved our issue…for what it’s worth to anyone.
If you are using type=’text/javascript’, you are doing it wrong.
In 2008, when this was written, no one was using an HTML5 doctype. In HTML 4.01 and XHTML 1.0 [Transitional], script types are required. For that matter, they’re optional in HTML5, not forbidden.
Right, consider that an update comment for future readers. Optional as in ‘you could put anything there and it would still be ignored’.
I have a hard time believing Google is not interested in logging the ip’s. They are willing to give anything for free as long as they get a ping. Last time ( a few years ago) I looked at the google analytic js drop in they have a cute little image of one pixel that is wrapped in a noscript tag. PING.
Google may not have that much more market share yet as far as search goes, but they saturate the web with they ‘free services’ that track nearly every site you go to. Don’t fool yourselves. It is not benign.
Personally, I find it unethical to host 3rd party content from a site. when a user types or clicks a address they should be able to expect that is where they are going. There should not 15 different hosts hiding in the background js.
Constructively, this could be avoided if there was a way to include a hash of the file along with the path in the script tag. It would then be obvious to the browser that the file was indeed the same library it had already loaded and not bother with it again. Very simple without having your every click tracked by big G.
Since the AJAX Libraries CDN serves content from a cookie-less domain, with a +1 year expires header, it’s extremely poorly suited to any sort of per-click tracking. The idea that AdSense or Google Analytics tracks you from site to site is a legitimate concern, but this CDN is configured to optimize performance at the direct loss of “trackability”.
Thank for this great article and good trick ;)
This is a great article. Thank you for taking the time to write it. The commenting area is also very valuable.
Regarding this statement in your article:
While this is indeed a nice trick, I wanted you to know that it did not work for me and probably does not work, period. When I include the link to the jquery without the HTTP that way, jquery simply isn’t loaded.
Thanks again.
Ricardo
It definitely works. The one place you’ll run into trouble is if you’re editing/testing a file locally on Windows, without using a development server. From a page opened at file:///yourPage.html, the protocol-less reference will attempt to load jQuery from your local filesystem instead of using regular HTTP.
I installed a plugin called “Use Google Libraries” that did all the work for me :) however, although it works, I find that it didn’t provide benefit #3 that you describe. For some reason, my website calls jquery 2x from Google’s CDN (probably some other plugin makes the call), and the site attempts to download it. Do you believe that if I hard coded my site like you recommend that it would stop the 2nd call? Or do I need to get rid of that plugin (assuming I can find it, I’ve got 19 installed).
Thanks!
That sounds like either two separate plugins are injecting the jQuery CDN reference or maybe you’ve got the “Use Google Libraries” plugin in addition to a theme that has the hard coded link already included.
I’d say try disabling the “Use Google Libraries” plugin first, and see if there’s still a single request being made for jQuery on the CDN. Also take a look at footer.php and header.php in your theme to see if jQuery is already being included there. If so, you shouldn’t need the plugin.
Good article thanks you
This quote is actually not correct, although it may have been changed since the article was wrriten. The google CDN site currently says:
and
Great article!
I’m glad to see they changed that. Thanks for pointing it out.
Dear encosia,
It is very hard to remember the path to Jquery on google. So every time I want to include google hosted jquery, I do a search of “jquery on google” and your blog always comes up on top.
I have done this about 25 times now and everytime I have to scroll all the way to the bottom of the page to get this bit.
I am sure a lot of people do this. Please insert a similar text right on top of the blog for people like me.
Thank you very much.
Done.
Excéllenté
Thank you very much for listening, this outta save a lot of time for me in future :)
I have used google hosted jquery in blog. Works like charm and thanks for sharing the valuable trick.
We were doing this on a client ecommerce site, and as unlikely as it sounds, the Google-hosted jquery *did* go down. Even going down for a short time means a loss in revenue. We don’t use Google’s hosted jquery anymore.
I visit this site a couple of times a month just for the ajax.googleapis.com link. Saved me lots of time. Thanks!
*Never ever* make yourself dependent on third parties if you are serious about your website and you have alternative solutions. Google’s servers in particular are unreliable in my experience (I am getting server errors all the the time for instance on Google groups, and emails on Gmail have simply disappeared or not been forwarded in the past)
Thomas
This is so amazing tutorial sir. Thanks a lot. This is pretty much helpful for beginners like me. I just want to ask though. After all the codings with jquery and having it hosted in google. I wanna ask how would i be able to install the codes into my site and make it work there?. I am using blogspot platform. I already have the code for the particular purpose. I just don’t know how to install it in my site.
Please help me sir. Thanks and more power to you.
Just wondering why if I leave the https off (//ajax.google.apis.com/…) jquery fails to load. I just end up with a “$ is not defined” error.
That’ll happen when you load the page from your local filesystem (as opposed to using a development server and localhost-based address), since the page’s base protocol isn’t HTTP or HTTPS.
I prefer not to use other parties hosted script. Just in case the unavailability of Google (which is rarely happened) our web might be broken or even make security hole in it because of the library cannot be loaded. IMHO it’s better put on our own hosting. If our hosting down then no one can access the web page, instead of open the web with security problem.
Wait, did you just say absence of juery might leave a security hole?
Looks like you are just making things up out of thin air, I mean, seriously, who depends on js for security? take it out of there and put it where it should belong. Also, please stop complaining about google going down so you didn’t have js to secure your website.
Yes, it’s true we don’t depend on js on security, yet if the website that uses js to do some processes and it broken because of the unavailability of the js might show some data that might not suitable for public to see.
And how you will be sure that google or other script provider does not put other malicious script on their hosted script.
http://www.networkworld.com/news/2007/040207-javascript-ajax-applications.html
and
http://www.clerkendweller.com/2011/1/7/Widespread-JavaScript-Vulnerabilities
If you’re relying on JavaScript to hide data that’s not suitable for public visibility, you’ve probably already been compromised and don’t know it yet. JavaScript can’t hide anything from view source or even a simple wget.
That said, you can use a fallback approach (linked on this page several times) to automatically load a local copy of jQuery if the CDN’s copy fails to load.
Actually I’m not hiding any data, but read data using javascript, I never do that on my projects.
CMIIW, I saw several websites that look fine, but displayed those kind of data when I disabled javascript on my web browser. That’s what I pointed as the security hole. Maybe this is different context though :)
Anyway, I agree on fallback approach when if using CDN. Btw, do you think there’s a file hash checker to make sure the file on CDN is the same as the one on jquery website?
Cheers
I’ve switched to using the specific version (say v1.6.2) of jquery to a generic version (i.e. v1), which is also possible with the script tag:
Even if you trust that new versions of jQuery won’t break your site (and the last two major versions had breaking changes that affected many sites), you shouldn’t use the latest version reference for performance reasons. In order to ensure that a browser seeing that reference will always use the latest version, that reference is served with an extremely short expires header. So, you lose not only the cross-site caching benefit, but for repeat visitors this aspect of your site will be even slower than if you’d self hosted jQuery with proper expires headers yourself.
Excellent points. I used an old version of jQuery for some sites and discovered later that there were errors with IE6. These were solved in a more recent version of jQuery. Hence I thought it might be a good idea to let it update automatically. For these sites, performance wasn’t too much of an issue, but breaking the site would be of course. Thanks for the pointers.
This shows how such a great topic can be in that is spanning over two and a half years! For me personally there is still a element of suspicion and concern using a third party and rely on them – no matter how big or dedicated they may be. I guess where revenue and performance are concerned however this is a deal breaker. Just to be contradictory, the idea of automatic updating does appeal, which I assume the hosted option would provide?
You should avoid the auto-updating usage, because it also comes with a very, very short expires header which hamstrings caching for repeat visitors. It’s also somewhat dangerous to assume that jQuery updates will not break old code these days – jQuery 1.5 and 1.6 included breaking changes (in the interest of the long-term good).
I’m not so sure that I like the idea of Google or anybody else having the ability to track exactly who is requesting my site’s pages. Since every request for the API libraries is accompanied by the key, then Google knows the site being accessed, and can also know who the visitor is and what page he has come from. I’m not paranoid, nor (I hope) are my visitors, but still giving someone that much information on my site and its user’s activities gives me something to think about.
Don’t worry much about that. Since Google’s serving the script with a +1 year expires header, they’re committing to a best-case scenario that’s extremely sub-optimal for tracking your users. With the huge cross-domain caching potential that the Google CDN has accumulated at this point, there’s a good chance they’ll never know that your users hit your site because the user will load jQuery from disk and not even need to make a request for a 304 result.
Great post. I only just got setup with jQuery today and am amazed by the great results you can get with such little code. I implemented the datepicker into my site which used a fraction of the code of my old JavaScript one and has much improved validation. I didn’t really know if I should host the files myself, but one quick Google search and 5 mins later and I now have a good understanding. Thanks!
It’s amazing how the comments have been going on for so long. I too came here in search of the quick link, thought it would be quicker than opening finder and grabbing a copy from one of my sites. Thirty minutes later and I’m not so sure it was quicker.
While I’m here I figure I might as well as put in my few cents as there are clearly a few designers and web developers on here looking for real clarity on the matter.
For those who just to skip to the end for the answer here it is, locally hosting is always the better choice than using google to host one jquery file.
The most basic of reasons is control, of which you lose by allowing someone else to add whatever they choose to on your site. In addition you are at the mercy of their network. Can you have faith in their abilities, sure nothing wrong with that. Should you base a business on letting an easily mitigated risk go on? No, no you shouldn’t.
Looking for technical side of it? Excellent…
To start of the most obvious reason, is that no matter if you have jquery hosted on google or your server all subsequent requests are pulled from cache, meaning any suggested benefit for using google is pointless after users have been to your site. If your site only loads once, in this case you may have a point. I know from my experience my clients like users to get past the home page and the majority do.
Secondly using the google version results in a dns lookup, sure your computer may have that cached but again if relying on an assumption that it already exists you might as well rely on the assumption it doesn’t. Since the assumption that it doesn’t allows you to reduce a risk.
From a CDN stand point, you normally look to use those when your customers are located across fast distances and have a large amount of traffic. If this is the situation you find yourself in, then using a cdn for your own assets would be more beneficial than just one small jquery file.
Additionally users won’t always get the closest response from a data center. For me I get sent up to Mountain View, CA even after first hopping to the data center in LA. It’s not a perfect science but more of a best guess, and guessing should have no part in logical conclusions.
As such, I don’t see any valid use of allowing google to host a jquery object for me or the companies I work for. At best some users may experience a noticeable benefit, say your customers in Montana accessing with their free month of AOL. The downsides associated with that practice can not offset the potential gains…. it’s good sense in my professional opinion.
But by all means, don’t take my advice live and let live i say. But if you build and maintain enough sites for enough time… you’ll realize this was sound advice.
Cheers!
Dev Head
I am a little curious about why no one here discusses your user’s privacy. Google doesn’t make this web space available for altruistic reasons, they are a business whose model is to gather as much data about everyone as possible and sell access to that information to the highest bidder. When a user pulls javascript from GoogleAPIs, this allows Google to track who is pulling the script and for what website it is being pulled. This is a valuable piece of information to add to all of the other information in the user profiles.
I (as a user) mistrust any website that requires me to make a connection to Google’s servers (which we firewall at the router), and have gone so far as to close bank accounts where the programmers have been lazy enough to rely on Google instead of serving their own javascript. Admittedly, most users don’t care about being tracked (there’s a never-ending argument between the young and the old!), but for those of us who do, relying on GoogleAPIs is a strong signal that the website programmers, and by extension the company owning the website, have no regard whatsoever for user privacy.
That’s been discussed a few times throughout the comments. Due to the far-future expires header and cookie-less domain, Google could only gather extremely spotty tracking data if they tracked it all. Many, if not most, of the script references to Google’s CDN don’t result in any HTTP request to Google’s servers, and they lack tracking cookies when they do.
Compared to the even more ubiquitous services like AdSense and Google Analytics that do actively track viewers, the public CDN is harmless.
Compared to the even more ubiquitous services like AdSense and Google Analytics that do actively track viewers, the public CDN is harmless.
That’s a silly argument; compared to Gmail, the CDN is harmless, too (not to imply Google tracking is harmful). But I don’t use Gmail (hell, our mail server won’t accept connections from Gmail machines). I don’t allow connections to google-analytics.com. And none of that is relevant. We are talking solely about the possibility of combining website and user through GoogleAPIs, IP addresses, and existing Goggle cookies, which is possible.
You say they receive, “extremely spotty tracking data.” Since Google has never touched any data they didn’t mine and maintain, I don’t personally want to give them anything, even what you consider “spotty.”
Please don’t take this the wrong way, but if this weren’t profitable for Google in providing them additional data to mine, they wouldn’t be doing it. Google is a business, whose only product is information; they are not a not-for-profit giving things away, and regardless of how most geeks see them, they are not altruistic. Gmail, Google Maps, all of the services are designed to acquire data they can sell. That isn’t “evil,” but I prefer not to participate.
If you as a website require me to load data from any Google server, you are requiring me to participate in G’s tracking whether I wish to or not, and this makes no sense particularly if you are a service that should be providing privacy, like a bank or a domain registrar. I am actually changing registrars because my current one requires me to accept Google AJAX files for only one object; the verify code for the credit card. They “require” me to accept that JS file. I won’t, so I can’t pay them. How seriously stupid is that?
The problem isn’t Google’s CDN. It’s that your credit card company doesn’t offer a fallback when you block connections to the CDN. There’s no reason that both can’t coexist. When you loaded this page, it first attempted to load jQuery via the Google CDN too, but if you blocked that connection then it resorted to loading a local copy. Any site can do that in one line of code. I totally agree with you that it’s not necessary for them to require you to allow a CDN connection to use the service at all.
You’re drastically overestimating how useful IP and referrer data is when you only receive it in up to one year intervals though. With most connections coming from dynamic IPs or from behind NAT, a sporadic sampling of logged IPs without tracking cookies is worse than worthless. Trying to integrate that into the data collected by cookie-tracked services like AdSense, Analytics, and Gmail would only degrade Google’s ability to track you, not improve it.
Google does a lot of things that don’t directly add to their bottom line. From their contributions to charity to things like the Summer of Code each year, they certainly spend a lot of money that only benefits them by improving the web platform in general (so they can continue to serve ads on it) or by buying them goodwill. I can be as cynical as the next guy, but sometimes there aren’t monsters in every shadow. It makes perfect sense that they’d host these common JavaScript libraries on their CDN in order to help speed up the web and improve their advertising business in the long run.
The problem isn’t Google’s CDN.
I agree. And so there is no misunderstanding, it isn’t my credit card company but rather my domain registrar who is requiring the code, only in the payment screen, and only for the CVV entry object – everything else works on their site without problem, except that one tiny entry field. Still, it prevents me from paying them, so I am transferring my domains. (*shrug*) Most people would just bypass their firewalls or use a proxy, I suppose, but I can’t trust a company that’s using an allegedly secure connection and then offloading connections to a non-trusted third party without informing the user.
You’re drastically overestimating how useful IP and referrer data is when you only receive it in up to one year intervals though.
You are drastically underestimating the number of times user browser caches are cleared (in my case multiple times per week), browsers or systems reinstalled, et al. But the level of usefulness is irrelevant in any event; I do not believe Google has ever acquired any data that it didn’t maintain and eventually put to use, so even a small amount of usefulness is something I prefer not to provide.
(*sigh*) We will need to agree to disagree. While it is impossible to eliminate tracking (both on and off the Internet), it is possible to keep one monolithic company from knowing a frighteningly large amount of data about one and maintaining it past the 180-day legal limit in the United States. I chose to follow this path. You as a web programmer seem to think it’s a great idea to allow this company to monitor your users. I think that is the height of irresponsibility, while you believe it to be not only practical for the website but a benefit to the entire infrastructure. I expect they will do whatever is technically possible as their entire business model is predicated on violating the privacy of Internet users, while you believe they have only the noblest intentions and can be trusted with this massive amount of data, so inadvertently providing them more is not a problem.
And since we’re never going to agree on this matter, I leave the last word to you with my thanks for the enjoyable conversation.
Hello, tried using your link, the // thing at the start of URL, chrome interpreted it as file://. Didnt look up on other browsers though.
That will happen if you’ve loaded the page itself via a file:// URL, since the // URL just loads the resource with the same protocol that the page was loaded with. That won’t happen if you access the page via HTTP or HTTPS (e.g. once deployed or using a server on localhost during development).