SharePoint Myths – Part 2

6 minute read

DaVinciCode.jpgLast week I had the privilege to hear the well-known author Dan Brown speak at a dinner for my alma mater, Phillips Exeter Academy. You undoubtedly have heard about a little book he wrote called The Da Vinci Code. Not only is Dan a talented author, but he’s also a deep thinker on topics of religion vs. science, and by all accounts a genuinely nice guy. Dan talked about a concept I don’t think I’d heard about before: Proof by Incredulity. Simply put (and probably paraphrasing Dan unfairly), the idea is that there are times when we look at two known (often opposing) possibilities and because one seems absolutely impossible to us (we’re incredulous), we decide that the other must be true. Thus the endless debate about creationism vs. the Big Bang. We decide that the one which seems far-fetched to us must be false, thus the other is true. We usually don’t consider the fact that there may be “middle truths”, or blends of the two truths which are the actual case.

About a year and a half ago I did a post called SharePoint Myths – Part One, thoroughly expecting that it would be a long series of posts. I haven’t gotten back to it until now, though I certainly try to dispel myths whenever and wherever I can. Sometimes this makes me seem like I’m pooh-poohing what others believe, but that’s not it at all. I just want to try to add to the discussion by avoiding proofs by incredulity or any other lack of hard facts.

So what does all this flowery (hopefully not precious) rhetoric have to do with SharePoint Myths? Well, a few days ago I spotted yet another discussion about where to store scripts within SharePoint, this time over at SharePoint Overflow in the thread Where do you deploy scripts that are loaded in a masterpage? There have been many threads on this topic that I have spotted over the last year or so as jQuery, in particular, has become more popular. (The best thread, IMO, was at the old SharePoint Overflow. It seems that the Stack Overflow overlords have decided to disappear all of the old content in the migration to SE 2.0. Bad move. The upshot in this case is that I can’t point you to a really good thread.)

<UPDATE dateTime=”2011-06-08″>
The overlords restored all of the old content since I posted this, so here is that really good thread I wanted to point to originally.

The discussion usually devolves into a lot of absolute statements about how large script files are and what a huge expense they add to page loads. I’ve always believed that this wasn’t the case, based on everything I understand about how browsers work. In most cases, browsers cache files like script files and images locally to reduce unnecessary HTTP traffic. But might I be wrong? Could it be true that SharePoint wasn’t smart enough to allow caching of frequently downloaded files?

I’m a big fan of empiricism. Facts are far more immutable than innuendo. I’ve especially enjoyed some of James Love’s recent work on benchmarking where he doesn’t take things for granted. If he doesn’t see the data, he ain’t buyin’ it. That’s just plain good thinking. Check out his recent blog posts SharePoint 2010 Performance with Item Level Permissions and SharePoint 2010 Performance with Item Level Permissions Part 2.

I’m going to apply nowhere near that sort of rigor here, but I am going to show you some real, live, actual facts based on some small tests I’ve run.

I like to store my scripts and CSS in Document Libraries, generally at the root of the Site Collection. So if I do that, am I incurring a huge overhead in the browser? Well, I turned to my environment where I develop SPServices. It’s a WSS 3.0 site hosted by FPWeb “in the cloud”. (When I first set it up, it was just hosted by FPWeb. Now it’s “in the cloud”.) Given that it’s WSS 3.0 and it’s multi-tenant, one would expect that it is sort of the lowest common denominator when it comes to sophistication. I’m not saying that the good folks over at FPWeb aren’t doing a great job keeping it running well (they are). I’m just saying that there’s less you can do to tune that environment than some fancy-schmancy SharePoint 2010 Server farm.

I have a test page I’ve been using lately for developing the new $().SPServices.SPComplexToSimpleDropdown function (still in alpha). It’s a basic EditForm which I’ve customized and of course I’ve added references to the jQuery library and SPServices. I took at look at the network traffic when I load that page. First I loaded it in IE and watched the traffic with Fiddler since IE8 doesn’t give me any capability to do so:


Huh, the requests returned a 304, which means that the file hasn’t changed since the last download and therefore the browser will use the cached  version.

Well, that could be due to Microsoft’s affinity for the Level 1 IE browsers, right? Let’s take a look with Firefox. Firefox with the Firebug extension gives me the capability to watch the network traffic:


Huh. Cached again.

Let’s make a change to the SPServices file and see what happens. As expected both browsers downloaded the fresh copies of SPServices from the Document Library.



Refresh again in each:



Cached again.

Even when the file is reloaded, the byte count is about 45k. This is also interesting because I’m loading the unminified version, which actually weighs in at 143.88KB (147334 bytes). The smaller download size is due to the fact that both IE8 and Firefox Gzip files when they are downloaded. (Sure, there are some older server OSs which may not support that, but we’re talking about SharePoint and IIS here.)

Again, the point here is not my rigor. I’ve only done a couple of refresh-change-refresh cycles. I’m not going anywhere near as far as James has investigating performance for item level permissions. However, I have at least tried to see what the real situation is.

Script files in Document Libraries in my test environment are indeed cached. Now I know for a fact that there are all kinds of ways to make this not be the case. Most of the instances I’ve run into were just plain mistakes and caused by a lack of understanding of the tools involved. For instance, an F5 set to never allow caching (yeah, they are built for load balancing, so…) or forced browser policy settings which reduce the possibility of caching. The point is that you need to look at real data in your own environment. As with everything in SharePoint (and really all software use and development, plus much of life), it depends. Telling people with authority that something is an absolute truth without taking a look into the details is abhorrent in my book.

Statements like “I would think that…” or “One time I saw…” are fine because they are essentially disclaimers. Statements like “It is always the case that…” or “You have to…” imply authority based upon actual knowledge. If one doesn’t have actual knowledge, then one should avoid presenting their thoughts as such.

Ultimately, it becomes a discussion based upon proof by incredulity. “SharePoint does all of these things I know about wrong or oddly, so I know it can’t do this part right.” Uh-uh. Do some testing. Gather some data. And use language that indicates what your *actual* level of knowledge is. We’ll all be better off because we’ll have better information to work with.

BTW, I think storing scripts in Document Libraries is a grand idea.



  1. Performance can be even improved if enable BLOB cache, which will set the cache header as public and gives value for max-age.. which in turn avoids 304’s. Each 304’s status requests involves a database query though content is not returned to client.

  2. “storing scripts in Document Libraries is a grand idea”
    If you are an adept of caching, using a CDN for scripts like the jQuery library is even better (when you have internet access).

  3. Marc,

    You should be able to find that old post from SharePoint Overflow now as everything has been imported.

    I really like your “myths” series!


Have a thought or opinion?