From the SPServices Discussions: Why Should I Bother to Learn XSLT?

Client Server DiagramHere’s a question from gdavis321 from the good old SPServices discussions. In most cases, I just answer the questions in situ, but occasionally – as you frequent readers know – I realize what I’m writing feels more like a blog post, so I move it over here.

It seems that Spservices (thank you for this god send) can do anything that you might want to do with a content query webpart or other webpart… am I missing something? Why should I bother to learn XSLT?

It’s a fair question. As someone who loves using the Data View Web Part (DVWP) – which is XSL-based – the answer is, as always, it depends.

Doing everything client side isn’t always a great idea. For each specific requirement, you should look at whether it makes more sense to do the heavy processing on the server or client. There are many variables which go into this, like client machine capabilities, data volume, number of data sources, need to join data from different sources, skills on the development team, desired time to market, etc.

In SharePoint 2013, much of the processing has moved to the client side (often called Client Side Rendering, or CSR), so many people think that we should just process everything client side now. I worry about this sort of myopia.

Client machines – meaning the machine sitting on your desk or in your lap or hand rather than the server, which sits in a data center somewhere – have better and better capabilities. Yet in many organizations, you may still see Windows XP machines with 1Gb of RAM – or worse!

Additionally, client side processing may mean moving a huge volume of data from the server to the client in order to roll it up, for example. If you’re moving thousands of rows (items) to make one calculation, it *can* feel sluggish on client machine without a lot of oomph.

So client side libraries like SPServices or KnockoutJS or AngularJS (among many others) are fantastic for some things, not so fantastic for others.

When it comes to the JavaScript versus XSL question, both approaches should be part of your toolset. Sometimes good old XSL will be an excellent call if you can do some of the heavy lifting on the server, then pass the results to the client for further processing. The output need not be just nicely formatted list views, but could be sent in such a format as to make the client side processing easy to start up upon page load. My approach is often to ship the data to the browser via a DVWP and then process it further with jQuery on the client.

The mix of what happens where is on a spectrum, too, depending on what you need to do. You might roll up tens of thousands of items in a DVWP and then request items from two or three other lists from the client as the page loads to optimize your architecture.

Yup, there’s that architecture word. A good SharePoint architect will know when to use each of the tools in the toolkit, and JavaScript and XSL are just two of many.

Keep in mind that because “it depends”, your mileage may vary in any and all of this: cavete architectus.

Similar Posts

13 Comments

  1. Funny coincidence, I sent an e-mail to one of my clients yesterday, explaining just that.

    One thing that is missing in your equation is the deprecation factor. At this point, XSLT is only half supported in SP 2013. The SPD design view has disappeared in 2013, and Microsoft has never moved to XSLT 2.

    So when I am asked about XSLT, my first step is to inquire about the client’s current environment and migration plans. I find it hard to recommend XSLT to somebody who is currently on Office 365 and will have a forced migration in less than 3 years.

    We definitely need to find the right balance between server side and client side. But there are other server side options than XSLT.

    1. Christophe:

      The problem is that the only real user-accessible options for server side code are the DVWP, CQWP, or other XSL-based functionality. By removing the Design View in SharePoint Designer 2013, Microsoft has made it harder – but not impossible – to work with these powerhouse Web parts. As long as client machine capabilities are there, the new model of Display Templates and other Client Side Rendering (CSR) techniques *can* work, but they aren’t always the best idea.

      M.

      1. Marc, you have a point here. As soon as you need more than item counts, it’s hard to find an clean alternative option to DVWP/CQWP.

        Still, there are situations where a DVWP would have been a no-brainer 3 years ago, but for which today I am going to collect more details about the volumes (number of lists and items) before making a decision. And in any case I like to be transparent, and make sure the client understands the mid-term risks (again, my concern here is Office 365, on premises deployments don’t have the same migration constraints).

  2. The only thing I see as missing is that where you put this processing and how depends totally on the client. If they are using IE10+ then going the JS route and even leveraging web workers will give a massive advantage. Of course there are also better more efficient ways of processing large amounts of data quickly even on the slowest of machines.

    This brings me to a new point. It is time we learnt to optimise our code again and the above reason of not knowing the client machine is the primary reason for this.

    Programmers need to learn optimisation again as we enter this brave new SP+JS World.

    I’m down for this and I’m ready.

    1. Hugh:

      I always enjoy our optimization chats on Twitter. It seems to be a lost art to care about code efficiency. With the risk of sounding like the old fogey that I am, learning to code in 64k (or less!) of memory taught us to be *extremely* frugal. That frugality has gone by the wayside because memory is cheap and disk is free (Dire Straits could redo the song with those lyrics).

      I’ve tried never to let go of my coding frugality, though sometimes I do cut corners. Good developers should always be aware of the underlying inefficiencies in their code and make adjustments accordingly. Things like looping through an array twice where once will do actually matters. If client side development re-teaches SharePoint developers some frugal approaches, then “Amen”.

      The scary question this raises is, of course, what kind of code have they been deploying to servers?!?!?!?

      M.

    2. Hugh, client side processing is not the only concern here, another key issue is passing data over the network. Imagine that you want the number of overdue tasks across multiple sites: are you going to pull all overdue items from each list, then do the math on the client side? Or would you rather do the aggregation on the server side and just send a number? For such questions, your client machine machine and browser doesn’t really matter.

      1. @Christophe
        The answer to this, is that bandwidth is fast becoming our cheap commodity. This means that we have to balance the amount of connections we make and the amount of data we transmit.

        For example an efficient data grid would only load the pages above and below and in view, the other pages would be kept out of memory unless required for aggregation. You could also do this with just the DOM elements. This is exactly how Netflix performs the task on their user interface to great effect, meaning the interface can even run on television systems.

        Of course if performance is an issue you would keep a value which is updated when items are updated for the totals in a separate location, should you are processing vast amounts of data.

        In reality a couple of thousand objects held in memory are quick to manipulate. It is the manipulation of the DOM which takes over 95% of the processing and this is where I am aiming my personal efforts in optimisation at the moment.

        After this let us see where ECMA6 takes us, and that we need to jump on board fast before front end developers take the reigns and we become obsolete.

        1. Sure, bandwidth is becoming a cheap commodity, and so does client side processing. Marc made it clear in his post that one of his concerns is organizations with older systems.

          Paging is of course an answer, for both DVWP/XSLT and client side. But as I said the real issue is aggregation. Here is a live example that hopefully will make my point clearer:
          http://blog.pathtosharepoint.com/2012/10/02/teaser-real-time-business-intelligence-in-sharepoint/
          Thousands of items are used to build that map, but what you really need in the end is 50 numbers. Do a SOAP test with SharePoint, or even worse a REST test, and you’ll see the ridiculous amount of data you need to pass over the wire for client side processing!

          Tracking changes is a good point (I think Marc wrote another article on this recently), but becomes difficult when you need more than just counts or totals.

  3. I think the DVWP is still a vital tool in SP2013. It can connect to external data sources that JavaScript can’t (cross-domain security, etc.). A couple of weeks ago I needed to display an RSS feed in a specific format from an external website, and the RSS feed web part wasn’t gonna cut it. So I used a DVWP and XSL to output the exact markup I needed. Even if a DVWP is not a first-class citizen, XSL is still used quite a bit for displaying data in 2013, so it’s a good skillset to have.

  4. I’ve found the combination of web parts and Javascript to be extremely powerful, much more so than either by themselves.

    For example, retrieving list data and generating markup from it is often very fast using a DVWP. You can use that to your advantage to produce the markup something like a jQuery UI widget, such as tabs or accordion, is expecting. The page renders almost instantly and you get the nice interactive jQuery elements. Best of both worlds.

    1. Well put, Carlos. I know that some other folks have had great luck using DVWPs to emit ISON for further processing once the page loads. The important thing is to decide here best to do each piece of processing so that the user experience is as good as possible.

      M.

Leave a Reply to Hugh Wood (@HughAJWood) Cancel reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.