Visual Studio Intellisense for SPServices

Be careful what you ask!

Well, no more than two hours later Daniel had sent me the stub of a vsdoc file for SPServices.

Now I don’t use Visual Studio, but I suppose if you like this kind of hand-holding, Intellisense could be useful. If you’re game, give it a try and let me know how it works out.

jquery.SPServices-2014.02 Intellisense BETA

If you have suggestions, additions, whatever, please send them along. If I get a general thumbs up, I’ll include the vsdoc file with the 2014.02 release, which I’m hoping to get out there in the next few weeks.

 

SPServices in the Office 365 Developer Podcast

Today Randy Drisgill (@drisgill) alerted me to the fact that Jeremy Thake (@jthake) and Steve Walker (@sharepointing) were talking about me in the Office 365 Developer Podcast: Episode 018 with Steve Walker on SharePoint UX developer guidance. While they do indeed say some very nice things about me (those payments to Redmond are working out), there’s actually a *lot* of great material covered in the podcast. If you just want to hear the part about SPServices, it starts around 28:00.

Podcast PlayerIt’s good to hear that Jeremy, Steve, and I think about some things the same way:

  • I’m awesome (just kidding)
  • Everyone loves SPServices, even Steve
  • “The SOAP services are deprecated, but unlikely to go away any time soon” (Jeremy)
  • “Deprecated doesn’t mean you can’t use it; it means [Microsoft] isn’t going to invest in the future and [they] will take it away at some point.” (Steve)
  • “Deprecated means that ‘no engineer is going to touch that’. It’s still running, it’s still in there, but we’re not going to be adding additional methods or functions.” (Jeremy)
  • Changes to the DOM can break SPServices and Microsoft *will* change the DOM on Office365 – expect it
  • Governance for script-based solutions is just as important as for anything else.
  • This stuff isn’t unique to SharePoint and Microsoft. We’d deal with a lot of these things with anyone else’s Web-based software, too.

As far as the deprecation of the SOAP Web Services goes, as far as I can tell, there hasn’t been *any* work done on them since SharePoint 2010 was released. That was way back on May 12, 2010, making it over 4 1/2 years since anyone has messed with them. Effectively, they have been deprecated for that long based on the definitions above. That latter part is one of the reasons the SOAP Web Services are so awesome: since no one messes with them, they are incredibly stable. Also, since they were developed a long time ago, they are pretty darn efficient. (We had to work harder to get more out of with less hardware capability back then.)

Just as with InfoPath, deprecated is not a death knell. With Infopath we *know* that there’s a long, long runway out to 2023 before support stops for it: “the InfoPath 2013 desktop client and InfoPath Forms Services for SharePoint Server 2013 will continue to be supported through 2023 as part of our Lifecycle support policy.” It will have a healthy life for quite a while after that, too, just like Windows XP still does. We don’t know what the runway looks like for the SOAP Web Services, though. That may mean we have more time, but it may mean we have less.

For quite a while now, I’ve advocated offering FaaS – or “Functions as a Service” – to the organization. By adding things like jQuery into the master page and letting citizen developers know about it, you immediately get some accountability and even can start some collaboration around that type of development. I wrote a whole chapter about these ideas last year in the book Black Magic Solutions for White Hat SharePoint. If you’re on premises now, but think you may be moving to Office365 in the future, the more you (an IT person) know about the citizen developer work that drives your business now, the better. The work that those citizen developers do is *so* important – they build what the organization truly needs and they are unsung heroes.

Some things I am not on the same page with Jeremy and Steve on:

“Don’t use our DOM as an API” – This would be fine if there were good ways to alter the DOM via a “real” API. Maybe that’s coming and maybe it isn’t. Until it does, we don’t really have much choice if we want to extend (or fix) things in places like the default list forms, which so many of the SPServices value-added functions provide.

The rules aren’t back and white. Much of the messaging from Microsoft centers around SharePoint Online and Office365, as if on premises installations don’t even exist. If you’re running things on premises, the same things won’t happen to you at such a rapid pace, the pace driven by the regular updates to Office365. You have control , so you can make different decisions. I know that the podcast is “Office365 Developer…” but many people think of messages from Redmond as applying to everything. It’s all nuances.

I’m probably more concerned about all the existing code out there that uses SPServices on top of the SOAP Web Services than the services themselves. I’ve been promising Scot Hiller (@scothillier) a series of articles on this for ITUnity for months, but I just can’t seem to get it done. Watch for some material from me about it, though. I want to help you move from SOAP to REST in an organized and productive way. You will have to do it sooner or later. Preparing for that eventuality will help to address the concerns that Steve and Jeremy expressed in the podcast.

And thanks again for the kind words and support, guys.

Caching SharePoint Data Locally with SPServices and HTML5’s Web Storage

The SharePoint SOAP Web Services are fast. In fact, I think they are as fast as the newer REST Web Services in many cases. The old, crufty SOAP Web Services even provide batching, something that the REST services don’t yet do. (Andrew Connell (@andrewconnell) has been beating the drum about this with Microsoft for months now, and we all hope they get this OData capability into SharePoint’s REST services sooner rather than later.)

Even though the SOAP services are fast, sometimes they just aren’t fast enough. In some of those cases, it may make sense to store some of your data in the browser’s Web storage so that it’s there on the client during a session or across sessions. Web storage is an HTML5 capability that is available in virtually all browsers these days, even Internet Explorer 8.

The best candidates for this type of storage (IMO) are list contents that are used as references and that don’t have a high number of changes. As an example, you might decide to store a list of countries in Web storage rather than loading them from the Countries list every time a page loads. Even though the read from the list is fast, it has to take *some* time. There’s the retrieval time, and then there is also any processing time on the client side. For instance, if you have dozens of fields per country and you need to load them into a complex JavaScript structure, that takes time, too. If those data chores are making your page loads seem laggy, then consider using local storage.

There are three main ways you can store data locally to improve performance. I’m not going to go into all of their intricacies, but I will give you some rules of thumb. There are a lot of nuances to this, so before you dive in, do some studying about how it all works.

Cookies

For small pieces of data, you should consider using cookies. Contrary to just about every article out there in the press, cookies are not bad. They can store up to 4k of data each for you, which you can read back when the user returns to the page. There’s a excellent little jQuery plugin I use to facilitate this called, aptly, jquery-cookie. You can download it (for free!) from GitHub here. Cookies persist across sessions.

Session Storage

Session storage is the flavor of Web storage that allows you to store data just for the duration of the session. Think of a session as a browser lifespan. Once you close the browser, the session storage is gone. Both session storage and local storage sizes are limited by the browser you are using. If you want to know if Web storage is available in your browser of choice, take a look at “Can I use“. The amount of storage each browser gives you is a moving target, but it’s per domain.

Local Storage

Local storage takes Web storage one step further. The data stored in local storage persists across browser sessions. In fact, it usually won’t go away until you explicitly delete it. (Consider this fact when you are gobbling up local storage in your development process.)

So What?

The trick with using these storage mechanisms is managing the data you’ve put in local storage as a cache. That data can go past its expiration date, either because some changes were made to the underlying data source or the cache itself has become corrupted. The latter is more difficult to deal with, so here I’ll focus on the former.

JavaScript – like most other programming languages – lends itself to building wrapper functions that add additional layers of abstraction on top of underlying functionality. Too many levels of abstraction can make things confusing, but with careful thought and smart code writing, you can build abstractions that serve you well.

In a recent client project, I found that as list data volumes were increasing, the pages in my SPServices- and KnockoutJS-driven application were loading more and more slowly. I’m building on top of SharePoint 2007 in this case, so even if I wanted to use REST, I couldn’t, nor do I believe that it would automatically make anything faster. If we had better servers running things, that might make a huge difference, but we have no control over that in the environment.

What I wanted was a reusable wrapper around SPGetListItemsJson (which itself is a wrapper around the SOAP List Web Service’s GetListItemChangesSinceToken and SPService’s SPXmlToJson) that would let me check local storage for a cached data source (list data), read either the entire data source or just the deltas from the SharePoint list, load the data into my application, and then update the cache appropriately.

The getDataSource function below is what I’ve come up with so far. There’s some setup to use it, so let me explain the parameters it takes:

  • ns – This is the namespace into which you want to load the data. In my applications these days, following the lead from the patterns Andrew and Scot Hillier (@scothillier) have published, I usually have a namespace defined that looks something like ProjectName.SubProjectName.DataSources. The “namespace” is simply a complex JavaScript object that contains most of my data and functions.
  • dataSourceName – The name that I want to give the specific data source within ns. In my example above with the Countries list I would use “Countries”.
  • params – This is the big magilla of the parameters. It contains all of the values that will make my call to SPGetListItemsJson work.
  • cacheItemName – This is the name of the item I want to store in Web storage. In the Countries example, I would use “ProjectName.SubProjectName.DataSources.Countries”.
  • storageType – Either “localStorage” or “sessionStorage”. If I expect the data to change regularly, I’d probably use sessionStorage (this gives me a clean data load for each session). If the data is highly static, I’d likely use localStorage.

And here’s the code:

/* Example:
getDataSource(ProjectName.SubProjectName.DataSources, "Countries", params: {
  webURL: "/",
  listName: "Countries",
  CAMLViewFields: "<ViewFields>" +
      "<FieldRef Name='ID'/>" +
      "<FieldRef Name='Title'/>" +
      "<FieldRef Name='Population'/>" +
      "<FieldRef Name='CapitalCity'/>" +
      "<FieldRef Name='Continent'/>" +
    "</ViewFields>",
  CAMLQuery: "<Query>" +
      "<OrderBy><FieldRef Name='ID'/></OrderBy>" +
    "</Query>",
  CAMLRowLimit: 0,
  changeToken: oldToken,
  mapping: {
      ows_ID:{"mappedName":"ID","objectType":"Counter"},
      ows_Title:{"mappedName":"Title","objectType":"Text"},
      ows_Population:{"mappedName":"Population","objectType":"Integer"},
      ows_CapitalCity:{"mappedName":"CapitalCity","objectType":"Text"},
      ows_Continent:{"mappedName":"Continent","objectType":"Lookup"},
    }
  }, "ProjectName.SubProjectName.DataSources.Countries"
)
*/

function getDataSource(ns, dataSourceName, params, cacheItemName, storageType) {

  var dataReady = $.Deferred();

  // Get the data from the cache if it's there
  ns[dataSourceName] = JSON.parse(window[storageType].getItem(cacheItemName)) || new DataSource();
  var oldToken = ns[dataSourceName].changeToken;
  params.changeToken = oldToken;

  // Read whatever we need from the dataSource
  var p = $().SPServices.SPGetListItemsJson(params);

  // Process the response
  p.done(function() {
    var updates = this.data;
    var deletedIds = this.deletedIds;
    var changeToken = this.changeToken;

    // Handle updates/new items
    if (oldToken !== "" && updates.length > 0) {
      for (var i = 0; i < updates.length; i++) {
        var thisIndex = ns[dataSourceName].data.binaryIndexOf(updates[i], "ID");
        // If the item is in the cache, replace it with the new data
        if (thisIndex > -1) {
          ns[dataSourceName].data[thisIndex] = updates[i];
          // Otherwise, add the new item to the cache
        } else {
          ns[dataSourceName].data.splice(-thisIndex, 0, updates[i]);
        }
      }
    } else if (oldToken === "") {
      ns[dataSourceName] = this;
    }
    // Handle deletes
    for (var i = 0; i < deletedIds.length; i++) {
      var thisIndex = ns[dataSourceName].data.binaryIndexOf({
        ID: deletedIds[i]
      }, "ID");
      ns[dataSourceName].data.splice(thisIndex, 1);
    }
    // Save the updated data back to the cache
    if (oldToken === "" || updates.length > 0 || deletedIds.length > 0) {
      // Save the new changeToken
      ns[dataSourceName].changeToken = changeToken;
      window[storageType].setItem(cacheItemName, JSON.stringify(ns[dataSourceName]));
    }
    dataReady.resolve();
  });
  return dataReady.promise();
}

Some of the nice things about this function:

  • It’s generic. I can call it for any list-based data source in SharePoint. (I started out building it for one data source and then generalized it.)
  • I call call it during a page life cycle to refresh the application data anytime I want or on a schedule, perhaps with setInterval.
  • I can set a lot of parameters to cover a lot of different use cases.
  • Each time I call it, it updates the cache (if it needs to) so that the next time I call it I get a “fresh” copy of the data.
  • It only loads the data that it needs to, by using the GetListItemChangesSinceToken capabilities.

And some downsides:

  • Since I know what data I’m working with in my application and that it will fit into the Web storage easily, I’m not worrying about failed saves.
  • If the cache does become corrupt (not something I expect, but there’s always Murphy), I’m not handling it at all.

If you decide to try this out, you’ll need a few auxiliary functions as well:

/* DataSource constructor */
function DataSource() {
  this.changeToken = "";
  this.mapping = {};
  this.data = [];
  this.deletedIds = [];
}

/** Adapted from http://oli.me.uk/2013/06/08/searching-javascript-arrays-with-a-binary-search/
 *
 * Performs a binary search on the host array.
 * @param {*} searchObj The object to search for within the array.
 * @param {*} searchElement The element in the object to compare. The objects in the array must be sorted by this element.
 * @return {Number} The index of the element. If the item is not found, the function returns a negative index where it should be inserted (if desired).
 */
Array.prototype.binaryIndexOf = function(searchObj, searchElement) {

  var minIndex = 0;
  var maxIndex = this.length - 1;
  var currentIndex;
  var currentElement;

  var searchValue = searchObj[searchElement];

  while (minIndex <= maxIndex) {
    currentIndex = (minIndex + maxIndex) / 2 | 0;
    currentElement = this[currentIndex];

    if (currentElement[searchElement] < searchValue) {
      minIndex = currentIndex + 1;
    } else if (currentElement[searchElement] > searchValue) {
      maxIndex = currentIndex - 1;
    } else {
      return currentIndex;
    }
  }

  return ~maxIndex;
}

Uploading Attachments to SharePoint Lists Using SPServices

Easy!For years people have been asking me how they could upload files using SPServices. I’ve dodged the questions every time because I simply didn’t know how.

On a current client project,  I’m building a slick Single Page Application (SPA) to manage tasks in SharePoint 2010. It’s basically a veneer over the clunky out-of-the-box task list experience. Rather than hopping from page to page, the team using the application can accomplish everything they need to do on a single page.

Every project has specific needs, and for this one I’m using KnockoutJS. But the code snippets I give below are generic enough that you should be able to use them in any context with a little thinking.

It’s been going well, but I had been putting off implementing adding attachments because…well, as I said, I didn’t know how.

One of the benefits of the shift from all server side development to a much more significant focus on client side techniques is that some of the bright minds in the SharePoint development community have turned their eyes toward solving these things, too. Usually it’s on SharePoint 2013 or SharePoint Online at Office 365. However, at least now when I search for “SharePoint JavaScript xxx”, there are likely to be some great hits I can learn from.

In this case, there were two excellent posts, one from James Glading and one from Scot Hillier. (See the Resources section below for links.)

The first step is to enable HTML5 capabilities in IE10. This requires two small changes to the master page. This seems simple, but it can have effects that you don’t expect. In other words, don’t consider this just a “no brainer”. You need to plan for the change across your Site Collection and any impacts it may have.

The first change is to switch from using “XHTML 1.0″ to “HTML” as the DOCTYPE. This is what “turns on” HTML5.

<!DOCTYPE html>

Then, in this project we set the content meta tag to IE=10 because we want to aim for IE10.

<meta http-equiv="X-UA-Compatible" content="IE=10"/>

If your browser base is different, you can, of course, set this differently. In this project, we will require all users of the application to be running IE10 or greater. Firefox and Chrome have supported the bits of HTML5 we need for so long, that there’s little concern about people who choose to use those browsers.

Once we have those two small changes in the master page (we are using a copy of v4.master with no other customizations at the moment), we “have HTML5″. It’s most obvious because some of the branding flourishes I’ve put in like rounded corners show up in IE10 now, rather than just square boxes.

Once we have HTML5 enabled, we can start to use the File API, a.k.a. the FileReader. This is such a simple little thing, that it’s hard to believe it gives us the capability it does.

<input type="file" id="attachment-file-name"/>

That’s it. That simple HTML gives us a file picker on the page. Depending on your browser, it will look something like this:

File Picker

When you make a file selection with this very familiar widget, you can query the element using jQuery.

var file = $("#attachment-file-name").files[0];

Note that we’re after a single file, so we grab the first object in the file list array. We get an object that looks something like this screen shot from Firebug:

File object from the File Picker

Once we have the file object, we can look at what the Lists SOAP Web Service needs to get in order to upload the file. Again, it’s pretty simple. Here is the list of inputs and output from the MSDN documentation page for the Lists.AddAttachment Method.

Parameters

listName
A string that contains either the title or the GUID for the list.
listItemID
A string that contains the ID of the item to which attachments are added. This value does not correspond to the index of the item within the collection of list items.
fileName
A string that contains the name of the file to add as an attachment.
attachment
A byte array that contains the file to attach by using base-64 encoding.

Return Value

A string that contains the URL for the attachment, which can subsequently be used to reference the attachment.

The first three inputs are straightforward. We pass in the name of the SharePoint list, the ID of the list item, and the name of the file we want to attach. It’s that last input parameter called “attachment” that’s a bit tricky.

When we upload a file via an HTTP POST operation – which is what all of the SOAP Web Services use – we have to pass text. But the files we want to upload are as likely as not to be binary files. That’s where the Base64 format comes in. Believe me, you don’t need to know exactly what the format actually is, but you should probably understand that it enables us to send binary files as text bytes. Those bytes are then decoded on the server end so that the file can be stored in all of its original, pristine glory.

Here’s the code I ended up with, skinnied down as much as possible to make it a little clearer to follow. I draw heavily from Scot and James’ posts for this, but nuanced for SPServices. I’ve also stripped out all of the error handling, etc.

  • First, the getFileBuffer function reads the contents of the file into the FileReader buffer. readAsArrayBuffer is an asynchronous method, so we use a jQuery Deferred (promise) to inform the calling function that the processing is done.
  • The contents of the buffer are then converted from an ArrayBuffer – which is what the FileReader gives us – into a Base64EncodedByteArray. This works by passing the buffer through a Uint8Array along the way.
  • Finally, we use the toBase64String method to convert the SP.Base64EncodedByteArray to a Base64String.

Yeah, that’s a lot to swallow, but again, you don’t really need to understand how it works. You just need to know that it does work.

Finally, we call Lists.AddAttachment using SPServices to do the actual upload.

/* From Scot Hillier's post:
http://www.shillier.com/archive/2013/03/26/uploading-files-in-sharepoint-2013-using-csom-and-rest.aspx */
var getFileBuffer = function(file) {

  var deferred = $.Deferred();
  var reader = new FileReader();

  reader.onload = function(e) {
    deferred.resolve(e.target.result);
  }

  reader.onerror = function(e) {
    deferred.reject(e.target.error);
  }

  reader.readAsArrayBuffer(file);

  return deferred.promise();
};

getFileBuffer(file).then(function(buffer) {
  var bytes = new Uint8Array(buffer);
  var content = new SP.Base64EncodedByteArray(); //base64 encoding
  for (var b = 0; b < bytes.length; b++) {
    content.append(bytes[b]);
  }

  $().SPServices({
    operation: "AddAttachment",
    listName: "Tasks",
    listItemID: taskID,
    fileName: file.name,
    attachment: content.toBase64String()
  });

});

Very cool! And as I tweeted yesterday, far easier than I ever would have expected. Yes, it took me a good chunk of a day to figure out, but it definitely works, and pretty fast, too.

If you use this example as a base, you could fairly easily build out some other file uploading functions. Combined with the other attachment-oriented methods in the Lists Web Services, you can also build the other bits of the attachment experience:

  • GetAttachmentCollection – Returns a list of the lit item’s attachments, providing the full path to each that you can use in a list of links.
  • DeleteAttachment – Once you’ve uploaded an attachment and realized it was the wrong one, this method will allow you to delete it.

Moral of the story: Fear not what you do not know. Figure it out.

Resources

What’s the Story for HTML5 with SharePoint 2010? by Joe Li

Uploading Files Using the REST API and Client Side Techniques by James Glading

Uploading Files in SharePoint 2013 using CSOM and REST by Scot Hillier

Lists.AddAttachment Method

This article was also posted at ITUnity on Jun 02, 2014. Visit the post there to read additional comments.

Update 2014-05-28 11:55 GMT-5

Hugh Wood (@HughAJWood) took one look at my code above and gave me some optimizations. Hugh could optimize just about anything; he is *very* good at it. I *think* I can even see the difference with larger files.

getFileBuffer(file).then(function(buffer) {
  var binary = "";
  var bytes = new Uint8Array(buffer);
  var i = bytes.byteLength;
  while (i--) {
    binary = String.fromCharCode(bytes[i]) + binary;
  }
  $().SPServices({
    operation: "AddAttachment",
    listName: "Tasks",
    listItemID: taskID,
    fileName: file.name,
    attachment: btoa(binary)
  });
});

SPServices: What About Deprecation of the SOAP Web Services?

SPServices user JSdream asked a question in the SPServices Discussions on Codeplex the other day that’s worthy of a more prominent response.

Should we, who use SPServices, be concerned at all with Microsoft recommending not to use either the ASMX web services in SharePoint 2013 (http://msdn.microsoft.com/en-us/library/office/jj164060.aspx#DeprecatedAPIs) or the owssvr.dll (I used this one a lot when doing InfoPath stuff) for development purposes?

The Choose the right API set in SharePoint 2013 article is an important one, for sure. In it, there’s pretty clear direction about which API to choose based on the specific requirements of the solution you are building. As with anything, though, there’s a strong dash of “it depends”. It depends on things like your available skills, the device the code will run on, etc.

Figure 1. Selected SharePoint extension types and SharePoint sets of APIs

Figure 1. Selected SharePoint extension types and SharePoint sets of APIs

I would say that we should all be “cautious” rather than “concerned”. The SOAP Web Services are used by many parts of SharePoint still, with the most prominent being SharePoint Designer. If you sniff around in the traffic between processes, you’ll spot other places as well.

That said, deprecated is deprecated. If you build your solutions with the data access layer separated out from the business logic layer, you should be able to replace the calls if and when it becomes necessary.

I am not aware of any official specific time frames for the SOAP services going away, and the Product Group would need to give us a good, long lead time because lots of code depends on it, SPServices or not. Many people have built managed code-based solutions which use the SOAP Web Services as well. GetListItems is a workhorse operation no matter how you build your solutions.

There are many things you simply cannot do with REST yet, though in many cases CSOM may provide better coverage. If you are using SPServices for the operations which aren’t available in REST or CSOM, obviously you’ll want to continue doing so until there is a viable replacement. The REST coverage is improving steadily. (See Overview of SharePoint Conference 2014 and new content for developers for some information.) Keep in mind, though, that the majority of improvements will end up in Office365 first and in the SharePoint Server product at some point later or perhaps not at all. So your time horizons and cloud plans are also critical decision factors.

[If I had to put my money on a horse, it’d be REST over CSOM over the long term.]

All that said, if you are starting a new project on SharePoint 2013 of any significant size, you should be considering using the App Model, CSOM, and REST. SPServices may still fit in there somewhere, so don’t toss it out with the bath water just yet. I’ll continue evolving it as long as it has people wanting to use it. I think I still have plenty of things to do.