Monthly Archives: November 2014

SharePoint 2013 Hosting with ASPHostPortal :: Scenario pages for SharePoint 2013

SharePoint 2013 has many great ways to help you get things done. We want to highlight a few of these, so we have created scenario pages that explain a specific scenario and provide content to help you understand, implement, and use it easily.

Scenario pages allow you to view key resources based on selected stages of evaluation or adoption. These stages are represented by colored tiles. Click a single tile for a specific stage or Ctrl-click multiple tiles for multiple stages. As you click the tiles, the scenario page lists the resources for each selected stage.

Content and resources are drawn from many Microsoft Web properties: IT content from TechNet, developer content from MSDN, and Information Worker content from Office.com are all integrated into the scenario page experience. All of the resources you need are available in one place, whether you want to understand:

  • Which features must be configured to support the scenario and how to manage them
  • What namespaces and methods to use to develop customizations for the scenario (MSDN content) Or how to accomplish a specific task in the scenario (Office content)

The following scenario pages are now available:

  • eDiscovery in SharePoint Server 2013 and Exchange Server 2013 . eDiscovery allows you to place electronic holds on documents and email for a legal case or audit. eDiscovery is a great example of a solution that benefits from a scenario page because it provides links to key resources published for SharePoint 2013, Exchange Server 2013, and Lync Server 2013.
  • Personal sites (My Sites) in SharePoint Server 2013 . My Sites technology provides profile data, activity feeds, tagging capabilities, and search results for each SharePoint user in your organization.

When you deploy My Sites, each user gets a starting place in SharePoint that brings together the sites, documents, and other information that they care about and helps them share what they know.

SharePoint 2013 Hosting – ASPHostPortal.com :: Binding SAP UI 5 aka Open UI 5 Table with List data from SharePoint 2013 REST API

Binding SAP UI 5 aka Open UI 5 Table with List data from SharePoint 2013 REST API

ahp_freehostSHP(1)

SAP UI 5 aka Open UI 5 is a new development framework available for SAP developers to expose and consume SAP data as JSON objects via REST API calls. Since it is a JavaScript UI library like jQuery UI, this can be used with any client side application that can make use of JSON

To be frank, this is the biggest client side library I have used so far. When extracted, the runtime files alone comes to 55 MB and the total number of files counts to 4K +
This is huge and bit complex and I am using it for quite some time for one of my SharePoint Project as it is heavily dependent on SAP data and UX design.
This article provides details on how to use SAP UI 5 in a SharePoint Application

Pre-requisites

  1. Download Open UI 5 runtime from http://sap.github.io/openui5/
  2. Extract the content and move it to a new folder named sap-ui5 in layouts folder of 15 hive ( In a typical installation the folder path would be C:\Program Files\Common Files\microsoft shared\Web Server Extensions\15\TEMPLATE\LAYOUTS)
  3. jQuery library to perform REST calls

For this demo, I have directly placed all the files under layouts folder.
Packaging and deploying it through a WSP solution or uploading it to Style Library or any other doc lib directly are also other alternative approaches.
I have also created a new list named “Employees” with columns Title and Location

Note: This article on developing SAP UI 5 applications in Visual Studio provides more details on how to create a basic Open UI 5 application with Visual Studio

Steps

  1. Create a new SharePoint Page
  2. Add a Script Editor Web part to the page
  3. Copy and paste the below code in Script Editor
<script>
 	$.getJSON("/_vti_bin/listdata.svc/Employees", function (data) {
 
 		var sTable = new sap.ui.table.Table({
 			width: "500px",
 			visibleRowCount: 5
 		});
 		sTable.setTitle("Employee Details");
 
 		sTable.addColumn(new sap.ui.table.Column({
 			label: new sap.ui.commons.Label({ text: "Employee Name" }),
 			template: new sap.ui.commons.TextView().bindProperty("text", "Title"),
 		}));
 
 		sTable.addColumn(new sap.ui.table.Column({
 			label: new sap.ui.commons.Label({ text: "Location" }),
 			template: new sap.ui.commons.TextView().bindProperty("text", "Location"),
 		}));
 
 		var oModel = new sap.ui.model.json.JSONModel();
 		oModel.setData({ modelData: data.d.results });
 		sTable.setModel(oModel);
 		sTable.bindRows("/modelData");
 		sTable.placeAt("Emp");
 
 	}, function () { alert('failed'); })
 </script>
 
 <div id='Emp'></div>

Note : If you have placed the libraries in a different location, change the URL before pasting it in the script editor

image.axdEverything is in place you would be able to view a grid similar to the one displayed above

SharePoint 2013 Hosting – ASPHostPortal.com :: Including a Blog to an Existing Site (Without Creating a New Site)

Usually when you want a website in SharePoint 2013, you should produce a brand new website assortment or a subsite using the Website site template.

1dfvwsjbefAt times it would just be simpler (or great) to have a weblog correct in yet another web site for instance a team site or departmental website without the require for yet one more site selection or subsite to maintain. No problem.

ahp_freehostSHP

You can easily include a weblog to an present website making use of PowerShell. Fire up the SharePoint 2013 Management Console and enter the subsequent command:

Enable-SPFeature -identity “BlogContent” -url <<the url of the site>>

2ntntjNext, navigate to the site in the browser and edit the home page:

3tbrtntnFrom the Insert tab on the top ribbon, click on Web Part:

4grjrjtySelect Posts from the Apps category and click on the Add button:

5trmjrymThe blog posts now appear on your home page:

6mtymtClick on the sample blog post:

7dfnfnYou now have a blog in your existing site without having a separate subsite!

AHPHKDC

ASPHostPortal.com to Launch New Data Center in Hong Kong

ASPHostPortal.com to Start New Data Center in Hong Kong on November 2014

ASPHostPortal is known for credible and loyal hosting solutions. Apart from the reliability in the ASPHostPortal Uptime, which features 99.9 per cent common uptime, ASPHostPortal also provides outstanding data center which displays ASPHostPortal large speed and large overall performance web hosting package deal. Lately, ASPHostPortal.com launch its new Data Center in Hons Kong on November 2014 with space for more than 10.000 physical servers, and allowing customers’ to satisfy their  data residency needs.

The brand new facility will provide consumers and their finish customers with ASPHostPortal.com providers that meet up with in-country info residency needs. It will also complement the existing ASPHostPortal.com Asia (Singapore) Data Center. The Hong Kong Data Center will offer the full variety of ASPHostPortal.com website hosting infrastructure services, which includes bare steel servers, virtual servers, storage and networking.

ASPHostPortal offers the perfect mixture of affordability and dependability. They have an excellent uptime history with numerous months this 12 months boasting a lot more than 99.9% typical uptime. Their hosting bundle displays velocity and efficiency. Their information heart might take significantly from the credit rating for such superb providers. The brand new data center will allow clients to copy or integrate data between Asia Data Center with higher transfer speeds and unmetered bandwidth (at no charge) in between amenities.

“With ASPHostPortal, picking the Data Center area is really a totally free function and alternative to all consumers. The shopper just chooses US, Europe, Asia or Australia. It is straightforward, intuitive and convenient. The choice is totally free, and there will never be any other cost to the person related with this choice,” said Dean Thomas, Manager at ASPHostPortal.com.

Customers that have any questions on the feature and also the choice which is most suitable for his or her functions ought to truly feel totally free to get in touch with ASPHostPortal via their 24/7/365 customer assistance crew. ASPHostPortal may help you select the right choice that will best fit your needs.

To find out more about new data center in Hong Kong, please visit http://asphostportal.com/Hosting-Data-Center-HongKong.

About ASPHostPortal.com:

ASPHostPortal.com is a hosting business that greatest support in Windows and ASP.NET-based hosting. Solutions consist of shared web hosting, reseller hosting, and SharePoint hosting, with specialty in ASP.NET, SQL Server, and architecting very scalable solutions. Like a top little to mid-sized company web hosting provider, ASPHostPortal.com attempt to supply essentially the most technologically advanced hosting options obtainable to all customers across the world. Security, reliability, and efficiency are in the core of web hosting operations to make certain every site and/or software hosted is extremely secured and performs at ideal stage.

SharePoint 2013 Hosting with ASPHostPortal.com – How To Migrate Content Database From SharePoint 2010 To SharePoint 2013

Migrate Content Database From SharePoint 2010 To SharePoint 2013

In this article, we will take you through the database migration process from SharePoint 2010 to SharePoint 2013. An overview of the SharePoint database migration process to a new server is available on the ShareGate website.

ahp_freehostSHP

Step-by-step to migrate content database

  • Step 1 : Make two servers available for the process. Both the servers need to run on the same environment. For instance, Server 1 should run on Windows 2008, SQL server 2008 and include SharePoint 2010. Server 2 should run on Windows 2008, SQL server 2008 and include SharePoint 2013.
  • Step 2 : You must begin with backing up the data from Server 1. To do this,

a) While on SharePoint 2010, pick the database of the port you want to back-up. Right click and from the options that appear, click Tasks → Back Up.
b) In the subsequent window that opens, click ‘Add
c) Copy the location available under the ‘Destination’ field and save it a notepad for later use.

  • Step 3 : On server 2, launch SharePoint 2013 and create a new web application under any port. If you are not sure, pick port 88.
  • Step 4 : Once a new application has been created, perform the following steps:

a) Under Central Administration, select Application Management Manage Content Databases
b) Under the newly created web application, select ‘Remove content database checkbox’. Click OK and SAVE
c) Under the Content Database section, you should now see a message that reads, “There are no items to show in this view’

  • Step 5 : The next step is to restore the database from SharePoint 2010 to the new server. To accomplish this, copy the WSS_Content.bak file from Server 1 on to the desktop or any convenient location on the computer handling Server 2.
  • Step 6 : In SharePoint 2013, launch SQL Server 2008 and right click on the node titled Database and from the options, choose ‘Restore Database’.
  • Step 7 : A new ‘Restore Database’ window now opens. Here, select the ‘From Device’ radio button and browse through your system folders to select the WSS_Content.bak file that we had earlier copied in Step 5. Click OK
  • Step 8 : Next, under the ‘Options’ tab of the Restore Database window, check the box that reads, “Overwrite the existing database (WITH REPLACE)”. Press OK to continue. A message box appears that confirms the operation. Press OK to close this box.
  • Step 9 : Open SharePoint 2013 and navigate to Central Administration → Application Management → Manage Content Databases. You should now see the WSS_Content.bak file displayed here.
  • Step 10 : On the top of the window, you will see a message. Click on the ‘Start Now‘ link to continue.
  • Step 11 : In the subsequent window, click on the ‘Upgrade the site collection’ option. You will be shown a message box. Click ‘I’m Ready‘ to continue.
  • Step 12 : The upgradation process will now begin. This typically takes a few minutes. Once you are done, you will be shown a message that reads, “Upgrade Completed Successfully”

This completes the process. Your content database migration from SharePoint 2010 to SharePoint 2013 has been completed successfully.

SharePoint 2013 Hosting – ASPHostPortal.com :: Event ID 6398 AppFabric Distributed Cache Error

Sharepoint 2013 Event ID 6398 AppFabric Distributed Cache Error

ahp_freehostSHPBefore, I started seeing repeated errors with Event ID 6398 and description of:

The Execute method of job definition Microsoft.Office.Server.UserProfiles.LMTRepopulationJob (ID 581fc80e-f7fb-4b3b-99cd-7affa208f57b) threw an exception. More information is included below. Unexpected exception in FeedCacheService.BulkLMTUpdate: Unable to create a DataCache. SPDistributedCache is probably down

This error occurs every 5 minutes as the User Profile Service – Feed Cache Repopulation Job ran and it also prevented anything from populating the My Sites Newsfeeds section. The Newsfeeds page would only return “We’re still collection the latest news. You may see more if you try again a little later.” I tried to follow a multitude of blog posts, forum posts and articles on repairing the AppFabric Distributed Cache Service and was unable to correct the error.

My next step was to try to get the AppFabric service back to the initial setup.

  • Remove the AppFabric setup from Add/Remove Programs.
  • More information on this process in this MSDN article and also follow the link from there to Clean up any remaining AppFabric settings either manually or using the Cleanup Tool they provide.
  • After rebooting, I downloaded the AppFabric 1.1 Installer from here.

However, do not install it manually, instead use the SharePoint 2013 setup disc to use the prerequisite installer to install and configure AppFabric using the following command:

prerequisiteinstaller.exe /appFabric:C:\pathto\WindowsServerAppFabricSetup_x64.exe

Now you can continue on with the initial configuration of the AppFabric service. I ran the following command from the SharePoint 2013 PowerShell as Administrator

$instanceName ="SPDistributedCacheService Name=AppFabricCachingService"

$serviceInstance = Get-SPServiceInstance | ? {($_.service.tostring()) -eq $instanceName -and ($_.server.name) -eq $env:computername}

$serviceInstance.Provision()
  • Then run

Add-SPDistributedCacheServiceInstance

You should see the Distributed Cache service running in Manage Services on Server in Central Administration and also see the AppFabric Caching Service running in Services. If you don’t then try Remove-DistributedCacheServiceInstance and Add again. After completing this process, I was able to go back to MySites and see the Newsfeed as it should be and also no more errors in the Event Log.

NewsFeed Working

I would love to know why this occurred since I was not working on anything with the Caching service prior to the errors; however, I hope this helps someone else caught up in this problem.

SharePoint 2013 Hosting – ASPHostPortal.com :: Integrating WordPress Website Into SharePoint 2013

Within this blog post, I’ll go over about how we will easily combine a WordPress blog with your SharePoint site with all the help of SharePoint 2013 work flow.

ahp_freehostSHP
Using SharePoint 2013 REST API and creating SPD based simple Workflow, we are going to fetch most recent 2 or more submit in the blog site and add those within a SharePoint checklist. Continue reading

SharePoint 2013 Hosting – ASPHostPortal.com :: Plan the Deployment of Farm Solutions for SharePoint 2013

How To Plan the Deployment of Farm Solutions for SharePoint 2013 ?

ahp_freehostSHPWhile everyone is talking about Apps, there are still significant investments in Full Trust Solutions and I am sure that many OnPrem deployments will want to carry these forward when upgrading to SharePoint 2013.  The new SharePoint 2013 upgrade model allows Sites to continue to run in 2010 mode after upgrading and each Site Collection explicitly has to be upgraded individually.

Not the way it worked in 2010 with Visual Upgrade, but this time there is actually both a 14 and 15 Root folder deployed and all the Features and Layout files from SharePoint 2010 are deployed as part of the 2013 installation.

For those of you new to SharePoint, the root folder is where SharePoint keeps most of its application files and the default location for this is “C:\Program Files\Common Files\Microsoft Shared\Web Server Extensions\[SharePoint Internal Version]”, where the versions for the last releases have been 60 (6.0), 12, 14, and now 15. The location is also known as “The xx hive.”

This is great in an upgrade scenario, where you may want to do a platform upgrade first or only want to share the new features of 2013 with a few users while maintaining an unchanged experience for the rest of the organization.  This also gives us the opportunity to have different functionality and features for sites running in 2010 and 2013 mode.  However, this requires some extra thought in the development and deployment process that I will give an introduction to here. Because you can now have Sites running in both 2010 and 2013 mode, SharePoint 2013 introduces a new concept of a Compatibility Level.  Right now it can only be 14 or 15, but you can imagine that there is room for growth.  This Compatibility Level is available at Site Collection and Site (web) level and can be used in code constructs and PowerShell commands.

I will start by explaining how you use it while building and deploying wsp-files for SharePoint 2013 and then finish off with a few things to watch out for and some code tips.

Deployment Considerations

If you take your wsp-files from SharePoint 2010 and just deploy these with Add-SPSolution -> Install-SPSolution as you did in 2010, then SharePoint will assume it is a 2010 solution or a “14” mode solution. If the level is not specified in the PowerShell command, it determines the level based on the value of the SharePointProductVersion attribute in the Solution manifest file of the wsp-package.  The value can currently be 15.0 or 14.0. If this attribute is missing, it will assume 14.0 (SharePoint 2010) and since this attribute did not exist in 2010, only very well informed people will have this included in existing packages.

For PowerShell cmdlets related to installing solutions and features, there is a new parameter called CompatibilityLevel. This can override the settings of the package itself and can assume the following values: 14, 15, New, Old, All and “14,15” (the latter currently also means All).

The parameter is available for Install-SPSolution, Uninstall-SPSolution, Install-SPFeature and Uninstall-SPFeature.  There is no way to specify “All” versions in the package itself – only the intended target – and therefore these parameters need to be specified if you want to deploy to both targets.

It is important to note that Compatibility Level impacts only files deployed to the Templates folder in the 14/15 Root folder.

That is:  Features, Layouts-files, Images, ControlTemplates, etc.

This means that files outside of this folder (e.g. a WCF Service deployed to the ISAPI folder) will be deployed to the 15/ISAPI no matter what level is set in the manifest or PowerShell.  Files such as Assemblies in GAC/Bin and certain resource files will also be deployed to the same location regardless of the Compatibility Level.

It is possible to install the same solution in both 14 and 15 mode, but only if it is done in the same command – specifying Compatibility Level as either “All” or “14,15”.  If it is first deployed with 14 and then with 15, it will throw an exception.  It can be installed with the –Force parameter, but this is not recommended as it could hide other errors and lead to an unknown state for the system.

The following three diagrams illustrate where files go depending on parameters and attributes set (click on the individual images for a larger view). Thanks to the Ignite Team for creating these. I did some small changes from the originals to emphasize a few points.

6786.CompatibilityLevelOld_5F00_thumb_5F00_6A8D17FE

6114.CompatibilityLevelNew_5F00_thumb_5F00_4E7EE9C41401.CompatibilityLevelAll_5F00_thumb_5F00_1974EB45When retracting the solutions, there is also an option to specify Compatibility Level.  If you do not specify this, it will retract all – both 14 and 15 files if installed.  When deployed to both levels, you can retract one, but the really important thing to understand here is that it will not only retract the files from the version folder, but also all version neutral files – such as Assemblies, ISAPI deployed files, etc. – leaving only the files from the Root folder you did not retract.

To plan for this, my suggestion would be the following during development/deployment:

  • If you want to only run sites in 2013 mode, then deploy the Solutions with CompatibilityLevel 15 or SharePointProductVersion 15.0.
  • If you want to run with both 2010 and 2013 mode, and want to share features and layout files, then deploy to both (All or “14,15”).
  • If you want to differentiate the files and features that are used in 2010 and 2013 mode, then the solutions should be split into two or three solutions:
  1. One solution (“Xxx – SP2010”), which contains the files and features to be deployed to the 14 folder for 2010 mode.  including code-behind (for things like feature activation and Application pages), but excluding shared assemblies and files.
  2. One solution (“Xxx – SP2013”), which contains the files and features to be deployed to the 15 folder for 2013 mode, including code-behind (for things like feature activation and Application pages), but excluding shared assemblies and files.
  3. One solution (“Xxx – Common”), which contains shared files (e.g. common assemblies or web services). This solution would also include all WebApplication scoped features such as bin-deployed assemblies and assemblies with SafeControl entries.
  • If you only want to have two solutions for various reasons, the Common solution can be joined with the SP2013 solution as this is likely to be the one you will keep the longest.

The assemblies being used as code-files for the artifacts in SP2010 and SP2013 need to have different names or at least different versions to differentiate them. Web Parts need to go in the Common package and should be shared across the versions, however the installed Web Part templates can be unique to the version mode.

Things to watch out for…

There are a few issues that are worth being aware of that may be fixed in future updates, but you’ll need to watch out for these currently.  I’ve come across an issue where installing the same solution in both levels can go wrong.  If you install it with level All and then uninstall it with level 14 two times, the deployment logic will think that it completely removed the solution, but the files in the 15/Templates folder will still be there.

To recover from this, you can install it with –Force in the orphan level and then uninstall it.  Again, it is better to not get in this situation.

Another scenario that can get you in trouble is if you install a solution in one Compatibility Level (either through PowerShell Parameter or manifest file attribute) and then uninstall with the other level.  It will then remove the common files but leave the specific 14 or 15 folder files and display the solution as fully retracted.

Unfortunately there is no public API to query which Compatibility Levels a package is deployed to.  So you need to get it right the first time or as quickly as possible move to native 2013 mode and packages (this is where we all want to be anyway).

Code patterns

An additional tip is to look for hard coded paths in you custom code such as _layouts and _controltemplates.  The SPUtility class has been updated with static methods to help you parse the current location based on the upgrade status of the Site.   For example, SPUtility.ContextLayoutsFolder will give you the path to the correct layouts folder.  See the reference article on SPUtility properties for more examples.

Round up

I hope this gave you an insight into some of the things you need to consider when deploying Farm Solutions for SharePoint 2013. There are lots of scenarios that are not covered here. If you find some, please share these or share your concerns and I will try to add it as comments or an additional post..

SharePoint 2013 Hosting – ASPHostPortal.com :: Design Manager – Transform HTML to Master Page

The first thing that caught my eye once i logged on to SharePoint 2013 was the design Manager. I presently introduced it shortly. In the previous I’ve centered on studying to model SharePoint making use of CSS and of course the instrument that most of us employed to integrate our CSS or new Master Pages was SharePoint Designer.

ahp_freehostSHP

SharePoint Designer is no longer the popular tool Continue reading

SharePoint 2013 Hosting – ASPHostPortal.com :: Caching SharePoint Data Locally with SPServices and HTML5’s Web Storage

Caching SharePoint Data Locally with SPServices and HTML5’s Web Storage

bulletin-france2

Even though the SOAP services are fast, sometimes they just aren’t fast enough. In some of those cases, it may make sense to store some of your data in the browser’s Web storage so that it’s there on the client during a session or across sessions. Web storage is an HTML5 capability that is available in virtually all browsers these days, even Internet Explorer 8.

The best candidates for this type of storage (IMO) are list contents that are used as references and that don’t have a high number of changes. As an example, you might decide to store a list of countries in Web storage rather than loading them from the Countries list every time a page loads. Even though the read from the list is fast, it has to take *some* time. There’s the retrieval time, and then there is also any processing time on the client side. For instance, if you have dozens of fields per country and you need to load them into a complex JavaScript structure, that takes time, too. If those data chores are making your page loads seem laggy, then consider using local storage.

There are three main ways you can store data locally to improve performance. I’m not going to go into all of their intricacies, but I will give you some rules of thumb. so before you dive in, do some studying about how it all works.

Cookies

For small pieces of data, you should consider using cookies. They can store up to 4k of data each for you.

Session Storage

Session storage is the flavor of Web storage that allows you to store data just for the duration of the session. Think of a session as a browser lifespan. Once you close the browser, the session storage is gone. Both session storage and local storage sizes are limited by the browser you are using. If you want to know if Web storage is available in your browser of choice, take a look at “Can I use“. The amount of storage each browser gives you is a moving target, but it’s per domain.

Local Storage

Local storage takes Web storage one step further. The data stored in local storage persists across browser sessions. In fact, it usually won’t go away until you explicitly delete it. (Consider this fact when you are gobbling up local storage in your development process.)

So how?

The trick with using these storage mechanisms is managing the data you’ve put in local storage as a cache. That data can go past its expiration date, either because some changes were made to the underlying data source or the cache itself has become corrupted. The latter is more difficult to deal with, so here I’ll focus on the former.

JavaScript – like most other programming languages – lends itself to building wrapper functions that add additional layers of abstraction on top of underlying functionality. Too many levels of abstraction can make things confusing, but with careful thought and smart code writing, you can build abstractions that serve you well.

In a recent client project, I found that as list data volumes were increasing, the pages in my SPServices- and KnockoutJS-driven application were loading more and more slowly. so even if I wanted to use REST, I couldn’t, nor do I believe that it would automatically make anything faster. If we had better servers running things, that might make a huge difference, but we have no control over that in the environment.

What I wanted was a reusable wrapper around SPGetListItemsJson (which itself is a wrapper around the SOAP List Web Service’s GetListItemChangesSinceToken and SPService’s SPXmlToJson) that would let me check local storage for a cached data source (list data), read either the entire data source or just the deltas from the SharePoint list, load the data into my application, and then update the cache appropriately.

The getDataSource function below is what I’ve come up with so far. There’s some setup to use it, so let me explain the parameters it takes:

  •  ns – This is the namespace into which you want to load the data. In my applications these days, I usually have a namespace defined that looks something like ProjectName.SubProjectName.DataSources. The “namespace” is simply a complex JavaScript object that contains most of my data and functions.
  • dataSourceName – The name that I want to give the specific data source within ns. In my example above with the Countries list I would use “Countries”.
  • params – This is the big magilla of the parameters. It contains all of the values that will make my call to SPGetListItemsJson work.
  • cacheItemName – This is the name of the item I want to store in Web storage. In the Countries example, I would use “ProjectName.SubProjectName.DataSources.Countries”.
  • storageType – Either “localStorage” or “sessionStorage”. If I expect the data to change regularly, I’d probably use sessionStorage (this gives me a clean data load for each session). If the data is highly static, I’d likely use localStorage.

And here’s the code:

/* Example:
getDataSource(ProjectName.SubProjectName.DataSources, "Countries", params: {
  webURL: "/",
  listName: "Countries",
  CAMLViewFields: "<ViewFields>" +
      "<FieldRef Name='ID'/>" +
      "<FieldRef Name='Title'/>" +
      "<FieldRef Name='Population'/>" +
      "<FieldRef Name='CapitalCity'/>" +
      "<FieldRef Name='Continent'/>" +
    "</ViewFields>",
  CAMLQuery: "<Query>" +
      "<OrderBy><FieldRef Name='ID'/></OrderBy>" +
    "</Query>",
  CAMLRowLimit: 0,
  changeToken: oldToken,
  mapping: {
      ows_ID:{"mappedName":"ID","objectType":"Counter"},
      ows_Title:{"mappedName":"Title","objectType":"Text"},
      ows_Population:{"mappedName":"Population","objectType":"Integer"},
      ows_CapitalCity:{"mappedName":"CapitalCity","objectType":"Text"},
      ows_Continent:{"mappedName":"Continent","objectType":"Lookup"},
    }
  }, "ProjectName.SubProjectName.DataSources.Countries"
)
*/

function getDataSource(ns, dataSourceName, params, cacheItemName, storageType) {

  var dataReady = $.Deferred();

  // Get the data from the cache if it's there
  ns[dataSourceName] = JSON.parse(window[storageType].getItem(cacheItemName)) || new DataSource();
  var oldToken = ns[dataSourceName].changeToken;
  params.changeToken = oldToken;

  // Read whatever we need from the dataSource
  var p = $().SPServices.SPGetListItemsJson(params);

  // Process the response
  p.done(function() {
    var updates = this.data;
    var deletedIds = this.deletedIds;
    var changeToken = this.changeToken;

    // Handle updates/new items
    if (oldToken !== "" && updates.length > 0) {
      for (var i = 0; i < updates.length; i++) {
        var thisIndex = ns[dataSourceName].data.binaryIndexOf(updates[i], "ID");
        // If the item is in the cache, replace it with the new data
        if (thisIndex > -1) {
          ns[dataSourceName].data[thisIndex] = updates[i];
          // Otherwise, add the new item to the cache
        } else {
          ns[dataSourceName].data.splice(-thisIndex, 0, updates[i]);
        }
      }
    } else if (oldToken === "") {
      ns[dataSourceName] = this;
    }
    // Handle deletes
    for (var i = 0; i < deletedIds.length; i++) {
      var thisIndex = ns[dataSourceName].data.binaryIndexOf({
        ID: deletedIds[i]
      }, "ID");
      ns[dataSourceName].data.splice(thisIndex, 1);
    }
    // Save the updated data back to the cache
    if (oldToken === "" || updates.length > 0 || deletedIds.length > 0) {
      // Save the new changeToken
      ns[dataSourceName].changeToken = changeToken;
      window[storageType].setItem(cacheItemName, JSON.stringify(ns[dataSourceName]));
    }
    dataReady.resolve();
  });
  return dataReady.promise();
}

Some of the nice things about this function:

  • It’s generic. I can call it for any list-based data source in SharePoint. (I started out building it for one data source and then generalized it.
  • I call call it during a page life cycle to refresh the application data anytime I want or on a schedule, perhaps with setInterval.
  • I can set a lot of parameters to cover a lot of different use cases.
  • Each time I call it, it updates the cache (if it needs to) so that the next time I call it I get a “fresh” copy of the data.
  • It only loads the data that it needs to, by using the GetListItemChangesSinceToken capabilities.

And some downsides:

  • Since I know what data I’m working with in my application and that it will fit into the Web storage easily, I’m not worrying about failed saves.
  • If the cache does become corrupt (not something I expect, but there’s always Murphy), I’m not handling it at all.
  • If you decide to try this out, you’ll need a few auxiliary functions as well:
/* DataSource constructor */
function DataSource() {
  this.changeToken = "";
  this.mapping = {};
  this.data = [];
  this.deletedIds = [];
}

/** Adapted from http://oli.me.uk/2013/06/08/searching-javascript-arrays-with-a-binary-search/
 *
 * Performs a binary search on the host array.
 * @param {*} searchObj The object to search for within the array.
 * @param {*} searchElement The element in the object to compare. The objects in the array must be sorted by this element.
 * @return {Number} The index of the element. If the item is not found, the function returns a negative index where it should be inserted (if desired).
 */
Array.prototype.binaryIndexOf = function(searchObj, searchElement) {

  var minIndex = 0;
  var maxIndex = this.length - 1;
  var currentIndex;
  var currentElement;

  var searchValue = searchObj[searchElement];

  while (minIndex <= maxIndex) {
    currentIndex = (minIndex + maxIndex) / 2 | 0;
    currentElement = this[currentIndex];

    if (currentElement[searchElement] < searchValue) {
      minIndex = currentIndex + 1;
    } else if (currentElement[searchElement] > searchValue) {
      maxIndex = currentIndex - 1;
    } else {
      return currentIndex;
    }
  }

  return ~maxIndex;
}