Tag Archives: SharePoint 2013 Hosting Recommendation

SharePoint 2013 Hosting – ASPHostPortal.com : Social Feature of SharePoint 2013

Social Feature of SharePoint 2013

Out-of-the-box SharePoint 2013 has some pretty neat capabilities to support project management initiatives in your organization. Some of the enhancements will greatly improve the communication on a project so that everybody can collaborate effectively!

ahp banner sharepoint-01

Some of the items like Newsfeeds enable immediate discussions, features connecting SharePoint to Microsoft Office helps people collaborate quicker and easier, and mobile improvements allow team members to stay tuned in while on the go.

SharePoint’s new features are strong influenced by–or in some cases, lifted from–from the top social networks. For instance, the updated My Sites feature has a strong microblogging component, complete with likes, hash tags, app mentions, and other social tools you’ll recognize from Facebook, Twitter, and Google+. Community sites–discussion forums where enterprise users share information and answer colleagues’ questions–bring a similar social feel to collaborative computing.

Sharepoint 2013′s People Card contains your contact information, as well as pictures, status updates, and activity feeds from SharePoint, Facebook, and LinkedIn. And SkyDrive Pro, the premium version of Microsoft’s cloud storage and syncing service, allows users to share files across SharePoint.

In earlier versions of SharePoint, each user had a profile and a personal site (e.g., My Site). The 2013 version of SharePoint splits My Site into three sections: Newsfeed, SkyDrive, and Sites. (More on each in the slideshow). A global navigation bar provides access to each section. These social features are tightly integrated into SharePoint 2013, so you no longer need to launch a Web browser to access them.

Previously, enterprise social networking on SharePoint required either extensive customization or the use of an add-on product such as NewsGator Social Sites. While still leaving room for third-party products to add features on top of the platform, Microsoft has now made SharePoint more of an enterprise social network in its own right.

Cheap and Recommended SharePoint 2013 Hosting

ASPHostPortal.com is Perfect, suitable hosting plan for a starter in SharePoint. ASPHostPortal the leading provider of Windows hosting and affordable SharePoint Hosting. ASPHostPortal proudly working to help grow the backbone of the Internet, the millions of individuals, families, micro-businesses, small business, and fledgling online businesses. ASPHostPortal has ability to support the latest Microsoft and ASP.NET technology, such as: WebMatrix, WebDeploy, Visual Studio 2015, .NET 5/ASP.NET 4.5.2, ASP.NET MVC 6.0/5.2, Silverlight 6 and Visual Studio Lightswitch, ASPHostPortal guarantees the highest quality product, top security, and unshakeable reliability, carefully chose high-quality servers, networking, and infrastructure equipment to ensure the utmost reliability.

SharePoint 2013 Hosting – ASPHostPortal.com : How to Enable BreadCrumb in SharePoint 2013?

How to Enable BreadCrumb in SharePoint 2013?

You must be remember the BreadCrumb option situated next to the “Site Action” button in SharePoint 2010 is really handy to navigate up or down with a single click. You must have noticed in SharePoint 2013 this option is removed rather hidden in the master. Here I will show you how to get that option back in your master page and make your life simpler.

ahp banner sharepoint-01

The good news is Microsoft didn’t remove it from SharePoint 2013, it’s just hidden in the Seattle master page.

Step by Step

  • Open your site with SharePoint designer
  • Navigate to All Files -> _catalogs -> master page
  • Edit the Seattle.master in advanced mode and copy all the code
  • By default it’s not possible to edit the original master. To create a new one click on File -> Blank Mater Page
  • Check out the new master, edit it in advanced mode, delete all the existent code and paste the one from the original Seattle
  • Search for
<divclass="ms-breadcrumb-dropdownBox"style="display:none;">
  • Delete the CSS attribute
style="display:none;"
  • Two lines bellow change the visible attribute of the SharePoint:PopoutMenu to true
  • After editing your code it should look like this
<divclass="ms-breadcrumb-dropdownBox"><SharePoint:AjaxDeltaid="DeltaBreadcrumbDropdown"runat="server">
	<SharePoint:PopoutMenu
		Visible="true"
		runat="server"
		ID="GlobalBreadCrumbNavPopout"
		IconUrl="/_layouts/15/images/spcommon.png?rev=27"
		IconAlt="<%$Resources:wss,master_breadcrumbIconAlt%>"
		ThemeKey="v15breadcrumb"
		IconOffsetX="215"
		IconOffsetY="120"
		IconWidth="16"
		IconHeight="16"
		AnchorCss="ms-breadcrumb-anchor"
		AnchorOpenCss="ms-breadcrumb-anchor-open"
		MenuCss="ms-breadcrumb-menu ms-noList">
		<divclass="ms-breadcrumb-top">
			<asp:Label runat="server" CssClass="ms-breadcrumb-header" Text="<%$Resources:wss,master_breadcrumbHeader%>" />
		</div>
<asp:ContentPlaceHolderid="PlaceHolderTitleBreadcrumb"runat="server">
<SharePoint:ListSiteMapPathrunat="server"
SiteMapProviders="SPSiteMapProvider,SPContentMapProvider"
RenderCurrentNodeAsLink="false"
PathSeparator=""
CssClass="ms-breadcrumb"
NodeStyle-CssClass="ms-breadcrumbNode"
CurrentNodeStyle-CssClass="ms-breadcrumbCurrentNode"
RootNodeStyle-CssClass="ms-breadcrumbRootNode"
NodeImageOffsetX="217"
NodeImageOffsetY="210"
NodeImageWidth="16"
NodeImageHeight="16"
NodeImageUrl="/_layouts/15/images/spcommon.png?rev=27"
RTLNodeImageOffsetX="199"
RTLNodeImageOffsetY="210"
RTLNodeImageWidth="16"
RTLNodeImageHeight="16"
RTLNodeImageUrl="/_layouts/15/images/spcommon.png?rev=27"
HideInteriorRootNodes="true"
SkipLinkText=""/>
</asp:ContentPlaceHolder>
</SharePoint:PopoutMenu>
</SharePoint:AjaxDelta>
</div>
  • If you are using one of the themes from SharePoint your breadcrumb icon will not appear as it should, to get it back add the script below before the closing tag
<scripttype="text/javascript">
document.getElementById("GlobalBreadCrumbNavPopout-anchor").innerHTML='<img alt="Navigate Up" src="/_layouts/15/images/spcommon.png?rev=27">';</script>
  • Save the modified master page, check it in and publish the major version
  • Open your SharePoint Site, go to Settings -> Site Settings -> MasterPage under Look and Feel and select the new master page for the Site master and System master options
  • You will see a new icon on the left side of the menu.

Cheap and Recommended SharePoint 2013 Hosting

ASPHostPortal.com is Perfect, suitable hosting plan for a starter in SharePoint. ASPHostPortal the leading provider of Windows hosting and affordable SharePoint Hosting. ASPHostPortal proudly working to help grow the backbone of the Internet, the millions of individuals, families, micro-businesses, small business, and fledgling online businesses. ASPHostPortal has ability to support the latest Microsoft and ASP.NET technology, such as: WebMatrix, WebDeploy, Visual Studio 2015, .NET 5/ASP.NET 4.5.2, ASP.NET MVC 6.0/5.2, Silverlight 6 and Visual Studio Lightswitch, ASPHostPortal guarantees the highest quality product, top security, and unshakeable reliability, carefully chose high-quality servers, networking, and infrastructure equipment to ensure the utmost reliability.

SharePoint 2013 Hosting – ASPHostPortal.com :: SharePoint 2013 Management Tips

SharePoint 2013 boasts a more simplified user interface than its predecessors, but several new features make it not only a powerful tool for collaboration, but for data analysis and integration as well. Although most of these features are built-in or enabled by default, some may not be obvious at first glance or will require a few steps of configuration to avoid user confusion.

ahp banner sharepoint-01

From improved site creation to increased social media capabilities, these expert tips will help administrators and enterprises get the most out of what SharePoint 2013 has to offer.

The right way to configure co-authoring in SharePoint 2013

With document co-authoring enabled by default in SharePoint 2013, multiple users can edit a document at a time without overwriting previous changes. Administrators will still need to configure SharePoint so users aren’t required to check documents out of the document library, and document versioning — which is disabled by default — will also need to be configured appropriately.

SharePoint 2013′s tight social media integration

Unlike previous versions of SharePoint, which had limited social media capabilities, SharePoint 2013 integrates microblogging features found on Twitter and Facebook, including hashtags, follows, mentions and likes. Enterprises can leverage the increased connectivity of the SharePoint community as a way to share knowledge and improve team productivity.

How to synchronize SharePoint 2013 lists with Outlook 2013

Support for various types of SharePoint 2013 lists is built into Outlook 2013, allowing users to access their SharePoint data directly from the email client. However, administrators would do well to train users on accessing SharePoint data through Outlook in order to avoid any accidental crossover between personal and team information.

How the Design Manager in SharePoint 2013 modernizes site creation

Site creation is much more flexible in SharePoint 2013, thanks to the Design Manager. This new publishing feature not only allows users to upload designs created in the HTML or CSS design tools of their choice, but also features improved themes, design packages and device channels that render sites differently for mobile devices.

How SharePoint 2013 analytics enables real-time decision making

Accessing analytics is easier than ever in SharePoint 2013. OData support in Business Connectivity Services provides real-time access to data from multiple sources, and various Web protocols are available for query and update operations. Another improvement over past iterations of SharePoint is the fact that reports can now be generated in Excel.
How to synchronize SharePoint 2013 lists with Outlook 2013

Support for various types of SharePoint 2013 lists is built into Outlook 2013, allowing users to access their SharePoint data directly from the email client. However, administrators would do well to train users on accessing SharePoint data through Outlook in order to avoid any accidental crossover between personal and team information.

Cheap and Recommended SharePoint 2013 Hosting

ASPHostPortal.com is Perfect, suitable hosting plan for a starter in SharePoint. ASPHostPortal the leading provider of Windows hosting and affordable SharePoint Hosting. ASPHostPortal proudly working to help grow the backbone of the Internet, the millions of individuals, families, micro-businesses, small business, and fledgling online businesses. ASPHostPortal has ability to support the latest Microsoft and ASP.NET technology, such as: WebMatrix, WebDeploy, Visual Studio 2015, .NET 5/ASP.NET 4.5.2, ASP.NET MVC 6.0/5.2, Silverlight 6 and Visual Studio Lightswitch, ASPHostPortal guarantees the highest quality product, top security, and unshakeable reliability, carefully chose high-quality servers, networking, and infrastructure equipment to ensure the utmost reliability.

SharePoint 2013 Hosting – ASPHostPortal.com :: Data Loss Prevention in SharePoint 2013

Data Loss Prevention in SharePoint 2013

A problem that has plagued many organizations over the years is how to best protect sensitive data in their SharePoint environments. This includes things like credit card, drivers license, and social security numbers. For SharePoint on premise deployments this has required third party applications to be used to provide this functionality to all locations in the SharePoint farm.

SharePoint 2013 Hosting - ASPHostPortal.com :: Data Loss Prevention in SharePoint 2013

Data Loss Prevention (DLP) is a feature that was first introduced in Exchange (2013 and Online) and is now in SharePoint Online (but not the on-premises version). I covered the initial news on this topic last year and took the chance at the Microsoft Ignite conference to find out had the functionality predicted then been realized in production.

DLP is actually a great example of how engineering teams are now working across multiple products rather than in the narrow silos of the past and it’s obvious that a lot of lessons learned from the DLP implementation in Exchange have influenced the implementation in SharePoint Online to provide protection against the misuse of sensitive data in documents. The documents can be stored in SharePoint or OneDrive for Business libraries.

Compliance officers, paralegals, or others performing a legal audit often need to assess the degree of risk posed by sensitive and personal data stored on SharePoint sites. Data loss prevention (DLP) in SharePoint Online provides you with a way to identify that data, so you can work with document owners to reduce any risk to your organization.

Assign permissions to the eDiscovery Center

Permissions are a big deal. And to run a query in the eDiscovery Center, you need lots of different types of permissions. Assigning permissions to multiple people for Exchange Online, SharePoint Online, the eDiscovery Center, and each site collection could take a long time.

If you only want to use the eDiscovery Center, you might wonder why you need all those other permissions. The eDiscovery Center is a site collection, and like any other site collection, you have to be given permissions to access it. Access to the eDiscovery Center, however, grants no special, automatic access to other site collections, to documents, or to content. To gain access to data stored on other site collections and in OneDrive, you’ll need to be granted admin permissions for each. Multiply that action times the number of admins in your organization, and you can see how it makes sense to optimize the process. Because the security group that you’ll create in the next set of tasks is powerful, choose its members carefully.

Open the eDiscovery Center

  • Sign in to the Office 365 admin portal.
  • In the Admin menu, choose SharePoint.
  • Click the link to the eDiscovery Center on the site collections link page. Your eDiscovery Center URL will look similar to this: http://contoso.sharepoint.com/sites/ediscovery.

Create an eDiscovery case

  • Cases are where you can run queries and export them for analysis. Follow these steps to create a case.
  • In the eDiscovery Center, click Create new case.
  • Type a <title and description> for your case.
  • In the Web Site Address box, type the last part of the URL you want for the case. Each case gets its own URL, so feel free to make this as unique and helpful as you’d like.
  • Under Select a template, select eDiscovery Case.
  • Under User Permissions, select whether to keep the same permissions as the parent site or use unique permissions. If specific people need access to this case but not to others, choose
  • Use unique permissions.
  • You can optionally choose to display the site on the Quick Launch or in the top link bar on the eDiscovery Center.
  • Click Create.

Query for sensitive data within a SharePoint site

Go to your case menu by using the URL you created. The case menu is specific to the case you’re working on and won’t show other cases that are in the eDiscovery Center. (When querying for sensitive data, you only need to pay attention to two sections on this page– Queries eDiscovery Sets.) Querying takes two steps: creating a query and running a query.

View and export the results of a query

You’ve made it. Now you can actually see the results of the query you’ve been building this whole time.

  • Click Search to see the results on the bottom of the page.
  • Click the Export button to view the data in a spreadsheet. For more information about exporting your data, see Export eDiscovery content and create reports.
  • Click Save if you want to keep the query.

Cheap and Recommended SharePoint 2013 Hosting

ASPHostPortal.com is Perfect, suitable hosting plan for a starter in SharePoint. ASPHostPortal the leading provider of Windows hosting and affordable SharePoint Hosting. ASPHostPortal proudly working to help grow the backbone of the Internet, the millions of individuals, families, micro-businesses, small business, and fledgling online businesses. ASPHostPortal has ability to support the latest Microsoft and ASP.NET technology, such as: WebMatrix, WebDeploy, Visual Studio 2015, .NET 5/ASP.NET 4.5.2, ASP.NET MVC 6.0/5.2, Silverlight 6 and Visual Studio Lightswitch, ASPHostPortal guarantees the highest quality product, top security, and unshakeable reliability, carefully chose high-quality servers, networking, and infrastructure equipment to ensure the utmost reliability

SharePoint 2013 Hosting – ASPHostPortal.com :: How to solve cannot connect to database at SQL server

You are engaged in building a new SharePoint Server 2013 farm on Windows Server 2012 servers. You have successfully installed all of the prerequisites and roles and features on all servers that will host SharePoint.

ahp banner sharepoint-01

You have installed and configured a new instance of SQL Server 2012. You have configured a database server alias on all SharePoint servers. You have not yet run the configuration wizard, but are now beginning configuration tasks. Your first task is to create the configuration database manually so as to avoid the lengthy GUID that SharePoint configuration wizard normally appends to the database name. On the batch serverYou run New-SPConfigurationDatabase, and then experience the following response in the management shell:

New-SPConfigurationDatabase : Cannot connect to database master at SQL server at [DatabaseAlias]. The database might not exist, or the current user does not have permission to connect to it. At line:1 char:1 + New-SPConfigurationDatabase -DatabaseName DB_Config -DatabaseServer [alias] – … + ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ + CategoryInfo : InvalidData:(Microsoft.Share… urationDatabase:SPCmdletNewSPConfigurationDatabase) [New-SPConfigurationDatabase], SPException + FullyQualifiedErrorId : Microsoft.SharePoint.PowerShell.SPCmdletNewSPConfigurationDatabase
This is a critical problem, as it prevents further installation efforts. Below are my troubleshooting steps and ultimate resolution.

Troubleshooting
Action: verified spelling of database server.
Results: verified.
Action: verify database alias (cliconfg).
Results: alias configured and appears to be correct.
Action: verify farm service account name and password by adding to managed accounts of another SharePoint farm.
Results: farm service account name successfully added to Managed Accounts of another farm.
Action: verify that farm service account added to SQL Server instance logins and configured with dbcreator and securityadmin roles. Start SQL Server Management Studio, navigate to [name]\security\.
Results: verified.
Action: verify that all SQL server services are running. Started SQL Server Configuration Manager
Results: verified that SQL Server, SQL Server Agent and SQL Server Browser are all running.
Action: verify that TCP/IP protocol is enabled. started SQL Server Configuration Manager
Results: verified that TCP/IP is enabled for SQL Native Client 11.0 Configuration (32 bit), SQL Server Network Configuration and SQL Native Client 11.0 Configuration.
Action: verify that remote connections are enabled. In SQL Server Management Studio, right-click server name instance in tree, select Properties, select Connections, look for Allow remote connections to this server.
Results: was enabled.
Action: check (trarget) SharePoint server Application log.
Results: found the following events correlated with attempts to run script:
Log Name: Application
Source: Microsoft-SharePoint Products-SharePoint
Foundation
Date: [date/time]
Event ID: 5586
Task Category: Database
Level: Error
Keywords:
User: [Administrator]
Computer: [ServerName]
Description:
Unknown SQL Exception -1 occurred. Additional error information
from SQL Server is included below.

A network-related or instance-specific error occurred while
establishing a connection to SQL Server. The server was not
found or was not accessible. Verify that the instance name
is correct and that SQL Server is configured to allow remote
connections. (provider: SQL Network Interfaces, error: 26 -
Error Locating Server/Instance Specified)
Event Xml:…

and

Log Name: Application
Source: Microsoft-SharePoint Products-SharePoint
Foundation
Date: [date/time]
Event ID: 3363
Task Category: Database
Level: Critical
Keywords:
User: [Administrator]
Computer: [ServerName]
Description:
Cannot connect to database master at SQL server at
[DatabaseServerAlias]. The database might not exist, or the
current user does not have permission to connect to it.
Event Xml:

Observation: issue likely due to connectivity with database; possible firewall issue – firewall blocking communication.
Action: disable firewall. On Server Manager, select Local Server, then click link next to Windows Firewall. Then click Turn Windows Firewall on of off.
Result: Settings are managed by GPO – can’t change.
Action: run netsh command: netsh firewall set allprofiles state off, then re-run New-SPConfigurationDatabase.
Result: same connection error.
Action: run netsh command: netsh firewall set opmode state off, then re-run New-SPConfigurationDatabase.
Result: same connection error.
Action: attempt ODBC connection using ODBC Data Source Administrator
Results: connection failed.
cannot connect to database at SQL server

Action: in Services control panel, set startup to Disabled. Restarted server.
Results: On reboot, Firewall disabled.
Action: attempt ODBC connection using ODBC Data Source Administrator.
Results: connection succeeded.
Observation: this is a firewall issue on the SQL Server instance.
Action: started Firewall service, then ran netsh scripts again, this time setting states to On. Then tested ODBC connectivity again.
Results: connection failed.
Action: on SQL Server, using the Windows Firewall with Advanced Security configured two TCP and two program inbound firewall rules: SQL – sqlbrowser.exe, SQL – sqlservr.exe, SQL – TCP 1433 and SQL – UDP 1434. Then tested ODBC connectivity again.
Results: connection failed.
Action: reviewed Firewall log.
Results: all TCP packets sent to SQL Server ports were being dropped.
Action: discussed results with sysadmin, who noted impact of group policy object. recommend that rules be created in local GPO instead. This can be verified by viewing rules under Windows Firewall with… / Monitoring / Firewall.
Results: none of the new firewall rules were listed.
Observation: New firewall rules were being overridden by GPO.
Action: launched local GP editor applet. Configured the four rules noted previously. Then tested ODBC connectivity again.
Results: connection succeeded.

Solution
Implement firewall rules as noted in reference [3]. These may need to be configured in GPO if Firewall access controlled by GPO.

Cheap and Recommended SharePoint 2013 Hosting

ASPHostPortal.com is Perfect, suitable hosting plan for a starter in SharePoint. ASPHostPortal the leading provider of Windows hosting and affordable SharePoint Hosting. ASPHostPortal proudly working to help grow the backbone of the Internet, the millions of individuals, families, micro-businesses, small business, and fledgling online businesses. ASPHostPortal has ability to support the latest Microsoft and ASP.NET technology, such as: WebMatrix, WebDeploy, Visual Studio 2015, .NET 5/ASP.NET 4.5.2, ASP.NET MVC 6.0/5.2, Silverlight 6 and Visual Studio Lightswitch, ASPHostPortal guarantees the highest quality product, top security, and unshakeable reliability, carefully chose high-quality servers, networking, and infrastructure equipment to ensure the utmost reliability

SharePoint 2013 Hosting – ASPHostPortal.com :: Event ID 6398 AppFabric Distributed Cache Error

Sharepoint 2013 Event ID 6398 AppFabric Distributed Cache Error

ahp_freehostSHPBefore, I started seeing repeated errors with Event ID 6398 and description of:

The Execute method of job definition Microsoft.Office.Server.UserProfiles.LMTRepopulationJob (ID 581fc80e-f7fb-4b3b-99cd-7affa208f57b) threw an exception. More information is included below. Unexpected exception in FeedCacheService.BulkLMTUpdate: Unable to create a DataCache. SPDistributedCache is probably down

This error occurs every 5 minutes as the User Profile Service – Feed Cache Repopulation Job ran and it also prevented anything from populating the My Sites Newsfeeds section. The Newsfeeds page would only return “We’re still collection the latest news. You may see more if you try again a little later.” I tried to follow a multitude of blog posts, forum posts and articles on repairing the AppFabric Distributed Cache Service and was unable to correct the error.

My next step was to try to get the AppFabric service back to the initial setup.

  • Remove the AppFabric setup from Add/Remove Programs.
  • More information on this process in this MSDN article and also follow the link from there to Clean up any remaining AppFabric settings either manually or using the Cleanup Tool they provide.
  • After rebooting, I downloaded the AppFabric 1.1 Installer from here.

However, do not install it manually, instead use the SharePoint 2013 setup disc to use the prerequisite installer to install and configure AppFabric using the following command:

prerequisiteinstaller.exe /appFabric:C:\pathto\WindowsServerAppFabricSetup_x64.exe

Now you can continue on with the initial configuration of the AppFabric service. I ran the following command from the SharePoint 2013 PowerShell as Administrator

$instanceName ="SPDistributedCacheService Name=AppFabricCachingService"

$serviceInstance = Get-SPServiceInstance | ? {($_.service.tostring()) -eq $instanceName -and ($_.server.name) -eq $env:computername}

$serviceInstance.Provision()
  • Then run

Add-SPDistributedCacheServiceInstance

You should see the Distributed Cache service running in Manage Services on Server in Central Administration and also see the AppFabric Caching Service running in Services. If you don’t then try Remove-DistributedCacheServiceInstance and Add again. After completing this process, I was able to go back to MySites and see the Newsfeed as it should be and also no more errors in the Event Log.

NewsFeed Working

I would love to know why this occurred since I was not working on anything with the Caching service prior to the errors; however, I hope this helps someone else caught up in this problem.

SharePoint 2013 Hosting – ASPHostPortal.com :: Integrating WordPress Website Into SharePoint 2013

Within this blog post, I’ll go over about how we will easily combine a WordPress blog with your SharePoint site with all the help of SharePoint 2013 work flow.

ahp_freehostSHP
Using SharePoint 2013 REST API and creating SPD based simple Workflow, we are going to fetch most recent 2 or more submit in the blog site and add those within a SharePoint checklist. Continue reading

SharePoint 2013 Hosting – ASPHostPortal.com :: Plan the Deployment of Farm Solutions for SharePoint 2013

How To Plan the Deployment of Farm Solutions for SharePoint 2013 ?

ahp_freehostSHPWhile everyone is talking about Apps, there are still significant investments in Full Trust Solutions and I am sure that many OnPrem deployments will want to carry these forward when upgrading to SharePoint 2013.  The new SharePoint 2013 upgrade model allows Sites to continue to run in 2010 mode after upgrading and each Site Collection explicitly has to be upgraded individually.

Not the way it worked in 2010 with Visual Upgrade, but this time there is actually both a 14 and 15 Root folder deployed and all the Features and Layout files from SharePoint 2010 are deployed as part of the 2013 installation.

For those of you new to SharePoint, the root folder is where SharePoint keeps most of its application files and the default location for this is “C:\Program Files\Common Files\Microsoft Shared\Web Server Extensions\[SharePoint Internal Version]”, where the versions for the last releases have been 60 (6.0), 12, 14, and now 15. The location is also known as “The xx hive.”

This is great in an upgrade scenario, where you may want to do a platform upgrade first or only want to share the new features of 2013 with a few users while maintaining an unchanged experience for the rest of the organization.  This also gives us the opportunity to have different functionality and features for sites running in 2010 and 2013 mode.  However, this requires some extra thought in the development and deployment process that I will give an introduction to here. Because you can now have Sites running in both 2010 and 2013 mode, SharePoint 2013 introduces a new concept of a Compatibility Level.  Right now it can only be 14 or 15, but you can imagine that there is room for growth.  This Compatibility Level is available at Site Collection and Site (web) level and can be used in code constructs and PowerShell commands.

I will start by explaining how you use it while building and deploying wsp-files for SharePoint 2013 and then finish off with a few things to watch out for and some code tips.

Deployment Considerations

If you take your wsp-files from SharePoint 2010 and just deploy these with Add-SPSolution -> Install-SPSolution as you did in 2010, then SharePoint will assume it is a 2010 solution or a “14” mode solution. If the level is not specified in the PowerShell command, it determines the level based on the value of the SharePointProductVersion attribute in the Solution manifest file of the wsp-package.  The value can currently be 15.0 or 14.0. If this attribute is missing, it will assume 14.0 (SharePoint 2010) and since this attribute did not exist in 2010, only very well informed people will have this included in existing packages.

For PowerShell cmdlets related to installing solutions and features, there is a new parameter called CompatibilityLevel. This can override the settings of the package itself and can assume the following values: 14, 15, New, Old, All and “14,15” (the latter currently also means All).

The parameter is available for Install-SPSolution, Uninstall-SPSolution, Install-SPFeature and Uninstall-SPFeature.  There is no way to specify “All” versions in the package itself – only the intended target – and therefore these parameters need to be specified if you want to deploy to both targets.

It is important to note that Compatibility Level impacts only files deployed to the Templates folder in the 14/15 Root folder.

That is:  Features, Layouts-files, Images, ControlTemplates, etc.

This means that files outside of this folder (e.g. a WCF Service deployed to the ISAPI folder) will be deployed to the 15/ISAPI no matter what level is set in the manifest or PowerShell.  Files such as Assemblies in GAC/Bin and certain resource files will also be deployed to the same location regardless of the Compatibility Level.

It is possible to install the same solution in both 14 and 15 mode, but only if it is done in the same command – specifying Compatibility Level as either “All” or “14,15”.  If it is first deployed with 14 and then with 15, it will throw an exception.  It can be installed with the –Force parameter, but this is not recommended as it could hide other errors and lead to an unknown state for the system.

The following three diagrams illustrate where files go depending on parameters and attributes set (click on the individual images for a larger view). Thanks to the Ignite Team for creating these. I did some small changes from the originals to emphasize a few points.

6786.CompatibilityLevelOld_5F00_thumb_5F00_6A8D17FE

6114.CompatibilityLevelNew_5F00_thumb_5F00_4E7EE9C41401.CompatibilityLevelAll_5F00_thumb_5F00_1974EB45When retracting the solutions, there is also an option to specify Compatibility Level.  If you do not specify this, it will retract all – both 14 and 15 files if installed.  When deployed to both levels, you can retract one, but the really important thing to understand here is that it will not only retract the files from the version folder, but also all version neutral files – such as Assemblies, ISAPI deployed files, etc. – leaving only the files from the Root folder you did not retract.

To plan for this, my suggestion would be the following during development/deployment:

  • If you want to only run sites in 2013 mode, then deploy the Solutions with CompatibilityLevel 15 or SharePointProductVersion 15.0.
  • If you want to run with both 2010 and 2013 mode, and want to share features and layout files, then deploy to both (All or “14,15”).
  • If you want to differentiate the files and features that are used in 2010 and 2013 mode, then the solutions should be split into two or three solutions:
  1. One solution (“Xxx – SP2010”), which contains the files and features to be deployed to the 14 folder for 2010 mode.  including code-behind (for things like feature activation and Application pages), but excluding shared assemblies and files.
  2. One solution (“Xxx – SP2013”), which contains the files and features to be deployed to the 15 folder for 2013 mode, including code-behind (for things like feature activation and Application pages), but excluding shared assemblies and files.
  3. One solution (“Xxx – Common”), which contains shared files (e.g. common assemblies or web services). This solution would also include all WebApplication scoped features such as bin-deployed assemblies and assemblies with SafeControl entries.
  • If you only want to have two solutions for various reasons, the Common solution can be joined with the SP2013 solution as this is likely to be the one you will keep the longest.

The assemblies being used as code-files for the artifacts in SP2010 and SP2013 need to have different names or at least different versions to differentiate them. Web Parts need to go in the Common package and should be shared across the versions, however the installed Web Part templates can be unique to the version mode.

Things to watch out for…

There are a few issues that are worth being aware of that may be fixed in future updates, but you’ll need to watch out for these currently.  I’ve come across an issue where installing the same solution in both levels can go wrong.  If you install it with level All and then uninstall it with level 14 two times, the deployment logic will think that it completely removed the solution, but the files in the 15/Templates folder will still be there.

To recover from this, you can install it with –Force in the orphan level and then uninstall it.  Again, it is better to not get in this situation.

Another scenario that can get you in trouble is if you install a solution in one Compatibility Level (either through PowerShell Parameter or manifest file attribute) and then uninstall with the other level.  It will then remove the common files but leave the specific 14 or 15 folder files and display the solution as fully retracted.

Unfortunately there is no public API to query which Compatibility Levels a package is deployed to.  So you need to get it right the first time or as quickly as possible move to native 2013 mode and packages (this is where we all want to be anyway).

Code patterns

An additional tip is to look for hard coded paths in you custom code such as _layouts and _controltemplates.  The SPUtility class has been updated with static methods to help you parse the current location based on the upgrade status of the Site.   For example, SPUtility.ContextLayoutsFolder will give you the path to the correct layouts folder.  See the reference article on SPUtility properties for more examples.

Round up

I hope this gave you an insight into some of the things you need to consider when deploying Farm Solutions for SharePoint 2013. There are lots of scenarios that are not covered here. If you find some, please share these or share your concerns and I will try to add it as comments or an additional post..

SharePoint 2013 Hosting – ASPHostPortal.com :: Design Manager – Transform HTML to Master Page

The first thing that caught my eye once i logged on to SharePoint 2013 was the design Manager. I presently introduced it shortly. In the previous I’ve centered on studying to model SharePoint making use of CSS and of course the instrument that most of us employed to integrate our CSS or new Master Pages was SharePoint Designer.

ahp_freehostSHP

SharePoint Designer is no longer the popular tool Continue reading

SharePoint 2013 Hosting – ASPHostPortal.com :: Caching SharePoint Data Locally with SPServices and HTML5’s Web Storage

Caching SharePoint Data Locally with SPServices and HTML5’s Web Storage

bulletin-france2

Even though the SOAP services are fast, sometimes they just aren’t fast enough. In some of those cases, it may make sense to store some of your data in the browser’s Web storage so that it’s there on the client during a session or across sessions. Web storage is an HTML5 capability that is available in virtually all browsers these days, even Internet Explorer 8.

The best candidates for this type of storage (IMO) are list contents that are used as references and that don’t have a high number of changes. As an example, you might decide to store a list of countries in Web storage rather than loading them from the Countries list every time a page loads. Even though the read from the list is fast, it has to take *some* time. There’s the retrieval time, and then there is also any processing time on the client side. For instance, if you have dozens of fields per country and you need to load them into a complex JavaScript structure, that takes time, too. If those data chores are making your page loads seem laggy, then consider using local storage.

There are three main ways you can store data locally to improve performance. I’m not going to go into all of their intricacies, but I will give you some rules of thumb. so before you dive in, do some studying about how it all works.

Cookies

For small pieces of data, you should consider using cookies. They can store up to 4k of data each for you.

Session Storage

Session storage is the flavor of Web storage that allows you to store data just for the duration of the session. Think of a session as a browser lifespan. Once you close the browser, the session storage is gone. Both session storage and local storage sizes are limited by the browser you are using. If you want to know if Web storage is available in your browser of choice, take a look at “Can I use“. The amount of storage each browser gives you is a moving target, but it’s per domain.

Local Storage

Local storage takes Web storage one step further. The data stored in local storage persists across browser sessions. In fact, it usually won’t go away until you explicitly delete it. (Consider this fact when you are gobbling up local storage in your development process.)

So how?

The trick with using these storage mechanisms is managing the data you’ve put in local storage as a cache. That data can go past its expiration date, either because some changes were made to the underlying data source or the cache itself has become corrupted. The latter is more difficult to deal with, so here I’ll focus on the former.

JavaScript – like most other programming languages – lends itself to building wrapper functions that add additional layers of abstraction on top of underlying functionality. Too many levels of abstraction can make things confusing, but with careful thought and smart code writing, you can build abstractions that serve you well.

In a recent client project, I found that as list data volumes were increasing, the pages in my SPServices- and KnockoutJS-driven application were loading more and more slowly. so even if I wanted to use REST, I couldn’t, nor do I believe that it would automatically make anything faster. If we had better servers running things, that might make a huge difference, but we have no control over that in the environment.

What I wanted was a reusable wrapper around SPGetListItemsJson (which itself is a wrapper around the SOAP List Web Service’s GetListItemChangesSinceToken and SPService’s SPXmlToJson) that would let me check local storage for a cached data source (list data), read either the entire data source or just the deltas from the SharePoint list, load the data into my application, and then update the cache appropriately.

The getDataSource function below is what I’ve come up with so far. There’s some setup to use it, so let me explain the parameters it takes:

  •  ns – This is the namespace into which you want to load the data. In my applications these days, I usually have a namespace defined that looks something like ProjectName.SubProjectName.DataSources. The “namespace” is simply a complex JavaScript object that contains most of my data and functions.
  • dataSourceName – The name that I want to give the specific data source within ns. In my example above with the Countries list I would use “Countries”.
  • params – This is the big magilla of the parameters. It contains all of the values that will make my call to SPGetListItemsJson work.
  • cacheItemName – This is the name of the item I want to store in Web storage. In the Countries example, I would use “ProjectName.SubProjectName.DataSources.Countries”.
  • storageType – Either “localStorage” or “sessionStorage”. If I expect the data to change regularly, I’d probably use sessionStorage (this gives me a clean data load for each session). If the data is highly static, I’d likely use localStorage.

And here’s the code:

/* Example:
getDataSource(ProjectName.SubProjectName.DataSources, "Countries", params: {
  webURL: "/",
  listName: "Countries",
  CAMLViewFields: "<ViewFields>" +
      "<FieldRef Name='ID'/>" +
      "<FieldRef Name='Title'/>" +
      "<FieldRef Name='Population'/>" +
      "<FieldRef Name='CapitalCity'/>" +
      "<FieldRef Name='Continent'/>" +
    "</ViewFields>",
  CAMLQuery: "<Query>" +
      "<OrderBy><FieldRef Name='ID'/></OrderBy>" +
    "</Query>",
  CAMLRowLimit: 0,
  changeToken: oldToken,
  mapping: {
      ows_ID:{"mappedName":"ID","objectType":"Counter"},
      ows_Title:{"mappedName":"Title","objectType":"Text"},
      ows_Population:{"mappedName":"Population","objectType":"Integer"},
      ows_CapitalCity:{"mappedName":"CapitalCity","objectType":"Text"},
      ows_Continent:{"mappedName":"Continent","objectType":"Lookup"},
    }
  }, "ProjectName.SubProjectName.DataSources.Countries"
)
*/

function getDataSource(ns, dataSourceName, params, cacheItemName, storageType) {

  var dataReady = $.Deferred();

  // Get the data from the cache if it's there
  ns[dataSourceName] = JSON.parse(window[storageType].getItem(cacheItemName)) || new DataSource();
  var oldToken = ns[dataSourceName].changeToken;
  params.changeToken = oldToken;

  // Read whatever we need from the dataSource
  var p = $().SPServices.SPGetListItemsJson(params);

  // Process the response
  p.done(function() {
    var updates = this.data;
    var deletedIds = this.deletedIds;
    var changeToken = this.changeToken;

    // Handle updates/new items
    if (oldToken !== "" && updates.length > 0) {
      for (var i = 0; i < updates.length; i++) {
        var thisIndex = ns[dataSourceName].data.binaryIndexOf(updates[i], "ID");
        // If the item is in the cache, replace it with the new data
        if (thisIndex > -1) {
          ns[dataSourceName].data[thisIndex] = updates[i];
          // Otherwise, add the new item to the cache
        } else {
          ns[dataSourceName].data.splice(-thisIndex, 0, updates[i]);
        }
      }
    } else if (oldToken === "") {
      ns[dataSourceName] = this;
    }
    // Handle deletes
    for (var i = 0; i < deletedIds.length; i++) {
      var thisIndex = ns[dataSourceName].data.binaryIndexOf({
        ID: deletedIds[i]
      }, "ID");
      ns[dataSourceName].data.splice(thisIndex, 1);
    }
    // Save the updated data back to the cache
    if (oldToken === "" || updates.length > 0 || deletedIds.length > 0) {
      // Save the new changeToken
      ns[dataSourceName].changeToken = changeToken;
      window[storageType].setItem(cacheItemName, JSON.stringify(ns[dataSourceName]));
    }
    dataReady.resolve();
  });
  return dataReady.promise();
}

Some of the nice things about this function:

  • It’s generic. I can call it for any list-based data source in SharePoint. (I started out building it for one data source and then generalized it.
  • I call call it during a page life cycle to refresh the application data anytime I want or on a schedule, perhaps with setInterval.
  • I can set a lot of parameters to cover a lot of different use cases.
  • Each time I call it, it updates the cache (if it needs to) so that the next time I call it I get a “fresh” copy of the data.
  • It only loads the data that it needs to, by using the GetListItemChangesSinceToken capabilities.

And some downsides:

  • Since I know what data I’m working with in my application and that it will fit into the Web storage easily, I’m not worrying about failed saves.
  • If the cache does become corrupt (not something I expect, but there’s always Murphy), I’m not handling it at all.
  • If you decide to try this out, you’ll need a few auxiliary functions as well:
/* DataSource constructor */
function DataSource() {
  this.changeToken = "";
  this.mapping = {};
  this.data = [];
  this.deletedIds = [];
}

/** Adapted from http://oli.me.uk/2013/06/08/searching-javascript-arrays-with-a-binary-search/
 *
 * Performs a binary search on the host array.
 * @param {*} searchObj The object to search for within the array.
 * @param {*} searchElement The element in the object to compare. The objects in the array must be sorted by this element.
 * @return {Number} The index of the element. If the item is not found, the function returns a negative index where it should be inserted (if desired).
 */
Array.prototype.binaryIndexOf = function(searchObj, searchElement) {

  var minIndex = 0;
  var maxIndex = this.length - 1;
  var currentIndex;
  var currentElement;

  var searchValue = searchObj[searchElement];

  while (minIndex <= maxIndex) {
    currentIndex = (minIndex + maxIndex) / 2 | 0;
    currentElement = this[currentIndex];

    if (currentElement[searchElement] < searchValue) {
      minIndex = currentIndex + 1;
    } else if (currentElement[searchElement] > searchValue) {
      maxIndex = currentIndex - 1;
    } else {
      return currentIndex;
    }
  }

  return ~maxIndex;
}