Branding your public facing site

SharePoint 2013 has made life much easier for developers to create a responsive website without much efforts and all done with no customization required. With the introduction of the design manager, you can easily brand your SharePoint publishing site from your HTML composites​ files. This can be done either with SharePoint on-premise farm or SharePoint Online.

This post will walk you through how you can create your master page and brand your site in few hours.

First, you need to create a Publishing Site Collection. Design Manager only works with a site collection.

 

 

 

 

 

 

 

 

Navigate to the site collection you have created, click on Site Settings icon then select Design Manager.

 

In the Design Manager page, you have few options to choose from, but in this post we are going to focus on 3 sections only that will allow us to brand a SharePoint site based on existing HTML5/CSS3 composites.

In the Quick Launch Navigation, click on Upload Design File.

This will show you a URL that you need to map the network drive to it:

 

Go ahead and map your network drive to the URL, in my case it is http://sp2013/_catalogs/masterpage

The picture above shows you how your SharePoint Site Collection Catalogs list shows up in File Explorer.

Now, Drag the folder that contains your HTML5, CSS3, JS, Images, etc… into the \_catalogs folder. Note that you need to drag and drop the entire folder and not only the files individually. I suggest to keep them all under the same folder, so you can better manage them within SharePoint.

As you can see, my HTML composites that I received from my designer are copied into my \_catalogs folder:

Next, go back to the Design Manager web page, and click on Edit Master Pages.

Click on the link that says Convert an HTML file to a SharePoint master page. Select your HTML file that you just upload. This will convert your HTML file to a master page, add all the necessary controls, and will fix the reference links to your JS, CSS, and Images locations. This will also create your master page. If you navigate to your \_catalogs list, you should see your master page has been created automatically.

You might have warnings and errors in the status of the conversion; click on the link that says warning and errors. The page will show you where are the errors. SharePoint 2013 validates your HTML5 structure, to make sure it is written properly. For example in my case, the HTML code did not have an empty space in one of the tags.

To fix it, open the HTML file from the \_catalogs list on your site. Go to your file explorer where you mapped the catalogs list, and open the HTML, NOT the master page in any HTML editor. You do not need to use SharePoint Designer. I personally prefer NotePad++, but you can chose any other HTML editor such as DreamWeaver, Eclipse, etc…

When you have corrected all the errors in the HTML, your master page associated with your HTML file is ready to be used as your default master page. Before doing so, you may want to add some dynamics controls to your master page, such as Top Navigation, Vertical Navigation, Search box, etc…, or you can add your own custom controls.

To do so, click on the Snippets links located on the Master Page preview. Note that this is the only place where you can find the Snippets link.

Clicking on each snippet will give you the code you need. Copy the code and paste it in the appropriate place in the HTML file. For example, you want to replace your static HTML navigation with the Top Navigation snippet to make it dynamic.

When you are done, you may want to clean up your converted HTML file to remove all the static content and replace them with either nothing, or placeholders for your Publishing HTML fields used in your page layouts.

When you are satisfied with your master page, you can publish it. Note that you need to publish your HTML page; this will automatically publish your master page.

At this time, you can start using your new master page as your default branding template.

SharePoint 2013 Search Crawl Timeout Issue

​Implementing SharePoint 2013 in a secure zone as an extranet application might be challenging, if you are deploying your farm in a zone with many restrictions.

Recently, I deployed a large SharePoint 2013 farm in a DMZ zone for a regulated portal. Regulated data in my case meant the following restrictive rules in the network and on the servers in the farm:

  • Strict GPO Policies
  • WFE, Application, Search, and SQL servers are hosted in different subnet zones
  • Everything is blocked on the firewall unless specific ports are requested to be open
  • Outbound internet access is disabled on all servers.

Configuring SharePoint in this environment was not a straight forward exercise. After disabling some GPO policies to allow the creation of the IIS web applications, we had to map out the communication between all the servers so the firewall ports are open, allowing each server in the farm to talk to each other.

To get a better understanding of the ports required in your farm, you can follow this TechNet article. It explains the details of each port and its use.

Configuration SharePoint was successful; everything worked, the portals are up and running, content is being populated, User Profile Service Synchronization is working, and the Search Service Application is up and running.

However, I was faced with a very challenging issue when crawling content. Crawling the SharePoint content source always returned a "timeout" error in the logs. Resolving this issue took a lot log monitoring, custom code to monitor the traffic, and long nights.

This means that the search crawl is sending an HTTP request to your portal, but it is not receiving an answer back. The authentication is fine, security is OK, but there is no HTTP trip back to the crawl server.

There are my suggestions to a Search Crawl Timeout issue; one of the following suggestions might resolve your issue:

  1. Make sure you disable the loopback on the crawler server. In my case, this did not help at all.
  2. CRL Check: Most DLL assemblies are digitally signed.  Each time signed assemblies are loaded, default system behaviour is to check with the owner of the root certificate that the cert with which the assembly was signed is still valid. SharePoint 2013 search checks few certificates, like crl.microsoft.com or *.akamaitechnologies.com. To resolve this issue, open the outbound internet connection . If this is not doable, then install the crl.microsoft.com certificate on the server, or add an entry to local server host file like this: 127.0.0.1 crl.microsoft.com. This way certificate checks does not need to validate the certificate over the internet;
  3. Add exceptions on the firewall to allow traffic for the certificates; or
  4. Open Internet; or
  5. Revisit the firewall rules.

 

I suggest to first looking into the firewall rules again. 9 out of 1, it is the firewall that is doing funny things to block traffic between the servers. In my case, the security team were using Cisco Smart Care firewall, which is an advanced firewall and it does not only look at the ports' rules. You will have to create exception for applications, because it detects SharePoint and it automatically blocks it if SharePoint as an app is not listed as one of the trusted apps.

 

 

SharePoint 2013 Distributed Cache Logon Token issue

​Distributed Cache Service (DCS) is a customized version of Windows App Fabric deployed in SharePoint 2013.

The Distributed Cache service provides caching functionality to features (not to be confused with site features) in SharePoint Server 2013. The Distributed Cache service is either required by or improves performance of the following features:

  1. Authentication;
  2. Newsfeeds;
  3. OneNote client access;
  4. Security Trimming; and
  5. Page load performance.

For more detailed about managing and deploying DCS, you can visit this TechNet article.

While implementing a SharePoint 2013 farm, I encountered some errors loading newsfeed, and a decrease in the farm performance.

While examining through the ULS logs, I immediately noticed a lot of dozens of DCS errors logged every 60 seconds. The error was:

Unexpected error occurred in method 'GetObject' , usage 'Distributed Logon Token Cache' – Exception 'Microsoft.ApplicationServer.Caching.DataCacheException: ErrorCode<ERRCA0018>:SubStatus<ES0001>:The request timed out

 

The Distributed Logon Token Cache stores the security token issued by a Secure Token Service for use by any web server in the server farm. Any web server that receives a request for resources can access the security token from the cache, authenticate the user, and provide access to the resources requested.

Tracing through the logs, I saw that when a user accesses a page, SharePoint attempts to authorize the user to ensure access can be granted. SharePoint stores the user’s token in the user's browser session and in the DistributedCacheLogonTokenCache container. When SharePoint tried to retrieve the token from distributed cache, the connection would time out or a connection would be unavailable and the comparison would fail. Since it couldn't validate the presented token SharePoint had no choice but to log the user out and redirect them to the sign in page.

In general, the problem might cause failures or performance problems of the following:

  • Authentication: Users will be forced to authenticate for each Web front end in a load balanced environment;
  • Search web parts;
  • Social comments;
  • Newsfeeds;
  • OneNote client access;
  • Security Trimming; and
  • Page load performance.

After further research, I found out that Out of the box, AppFabric 1.1 contains a bug with garbage collection and this impacts the SharePoint 2013 farm with the March 2013 CU.

Resolution

  1. Apply the AppFabric CU 4, or a later CU on all of your servers in the farm;
  2. Restart the AppFabric Service on all servers;
  3. Restart DCS service on the servers where the service is running; and
  4. Perform an IIS reset.

User Profile Synchronization Service – Access is denied

Sometimes, when you first create your User Profile Service Application and you configure your first Synchronization Connection, (or any new Synchronization Connection), you might notice that the service is not synchronizing from Active Directory, and you wil receive Access is Denied error in the event log.

Reviewing the ULS logs shows the following errors:

UserProfileApplication.SynchronizeMIIS: Error updating users with FIM permissions: Microsoft.ResourceManagement.WebServices.Faults.ServiceFaultException: Unable to process Create message   

 at Microsoft.ResourceManagement.WebServices.Client.ResourceTemplate.CreateResource()   

 at Microsoft.Office.Server.Administration.UserProfileApplication.UpdateFIMUser(SchemaManager schemaManager, String userName, String accountName, String domain, Byte[] userSid)   

 at Microsoft.Office.Server.Administration.UserProfileApplication.SynchronizeMIISAdminsList(Hashtable htPermittedUsers)   

 at Microsoft.Office.Server.Administration.UserProfileApplication.SetupProfileSynchronizationEnginePermissions().

UserProfileApplication.SynchronizeMIIS: Failed to configure ILM, will attempt during next rerun. Exception: Microsoft.ResourceManagement.WebServices.Faults.ServiceFaultException: Unable to process Create message   

 at Microsoft.ResourceManagement.WebServices.Client.ResourceTemplate.CreateResource()   

 at Microsoft.Office.Server.Administration.UserProfileApplication.UpdateFIMUser(SchemaManager schemaManager, String userName, String accountName, String domain, Byte[] userSid)   

 at Microsoft.Office.Server.Administration.UserProfileApplication.SynchronizeMIISAdminsList(Hashtable htPermittedUsers)   

 at Microsoft.Office.Server.Administration.UserProfileApplication.SetupProfileSynchronizationEnginePermissions()   

 at Microsoft.Office.Server.Administration.UserProfileApplication.SetupSynchronizationService(ProfileSynchronizationServiceInstance profileSyncInstance).

The best way to troubleshoot this issue is to look at why the FIM Synchronization service is failing to sync. To do so, you will need to launch the FIM client miisclient.exe that is located under <%Install Dir%>\Program Files\Microsoft Office Servers\14.0\Synchronization Service\UIShell.

When you launch the client, as per picture 1:

  1. Click on the Management Agents
  2. Click on the Synchronization Connection you created in the UPSA
  3. Under Actions, Click on Configure Run Profiles


Picture 1

In the Run Profiles, you should see all the synchronization profiles, and you might notice 2 steps per profile as per picture 2.


Picture 2

For example, in the DS_EXPORT profile, you might find Step 0 with a GUID as a Partition value, then a Step 1 with the correct forest info as the Partition value.

If this is the case, then you need to delete the Step that has the GUID as a Partition value and keep the step with the correct forest info.

In some cases, if you run the synchronization job, and you monitor the status under the FIM Client Operations tab, you will find at what specific run profile the job is failing, with an access is denied error, With that, you can directly go to the specific run profile instead of going through all of them. However, it is a good idea to go through all the profiles to make sure you don't have extra steps that is messing up your synchronization job.

Also, in some scenarios, you might find only one step with a GUID as a Partition value. In this case, you will need to delete the step and create a new one, by clicking on New Step. With this in mind, make sure you select the right value for the run profile. Follow the wizard and select the right forest from the dropdown menu.

After completing your manual clean-up, try running the Synchronization job from Central Admin. No IISRESET is needed. this should fix the access is denied issue and the job will complete successfully.

Missing Content Search Web Part

In SharePoint 2013, all the content can now be surfaced using search.  The Search driven  web parts have their own Querying Builder user Interface which makes it very easy to select, filter and display the data that you want. However, content Search Web Part is only available in SharePoint 2013 Enterprise Edition. If you are using the Enterprise CALs, then you should see the search driven web part in your web part gallery.

But, this is not always the case if you have played around with the licensing in the farm. SharePoint 2013 provides a new feature called SharePoint User License Enforcement (SPULE) that a lot of people may not be aware of.  SPULE means that we can have a mix of different licenses in a single farm.  What this means, is that Enterprise features can be made available to those who need it, and Standard features to others.  This can save an organization a substantial amount related to cost of Client Access Licenses.

If for some reason you ran this command line: Set-SPUserLicensing, this will actually disable all your search driven web parts. Note that by default, the SPULE is not enabled.

To get an overview of the SPULE in your farm, run this command: Get-SPUserLicensing. If true is returned, this means that the SPULE has been enabled on your farm.

What you need to do is to disable the SPULE, and the Search driven web parts will appear again. Run this command Disable-SPUserLicensing, and voila! Your web parts are back in the gallery!

Note: You can set the SPULE based on different AD groups, and you can set it for different type of licenses. This TechNet article will explain to you how you can manipulate different SPULE in your farm.