Have you heard about the virtual Collab365 Global Conference 2017 that’s streaming online November 1st – 2nd?
Join me and 120 other speakers from around the world who will be bringing you the very latest content around SharePoint, Office 365, Flow, PowerApps, Azure, OneDrive for Business and of course the increasingly popular Microsoft Teams. The event is produced by the Collab365 Community and is entirely free to attend.
Places are limited to 5000 so be quick and register now.
During the conference I'd love you to watch my session which is called : 'Search Web Parts'
Content Search Web Part (CSWP) is one of the great web parts in O365 and on-premises. In this session, Mike will demo how to configure and use the CSWP, and build a dynamic O365 branded portal with CSWP only. In this session, we will review: 1.Creating Queries using Keyword Query Language (KQL) 2.Building dynamic queries 3.Creating and customizing HTML Display Templates.
If you join me, you will learn:
Topic(s):
Audience :
Time (in UTC) :
How to attend :
Implementing SharePoint 2013 in a secure zone as an extranet application might be challenging, if you are deploying your farm in a zone with many restrictions.
Recently, I deployed a large SharePoint 2013 farm in a DMZ zone for a regulated portal. Regulated data in my case meant the following restrictive rules in the network and on the servers in the farm:
Configuring SharePoint in this environment was not a straight forward exercise. After disabling some GPO policies to allow the creation of the IIS web applications, we had to map out the communication between all the servers so the firewall ports are open, allowing each server in the farm to talk to each other.
To get a better understanding of the ports required in your farm, you can follow this TechNet article. It explains the details of each port and its use.
Configuration SharePoint was successful; everything worked, the portals are up and running, content is being populated, User Profile Service Synchronization is working, and the Search Service Application is up and running.
However, I was faced with a very challenging issue when crawling content. Crawling the SharePoint content source always returned a "timeout" error in the logs. Resolving this issue took a lot log monitoring, custom code to monitor the traffic, and long nights.
This means that the search crawl is sending an HTTP request to your portal, but it is not receiving an answer back. The authentication is fine, security is OK, but there is no HTTP trip back to the crawl server.
There are my suggestions to a Search Crawl Timeout issue; one of the following suggestions might resolve your issue:
I suggest to first looking into the firewall rules again. 9 out of 1, it is the firewall that is doing funny things to block traffic between the servers. In my case, the security team were using Cisco Smart Care firewall, which is an advanced firewall and it does not only look at the ports' rules. You will have to create exception for applications, because it detects SharePoint and it automatically blocks it if SharePoint as an app is not listed as one of the trusted apps.
In SharePoint 2013, all the content can now be surfaced using search. The Search driven web parts have their own Querying Builder user Interface which makes it very easy to select, filter and display the data that you want. However, content Search Web Part is only available in SharePoint 2013 Enterprise Edition. If you are using the Enterprise CALs, then you should see the search driven web part in your web part gallery.
But, this is not always the case if you have played around with the licensing in the farm. SharePoint 2013 provides a new feature called SharePoint User License Enforcement (SPULE) that a lot of people may not be aware of. SPULE means that we can have a mix of different licenses in a single farm. What this means, is that Enterprise features can be made available to those who need it, and Standard features to others. This can save an organization a substantial amount related to cost of Client Access Licenses.
If for some reason you ran this command line: Set-SPUserLicensing, this will actually disable all your search driven web parts. Note that by default, the SPULE is not enabled.
To get an overview of the SPULE in your farm, run this command: Get-SPUserLicensing. If true is returned, this means that the SPULE has been enabled on your farm.
What you need to do is to disable the SPULE, and the Search driven web parts will appear again. Run this command Disable-SPUserLicensing, and voila! Your web parts are back in the gallery!
Note: You can set the SPULE based on different AD groups, and you can set it for different type of licenses. This TechNet article will explain to you how you can manipulate different SPULE in your farm.
Many customers are excited about the new features that SharePoint 2013 brings to the table. Small or large organizations who have implemented any SharePoint implementation project size hesitate to upgrade for many reasons, but they want to take advantage of some the new features of 2013.
In the first section of this article, I am going to show how you can create SharePoint 2013 Search Service Application Using PowerShell. This list of commands will allow you to name your own database, instead having a GUID based database name for search.
The architecture and design of search in SharePoint 2013 have changed a bit. There are more added components and more flexibility for high availability search farm, allowing the farm to index more than 100 million items.
There are several steps involved in the creation of a Search Service Application and defining the Search Topology. The steps are:
Instead of using Central Admin, I will be showing PowerShell commands to create SSA:
# Define the variables
$SSADB = “SharePoint_Demo_SearchAdmin”
$SSAName = “Search Service Application”
$SVCAcct = “mcm\sp_search”
$SSI = get-spenterprisesearchserviceinstance -local
#1. Start the search services for SSI
Start-SPEnterpriseSearchServiceInstance -Identity $SSI
#2. Create the Application Pool
$AppPool = new-SPServiceApplicationPool -name $SSAName”-AppPool” -account $SVCAcct
#3. Create the search application and set it to a variable
$SearchApp = New-SPEnterpriseSearchServiceApplication -Name $SSAName -applicationpool $AppPool -databaseserver SQL2012 -databasename $SSADB
#4. Create search service application proxy
$SSAProxy = new-SPEnterpriseSearchServiceApplicationProxy -name $SSAName” Application Proxy” -Uri $SearchApp.Uri.AbsoluteURI
#5. Provision Search Admin Component
Set-SPEnterpriseSearchAdministrationComponent -searchapplication $SearchApp -searchserviceinstance $SSI
#6. Create the topology
$Topology = New-SPEnterpriseSearchTopology -SearchApplication $SearchApp
#7. Assign server(s) to the topology
$hostApp1 = Get-SPEnterpriseSearchServiceInstance -Identity “SPWFE”
New-SPEnterpriseSearchAdminComponent -SearchTopology $Topology -SearchServiceInstance $hostApp1
New-SPEnterpriseSearchCrawlComponent -SearchTopology $Topology -SearchServiceInstance $hostApp1
New-SPEnterpriseSearchContentProcessingComponent -SearchTopology $Topology -SearchServiceInstance $hostApp1
New-SPEnterpriseSearchAnalyticsProcessingComponent -SearchTopology $Topology -SearchServiceInstance $hostApp1
New-SPEnterpriseSearchQueryProcessingComponent -SearchTopology $Topology -SearchServiceInstance $hostApp1
New-SPEnterpriseSearchIndexComponent -SearchTopology $Topology -SearchServiceInstance $hostApp1 –IndexPartition 0
#8. Create the topology
$Topology | Set-SPEnterpriseSearchTopology
Once SSA is created, we will need to clone the topology to be able to extend to other servers in the farm. In this script, we will be replicating all the Search components onto two servers in the farm, also creating 2 indexes. Here are the steps:
#1. Extend the Search Topology:
$hostApp1 = Get-SPEnterpriseSearchServiceInstance -Identity “AppSearch1”
$hostApp2 = Get-SPEnterpriseSearchServiceInstance -Identity “AppSearch2”
Start-SPEnterpriseSearchServiceInstance -Identity $hostApp1
Start-SPEnterpriseSearchServiceInstance -Identity $hostApp2
#3. Keep running this command until the Status is Online:
Get-SPEnterpriseSearchServiceInstance -Identity $hostApp1
Get-SPEnterpriseSearchServiceInstance -Identity $hostApp2
#4. Once the status is online, you can proceed with the following commands:
$ssa = Get-SPEnterpriseSearchServiceApplication
$active = Get-SPEnterpriseSearchTopology -SearchApplication $ssa -Active
$newTopology = New-SPEnterpriseSearchTopology -SearchApplication $ssa
#Assign components to the hosts
New-SPEnterpriseSearchAdminComponent -SearchTopology $newTopology -SearchServiceInstance $hostApp1
New-SPEnterpriseSearchCrawlComponent -SearchTopology $newTopology -SearchServiceInstance $hostApp1
New-SPEnterpriseSearchContentProcessingComponent -SearchTopology $newTopology -SearchServiceInstance $hostApp1
New-SPEnterpriseSearchAnalyticsProcessingComponent -SearchTopology $newTopology -SearchServiceInstance $hostApp1
New-SPEnterpriseSearchQueryProcessingComponent -SearchTopology $newTopology -SearchServiceInstance $hostApp1
New-SPEnterpriseSearchIndexComponent -SearchTopology $newTopology -SearchServiceInstance $hostApp1 –IndexPartition 0
New-SPEnterpriseSearchAdminComponent -SearchTopology $newTopology -SearchServiceInstance $hostApp2
New-SPEnterpriseSearchCrawlComponent -SearchTopology $newTopology -SearchServiceInstance $hostApp2
New-SPEnterpriseSearchContentProcessingComponent -SearchTopology $newTopology -SearchServiceInstance $hostApp2
New-SPEnterpriseSearchAnalyticsProcessingComponent -SearchTopology $newTopology -SearchServiceInstance $hostApp2
New-SPEnterpriseSearchQueryProcessingComponent -SearchTopology $newTopology -SearchServiceInstance $hostApp2
#Below is creating another index on host 2. If you want to replicate the index to the second server, then you don’t need this step.
New-SPEnterpriseSearchIndexComponent -SearchTopology $newTopology -SearchServiceInstance $hostApp2 –IndexPartition 1
#5. Activate the topology:
Set-SPEnterpriseSearchTopology -Identity $newTopology
The above scenario is creating a search topology over 2 server farm. For larger search topology, you can just add more hosts to the topology and select which components to run on them.