In this example, I needed to search a farm for every site under a managed path. BUT, the sites I’m searching for were built using a 3rd part tool and would not correctly appear in the search results. The problem was related to having Trim Duplicates enabled by default. Easy fix… Set your search property trim duplicates = false.
$site = Get-SPSite "https://sharepointed.com"
$keywordQuery = New-Object Microsoft.Office.Server.Search.Query.KeywordQuery($site)
$queryText = "ContentClass:STS_Site AND Path:https://sharepointed.com/TACOS/*"
$keywordQuery.QueryText = $queryText
$keywordQuery.TrimDuplicates = $false
$searchExec = New-Object Microsoft.Office.Server.Search.Query.SearchExecutor
$searchResults = $searchExec.ExecuteQuery($keywordQuery)
$table = $searchResults.Table
Write-Host $table.Length" Results Found" -BackgroundColor "Green" -ForegroundColor "Black"
$table | select Title, Path, IsDocument
The search results will display all sites that have Taco as its managed path. If you are not retrieving the results you expect, try switching TrimDuplicates = $false .
Needed to write a script to validate that the SharePoint crawler was picking up all items in a library.
One of my document libraries had over a 100,000 document, and I needed to make sure all of them were being indexed.
Document library can support millions of documents, IF you use a good foldering structure.
function GetSearchIndex ($site, $file)
$fOutPut = $null
$kq = new-object Microsoft.Office.Server.Search.Query.KeywordQuery($site)
$kq.QueryText = $file
$kq.HiddenConstraints = 'scope:"All Sites"'
$kq.RowLimit = 10
$res = $kq.Execute()
$table = new-object System.Data.DataTable
if($table.Rows.Count -eq 0)
$fOut = "Failed"
$fOut = "Passed"
$file = "c:\indexCheck.txt"
$cfcSite = Get-SPWeb "http://SOMEsite.sharepointed.com/sites/test"
$nv = $cfcSite.Lists["bigLibrary"]
$spQuery = New-Object Microsoft.SharePoint.SPQuery
$spQuery.ViewAttributes = "Scope='Recursive'"
$spQuery.RowLimit = 2000
$caml = '<OrderBy Override="TRUE"><FieldRef Name="ID"/></OrderBy>'
$spQuery.Query = $caml
$listItems = $nv.GetItems($spQuery)
$spQuery.ListItemCollectionPosition = $listItems.ListItemCollectionPosition
foreach($item in $listItems)
$sResult = GetSearchIndex "http://test.sharepointed.com" $item.Name
if($sResult -eq "Failed")
$item.Name | Out-File $file -Append
while ($spQuery.ListItemCollectionPosition -ne $null)
Query the list/library in batches of 2,000.
Looping through the returned items.
Call function to see if the item is in the index (query SharePoint).
From the function, return a value of Passed or Failed.
If Failed is true, log it to a text file on the C:\ drive.
If your Search Service Application shows a crawl status of Paused for:External request or Paused by system . You can use the following script to get it back online.
$ssa = Get-SPEnterpriseSearchServiceApplication “YOUR Search Service Application name here”
Your Search App might be named something different than Search Service Application.
To find your Searh App name, navigate to your Central Admin site, under Application Management, click on Manage service applications. In the Type column, look for an item that says Search Service Application.
and credit for this post goes to a super smart coworker.
After setting up a new Search Center, I tried to add the Refinement Panel web part, but was unable to locate it.
Navigate to your Site Collection and enable Search Server Web Parts.
To activate the Search Server Web Part feature
To open the Site Settings page for the top-level (root) site of the upgraded site collection, append /_layouts/settings.aspx to the root site’s URL, as follows:
In the Site Collection Administration section of the Site Settings page, click Site collection features.
For Search Server Web Parts, click Activate.
Link for more detail: