Move OneNote Notebooks to New SharePoint Library or Server

Problem:
My company recently completed a SharePoint 2010 to 2016 migration. With the migration came the use HTTPS security, so my OneNote notebooks stored in SharePoint would no longer sync. All of my notebooks displayed an error of Not syncing.

Here is how I fixed the issue:
First, make sure all of your notebooks are in SharePoint.
In OneNote, click on the File tab.
Locate the first notebook you want to update.
Next to the notebook name click Settings, then Properties.
In the Properties window click on Change Location…
Copy the URL of your SharePoint document library.
Paste the URL into the OneNote Chose a sync location… window.
Select the folder you want to sync OneNote with.
Click OK, a message box should appear saying the item is syncing.
Give it a minute and you should be set.

The Web application at X could not be found.

Error: The Web application at https://sharepoint.sharepointed.com could not be found. Verify that you have typed the URL correctly. If the URL should be serving existing content, the system administrator may need to add a new request URL mapping to the intended application.

I created a .net console app to update some stuff in SharePoint.  When executing the .exe file with a new service account, I received the above error.

First I tried granting Shell access to the content db that I was working with but that didn’t solve the problem.

#powershell
$cDb = Get-SPContentDatabase -site "https://taco.sharepointed.com/"
Add-SPShellAdmin -UserName "domain\userAccount -database $cDb

Running the same command without the database switch fixed my problem.

#powershell
Add-SPShellAdmin -UserName "domain\userAccount"

SharePoint CSOM Upload File Access Denied Error

Simple process, using a .Net Web App, create a local file and upload it to SharePoint. I thought the service account I was using had ampel permissions to the Site, but for it didnt… For testing, I granted the seveice account Full Control at the Web App level (user policy), site collection admin, and more. Nothing worked.

Sample of the code I was using:

            ClientContext ctxH = new ClientContext(hURL);
            Web siteH = ctxH.Web;
            ctxH.Load(siteH);
            ctxH.ExecuteQuery();

            List _library = siteH.Lists.GetByTitle("Drop Off Library");

            Folder _oFolder = siteH.GetFolderByServerRelativeUrl(siteH.ServerRelativeUrl.TrimEnd('/') + "/" + "DropOffLibrary");
            ctxH.Load(_oFolder);
            ctxH.ExecuteQuery();

            FileStream fileStream = System.IO.File.OpenRead(fileName);
            FileCreationInformation fileInfo = new FileCreationInformation();
            fileInfo.ContentStream = fileStream;
            fileInfo.Overwrite = true;
            fileInfo.Url = destFileName;

            Microsoft.SharePoint.Client.File _oFile = _oFolder.Files.Add(fileInfo);
            ctxHub.Load(_oFile);
            ctxHub.ExecuteQuery();

Quick and simple fix:
Grant your account / service account Design permissions to the Site/Web where you are uploading files.

Error at the web level:
C:\Windows\TEMP\tmp28E2.tmp: Access denied. You do not have permission to perform this action or access this resource.

ULS errors:
Permission check failed. Asking for 0x00040002, have 0x1B00C0310EF
Access denied.
System.UnauthorizedAccessException: Access denied., StackTrace:
Exception : System.UnauthorizedAccessException: Access denied.
Exception occured in scope Microsoft.SharePoint.SPFileCollection.Add. Exception=System.UnauthorizedAccessException: Access denied.
Original error: System.UnauthorizedAccessException: Access denied.
SocialRESTExceptionProcessingHandler.DoServerExceptionProcessing – SharePoint Server Exception [System.UnauthorizedAccessException: Access denied.
Throw UnauthorizedAccessException instead of SPUtilityInternal.Send401 for client.svc request.

Use PowerShell to Execute SharePoint Search Queries

In this example, I’m narrowing my search to one library and a search term.

At a high level, the script is searching the FoodSite for the word GoodTaco.

cls

function Query-SPSearch {
	param(
		[Parameter(Mandatory=$true)][String]$WebApplicationPath,
		[Parameter(Mandatory=$true)][String]$KeywordQuery,
		[Parameter()][Int32]$Count = 10
	)

	$QueryXml = @"

<QueryPacket xmlns="urn:Microsoft.Search.Query" >
    <Query>
        <Context>
            <QueryText type="STRING">$KeywordQuery</QueryText>
        </Context>
        <Range>
            <Count>$Count</Count>
        </Range>    
        <IncludeSpecialTermResults>false</IncludeSpecialTermResults>
        <PreQuerySuggestions>false</PreQuerySuggestions>
        <HighlightQuerySuggestions>false</HighlightQuerySuggestions>
        <IncludeRelevantResults>true</IncludeRelevantResults>
        <IncludeHighConfidenceResults>false</IncludeHighConfidenceResults>
    </Query>
</QueryPacket>
"@
	$ServicePath = "/_vti_bin/search.asmx"
	$SearchWS = New-WebServiceProxy -Uri ($WebApplicationPath + $ServicePath) -UseDefaultCredential
	$Results = $SearchWS.QueryEx( $QueryXml )
	# we excluded all other result sets, but just in case get the one we want:
	$Results.Tables["RelevantResults"]
}

Query-SPSearch -WebApplicationPath "https://sharepointed.com/sites/foodsite" -KeywordQuery "GoodTaco AND path:https://sharepointed.com/sites/foodsite/tacos" -Count 20 | Format-Table Title, Author, Path

Searching SharePoint Using PowerShell

In this example, I needed to search a farm for every site under a managed path. BUT, the sites I’m searching for were built using a 3rd part tool and would not correctly appear in the search results.  The problem was related to having Trim Duplicates enabled by default.  Easy fix… Set your search property trim duplicates = false.


$site = Get-SPSite "https://sharepointed.com"

$keywordQuery = New-Object Microsoft.Office.Server.Search.Query.KeywordQuery($site)
$queryText = "ContentClass:STS_Site AND Path:https://sharepointed.com/TACOS/*"
$keywordQuery.QueryText = $queryText
$keywordQuery.TrimDuplicates = $false
$searchExec = New-Object Microsoft.Office.Server.Search.Query.SearchExecutor
$searchResults = $searchExec.ExecuteQuery($keywordQuery)

Write-Host "'r'n"
$table = $searchResults.Table
Write-Host $table.Length" Results Found" -BackgroundColor "Green" -ForegroundColor "Black"
$table | select Title, Path, IsDocument

The search results will display all sites that have Taco as its managed path. If you are not retrieving the results you expect, try switching TrimDuplicates = $false .

Remove Upload Button and Drag files here to upload text

There might be a better way to get this done, but for now, this works for me. Keep in mind, this update works on a per view basis. If I find a way to correctly update the masterpage, I will update this post.

At a high level, you will need to modify the page, add two content editor web parts, and save.

Normal page displaying the Upload button and Drag files here to upload text.

Start by editing the page.  Top right of the screen, click the gear icon, then select Edit Page.  Once in edit make, add two Content Editor web parts to the page.

Place your cursor in the top web part, select Edit Source in the Format Text ribbon window.

In the Source window, enter the following text: link to script

Update the second web part inserting the following text: link to script

Once the web parts have been updated, click on Stop Editing.

All done.

You can also upload the attached web parts to your page:  zip file of web parts

Email address is incorrect for user in SharePoint

In the process of migrating from SharePoint 2010 to 2016 and ran into a small problem.

When trying to get the email property from the SPUser class, it returned a value of domain\userName. Clearly, this is not correct and caused some other issues.

Sample code

$web = Get-SPWeb "https://webapp.taco/toppings/cheese"
$userEnsure = $web.EnsureUser("domain\yourNameHere")
write-host $userEnsure.Email

Running this returned domain\yourNameHere, when it should have returned yourname@domain.com.

Navigate to Central Admin, then cruise over to your User Profile Service. Once there, run a full synchronization.
Profile Service –> Synchronization –> Start Profile Synchronization –> Start Full Synchronization

Run the PowerShell script again and it will return the correct data.

Same idea as above but using the SharePoint ClientContext.

            using (ClientContext clientContext = new ClientContext("https://webapp.taco/toppings/cheese"))
            {
                Web web = clientContext.Web;

                clientContext.Load(web);
                clientContext.Load(web.CurrentUser);
                clientContext.ExecuteQuery();

                var userEmail = web.CurrentUser.Email;
           }

SharePoint listdata.svc Returns Error – FIXED

With SharePoint 2016 and 2013:
If you try to access listdata.svc you receive an error This page can’t be displayed or Sorry, something went wrong.
SharePoint Designer you try to open Lists and Libraries and receive a message of There are no items to show in this view.

The root problem is that the Farm is missing a feature. In SPD, if you click on All Files, Lists, then click on each list and click the Preview in Browser button (ribbon). You will sooner or later find the problem list. From there, you can remove the list or find the problem feature and unhook it.

Basic script to find the problem list in SharePoint 2013 and 2016:

function Get-WebPage([string]$url)
{
	$pageContents = ""
	
	try
	{
		$wc = new-object net.webclient;
		$wc.credentials = [System.Net.CredentialCache]::DefaultCredentials;
		$pageContents = $wc.DownloadString($url);
		$wc.Dispose();
	}
	catch{}
    return $pageContents;
}

$webX = Get-SPWeb "https://yourSpWebUrl"

foreach($list in $webX.Lists)
{
	$listUrl = $list.ParentWeb.Url + "/" + $list.RootFolder.Url
	
	$xo = Get-WebPage -url $listUrl 

	if($xo -like "*Sorry, something went wrong*")
	{
		Write-Host $listUrl
	}	
}

Uploading Files to SharePoint 2016 Using ListData.svc

Update
Another issue we ran into was related to client machines having an outdated cert. Once the updated cert was published to SharePoint, the client machines downloaded the new cert and were able to upload to SharePoint.

Ran into a small issue when testing some code for a SharePoint 2010 to SharePoint 2016 migration.

With SharePoint 2010, the following code sample would work to upload files to a library.

//ServRef is a Service Reference to _vti_bin/ListData.svc
string sharePointSvc = "https://sp2016.some.url/sites/random/_vti_bin/ListData.svc";

            using (FileStream file = File.Open(@"C:\test1.docx", FileMode.Open))
            {
                ServRef.RandomDataContext ctx = new ServRef.RandomDataContext(new Uri(sharePointSvc));

                ctx.Credentials = System.Net.CredentialCache.DefaultNetworkCredentials;

                string filename = Path.GetFileNameWithoutExtension(file.Name);
                string path = "/sites/random/dropofflibrary/" + Path.GetFileName(file.Name);
                string contentType = "Interest Summary";

                ServRef.DropOffLibraryItem documentItem = new ServRef.DropOffLibraryItem()
                {
                    ContentType = contentType,
                    Name = filename,
                    Title = filename
                };

                ctx.AddToDropOffLibrary(documentItem);
                ctx.SetSaveStream(documentItem, file, false, contentType, path);

                try
                {
                    ctx.SaveChanges();
                }
                catch (Exception ex)
                {
                    var err = ex.Message;
                    throw;
                }
            } 

When trying to use this same code with SharePoint 2016, I was receiving the following Errors:
Output error: An error occurred while processing this request.

Errors from Fiddler:
Auth:
No Proxy-Authenticate Header is present.

No WWW-Authenticate Header is present.

Caching:

Under RFC2616, HTTP/500 responses will not be cached regardless of what caching headers may be present. HTTP/1.1 Cache-Control Header is present: no-cache

This response does not specify explicit HTTP Cache Lifetime information and does not specify a Last-Modified date. Heuristic expiration is typically based on Last-Modified date. Lacking Last-Modified, this response may be revalidated on every use or once per browsing session, depending on the browser configuration.

This response contains neither an ETAG nor a Last-Modified time. This will prevent a Conditional Revalidation of this response.

FIX
For SharePoint 2016, the upload process required that the Path property be populated.

//ServRef is a Service Reference to _vti_bin/ListData.svc
string sharePointSvc = "https://sp2016.some.url/sites/random/_vti_bin/ListData.svc";

            using (FileStream file = File.Open(@"C:\test1.docx", FileMode.Open))
            {
                ServRef.RandomDataContext ctx = new ServRef.RandomDataContext(new Uri(sharePointSvc));

                ctx.Credentials = System.Net.CredentialCache.DefaultNetworkCredentials;

                string filename = Path.GetFileNameWithoutExtension(file.Name);
                string path = "/sites/random/dropofflibrary/" + Path.GetFileName(file.Name);
                string contentType = "Interest Summary";

                ServRef.DropOffLibraryItem documentItem = new ServRef.DropOffLibraryItem()
                {
                    Path = path,
                    ContentType = contentType,
                    Name = filename,
                    Title = filename
                };

                ctx.AddToDropOffLibrary(documentItem);
                ctx.SetSaveStream(documentItem, file, false, contentType, path);

                try
                {
                    ctx.SaveChanges();
                }
                catch (Exception ex)
                {
                    var err = ex.Message;
                    throw;
                }
            } 

Make Your PowerShell Script Environment Aware

In place of hard-coding URLs for each environment, I decided to make a single script that is environmentally aware. Why? Cuts down on the number of scripts that have to be supported for a single development cycle. To make this more dynamic, you could move this to a function script and reference it from all your scripts.

if ((Get-PSSnapin -Name Microsoft.SharePoint.PowerShell -ErrorAction SilentlyContinue) -eq $null)
{
	Add-PsSnapin Microsoft.SharePoint.PowerShell
}

#get config database server
$ConfigDB = Get-SPDatabase | where-Object{$_.Type -eq "Configuration Database"}
$serverName = $ConfigDB.Server.Displayname

#replace this with the web app you want to target.  taco, burrito, nacho...
$webApp = "taco"

#set variable equal to the environment url
$siteURL = switch ($serverName.ToLower())
{
	"dev_db" {"http://$webApp.sharepointed.com/"}
	"test_db" {"http://test$webApp.sharepointed.com/"}
	"build_db" {"http://build$webApp.sharepointed.com/"}
	"prod_db" {"http://$webApp.sharepointed.com/"}
}

Same as above, but using a wildcard in the switch statement.

$siteURL = switch -Wildcard ($serverName.ToLower())
{
	"*dev*" {"http://$webApp.sharepointed.com/"}
	"*test*" {"http://test$webApp.sharepointed.com/"}
	"*build*" {"http://build$webApp.sharepointed.com/"}
	"*prod*" {"http://$webApp.sharepointed.com/"}
}

Make sure to check $serverName = $ConfigDB.Server.Displayname
This might need to be replaced with $ConfigDB.Displayname