Quantcast
Channel: SharePoint Developer Support Team Blog
Viewing all 100 articles
Browse latest View live

SharePoint Online Active Authentication

$
0
0

This post is a contribution from Vitaly Lyamin, an engineer with the SharePoint Developer Support team

We often see issues that have to do with actively authenticating to SharePoint Online for the purpose of consuming API’s and services (WCF and ASMX). There are 2 flavors of authentication - one with a Custom STS and one without (Using MSO STS only). The end goal is to retrieve the authentication cookie (SPOIDCRL cookie).

Step 1: Getting the Custom STS active endpoint URL
Microsoft Online provides a way to discover the custom STS authentication URL via the “GetUserRealm.srf” endpoint. The “STSAuthURL” node in the XML response contains the value.

Step 2: Authenticating to the STS and Retrieving the BinarySecurityToken
The default MSO endpoint https://login.microsoftonline.com/rst2.srf will either take the *.onmicrosoft.com user credentials or the assertion from the custom STS.

If there’s a custom STS (as discovered in previous step), that endpoint needs to be hit first to retrieve the assertion.

The SAML response from rst2.srf endpoint contains the BinarySecurityToken which is retrieved and used in the next step.

STS Endpoints
https://login.microsoftonline.com/rst2.srf (default MSO endpoint)
https://#ADFSHOST#/adfs/services/trust/2005/usernamemixed (username/password ADFS endpoint)
https://#ADFSHOST#/adfs/services/trust/2005/windowstransport (integrated Windows ADFS endpoint)

Step 3: Get the SPOIDCRL Cookie
Now that we have the BinarySecurityToken, we can pass the value to the https://TENANT.sharepoint.com/_vti_bin/idcrl.svc endpoint in the Authorization header.

Authorization Header with BinarySecurityToken
Authorization: BPOSIDCRL t=*

The response from the idcrl.svc sets the SPOIDCRL cookie which can be programmatically retrieved and used in subsequent API calls.

PowerShell Script

<#
    .Synopsis 
        Retrieve SPOIDCR cookie for SharePoint Online.
    .Description
        Authenticates to the sts and retrieves the SPOIDCR cookie for SharePoint Online.
        Will use the custom IDP if one has been setup.
        Optionally, can use integrated credentials (when integrated is set to true) with ADFS using the windowsmixed endpoint.
        Results are formattable as XML, JSON, KEYVALUE, and by line.
        
        Makes global variables avaiable at the end of the run.
        $spoidcrl contains the SPOIDCRL cookie

    .Example 
        The following returns the SPOIDCRL cookie value provided a username and password.

        PS> .\spoidcrl.ps1 -url https://contoso.sharepoint.com -username user@contoso.com -password ABCDEFG
    .Example 
        The following returns the SPOIDCRL cookie value using integrated windows credentials. Applies only to ADFS.

        PS> .\spoidcrl.ps1 -url https://contoso.sharepoint.com/sites/site1 -integrated

	.Example 
        The following saves the SPOIDCRL cookie value using integrated windows credentials. Applies only to ADFS.

        PS> .\spoidcrl.ps1 -url https://contoso.sharepoint.com/sites/site1 -integrated -format "XML" | Out-File "c:\temp\spoidcr.txt"

    .PARAMETER url 
        Tenant url (e.g. contoso.sharepoint.com)
    .PARAMETER username
        The username to login with. (e.g. user@contoso.com or user@contoso.onmicrosoft.com)		
    .PARAMETER password
      The password to login with.
    .PARAMETER integrated
      Whether to use integrated credentials (user running PowerShell) instead of explicit credentials.
      Needs to be supported by ADFS.
    .PARAMETER format
      How to format the output. Options include: XML, JSON, KEYVALUE

#>
[CmdletBinding()]
Param(
[Parameter(Mandatory=$true)]
[string]$url,
[Parameter(Mandatory=$false)]
[string]$username,
[Parameter(Mandatory=$false)]
[string]$password,
[Parameter(Mandatory=$false)]
[switch]$integrated = $false,
[Parameter(Mandatory=$false)]
[string]$format
)

$statusText = New-Object System.Text.StringBuilder

function log($info)
{
    if([string]::IsNullOrEmpty($info))
    {
        $info = ""
    }

    [void]$statusText.AppendLine($info)
}

try
{
    if (![uri]::IsWellFormedUriString($url, [UriKind]::Absolute))
    {
        throw "Parameter 'url' is not a valid URI."
    }
    else
    {
        $uri = [uri]::new($url)
        $tenant = $uri.Authority
    }

    if ($tenant.EndsWith("sharepoint.com", [System.StringComparison]::OrdinalIgnoreCase))
    {
        $msoDomain = "sharepoint.com"
    }
    else
    {
        $msoDomain = $tenant
    }

    if ($integrated.ToBool())
    {
        [System.Reflection.Assembly]::LoadWithPartialName("System.DirectoryServices") | out-null
        [System.Reflection.Assembly]::LoadWithPartialName("System.DirectoryServices.AccountManagement") | out-null
        $username = [System.DirectoryServices.AccountManagement.UserPrincipal]::Current.UserPrincipalName
    }
    elseif ([string]::IsNullOrWhiteSpace($username) -or [string]::IsNullOrWhiteSpace($password))
    {
        $credential = Get-Credential -UserName $username -Message "Enter credentials"
        $username = $credential.UserName
        $password = $credential.GetNetworkCredential().Password
    }

    $contextInfoUrl = $url.TrimEnd('/') + "/_api/contextinfo"
    $getRealmUrl = "https://login.microsoftonline.com/GetUserRealm.srf"
    $realm = "urn:federation:MicrosoftOnline"
    $msoStsAuthUrl = "https://login.microsoftonline.com/rst2.srf"
    $idcrlEndpoint = "https://$tenant/_vti_bin/idcrl.svc/"
    $username = [System.Security.SecurityElement]::Escape($username)
    $password = [System.Security.SecurityElement]::Escape($password)

    # Custom STS integrated authentication envelope format index info
    # 0: message id - unique guid
    # 1: custom STS auth url
    # 2: realm
    $customStsSamlIntegratedRequestFormat = "<?xml version=`"1.0`" encoding=`"UTF-8`"?><s:Envelope xmlns:s=`"http://www.w3.org/2003/05/soap-envelope`" xmlns:a=`"http://www.w3.org/2005/08/addressing`"><s:Header><a:Action s:mustUnderstand=`"1`">http://schemas.xmlsoap.org/ws/2005/02/trust/RST/Issue</a:Action><a:MessageID>urn:uuid:{0}</a:MessageID><a:ReplyTo><a:Address>http://www.w3.org/2005/08/addressing/anonymous</a:Address></a:ReplyTo><a:To s:mustUnderstand=`"1`">{1}</a:To></s:Header><s:Body><t:RequestSecurityToken xmlns:t=`"http://schemas.xmlsoap.org/ws/2005/02/trust`"><wsp:AppliesTo xmlns:wsp=`"http://schemas.xmlsoap.org/ws/2004/09/policy`"><wsa:EndpointReference xmlns:wsa=`"http://www.w3.org/2005/08/addressing`"><wsa:Address>{2}</wsa:Address></wsa:EndpointReference></wsp:AppliesTo><t:KeyType>http://schemas.xmlsoap.org/ws/2005/05/identity/NoProofKey</t:KeyType><t:RequestType>http://schemas.xmlsoap.org/ws/2005/02/trust/Issue</t:RequestType></t:RequestSecurityToken></s:Body></s:Envelope>";


    # custom STS envelope format index info
    # {0}: ADFS url, such as https://corp.sts.contoso.com/adfs/services/trust/2005/usernamemixed, its value comes from the response in GetUserRealm request.
    # {1}: MessageId, it could be an arbitrary guid
    # {2}: UserLogin, such as someone@contoso.com
    # {3}: Password
    # {4}: Created datetime in UTC, such as 2012-11-16T23:24:52Z
    # {5}: Expires datetime in UTC, such as 2012-11-16T23:34:52Z
    # {6}: tokenIssuerUri, such as urn:federation:MicrosoftOnline, or urn:federation:MicrosoftOnline-int
    $customStsSamlRequestFormat = "<?xml version=`"1.0`" encoding=`"UTF-8`"?><s:Envelope xmlns:s=`"http://www.w3.org/2003/05/soap-envelope`" xmlns:wsse=`"http://docs.oasis-open.org/wss/2004/01/oasis-200401-wss-wssecurity-secext-1.0.xsd`" xmlns:saml=`"urn:oasis:names:tc:SAML:1.0:assertion`" xmlns:wsp=`"http://schemas.xmlsoap.org/ws/2004/09/policy`" xmlns:wsu=`"http://docs.oasis-open.org/wss/2004/01/oasis-200401-wss-wssecurity-utility-1.0.xsd`" xmlns:wsa=`"http://www.w3.org/2005/08/addressing`" xmlns:wssc=`"http://schemas.xmlsoap.org/ws/2005/02/sc`" xmlns:wst=`"http://schemas.xmlsoap.org/ws/2005/02/trust`"><s:Header><wsa:Action s:mustUnderstand=`"1`">http://schemas.xmlsoap.org/ws/2005/02/trust/RST/Issue</wsa:Action><wsa:To s:mustUnderstand=`"1`">{0}</wsa:To><wsa:MessageID>{1}</wsa:MessageID><ps:AuthInfo xmlns:ps=`"http://schemas.microsoft.com/Passport/SoapServices/PPCRL`" Id=`"PPAuthInfo`"><ps:HostingApp>Managed IDCRL</ps:HostingApp><ps:BinaryVersion>6</ps:BinaryVersion><ps:UIVersion>1</ps:UIVersion><ps:Cookies></ps:Cookies><ps:RequestParams>AQAAAAIAAABsYwQAAAAxMDMz</ps:RequestParams></ps:AuthInfo><wsse:Security><wsse:UsernameToken wsu:Id=`"user`"><wsse:Username>{2}</wsse:Username><wsse:Password>{3}</wsse:Password></wsse:UsernameToken><wsu:Timestamp Id=`"Timestamp`"><wsu:Created>{4}</wsu:Created><wsu:Expires>{5}</wsu:Expires></wsu:Timestamp></wsse:Security></s:Header><s:Body><wst:RequestSecurityToken Id=`"RST0`"><wst:RequestType>http://schemas.xmlsoap.org/ws/2005/02/trust/Issue</wst:RequestType><wsp:AppliesTo><wsa:EndpointReference>  <wsa:Address>{6}</wsa:Address></wsa:EndpointReference></wsp:AppliesTo><wst:KeyType>http://schemas.xmlsoap.org/ws/2005/05/identity/NoProofKey</wst:KeyType></wst:RequestSecurityToken></s:Body></s:Envelope>"

    # mso envelope format index info (Used for custom STS + MSO authentication)
    # 0: custom STS assertion
    # 1: mso endpoint
    $msoSamlRequestFormat = "<?xml version=`"1.0`" encoding=`"UTF-8`"?><S:Envelope xmlns:S=`"http://www.w3.org/2003/05/soap-envelope`" xmlns:wsse=`"http://docs.oasis-open.org/wss/2004/01/oasis-200401-wss-wssecurity-secext-1.0.xsd`" xmlns:wsp=`"http://schemas.xmlsoap.org/ws/2004/09/policy`" xmlns:wsu=`"http://docs.oasis-open.org/wss/2004/01/oasis-200401-wss-wssecurity-utility-1.0.xsd`" xmlns:wsa=`"http://www.w3.org/2005/08/addressing`" xmlns:wst=`"http://schemas.xmlsoap.org/ws/2005/02/trust`"><S:Header><wsa:Action S:mustUnderstand=`"1`">http://schemas.xmlsoap.org/ws/2005/02/trust/RST/Issue</wsa:Action><wsa:To S:mustUnderstand=`"1`">https://login.microsoftonline.com/rst2.srf</wsa:To><ps:AuthInfo xmlns:ps=`"http://schemas.microsoft.com/LiveID/SoapServices/v1`" Id=`"PPAuthInfo`"><ps:BinaryVersion>5</ps:BinaryVersion><ps:HostingApp>Managed IDCRL</ps:HostingApp></ps:AuthInfo><wsse:Security>{0}</wsse:Security></S:Header><S:Body><wst:RequestSecurityToken xmlns:wst=`"http://schemas.xmlsoap.org/ws/2005/02/trust`" Id=`"RST0`"><wst:RequestType>http://schemas.xmlsoap.org/ws/2005/02/trust/Issue</wst:RequestType><wsp:AppliesTo><wsa:EndpointReference><wsa:Address>{1}</wsa:Address></wsa:EndpointReference></wsp:AppliesTo><wsp:PolicyReference URI=`"MBI`"></wsp:PolicyReference></wst:RequestSecurityToken></S:Body></S:Envelope>"

    # mso envelope format index info (Used for MSO-only authentication)
    # 0: mso endpoint
    # 1: username
    # 2: password
    $msoSamlRequestFormat2 = "<?xml version=`"1.0`" encoding=`"UTF-8`"?><S:Envelope xmlns:S=`"http://www.w3.org/2003/05/soap-envelope`" xmlns:wsse=`"http://docs.oasis-open.org/wss/2004/01/oasis-200401-wss-wssecurity-secext-1.0.xsd`" xmlns:wsp=`"http://schemas.xmlsoap.org/ws/2004/09/policy`" xmlns:wsu=`"http://docs.oasis-open.org/wss/2004/01/oasis-200401-wss-wssecurity-utility-1.0.xsd`" xmlns:wsa=`"http://www.w3.org/2005/08/addressing`" xmlns:wst=`"http://schemas.xmlsoap.org/ws/2005/02/trust`"><S:Header><wsa:Action S:mustUnderstand=`"1`">http://schemas.xmlsoap.org/ws/2005/02/trust/RST/Issue</wsa:Action><wsa:To S:mustUnderstand=`"1`">{0}</wsa:To><ps:AuthInfo xmlns:ps=`"http://schemas.microsoft.com/LiveID/SoapServices/v1`" Id=`"PPAuthInfo`"><ps:BinaryVersion>5</ps:BinaryVersion><ps:HostingApp>Managed IDCRL</ps:HostingApp></ps:AuthInfo><wsse:Security><wsse:UsernameToken wsu:Id=`"user`"><wsse:Username>{1}</wsse:Username><wsse:Password>{2}</wsse:Password></wsse:UsernameToken></wsse:Security></S:Header><S:Body><wst:RequestSecurityToken xmlns:wst=`"http://schemas.xmlsoap.org/ws/2005/02/trust`" Id=`"RST0`"><wst:RequestType>http://schemas.xmlsoap.org/ws/2005/02/trust/Issue</wst:RequestType><wsp:AppliesTo><wsa:EndpointReference><wsa:Address>sharepoint.com</wsa:Address></wsa:EndpointReference></wsp:AppliesTo><wsp:PolicyReference URI=`"MBI`"></wsp:PolicyReference></wst:RequestSecurityToken></S:Body></S:Envelope>"


    function Invoke-HttpPost($endpoint, $body, $headers, $session)
    {
        log
        log "Invoke-HttpPost"
        log "url: $endpoint"
        log "post body: $body"

        $params = @{}
        $params.Headers = $headers
        $params.uri = $endpoint
        $params.Body = $body
        $params.Method = "POST"
        $params.WebSession = $session

        $response = Invoke-WebRequest @params -ContentType "application/soap+xml; charset=utf-8" -UseDefaultCredentials -UserAgent ([string]::Empty)
        $content = $response.Content

        return $content
    }

    # Get saml Assertion value from the custom STS
    function Get-AssertionCustomSts($customStsAuthUrl)
    {
        log
        log "Get-AssertionCustomSts"

        $messageId = [guid]::NewGuid()
        $created = [datetime]::UtcNow.ToString("o", [System.Globalization.CultureInfo]::InvariantCulture)
        $expires = [datetime]::UtcNow.AddMinutes(10).ToString("o", [System.Globalization.CultureInfo]::InvariantCulture)

        if ($integrated.ToBool())
        {
            log "integrated"

            $customStsAuthUrl = $customStsAuthUrl.ToLowerInvariant().Replace("/usernamemixed","/windowstransport")
            log $customStsAuthUrl

            $requestSecurityToken = [string]::Format($customStsSamlIntegratedRequestFormat, $messageId, $customStsAuthUrl, $realm)
            log $requestSecurityToken

        }
        else
        {
            log "not integrated"

            $requestSecurityToken = [string]::Format($customStsSamlRequestFormat, $customStsAuthUrl, $messageId, $username, $password, $created, $expires, $realm)
            log $requestSecurityToken

        }

        [xml]$customStsXml = Invoke-HttpPost $customStsAuthUrl $requestSecurityToken

        return $customStsXml.Envelope.Body.RequestSecurityTokenResponse.RequestedSecurityToken.Assertion.OuterXml
    }

    function Get-BinarySecurityToken($customStsAssertion, $msoSamlRequestFormatTemp)
    {
        log
        log "Get-BinarySecurityToken"

        if ([string]::IsNullOrWhiteSpace($customStsAssertion))
        {
            log "using username and password"
            $msoPostEnvelope = [string]::Format($msoSamlRequestFormatTemp, $msoDomain, $username, $password)
        }
        else
        {
            log "using custom sts assertion"
            $msoPostEnvelope = [string]::Format($msoSamlRequestFormatTemp, $customStsAssertion, $msoDomain)
        }

        $msoContent = Invoke-HttpPost $msoStsAuthUrl $msoPostEnvelope

        # Get binary security token using regex instead of [xml]
        # Using regex to workaround PowerShell [xml] bug where hidden characters cause failure
        [regex]$regex = "BinarySecurityToken Id=.*>([^<]+)<"
        $match = $regex.Match($msoContent).Groups[1]

        return $match.Value
    }

    function Get-SPOIDCRLCookie($msoBinarySecurityToken)
    {
        log
        log "Get-SPOIDCRLCookie"
        log
        log "BinarySecurityToken: $msoBinarySecurityToken"

        $binarySecurityTokenHeader = [string]::Format("BPOSIDCRL {0}", $msoBinarySecurityToken)
        $params = @{uri=$idcrlEndpoint
                    Method="GET"
                    Headers = @{}
                   }
        $params.Headers["Authorization"] = $binarySecurityTokenHeader
        $params.Headers["X-IDCRL_ACCEPTED"] = "t"

        $resonse = Invoke-WebRequest @params -UserAgent ([string]::Empty)
        $cookie = $resonse.BaseResponse.Cookies["SPOIDCRL"]

        return $cookie
    }

    # Retrieve the configured STS Auth Url (ADFS, PING, etc.)
    function Get-UserRealmUrl($getRealmUrl, $username)
    {
        log
        log "Get-UserRealmUrl"
        log "url: $getRealmUrl"
        log "username: $username"

        $body = "login=$username&xml=1"
        $response = Invoke-WebRequest -Uri $getRealmUrl -Method POST -Body $body -UserAgent ([string]::Empty)

        return ([xml]$response.Content).RealmInfo.STSAuthURL
    }

    [System.Net.ServicePointManager]::Expect100Continue = $true

    #1 Get custom STS auth url
    $customStsAuthUrl = Get-UserRealmUrl $getRealmUrl $username

    if ($customStsAuthUrl -eq $null)
    {
        #2 Get binary security token from the MSO STS by passing the SAML <Assertion> xml
        $customStsAssertion = $null
        $msoBinarySecurityToken = Get-BinarySecurityToken $customStsAssertion $msoSamlRequestFormat2
    }
    else
    {
        #2 Get SAML <Assertion> xml from custom STS
        $customStsAssertion = Get-AssertionCustomSts $customStsAuthUrl

        #3 Get binary security token from the MSO STS by passing the SAML <Assertion> xml
        $msoBinarySecurityToken = Get-BinarySecurityToken $customStsAssertion $msoSamlRequestFormat
    }

    #3/4 Get SPOIDRCL cookie from SharePoint site by passing the binary security token
    #  Save cookie and reuse with multiple requests
    $idcrl = $null
    $idcrl = Get-SPOIDCRLCookie $msoBinarySecurityToken

    if ([string]::IsNullOrEmpty($format))
    {
        $format = [string]::Empty
    }
    else
    {
        $format = $format.Trim().ToUpperInvariant()
    }

    $Global:spoidcrl = $idcrl

    if ($format -eq "XML")
    {
        Write-Output ([string]::Format("<SPOIDCRL>{0}</SPOIDCRL>", $idcrl.Value))
    }
    elseif ($format -eq "JSON")
    {
        Write-Output ([string]::Format("{{`"SPOIDCRL`":`"{0}`"}}", $idcrl.Value))
    }
    elseif ($format.StartsWith("KEYVALUE") -or $format.StartsWith("NAMEVALUE"))
    {
        Write-Output ("SPOIDCRL:" + $idcrl.Value)
    }
    else
    {
        Write-Output $idcrl.Value
    }

}
catch
{
    log $error[0]
    "ERROR:" + $statusText.ToString()
}



SharePoint Online AAD App OAuth

$
0
0

This post is a contribution from Vitaly Lyamin, an engineer with the SharePoint Developer Support team

Accessing SharePoint API’s has never been easier (SPOIDCRL cookie, ACS OAuth, AAD OAuth). Azure AD apps are quickly becoming the standard way of accessing O365 API’s in addition to other API’s. Below are some resources on registering apps and using libraries. Also, there’s a test script that walks through the entire authorization grant flow. The end goal with all OAuth-based authorization is to retrieve the access token to be used in the HTTP request Authorization header (Authorization: Bearer <access token>).

Native Client App
Native app registrations are primarily for devices and services where browser interaction is not needed. One of the biggest benefits is the non-interactive (active) authorization using credentials, Federated IDP assertion or similar.

Links
https://docs.microsoft.com/en-us/azure/active-directory/develop/active-directory-authentication-scenarios#native-application-to-web-api
https://azure.microsoft.com/en-us/resources/samples/active-directory-dotnet-native-headless

Web App / API
Web app registrations are just as they sound – apps on the web. These apps typically use the authorization grant and refresh grant flows and are not intended for devices/services. Once authorized (some permissions scopes require admin consent), the access token is retrieved from the OAuth token endpoint using the authorization code.

Authorization URL
https://login.microsoftonline.com/common/oauth2/authorize?resource=<RESOURCE>&client_id=>CLIENTID>&scope=<SCOPE>&redirect_uri=<REDIRECTURI>&response_type=code&prompt=admin_consent

Access Token URL
https://login.microsoftonline.com/common/oauth2/token

Link
https://docs.microsoft.com/en-us/azure/active-directory/develop/active-directory-authentication-scenarios#web-browser-to-web-application

Libraries
ADAL libraries are available in many different flavors and are quick and easy to implement. There primary purpose is to authorize the user/service to a resource (e.g. SharePoint REST API’s, Graph).

Link
https://docs.microsoft.com/en-us/azure/active-directory/develop/active-directory-authentication-libraries

Other Resources
https://docs.microsoft.com/en-us/azure/active-directory/develop/active-directory-integrating-applications
https://msdn.microsoft.com/en-us/office/office365/howto/getting-started-Office-365-APIs

Test Script (Web App)

<#
    .Synopsis
        Get access token for AAD web app.

    .Description
        Authorizes AAD app and retrieves access token using OAuth 2.0 and endpoints.
        Refreshes the token if within 5 minutes of expiration or, optionally forces refresh.
        Sets global variable ($Global:accessTokenResult) that can be used after the script runs.

    .Todo
        Add ability to handle refresh token input and access token retrieval without re-authorization.

    .Example 
        The following returns the access token result from AAD with admin consent authorization and caches the result.

        PS> .\aad_web.ps1 -Clientid "" -Clientsecret "" -Resource "https://TENANT.sharepoint.com" -Redirecturi "https://localhost:44385" -Scope "" -AdminConsent -Cache
    
    .Example 
        The following returns the access token result from AAD with admin consent authorization or refreshes the token.

        PS> .\aad_web.ps1 -Clientid "" -Clientsecret "" -Resource "https://TENANT.sharepoint.com" -Redirecturi "https://localhost:44385" -Scope "" -AdminConsent
    
    .Example 
        The following returns the access token result from AAD or from cache, forces refresh so the token is good for an hour and outputs to a file

        PS> .\aad_web.ps1 -Clientid "" -Clientsecret "" -Resource "https://TENANT.sharepoint.com" -Redirecturi "https://localhost:44385" -Scope "" -Refresh Force | Out-File c:\temp\token.txt

    .PARAMETER ClientId 
        The AAD App client id.
    .PARAMETER ClientSecret
        The AAD App client secret.	
    .PARAMETER RedirectUri
        The redirect uri configured for that app.
    .PARAMETER Resource
        The resource the app is attempting to access (i.e. https://TENANT.sharepoint.com)
    .PARAMETER Scope
        Permission scopes for the app (optional).
    .PARAMETER AdminConsent
        Will perform admin consent (optional).
    .PARAMETER Cache
        Cache the access token in the temp directory for subsequent retrieval (optional).
    .PARAMETER Refresh
        Options (Yes, No, Force). Will automatically enabling caching if "Yes" or "Force" are used.
        Yes: Refresh token if within 5 minutes of expiration if cached token found.
        No: Do not refresh and re-authorize.
        Force: Forfce a refresh if cached token found.

#>
[CmdletBinding()]
Param(
[Parameter(Mandatory=$true)]
[string]$ClientId,
[Parameter(Mandatory=$true)]
[string]$ClientSecret,
[Parameter(Mandatory=$true)]
[string]$RedirectUri,
[Parameter(Mandatory=$true)]
[string]$Resource,
[Parameter(Mandatory=$false)]
[string]$Scope,
[Parameter(Mandatory=$false)]
[switch]$AdminConsent,
[Parameter(Mandatory=$false)]
[switch]$Cache,
[Parameter(Mandatory=$false)]
[ValidateSet("Yes","No","Force")]
[ValidateNotNullOrEmpty()]
[string]$Refresh = "Yes"
)

Add-Type -AssemblyName System.Windows.Forms
Add-Type -AssemblyName System.Web

$isCache = $Cache.IsPresent
$isRefresh = (($Refresh -eq "Yes") -or ($Refresh -eq "Force"))
$refreshForce = $Refresh -eq "Force"

if ($isRefresh)
{
    $isCache = $true
}

# Don't edit variables below (unless there's a bug)
$clientSecretEncoded = [uri]::EscapeDataString($clientSecret)
$redirectUriEncoded = [uri]::EscapeDataString($redirectUri)
$resourceEncoded = [uri]::EscapeDataString($resource)
$accessTokenUrl = "https://login.microsoftonline.com/common/oauth2/token"
$cacheFilePath = [System.IO.Path]::Combine($env:TEMP, "aad_web_cache_$clientId.json")

$accessTokenResult = $null
$adminConsentText =""
if ($adminConsent)
{
    $adminConsentText = "&prompt=admin_consent"
}

$authorizationUrl = "https://login.microsoftonline.com/common/oauth2/authorize?resource=$resourceEncoded&client_id=$clientId&scope=$scope&redirect_uri=$redirectUriEncoded&response_type=code$adminConsentText"

function Invoke-OAuth()
{
    $Global:authorizationCode = $null

    $form = New-Object Windows.Forms.Form
    $form.FormBorderStyle = [Windows.Forms.FormBorderStyle]::FixedSingle
    $form.Width = 640
    $form.Height = 480
    $form.MaximizeBox = $false
    $form.MinimizeBox = $false

    $web = New-Object Windows.Forms.WebBrowser
    $form.Controls.Add($web)

    $web.Size = $form.ClientSize
    $web.DocumentText = "<html><body style='text-align:center;overflow:hidden;background-image:url(https://secure.aadcdn.microsoftonline-p.com/ests/2.1.6856.20/content/images/backgrounds/0.jpg?x=f5a9a9531b8f4bcc86eabb19472d15d5)'><h3 id='title'>Continue with current user or logout?</h3><div><input id='cancel' type='button' value='Continue' /></div><br /><div><input id='logout' type='button' value='Logout' /></div><h5 id='loading' style='display:none'>Working on it...</h5><script type='text/javascript'>var logout = document.getElementById('logout');var cancel = document.getElementById('cancel');function click(element){document.getElementById('title').style.display='none';document.getElementById('loading').style.display='block';logout.style.display='none';cancel.style.display='none';if (this.id === 'logout'){window.location = 'https://login.microsoftonline.com/common/oauth2/logout?post_logout_redirect_uri=' + encodeURIComponent('$authorizationUrl');}else{window.location = '$authorizationUrl';}}logout.onclick = click;cancel.onclick = click;</script></body></html>"

    $web.add_DocumentCompleted(
    {
        $uri = [uri]$redirectUri
        $queryString = [System.Web.HttpUtility]::ParseQueryString($_.url.Query)

        if($_.url.authority -eq $uri.authority)
        {
            $authorizationCode = $queryString["code"]

            if (![string]::IsNullOrEmpty($authorizationCode))
            {
                $form.DialogResult = "OK"
                $Global:authorizationCode = $authorizationCode
                $Global:authorizationCodeTime = [datetime]::Now
            }

            $form.close()
        }
    })

    $dialogResult = $form.ShowDialog()

    if($dialogResult -eq "OK")
    {
        $authorizationCode = $Global:authorizationCode
        $headers = @{"Accept" = "application/json;odata=verbose"}
        $body = "client_id=$clientId&client_secret=$clientSecretEncoded&redirect_uri=$redirectUriEncoded&grant_type=authorization_code&code=$authorizationCode"

        $accessTokenResult = Invoke-RestMethod -Uri $accessTokenUrl -Method POST -Body $body -Headers $headers
        $Global:accessTokenResult = $accessTokenResult
        $Global:accessTokenResultTime = [datetime]::Now
        $accessTokenResultText = (ConvertTo-Json $accessTokenResult)

        if ($isCache -and ![string]::IsNullOrEmpty($accessTokenResultText))
        {
            [void](Set-Content -Path $cacheFilePath -Value $accessTokenResultText)
        }

        Write-Output (ConvertTo-Json $accessTokenResultText)
    }

    $web.Dispose()
    $form.Dispose()
}

function Get-CachedAccessTokenResult()
{
    if ($isCache -and [System.IO.File]::Exists($cacheFilePath))
    {
        $accessTokenResultText = Get-Content -Raw $cacheFilePath
        if (![string]::IsNullOrEmpty($accessTokenResultText))
        {
            $accessTokenResult  = (ConvertFrom-Json $accessTokenResultText)
            if (![string]::IsNullOrEmpty($accessTokenResult.access_token))
            {
                $Global:accessTokenResult = $accessTokenResult

                return $accessTokenResult
            }
        }
    }

    return $null
}

function Invoke-Refresh()
{
    $refreshToken = $accessTokenResult.refresh_token
    $headers = @{"Accept" = "application/json;odata=verbose"}
    $body = "client_id=$clientId&client_secret=$clientSecretEncoded&resource=$resourceEncoded&grant_type=refresh_token&refresh_token=$refreshToken"
    $accessTokenResult2 = Invoke-RestMethod -Uri $accessTokenUrl -Method POST -Body $body -Headers $headers

    $accessTokenResult.scope = $accessTokenResult2.scope
    $accessTokenResult.expires_in = $accessTokenResult2.expires_in
    $accessTokenResult.ext_expires_in = $accessTokenResult2.ext_expires_in
    $accessTokenResult.expires_on = $accessTokenResult2.expires_on
    $accessTokenResult.not_before = $accessTokenResult2.not_before
    $accessTokenResult.resource = $accessTokenResult2.resource
    $accessTokenResult.access_token = $accessTokenResult2.access_token
    $accessTokenResult.refresh_token = $accessTokenResult2.refresh_token

    $Global:accessTokenResult = $accessTokenResult
    $Global:accessTokenResultTime = [datetime]::Now
    $accessTokenResultText = (ConvertTo-Json $accessTokenResult)

    if (![string]::IsNullOrEmpty($accessTokenResultText))
    {
        [void](Set-Content -Path $cacheFilePath -Value $accessTokenResultText)
    }

    Write-Output (ConvertTo-Json $accessTokenResultText)
}

$accessTokenResult = Get-CachedAccessTokenResult
if ($accessTokenResult -eq $null)
{
    Invoke-OAuth
}
elseif ($refreshForce -or (([datetime]::Parse("1/1/1970")).AddSeconds([int]$accessTokenResult.expires_on).ToLocalTime() -lt ([datetime]::Now).AddMinutes(5)))
{
    if ($isRefresh)
    {
        Invoke-Refresh
    }
    else
    {
        Invoke-OAuth
    }
}
else
{
    Write-Output (ConvertTo-Json $Global:accessTokenResult)
}

How to adjust default width for Name Column in Modern UI

$
0
0

This post is a contribution from Jing Wang, an engineer with the SharePoint Developer Support team

One SPO customer had a Document library, which showed Name (filename) column in the default view.

The filenames in the library are not very short generally, default display looks like :

A large part of the filenames was cut on the right side by default.

They wanted the column "Name" to have larger default width, so the filenames will show up nicely at the beginning.

We tried to adjust the width for the column using new Column Formatting capability, also SharePoint Framework extension (SPFx) Field Customizer, but none of the tries yielded good results.

Though SPFx Field Customizer does give us much room for customization, the Name field is a unique lookup field and it does not accept the SPFx Field Customizer

Thanks to Escalation Engineer, Westley Hall, we were able to identify a workaround to show the filename in multiple lines with Column Formatting:

{
   "debugMode": true,
   "elmType": "span",
   "txtContent": "@currentField",
   "style": {
      "width":"100%",
      "height": "80px",
      "overflow":"visible",
      "vertical-align": "top",
      "padding": "4px",
      "word-wrap": "break-word",
      "overflow-wrap": "break-word",
      "word-break": "break-all",
      "white-space":"normal"
    }
}

 

The result looks like this in UI. It shows the complete file name without trimming.

SharePoint 2016 Apps from Your Organization – There is nothing here yet

$
0
0

This post is a contribution from Vitaly Lyamin, an engineer with the SharePoint Developer Support team

The “App Catalog” requires a corresponding language patch to be installed to work correctly. We’ve seen a number of cases where deployed apps don’t show up in in the “Apps From Your Organization” page (addanapp.aspx). One of the causes we found was that either the language patch is not installed or the language dependent update is not installed.

Symptom

No apps show up when attempting to add an app “From Your Organization” even though there are apps deployed though the app catalog. Sometimes it’s easy to spot the issue where the library view field names are not resolved (like below), other times there’s no indication other than apps not showing up.

App Catalog Library All Items View Fields

  • Notice resource strings instead of columns names below

 

Solution
Install language dependent update or corresponding language patch

Missing trust prompt when deploying SharePoint framework app in App Catalog site in SharePoint OnPremise

$
0
0

This post is a contribution from Manish Kumar, an engineer with the SharePoint Developer Support team

We had an issue reported recently that when adding a SharePoint framework package to the Apps for SharePoint page in the app catalog site, the trust prompt is not shown. Due to this, the app is not deployed. This was with SharePoint 2016 on premise version running January 2018 CU having both language dependent and independent fixes installed.

The trust prompt:

Through the comparison of Fiddler trace in working and non-working setup, we figured out that the trust prompt is not being shown because there was no redirect to the page TrustedClientSideSolution.aspx. In working setup, the redirect Url looks something like - https://<SPWebApp>/sites/<<AppCatalog>>/_layouts/15/TrustClientSideSolution.aspx?itemId=5&listId={9F37D537-292B-47D5-8C5D-143606A70262}&IsDlg=1

A closer look at the Fiddler trace for the non-working setup showed that the calls to the client.svc were getting redirecting in a loop to https://<SPSite>/_windows/default.aspx and https://<SPSite>/_login/default.aspx pages. And the call ultimately didn’t succeed. This was corroborated through the ULS log as well.

This made us verify the authentication provider for the web application (Central Admin -> Application Management -> Manage Web Applications -> Select the web application -> Go to Authentication providers -> Click on the Default). But everything looked fine. The Authentication scheme on the SharePoint Web Services also looked right.

Further digging led us to checking the authentication scheme on SharePoint – 80 site. And Bingo! The site had Anonymous authentication disabled. Simply enabling it resolved the issue i.e. the trust prompt was shown now.

Though disabling the anonymous authentication on that site may manifest into different issues, but this indeed came as a surprise!

Troubleshooting assistance with Microsoft Graph API Development

$
0
0

This post is a contribution from Manish Kumar, an engineer with the SharePoint Developer Support team

This post is an attempt to guide Developers in troubleshooting issues that they may come across when doing the development using
Microsoft Graph API and possible things to check to resolve those issues. Some of the approaches discussed would be common irrespective of the Office 365 resource (workload) being retrieved e.g. SharePoint list or Outlook mail.

  1. Have I registered my app correctly?
    1. First thing to check is whether you have an app designed to run in interactive mode or in unattended mode. This will give you clarity on whether you need to register the app as a native app or web app, and the permissions that you need to grant to the app. For example, if you create a Native app from the Azure AD app registration page, you would not see an option of adding Application permission to any API (like Microsoft Graph). The reason being that native apps are expected to be installed on devices and run in interactive mode and expected to have just Delegated permissions.
    2. Whether you intend to develop only for Azure AD organizations/accounts or for both Microsoft account and Azure AD accounts? This would determine the Azure AD v1.0 vs v2.0 endpoint usage and how the permission scopes are set. More reading here -
      https://docs.microsoft.com/en-us/azure/active-directory/develop/active-directory-v2-compare.
      And make sure you’re not running into any limitations associated with v2.0 endpoint -
      https://docs.microsoft.com/en-us/azure/active-directory/develop/active-directory-v2-limitations.
  2. Have the permissions been set correctly?
    When setting app permissions, there are couple of options – Delegated vs Application permission. You would use the Delegated permission when the app needs the permission on behalf of a user to certain resources that belong to the user. Useful in interactive scenarios. On the other hand, Application permissions have wider scope and are granted directly to applications without any user interaction. Useful in running background services.So, how to verify your app permissions?
    Most of the time we run into issues where the call to Graph API resource returns 401 Unauthorized or Access Denied errors. First thing to check in such scenario is the permissions scope on the access token. Any JWT token decoder tool can be used for looking at the claims on the token. Or some of the Azure AD Authentication libraries may provide inbuilt functions for decoding the token.Examples of permission on the access token:
    1. User access token permission scope (decoded using http://jwt.calebb.net):

      The user token obtained with Delegated permissions, would also contain the user information in the claims.
      For example,
      name: "Manish Kumar",
      upn: "Manish.Kumar@domain.onmicrosoft.com",
    2. App-only access token permission scope (decoded using http://jwt.calebb.net):
  3. Has the admin granted consent to the app that have requested permissions that explicitly require admin consent?
    All Application permissions require admin consent and there are some Delegated permissions too that require admin consent. So, if any of those permissions are added or changed, it’s important to grant the app admin consent first, before hitting the token endpoint. See ‘Get administrator consent’ section under https://developer.microsoft.com/en-us/graph/docs/concepts/auth_v2_service.
  4. Why am I not getting refresh token for app-only access token (using Client credentials grant flow)?
    We need to note that with the app-only scenario, in AAD, refresh token is not needed. The likely rationale is that app-only token retrieval is a one step process and when the access token nears expiry or expires, the app should hit the token endpoint to get a new access token. The refresh token is returned when the app is run in delegated mode with user interaction. In that case, it’s a two-step process to retrieve access token. The first step is to get authorization code and then to use that code to obtain the access token as the second step. In this scenario, having a refresh token helps avoid making two calls wherein the app can use the refresh token to get the access token directly in one call.
  5. I’ve changed the app permissions in the registration portal, but I’m not getting the user consent prompt. Why?
    This is likely because you have not added any new permission to the existing permission set for the application. The user consent prompt will be thrown for any new permissions that have been added to the app but not consented before.
  6. Have you tried Microsoft Graph Explorer for emulating the calls?
    Microsoft Graph Explorer allows you to explore the calls to the service with sample data as well by signing into your Microsoft account. One thing to note is that Graph Explorer (as of now) gets a user token for the account you signed-in. So, if you intend to test the calls made with app-only token, you’d have to consider using other REST clients.
  7. What about the calls that can’t be made through Graph Explorer?
    For those calls that can’t be made using Graph Explorer, you can use any REST client like Postman to make those calls manually. For authorization code grant flow, below are the steps:
    1. Get authorization code from authorization endpoint
    2. Get access token from token endpoint
    3. Pass the access token to Authorization header along with the Graph API call.

    Links:
    https://developer.microsoft.com/en-us/graph/docs/concepts/auth_v2_user
    https://developer.microsoft.com/en-us/graph/docs/concepts/auth_v2_service

  8. Have you looked at the known issues documentation for Graph API?
    At times you may feel that you’re following all the steps but still not getting through. For example, using client credentials grant flow (with app-only token) to make call to retrieve group events but it fails with 401 Unauthorized error. Well, it is because retrieving group events is only supported for Delegated permissions as of now. Therefore, reviewing the knowns issues - https://developer.microsoft.com/en-us/graph/docs/concepts/known_issues is right place to start in such scenarios.
  9. Is it a problem at SDK level or Graph API level?
    There are multitudes of Graph API SDKs available for different dev platforms. At times it is possible that there is a problem with the SDK and it is not constructing the HTTP requests (GET/POST/others) correctly and hence the value is not returned as expected. To isolate the problem at SDK level, take a network capture (like Fiddler) and review the calls to make sure that it looks correct as per the documentation. Besides that, as noted in point (6), you can leverage Graph Explorer for reference, if that call is available there.
  10. Is your app throttling the Microsoft Graph service?
    Look at the response code for the calls made from your app for HTTP status code 429 for throttling confirmation. The app developer should design the app in a way that avoids throttling at first place and is able to identify and gracefully handle any throttling scenarios if it occurs.
  11. Are you running the Beta version of the API?
    The Beta version is only meant for testing purpose and subject to change. Therefore, not to be used in production. So, isolate whether the problem exists with just the Beta version or the generally available version (v1.0).
  12. How to get help?
    If it is a feature request, post it here -
    https://officespdev.uservoice.com/forums/224641-feature-requests-and-feedback/category/101632-microsoft-graph-o365-rest-apis
    For posting your queries, leverage the Stackoverflow forum for Microsoft Graph API https://stackoverflow.com/questions/tagged/microsoft-graph

This post is by no means exhaustive in nature. Besides, there are changes continuously made to improve the developer and end user experience with Office 365. Thus, calling for appropriate changes in troubleshooting approach.

Retrieve granular user actions or usage reports using Search-UnifiedAuditLog cmdlet

$
0
0

This post is a contribution from Manish Joshi, an engineer with the SharePoint Developer Support team

The following blog post demonstrates the steps to retrieve granular user action or usage reports using the Search-UnifiedAuditLog commandlet.

1.       Browse to https://protection.office.com.

           In the left pane, click Search & investigation, and then click Audit log search

Note: You have to first turn on audit logging before you can run an audit log search. If the Start recording user and admin activity link is displayed, click it to turn on auditing. If you don't see this link, auditing has already been turned on for your organization. It will take couple of hours before you are able to see log results in UI or via code.

2.       Browse to https://outlook.office365.com/ecp/

a.       Under permissions – go to admin role

b.       Create a new role, called AuditReportRole

c.       Assign following Roles:

                                 i.            Audit Logs

                               ii.            View-Only Audit Logs

d.       Add Members

Add users (for e.g: garthf@spo.onmicrosoft.com)

e.       Write-Scope --> Default

In the screenshot below. I am creating a new admin role called “AuditReportRole”, assigning minimum required permissions “Audit Logs” and “View-Only Audit Logs” and granting a user “Garth Fort” permission to be able to access the Usage reports.

3.       Use following powershell script, please make changes as per your environment and this will generate .csv file for each user with the actions they have undertaken for last 7 days.

$Username = "garthf@spo.onmicrosoft.com"
$Password = ConvertTo-SecureString 'password' -AsPlainText -Force
$LiveCred = New-Object System.Management.Automation.PSCredential $Username, $Password
 
$session = New-PSSession -ConfigurationName Microsoft.Exchange -ConnectionUri https://outlook.office365.com/powershell-liveid/ -Credential $LiveCred -Authentication Basic -AllowRedirection

Import-PSSession $session

Connect-MsolService -Credential $LiveCred
 
$Users = Get-MsolUser | Where-Object {$_.UserPrincipalName -notlike "*#EXT#*" }
 
$Users | ForEach {
        
$OutputFile = "C:\SomeFolder\Usage-" + $_.DisplayName + ".csv"

$auditEventsForUser = Search-UnifiedAuditLog -EndDate $((Get-Date)) -StartDate $((Get-Date).AddDays(-7)) -UserIds $_.UserPrincipalName -RecordType SharePoint -Operations FileAccessed,PageViewed,PageViewedExtended


Write-Host "Events for" $_.DisplayName "created at" $_.WhenCreated
 
$ConvertedOutput = $auditEventsForUser | Select-Object -ExpandProperty AuditData | ConvertFrom-Json

$ConvertedOutput | Select-Object CreationTime,UserId,Operation,Workload,ObjectID,SiteUrl,SourceFileName,ClientIP,UserAgent | Export-Csv $OutputFile -NoTypeInformation -Append
}
 
Remove-PSSession $session 

 

4.   Sample CSV output

5. Please also go thru following articles to better understand the Audit log concept and detailed properties that can be retrieved:

  https://support.office.com/en-us/article/Search-the-audit-log-in-the-Office-365-Security-Compliance-Center-0d4d0f35-390b-4518-800e-0c7ec95e946c?ui=en-US&rs=en-US&ad=US

https://technet.microsoft.com/en-us/library/mt238501(v=exchg.160).aspx

https://support.office.com/en-us/article/Detailed-properties-in-the-Office-365-audit-log-ce004100-9e7f-443e-942b-9b04098fcfc3

Issue with Document Set content type in custom list templates and CSOM

$
0
0

This post is a contribution from Sohail Sayed, an engineer with the SharePoint Developer Support team

Recently one of the customers encountered an interesting issue with Document Set content types when used in custom list templates and using CSOM to provision the list. Customer had created a list which had multiple content types which were inheriting from Document Set content type. The list was saved as a list template and the list provisioning was being done using JavaScript object model. In this scenario the welcome page for the document set content type was not provisioned correctly and would come up blank. This seems to happen only when provisioning the list using the custom list template via code.

Below is an example

 

In a working scenario you can see 3 web parts on the welcome page for a Document Set content type as below

It looks like there is an issue due to which the webparts on the welcome page are not provisioned when creating the list from the custom list templates through CSOM.  The issue does not happen if we create the list using the custom list template in UI.

Fortunately, there are couple of simple solutions to fix this issue.

The first solution is to remove and re-add the content types of type Document Set in CSOM. This will re-provision the welcome page correctly with the required web parts. However, if the custom list template is saved with content and has list items referencing the Document Set Content types then we will get an error if we try to remove the content types. In this case we can use an alternate solution to add the web parts through code. 

The page contains the following 3 web parts

-          an Image web part which references a static image from /_layouts folder

-          Web part to display document set metadata

-          Web Part to display the document set items

All the 3 web parts read the current list item context and we don’t need to explicitly configure any properties like list id or list url.

Below is the sample code for adding the web parts to the welcome page of the document set content types using JSOM.

function fixWebParts(){
		
	var docSetID = "0x0120D5"; // all document sets content types will start with this id
	var listName = "DocSet3"; // update list name 

	var clientContext = SP.ClientContext.get_current();

	var oWeb =	clientContext.get_web();
	var oList = oWeb.get_lists().getByTitle(listName);
	var contentTypes = oList.get_contentTypes();
	
		
	clientContext.load(oList, 'RootFolder');
	clientContext.load(contentTypes );
	clientContext.executeQueryAsync(onSuccess, onFail);
	
	function onSuccess() {
	  var rootFolder = oList.get_rootFolder();
	  var listURL = rootFolder.get_serverRelativeUrl(); // URL of the list
	  console.log(listURL)
	  
	  var count = contentTypes.get_count();
	  
	  for(var i=0; i < count; i++)
	  {
	  	var id = contentTypes.itemAt(i).get_id().toString();
	  	var name = contentTypes.itemAt(i).get_name();
	  	
	  	if (id.startsWith(docSetID)) // this is a document set type
	  	{
	  		AddWebParts(listURL , name );
	  	}
	  }	  
	}

	function onFail(sender, args) {
	  console.log('Request Failed: ' + args.get_message() + '\n' + args.get_stackTrace());
	}	
}



function AddWebParts(listRootFolderUrl,contentTypeName){

	var pageUrl = listRootFolderUrl + "/forms/" + contentTypeName + "/docsethomepage.aspx";
	
	
	AddImageWebPart(pageUrl);
	AddDocumentSetContentWebPart(pageUrl);
	AddDocumentSetPropertiesWebPart(pageUrl);

}


function AddImageWebPart(pageUrl)
{
	var webPartXml = '<WebPart xmlns:xsd="http://www.w3.org/2001/XMLSchema" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xmlns="http://schemas.microsoft.com/WebPart/v2">' +
  '<Title>Image</Title>' +
  '<FrameType>Default</FrameType>' +
  '<Description>Use to display pictures and photos.</Description>' +
  '<IsIncluded>true</IsIncluded>' +
  '<ZoneID>WebPartZone_TopLeft</ZoneID>' +
  '<PartOrder>0</PartOrder>' +
  '<FrameState>Normal</FrameState>' +
  '<Height />' +
  '<Width />' +
  '<AllowRemove>true</AllowRemove>' +
  '<AllowZoneChange>true</AllowZoneChange>' +
  '<AllowMinimize>true</AllowMinimize>' +
  '<AllowConnect>true</AllowConnect>' +
  '<AllowEdit>true</AllowEdit>' +
  '<AllowHide>true</AllowHide>' +
  '<IsVisible>true</IsVisible>' +
  '<DetailLink />' +
  '<HelpLink />' +
  '<HelpMode>Modeless</HelpMode>' +
  '<Dir>Default</Dir>' +
  '<PartImageSmall />' +
  '<MissingAssembly>Cannot import this Web Part.</MissingAssembly>' +
  '<PartImageLarge />' +
  '<IsIncludedFilter />' +
  '<Assembly>Microsoft.SharePoint, Version=16.0.0.0, Culture=neutral, PublicKeyToken=71e9bce111e9429c</Assembly>' +
  '<TypeName>Microsoft.SharePoint.WebPartPages.ImageWebPart</TypeName>' +
  '<ImageLink xmlns="http://schemas.microsoft.com/WebPart/v2/Image">/_layouts/images/docset_welcomepage_big.png</ImageLink>' +
  '<AlternativeText xmlns="http://schemas.microsoft.com/WebPart/v2/Image" />' +
  '<VerticalAlignment xmlns="http://schemas.microsoft.com/WebPart/v2/Image">Middle</VerticalAlignment>' +
  '<HorizontalAlignment xmlns="http://schemas.microsoft.com/WebPart/v2/Image">Center</HorizontalAlignment>' +
  '<BackgroundColor xmlns="http://schemas.microsoft.com/WebPart/v2/Image">transparent</BackgroundColor>' +
'</WebPart>';


AddWebPart(pageUrl,"WebPartZone_TopLeft","0",webPartXml );

}


function AddDocumentSetPropertiesWebPart(pageUrl)
{
	var webPartXml = '<WebPart xmlns:xsd="http://www.w3.org/2001/XMLSchema" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xmlns="http://schemas.microsoft.com/WebPart/v2">' +
  '<Title>Document Set Properties</Title>' +
  '<FrameType>Default</FrameType>' +
  '<Description>Displays the properties of the Document Set.</Description>' +
  '<IsIncluded>true</IsIncluded>' +
  '<ZoneID>WebPartZone_Top</ZoneID>' +
  '<PartOrder>0</PartOrder>' +
  '<FrameState>Normal</FrameState>' +
  '<Height />' +
  '<Width />' +
  '<AllowRemove>true</AllowRemove>' +
  '<AllowZoneChange>true</AllowZoneChange>' +
  '<AllowMinimize>true</AllowMinimize>' +
  '<AllowConnect>true</AllowConnect>' +
  '<AllowEdit>true</AllowEdit>' +
  '<AllowHide>true</AllowHide>' +
  '<IsVisible>true</IsVisible>' +
  '<DetailLink />' +
  '<HelpLink />' +
  '<HelpMode>Modeless</HelpMode>' +
  '<Dir>Default</Dir>' +
  '<PartImageSmall />' +
  '<MissingAssembly>Cannot import this Web Part.</MissingAssembly>' +
  '<PartImageLarge>/_layouts/15/images/msimagel.gif</PartImageLarge>' +
  '<IsIncludedFilter />' +
  '<DisplayText>' +
  '</DisplayText>' +
  '<Assembly>Microsoft.Office.DocumentManagement, Version=16.0.0.0, Culture=neutral, PublicKeyToken=71e9bce111e9429c</Assembly>' +
  '<TypeName>Microsoft.Office.Server.WebControls.DocumentSetPropertiesWebPart</TypeName>' +
'</WebPart>';


AddWebPart(pageUrl,"WebPartZone_Top","0",webPartXml );

}


function AddDocumentSetContentWebPart(pageUrl)
{
	var webPartXml = '<WebPart xmlns:xsd="http://www.w3.org/2001/XMLSchema" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xmlns="http://schemas.microsoft.com/WebPart/v2">' +
  '<Title>Document Set Contents</Title>' +
  '<FrameType>Default</FrameType>' +
  '<Description>Displays the contents of the Document Set.</Description>' +
  '<IsIncluded>true</IsIncluded>' +
  '<ZoneID>WebPartZone_CenterMain</ZoneID>' +
  '<PartOrder>0</PartOrder>' +
  '<FrameState>Normal</FrameState>' +
  '<Height />' +
  '<Width />' +
  '<AllowRemove>true</AllowRemove>' +
  '<AllowZoneChange>true</AllowZoneChange>' +
  '<AllowMinimize>true</AllowMinimize>' +
  '<AllowConnect>true</AllowConnect>' +
  '<AllowEdit>true</AllowEdit>' +
  '<AllowHide>true</AllowHide>' +
  '<IsVisible>true</IsVisible>' +
  '<DetailLink />' +
  '<HelpLink />' +
  '<HelpMode>Modeless</HelpMode>' +
  '<Dir>Default</Dir>' +
  '<PartImageSmall />' +
  '<MissingAssembly>Cannot import this Web Part.</MissingAssembly>' +
  '<PartImageLarge>/_layouts/15/images/msimagel.gif</PartImageLarge>' +
  '<IsIncludedFilter />' +
  '<DisplayText>' +
  '</DisplayText>' +
  '<Assembly>Microsoft.Office.DocumentManagement, Version=16.0.0.0, Culture=neutral, PublicKeyToken=71e9bce111e9429c</Assembly>' +
  '<TypeName>Microsoft.Office.Server.WebControls.DocumentSetContentsWebPart</TypeName>' +
'</WebPart>';


AddWebPart(pageUrl,"WebPartZone_CenterMai","0",webPartXml );

}



function AddWebPart(pageUrl,zoneId,zoneIndex, webPartXml){
	
	var clientContext = SP.ClientContext.get_current();
    var oFile = clientContext.get_web().getFileByServerRelativeUrl(pageUrl);

    var limitedWebPartManager = oFile.getLimitedWebPartManager(SP.WebParts.PersonalizationScope.shared);
    
   
	var oWebPartDefinition = limitedWebPartManager.importWebPart(webPartXml);
    this.oWebPart = oWebPartDefinition.get_webPart(); 
    
   limitedWebPartManager.addWebPart(oWebPart,zoneId, zoneIndex);

    clientContext.load(oWebPart);

    clientContext.executeQueryAsync(Function.createDelegate(this, this.onQuerySucceeded), Function.createDelegate(this, this.onQueryFailed));
}


function onQuerySucceeded()
{
	console.log("success");
}


function onQueryFailed(e1,e2,e3)
{
	console.log("error");
	console.log(e1);
	console.log(e2);
	console.log(e3);
		
}

Embed video in multiple lines of text field in SharePoint List

$
0
0

This post is a contribution from Jing Wang, an engineer with the SharePoint Developer Support team

Problem Description:

SharePoint site user created a custom list and added a multiple line of text field called “note”, which is used to embed a video.

The video is stored in Azure.
Add new item to the list, insert the content with “<>Edit Source” button on Ribbon:

 

<link href="//amp.azure.net/libs/amp/2.1.3/skins/amp-default/azuremediaplayer.min.css" rel="stylesheet"> 
<script src="//amp.azure.net/libs/amp/2.1.3/azuremediaplayer.min.js"></script>
<video id="azuremediaplayer" class="azuremediaplayer amp-default-skin amp-big-play-centered" tabindex="0"></video>
<script type="text/javascript">
var myOptions = {
 "nativeControlsForTouch": false,
 controls: true,
 autoplay: true,
 width: "640",
 height: "400",
 }

myPlayer = amp("azuremediaplayer", myOptions);
myPlayer.src([
{
 "src": "https://sccpssvideo.streaming.mediaservices.windows.net/ab838cb0-befe-40bd-b873-f9793e86255d/Capital Improvements Committee M.ism/manifest",
 "type": "application/vnd.ms-sstr+xml"
 }
]);
</script>

 

Click on Ok, got warning:

Warning: Some of your markup was stripped out…. The JavaScript is removed. The video does not get inserted.

 

Solution:

Use following iframe tag to embed the video instead, which eliminate the use of Javascript.

<iframe width="500" height="280" align="center" src="https://aka.ms/ampembed?url=https://sccpssvideo.streaming.mediaservices.windows.net/ab838cb0-befe-40bd-b873-f9793e86255d/Capital%20Improvements%20Committee%20M.ism/manifest&autoplay=false" frameborder="no"></iframe>

Before you copy and paste above html source to the note field, go to Site Settings and make modification to the below setting first:

Under "Site Collection Administration"

- Go to Html Field Security:

- Select “Permit contributors to insert iframes from the following list of external domains into pages on this site” and Add “aka.ms” for Allow iframes from this domain:

 

 

Then copy the iframe html tag to the list field as shown below.

 

Save the list item, browse to the list, and you should be able to see the video.

 

Troubleshoot issues with SharePoint hosted Add-ins

$
0
0

This post is a contribution from Jing Wang, an engineer with the SharePoint Developer Support team

SharePoint Hosted Add-ins are hosted inside SharePoint environment. They use SharePoint JSOM code (JavaScript Object Model) to access list/library items in add-in web or Host web.

They can contain SharePoint lists, Web Parts, workflows, custom pages, and other components, which are usually installed on a subweb, called the add-in web, of the Host SharePoint website where the add-in is installed.

Here are some common issues customers encountered for SharePoint hosted Add-ins:

Sharepoint hosted app not working after publish

The Sharepoint hosted add-in works okay when deploy with F5 from Visual Studio, but after publish and add it to App Catalog, if you try to add it to Site, you get an error : “The page can't be displayed, make sure web address is correct”,

Sometimes, the error is:

Fiddler trace shows the following.

The cause of this is the app’s URL cannot be resolved with DNS.
The solution is to set up the app domain in DNS and creating a forward lookup zone and a CNAME alias (because each app web's URL will contain a unique hash). These steps are described in detail in
this TechNet article

 

Problems when host site’s Web Application is Host Header Web Application:

There are various kinds of symptoms of this category, for example,

a. 404 Not Found

Fiddler trace shows the following.

b. Blank page

c. HTTP 403 Forbidden:

In IE:

In Chrome:

d. Multiple Line of Text field in Add-in web (app web) is blank in form pages:

If the Add-in works in a non-host header web application, verify following configurations:

1. A SharePoint Web Application without host header on default port (80 or 443 depending on whether the hosting web application is on SSL or not).
I will call this Web Application “PlaceHolder Web Application” in this article.

2. PlaceHolder Web Application needs to have a root site collection.

3. If you have setup separate app domain for Add-ins, for example, apps.com, the certificate on your non PlaceHolder Web Application must also have a wildcard for your app domain (*.apps.com).

4. PlaceHolder Web Application’s Application Pool account needs to be same as Host Header Web Application’s Application Pool account.

If they are not the same, change the service account on Central Admin –> Security –> Configure service accounts:

 

Cannot access list/list items in Host Web with JSOM code:

To access list/list items in Host Web:

  • Request permission to the host web in the add-in manifest file of your add-in. The user who installs the add-in is prompted to grant this permission, and the add-in cannot be installed if the user does not.
  • Instead of using an SP.ClientContextobject to make JSOM calls to the host web, you use an SP.AppContextSite object. This object enables the add-in to get a context object for websites other than the add-in web, but only for websites within the same site collection.

Sample code:

App.js:
----------

'use strict'; 
ExecuteOrDelayUntilScriptLoaded(initializePage, "sp.js"); 
function initializePage()
{
    var context = SP.ClientContext.get_current();
    var user = context.get_web().get_currentUser();
    var hostWebUrl;
    var appWebUrl;
    var listName = "testList";  
    var ctx;
    var appCtxSite;
    var oList;
    var web;
    var collListItem;	 
    // This code runs when the DOM is ready and creates a context object which is needed to use the SharePoint object model
    $(document).ready(function () {         
        hostWebUrl = decodeURIComponent(getQueryStringParameter('SPHostUrl'));
        appWebUrl = decodeURIComponent(getQueryStringParameter('SPAppWebUrl'));
        getListItemFromHostWeb(); 
    }); 
    //get the list data from host web  
    function getListItemFromHostWeb() {
        debugger;
        ctx = new SP.ClientContext(appWebUrl);
        appCtxSite = new SP.AppContextSite(ctx, hostWebUrl);
        oList = appCtxSite.get_web().get_lists().getByTitle(listName); 
        ctx.load(oList);
        ctx.executeQueryAsync(oListExist, OnGetListItemFailure);
    } 
    //check whether list is exist or not  
    function oListExist(sender, args) {        
        alert('Got List Title - ' + oList.get_title());
        var camlQuery = new SP.CamlQuery();
        camlQuery.set_viewXml('<View><Query><Where><Geq><FieldRef Name=\'ID\'/>' +
            '<Value Type=\'Number\'>1</Value></Geq></Where></Query><RowLimit>10</RowLimit></View>');
        collListItem = oList.getItems(camlQuery);        
        ctx.load(collListItem);
        ctx.executeQueryAsync(onGetListItemSucceeded, OnGetListItemFailure);     
    }
 
    function onGetListItemSucceeded(sender, args) {        
        var listItemInfo = '<p>';
        var listItemEnumerator = collListItem.getEnumerator();
        if (collListItem.get_count() > 0) {
            while (listItemEnumerator.moveNext()) {
                var oListItem = listItemEnumerator.get_current();
                listItemInfo += oListItem.get_id() + ',Title: ' + oListItem.get_item('Title');
            }
            listItemInfo += '</p>';            
            document.getElementById('Label1').innerHTML = listItemInfo.toString();
        }
    }    
 
    function OnGetListItemFailure(sender, args) {
        alert('Failed to get target List. Error:' + args.get_message());
    } 
    //This function is used to get the hostweb url  
    function getQueryStringParameter(paramToRetrieve)   
    {  
        var params =  
        document.URL.split("?")[1].split("&");  
        var strParams = "";  
        for (var i = 0; i < params.length; i = i + 1)  
        {  
            var singleParam = params[i].split("=");  
            if (singleParam[0] == paramToRetrieve)   
               { return singleParam[1]; }  
        }  
    }       
}

Remove SharePoint assets deployed as part of SharePoint Framework solution

$
0
0

This post is a contribution from Aravinda Devagiri, an engineer with the SharePoint Developer Support team

SharePoint assets can be provisioned as part of the SharePoint Framework solution and are deployed to SharePoint sites when the solution is installed on it. You can follow the article on how to provision assets as part of SharePoint Framework solution.
https://docs.microsoft.com/en-us/sharepoint/dev/spfx/web-parts/get-started/provision-sp-assets-from-package.

The following SharePoint assets can be provisioned along with your client-side solution package:

  • Fields
  • Content types
  • List instances
  • List instances with custom schema

I had to work on a case where these assets had to be removed on uninstalling the package, but we still see that the Content Types remained in the site. When tried to delete the Content Types, we get the below error.

The solution package uses SharePoint Features to package and provision the SharePoint items. When you navigate to the Site feature, you will find the feature activated which provisioned these assets. To remove the assets, we must deactivate this feature, this should remove the fields and content types provisioned.

Here is the order on how to remove assets deployed as part of SharePoint Framework package.

  • Remove the list from the site contents.
  • Delete them from the recycle bin, both from the first stage and second stage recycle bin.
  • Browse to "Site Settings" --> "Manage site features" --> Deactivate the custom feature that is installed through package and provisioned the assets.
  • You should find that the content types are removed for the Site content types.
  • You can remove the app without any issue and without any assets remaining in the site.

 

Error building the client side webparts behind corporate proxy

$
0
0

This post is a contribution from Aravinda Devagiri, an engineer with the SharePoint Developer Support team


Recently I worked on a case, while trying to build simple Client Side webpart, when we run the gulp trust-dev-cert to install developer certificate (to preview the webpart) we get the error.

../gulp trust-dev-cert
 
Error: Cannot find module '@microsoft/sp-build-web'
    at Function.Module._resolveFilename (module.js:527:15)
    at Function.Module._load (module.js:476:23)
    at Module.require (module.js:568:17)
    at require (internal/module.js:11:18)
    at Object.<anonymous> (D:\bin\helloworld-webpart\gulpfile.js:4:15)
    at Module._compile (module.js:624:30)
    at Object.Module._extensions..js (module.js:635:10)
    at Module.load (module.js:545:32)
    at tryModuleLoad (module.js:508:12)
    at Function.Module._load (module.js:500:3)

The most probable reason could be that the you are working behind corporate proxy.
You have to setup node.js and get npm to work behind a proxy by setting the proxy and https-proxy settings.

Below are the commands to configure proxy setting for node.js.

npm config set proxy http://proxyaddress.com:8080
npm config set https-proxy http://proxyaddress.com:8080

Create a new project with the Yeoman generator and then execute the cmd gulp trust-dev-cert and gulp serve to preview the webpart in local work bench.

If you still see the issue, you may have to check if you have the latest version of NodeJS using node -v. It should return the current LTS version.

Now create new project and before installing the developer certificate, run the cmd npm install to install the package and any other packages it depends on.

Here is the sequence of the cmds run to create the webpart.

  • md helloworld-webpart
  • cd helloworld-webpart
  • yo @microsoft/sharepoint
  • npm install
  • gulp trust-dev-cert
  • gulp serve

References:

https://docs.microsoft.com/en-us/sharepoint/dev/spfx/web-parts/get-started/build-a-hello-world-web-part
https://jjasonclark.com/how-to-setup-node-behind-web-proxy/

Create an OData data service for use as a BCS external system on SharePoint Online

$
0
0

This post is a contribution from Jing Wang, an engineer with the SharePoint Developer Support team

Recently I worked with two customers who wanted to show their external data on SharePoint Online web sites, one from on-premise SQL Database and one from their secured internal data source.
They did not want to migrate their SQL Database to SQL Azure.

We researched for few possible options and chose the most feasible, easy to implement and secure solution - "Create custom OData WCF service accessing external data source and generate BDC external content type through SharePoint hosted Add-in.". Note this solution will also work for SharePoint On Premises but is especially suitable for SharePoint Online as we have some constraints on the kind of customization that can be done in SharePoint Online.

Here is the high level overview of the steps involved:

 

Step I - Create an Internet-addressable ASP.NET Windows Communication Foundation (WCF) Data Service to expose the SQL database.

https://docs.microsoft.com/en-us/sharepoint/dev/general-development/how-to-create-an-odata-data-service-for-use-as-a-bcs-external-system

Note: This article covers the steps before Visual Studio 2017.

If you use VS2017, it does not come with WCF Data Service by default, you need to install WCF Data Services Template for Visual Studio 2017: https://marketplace.visualstudio.com/items?itemName=CONWID.WcfDataServiceTemplateExtension#overview

WCF team also recommend to add a beta version of the WCF DataServices EntityProvider to your project: https://www.nuget.org/packages/Microsoft.OData.EntityFrameworkProvider/

Once the two things are installed, you can add WCF Data Service to the Project after you create ADO.NET Entity Data Model following the HowTo article above:

  • Add New Item and search for WCF
  • WCF Data Service 5.6.4

For Data Sources other than SQL Database, please refer to this article:

How to: Create a Data Service Using the Reflection Provider (WCF Data Services) -

https://docs.microsoft.com/en-us/dotnet/framework/data/wcf/create-a-data-service-using-rp-wcf-data-services

 

Step II - Generate a Business Connectivity Services (BCS) external content type connecting to the OData WCF service above with SharePoint Hosted Add-in and extract the model file out:

https://docs.microsoft.com/en-us/sharepoint/dev/general-development/how-to-convert-an-add-in-scoped-external-content-type-to-tenant-scoped

If you don’t want to expose your OData data service anonymously, you need to continue with Step III and Step IV, otherwise, skip to Step V directly.

 

Step III - Create Target Application in SharePoint Central Admin site –Secure Store Service.

https://docs.microsoft.com/en-us/sharepoint/create-or-edit-secure-store-target-application?redirectSourcePath=%252fen-us%252farticle%252fcreate-or-edit-a-secure-store-target-application-f724dec2-ce28-4b76-9235-31728dce64b5

 

Step IV- Edit BDC model to use the Secure Store Target Application:

Highlighted lines are the ones you need to manually put in the model file.
-----------------------

<?xml version="1.0" encoding="utf-16"?>
<Model xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xmlns:xsd="http://www.w3.org/2001/XMLSchema" Name="BDCMetadata_AdventureWorks" xmlns="http://schemas.microsoft.com/windows/2007/BusinessDataCatalog">
<LobSystems>
<LobSystem Name="adventureworks" Type="OData">
<Properties>
<Property Name="ODataServiceMetadataUrl" Type="System.String">http://adventureworks.contoso.com/WcfODataService_AdventureWorks.svc/$metadata</Property>
<Property Name="ODataServiceMetadataAuthenticationMode" Type="System.String">WindowsCredentials</Property>
<Property Name="ODataServicesVersion" Type="System.String">2.0</Property>
<Property Name="SsoApplicationId" Type="System.String">SecureStoreTargentApplicationName</Property>
<Property Name="SsoProviderImplementation" Type="System.String">Microsoft.Office.SecureStoreService.Server.SecureStoreProvider, Microsoft.Office.SecureStoreService, Version=16.0.0.0, Culture=neutral, PublicKeyToken=71e9bce111e9429c</Property>
</Properties>
…
<LobSystemInstances>
<LobSystemInstance Name="adventureworks">
<Properties>
<Property Name="ODataServiceUrl" Type="System.String">http://adventureworks.contoso.com/WcfODataService_AdventureWorks.svc</Property>
<Property Name="ODataServiceAuthenticationMode" Type="System.String">WindowsCredentials</Property>
<Property Name="ODataFormat" Type="System.String">application/atom+xml</Property>
<Property Name="HttpHeaderSetAcceptLanguage" Type="System.Boolean">true</Property>
<Property Name="SsoApplicationId" Type="System.String"> SecureStoreTargentApplicationName</Property>
<Property Name="SsoProviderImplementation" Type="System.String">Microsoft.Office.SecureStoreService.Server.SecureStoreProvider, Microsoft.Office.SecureStoreService, Version=16.0.0.0, Culture=neutral, PublicKeyToken=71e9bce111e9429c</Property>
</Properties>
</LobSystemInstance>
</LobSystemInstances>

 

Step V - Import the External content type’s BDC model file to SPO BDC Service Application

To import the model file using SharePoint Central Administration pages

  1. Open SharePoint Online or SharePoint on-premises Central Administration pages.
  2. Choose Manage serve applications.
  3. Choose Business Data Connectivity Service.
  4. Choose the Import link on the server ribbon.
  5. Choose the Browse button to specify the location where you extracted the .bdcm file.
  6. Keep the default settings, and then choose Import.

 

Step VI - Create External List from the External Content Type created above.

Here is a screen shot of the External List, showing the Microsoft sample AdventureWorks database – productInventory list:

 

Problem of using TermStores.getByName()

$
0
0

This post is a contribution from Aaron Miao, an engineer with the SharePoint Developer Support team

We recently still see customers open support cases because code for fetching term store stopped working, especially from SharePoint Online customers.

Fetching term store by name is bad idea. The term store name is not actually persisted anywhere. It’s a calculated value and it changes. From the code like below, you probably can see the name of the term store passed to getByName() looks like a random value.

var termStore = termStores.getByName("Taxonomy_Dmxzz8tIBzk8wNVKQpJ+xA==");

The term store name could change due to various reasons, such as failover, tenant move, DB move, etc. events happened on SharePoint server side.

However, this is not your fault. There’re no published official documents on this. Even worse, SharePoint client-side APIs provide the getByName(). So, of course, customers start fetching their term store using that name in their custom code. Stop using getByName() method!

The correct way to fetch term store, like many good samples out there, is to use the following (CSOM) APIs are:

TaxonomySession.GetDefaultSiteCollectionTermStore
TaxonomySession.GetDefaultKeywordsTermStore

Both methods above return TermStore object. The DefaultSiteCollectionTermStore is a term store that has metadata local to the site collection, while the DefaultKeywordsTermStore is a global term store that contains enterprise keywords, as shown in TaxonomySession class

Please be aware that both methods may return null value, as remarked in each API document, depends on your term store settings.

Here’re the JSOM APIs:

getDefaultSiteCollectionTermStore
getDefaultKeywordsTermStore

This document discusses difference between global and local (site collection specific) term stores.

Fixing mixed content errors caused by Media Player web part on a https site collection on Internet Explorer (On Premise)

$
0
0

This post is a contribution from Sohail Sayed, an engineer with the SharePoint Developer Support team

If you are using a Media Player web part on a SharePoint site over HTTPS you may see an error “The page at 'https://sp/Pages/Default.aspx' was loaded over HTTPS, but requested an insecure image 'http://download.microsoft.com/download/d/2/9/d29e5571-4b68-4d95-b43a-4e81ba178455/2.0/ENU/InstallSilverlight.png'. This content should also be served over HTTPS.

This error message is displayed when Silverlight is not installed on the System.

We ran into a scenario with a customer where they could not install Silverlight on the user’s machines and also had to avoid the mixed content error.

We tried a couple of options of hiding / removing the Silverlight prompt via CSS / JavaScript. However, the mixed content error seems to be thrown even before the DOM ready event happens so any modifications in CSS / JavaScript are too late.

After further research we found that SharePoint provides a configuration to control this. The Web Application has a configurable property “AllowSilverlightPrompt” that will control if the download Silverlight prompt can be loaded or not. By default, this is set to true.

We ran the below PowerShell to change this to false and this fixed the mixed content error message.

$webApp = Get-SPWebApplication "https://sp"
$webApp.AllowSilverlightPrompt = $false
$webApp.Update()

Note that this is a server-side API and hence is applicable to SharePoint On-Premise only.


Use Excel Web Access Web Part to dynamically show a workbook

$
0
0

This post is a contribution from Mustaq Patel, an engineer with the SharePoint Developer Support team

Requirement: In SharePoint online page, how to show excel workbook dynamically using JSLink without O365 Global Suite Navigation bar.

In SharePoint online we can use script or content editor webpart to show the excel workbook in an iframe. The script webpart can also contain jquery code to set the iframe’s src to the excel workbook dynamically using some business logic. Due to cross domain calls, we cannot write jquery or css to hide the global suite bar that Excel office web app loads in the iframe. So how to show excel workbook without global suite bar and still have workbook url set dynamically. This blog post will address this requirement.
We will use Excel Web Access Webpart and list view webpart. We will connect this 2 webparts, so the Excel Web Access Webpart receives the workbook url from list view webpart. We will also be using JSLink to filter items in the listview webpart, there by only files that meet particular criteria shows up in the list view webpart.
Below are the steps to demonstrate the solution:

Create a document library. Create a View Called “MyExcelFileView” and make sure to include “Created By” Field. You can include any other fields in this view. Upload few excel files under different user so “Created By “ field have different user.

Create a webpart page or use any existing wiki page. Edit the Page. Add 2 webparts in same order as below

  1. List View Webpart showing all the files from document library where all workbook are present.
    1. Upload the javascript file from below to SiteAssets library. I have uploaded at below url SiteUrl/SiteAssets/POC_ListItemFiltering.js. I have also uploaded jquery at same location SiteUrl/SiteAssets/ jquery-1.9.1.js, you may refer the jquery from CDN as well and in that case, you don’t need to upload the jquery file.
    2. Edit this listview webpart and under Miscellaneous, set the JS Link property as below. If you are referring jquery from cdn, you can change below
      ~site/SiteAssets/jquery-1.9.1.js | ~site/SiteAssets/POC_ListItemFiltering.js
    3. Set Appearance, layouts etc as per your needs, you can also set what view to show, to not take lot of space on the page. Select View “MyExcelFileView”. You may select any out of box view as well. Just make sure the field “Created By” is also part of the view.
    4. Once jslink works, it will filter and show workbooks for current logged in user.
  2. The Page still in edit mode, add Excel Web Access webpart from Business Data. Edit its properties and make connection as below
    1. click Connections -> Get Workbook Url from -> above list view webpart -> A connection setting popup will load. Select Field Name “Document URL” finish and save the excel webpart.
    2. I have also uncheck both checkboxes in “Title Bar” and Type of ToolBar: “No ToolBar”.
    3. Save the webpart. Save the page and publish, approve if required.

The page should look as below, and you can see the first excel is already being loaded in the excel webpart. If you click that small icon in listview webpart to select another workbook, it will show in the excel webpart.

Here is the javascript we used in the jslink.

Type.registerNamespace('SSPXTesting')
SSPXTesting.Disp = SSPXTesting.Disp || {};
SSPXTesting.Disp.Templates = SSPXTesting.Disp.Templates || {};
SSPXTesting.Disp.Functions = SSPXTesting.Disp.Functions || {};
var currentUser = null;    

function getCurrentUser(siteurl)
{	    
    $.ajax({
        async: false,
        headers: { "accept": "application/json; odata=verbose" },
        method: "GET",
        url: siteurl + "/_api/web/CurrentUser",
        success: function (data) { 
            currentUser = data.d;
        },
		error: function (error) {
			console.log('some error');
			//alert("error:" +error);
		}
    });    
}

renderListItemTemplate = function (renderCtx) {		
		var	workbookname = "";			
		//File_x0020_Type	"xlsm" or "xlsx"
		var filetype = renderCtx.CurrentItem["File_x0020_Type"];
		var fileAuthor = renderCtx.CurrentItem.Author;	
        if(filetype == "xlsm" || filetype == "xlsx") 
        {
			if(!currentUser)
			{
				getCurrentUser(renderCtx.HttpRoot)
			}			
			if(fileAuthor[0].email == currentUser.Email)
			{
				return RenderItemTemplate(renderCtx);
			}
			else
			{
				//return empty string
				return '';
			}
        }
		else
		{
			// return emtpy string
			return '';  
		}        
}	

SSPXTesting.Disp.Templates.Item = renderListItemTemplate

SSPXTesting.Disp.Functions.RegisterField = function () {
    SPClientTemplates.TemplateManager.RegisterTemplateOverrides(SSPXTesting.Disp)
}

SSPXTesting.Disp.Functions.MdsRegisterField = function () {
    var thisUrl = _spPageContextInfo.siteServerRelativeUrl + "/SiteAssets/POC_ListItemFiltering.js";
    SSPXTesting.Disp.Functions.RegisterField();
    RegisterModuleInit(thisUrl, SSPXTesting.Disp.Functions.RegisterField)
};


if (typeof _spPageContextInfo != "undefined" && _spPageContextInfo != null) {
    SSPXTesting.Disp.Functions.MdsRegisterField();
}
else {
    SSPXTesting.Disp.Functions.RegisterField();
}

Get SPO sites with filter using SPOSitePropertiesEnumerableFilter

$
0
0

This post is a contribution from Aaron Miao, an engineer with the SharePoint Developer Support team

To get list of all OneDrive for Business sites of the tenant, with SharePoint Online Management Shell, run

Get-SPOSite -Template "SPSPERS"

 

This will return all OneDrive for Business sites.

It can be done with CSOM APIs as well. In fact, there’s class called SPOSitePropertiesEnumerableFilter. This class makes it easy specially to get all specific type of sites.

As seen, the class SPOSitePropertiesEnumerableFilter is included in
Namespace: Microsoft.Online.SharePoint.TenantAdministration
Assembly: Microsoft.Online.SharePoint.Client.Tenant (in Microsoft.Online.SharePoint.Client.Tenant.dll)
You can get assembly Microsoft.Online.SharePoint.Client.Tenant.dll from latest version of nuget package: Microsoft.SharePointOnline.CSOM.

Code below shows how to use SPOSitePropertiesEnumerableFilter class to get different types of sites.

using System;
using System.Collections.Generic;
using System.Security;
using Microsoft.SharePoint.Client;
using Microsoft.Online.SharePoint.TenantAdministration;

namespace GetAllSitesWithFilter
{
    class Program
    {
        static string TENANT = "mytenant";
        static string ADMINURL = "https://" + TENANT + "-admin.sharepoint.com";

        static void Main(string[] args)
        {
            //Specify tenant admin and site URL
            string adminUser = "admin@" + TENANT + ".onmicrosoft.com";
            SecureString password = "adminpassword".ConvertToSecureString();
            SharePointOnlineCredentials adminCred = new SharePointOnlineCredentials(adminUser, password);

            List<SiteProperties> list = new List<SiteProperties>();

            SPOSitePropertiesEnumerable ssp = null;

            using (ClientContext cc = new ClientContext(ADMINURL))
            {
                cc.Credentials = adminCred;

                String nextIndex = null;
                Tenant tenant = new Tenant(cc);

                SPOSitePropertiesEnumerableFilter sspFilter = new SPOSitePropertiesEnumerableFilter()
                {
                    // get personal sites
                    IncludePersonalSite = PersonalSiteFilter.Include,  // needed to for personal sites
                    IncludeDetail = true,
                    Template = "SPSPERS"

                    // get classic team sites
                    //IncludeDetail = true,
                    //Template = "STS"

                    // get modern sites
                    //IncludeDetail = true,
                    //Template = "GROUP"

                    // get communication sites
                    //IncludeDetail = true,
                    //Template = "SITEPAGEPUBLISHING"
                };

                do
                {
                    sspFilter.StartIndex = nextIndex;
                    ssp = tenant.GetSitePropertiesFromSharePointByFilters(sspFilter);

                    cc.Load(ssp);
                    cc.ExecuteQuery();

                    list.AddRange(ssp);
                    nextIndex = ssp.NextStartIndexFromSharePoint;
                } while (nextIndex != null);
            }

            foreach(SiteProperties siteProp in list)
            {
                Console.WriteLine(string.Format("Site Url: {0} - Template: {1}", siteProp.Url, siteProp.Template));
            }

        }
    }

    static class StringExtension
    {
        public static SecureString ConvertToSecureString(this string item)
        {
            if (!string.IsNullOrEmpty(item))
            {
                var secureString = new SecureString();
                foreach (var c in item)
                {
                    secureString.AppendChar(c);
                }
                return secureString;
            }
            return null;
        }
    }
}

As you can see, for getting all OneDrive sites, the filter can be set like this:

SPOSitePropertiesEnumerableFilter sspFilter = new SPOSitePropertiesEnumerableFilter()
{
  IncludePersonalSite = PersonalSiteFilter.Include,
  IncludeDetail = true,
  Template = "SPSPERS" // template name for personal site
};

Note that IncludePersonalSite = PersonalSiteFilter.Include is needed to work with “SPSPERS” template filter. Without setting IncludePersonalSite, it returns no sites even if Template = "SPSPERS" is specified.
For getting all classic team sites,

SPOSitePropertiesEnumerableFilter sspFilter = new SPOSitePropertiesEnumerableFilter()
{
  IncludeDetail = true,
  Template = "STS"
};

For getting all modern team sites,

SPOSitePropertiesEnumerableFilter sspFilter = new SPOSitePropertiesEnumerableFilter()
{
  IncludeDetail = true,
  Template = "GROUP"
};

For getting all communication sites,

SPOSitePropertiesEnumerableFilter sspFilter = new SPOSitePropertiesEnumerableFilter()
{
  IncludeDetail = true,
  Template = "SITEPAGEPUBLISHING"
};

Noticed that the sample code does not set Template filter like: Template = "STS#0". With or without “#0”, it returns same result.

Using Microsoft Graph API to convert the format of your documents

$
0
0

This post is a contribution from Jing Wang, an engineer with the SharePoint Developer Support team

Many SharePoint Online customers want to convert their word documents or some other documents in SPO to pdf files programmatically.
Within SharePoint Online User Interface, you can convert them one at a time manually, but sometimes, users wish to have the ability to convert multiple documents automatically without having to open each document library, locate the documents and click through the options to do them one at a time.

With new Graph api endpoint listed below, above requirement can be automated with custom code, for example, C#, JavaScript…

GET /drive/items/{item-id}/content?format={format}
GET /drive/root:/{path and filename}:/content?format={format}

Format options
The following values are valid for the format parameter:

Format value Description Supported source extensions
pdf Converts the item into PDF format. csv, doc, docx, odp, ods, odt, pot, potm, potx, pps, ppsx, ppsxm, ppt, pptm, pptx, rtf, xls, xlsx

See details of the endpoint here:
https://developer.microsoft.com/en-us/graph/docs/api-reference/v1.0/api/driveitem_get_content_format

Sample - Complete solution in C#:

Step I, Create a native app in Azure portal and give permissions, for Graph API.

 

Step II. Create a Console Application, add two dlls and their references:
DLLs:. These can be added as Nuget packages.
Microsoft.IdentityModel.Clients.ActiveDirectory.dll (Nuget Package Microsoft.SharePointOnline.CSOM)
Newtonsoft.Json.dll (Nuget Package Newtonsoft.Json)

Add the below using statements

using Microsoft.IdentityModel.Clients.ActiveDirectory;
using Newtonsoft.Json;

Step III, Implement the code, with Graph Api to convert the document and download it locally and upload it back to SPO site:
Note: ADAL library is used for authentication.

Source code:
-------------------------------

using Microsoft.IdentityModel.Clients.ActiveDirectory;
using Newtonsoft.Json;
using System;
using System.Collections.Generic;
using System.Linq;
using System.Net.Http;
using System.Net.Http.Headers;
using System.Security.Cryptography.X509Certificates;
using System.Text;
using System.Threading.Tasks;
using System.Net;
using System.Security.Claims;
using System.IO;
using Microsoft.SharePoint.Client;
using System.Security;


namespace ConsoleApp1
{
    public static class StreamExtensions
    {
        public static byte[] ReadAllBytes(this Stream instream)
        {
            if (instream is MemoryStream)
                return ((MemoryStream)instream).ToArray();

            using (var memoryStream = new MemoryStream())
            {
                instream.CopyTo(memoryStream);
                return memoryStream.ToArray();
            }
        }
    }
    class Program
    {
        private static string TENANT_NAME = "mycompany.onmicrosoft.com";
        private static string resource = "https://graph.microsoft.com";
        private static string loginname = "user@mycompany.onmicrosoft.com";
        private static string loginpassword = "*********";
        private static string AzureTenantID = "********-f247-4d48-a45d-************";
        private static string spositeUrl = "https://mycompany.sharepoint.com/*********";
        private static string destinationDocumentLibrary = "dl1"; 

        static void Main(string[] args)
        {
            //USER TOKEN - THIS WORKS!!!!!!!!!!!!
            UserPasswordCredential userPasswordCredential = new UserPasswordCredential(loginname, loginpassword);
            var graphauthority = "https://login.microsoftonline.com/" + AzureTenantID;
            AuthenticationContext authContext = new AuthenticationContext(graphauthority);
            var token = authContext.AcquireTokenAsync(resource, "94b1544c-35e8-4d45-a941-c3dbaab283dc", userPasswordCredential).Result.AccessToken;

            // Create a new HttpWebRequest Object to the mentioned URL.
HttpWebRequest myHttpWebRequest = (HttpWebRequest)WebRequest.Create("https://graph.microsoft.com/v1.0/me/drive/root:/orange.docx:/content?format=pdf");
            //HttpWebRequest myHttpWebRequest = (HttpWebRequest)WebRequest.Create("https://graph.microsoft.com/v1.0/drives/b!zMNDej1sNEG0SanRDltXfAVTYcdt1pdIggMBPYZYp9Wgdi3ir9sFQJXof6j8GNUD/root:/Repro.docx:/content?format=pdf");
            myHttpWebRequest.AllowAutoRedirect = false;
            myHttpWebRequest.Headers.Set("Authorization", ("Bearer " + token));
            HttpWebResponse myHttpWebResponse = (HttpWebResponse)myHttpWebRequest.GetResponse();
            string downloadPath = myHttpWebResponse.GetResponseHeader("Location");
            Console.WriteLine("Download PDF file from here:\n " + downloadPath);

            //Get the file Stream with Location Url
            HttpWebRequest HttpWebRequest_download = (HttpWebRequest)WebRequest.Create(downloadPath);
            HttpWebRequest_download.Accept = "blob";

            var response = (HttpWebResponse)HttpWebRequest_download.GetResponse();
            Stream myStream = response.GetResponseStream();
            FileStream targetFile = new FileStream("C:\\temp\\orange_converted_localcopy.pdf", FileMode.Create);
            myStream.CopyTo(targetFile);
            myStream.Close();                   
            response.Close();

//You can continue to use Graph API to upload document back to OneDrive or other SPO site
//since we used loginname/password above already, we will use simple CSOM to upload file to another SPO site as quick demo
            using (var clientContext = new ClientContext(spositeUrl))
            {
                SecureString passWord = new SecureString();
                foreach (char c in loginpassword.ToCharArray()) passWord.AppendChar(c);

                clientContext.Credentials = new SharePointOnlineCredentials(loginname, passWord);
                var web = clientContext.Web;
                clientContext.Load(web);
                clientContext.ExecuteQuery();

                List dl = web.Lists.GetByTitle(destinationDocumentLibrary);
                clientContext.Load(dl);
                clientContext.ExecuteQuery();

                //Upload the converted file to SPO site
                targetFile.Position = 0;               
                    var fci = new FileCreationInformation
                    {
                        Url = "orange_converted_spocopy.pdf",
                        ContentStream = targetFile,
                        Overwrite = true
                    };
                    Folder folder = dl.RootFolder;
                    FileCollection files = folder.Files;
                    Microsoft.SharePoint.Client.File file = files.Add(fci);
                    clientContext.Load(files);
                    clientContext.Load(file);
                    clientContext.ExecuteQuery();

                    targetFile.Close();
                    response.Close();            

                Console.WriteLine("Converted file is uploaded to SPO site - orange_converted_spocopy.pdf");
                Console.ReadKey();
            }
        }
       
    }
    }

Here is the converted file downloaded locally.

The converted pdf is also uploaded to this SPO site:

In the process to generate the url for the Graph API endpoint, I found it kind of tricky to identify the drive ID for specific SPO site, so listing the approach to get the same.

First, Use following url format in Graph Explorer to retrieve the drive id:
https://graph.microsoft.com/v1.0/sites/[spositehostname]:/[sites/pub]:/drive

For example, if the SPO site url is:
https://mycompany.sharepoint.com/sites/testsite

Url to put in Graph Explorer is:
https://graph.microsoft.com/v1.0/sites/mycompany.sharepoint.com:/sites/testsite:/drive

Output has the drive id:
--

{
    "@odata.context": "https://graph.microsoft.com/beta/$metadata#drives/$entity",
    "createdDateTime": "2017-12-04T19:48:25Z",
    "description": "This system library was created by the Publishing feature to store documents that are used on pages in this site.",
    "id": "b!zMNDej1sNEG0SanRDltXfAVTYcdt1pdIggMBPYZYp9Wgdi3ir9sFQJXof6j8GNUD",
    "lastModifiedDateTime": "2018-10-11T04:08:37Z",
    "name": "Documents",
    "webUrl": "https://mycompany.sharepoint.com/sites/****/Documents",
    "driveType": "documentLibrary",
    "createdBy": {
        "user": {
            "displayName": "System Account"
        }
  }
}

I have a file named “Repro.docx” in the root of above drive:

So the file’s conversion endpoint is:
https://graph.microsoft.com/v1.0/drives/b!zMNDej1sNEG0SanRDltXfAVTYcdt1pdIggMBPYZYp9Wgdi3ir9sFQJXof6j8GNUD/root:/Repro.docx:/content?format=pdf

SharePoint Online Authentication in Powershell for CSOM when Legacy Authentication is disabled for tenant or Multi Factor Authentication is enabled for user

$
0
0

This post is a contribution from Sohail Sayed, an engineer with the SharePoint Developer Support team

SharePoint Online Authentication in Powershell for CSOM when Legacy Authentication is disabled for tenant or Multi Factor Authentication is enabled for user
Authentication using SharePointOnlineCredentials class will work only if Legacy auth is enabled. If you tenant administrator has disabled Legacy Auth then SharePointOnlineCredentials will not be able to perform authentication. This also fails if the user account has Multi Factor authentication enabled. In this scenario you will be required to use Modern Authentication which uses OAuth. In powershell most SharePoint Online commandlets will be able to handle this scenario since Connect-SPOService command is able to handle this scenario. However if you authenticate successfully using Connect-SPOService command you cannot make CSOM class as we don’t get a Credentials object back from Connect-SPOService that can be used with the CSOM ClientContext class. You can work around this scenario using the OfficeDevPnP.Core.AuthenticationManager class. More details follow below.

Checking if Legacy Auth is disabled

  • To check if legacy auth is disabled open SharePoint Online Management Shell.
  • Run the command “Connect-SPOService -Url https://<tenant>-admin.sharepoint.com”. Replace the <tenant> with your tenant name.
  • Enter credentials if prompted to authenticate.
  • Run the command “Get-SPOTenant”.
  • Check the property “LegacyAuthProtocolsEnabled”. If this is set to true then Legacy Authentication is enabled else disabled.

It could be possible that this is setting is set to true but Legacy Authentication is blocked via conditional access policies set by your tenant administrator. If conditional access policies are configured to block Legacy Authentication then you would be able to see an appropriate message in the network / fiddler trace for the endpoint https://login.microsoftonline.com/rst2.srf.
Below is an example of the response you will see in fiddler

<S:Fault>
<S:Code>
<S:Value>S:Sender</S:Value>
<S:Subcode>
<S:Value>wst:FailedAuthentication</S:Value>
</S:Subcode>
</S:Code>
<S:Reason>
<S:Text xml:lang="en-US">Authentication Failure</S:Text>
</S:Reason>
<S:Detail>
<psf:error xmlns:psf="http://schemas.microsoft.com/Passport/SoapServices/SOAPFault">
<psf:value>0x80048823</psf:value><psf:internalerror>
<psf:code>0x80048823</psf:code>
<psf:text>AADSTS53003: Blocked by conditional access.</psf:text>
</psf:internalerror></psf:error>
</S:Detail>
</S:Fault>

Alternatively, Legacy Authentication is enabled and there are no Conditional access policies blocking the authentication but the SharePointOnlineCredentials still fails. This would be most likely due to Multi Factor Authentication on that user account. This can be easily verified by performing a login with that user account to the SharePoint site in the browser preferably an In-Private browsing session.

Using the OfficeDevPnP.Core.AuthenticationManager to authenticate.
We need to download specific versions of assemblies for using the OfficeDevPnP.Core.AuthenticationManager class. You can get these from the Nuget package site.

Note once you go to the nuget url you will find the “Manual Download” link on the right.

Download the file and rename to .zip extension.
Extract the contents on the zip files. We will be referencing this in the Powershell script.
The dlls can be found in /lib/net45 sub folder.

Code Example

Below is the code sample demonstrating using the OfficeDevPnP.Core.AuthenticationManager class for authentication.

import-module microsoft.online.sharepoint.powershell
#download https://www.nuget.org/packages/Microsoft.SharePointOnline.CSOM/16.1.7723.1200
[System.Reflection.Assembly]::LoadFile("C:\TEMP\microsoft.sharepointonline.csom.16.1.7723.1200\lib\net45\Microsoft.SharePoint.Client.dll")
[System.Reflection.Assembly]::LoadFile("C:\TEMP\microsoft.sharepointonline.csom.16.1.7723.1200\lib\net45\Microsoft.SharePoint.Client.Runtime.dll")
[System.Reflection.Assembly]::LoadFile("C:\TEMP\microsoft.sharepointonline.csom.16.1.7723.1200\lib\net45\Microsoft.SharePoint.Client.Taxonomy.dll")
#download https://www.nuget.org/packages/Microsoft.IdentityModel.Clients.ActiveDirectory/2.29.0
[System.Reflection.Assembly]::LoadFile("C:\TEMP\microsoft.identitymodel.clients.activedirectory\lib\net45\Microsoft.IdentityModel.Clients.ActiveDirectory.dll")
#download https://www.nuget.org/packages/SharePointPnPCoreOnline/2.26.1805.1
[System.Reflection.Assembly]::LoadFile("C:\TEMP\sharepointpnpcoreonline.2.26.1805.1\lib\net45\OfficeDevPnP.Core.dll")
$adminUrl = "https://<<tenant>>-admin.sharepoint.com"
$siteUrl=https://<<tenant>>.sharepoint.com
Connect-SPOService -Url $adminUrl
$authManager = new-object OfficeDevPnP.Core.AuthenticationManager;
$clientContext = $authManager.GetWebLoginClientContext($siteUrl);
#testing CSOM calls
$clientContext.Load($clientContext.Web)
$clientContext.ExecuteQuery();
Write-Host $clientContext.Web.Title

You can see that we are using Connect-SPOService command first. This will cause the authentication prompt and allow the user to successfully authenticate even if the legacy auth is disabled or multi factor auth enabled. We then call $authManager.GetWebLoginClientContext($siteUrl);. This return back a ClientContext object that uses the same credentials allowing the CSOM calls to authenticate successfully now.

Note that user will need to enter credential every time the powershell script executes in a new PowerShell console session. This approach is not feasible if you have a PowerShell script executing in background without user interaction like a scheduled task. In that case you need to use app only authentication approach using Client id and client secret.

 

Fixing on the fly OAuth issue for Provider Hosted Add-in in GCC High or DOD Tenant

$
0
0

This post is a contribution from Mustaq Patel, an engineer with the SharePoint Developer Support team

Requirement: You have low trust Provider Hosted add-in that does on the fly OAuth (request permissions at runtime). You try to use this Provider Hosted add-in on a GCC High or DoD tenant. You see that the add-in does not work and you see an error message which seems to indicate that the add-in is looking for Client Id/Client Secret in wrong tenant.

Issue: when the add-in does on the fly OAuth the TokenHelper class that the add-in uses to get the OAuth process going and retrieve Access Token by validating Client Id and Client Secret in the Azure Active Directory, goes to a wrong AAD (i.e. public AAD) and will come back without any record matching with that Client Id and so the add-in fails. It is due to below hard coded string in the TokenHelper.cs file which is not valid for on the fly OAuth on GCC High/DOD tenants.

private static string AcsHostUrl = "accesscontrol.windows.net";

To fix this modify this error line as below.

private static string AcsHostUrl = "login.microsoftonline.us";

Also modify method GetAcsGlobalEndpointUrl as below.

private static string GetAcsGlobalEndpointUrl()
{
    return String.Format(CultureInfo.InvariantCulture, "https://{0}/", AcsHostUrl);
}

These 2 changes will fix the Provider Hosted add-in by causing the OAuth requests to go to correct authentication url for GCCH AAD.

If you are using same Provider Hosted add-in for both GCCH and public cloud or want to maintain same code base for both types of cloud, than we have to do some extra work. The code logic would be to find out the top domain of the sharepoint site and if it is .us than go to login.microsoftonline.us and if it is .com go to accesscontrol.windows.net.

Below is the complete code change.

  1. Declare 2 static variables as below (highlighted).
    private static string GlobalEndPointPrefix = "accounts";
    private static string AcsHostUrl = "accesscontrol.windows.net";
    private static string GCCHAcsHostUrl = "login.microsoftonline.us";
    private static string  SPTargetUrl = "";
    
  2.  Modify GetAuthorizationUrl to set SPTargetUrl as below
    public static string GetAuthorizationUrl(string contextUrl, string scope, string redirectUri)
    {
        SPTargetUrl = contextUrl;
        return string.Format(
         "{0}{1}?IsDlg=1&client_id={2}&scope={3}&response_type=code&redirect_uri={4}",
         EnsureTrailingSlash(contextUrl),
         AuthorizationPage,
         ClientId,
         scope,
         redirectUri);
    }
    
  3. Add below method, to get the top domain of the sharepoint url.
    private static string GetTopDomainType(Uri uri)
    {
        if (!uri.HostNameType.Equals(UriHostNameType.Dns) || uri.IsLoopback)
            return string.Empty; // or throw an exception            
        return String.Format(CultureInfo.InvariantCulture, "{0}", uri.Host.Split('.').Last());
    }
    
  4.  Modify GetAcsGlobalEndpointUrl as below, to find if SharePoint is in GCC High by checking top domain (.us)
    private static string GetAcsGlobalEndpointUrl()
    {
        string topDomain = GetTopDomainType(new Uri(SPTargetUrl));
        if (string.Equals(topDomain, "us", StringComparison.OrdinalIgnoreCase))
        {
            AcsHostUrl = GCCHAcsHostUrl;
            return String.Format(CultureInfo.InvariantCulture, "https://{0}/", AcsHostUrl);
        }
        else
        {
            return String.Format(CultureInfo.InvariantCulture, "https://{0}.{1}/", GlobalEndPointPrefix, AcsHostUrl);
        }
    }
    
  5. Deploy the Provider Hosted add-in with this code change and test the add-in.

Please note that from the Provider Hosted add-in, you have to call GetAuthorizationUrl method to construct the redirecturl for add-in consent.

For more details please refer below article
On the fly oauth - https://docs.microsoft.com/en-us/sharepoint/dev/sp-add-ins/authorization-code-oauth-flow-for-sharepoint-add-ins.
GCC high and DoD - https://docs.microsoft.com/en-us/office365/servicedescriptions/office-365-platform-service-description/office-365-us-government/gcc-high-and-dod.

Viewing all 100 articles
Browse latest View live




Latest Images