Category Archives: Active Directory

Azure AD Connect Staging Mode

Azure AD Connect is the tool use to connect on-premises directory service with Azure AD. It allows users to use same on-premises ID and passwords to authenticate in to Azure AD, Office 365 or other Applications hosted in Azure. Azure AD connect can install on any server if its meets following,

The AD forest functional level must be Windows Server 2003 or later. 

If you plan to use the feature password writeback, then the Domain Controllers must be on Windows Server 2008 (with latest SP) or later. If your DCs are on 2008 (pre-R2), then you must also apply hotfix KB2386717.

The domain controller used by Azure AD must be writable. It is not supported to use a RODC (read-only domain controller) and Azure AD Connect does not follow any write redirects.

It is not supported to use on-premises forests/domains using SLDs (Single Label Domains).

It is not supported to use on-premises forests/domains using "dotted" (name contains a period ".") NetBios names.

Azure AD Connect cannot be installed on Small Business Server or Windows Server Essentials. The server must be using Windows Server standard or better.

The Azure AD Connect server must have a full GUI installed. It is not supported to install on server core.

Azure AD Connect must be installed on Windows Server 2008 or later. This server may be a domain controller or a member server when using express settings. If you use custom settings, then the server can also be stand-alone and does not have to be joined to a domain.

If you install Azure AD Connect on Windows Server 2008 or Windows Server 2008 R2, then make sure to apply the latest hotfixes from Windows Update. The installation is not able to start with an unpatched server.

If you plan to use the feature password synchronization, then the Azure AD Connect server must be on Windows Server 2008 R2 SP1 or later.

If you plan to use a group managed service account, then the Azure AD Connect server must be on Windows Server 2012 or later.

The Azure AD Connect server must have .NET Framework 4.5.1 or later and Microsoft PowerShell 3.0 or later installed.

If Active Directory Federation Services is being deployed, the servers where AD FS or Web Application Proxy are installed must be Windows Server 2012 R2 or later. Windows remote management must be enabled on these servers for remote installation.

If Active Directory Federation Services is being deployed, you need SSL Certificates.

If Active Directory Federation Services is being deployed, then you need to configure name resolution.

If your global administrators have MFA enabled, then the URL https://secure.aadcdn.microsoftonline-p.com must be in the trusted sites list. You are prompted to add this site to the trusted sites list when you are prompted for an MFA challenge and it has not added before. You can use Internet Explorer to add it to your trusted sites.

Azure AD Connect requires a SQL Server database to store identity data. By default a SQL Server 2012 Express LocalDB (a light version of SQL Server Express) is installed. SQL Server Express has a 10GB size limit that enables you to manage approximately 100,000 objects. If you need to manage a higher volume of directory objects, you need to point the installation wizard to a different installation of SQL Server.

What is staging mode? 
 
In a given time, only one Azure AD connect instance can involve with sync process for a directory. But this gives few challenges. 
 
Disaster Recovery – If the server with Azure AD connect involves in a disaster it going to make impact on sync process. This can be worse if you using features such as password pass-through, single-sing-on, password writeback through AD connect.
Upgrades – If the system which running Azure AD connect needs upgrade or if Azure AD connect itself needs upgrade, will make impact for sync process. Again, the affordable downtime will be depending on the features and organization dependencies over Azure AD connect and its operations. 
Testing New Features – Microsoft keep adding new features to Azure AD connect. Before introduce those to production its always good to simulate and see how it will impact. But if its only one instance, it is not possible to do so. Even you have demo environment it may not simulate same impact as production in some occasions. 
 
Microsoft introduced the staging mode of Azure AD connect to overcome above challenges. With staging mode, it allows you to maintain another copy of Azure AD connect instance in another server. it can have same config as primary server. It will connect to Azure AD and receive changes and keep a latest copy to make sure the switch over is seamless as possible. However, it will not sync Azure AD connect configuration from primary server. it is engineer’s responsibility to update staging server AD connect configuration, if primary server AD connects config modified. 
 
Installation
 
Let’s see how we can configure Azure AD connect in staging mode.
 
1) Prepare a server according to guidelines given in prerequisites section to install Azure AD Connect. 
2) Review current configuration of Azure AD connect running on primary server. you can check this by Azure AD Connect | View current configuration 

sta1
 
sta2
 
3) Log in to server as Domain Administrator and download latest Azure Ad Connect from https://www.microsoft.com/en-us/download/details.aspx?id=47594
4) During the installation, please select customize option. 
 
sta3
 
5) Then proceed with the configuration according to settings used in primary server. 
6) At the last step of the configuration, select Enable staging mode: When selected, synchronization will not export any data to Ad or Azure AD and then click install
 
sta4
 
7) Once installation completed, in Synchronization Service (Azure AD Connect | Synchronization Service) we can confirm there is no sync jobs. 
 
sta5
 
Verify data
 
As I mentioned before, staging server allows to simulate export before it make as primary. This is important if you implement new configuration changes. 
 
In order to prepare a staged copy of export, 
 
1) Go to Start | Azure AD Connect | Synchronization Service | Connectors 
 
sta6
 
2) Select the Active Directory Domain Services connector and click on Run from the right-hand panel. 
 
sta7
 
3) Then in next window select Full Import and click OK.
 
sta8
 
4) Repeat same for Windows Azure Active Directory (Microsoft) 
5) Once both jobs completed, Select the Active Directory Domain Services connector and click on Run from the right-hand panel again. But this time select Delta Synchronization, and click OK.
 
sta9
 
6) Repeat same for Windows Azure Active Directory (Microsoft)
7) Once both jobs finished, go to Operation tab and verify if jobs were completed successfully. 
 
sta10
 
Now we have the staging copy, next step is to verify if the data is presented as expected. to do that we need to get help of a PowerShell script.  

 
Param(
    [Parameter(Mandatory=$true, HelpMessage="Must be a file generated using csexport 'Name of Connector' export.xml /f:x)")]
    [string]$xmltoimport="%temp%\exportedStage1a.xml",
    [Parameter(Mandatory=$false, HelpMessage="Maximum number of users per output file")][int]$batchsize=1000,
    [Parameter(Mandatory=$false, HelpMessage="Show console output")][bool]$showOutput=$false
)

#LINQ isn't loaded automatically, so force it
[Reflection.Assembly]::Load("System.Xml.Linq, Version=3.5.0.0, Culture=neutral, PublicKeyToken=b77a5c561934e089") | Out-Null

[int]$count=1
[int]$outputfilecount=1
[array]$objOutputUsers=@()

#XML must be generated using "csexport "Name of Connector" export.xml /f:x"
write-host "Importing XML" -ForegroundColor Yellow

#XmlReader.Create won't properly resolve the file location,
#so expand and then resolve it
$resolvedXMLtoimport=Resolve-Path -Path ([Environment]::ExpandEnvironmentVariables($xmltoimport))

#use an XmlReader to deal with even large files
$result=$reader = [System.Xml.XmlReader]::Create($resolvedXMLtoimport) 
$result=$reader.ReadToDescendant('cs-object')
do 
{
    #create the object placeholder
    #adding them up here means we can enforce consistency
    $objOutputUser=New-Object psobject
    Add-Member -InputObject $objOutputUser -MemberType NoteProperty -Name ID -Value ""
    Add-Member -InputObject $objOutputUser -MemberType NoteProperty -Name Type -Value ""
    Add-Member -inputobject $objOutputUser -MemberType NoteProperty -Name DN -Value ""
    Add-Member -inputobject $objOutputUser -MemberType NoteProperty -Name operation -Value ""
    Add-Member -inputobject $objOutputUser -MemberType NoteProperty -Name UPN -Value ""
    Add-Member -inputobject $objOutputUser -MemberType NoteProperty -Name displayName -Value ""
    Add-Member -inputobject $objOutputUser -MemberType NoteProperty -Name sourceAnchor -Value ""
    Add-Member -inputobject $objOutputUser -MemberType NoteProperty -Name alias -Value ""
    Add-Member -inputobject $objOutputUser -MemberType NoteProperty -Name primarySMTP -Value ""
    Add-Member -inputobject $objOutputUser -MemberType NoteProperty -Name onPremisesSamAccountName -Value ""
    Add-Member -inputobject $objOutputUser -MemberType NoteProperty -Name mail -Value ""

    $user = [System.Xml.Linq.XElement]::ReadFrom($reader)
    if ($showOutput) {Write-Host Found an exported object... -ForegroundColor Green}

    #object id
    $outID=$user.Attribute('id').Value
    if ($showOutput) {Write-Host ID: $outID}
    $objOutputUser.ID=$outID

    #object type
    $outType=$user.Attribute('object-type').Value
    if ($showOutput) {Write-Host Type: $outType}
    $objOutputUser.Type=$outType

    #dn
    $outDN= $user.Element('unapplied-export').Element('delta').Attribute('dn').Value
    if ($showOutput) {Write-Host DN: $outDN}
    $objOutputUser.DN=$outDN

    #operation
    $outOperation= $user.Element('unapplied-export').Element('delta').Attribute('operation').Value
    if ($showOutput) {Write-Host Operation: $outOperation}
    $objOutputUser.operation=$outOperation

    #now that we have the basics, go get the details

    foreach ($attr in $user.Element('unapplied-export-hologram').Element('entry').Elements("attr"))
    {
        $attrvalue=$attr.Attribute('name').Value
        $internalvalue= $attr.Element('value').Value

        switch ($attrvalue)
        {
            "userPrincipalName"
            {
                if ($showOutput) {Write-Host UPN: $internalvalue}
                $objOutputUser.UPN=$internalvalue
            }
            "displayName"
            {
                if ($showOutput) {Write-Host displayName: $internalvalue}
                $objOutputUser.displayName=$internalvalue
            }
            "sourceAnchor"
            {
                if ($showOutput) {Write-Host sourceAnchor: $internalvalue}
                $objOutputUser.sourceAnchor=$internalvalue
            }
            "alias"
            {
                if ($showOutput) {Write-Host alias: $internalvalue}
                $objOutputUser.alias=$internalvalue
            }
            "proxyAddresses"
            {
                if ($showOutput) {Write-Host primarySMTP: ($internalvalue -replace "SMTP:","")}
                $objOutputUser.primarySMTP=$internalvalue -replace "SMTP:",""
            }
        }
    }

    $objOutputUsers += $objOutputUser

    Write-Progress -activity "Processing ${xmltoimport} in batches of ${batchsize}" -status "Batch ${outputfilecount}: " -percentComplete (($objOutputUsers.Count / $batchsize) * 100)

    #every so often, dump the processed users in case we blow up somewhere
    if ($count % $batchsize -eq 0)
    {
        Write-Host Hit the maximum users processed without completion... -ForegroundColor Yellow

        #export the collection of users as as CSV
        Write-Host Writing processedusers${outputfilecount}.csv -ForegroundColor Yellow
        $objOutputUsers | Export-Csv -path processedusers${outputfilecount}.csv -NoTypeInformation

        #increment the output file counter
        $outputfilecount+=1

        #reset the collection and the user counter
        $objOutputUsers = $null
        $count=0
    }

    $count+=1

    #need to bail out of the loop if no more users to process
    if ($reader.NodeType -eq [System.Xml.XmlNodeType]::EndElement)
    {
        break
    }

} while ($reader.Read)

#need to write out any users that didn't get picked up in a batch of 1000
#export the collection of users as as CSV
Write-Host Writing processedusers${outputfilecount}.csv -ForegroundColor Yellow
$objOutputUsers | Export-Csv -path processedusers${outputfilecount}.csv -NoTypeInformation

 
Save this as .ps1 on C Drive. 
 
1) Open PowerShell and type cd "C:\Program Files\Microsoft Azure AD Sync\Bin" (if your install path is different use the relevant path)
2) Then run .\csexport "myrebeladmin.onmicrosoft.com – AAD" C:\export.xml /f:x in here myrebeladmin.onmicrosoft.com -AAD should replace with your Azure AD connector name. This will export config to C:\export.xml
3) Then type .\analyze.ps1 -xmltoimport C:\export.xml in here analyze.ps1 is the script we saved in beginning of this section. 
4) Then it will create CSV file called processedusers1.csv and it’s contain all changes which will sync to Azure AD. 
 
However, this step is always not required. It can make as primary server without import and verify process. 
 
How to make it as primary Server?
 
In order to make staging server as primary server,
 
1) Go to Start | Azure AD Connect | Azure AD Connect
2) Then click on Configure in next page. 
3) In next page select option Configure staging mode and click Next
 
 
sta11
 
4) In next page provide the Azure AD login credentials for directory sync account. 
5) In next window, untick Enable staging mode and click Next
 
sta12
 
6) In next window select start the synchronization process… and click Configure
 
sta13
 
This completes the process of promoting staging server in to primary. Hope this was useful and if you have any questions feel free to contact me on rebeladm@live.com also follow me on twitter @rebeladm to get updates about new blog posts.

What is Content Freshness protection in DFSR?

Healthy Replication is a must for active directory environment. SYSVOL folder in domain controllers contain policies and log on scripts. It is replicated between domain controllers to maintain up to date config (consistency). Before windows server 2008, it used FRS (File Replication Service) to replicate sysvol content among domain controllers. With Windows server 2008 FRS was deprecated and introduced Distributed File System (DFS) for replication.

A healthy replication required healthy communication between domain controllers. sometime the communication can interrupt due to domain controller failure or link failure. Based on the impact it is still possible that the communication re-established after period of time. Then it will try to resume replication and catch up with SYSVOL changes. In such scenario, we may see event 4012 in event viewer. 

The DFS Replication service stopped replication on the replicated folder at local path c:\xxx. It has been disconnected from other partners for 70 days, which is longer than the MaxOfflineTimeInDays parameter. Because of this, DFS Replication considers this data to be stale, and will replace it with data from other members of the replication group during the next replication. DFS Replication will move the stale files to the local Conflict folder. No user action is required.

With windows server 2008, Microsoft introduced a setting called content freshness protection to protect DFS shares from stale data. DFS also use a multi-master database similar to active directory. It also has tombstone time limit similar to Active Directory. The default value for this is 60 days. If there were no replication more than that time and resume replication in later time, it can have stale data. It is similar to lingering objects in AD. To protect from this, we can define value for MaxOfflineTimeInDays. if the number of days from last successful DFS replication is larger than MaxOfflineTimeInDays it will prevent the replication. 

We can review this value by running,

For /f %m IN ('dsquery server -o rdn') do @echo %m && @wmic /node:"%m" /namespace:\\root\microsoftdfs path DfsrMachineConfig get MaxOfflineTimeInDays

cf1

There is two ways to recover from this. First method is to increase the value of MaxOfflineTimeInDays. it can be done using,

wmic.exe /namespace:\\root\microsoftdfs path DfsrMachineConfig set MaxOfflineTimeInDays=120

cf2

It is recommended to run this on all domain controllers to maintain same config. 

If you not willing to change this value, it still can recover using non-authoritative restore. It will remove all conflicting values and take an updated copy. 

I have already written an article about non-authoritative restore of SYSVOL and it can be find in http://www.rebeladmin.com/2017/08/non-authoritative-authoritative-sysvol-restore-dfs-replication/ 

This is not only for SYSVOL replication. It is valid for DFS replication in general. 

Hope this was useful and if you have any questions feel free to contact me on rebeladm@live.com also follow me on twitter @rebeladm to get updates about new blog posts.

Step-by-Step guide to setup Fine-Grained Password Policies

In AD environment, we can use password policy to define passwords security requirements. These settings are located under Computer Configuration | Policies | Windows Settings | Security Settings | Account Policies

fine1

Before Windows server 2008, only one password policy can apply to the users. But in an environment, based on user roles it may require additional protection. As an example, for sales users 8-character complex password can be too much but it is not too much for domain admin account. With windows server 2008 Microsoft introduced Fine-Grained Password Policies. This allow to apply different password policies users and groups. In order to use this feature, 

1) Your domain functional level should be windows server 2008 at least.

2) Need Domain/Enterprise Admin account to create policies. 

Similar to group policies, sometime objects may end up with multiple password policies applied to it. but in any given time, an object can only have one password policy. Each Fine-Grained Password Policy have a precedence value. This integer value can define during the policy setup. Lower precedence value means the higher priority. If multiple policies been applied to an object, the policy with lower precedence value wins. Also, policy linked to user object directly, always wins. 

We can create the policies using Active Directory Administrative Centre or PowerShell. In this demo, I am going to use PowerShell method. 

New-ADFineGrainedPasswordPolicy -Name "Tech Admin Password Policy" -Precedence 1 `

-MinPasswordLength 12 -MaxPasswordAge "30" -MinPasswordAge "7" `

-PasswordHistoryCount 50 -ComplexityEnabled:$true `

-LockoutDuration "8:00" `

-LockoutObservationWindow "8:00" -LockoutThreshold 3 `

-ReversibleEncryptionEnabled:$false

In above sample I am creating a new fine-grained password policy called “Tech Admin Password Policy”. New-ADFineGrainedPasswordPolicy is the cmdlet to create new policy. Precedence to define precedence. LockoutDuration and LockoutObservationWindow values are define in hours. LockoutThreshold value defines the number of login attempts allowed. 

More info about the syntax can find using,

Get-Help New-ADFineGrainedPasswordPolicy

Also, can view examples using 

Get-Help New-ADFineGrainedPasswordPolicy -Examples

fine2

Once policy is setup we can verify its settings using, 

Get-ADFineGrainedPasswordPolicy –Identity “Tech Admin Password Policy” 

fine3

Now we have policy in place. Next step is to attach it to groups or users. In my demo, I am going to apply this to a group called “IT Admins”

Add-ADFineGrainedPasswordPolicySubject -Identity "Tech Admin Password Policy" -Subjects "IT Admins"

I also going to attach it to s user account R143869

Add-ADFineGrainedPasswordPolicySubject -Identity "Tech Admin Password Policy" -Subjects "R143869"

We can verify the policy using following,

Get-ADFineGrainedPasswordPolicy -Identity "Tech Admin Password Policy" | Format-Table AppliesTo –AutoSize

fine4

This confirms the configuration. Hope this was useful and if you have any questions feel free to contact me on rebeladm@live.com also follow me on twitter @rebeladm to get updates about new blog posts.

When AD password will expire?

In Active Directory environment users have to update their passwords when its expire. In some occasions, it is important to know when user password will expire.

For user account, the value for the next password change is saved under the attribute msDS-UserPasswordExpiryTimeComputed

We can view this value for a user account using a PowerShell command like following, 

Get-ADuser R564441 -Properties msDS-UserPasswordExpiryTimeComputed | select Name, msDS-UserPasswordExpiryTimeComputed 

In above command, I am trying to find out the msDS-UserPasswordExpiryTimeComputed attribute for the user R564441. In output I am listing value of Name attribute and msDS-UserPasswordExpiryTimeComputed

ex1

In my example, it gave 131412469385705537 but it’s not mean anything. We need to convert it to readable format. 

I can do it using,

Get-ADuser R564441 -Properties msDS-UserPasswordExpiryTimeComputed | select Name, {[datetime]::FromFileTime($_."msDS-UserPasswordExpiryTimeComputed")}

In above the value was converted to datetime format and now its gives readable value. 

ex2

We can further develop this to provide report or send automatic reminders to users. I wrote following PowerShell script to generate a report regarding all the users in AD. 

$passwordexpired = $null

$dc = (Get-ADDomain | Select DNSRoot).DNSRoot

$Report= "C:\report.html"

$HTML=@"

<title>Password Validity Period For $dc</title>

<style>

BODY{background-color :LightBlue}

</style>

"@

$passwordexpired = Get-ADUser -filter * –Properties "SamAccountName","pwdLastSet","msDS-UserPasswordExpiryTimeComputed" | Select-Object -Property "SamAccountName",@{Name="Last Password Change";Expression={[datetime]::FromFileTime($_."pwdLastSet")}},@{Name="Next Password Change";Expression={[datetime]::FromFileTime($_."msDS-UserPasswordExpiryTimeComputed")}}

$passwordexpired | ConvertTo-Html -Property "SamAccountName","Last Password Change","Next Password Change"-head $HTML -body "<H2> Password Validity Period For $dc</H2>"|

Out-File $Report

     Invoke-Expression C:\report.html

This creates HTML report as following. It contains user name, last password change time and date and time it going to expire. 

ex3

The attributes value I used in here is SamAccountName, pwdLastSet and msDS-UserPasswordExpiryTimeComputed. pwdLastSet attribute holds the value for last password reset time and date. 

Hope this was useful and if you have any questions feel free to contact me on rebeladm@live.com also follow me on twitter @rebeladm to get updates about new blog posts. 

Step-by-Step guide to configure Azure MFA with ADFS 2016

Multifactor authentication (MFA) is commonly use to protect applications, web services which is publish to internet. It helps to verify the authenticity of the authentication requests. There are many multifactor service providers. Some are cloud based and some are required on-premises installations.  

Azure MFA first was introduced to use with Azure services and later developed further to support on-premises workload protections too. It is possible to configure Azure MFA with ADFS 2.0 and ADFS 3.0, however the configuration required to install additional MFA server for that. With ADFS 4.0 (windows server 2016) this is made simple and we can integrate Azure MFA without need of additional server. 

In this post, I am going to walk you through the integration of Azure MFA with ADFS 2016. 

Before we start we need to look in to the prerequisites. 

1. Valid Azure subscription.

2. Azure Global Administrator account 

3. Existing Federate Azure AD setup. More info about this configuration can find in https://docs.microsoft.com/en-gb/azure/active-directory/connect/active-directory-aadconnect-get-started-custom#configuring-federation-with-ad-fs 

4. Windows Server 2016 AD FS installed in on-premises

5. Enterprise Administrator Account to configure MFA

6. Users with Azure MFA enabled – http://www.rebeladmin.com/2016/01/step-by-step-guide-to-configure-mfa-multi-factor-authentication-for-azure-users/

7. Windows Azure Active Directory module for Windows PowerShell installed in ADFS server

Create Certificate in each ADFS server to use with Azure MFA 

First step of the configuration is to generate a certificate for Azure MFA. This needs to perform on every ADFS server in the farm. In order to generate the certificate, you can use following on PowerShell. 

$certbase64 = New-AdfsAzureMfaTenantCertificate -TenantID “Your Tenant ID”

Please replace “Your Tenant ID” with actual azure tenant ID. You can find tenant ID by running Login-AzureRmAccount on Azure AD PowerShell. 

Once it is generated, the certificate will be under local computer certificates. 

cert1

Add new credentials to connect with Auth Client SPN

Now, we have the certificate, but we need to tell Azure Multi-Factor Auth Client to use it as

a credential to connect with AD FS.

Before that, we need to connect to the Azure AD using Azure PowerShell. We can do that

using this:

Connect-MsolService

Then, it will prompt for login and make sure to use Azure Global Administrator account to connect.

After that execute the command,

New-MsolServicePrincipalCredential -AppPrincipalId 981f26a1-7f43-403b-a875-f8b09b8cd720 -Type asymmetric -Usage verify -Value $certbase64

In the above command, AppPrincipalId defines the GUID for Azure Multi-Factor Auth Client.

Configure ADFS farm to use Azure MFA

Now we have the components ready and next step is to configure ADFS farm to use Azure AD. In order to do that run the following PowerShell command.

Set-AdfsAzureMfaTenant -TenantId “Your Tenant ID” -ClientId 981f26a1-7f43-403b-a875-f8b09b8cd720

In above command replace “Your Tenant ID” with your Azure Tennant id. ClientId in the command represent the GUID for Azure Multi-Factor Auth Client.

cert2

Once it is completed restart the ADFS service. 

Enable Azure MFA globally

Last step of the configuration is to enable Azure MFA for authentication. In order to do that log in to ADFS server and go to Server Manager > Tools > AD FS Management. Then, in the MMC, go to Service > Authentication Methods > Then in the Actions panel, click on Edit Primary Authentication Method.

cert3

This opens up the window to configure global authentication methods. It has two tabs, and we can see Azure MFA on both.

cert4

By selecting each box, you can enable MFA for intranet and extranet. 

This completes the configuration. now you can use Azure MFA with your ADFS farm. Hope this was useful and if you have any questions feel free to contact me on rebeladm@live.com

Azure Active Directory Seamless Single Sign-On (Azure AD Seamless SSO)

I am sure most of you aware what is single sign-on (SSO) in Active Directory infrastructure and how it works. When we extend identity infrastructures to Azure by using Azure AD, it also allows to extend Single Sign-On capabilities to authenticate in to cloud workloads. it can be done using on-premises ADFS farm. Password Hash Synchronization or Pass-through Authentication allow users to use same user name and password to log in to cloud applications but this is not a “Seamless” access. Even they are using same user name and password, when log in to Azure workloads it will prompt for password. 

In my below example, I have an Azure AD instance integrated with on-premises AD using Pass-through Authentication. In there I have a user R272845. I logged in to a domain joined computer with this user and try to access application published using Azure. when I type the URL and press enter, it redirects me to Azure AD login page.

sso1

sso2

Azure Active Directory Seamless Single Sign-On is a feature which allow users to authenticate in to Azure AD without providing password again when login from domain join/ corporate device. This can be integrated with Password Hash Synchronization or Pass-through Authentication. This is still on preview which means cannot use in production environment yet. However, if it doesn’t work in environment, it will always issue the typical Azure AD authentication page, so it will not prevent you from accessing any application. This feature is not supported if you using ADFS option already.

According to Microsoft, following can list as key features of Azure Active Directory Seamless Single Sign-On (Azure AD Seamless SSO)

Users are automatically signed into both on-premises and cloud-based applications.

Users don't have to enter their passwords repeatedly.

No additional components needed on-premises to make this work.

Works with any method of cloud authentication – Password Hash Synchronization or Pass-through Authentication.

Can be rolled out to some or all your users using Group Policy.

Register non-Windows 10 devices with Azure AD without the need for any AD FS infrastructure. This capability needs you to use version 2.1 or later of the workplace-join client.

Seamless SSO is an opportunistic feature. If it fails for any reason, the user sign-in experience goes back to its regular behavior – i.e, the user needs to enter their password on the sign-in page.

It can be enabled via Azure AD Connect.

It is a free feature, and you don't need any paid editions of Azure AD to use it.

It is supported on web browser-based clients and Office clients that support modern authentication on platforms and browsers capable of Kerberos authentication

According to Microsoft, following environments are supported. 

OS\Browser

Internet Explorer

Edge

Google Chrome

Mozilla Firefox

Safari

Windows 10

Yes

No

Yes

Yes, additional config required

N/A

Windows 8.1

Yes

N/A

Yes

Yes, additional config required

N/A

Windows 8

Yes

N/A

Yes

Yes, additional config required

N/A

Windows 7

Yes

N/A

Yes

Yes, additional config required

N/A

Mac OS X

N/A

N/A

Yes

Yes, additional config required

Yes, additional config required

The current release (at the time this blog post was written) do not support edge browser. Also this feature will not work when users use private browser mode on Firefox or when users have Enhanced Protection mode enabled in IE. 

How it works?

Before we look in to configuration, let’s go ahead and see how it’s really works. In following example, user is trying to access cloud based application (integrated with azure) using his on-premises username, password and domain joined device. 

Also, it is important to know what happen in corporate infrastructure when seamless SSO enabled.

System will create AZUREADSSOACCT computer object in on-premises AD to represent Azure AD

AZUREADSSOACCT computer account’s Kerberos decryption key is shared with Azure AD.

Two Kerberos service principal names (SPNs) are created to represent two URLs that are used during Azure AD sign-in which is https://autologon.microsoftazuread-sso.com and https://aadg.windows.net.nsatc.net 

sso3

1. User is accessing the application URL using his browser. He is doing it using his domain joined device in corporate network.

2. If user is not sign in already, it is pointed to Azure AD sign in page and then user type his user name.

3. Azure AD challenge back user via browser using 401 response to provide Kerberos ticket.

4. Browser request a Kerberos ticket for AZUREADSSOACCT computer object from on-premises AD. This account will be created in on premise AD as part of the process in order to represent Azure AD. 

5. On-premises AD locate the AZUREADSSOACCT computer object and return the Kerberos ticket to the browser encrypted using computer object’s secret. 

6. The browser forwards Kerberos ticket to Azure AD.

7. Azure AD decrypts the Kerberos ticket using Kerberos decryption key (This was shared with azure AD when SSO feature enable)

8. After evaluation, Azure AD pass the response back to the user (if required additional steps such as MFA required).

9. User allowed to access the application. 

Prerequisites

In order to implement this feature, we need the following,

1. Domain Admin / Enterprise Admin account to install and configure Azure AD Connect in on-premises 

2. Global Administrator Account for Azure subscription – in order to create custom domain, configure AD connect etc.

3. Latest Azure AD Connect https://www.microsoft.com/en-us/download/details.aspx?id=47594 – if you have older Azure AD connect version installed, you need to upgrade it to latest before we configure this feature.

4. Azure AD Connect can communicate with *.msappproxy.net URLs and over port 443. If connection is control via IP addresses, the range of azure IP addresses can find in here https://www.microsoft.com/en-us/download/details.aspx?id=41653 

5. Add is https://autologon.microsoftazuread-sso.com and https://aadg.windows.net.nsatc.net to browser intranet zone. If users are using IE and chrome, this can be done using group policy. I have written blog post before how to create policy targeting IE. You can find it here

6. Firefox need above URL added to the trusted Kerberos site list to do Kerberos authentication. To do that go to Firefox browser > Type about:config in address bar > in list look for network.negotiate-auth.trusted-uris > right click and select modify > type “https://autologon.microsoftazuread-sso.com, https://aadg.windows.net.nsatc.net" and click ok

7. if its MAC os, device need to be joined to AD. More details can be found in here

Configure Azure AD Seamless SSO
 
Configuration of this feature is straight forward, basically it’s just putting a one tick box. 
 
If its fresh Azure AD connect installation, select the customize option under express settings.
 
sso4
 
Then in User Sign-in page select the appropriate sign-in option and then select Enable single sign-on option.
 
sso5
 
If you have existing Azure AD connect instance running, double click on Azure AD connect short cut. In initial window click on Configure.
 
sso6
 
In additional task page click on Change user sign-in and then click on Next.
 
sso7
 
In next window, type the Azure AD sync account user name and password and click on Next.
 
sso8
 
Then under the User Sign-in page select Enable single sign-on option and then click Next
 
sso9
 
In next page, enter the credentials for on-premises domain admin account and click Next.  
 
sso10
 
At the end click on Configure to complete the process. 
 
sso11
 
This completes the configuration and next step is to verify if its configured SSO. First thing is to check if its create computer object called AZUREADSSOACCT under on-premises AD. You will be able to find it under default Computers OU.
 
sso12
 
Then log in to Azure Portal and go to Azure Active Directory > Azure AD Connect then under the user sign-in option we can see seamless sign-on option is enabled. 
 
sso13
 
This means it’s all good. Next step is to check if its working as expected. in order to do that I am login to corporate device with same user I used earlier which is R272845 and try to access same app url. 
 
This time, all I needed to type was the user name and it log me in. nice!!!!
 
sso14
 
Note – before testing make sure you added the two Azure AD urls to intranet zone as I mentioned in prerequisites section. 
 
Hope this information was useful and if you have any questions feel free to contact me on rebeladm@live.com

Azure Active Directory Pass-through Authentication

When organizations want to use same user name and passwords to log in to on-premises and cloud workloads (azure), there are two options. One is to sync user name and password hashes from on-premises active directory to azure AD. Other option is to deploy ADFS farm on-premises and use it to authenticate cloud based logins. But it needs additional planning and resources. On-premises AD uses hash values (which are generated by a hash algorithm) as passwords. They are NOT saved as clear text, and it is almost impossible to revert it to the original password even someone have hash value. There is misunderstanding about this as some people still think Azure AD password sync uses clear text passwords. Every two minutes, the Azure AD connect server retrieves password hashes from the on-premises AD and syncs it with Azure AD on a per user-basis in chronological order. In technical point of view, I do not see a reason why people should avoid password hash sync to azure AD. However, there are company policies and compliance requirements which do not accept any form of identity sync to external system even on hash format. Azure Active Directory Pass-through Authentication is introduced by Microsoft to answer these requirements. It allows users to authenticate in to cloud workloads using same passwords they are using in on-premises without syncing their password hash values to Azure AD. This feature is currently on preview. Which means it’s still not supported on production environment. But it is not too early to try it in development environments.

According to Microsoft, following can list as key features of Pass-through Authentication.

Users use the same passwords to sign into both on-premises and cloud-based applications.

Users spend less time talking to the IT helpdesk resolving password-related issues.

Users can complete self-service password management tasks in the cloud.

No need for complex on-premises deployments or network configuration.

Needs just a lightweight agent to be installed on-premises.

No management overhead. The agent automatically receives improvements and bug fixes.

On-premises passwords are never stored in the cloud in any form.

The agent only makes outbound connections from within your network. Therefore, there is no requirement to install the agent in a perimeter network, also known as a DMZ.

Protects your user accounts by working seamlessly with Azure AD Conditional Access policies, including Multi-Factor Authentication (MFA), and by filtering out brute force password attacks.

Additional agents can be installed on multiple on-premises servers to provide high availability of sign-in requests.

Multi-forest environments are supported if there are forest trusts between your AD forests and if name suffix routing is correctly configured.

It is a free feature, and you don't need any paid editions of Azure AD to use it.

It can be enabled via Azure AD Connect.

It protects your on-premises accounts against brute force password attacks in the cloud.

How it works?

Let’s see how it really works. In following example, user is trying to access cloud based application (integrated with azure) using his on-premises username and the password. This organization is using pass-through authentication.

pt1

1. User is accessing the application URL using his browser. 

2. In order to authenticate to the application, user is directed to Azure Active Directory sign-in page. User then type the user name, password and click on sign-in button. 

3. Azure AD receives the data and it encrypt the password using public key which is used to verify the data authenticity. Then it places it’s in a queue where it will wait till pass-through agent retrieves it.   

4. On-premises pass-through agent retrieves the data from Azure AD queue (using an outbound connection) 

5. Agent decrypt the password using private key available for it. 

6. Agent validates the user name and password information with on-premises Active Directory. It uses same mechanism as ADFS. 

7. On-premises AD evaluate the request and provides the response. It can be success, failure, password-expire or account lockout. 

8. Pass-through agent passes the response back to Azure AD. 

9. Azure AD evaluate response and pass it back to user.

10. If response was success, user is allowed to access the application. 

Prerequisites
 
In order to implement this feature, we need the following,
 
1. Domain Admin / Enterprise Admin account to install and configure Azure AD Connect in on-premises 
2. Global Administrator Account for Azure subscription – in order to create custom domain, configure AD connect etc. 
3. On-premises servers running windows server 2012 R2 or latest to install Azure AD connect and pass-through agent.
4. Latest Azure AD Connect https://www.microsoft.com/en-us/download/details.aspx?id=47594 – if you have older Azure AD connect version installed, you need to upgrade it to latest before we configure this feature. 
5. Allow outbound communication to Azure via TCP port 80 and 443 from servers which will have Azure AD connect and authentication agents. You can find azure datacenter ip ranges from https://www.microsoft.com/en-us/download/details.aspx?id=41653
 

Configure Azure Active Directory Pass-through Authentication
 
Once we have all the prerequisites ready, we can look in to configuration. if you running Azure AD connect for first time make sure to use custom method.
 
pt1-1
 
Then in User sign-in option, select pass-through authentication and continue. 
 
pt1-2
 
If you running it already in servers, first run as Azure AD Connect as administrator. Then click on Configure.
 
pt2
 
Then in next page, select Change user sign-in option and click Next.
 
pt3
 
In next window type the Azure AD sync account login details and then click Next.
 
pt4
 
In next window, select pass-through authentication and click Next
 
pt5
 
Note– If you have Azure AD App Proxy Connector installed on same Azure AD connect server you will receive error saying, Pass-through authentication cannot be configured on this machine because Azure AD Connect agent is already installed. To fix it uninstall the Azure AD proxy connector and then reconfigure AD connect. After that you can reinstall Azure AD App proxy Connector. 
 
Once it finishes the configuration click on configure to complete the process. 
 
pt6
 
Once process is completed, log in to Azure Portal and then go to Azure Active Directory > Azure AD Connect. In there we can see pass-through authentication is enabled. 
 
pt7
 
And if you click on there it will shows the connected agents status. 
 
pt8
 
At this stages users from on-premises should be able to sign in to their cloud applications by using pass-through authentication.
 
in order to add high availability, we can install agent in multiple domain join servers. it can download from pass-through authentication page.
 
pt9
 
This completes the implementation of pass-through authentication and hope this post was useful. If you have any questions, feel free to contact me on rebeladm@live.com

Non-Authoritative and Authoritative SYSVOL Restore (DFS Replication)

Healthy SYSVOL replication is key for every active directory infrastructure. when there is SYSVOL replication issues you may notice,

1. Users and systems are not applying their group policy settings properly. 

2. New group policies not applying to certain users and systems. 

3. Group policy object counts is different between domain controllers (inside SYSVOL folders)

4. Log on scripts are not processing correctly

Also, same time if you look in to event viewer you may able to find events such as,

Event Id

Event Description

2213

The DFS Replication service stopped replication on volume C:. This occurs when a DFSR JET database is not shut down cleanly and Auto Recovery is disabled. To resolve this issue, back up the files in the affected replicated folders, and then use the ResumeReplication WMI method to resume replication.

Recovery Steps

1. Back up the files in all replicated folders on the volume. Failure to do so may result in data loss due to unexpected conflict resolution during the recovery of the replicated folders.

2. To resume the replication for this volume, use the WMI method ResumeReplication of the DfsrVolumeConfig class. For example, from an elevated command prompt, type the following command:

wmic /namespace:\\root\microsoftdfs path dfsrVolumeConfig where volumeGuid=”xxxxxxxx″ call ResumeReplication

5002

The DFS Replication service encountered an error communicating with partner <FQDN> for replication group Domain System Volume.

5008

The DFS Replication service failed to communicate with partner <FQDN> for replication group Home-Replication. This error can occur if the host is unreachable, or if the DFS Replication service is not running on the server.

5014

The DFS Replication service is stopping communication with partner <FQDN> for replication group Domain System Volume due to an error. The service will retry the connection periodically.

Some of these errors can be fixed with simple server reboot or running commands describe in the error ( ex – event 2213 description) but if its keep continuing we need to do Non-Authoritative or Authoritative SYSVOL restore.

Non-Authoritative Restore 

If it’s only one or few domain controller (less than 50%) which have replication issues in a given time, we can issue a non-authoritative replication. In that scenario, system will replicate the SYSVOL from the PDC. 

Authoritative Restore

If more than 50% of domain controllers have SYSVOL replication issues, it possible that entire SYSVOL got corrupted. In such scenario, we need to go for Authoritative Restore. In this process, first we need to restore SYSVOL from backup to PDC and then replicate over or force all the domain controllers to update their SYSVOL copy from the copy in PDC. 

SYSVOL can replicate using FRS too. This is deprecated after windows server 2008, but if you migrated from older Active Directory environment you may still have FRS for SYSVOL replication. It also supports for Non-Authoritative and Authoritative restore but in this demo, I am going to talk only about SYSVOL with DFS replication. 

Non-Authoritative DFS Replication 

In order to perform a non-authoritative replication,

1) Backup the existing SYSVOL – This can be done by copying the SYSVOL folder from the domain controller which have DFS replication issues in to a secure location. 

2) Log in to Domain Controller as Domain Admin/Enterprise Admin

3) Launch ADSIEDIT.MSC tool and connect to Default Naming Context

sys1

4) Brows to DC=domain,DC=local > OU=Domain Controllers > CN=(DC NAME) > CN=DFSR-LocalSettings > Domain System Volume > SYSVOL Subscription

5) Change value of attribute msDFSR-Enabled = FALSE

sys2

6) Force the AD replication using,

repadmin /syncall /AdP

7) Run following to install the DFS management tools using (unless this is already installed), 

Add-WindowsFeature RSAT-DFS-Mgmt-Con

8) Run following command to update the DFRS global state,

dfsrdiag PollAD

9) Search for the event 4114 to confirm SYSVOL replication is disabled. 

Get-EventLog -Log "DFS Replication" | where {$_.eventID -eq 4114} | fl

10) Change the attribute value back to msDFSR-Enabled=TRUE (step 5)

11) Force the AD replication as in step 6

12) Update DFRS global state running command in step 8

13) Search for events 4614 and 4604 to confirm successful non-authoritative synchronization. 

sys3

All these commands should run from domain controllers set as non-authoritative. 

Authoritative DFS Replication 

In order to perform to initiate authoritative DFS Replication,

1) Log in to PDC FSMO role holder as Domain Administrator or Enterprise Administrator

2) Stop DFS Replication Service (This is recommended to do in all the Domain Controllers)

3) Launch ADSIEDIT.MSC tool and connect to Default Naming Context

4) Brows to DC=domain,DC=local > OU=Domain Controllers > CN=(DC NAME) > CN=DFSR-LocalSettings > Domain System Volume > SYSVOL Subscription

5) Update the given attributes values as following, 

msDFSR-Enabled=FALSE

msDFSR-options=1

sys4

6) Modify following attribute on ALL other domain controller.

msDFSR-Enabled=FALSE

7) Force the AD replication using,

repadmin /syncall /AdP

8) Start DFS replication service in PDC

9) Search for the event 4114 to verify SYSVOL replication is disabled.

10) Change following value which were set on the step 5,

msDFSR-Enabled=TRUE

11) Force the AD replication using,

repadmin /syncall /AdP

12) Run following command to update the DFRS global state,

dfsrdiag PollAD

13) Search for the event 4602 and verify the successful SYSVOL replication. 

14) Start DFS service on all other Domain Controllers

15) Search for the event 4114 to verify SYSVOL replication is disabled.

16) Change following value which were set on the step6. This need to be done on ALL domain controllers. 

msDFSR-Enabled=TRUE

17) Run following command to update the DFRS global state,

dfsrdiag PollAD

18) Search for events 4614 and 4604 to confirm successful authoritative synchronization. 

Please note you do not need to run Authoritative DFS Replication for every DFS replication issue. It should be the last option.

Hope this was useful and if you have any questions feel free to contact me on rebeladm@live.com 

Protect cloud Identities with Azure Active Directory Identity Protection

Symantec released their latest Internet Security Threat Report in early June. This report includes data about infrastructure threats for year 2016. It says, for year 2016, near 1.1 billion identities has been exposed. Also for last 8 years total identity breach is around 7.1 billion which is almost equal to total world population.

aip1

In Identity infrastructure breach, most of the time advisories get in to the system using a legitimate user name and password belong to an identity in that infrastructure. This initial breach can be result of malware, phishing or pass-the-hash attack. If it’s a “privileged” account, it makes easier for advisories to gain control over identity infrastructure. But it’s not always a must. All they need is some sort of a breach. Latest reports show after an initial breach it only takes less than 48 hours to gain full control over identity infrastructure.

When we look in to it from identity infrastructure end, if someone provides a legitimate user name and password it allows access to the system. This can be from the user or an advisory. But by default, system will not know that. In local AD infrastructure, solutions like Microsoft Advanced Threat Analytics, Microsoft Identity Management helps to identify and prevent inauthentic use of identities. Azure Active Directory Identity Protection is a feature comes with Azure AD Premium, which can use to protect your workloads from inauthentic use of cloud identities.  It mainly has following benefits. 

Detect vulnerabilities which affect the cloud identities using adaptive machine learning algorithms and heuristics.

Issue alerts and reports to detect/identify potential identity threats and allow administrators to take actions accordingly. 

Based on policies, force automated actions such as Block Access, MFA authentication or Password reset when it detects a suspicious login attempt. 

According to Microsoft https://docs.microsoft.com/en-us/azure/active-directory/active-directory-identityprotection Azure AD Identity protection has following capabilities.

Detecting vulnerabilities and risky accounts:

Providing custom recommendations to improve overall security posture by highlighting vulnerabilities

Calculating sign-in risk levels

Calculating user risk levels

Investigating risk events:

Sending notifications for risk events

Investigating risk events using relevant and contextual information

Providing basic workflows to track investigations

Providing easy access to remediation actions such as password reset

Risk-based conditional access policies:

Policy to mitigate risky sign-ins by blocking sign-ins or requiring multi-factor authentication challenges.

Policy to block or secure risky user accounts

Policy to require users to register for multi-factor authentication

Azure Active Directory Identity Protection detect and report following as vulnerabilities,

User logins without Multi-Factor Authentication

Use of unmanaged cloud apps – These are the applications which is not managed using Azure Active Directory. 

Risk events detect by Azure Privileged Identity Management – This is another additional service which can use to manage and monitor privileged accounts associated with Azure Active Directory, Office 365, EMS etc.

More info about these events can find on here 

Azure Active Directory Identity Protection also reports about Risk events. These are Azure Active Directory based events. These also use to create policies in Azure Active Directory Identity Protection. It detects six types of risk events. 
 
Users with leaked credentials
Sign-ins from anonymous IP addresses
Impossible travel to atypical locations
Sign-ins from unfamiliar locations
Sign-ins from infected devices
Sign-ins from IP addresses with suspicious activity
 
More information about Azure risk events can find here 
 
Let’s see how we start using Azure Active Directory Identity Protection. Before we start we need to have following, 
 
1) Azure Active Directory Premium P2 Subscription
2) Global Administrator account  
 
Once you have above, log in to Azure as Global Administrator.
 
1) Go to New | then type Azure AD Identity Protection 
 
aip2
 
2) On Azure AD Identity Protection page click on Create
 
aip3
 
3) Then it loads up a new page with Azure Directory details and feature details. Click on Create again to proceed. 
 
aip4
 
4) Once it is done, go on and search for Azure AD Identity Protection under more services. Then it will show the dashboard for the feature. 
 
aip5
 
5) In my demo environment, it is immediately detect that I have 257 user accounts without MFA. 
 
aip6
 
6) The beauty of this is, once it detects a problem it guides you to address the issue. If I double click on the MFA alert, I gets the Multi-factor authentication registration policy settings page, where I can enforce users to use MFA. 
 
aip7
 
Under the user assignment, I can force it for all users or group of users. 
 
aip8
 
Under the controls, by default it selected the Require Azure MFA registration option. 
 
aip9
 
Under the Review option we can evaluate the impact. 
 
aip10
 
Once configuration is done, we can enforce the policy using Enforce Policy option. 
 
aip11
 
7) Using Sign-in risk policy, we can define actions system need to take in an event it detects sing-in risk from user. In order to define policy settings, click on Sign-in risk policy in Azure AD Identity Protection Dashboard
 
aip12
 
8) Then it loads up the policy configuration page. 
 
aip13
 
Under the Assignments we can decide which users this policy applies to. It can be either All users or group of users. we also can exclude users from the selection. 
 
aip14
 
Under the Conditions, we can select the level of sing-in risk events need to consider in the policy. there are three options to select with which is low and above, medium and above or High.
 
aip15
 
Under the Access option we can define what system should do once it detect risk events. It can either block access completely or allow access with additional security conditions such as Require multi-factor authentication.
 
aip16
 
Using Review option we can see what kind of impact policy will make ( if there are any existing risk events). 
 
Once all these done, use Enforce Policy option to enforce it. 
 
aip17
 
9) In similar way using User risk policy we can define policy settings to handle risky users. 
 
10) We also can configure Azure AD Identity Protection to send alerts when it detects an event. To do that go to Alerts option in Azure AD Identity Protection Dashboard
 
aip18
 
In configuration page, we can define the lowest alert level that need to consider. So, any event equal to that level or above will send out as alert to the selected recipients. 
 
aip19
 
11) Azure AD Identity Protection also can send automated weekly email report with events it detected. In order to configure it go to Weekly Digest option in Azure AD Identity Protection Dashboard
 
aip20
 
In configuration page, we can select which recipients it should goes to. 
 
aip21
 
In this article, I explained what is Azure AD Identity Protection and how we can use it to protect cloud identities. Hope this was useful and if you have any questions feel free to contact me on rebeladm@live.com

Conditional Access Policies with Azure Active Directory

When it comes to manage access to resources in infrastructure, there are two main questions we usually ask.

  1. “Who” is the user and “What” resources?
  2. Is it allow or deny access? 

Answers to above questions are enough to define the base rules. But depending on the tools and technologies that can use to manage the access, we will have additional questions which will help us define accurate rules. As an example, Sales manager walks up to the IT department and says “Peter need to access “Sales” folder in the file server”. So, based on the statement, we know the user is “Peter” and resources is “Sales” folder in the “File Server”. Also, we know the user “Peter” needs to “Allow” access to the folder. However, since we are going to use NTFS permission, we know that we can make the permissions more accurate than that. When sales manager says “Allow” peter to access “Sales” folder he didn’t define it as “Read & Write” or “Read Only”. He didn’t also define if he need same permission to all the sub folders in the “Sales” folder. Based on answer to those, we can define more granular level rules.

Access control to resource in an infrastructure happens in many different levels with many different tools and technologies. The first level of control happens in the network perimeter level. Using firewall rules, we can handle “in” and “out” network traffic to/from company infrastructure. If user pass that level, then it will verify the access based in users and groups. After that it comes to applications and other resources. But problem we have as engineers is to manage all these separately. Let’s go back to our previous example. In there we only consider about NTFS permission. If “Peter” is a remote worker and he connect to internal network using Remote desktop services, first we need to define firewall rules to allow his connection. Then if multi-factor authentication required for remote workers, I need to configure and defines rules in there. Also, when user logs in, he will not have same permission he has in company workstation. So, those session host permissions need to be adjusted too. So, as we can see even its sounds simple, we have to deal with many different systems and rules which cannot combine in to “one”.

So far, we looked in to on-premises scenarios. When it comes to cloud, the operation model is different. We cannot apply the same tools and technologies we used to manage access in on-premises. Microsoft Azure’s answer for simplifying access management to workloads is “Conditional Access”. This allow manage access to applications based on “Conditions”.  When it comes to public cloud mostly we allowing access to applications from networks we do not trust. There for, using “Conditions” we can define policies for users which they need to comply, in order to get access to the applications.

In Condition Access Policy, there are two main section.

Assignments –  This is where we can define conditions applying to user environment such as users and groups, applications, device platform, login locations etc.

Access Control –  This is to control access for the users and groups when they comply with the conditions specified in the “assignments” section. it can be either allow access or deny access.

cac1

Let’s see what conditions we can applies using conditional access policies. 

Assignments 

Under the assignment section there are three main options which can use to define conditions. 

1) Users and Groups

2) Cloud apps

3) Conditions 

User and Groups

Under the user and groups option we can define the users and groups targeted by the condition access policy. 

We can select define target as “All” or selected number of users and groups. 

cac2

We also can explicitly select groups and exclude individuals from it. 

cac3

  

Cloud Apps

  

Under the cloud app option we can select the applications which is targeted for the policy. these applications can be Azure apps or on-premises applications which is published via Azure Active Directory using Azure App Proxy. Similar to users and groups, we also can explicitly allow access to a large group and exclude specific entities. 

cac4

Condition 

Using options under this category we can specify the conditions related to user’s login environment. This category has 4 sub-categories. 

1) Sign-in risk

2) Device Platforms

3) Locations

4) Client Apps

It is not required to use all these sub-categories for each and every policy. By default, all these are in disabled mode. 

 

Sign-in risk

Azure Active Directory monitor user login in behavior based on six types of risk events. These events are explained in details on https://docs.microsoft.com/en-us/azure/active-directory/active-directory-reporting-risk-events#risk-level . As an example, I am usually login to azure from IP addresses belongs to Canada. I log in to azure at 8am from Toronto. After 5 minutes, its detects a login from Germany. In typical scenario, it’s not possible unless I use a remote login. From Azure point of view, it will detect as malicious activity and will rate as “Medium” risk event. In this sub-category, we can define what level of sign-in-risks need to consider. 

cac5

Note – If you need enable the policy, you need to first click on “Yes” under configure option.

cac6

Device Platforms

Device platforms are categorized based on the operating systems. it can be,

Android

iOS

Windows Phone

Windows

cac7

We also can explicitly allow all and then exclude specific platforms.

Locations

Locations are defined based on IP addresses. If it’s only for “trusted” IP addresses, make sure to define trusted IP addresses using the given option.

cac8

Client Apps

Client apps are the form that users access the apps. It can be using web, mobile apps or desktop clients. Exchange ActiveSync is available when Exchange Online is the only cloud app selected. 

cac9

Access Controls 

There are two categories which can use to add the access control conditions to the policies. 

1) Grant

2) Session

Grant

In this category, we can specify the allow or deny access. Under the allow access, we can add further conditions such as,

Require multi-factor authentication

Require device to be marked as compliant

Require domain joined device

cac10

MFA

Multi-factor authentication is additional layer of security to confirm the authenticity of the login attempt. Even policy set to allow access, using this option we still can force user to use MFA. This is allowed to use Azure MFA or on-premises MFA solution (via ADFS).

 

Compliant

 

Using Microsoft Intune, we can define rules to categorize the user devices are compliant or not according to company standards. if this option is used, only the devices which is compliant will consider.

 

Domain Joined

 

If this option is used, it will only consider connection from Azure Active Directory domain joined devices.

Once you define the options, it can either force to use all the options or only to consider “one” of the selected. 

cac11

Sessions

This is still on preview mode. This is basically to provide additional information about session to the cloud app so it can confirm authenticity of the session. Not every cloud app supports this option yet. 

cac12

Demo

By now we know what are the conditions we can use to define a condition access policy. Let’s see how we can configure a policy with a real-world example. Before we start, we need to look in to prerequisites for the task. In order to setup condition access policies we need following.

1. Valid Azure Active Directory Premium Subscription

2. Azure Administrator Account to create policies

In my demo, I have a user called “Berngard Saller”. He is allowed to access an on-premises application which is published using Azure Application Proxy (http://www.rebeladmin.com/2017/06/azure-active-directory-application-proxy-part-02/). 

cac13
 
I need a condition access policy to block his access to this app if he is login from a device which use “Android OS”. 
So, let’s start,
 
1. Log in to the Azure portal as Administrator.
2. Click on Azure Active Directory | Conditional Access
 
cac14
 
3. In new page, Click on + New Policy
 
cac15
 
4. In next window, first provide a Name for the policy
 
cac16
 
5. Then click on Users and Groups | Select Users and Groups | Select. Then from the list of users select the appropriate user (in my demo its user Berngard Saller) and then click on Select button. 
 
 
cac17
 
6. Then in next window click on Done
 
cac18
 
7. Next step is to define the App. To do that, Click on Cloud Apps | Selected Apps | Select. Then from the list select the relevant app (in my demo its webapp1) and then click on Select button. 
 
cac19
 
8. Then in next window click on Done
 
9. As next step, go to Conditions | Device Platforms | Click on Yes to enable Condition | Select Device Platforms | Android. then click on Done button. 
 
cac20
 
10. Then in next window click on Done
 
11. Next step is to define Access Controls. To do that Click on Grant | Block Access. Then click on Select button. 
 
cac21
 
12. Now we have the condition policy ready. Click On under Enable Policy and click on Create button to create the policy. 
 
cac22
 
Now we have the policy ready. The next step is to test. 
 
cac23
 
When I access the app from windows system, I have allowed access. 
 
cac24
 
But when I do it from android mobile it denied access as expected. 
 
cac25
 
As we can see conditional access simplifies the access control to workloads in Azure. 
 
This is the end of this post and if you have any questions feel free to contact me on rebeladm@live.com