Category Archives: Active Directory

Self-Service password reset on Azure AD joined windows 10 device

Password resets are common service desk request IT engineers deals with. Passwords are weak authentication method. Passwords are breakable, crackable and guessable. This is why Microsoft invested on password less authentication such as Windows Hello. However, majority of systems still use traditional user name and password to authenticate.

When user forget their password, it prevents them from accessing the systems or services they trying to access. Until someone with higher privileges reset user’s password his/her time will be wasted. It is manageable for small number of users but if its large organization, it can cost lot for both parties. This is why organizations use self-service password reset solutions. It will allow its users to reset their passwords in secure, controlled environment. 

When it comes to Azure AD, it also can allow users to have self-service password reset feature. In one of my previous blog post I explained how it can enable. It can access using http://www.rebeladmin.com/2016/01/step-by-step-guide-to-configure-self-service-password-reset-in-azure-ad/ . Now Azure AD also allows to reset password directly from login screen of Azure AD join windows 10 devices. In this post, I am going to demonstrate this feature. 

In order to use this feature, Azure AD environment should have following,

1. Enable self-service password reset – By default Azure AD do not have this feature enable. It need to enable before users use this feature. It can be enable for all the users or group of users. 

password1

In my demo environment, I have it enable for all the users. 

Also in here users can have one or two authentication methods to reset password. if it’s using two methods, it will verify user using both methods. 

password2

2. Password writeback for Hybrid Environments – If its Hybrid environment (with on-premises AD) password writeback option should enable. Otherwise password which reset from Azure AD will not replicate back. This option is available in Azure AD connect. If you not enable this option, even if you have self-service password reset enable it will not allow password reset for users. 

password3

3. Windows 10 Fall Creator Update – This password reset feature is only available for Windows 10 Version 1709. So, make sure device is running with latest update. it can be apply using windows update. more details can find via https://support.microsoft.com/en-gb/help/4028685/windows-10-get-the-fall-creators-update 

In my demo environment, I have an Azure Domain Join Windows 10 PC. 

password4

After I enable self-service password reset, I am going to log in to this PC as user RA722725@therebeladmin.com

on my login, it says I need to provide additional info for password recovery. 

password5

Click on Set it up now to continue. 

Then it provides list of options I can use to verify. Select the option you need and click Next

password6

Now we have recovery options setup, let’s see how password reset works from the device. 

On my Azure AD join device, in login screen I type the user name. but I do not get any option for password reset. This is because I am also using PIN option for login. If you are using PIN you probably end up using PIN instead of password. so, if you using PIN and still need to recover password, click on Sign -in option

password7

Then click on number pad sign to select PIN option.

password8

Then click on I forgot my PIN option. I know this is confusing as we trying to reset password. but unfortunately, option is in PIN reset page. 

password9

Then it will open new window. In their click on Forgotten password option. 

password10

Now it opens a new window to reset password. click Next to proceed. 

password11

Then it gives option for verification. Select the method you like to use. You can’t change your registered data in here. 

password12

After successful verification, it gives option to define new password. after type new password, click on Next to proceed. 

password13

Then click on Finish to complete the process. 

password14

Then I can login to device with new password. 

password15

Cool ha???, as expected we were able to reset password on device login screen. This marks the end of this blog post. If you have any questions feel free to contact me on rebeladm@live.com also follow me on twitter @rebeladm to get updates about new blog posts.  

Azure AD now support macOS Conditional Access – Let’s see it in action!

Azure AD conditional access policies allows to provide conditional based access to cloud workloads. 

In one of my previous blog post I explain it in detail what is conditional access policy and how we can configure it. you can find it on http://www.rebeladmin.com/2017/07/conditional-access-policies-azure-active-directory/ . I highly recommend to read it before we continue on this post. 

In Condition Access Policy, there are two main section.

Assignments –  This is where we can define conditions applying to user environment such as users and groups, applications, device platform, login locations etc.

Access Control –  This is to control access for the users and groups when they comply with the conditions specified in the “assignments” section. it can be either allow access or deny access. 

Under Assignment section we can define device platforms involves in the condition. Before when I wrote my previous post it was only supporting for following platforms.

• Android

• iOS

• Windows Phone

• Windows

From November 14th 2017, Azure AD add macOS to the list. With this update following OS versions, applications, and browsers are supported on macOS for conditional access:

Operating Systems

macOS 10.11+

Applications

Microsoft Office 2016 for macOS v15.34 and later

Microsoft Teams

Web applications (via Application Proxy)

Browsers

Safari

Chrome

In original documentation, it didn’t say anything about web apps but in this demo, I am going to use conditional access with on-premises web app which is publish to internet using Azure Application Proxy. I wrote article about application proxy while ago and it can access via http://www.rebeladmin.com/2017/06/azure-active-directory-application-proxy-part-02/ 

Before start configuration, let me explain little bit about my environment. I have on-premises domain environment with therebeladmin.com. I integrated it with Azure AD Premium and I have healthy sync. I have on-premises webapp and I have published it to internet using Azure Application Proxy so I can use Azure AD authentication with it. webapp can access via https://webapp-myrebeladmin.msappproxy.net/webapp/ 

I have a mac with sierra running. In this demo, I am going to setup a conditional access policy to block access to webapp if the request coming from a mac environment. 

mac01

In order to configure this, 

1) Log on to Azure as global admin
2) Click on Azure Active Directory from left menu.
 
mac2
 
3) Then in Azure Active Directory panel, click on Conditional Access under security section. 
 
mac3
 
4) It will load up the conditional access window. Click on + New Policy to create new policy. 
 
mac4
 
5) It will open up policy window where we can define policy settings. First thing first, provide a name for policy. in my case I will use “Block access from macOS
 
mac5
 
6) Then click on User and groups to define target users for the policy. in this demo, I am going to target All users. once selection is done click on Done
 
mac6
 
7) Then Click on Clouds Apps to select application for the policy. in my policy, I am going to target rebelwebapp. Once selection is done click on Select and the Done to complete the process. 
 
mac7
 
8) Next step is to define the conditions. In order to do that click on Conditions option. In here I am only worrying about device platforms. To select platforms, click on option Device Platforms. Then to enable the condition click on Yes under configure and then under include tab select macOS. After that click on Done in both windows to complete the process. 
 
mac8
 
9) Next step to define access control rules. To do that click on Grant under access controls section. in my demo, I am going to block access to app. So, I am selecting block access option. Once selection is done click on select to complete the action. 
 
mac9
 
10) Now policy is ready. To enable it click On tab under Enable Policy option. 
 
mac10
 
11) Then to create the policy, click on Create button. 
 
mac11
 
12) Now policy is ready and next step is to test it. in order to do that I am using webapp url via mac. As soon as I access url, it asks for login.
 
mac12
 
13) As soon as I type user name and password, I get following response saying it is not allowed. 
 
mac13
 
14) If we click on More Details it gives more info about error. As expected it was due to the conditional access policy we set up. Nice ha!!
 
mac14
 
So as expected, conditional access with macOS working fine. This is another good step forward. Well done Microsoft! This marks the end of this blog post. If you have any questions feel free to contact me on rebeladm@live.com also follow me on twitter @rebeladm to get updates about new blog posts.  

Azure AD Password Synchronization

Azure AD Connect allows engineers to sync on-permises AD data to Azure AD. If you use express settings for the AD connect setup, by default it enables the password synchronization as well. This allows users to use same Active Directory password to authenticate in to cloud based workloads. This allow users to use single login details without maintaining different passwords. It simplifies the user’s login experience as well as reduce the helpdesk involvements. 

Windows Active Directory uses hash values, which is generated by hash algorithm as passwords. It is not being saved as clear text password and it is impossible to revert it back to a clear text password. There is misunderstanding about this as some people thinks Azure AD password sync uses clear text passwords. In every 2 minutes’ intervals Azure AD connect server retrieves password hashes from on-premises AD and sync it to Azure AD per user-basis in chronological order. This also involves with encryption and decryption process to add extra security to password sync process. In event of password change it will sync to Azure AD in next password sync interval. In healthy environment, maximum delay to update password will be 2 minutes. 

If the password was changed while user has open session, it will affect on next Azure authentication attempt. It will not log out the user from existing session. Also, password synchronization doesn’t mean SSO. Users always have to use corporate login details to authenticate to Azure Services. You can find more information about SSO using https://docs.microsoft.com/en-us/azure/active-directory/connect/active-directory-aadconnect-sso 

Enable synchronization of NTLM and Kerberos credential hashes to Azure AD

However Azure AD Connect does not synchronize NTLM and Kerberos credential hashes to Azure AD by default. So, if you had Azure AD directory setup and only enabled Azure Domain Services recently make sure you check following,

pass1
1. If there is existing Azure AD Connect server, Upgrade the Azure AD connect to latest
2. If there is existing Azure AD Connect server, confirm password synchronization is enabled in Azure AD connect 
 
In order to do that, open Azure AD connect and select option to “view current configuration” and check if password synchronization is enabled. 
 
pass2
 
If it’s not, we need to go back to initial page and select option “customize synchronization options” and under optional features select password synchronization
 
pass3
 
Run following PowerShell script on local AD to force full password synchronization, and enable all on-premises users’ credential hashes to sync to Azure AD. 

$adConnector = "<CASE SENSITIVE AD CONNECTOR NAME>"  
$azureadConnector = "<CASE SENSITIVE AZURE AD CONNECTOR NAME>"  
Import-Module “C:\Program Files\Microsoft Azure AD Sync\Bin\ADSync\ADSync.psd1”  
$c = Get-ADSyncConnector -Name $adConnector  
$p = New-Object Microsoft.IdentityManagement.PowerShell.ObjectModel.ConfigurationParameter "Microsoft.Synchronize.ForceFullPasswordSync", String, ConnectorGlobal, $null, $null, $null
$p.Value = 1  
$c.GlobalParameters.Remove($p.Name)  
$c.GlobalParameters.Add($p)  
$c = Add-ADSyncConnector -Connector $c  
Set-ADSyncAADPasswordSyncConfiguration -SourceConnector $adConnector -TargetConnector $azureadConnector -Enable $false   
Set-ADSyncAADPasswordSyncConfiguration -SourceConnector $adConnector -TargetConnector $azureadConnector -Enable $true  
 
You can find AD connector and Azure AD Connector name using, Start > Synchronization Service > Connections.
 
pass4
 
After that you can try to log in to Azure as a user in on-premises AD. If sync is working properly, it should accept your corporate login. 
 
This marks the end of this blog post. If you have any questions feel free to contact me on rebeladm@live.com also follow me on twitter @rebeladm to get updates about new blog posts.  

Step-by-Step guide to create custom Active Directory Attributes

In active directory schema, it is allowed to add custom attributes. In organizations, there are situations where this option is useful. It is most of the time related to application integration requirements with active directory infrastructure. In modern infrastructures, applications are decentralizing identity management. Organization’s identities can sit on active directory as well as applications. Some may in in-house infrastructures and some may even in public cloud. If these applications are integrated with active directory it’s still provides central identity management but it’s not always. Some applications have their own way of handling its user accounts and privileges. Similar to active directory attributes, these applications can also have their own attributes defined by its database system to store the data. These application attributes most of the time will not match the attributes on active directory. As an example, HR system uses employee ID to identify an employee record uniquely from others. But active directory use username to identify a unique record. Each system’s attributes hold some data about the objects even its referring to same user or device. If there is another application which required to retrieve data from both system’s attributes how we can facilitate such without data duplication?

One’s a customer was talking to me regarding similar requirement. They have active directory infrastructure in place. They also maintaining a HR system which is not integrated with active directory. They got a new requirement for an employee collaboration application which required data input in specific way. It has defined its fields in the database and we need to match the data on that order. Some of these required data about users can retrieve from active directory and some of user data can retrieve from the HR system. Instead of keeping two data feeds to the system we decided to treat the active directory as the trustworthy data source for this new system. If active directory need to hold all the required data, it somehow need to store the data comes from HR system as well. The final solution was to add custom attributes to active directory schema and associate it with the user class. Instead of both system operate as data feeds, now HR system pass the filtered values to Active directory and it exports all the required data in CSV format to the application.  

In order to create custom attributes, go to active directory schema snap-in, right click on attributes container and select create attribute

Tip – In order to open active directory schema snap-in you need to run command regsvr32 schmmgmt.dll from the Domain Controller. After that you can use MMC and add active directory schema as snap-in. 

Then system will give a warning about the schema object creation and click OK to continue. 

It will open up a form and this is where we need to define the details about custom attribute. 

1) Common Name – This is the name of the object. It is only allowed to use letters, numbers and hyphen for the CN. 

2) LDAP Display Name – When object is referring in script, program or command line utility it need to call using the LDAP Display name instead of the Common Name. when you define the CN, it will automatically create the LDAP Display name. 

3) X500 Object ID – Each and every attribute in active directory schema has unique OID value. There is script develop by Microsoft to generate these unique OID valves. It can be found in https://gallery.technet.microsoft.com/scriptcenter/Generate-an-Object-4c9be66a#content it also can directly run using following PowerShell command. 

 

#--- 

$Prefix="1.2.840.113556.1.8000.2554" 

$GUID=[System.Guid]::NewGuid().ToString() 

$Parts=@() 

$Parts+=[UInt64]::Parse($guid.SubString(0,4),"AllowHexSpecifier") 

$Parts+=[UInt64]::Parse($guid.SubString(4,4),"AllowHexSpecifier") 

$Parts+=[UInt64]::Parse($guid.SubString(9,4),"AllowHexSpecifier") 

$Parts+=[UInt64]::Parse($guid.SubString(14,4),"AllowHexSpecifier") 

$Parts+=[UInt64]::Parse($guid.SubString(19,4),"AllowHexSpecifier") 

$Parts+=[UInt64]::Parse($guid.SubString(24,6),"AllowHexSpecifier") 

$Parts+=[UInt64]::Parse($guid.SubString(30,6),"AllowHexSpecifier") 

$OID=[String]::Format("{0}.{1}.{2}.{3}.{4}.{5}.{6}.{7}",$prefix,$Parts[0],$Parts[1],$Parts[2],$Parts[3],$Parts[4],$Parts[5],$Parts[6]) 

$oid 

#---

 

4) Syntax – It define the storage representation for the object. It is only allowed to use syntaxes defined by Microsoft. One attribute can only associate with one syntax. In below I listed few common used syntaxes in attributes. 

 

Syntax

Description

Boolean

True or False 

Unicode String

A large string

Numeric String

String of digits

Integer

32-bit Numeric value

Large Integer

64-bit Numeric value

SID

Security Identifier Value

Distinguished Name

String value to uniquely identify object in AD

Along with the syntax we also can define the minimum or maximum values. If it’s not defined it will take the default values. 

In following demo, I like to add a new attribute called NI-Number and add it to the User Class

attri1

As the next step, we need to add it to the user class. In order to do that go to classes container, double click on user class and click on attributes tab. In there by clicking the add button can browse and select the newly added attribute from the list. 

attri2

Now when we open a user account we can see the new attribute and we can add the new data to it. 

attri3

Once data been added we can filter out the information as required. 

Get-ADuser “tuser4” -Properties nINumber | ft nINumber

attri4

Note – To add the attributes to the schema you need to have schema administrator privileges or enterprise administrator privileges. 

This marks the end of this blog post. If you have any questions feel free to contact me on rebeladm@live.com also follow me on twitter @rebeladm to get updates about new blog posts.  

Review Active Directory Domain Service Events with PowerShell

There are different ways to review Active Directory service related logs in a domain controller. Most common way is to review events under Event Viewer mmc. 

event1

We can review events using server manager too. 

event2

We also can use PowerShell commands to review event logs or filter events from local and remote computers without any additional service configurations. Get-EventLog is the primary cmdlet we can use for this task. 

Get-EventLog -List

Above command will list down the details about the log files in your local system including the log file name, max log file size, number of entries. 

Get-EventLog -LogName ‘Directory Service’ | fl

Above command will list down all the events under the log file Directory Service

we also can limit the number of events we need to list down. As an example, if we only need to list down the latest 5 events from the Directory Service log file, we can use,

Get-EventLog -Newest 5 -LogName ‘Directory Service’

We can further filter down it by listing down evens according to entry type. 

Get-EventLog -Newest 5 -LogName ‘Directory Service’ -EntryType Error

Above command will list down first five “errors” in the Directory Service log file.

We also can add time limit to filter events more. 

Get-EventLog -Newest 5 -LogName ‘Directory Service’ -EntryType Error –After (Get-Date).AddDays(-1)

Above command will list down the events with error type ‘error’ with in last 24 hours under Directory Service log.

We also can get the events from the remote computers. 

Get-EventLog -Newest 5 -LogName ‘Directory Service’ -ComputerName ‘REBEL-SRV01’ | fl -Property *

Above command will list down the first five log entries in Directory Service log file from REBEL-SRV01 remote computer. 

event3

We also can extract events from few computers in same time. 

Get-EventLog -Newest 5 -LogName ‘Directory Service’ -ComputerName “localhost”,“REBEL-SRV01”

Above command will list down the log entries from local computer and the REBEL-SRV01 remote computer. 

When it comes to filtering, we can further filter events using the event source. 

Get-EventLog -LogName ‘Directory Service’ -Source “NTDS KCC”

Above command will list down the events with the source NTDS KCC

It also allows to search for the specific event ids. 

Get-EventLog -LogName ‘Directory Service’ | where {$_.eventID -eq 1000}

Above command will list down the events with event id 1000. 

Note – There are recommended list of events which we need to audit periodically to identify potential issues in active directory environment. The complete list is available for review under https://docs.microsoft.com/en-gb/windows-server/identity/ad-ds/plan/appendix-l–events-to-monitor

This marks the end of this blog post. If you have any questions feel free to contact me on rebeladm@live.com also follow me on twitter @rebeladm to get updates about new blog posts.  

Active Directory Health Monitoring with OMS (Operation Management Suite)

System Center Operation Manager (SCOM) is the Microsoft solution to monitor application and systems health in detail. It applies to Active Directory monitoring as well.  Using relevant management packs, it can monitor health of active directory services and its activities. Microsoft introduced Operation Management suite to bring monitoring to the next level with advanced analytics technologies. SCOM was more in to monitoring applications, services and devices running on-premises. But OMS work with on-premises, cloud only or hybrid cloud environments. 

OMS Benefits 

Minimal Configuration and Maintenance – If you worked with SCOM before you may know how many different components we need to configure such as management servers, SQL servers, Gateway Servers, Certificate Authority etc. But with OMS all we need a subscription and initial configuration of monitoring agents or gateway. No more complex maintenance routings either. 

Scalable – Latest records from Microsoft shows OMS is already using by more than 50k customer. More than 20PB data been collected more than 188 million queries been run for a week. With cloud based solution we no longer need to worry about the resource when we expanding. Subscription is based on the features and the amount of data you upload. You do not need to pay for the compute power. I am sure Microsoft no-way near running out of resources!! 

Integration with SCOM – OMS fully supported to integrate with SCOM. It allows engineers to specify which systems and data should be analyze by OMS. It also allows to do smooth migration from SCOM to OMS in stages. In integrated environment SCOM works similar to a gateway and OMS do queries through SCOM. OMS and SCOM both uses same monitoring agent (Microsoft Monitoring Agent) and there for client side configuration are minimum. 

Note – Some OMS components such as Network Performance Monitoring, WireData 2.0, Service Map required additional agent files, system changes and direct connection with OMS. 

Frequent Features Updates –Microsoft releases System center version in every four years’ time. But OMS updates and new services are coming more often. It allows Microsoft to address industry requirements quickly. 

OMS in Hybrid Environment 

In a hybrid environment, we can integrate on-premises system with OMS using three methods. 

Microsoft Monitoring Agent – Monitoring agent need to install in each and every system and it will directly connect to OMS to upload the data and run queries. Every system need to connection to OMS via port 443. 

SCOM – If you already have SCOM installed and configured in your infrastructure, OMS can integrate with it. Data upload to OMS will be done from SCOM management servers. OMS runs the queries to the systems via SCOM. However, some OMS feature still need direct connection to system to collect specific data. 

OMS gateway – Now OMS supports to collect data and run queries via its own gateway. This works similar to SCOM gateways. All the systems do not need to have direct connection to OMS and OMS gateway will collect and upload relevant data from its infrastructure. 

What is in there for AD Monitoring? 

In SCOM environment we can monitor active directory components and services using relevant management packs. It collects great amount of insight. However, to identify potential issues, engineers need to analyze these collected data. OMS provide two solution packs which collect data from Active Directory environment and analyze those for you. After analyzing it will visualize it in user friendly way. It also provides insight how to fix the detected problems as well as provide guidelines to improve the environment performance, security and high availability. 

AD Assessment – This solution will analyze risk and health of AD environments on a regular interval. It provides list of recommendations to improve you existing AD infrastructure. 

AD Replication Status – This solution analyzes replication status of your Active Directory environment. 

In this section I am going to demonstrate how we can monitor AD environment using OMS. Before we start we need, 

1) Valid OMS Subscription – OMS has different level of subscriptions. It is depending on the OMS services you use and amount of data uploaded daily. It does have free version which provides 500mb daily upload and 7-day data retention. 

2) Direct Connection to OMS – In this demo I am going to use the direct OMS integration via Microsoft Monitoring Agent. 

3) Domain Administrator Account – in order to install the agent in the domain controllers we need to have Domain Administrator privileges. 

Enable OMS AD Solutions 

1) Log in to OMS https://login.mms.microsoft.com/signin.aspx?ref=ms_mms as OMS administrator

2) Click on Solution Gallery

oms1

3) By default, AD Assessment solution is enabled. In order to enable AD Replication Status solution, click on the tile from the solution list and then click on Add

oms2

Install OMS Agents 
 
Next step of the configuration is to install monitoring agent in domain controllers and get them connected with OMS. 
 
1) Log in to the domain controller as domain administrator
2) Log in to OMS portal 
3) Go to Settings > Connected Sources > Windows Servers > click on Download Windows Agent (64bit). it will download the monitoring agent to the system. 
 
oms3
 
4) Once it is download, double click on the setup and start the installation process. 
5) In first windows of the wizard click Next to begin the installation. 
6) In next window read and accept the licenses terms.
7) In next window, we can select where it should install. If there is on changes click Next to Continue. 
8) In next window, it asks where it will connect to. In our scenario, it will connect to OMS directly. 
 
oms4
 
9) In next window, it asks about OMS Workspace ID and Key. it can be found in OMS portal in Settings > Connected Sources > Windows Servers. if this server is behind proxy server, we also can specify the proxy setting in this window. Once relevant info provided click on Next to continue. 
 
oms5
 
10) In next window, it asks how I need to check agent updates. It is recommended to use windows updates option. Once selection has made, Click Next
11) In confirmation page, click Install to begin the installation. 
12) Follow same steps for other domain controllers.
13) After few minutes, we can see the newly added servers are connected as data source under Settings > Connected Sources > Windows Servers
 
oms6

View Analyzed Data
 
1) After few minutes, OMS will start to collect data and virtualize the findings. 
2) To view the data, log in to OMS portal and click on relevant solution tile in home page. 
 
oms7
 
3) Once click on the tile it brings you to a page where it displays more details about its findings. 
 
oms8
 
4) As I explain before, it not only displays errors. It also gives recommendation on how to fix the existing issues. 
 
oms9
 
Collect Windows Logs for Analysis
 
Using OMS, we also can collect windows logs and use OMS analyzing capabilities to analyze those. When this enabled, OMS space usage and bandwidth usage on organization end will be higher. In order to collect logs,
 
1) Log in to OMS portal
2) Go to Settings > Data > Windows Event Log
3) In the box, you can search for the relevant log file name and add it to the list. We also can select which type of events to extract. Once selection is made click Save
 
oms10
 
4) After few minutes, you can start to see the events under log search option. In their using queries we can filter out the data. Also, we can setup email alerts based on the events. 
 
oms11
 
I believe now you have a basic knowledge on how to use OMS to monitor AD environment. There is lot of things we can do with OMS and I will cover those in future posts. 
 
This marks the end of this blog post. If you have any questions feel free to contact me on rebeladm@live.com also follow me on twitter @rebeladm to get updates about new blog posts.  

Active Directory Lingering objects

If you are maintaining healthy AD infrastructure it is very much unlikely to see lingering objects in AD. Let’s assume a Domain Controller has been disconnected from Active Directory environment and stayed offline more that the value specified tombstone lifetime attribute. Then it was again reconnected to replication topology. The objects which were deleted from Active Directory during the time that particular domain controller stayed offline will be remain as lingering objects on it. 

When object was deleted using one domain controller, it replicates to other domain controllers as tombstone object. it contains few attribute values but it cannot be used for active operations. It remains in Domain Controllers until it reaches the time specify by tombstone lifetime value. Then tombstone object will be permanently deleted from the directory. Tombstone time value is forest wide setting and depend on the operating system running. For operating systems after windows server 2003, default tombstone value is 180 days.  

The problem happens when the Domain Controller with lingering object involve with outbound replication. In such situation, one of following can happen. 

If the destination domain controller has strict replication consistency enabled it will halt the inbound replication from that particular Domain Controller. 

If the destination domain controller has strict replication consistency disabled it will request full replica and will reintroduced to the directory. 

Events 1388, 1988, 2042 are clues for lingering objects in Active Directory Infrastructure. 

Event id

Event Description

1388

Another domain controller (DC) has attempted to replicate into this DC an object which is not present in the local Active Directory Domain Services database. The object may have been deleted and already garbage collected (a tombstone lifetime or more has past since the object was deleted) on this DC. The attribute set included in the update request is not sufficient to create the object. The object will be re-requested with a full attribute set and re-created on this DC. Source DC (Transport-specific network address): xxxxxxxxxxxxxxxxx._msdcs.contoso.com Object: CN=xxxx,CN=xxx,DC=xxxx,DC=xxx Object GUID: xxxxxxxxxxxxx Directory partition: DC=xxxx,DC=xx Destination highest property USN: xxxxxx

1988

Active Directory Domain Services Replication encountered the existence of objects in the following partition that have been deleted from the local domain controllers (DCs) Active Directory Domain Services database. Not all direct or transitive replication partners replicated in the deletion before the tombstone lifetime number of days passed. Objects that have been deleted and garbage collected from an Active Directory Domain Services partition but still exist in the writable partitions of other DCs in the same domain, or read-only partitions of global catalog servers in other domains in the forest are known as "lingering objects". This event is being logged because the source DC contains a lingering object which does not exist on the local DCs Active Directory Domain Services database.

This replication attempt has been blocked. The best solution to this problem is to identify and remove all lingering objects in the forest. Source DC (Transport-specific network address): xxxxxxxxxxxxxx._msdcs.contoso.com Object: CN=xxxxxx,CN=xxxxx,DC=xxxxxx,DC=xxx Object GUID: xxxxxxxxxxxx

2042

It has been too long since this machine last replicated with the named source machine. The time between replications with this source has exceeded the tombstone lifetime. Replication has been stopped with this source. The reason that replication is not allowed to continue is that the two machine's views of deleted objects may now be different. The source machine may still have copies of objects that have been deleted (and garbage collected) on this machine. If they were allowed to replicate, the source machine might return objects which have already been deleted. Time of last successful replication: <date> <time> Invocation ID of source: <Invocation ID> Name of source: <GUID>._msdcs.<domain> Tombstone lifetime (days): <TSL number in days> The replication operation has failed.

Strict replication consistency

This setting is controlled by a registry key. After windows server 2003, by default this setting is enabled. The key can be found under HKEY_LOCAL_MACHINE\SYSTEM\CurrentControlSet\Services\NTDS\Parameters 

lin1

Removing lingering objects

Lingering objects can be remove using:

repadmin /removelingeringobjects <faulty DC name> <reference DC GUID><directory partition>

In the preceding command:

faulty DC name: It represents the DC which contains lingering objects

reference DC GUID: It is the GUID of a DC which contains an up-to-date database that can be used as a reference

directory partition is the directory partition where lingering objects are contained

This marks the end of this blog post. If you have any questions feel free to contact me on rebeladm@live.com also follow me on twitter @rebeladm to get updates about new blog posts.  

Azure AD Connect Staging Mode

Azure AD Connect is the tool use to connect on-premises directory service with Azure AD. It allows users to use same on-premises ID and passwords to authenticate in to Azure AD, Office 365 or other Applications hosted in Azure. Azure AD connect can install on any server if its meets following,

The AD forest functional level must be Windows Server 2003 or later. 

If you plan to use the feature password writeback, then the Domain Controllers must be on Windows Server 2008 (with latest SP) or later. If your DCs are on 2008 (pre-R2), then you must also apply hotfix KB2386717.

The domain controller used by Azure AD must be writable. It is not supported to use a RODC (read-only domain controller) and Azure AD Connect does not follow any write redirects.

It is not supported to use on-premises forests/domains using SLDs (Single Label Domains).

It is not supported to use on-premises forests/domains using "dotted" (name contains a period ".") NetBios names.

Azure AD Connect cannot be installed on Small Business Server or Windows Server Essentials. The server must be using Windows Server standard or better.

The Azure AD Connect server must have a full GUI installed. It is not supported to install on server core.

Azure AD Connect must be installed on Windows Server 2008 or later. This server may be a domain controller or a member server when using express settings. If you use custom settings, then the server can also be stand-alone and does not have to be joined to a domain.

If you install Azure AD Connect on Windows Server 2008 or Windows Server 2008 R2, then make sure to apply the latest hotfixes from Windows Update. The installation is not able to start with an unpatched server.

If you plan to use the feature password synchronization, then the Azure AD Connect server must be on Windows Server 2008 R2 SP1 or later.

If you plan to use a group managed service account, then the Azure AD Connect server must be on Windows Server 2012 or later.

The Azure AD Connect server must have .NET Framework 4.5.1 or later and Microsoft PowerShell 3.0 or later installed.

If Active Directory Federation Services is being deployed, the servers where AD FS or Web Application Proxy are installed must be Windows Server 2012 R2 or later. Windows remote management must be enabled on these servers for remote installation.

If Active Directory Federation Services is being deployed, you need SSL Certificates.

If Active Directory Federation Services is being deployed, then you need to configure name resolution.

If your global administrators have MFA enabled, then the URL https://secure.aadcdn.microsoftonline-p.com must be in the trusted sites list. You are prompted to add this site to the trusted sites list when you are prompted for an MFA challenge and it has not added before. You can use Internet Explorer to add it to your trusted sites.

Azure AD Connect requires a SQL Server database to store identity data. By default a SQL Server 2012 Express LocalDB (a light version of SQL Server Express) is installed. SQL Server Express has a 10GB size limit that enables you to manage approximately 100,000 objects. If you need to manage a higher volume of directory objects, you need to point the installation wizard to a different installation of SQL Server.

What is staging mode? 
 
In a given time, only one Azure AD connect instance can involve with sync process for a directory. But this gives few challenges. 
 
Disaster Recovery – If the server with Azure AD connect involves in a disaster it going to make impact on sync process. This can be worse if you using features such as password pass-through, single-sing-on, password writeback through AD connect.
Upgrades – If the system which running Azure AD connect needs upgrade or if Azure AD connect itself needs upgrade, will make impact for sync process. Again, the affordable downtime will be depending on the features and organization dependencies over Azure AD connect and its operations. 
Testing New Features – Microsoft keep adding new features to Azure AD connect. Before introduce those to production its always good to simulate and see how it will impact. But if its only one instance, it is not possible to do so. Even you have demo environment it may not simulate same impact as production in some occasions. 
 
Microsoft introduced the staging mode of Azure AD connect to overcome above challenges. With staging mode, it allows you to maintain another copy of Azure AD connect instance in another server. it can have same config as primary server. It will connect to Azure AD and receive changes and keep a latest copy to make sure the switch over is seamless as possible. However, it will not sync Azure AD connect configuration from primary server. it is engineer’s responsibility to update staging server AD connect configuration, if primary server AD connects config modified. 
 
Installation
 
Let’s see how we can configure Azure AD connect in staging mode.
 
1) Prepare a server according to guidelines given in prerequisites section to install Azure AD Connect. 
2) Review current configuration of Azure AD connect running on primary server. you can check this by Azure AD Connect | View current configuration 

sta1
 
sta2
 
3) Log in to server as Domain Administrator and download latest Azure Ad Connect from https://www.microsoft.com/en-us/download/details.aspx?id=47594
4) During the installation, please select customize option. 
 
sta3
 
5) Then proceed with the configuration according to settings used in primary server. 
6) At the last step of the configuration, select Enable staging mode: When selected, synchronization will not export any data to Ad or Azure AD and then click install
 
sta4
 
7) Once installation completed, in Synchronization Service (Azure AD Connect | Synchronization Service) we can confirm there is no sync jobs. 
 
sta5
 
Verify data
 
As I mentioned before, staging server allows to simulate export before it make as primary. This is important if you implement new configuration changes. 
 
In order to prepare a staged copy of export, 
 
1) Go to Start | Azure AD Connect | Synchronization Service | Connectors 
 
sta6
 
2) Select the Active Directory Domain Services connector and click on Run from the right-hand panel. 
 
sta7
 
3) Then in next window select Full Import and click OK.
 
sta8
 
4) Repeat same for Windows Azure Active Directory (Microsoft) 
5) Once both jobs completed, Select the Active Directory Domain Services connector and click on Run from the right-hand panel again. But this time select Delta Synchronization, and click OK.
 
sta9
 
6) Repeat same for Windows Azure Active Directory (Microsoft)
7) Once both jobs finished, go to Operation tab and verify if jobs were completed successfully. 
 
sta10
 
Now we have the staging copy, next step is to verify if the data is presented as expected. to do that we need to get help of a PowerShell script.  

 
Param(
    [Parameter(Mandatory=$true, HelpMessage="Must be a file generated using csexport 'Name of Connector' export.xml /f:x)")]
    [string]$xmltoimport="%temp%\exportedStage1a.xml",
    [Parameter(Mandatory=$false, HelpMessage="Maximum number of users per output file")][int]$batchsize=1000,
    [Parameter(Mandatory=$false, HelpMessage="Show console output")][bool]$showOutput=$false
)

#LINQ isn't loaded automatically, so force it
[Reflection.Assembly]::Load("System.Xml.Linq, Version=3.5.0.0, Culture=neutral, PublicKeyToken=b77a5c561934e089") | Out-Null

[int]$count=1
[int]$outputfilecount=1
[array]$objOutputUsers=@()

#XML must be generated using "csexport "Name of Connector" export.xml /f:x"
write-host "Importing XML" -ForegroundColor Yellow

#XmlReader.Create won't properly resolve the file location,
#so expand and then resolve it
$resolvedXMLtoimport=Resolve-Path -Path ([Environment]::ExpandEnvironmentVariables($xmltoimport))

#use an XmlReader to deal with even large files
$result=$reader = [System.Xml.XmlReader]::Create($resolvedXMLtoimport) 
$result=$reader.ReadToDescendant('cs-object')
do 
{
    #create the object placeholder
    #adding them up here means we can enforce consistency
    $objOutputUser=New-Object psobject
    Add-Member -InputObject $objOutputUser -MemberType NoteProperty -Name ID -Value ""
    Add-Member -InputObject $objOutputUser -MemberType NoteProperty -Name Type -Value ""
    Add-Member -inputobject $objOutputUser -MemberType NoteProperty -Name DN -Value ""
    Add-Member -inputobject $objOutputUser -MemberType NoteProperty -Name operation -Value ""
    Add-Member -inputobject $objOutputUser -MemberType NoteProperty -Name UPN -Value ""
    Add-Member -inputobject $objOutputUser -MemberType NoteProperty -Name displayName -Value ""
    Add-Member -inputobject $objOutputUser -MemberType NoteProperty -Name sourceAnchor -Value ""
    Add-Member -inputobject $objOutputUser -MemberType NoteProperty -Name alias -Value ""
    Add-Member -inputobject $objOutputUser -MemberType NoteProperty -Name primarySMTP -Value ""
    Add-Member -inputobject $objOutputUser -MemberType NoteProperty -Name onPremisesSamAccountName -Value ""
    Add-Member -inputobject $objOutputUser -MemberType NoteProperty -Name mail -Value ""

    $user = [System.Xml.Linq.XElement]::ReadFrom($reader)
    if ($showOutput) {Write-Host Found an exported object... -ForegroundColor Green}

    #object id
    $outID=$user.Attribute('id').Value
    if ($showOutput) {Write-Host ID: $outID}
    $objOutputUser.ID=$outID

    #object type
    $outType=$user.Attribute('object-type').Value
    if ($showOutput) {Write-Host Type: $outType}
    $objOutputUser.Type=$outType

    #dn
    $outDN= $user.Element('unapplied-export').Element('delta').Attribute('dn').Value
    if ($showOutput) {Write-Host DN: $outDN}
    $objOutputUser.DN=$outDN

    #operation
    $outOperation= $user.Element('unapplied-export').Element('delta').Attribute('operation').Value
    if ($showOutput) {Write-Host Operation: $outOperation}
    $objOutputUser.operation=$outOperation

    #now that we have the basics, go get the details

    foreach ($attr in $user.Element('unapplied-export-hologram').Element('entry').Elements("attr"))
    {
        $attrvalue=$attr.Attribute('name').Value
        $internalvalue= $attr.Element('value').Value

        switch ($attrvalue)
        {
            "userPrincipalName"
            {
                if ($showOutput) {Write-Host UPN: $internalvalue}
                $objOutputUser.UPN=$internalvalue
            }
            "displayName"
            {
                if ($showOutput) {Write-Host displayName: $internalvalue}
                $objOutputUser.displayName=$internalvalue
            }
            "sourceAnchor"
            {
                if ($showOutput) {Write-Host sourceAnchor: $internalvalue}
                $objOutputUser.sourceAnchor=$internalvalue
            }
            "alias"
            {
                if ($showOutput) {Write-Host alias: $internalvalue}
                $objOutputUser.alias=$internalvalue
            }
            "proxyAddresses"
            {
                if ($showOutput) {Write-Host primarySMTP: ($internalvalue -replace "SMTP:","")}
                $objOutputUser.primarySMTP=$internalvalue -replace "SMTP:",""
            }
        }
    }

    $objOutputUsers += $objOutputUser

    Write-Progress -activity "Processing ${xmltoimport} in batches of ${batchsize}" -status "Batch ${outputfilecount}: " -percentComplete (($objOutputUsers.Count / $batchsize) * 100)

    #every so often, dump the processed users in case we blow up somewhere
    if ($count % $batchsize -eq 0)
    {
        Write-Host Hit the maximum users processed without completion... -ForegroundColor Yellow

        #export the collection of users as as CSV
        Write-Host Writing processedusers${outputfilecount}.csv -ForegroundColor Yellow
        $objOutputUsers | Export-Csv -path processedusers${outputfilecount}.csv -NoTypeInformation

        #increment the output file counter
        $outputfilecount+=1

        #reset the collection and the user counter
        $objOutputUsers = $null
        $count=0
    }

    $count+=1

    #need to bail out of the loop if no more users to process
    if ($reader.NodeType -eq [System.Xml.XmlNodeType]::EndElement)
    {
        break
    }

} while ($reader.Read)

#need to write out any users that didn't get picked up in a batch of 1000
#export the collection of users as as CSV
Write-Host Writing processedusers${outputfilecount}.csv -ForegroundColor Yellow
$objOutputUsers | Export-Csv -path processedusers${outputfilecount}.csv -NoTypeInformation

 
Save this as .ps1 on C Drive. 
 
1) Open PowerShell and type cd "C:\Program Files\Microsoft Azure AD Sync\Bin" (if your install path is different use the relevant path)
2) Then run .\csexport "myrebeladmin.onmicrosoft.com – AAD" C:\export.xml /f:x in here myrebeladmin.onmicrosoft.com -AAD should replace with your Azure AD connector name. This will export config to C:\export.xml
3) Then type .\analyze.ps1 -xmltoimport C:\export.xml in here analyze.ps1 is the script we saved in beginning of this section. 
4) Then it will create CSV file called processedusers1.csv and it’s contain all changes which will sync to Azure AD. 
 
However, this step is always not required. It can make as primary server without import and verify process. 
 
How to make it as primary Server?
 
In order to make staging server as primary server,
 
1) Go to Start | Azure AD Connect | Azure AD Connect
2) Then click on Configure in next page. 
3) In next page select option Configure staging mode and click Next
 
 
sta11
 
4) In next page provide the Azure AD login credentials for directory sync account. 
5) In next window, untick Enable staging mode and click Next
 
sta12
 
6) In next window select start the synchronization process… and click Configure
 
sta13
 
This completes the process of promoting staging server in to primary. Hope this was useful and if you have any questions feel free to contact me on rebeladm@live.com also follow me on twitter @rebeladm to get updates about new blog posts.

What is Content Freshness protection in DFSR?

Healthy Replication is a must for active directory environment. SYSVOL folder in domain controllers contain policies and log on scripts. It is replicated between domain controllers to maintain up to date config (consistency). Before windows server 2008, it used FRS (File Replication Service) to replicate sysvol content among domain controllers. With Windows server 2008 FRS was deprecated and introduced Distributed File System (DFS) for replication.

A healthy replication required healthy communication between domain controllers. sometime the communication can interrupt due to domain controller failure or link failure. Based on the impact it is still possible that the communication re-established after period of time. Then it will try to resume replication and catch up with SYSVOL changes. In such scenario, we may see event 4012 in event viewer. 

The DFS Replication service stopped replication on the replicated folder at local path c:\xxx. It has been disconnected from other partners for 70 days, which is longer than the MaxOfflineTimeInDays parameter. Because of this, DFS Replication considers this data to be stale, and will replace it with data from other members of the replication group during the next replication. DFS Replication will move the stale files to the local Conflict folder. No user action is required.

With windows server 2008, Microsoft introduced a setting called content freshness protection to protect DFS shares from stale data. DFS also use a multi-master database similar to active directory. It also has tombstone time limit similar to Active Directory. The default value for this is 60 days. If there were no replication more than that time and resume replication in later time, it can have stale data. It is similar to lingering objects in AD. To protect from this, we can define value for MaxOfflineTimeInDays. if the number of days from last successful DFS replication is larger than MaxOfflineTimeInDays it will prevent the replication. 

We can review this value by running,

For /f %m IN ('dsquery server -o rdn') do @echo %m && @wmic /node:"%m" /namespace:\\root\microsoftdfs path DfsrMachineConfig get MaxOfflineTimeInDays

cf1

There is two ways to recover from this. First method is to increase the value of MaxOfflineTimeInDays. it can be done using,

wmic.exe /namespace:\\root\microsoftdfs path DfsrMachineConfig set MaxOfflineTimeInDays=120

cf2

It is recommended to run this on all domain controllers to maintain same config. 

If you not willing to change this value, it still can recover using non-authoritative restore. It will remove all conflicting values and take an updated copy. 

I have already written an article about non-authoritative restore of SYSVOL and it can be find in http://www.rebeladmin.com/2017/08/non-authoritative-authoritative-sysvol-restore-dfs-replication/ 

This is not only for SYSVOL replication. It is valid for DFS replication in general. 

Hope this was useful and if you have any questions feel free to contact me on rebeladm@live.com also follow me on twitter @rebeladm to get updates about new blog posts.

Step-by-Step guide to setup Fine-Grained Password Policies

In AD environment, we can use password policy to define passwords security requirements. These settings are located under Computer Configuration | Policies | Windows Settings | Security Settings | Account Policies

fine1

Before Windows server 2008, only one password policy can apply to the users. But in an environment, based on user roles it may require additional protection. As an example, for sales users 8-character complex password can be too much but it is not too much for domain admin account. With windows server 2008 Microsoft introduced Fine-Grained Password Policies. This allow to apply different password policies users and groups. In order to use this feature, 

1) Your domain functional level should be windows server 2008 at least.

2) Need Domain/Enterprise Admin account to create policies. 

Similar to group policies, sometime objects may end up with multiple password policies applied to it. but in any given time, an object can only have one password policy. Each Fine-Grained Password Policy have a precedence value. This integer value can define during the policy setup. Lower precedence value means the higher priority. If multiple policies been applied to an object, the policy with lower precedence value wins. Also, policy linked to user object directly, always wins. 

We can create the policies using Active Directory Administrative Centre or PowerShell. In this demo, I am going to use PowerShell method. 

New-ADFineGrainedPasswordPolicy -Name "Tech Admin Password Policy" -Precedence 1 `

-MinPasswordLength 12 -MaxPasswordAge "30" -MinPasswordAge "7" `

-PasswordHistoryCount 50 -ComplexityEnabled:$true `

-LockoutDuration "8:00" `

-LockoutObservationWindow "8:00" -LockoutThreshold 3 `

-ReversibleEncryptionEnabled:$false

In above sample I am creating a new fine-grained password policy called “Tech Admin Password Policy”. New-ADFineGrainedPasswordPolicy is the cmdlet to create new policy. Precedence to define precedence. LockoutDuration and LockoutObservationWindow values are define in hours. LockoutThreshold value defines the number of login attempts allowed. 

More info about the syntax can find using,

Get-Help New-ADFineGrainedPasswordPolicy

Also, can view examples using 

Get-Help New-ADFineGrainedPasswordPolicy -Examples

fine2

Once policy is setup we can verify its settings using, 

Get-ADFineGrainedPasswordPolicy –Identity “Tech Admin Password Policy” 

fine3

Now we have policy in place. Next step is to attach it to groups or users. In my demo, I am going to apply this to a group called “IT Admins”

Add-ADFineGrainedPasswordPolicySubject -Identity "Tech Admin Password Policy" -Subjects "IT Admins"

I also going to attach it to s user account R143869

Add-ADFineGrainedPasswordPolicySubject -Identity "Tech Admin Password Policy" -Subjects "R143869"

We can verify the policy using following,

Get-ADFineGrainedPasswordPolicy -Identity "Tech Admin Password Policy" | Format-Table AppliesTo –AutoSize

fine4

This confirms the configuration. Hope this was useful and if you have any questions feel free to contact me on rebeladm@live.com also follow me on twitter @rebeladm to get updates about new blog posts.