Tag Archives: AD

Step-by-Step guide to create custom Active Directory Attributes

In active directory schema, it is allowed to add custom attributes. In organizations, there are situations where this option is useful. It is most of the time related to application integration requirements with active directory infrastructure. In modern infrastructures, applications are decentralizing identity management. Organization’s identities can sit on active directory as well as applications. Some may in in-house infrastructures and some may even in public cloud. If these applications are integrated with active directory it’s still provides central identity management but it’s not always. Some applications have their own way of handling its user accounts and privileges. Similar to active directory attributes, these applications can also have their own attributes defined by its database system to store the data. These application attributes most of the time will not match the attributes on active directory. As an example, HR system uses employee ID to identify an employee record uniquely from others. But active directory use username to identify a unique record. Each system’s attributes hold some data about the objects even its referring to same user or device. If there is another application which required to retrieve data from both system’s attributes how we can facilitate such without data duplication?

One’s a customer was talking to me regarding similar requirement. They have active directory infrastructure in place. They also maintaining a HR system which is not integrated with active directory. They got a new requirement for an employee collaboration application which required data input in specific way. It has defined its fields in the database and we need to match the data on that order. Some of these required data about users can retrieve from active directory and some of user data can retrieve from the HR system. Instead of keeping two data feeds to the system we decided to treat the active directory as the trustworthy data source for this new system. If active directory need to hold all the required data, it somehow need to store the data comes from HR system as well. The final solution was to add custom attributes to active directory schema and associate it with the user class. Instead of both system operate as data feeds, now HR system pass the filtered values to Active directory and it exports all the required data in CSV format to the application.  

In order to create custom attributes, go to active directory schema snap-in, right click on attributes container and select create attribute

Tip – In order to open active directory schema snap-in you need to run command regsvr32 schmmgmt.dll from the Domain Controller. After that you can use MMC and add active directory schema as snap-in. 

Then system will give a warning about the schema object creation and click OK to continue. 

It will open up a form and this is where we need to define the details about custom attribute. 

1) Common Name – This is the name of the object. It is only allowed to use letters, numbers and hyphen for the CN. 

2) LDAP Display Name – When object is referring in script, program or command line utility it need to call using the LDAP Display name instead of the Common Name. when you define the CN, it will automatically create the LDAP Display name. 

3) X500 Object ID – Each and every attribute in active directory schema has unique OID value. There is script develop by Microsoft to generate these unique OID valves. It can be found in https://gallery.technet.microsoft.com/scriptcenter/Generate-an-Object-4c9be66a#content it also can directly run using following PowerShell command. 

 

#--- 

$Prefix="1.2.840.113556.1.8000.2554" 

$GUID=[System.Guid]::NewGuid().ToString() 

$Parts=@() 

$Parts+=[UInt64]::Parse($guid.SubString(0,4),"AllowHexSpecifier") 

$Parts+=[UInt64]::Parse($guid.SubString(4,4),"AllowHexSpecifier") 

$Parts+=[UInt64]::Parse($guid.SubString(9,4),"AllowHexSpecifier") 

$Parts+=[UInt64]::Parse($guid.SubString(14,4),"AllowHexSpecifier") 

$Parts+=[UInt64]::Parse($guid.SubString(19,4),"AllowHexSpecifier") 

$Parts+=[UInt64]::Parse($guid.SubString(24,6),"AllowHexSpecifier") 

$Parts+=[UInt64]::Parse($guid.SubString(30,6),"AllowHexSpecifier") 

$OID=[String]::Format("{0}.{1}.{2}.{3}.{4}.{5}.{6}.{7}",$prefix,$Parts[0],$Parts[1],$Parts[2],$Parts[3],$Parts[4],$Parts[5],$Parts[6]) 

$oid 

#---

 

4) Syntax – It define the storage representation for the object. It is only allowed to use syntaxes defined by Microsoft. One attribute can only associate with one syntax. In below I listed few common used syntaxes in attributes. 

 

Syntax

Description

Boolean

True or False 

Unicode String

A large string

Numeric String

String of digits

Integer

32-bit Numeric value

Large Integer

64-bit Numeric value

SID

Security Identifier Value

Distinguished Name

String value to uniquely identify object in AD

Along with the syntax we also can define the minimum or maximum values. If it’s not defined it will take the default values. 

In following demo, I like to add a new attribute called NI-Number and add it to the User Class

attri1

As the next step, we need to add it to the user class. In order to do that go to classes container, double click on user class and click on attributes tab. In there by clicking the add button can browse and select the newly added attribute from the list. 

attri2

Now when we open a user account we can see the new attribute and we can add the new data to it. 

attri3

Once data been added we can filter out the information as required. 

Get-ADuser “tuser4” -Properties nINumber | ft nINumber

attri4

Note – To add the attributes to the schema you need to have schema administrator privileges or enterprise administrator privileges. 

This marks the end of this blog post. If you have any questions feel free to contact me on rebeladm@live.com also follow me on twitter @rebeladm to get updates about new blog posts.  

Review Active Directory Domain Service Events with PowerShell

There are different ways to review Active Directory service related logs in a domain controller. Most common way is to review events under Event Viewer mmc. 

event1

We can review events using server manager too. 

event2

We also can use PowerShell commands to review event logs or filter events from local and remote computers without any additional service configurations. Get-EventLog is the primary cmdlet we can use for this task. 

Get-EventLog -List

Above command will list down the details about the log files in your local system including the log file name, max log file size, number of entries. 

Get-EventLog -LogName ‘Directory Service’ | fl

Above command will list down all the events under the log file Directory Service

we also can limit the number of events we need to list down. As an example, if we only need to list down the latest 5 events from the Directory Service log file, we can use,

Get-EventLog -Newest 5 -LogName ‘Directory Service’

We can further filter down it by listing down evens according to entry type. 

Get-EventLog -Newest 5 -LogName ‘Directory Service’ -EntryType Error

Above command will list down first five “errors” in the Directory Service log file.

We also can add time limit to filter events more. 

Get-EventLog -Newest 5 -LogName ‘Directory Service’ -EntryType Error –After (Get-Date).AddDays(-1)

Above command will list down the events with error type ‘error’ with in last 24 hours under Directory Service log.

We also can get the events from the remote computers. 

Get-EventLog -Newest 5 -LogName ‘Directory Service’ -ComputerName ‘REBEL-SRV01’ | fl -Property *

Above command will list down the first five log entries in Directory Service log file from REBEL-SRV01 remote computer. 

event3

We also can extract events from few computers in same time. 

Get-EventLog -Newest 5 -LogName ‘Directory Service’ -ComputerName “localhost”,“REBEL-SRV01”

Above command will list down the log entries from local computer and the REBEL-SRV01 remote computer. 

When it comes to filtering, we can further filter events using the event source. 

Get-EventLog -LogName ‘Directory Service’ -Source “NTDS KCC”

Above command will list down the events with the source NTDS KCC

It also allows to search for the specific event ids. 

Get-EventLog -LogName ‘Directory Service’ | where {$_.eventID -eq 1000}

Above command will list down the events with event id 1000. 

Note – There are recommended list of events which we need to audit periodically to identify potential issues in active directory environment. The complete list is available for review under https://docs.microsoft.com/en-gb/windows-server/identity/ad-ds/plan/appendix-l–events-to-monitor

This marks the end of this blog post. If you have any questions feel free to contact me on rebeladm@live.com also follow me on twitter @rebeladm to get updates about new blog posts.  

Active Directory Health Monitoring with OMS (Operation Management Suite)

System Center Operation Manager (SCOM) is the Microsoft solution to monitor application and systems health in detail. It applies to Active Directory monitoring as well.  Using relevant management packs, it can monitor health of active directory services and its activities. Microsoft introduced Operation Management suite to bring monitoring to the next level with advanced analytics technologies. SCOM was more in to monitoring applications, services and devices running on-premises. But OMS work with on-premises, cloud only or hybrid cloud environments. 

OMS Benefits 

Minimal Configuration and Maintenance – If you worked with SCOM before you may know how many different components we need to configure such as management servers, SQL servers, Gateway Servers, Certificate Authority etc. But with OMS all we need a subscription and initial configuration of monitoring agents or gateway. No more complex maintenance routings either. 

Scalable – Latest records from Microsoft shows OMS is already using by more than 50k customer. More than 20PB data been collected more than 188 million queries been run for a week. With cloud based solution we no longer need to worry about the resource when we expanding. Subscription is based on the features and the amount of data you upload. You do not need to pay for the compute power. I am sure Microsoft no-way near running out of resources!! 

Integration with SCOM – OMS fully supported to integrate with SCOM. It allows engineers to specify which systems and data should be analyze by OMS. It also allows to do smooth migration from SCOM to OMS in stages. In integrated environment SCOM works similar to a gateway and OMS do queries through SCOM. OMS and SCOM both uses same monitoring agent (Microsoft Monitoring Agent) and there for client side configuration are minimum. 

Note – Some OMS components such as Network Performance Monitoring, WireData 2.0, Service Map required additional agent files, system changes and direct connection with OMS. 

Frequent Features Updates –Microsoft releases System center version in every four years’ time. But OMS updates and new services are coming more often. It allows Microsoft to address industry requirements quickly. 

OMS in Hybrid Environment 

In a hybrid environment, we can integrate on-premises system with OMS using three methods. 

Microsoft Monitoring Agent – Monitoring agent need to install in each and every system and it will directly connect to OMS to upload the data and run queries. Every system need to connection to OMS via port 443. 

SCOM – If you already have SCOM installed and configured in your infrastructure, OMS can integrate with it. Data upload to OMS will be done from SCOM management servers. OMS runs the queries to the systems via SCOM. However, some OMS feature still need direct connection to system to collect specific data. 

OMS gateway – Now OMS supports to collect data and run queries via its own gateway. This works similar to SCOM gateways. All the systems do not need to have direct connection to OMS and OMS gateway will collect and upload relevant data from its infrastructure. 

What is in there for AD Monitoring? 

In SCOM environment we can monitor active directory components and services using relevant management packs. It collects great amount of insight. However, to identify potential issues, engineers need to analyze these collected data. OMS provide two solution packs which collect data from Active Directory environment and analyze those for you. After analyzing it will visualize it in user friendly way. It also provides insight how to fix the detected problems as well as provide guidelines to improve the environment performance, security and high availability. 

AD Assessment – This solution will analyze risk and health of AD environments on a regular interval. It provides list of recommendations to improve you existing AD infrastructure. 

AD Replication Status – This solution analyzes replication status of your Active Directory environment. 

In this section I am going to demonstrate how we can monitor AD environment using OMS. Before we start we need, 

1) Valid OMS Subscription – OMS has different level of subscriptions. It is depending on the OMS services you use and amount of data uploaded daily. It does have free version which provides 500mb daily upload and 7-day data retention. 

2) Direct Connection to OMS – In this demo I am going to use the direct OMS integration via Microsoft Monitoring Agent. 

3) Domain Administrator Account – in order to install the agent in the domain controllers we need to have Domain Administrator privileges. 

Enable OMS AD Solutions 

1) Log in to OMS https://login.mms.microsoft.com/signin.aspx?ref=ms_mms as OMS administrator

2) Click on Solution Gallery

oms1

3) By default, AD Assessment solution is enabled. In order to enable AD Replication Status solution, click on the tile from the solution list and then click on Add

oms2

Install OMS Agents 
 
Next step of the configuration is to install monitoring agent in domain controllers and get them connected with OMS. 
 
1) Log in to the domain controller as domain administrator
2) Log in to OMS portal 
3) Go to Settings > Connected Sources > Windows Servers > click on Download Windows Agent (64bit). it will download the monitoring agent to the system. 
 
oms3
 
4) Once it is download, double click on the setup and start the installation process. 
5) In first windows of the wizard click Next to begin the installation. 
6) In next window read and accept the licenses terms.
7) In next window, we can select where it should install. If there is on changes click Next to Continue. 
8) In next window, it asks where it will connect to. In our scenario, it will connect to OMS directly. 
 
oms4
 
9) In next window, it asks about OMS Workspace ID and Key. it can be found in OMS portal in Settings > Connected Sources > Windows Servers. if this server is behind proxy server, we also can specify the proxy setting in this window. Once relevant info provided click on Next to continue. 
 
oms5
 
10) In next window, it asks how I need to check agent updates. It is recommended to use windows updates option. Once selection has made, Click Next
11) In confirmation page, click Install to begin the installation. 
12) Follow same steps for other domain controllers.
13) After few minutes, we can see the newly added servers are connected as data source under Settings > Connected Sources > Windows Servers
 
oms6

View Analyzed Data
 
1) After few minutes, OMS will start to collect data and virtualize the findings. 
2) To view the data, log in to OMS portal and click on relevant solution tile in home page. 
 
oms7
 
3) Once click on the tile it brings you to a page where it displays more details about its findings. 
 
oms8
 
4) As I explain before, it not only displays errors. It also gives recommendation on how to fix the existing issues. 
 
oms9
 
Collect Windows Logs for Analysis
 
Using OMS, we also can collect windows logs and use OMS analyzing capabilities to analyze those. When this enabled, OMS space usage and bandwidth usage on organization end will be higher. In order to collect logs,
 
1) Log in to OMS portal
2) Go to Settings > Data > Windows Event Log
3) In the box, you can search for the relevant log file name and add it to the list. We also can select which type of events to extract. Once selection is made click Save
 
oms10
 
4) After few minutes, you can start to see the events under log search option. In their using queries we can filter out the data. Also, we can setup email alerts based on the events. 
 
oms11
 
I believe now you have a basic knowledge on how to use OMS to monitor AD environment. There is lot of things we can do with OMS and I will cover those in future posts. 
 
This marks the end of this blog post. If you have any questions feel free to contact me on rebeladm@live.com also follow me on twitter @rebeladm to get updates about new blog posts.  

Active Directory Lingering objects

If you are maintaining healthy AD infrastructure it is very much unlikely to see lingering objects in AD. Let’s assume a Domain Controller has been disconnected from Active Directory environment and stayed offline more that the value specified tombstone lifetime attribute. Then it was again reconnected to replication topology. The objects which were deleted from Active Directory during the time that particular domain controller stayed offline will be remain as lingering objects on it. 

When object was deleted using one domain controller, it replicates to other domain controllers as tombstone object. it contains few attribute values but it cannot be used for active operations. It remains in Domain Controllers until it reaches the time specify by tombstone lifetime value. Then tombstone object will be permanently deleted from the directory. Tombstone time value is forest wide setting and depend on the operating system running. For operating systems after windows server 2003, default tombstone value is 180 days.  

The problem happens when the Domain Controller with lingering object involve with outbound replication. In such situation, one of following can happen. 

If the destination domain controller has strict replication consistency enabled it will halt the inbound replication from that particular Domain Controller. 

If the destination domain controller has strict replication consistency disabled it will request full replica and will reintroduced to the directory. 

Events 1388, 1988, 2042 are clues for lingering objects in Active Directory Infrastructure. 

Event id

Event Description

1388

Another domain controller (DC) has attempted to replicate into this DC an object which is not present in the local Active Directory Domain Services database. The object may have been deleted and already garbage collected (a tombstone lifetime or more has past since the object was deleted) on this DC. The attribute set included in the update request is not sufficient to create the object. The object will be re-requested with a full attribute set and re-created on this DC. Source DC (Transport-specific network address): xxxxxxxxxxxxxxxxx._msdcs.contoso.com Object: CN=xxxx,CN=xxx,DC=xxxx,DC=xxx Object GUID: xxxxxxxxxxxxx Directory partition: DC=xxxx,DC=xx Destination highest property USN: xxxxxx

1988

Active Directory Domain Services Replication encountered the existence of objects in the following partition that have been deleted from the local domain controllers (DCs) Active Directory Domain Services database. Not all direct or transitive replication partners replicated in the deletion before the tombstone lifetime number of days passed. Objects that have been deleted and garbage collected from an Active Directory Domain Services partition but still exist in the writable partitions of other DCs in the same domain, or read-only partitions of global catalog servers in other domains in the forest are known as "lingering objects". This event is being logged because the source DC contains a lingering object which does not exist on the local DCs Active Directory Domain Services database.

This replication attempt has been blocked. The best solution to this problem is to identify and remove all lingering objects in the forest. Source DC (Transport-specific network address): xxxxxxxxxxxxxx._msdcs.contoso.com Object: CN=xxxxxx,CN=xxxxx,DC=xxxxxx,DC=xxx Object GUID: xxxxxxxxxxxx

2042

It has been too long since this machine last replicated with the named source machine. The time between replications with this source has exceeded the tombstone lifetime. Replication has been stopped with this source. The reason that replication is not allowed to continue is that the two machine's views of deleted objects may now be different. The source machine may still have copies of objects that have been deleted (and garbage collected) on this machine. If they were allowed to replicate, the source machine might return objects which have already been deleted. Time of last successful replication: <date> <time> Invocation ID of source: <Invocation ID> Name of source: <GUID>._msdcs.<domain> Tombstone lifetime (days): <TSL number in days> The replication operation has failed.

Strict replication consistency

This setting is controlled by a registry key. After windows server 2003, by default this setting is enabled. The key can be found under HKEY_LOCAL_MACHINE\SYSTEM\CurrentControlSet\Services\NTDS\Parameters 

lin1

Removing lingering objects

Lingering objects can be remove using:

repadmin /removelingeringobjects <faulty DC name> <reference DC GUID><directory partition>

In the preceding command:

faulty DC name: It represents the DC which contains lingering objects

reference DC GUID: It is the GUID of a DC which contains an up-to-date database that can be used as a reference

directory partition is the directory partition where lingering objects are contained

This marks the end of this blog post. If you have any questions feel free to contact me on rebeladm@live.com also follow me on twitter @rebeladm to get updates about new blog posts.  

Azure AD Connect Staging Mode

Azure AD Connect is the tool use to connect on-premises directory service with Azure AD. It allows users to use same on-premises ID and passwords to authenticate in to Azure AD, Office 365 or other Applications hosted in Azure. Azure AD connect can install on any server if its meets following,

The AD forest functional level must be Windows Server 2003 or later. 

If you plan to use the feature password writeback, then the Domain Controllers must be on Windows Server 2008 (with latest SP) or later. If your DCs are on 2008 (pre-R2), then you must also apply hotfix KB2386717.

The domain controller used by Azure AD must be writable. It is not supported to use a RODC (read-only domain controller) and Azure AD Connect does not follow any write redirects.

It is not supported to use on-premises forests/domains using SLDs (Single Label Domains).

It is not supported to use on-premises forests/domains using "dotted" (name contains a period ".") NetBios names.

Azure AD Connect cannot be installed on Small Business Server or Windows Server Essentials. The server must be using Windows Server standard or better.

The Azure AD Connect server must have a full GUI installed. It is not supported to install on server core.

Azure AD Connect must be installed on Windows Server 2008 or later. This server may be a domain controller or a member server when using express settings. If you use custom settings, then the server can also be stand-alone and does not have to be joined to a domain.

If you install Azure AD Connect on Windows Server 2008 or Windows Server 2008 R2, then make sure to apply the latest hotfixes from Windows Update. The installation is not able to start with an unpatched server.

If you plan to use the feature password synchronization, then the Azure AD Connect server must be on Windows Server 2008 R2 SP1 or later.

If you plan to use a group managed service account, then the Azure AD Connect server must be on Windows Server 2012 or later.

The Azure AD Connect server must have .NET Framework 4.5.1 or later and Microsoft PowerShell 3.0 or later installed.

If Active Directory Federation Services is being deployed, the servers where AD FS or Web Application Proxy are installed must be Windows Server 2012 R2 or later. Windows remote management must be enabled on these servers for remote installation.

If Active Directory Federation Services is being deployed, you need SSL Certificates.

If Active Directory Federation Services is being deployed, then you need to configure name resolution.

If your global administrators have MFA enabled, then the URL https://secure.aadcdn.microsoftonline-p.com must be in the trusted sites list. You are prompted to add this site to the trusted sites list when you are prompted for an MFA challenge and it has not added before. You can use Internet Explorer to add it to your trusted sites.

Azure AD Connect requires a SQL Server database to store identity data. By default a SQL Server 2012 Express LocalDB (a light version of SQL Server Express) is installed. SQL Server Express has a 10GB size limit that enables you to manage approximately 100,000 objects. If you need to manage a higher volume of directory objects, you need to point the installation wizard to a different installation of SQL Server.

What is staging mode? 
 
In a given time, only one Azure AD connect instance can involve with sync process for a directory. But this gives few challenges. 
 
Disaster Recovery – If the server with Azure AD connect involves in a disaster it going to make impact on sync process. This can be worse if you using features such as password pass-through, single-sing-on, password writeback through AD connect.
Upgrades – If the system which running Azure AD connect needs upgrade or if Azure AD connect itself needs upgrade, will make impact for sync process. Again, the affordable downtime will be depending on the features and organization dependencies over Azure AD connect and its operations. 
Testing New Features – Microsoft keep adding new features to Azure AD connect. Before introduce those to production its always good to simulate and see how it will impact. But if its only one instance, it is not possible to do so. Even you have demo environment it may not simulate same impact as production in some occasions. 
 
Microsoft introduced the staging mode of Azure AD connect to overcome above challenges. With staging mode, it allows you to maintain another copy of Azure AD connect instance in another server. it can have same config as primary server. It will connect to Azure AD and receive changes and keep a latest copy to make sure the switch over is seamless as possible. However, it will not sync Azure AD connect configuration from primary server. it is engineer’s responsibility to update staging server AD connect configuration, if primary server AD connects config modified. 
 
Installation
 
Let’s see how we can configure Azure AD connect in staging mode.
 
1) Prepare a server according to guidelines given in prerequisites section to install Azure AD Connect. 
2) Review current configuration of Azure AD connect running on primary server. you can check this by Azure AD Connect | View current configuration 

sta1
 
sta2
 
3) Log in to server as Domain Administrator and download latest Azure Ad Connect from https://www.microsoft.com/en-us/download/details.aspx?id=47594
4) During the installation, please select customize option. 
 
sta3
 
5) Then proceed with the configuration according to settings used in primary server. 
6) At the last step of the configuration, select Enable staging mode: When selected, synchronization will not export any data to Ad or Azure AD and then click install
 
sta4
 
7) Once installation completed, in Synchronization Service (Azure AD Connect | Synchronization Service) we can confirm there is no sync jobs. 
 
sta5
 
Verify data
 
As I mentioned before, staging server allows to simulate export before it make as primary. This is important if you implement new configuration changes. 
 
In order to prepare a staged copy of export, 
 
1) Go to Start | Azure AD Connect | Synchronization Service | Connectors 
 
sta6
 
2) Select the Active Directory Domain Services connector and click on Run from the right-hand panel. 
 
sta7
 
3) Then in next window select Full Import and click OK.
 
sta8
 
4) Repeat same for Windows Azure Active Directory (Microsoft) 
5) Once both jobs completed, Select the Active Directory Domain Services connector and click on Run from the right-hand panel again. But this time select Delta Synchronization, and click OK.
 
sta9
 
6) Repeat same for Windows Azure Active Directory (Microsoft)
7) Once both jobs finished, go to Operation tab and verify if jobs were completed successfully. 
 
sta10
 
Now we have the staging copy, next step is to verify if the data is presented as expected. to do that we need to get help of a PowerShell script.  

 
Param(
    [Parameter(Mandatory=$true, HelpMessage="Must be a file generated using csexport 'Name of Connector' export.xml /f:x)")]
    [string]$xmltoimport="%temp%\exportedStage1a.xml",
    [Parameter(Mandatory=$false, HelpMessage="Maximum number of users per output file")][int]$batchsize=1000,
    [Parameter(Mandatory=$false, HelpMessage="Show console output")][bool]$showOutput=$false
)

#LINQ isn't loaded automatically, so force it
[Reflection.Assembly]::Load("System.Xml.Linq, Version=3.5.0.0, Culture=neutral, PublicKeyToken=b77a5c561934e089") | Out-Null

[int]$count=1
[int]$outputfilecount=1
[array]$objOutputUsers=@()

#XML must be generated using "csexport "Name of Connector" export.xml /f:x"
write-host "Importing XML" -ForegroundColor Yellow

#XmlReader.Create won't properly resolve the file location,
#so expand and then resolve it
$resolvedXMLtoimport=Resolve-Path -Path ([Environment]::ExpandEnvironmentVariables($xmltoimport))

#use an XmlReader to deal with even large files
$result=$reader = [System.Xml.XmlReader]::Create($resolvedXMLtoimport) 
$result=$reader.ReadToDescendant('cs-object')
do 
{
    #create the object placeholder
    #adding them up here means we can enforce consistency
    $objOutputUser=New-Object psobject
    Add-Member -InputObject $objOutputUser -MemberType NoteProperty -Name ID -Value ""
    Add-Member -InputObject $objOutputUser -MemberType NoteProperty -Name Type -Value ""
    Add-Member -inputobject $objOutputUser -MemberType NoteProperty -Name DN -Value ""
    Add-Member -inputobject $objOutputUser -MemberType NoteProperty -Name operation -Value ""
    Add-Member -inputobject $objOutputUser -MemberType NoteProperty -Name UPN -Value ""
    Add-Member -inputobject $objOutputUser -MemberType NoteProperty -Name displayName -Value ""
    Add-Member -inputobject $objOutputUser -MemberType NoteProperty -Name sourceAnchor -Value ""
    Add-Member -inputobject $objOutputUser -MemberType NoteProperty -Name alias -Value ""
    Add-Member -inputobject $objOutputUser -MemberType NoteProperty -Name primarySMTP -Value ""
    Add-Member -inputobject $objOutputUser -MemberType NoteProperty -Name onPremisesSamAccountName -Value ""
    Add-Member -inputobject $objOutputUser -MemberType NoteProperty -Name mail -Value ""

    $user = [System.Xml.Linq.XElement]::ReadFrom($reader)
    if ($showOutput) {Write-Host Found an exported object... -ForegroundColor Green}

    #object id
    $outID=$user.Attribute('id').Value
    if ($showOutput) {Write-Host ID: $outID}
    $objOutputUser.ID=$outID

    #object type
    $outType=$user.Attribute('object-type').Value
    if ($showOutput) {Write-Host Type: $outType}
    $objOutputUser.Type=$outType

    #dn
    $outDN= $user.Element('unapplied-export').Element('delta').Attribute('dn').Value
    if ($showOutput) {Write-Host DN: $outDN}
    $objOutputUser.DN=$outDN

    #operation
    $outOperation= $user.Element('unapplied-export').Element('delta').Attribute('operation').Value
    if ($showOutput) {Write-Host Operation: $outOperation}
    $objOutputUser.operation=$outOperation

    #now that we have the basics, go get the details

    foreach ($attr in $user.Element('unapplied-export-hologram').Element('entry').Elements("attr"))
    {
        $attrvalue=$attr.Attribute('name').Value
        $internalvalue= $attr.Element('value').Value

        switch ($attrvalue)
        {
            "userPrincipalName"
            {
                if ($showOutput) {Write-Host UPN: $internalvalue}
                $objOutputUser.UPN=$internalvalue
            }
            "displayName"
            {
                if ($showOutput) {Write-Host displayName: $internalvalue}
                $objOutputUser.displayName=$internalvalue
            }
            "sourceAnchor"
            {
                if ($showOutput) {Write-Host sourceAnchor: $internalvalue}
                $objOutputUser.sourceAnchor=$internalvalue
            }
            "alias"
            {
                if ($showOutput) {Write-Host alias: $internalvalue}
                $objOutputUser.alias=$internalvalue
            }
            "proxyAddresses"
            {
                if ($showOutput) {Write-Host primarySMTP: ($internalvalue -replace "SMTP:","")}
                $objOutputUser.primarySMTP=$internalvalue -replace "SMTP:",""
            }
        }
    }

    $objOutputUsers += $objOutputUser

    Write-Progress -activity "Processing ${xmltoimport} in batches of ${batchsize}" -status "Batch ${outputfilecount}: " -percentComplete (($objOutputUsers.Count / $batchsize) * 100)

    #every so often, dump the processed users in case we blow up somewhere
    if ($count % $batchsize -eq 0)
    {
        Write-Host Hit the maximum users processed without completion... -ForegroundColor Yellow

        #export the collection of users as as CSV
        Write-Host Writing processedusers${outputfilecount}.csv -ForegroundColor Yellow
        $objOutputUsers | Export-Csv -path processedusers${outputfilecount}.csv -NoTypeInformation

        #increment the output file counter
        $outputfilecount+=1

        #reset the collection and the user counter
        $objOutputUsers = $null
        $count=0
    }

    $count+=1

    #need to bail out of the loop if no more users to process
    if ($reader.NodeType -eq [System.Xml.XmlNodeType]::EndElement)
    {
        break
    }

} while ($reader.Read)

#need to write out any users that didn't get picked up in a batch of 1000
#export the collection of users as as CSV
Write-Host Writing processedusers${outputfilecount}.csv -ForegroundColor Yellow
$objOutputUsers | Export-Csv -path processedusers${outputfilecount}.csv -NoTypeInformation

 
Save this as .ps1 on C Drive. 
 
1) Open PowerShell and type cd "C:\Program Files\Microsoft Azure AD Sync\Bin" (if your install path is different use the relevant path)
2) Then run .\csexport "myrebeladmin.onmicrosoft.com – AAD" C:\export.xml /f:x in here myrebeladmin.onmicrosoft.com -AAD should replace with your Azure AD connector name. This will export config to C:\export.xml
3) Then type .\analyze.ps1 -xmltoimport C:\export.xml in here analyze.ps1 is the script we saved in beginning of this section. 
4) Then it will create CSV file called processedusers1.csv and it’s contain all changes which will sync to Azure AD. 
 
However, this step is always not required. It can make as primary server without import and verify process. 
 
How to make it as primary Server?
 
In order to make staging server as primary server,
 
1) Go to Start | Azure AD Connect | Azure AD Connect
2) Then click on Configure in next page. 
3) In next page select option Configure staging mode and click Next
 
 
sta11
 
4) In next page provide the Azure AD login credentials for directory sync account. 
5) In next window, untick Enable staging mode and click Next
 
sta12
 
6) In next window select start the synchronization process… and click Configure
 
sta13
 
This completes the process of promoting staging server in to primary. Hope this was useful and if you have any questions feel free to contact me on rebeladm@live.com also follow me on twitter @rebeladm to get updates about new blog posts.

What is Content Freshness protection in DFSR?

Healthy Replication is a must for active directory environment. SYSVOL folder in domain controllers contain policies and log on scripts. It is replicated between domain controllers to maintain up to date config (consistency). Before windows server 2008, it used FRS (File Replication Service) to replicate sysvol content among domain controllers. With Windows server 2008 FRS was deprecated and introduced Distributed File System (DFS) for replication.

A healthy replication required healthy communication between domain controllers. sometime the communication can interrupt due to domain controller failure or link failure. Based on the impact it is still possible that the communication re-established after period of time. Then it will try to resume replication and catch up with SYSVOL changes. In such scenario, we may see event 4012 in event viewer. 

The DFS Replication service stopped replication on the replicated folder at local path c:\xxx. It has been disconnected from other partners for 70 days, which is longer than the MaxOfflineTimeInDays parameter. Because of this, DFS Replication considers this data to be stale, and will replace it with data from other members of the replication group during the next replication. DFS Replication will move the stale files to the local Conflict folder. No user action is required.

With windows server 2008, Microsoft introduced a setting called content freshness protection to protect DFS shares from stale data. DFS also use a multi-master database similar to active directory. It also has tombstone time limit similar to Active Directory. The default value for this is 60 days. If there were no replication more than that time and resume replication in later time, it can have stale data. It is similar to lingering objects in AD. To protect from this, we can define value for MaxOfflineTimeInDays. if the number of days from last successful DFS replication is larger than MaxOfflineTimeInDays it will prevent the replication. 

We can review this value by running,

For /f %m IN ('dsquery server -o rdn') do @echo %m && @wmic /node:"%m" /namespace:\\root\microsoftdfs path DfsrMachineConfig get MaxOfflineTimeInDays

cf1

There is two ways to recover from this. First method is to increase the value of MaxOfflineTimeInDays. it can be done using,

wmic.exe /namespace:\\root\microsoftdfs path DfsrMachineConfig set MaxOfflineTimeInDays=120

cf2

It is recommended to run this on all domain controllers to maintain same config. 

If you not willing to change this value, it still can recover using non-authoritative restore. It will remove all conflicting values and take an updated copy. 

I have already written an article about non-authoritative restore of SYSVOL and it can be find in http://www.rebeladmin.com/2017/08/non-authoritative-authoritative-sysvol-restore-dfs-replication/ 

This is not only for SYSVOL replication. It is valid for DFS replication in general. 

Hope this was useful and if you have any questions feel free to contact me on rebeladm@live.com also follow me on twitter @rebeladm to get updates about new blog posts.

Step-by-Step guide to setup Fine-Grained Password Policies

In AD environment, we can use password policy to define passwords security requirements. These settings are located under Computer Configuration | Policies | Windows Settings | Security Settings | Account Policies

fine1

Before Windows server 2008, only one password policy can apply to the users. But in an environment, based on user roles it may require additional protection. As an example, for sales users 8-character complex password can be too much but it is not too much for domain admin account. With windows server 2008 Microsoft introduced Fine-Grained Password Policies. This allow to apply different password policies users and groups. In order to use this feature, 

1) Your domain functional level should be windows server 2008 at least.

2) Need Domain/Enterprise Admin account to create policies. 

Similar to group policies, sometime objects may end up with multiple password policies applied to it. but in any given time, an object can only have one password policy. Each Fine-Grained Password Policy have a precedence value. This integer value can define during the policy setup. Lower precedence value means the higher priority. If multiple policies been applied to an object, the policy with lower precedence value wins. Also, policy linked to user object directly, always wins. 

We can create the policies using Active Directory Administrative Centre or PowerShell. In this demo, I am going to use PowerShell method. 

New-ADFineGrainedPasswordPolicy -Name "Tech Admin Password Policy" -Precedence 1 `

-MinPasswordLength 12 -MaxPasswordAge "30" -MinPasswordAge "7" `

-PasswordHistoryCount 50 -ComplexityEnabled:$true `

-LockoutDuration "8:00" `

-LockoutObservationWindow "8:00" -LockoutThreshold 3 `

-ReversibleEncryptionEnabled:$false

In above sample I am creating a new fine-grained password policy called “Tech Admin Password Policy”. New-ADFineGrainedPasswordPolicy is the cmdlet to create new policy. Precedence to define precedence. LockoutDuration and LockoutObservationWindow values are define in hours. LockoutThreshold value defines the number of login attempts allowed. 

More info about the syntax can find using,

Get-Help New-ADFineGrainedPasswordPolicy

Also, can view examples using 

Get-Help New-ADFineGrainedPasswordPolicy -Examples

fine2

Once policy is setup we can verify its settings using, 

Get-ADFineGrainedPasswordPolicy –Identity “Tech Admin Password Policy” 

fine3

Now we have policy in place. Next step is to attach it to groups or users. In my demo, I am going to apply this to a group called “IT Admins”

Add-ADFineGrainedPasswordPolicySubject -Identity "Tech Admin Password Policy" -Subjects "IT Admins"

I also going to attach it to s user account R143869

Add-ADFineGrainedPasswordPolicySubject -Identity "Tech Admin Password Policy" -Subjects "R143869"

We can verify the policy using following,

Get-ADFineGrainedPasswordPolicy -Identity "Tech Admin Password Policy" | Format-Table AppliesTo –AutoSize

fine4

This confirms the configuration. Hope this was useful and if you have any questions feel free to contact me on rebeladm@live.com also follow me on twitter @rebeladm to get updates about new blog posts.

When AD password will expire?

In Active Directory environment users have to update their passwords when its expire. In some occasions, it is important to know when user password will expire.

For user account, the value for the next password change is saved under the attribute msDS-UserPasswordExpiryTimeComputed

We can view this value for a user account using a PowerShell command like following, 

Get-ADuser R564441 -Properties msDS-UserPasswordExpiryTimeComputed | select Name, msDS-UserPasswordExpiryTimeComputed 

In above command, I am trying to find out the msDS-UserPasswordExpiryTimeComputed attribute for the user R564441. In output I am listing value of Name attribute and msDS-UserPasswordExpiryTimeComputed

ex1

In my example, it gave 131412469385705537 but it’s not mean anything. We need to convert it to readable format. 

I can do it using,

Get-ADuser R564441 -Properties msDS-UserPasswordExpiryTimeComputed | select Name, {[datetime]::FromFileTime($_."msDS-UserPasswordExpiryTimeComputed")}

In above the value was converted to datetime format and now its gives readable value. 

ex2

We can further develop this to provide report or send automatic reminders to users. I wrote following PowerShell script to generate a report regarding all the users in AD. 

$passwordexpired = $null

$dc = (Get-ADDomain | Select DNSRoot).DNSRoot

$Report= "C:\report.html"

$HTML=@"

<title>Password Validity Period For $dc</title>

<style>

BODY{background-color :LightBlue}

</style>

"@

$passwordexpired = Get-ADUser -filter * –Properties "SamAccountName","pwdLastSet","msDS-UserPasswordExpiryTimeComputed" | Select-Object -Property "SamAccountName",@{Name="Last Password Change";Expression={[datetime]::FromFileTime($_."pwdLastSet")}},@{Name="Next Password Change";Expression={[datetime]::FromFileTime($_."msDS-UserPasswordExpiryTimeComputed")}}

$passwordexpired | ConvertTo-Html -Property "SamAccountName","Last Password Change","Next Password Change"-head $HTML -body "<H2> Password Validity Period For $dc</H2>"|

Out-File $Report

     Invoke-Expression C:\report.html

This creates HTML report as following. It contains user name, last password change time and date and time it going to expire. 

ex3

The attributes value I used in here is SamAccountName, pwdLastSet and msDS-UserPasswordExpiryTimeComputed. pwdLastSet attribute holds the value for last password reset time and date. 

Hope this was useful and if you have any questions feel free to contact me on rebeladm@live.com also follow me on twitter @rebeladm to get updates about new blog posts. 

Non-Authoritative and Authoritative SYSVOL Restore (DFS Replication)

Healthy SYSVOL replication is key for every active directory infrastructure. when there is SYSVOL replication issues you may notice,

1. Users and systems are not applying their group policy settings properly. 

2. New group policies not applying to certain users and systems. 

3. Group policy object counts is different between domain controllers (inside SYSVOL folders)

4. Log on scripts are not processing correctly

Also, same time if you look in to event viewer you may able to find events such as,

Event Id

Event Description

2213

The DFS Replication service stopped replication on volume C:. This occurs when a DFSR JET database is not shut down cleanly and Auto Recovery is disabled. To resolve this issue, back up the files in the affected replicated folders, and then use the ResumeReplication WMI method to resume replication.

Recovery Steps

1. Back up the files in all replicated folders on the volume. Failure to do so may result in data loss due to unexpected conflict resolution during the recovery of the replicated folders.

2. To resume the replication for this volume, use the WMI method ResumeReplication of the DfsrVolumeConfig class. For example, from an elevated command prompt, type the following command:

wmic /namespace:\\root\microsoftdfs path dfsrVolumeConfig where volumeGuid=”xxxxxxxx″ call ResumeReplication

5002

The DFS Replication service encountered an error communicating with partner <FQDN> for replication group Domain System Volume.

5008

The DFS Replication service failed to communicate with partner <FQDN> for replication group Home-Replication. This error can occur if the host is unreachable, or if the DFS Replication service is not running on the server.

5014

The DFS Replication service is stopping communication with partner <FQDN> for replication group Domain System Volume due to an error. The service will retry the connection periodically.

Some of these errors can be fixed with simple server reboot or running commands describe in the error ( ex – event 2213 description) but if its keep continuing we need to do Non-Authoritative or Authoritative SYSVOL restore.

Non-Authoritative Restore 

If it’s only one or few domain controller (less than 50%) which have replication issues in a given time, we can issue a non-authoritative replication. In that scenario, system will replicate the SYSVOL from the PDC. 

Authoritative Restore

If more than 50% of domain controllers have SYSVOL replication issues, it possible that entire SYSVOL got corrupted. In such scenario, we need to go for Authoritative Restore. In this process, first we need to restore SYSVOL from backup to PDC and then replicate over or force all the domain controllers to update their SYSVOL copy from the copy in PDC. 

SYSVOL can replicate using FRS too. This is deprecated after windows server 2008, but if you migrated from older Active Directory environment you may still have FRS for SYSVOL replication. It also supports for Non-Authoritative and Authoritative restore but in this demo, I am going to talk only about SYSVOL with DFS replication. 

Non-Authoritative DFS Replication 

In order to perform a non-authoritative replication,

1) Backup the existing SYSVOL – This can be done by copying the SYSVOL folder from the domain controller which have DFS replication issues in to a secure location. 

2) Log in to Domain Controller as Domain Admin/Enterprise Admin

3) Launch ADSIEDIT.MSC tool and connect to Default Naming Context

sys1

4) Brows to DC=domain,DC=local > OU=Domain Controllers > CN=(DC NAME) > CN=DFSR-LocalSettings > Domain System Volume > SYSVOL Subscription

5) Change value of attribute msDFSR-Enabled = FALSE

sys2

6) Force the AD replication using,

repadmin /syncall /AdP

7) Run following to install the DFS management tools using (unless this is already installed), 

Add-WindowsFeature RSAT-DFS-Mgmt-Con

8) Run following command to update the DFRS global state,

dfsrdiag PollAD

9) Search for the event 4114 to confirm SYSVOL replication is disabled. 

Get-EventLog -Log "DFS Replication" | where {$_.eventID -eq 4114} | fl

10) Change the attribute value back to msDFSR-Enabled=TRUE (step 5)

11) Force the AD replication as in step 6

12) Update DFRS global state running command in step 8

13) Search for events 4614 and 4604 to confirm successful non-authoritative synchronization. 

sys3

All these commands should run from domain controllers set as non-authoritative. 

Authoritative DFS Replication 

In order to perform to initiate authoritative DFS Replication,

1) Log in to PDC FSMO role holder as Domain Administrator or Enterprise Administrator

2) Stop DFS Replication Service (This is recommended to do in all the Domain Controllers)

3) Launch ADSIEDIT.MSC tool and connect to Default Naming Context

4) Brows to DC=domain,DC=local > OU=Domain Controllers > CN=(DC NAME) > CN=DFSR-LocalSettings > Domain System Volume > SYSVOL Subscription

5) Update the given attributes values as following, 

msDFSR-Enabled=FALSE

msDFSR-options=1

sys4

6) Modify following attribute on ALL other domain controller.

msDFSR-Enabled=FALSE

7) Force the AD replication using,

repadmin /syncall /AdP

8) Start DFS replication service in PDC

9) Search for the event 4114 to verify SYSVOL replication is disabled.

10) Change following value which were set on the step 5,

msDFSR-Enabled=TRUE

11) Force the AD replication using,

repadmin /syncall /AdP

12) Run following command to update the DFRS global state,

dfsrdiag PollAD

13) Search for the event 4602 and verify the successful SYSVOL replication. 

14) Start DFS service on all other Domain Controllers

15) Search for the event 4114 to verify SYSVOL replication is disabled.

16) Change following value which were set on the step6. This need to be done on ALL domain controllers. 

msDFSR-Enabled=TRUE

17) Run following command to update the DFRS global state,

dfsrdiag PollAD

18) Search for events 4614 and 4604 to confirm successful authoritative synchronization. 

Please note you do not need to run Authoritative DFS Replication for every DFS replication issue. It should be the last option.

Hope this was useful and if you have any questions feel free to contact me on rebeladm@live.com 

Step-by-Step Guide to enable Azure AD Domain Services

Azure AD, Azure AD Domain Services, On-premises Active Directory, AD-sync ….. All these terms are now start to appear on most of now a days infrastructure projects. Based on the questions I get from the blog also represent still engineers struggle how to implements Azure services with their needs and how to get best benefits out from it. So this article also a series of articles I was doing to cover up Azure AD related services and how to use these services to enhanced your current infrastructure operations.

Azure AD Domain Services

Azure AD Domain Services is in preview for a while now (6 months). Azure AD Domain Services is a managed domain service which provides group policy, LDAP, NTLM/Kerberos Authentication without need of “Domain Controller” in your azure cloud setup.

If you have “cloud-only” service with Azure, this service will allow you to manage your azure identities more affectively. You can deploy the azure ad domain services in to the same virtual network your other IaaS workloads runs. Then these VM can connect to the Azure AD as typical domain join servers and can control those centrally. Also can apply group policies if you like.

If its hybrid setup you can sync your on-premises identities to the cloud and use those along with the azure Iaas workloads.

These are the main features of Azure Active Directory Domain Services (From: https://azure.microsoft.com/en-gb/documentation/articles/active-directory-ds-features/)

•    Simple deployment experience: You can enable Azure AD Domain Services for your Azure AD tenant using just a few clicks. Regardless of whether your Azure AD tenant is a cloud-tenant or synchronized with your on-premises directory, your managed domain can be provisioned quickly.
•    Support for domain-join: You can easily domain join computers in the Azure virtual network that Azure AD Domain Services is available in. The domain join experience on Windows client and Server operating systems works seamlessly against domains serviced by Azure AD Domain Services. You can also use automated domain join tooling against such domains.
•    One domain instance per Azure AD directory: You can create a single Active Directory domain for each Azure AD directory.
•    Create domains with custom names: You can create domains with custom names (eg. contoso.local) using Azure AD Domain Services. This includes both verified as well as unverified domain names. Optionally, you can also create a domain with the built-in domain suffix (i.e. *.onmicrosoft.com) that is offered by your Azure AD directory.
•    Integrated with Azure AD: You do not need to configure or manage replication to Azure AD Domain Services. User accounts, group memberships and user credentials (passwords) from your Azure AD directory are automatically available in Azure AD Domain Services. New users, groups or changes to attributes ocurring in your Azure AD tenant or in your on-premises directory are automatically synchronized to Azure AD Domain Services.
•    NTLM and Kerberos authentication: With support for NTLM and Kerberos authentication, you can deploy applications that rely on Windows Integrated Authentication.
•    Use your corporate credentials/passwords: Passwords for users in your Azure AD tenant work with Azure AD Domain Services. This means users in your organization can use their corporate credentials on the domain – for domain joining machines, logging in interactively or over remote desktop, authenticating against the DC etc.
•    LDAP bind & LDAP read support: You can use applications that rely on LDAP binds in order to authenticate users in domains serviced by Azure AD Domain Services. Additionally, applications that use LDAP read operations to query user/computer attributes from the directory can also work against Azure AD Domain Services.
•    Group Policy: You can leverage a single built-in GPO each for the users and computers containers in order to enforce compliance with required security policies for user accounts as well as domain joined computers.
•    Available in multiple Azure regions: See the Azure services by region page to know the Azure regions in which Azure AD Domain Services are available.
•    High availability: Azure AD Domain Services offer high availability for your domain. This offers the guarantee of higher service uptime and resilience to failures. Built-in health monitoring offers automated remediation from failures by spinning up new instances to replace failed instances and to provide continued service for your domain.
•    Use familiar management tools: You can use familiar Windows Server Active Directory management tools such as the Active Directory Administrative Center or Active Directory PowerShell in order to administer domains provided by Azure AD Domain Services.

In my demo today I am going to show how to enable Azure AD Domain Services and how to configure it properly for cloud-only IaaS setup.

I have created Azure AD instance called REBELADMIN already. I will be using it during the demo.

aads1

Setup Azure Virtual Network

I am going to show how to setup new azure virtual network. The azure AD domain service instance also need to assign to the same virtual network as your other service run in order to integrate those resources.

1)    In Azure Classic Portal click on “Networks” option in left side.

aads2

2)    Then click on “Create a Virtual Network

aads3

3)    In wizard type the name for the virtual network and select the location, then click on proceed button to go to next step

aads4

4)    In next page, I am not going to define any DNS servers as I will setup it in later time in this demo, click on proceed button

aads5

5)    In next window it will show the address space, you can either customize or proceed with default. I am going to use default.

aads6

6)    After proceed, its created the new virtual network successfully

aads7

Enable Azure AD Domain Service

Now we got the virtual network setup. Next step is to enable the domain service.

1)    Click on the Azure AD directory instance which needs to enable Azure AD Domain Service (if you not done yet you can do it using New > App Services > Active Directory > Directory )

aads8

2)    Then click on “Configure

aads9

3)    Under the “Domain Services” click on “Yes” button to enable the domain services.

aads10

4)    DNS Domain name of domain services – This option to define the dns domain name. If you do not have domain setup you still can use default azure name which is ends up with onmicrosoft.com.
Connect domain service to this virtual network – in here you can define which virtual network domain service should assign to. I have selected the new virtual network created on previous step.
After changes click on “Save

aads11

5)    Then it will start to activate the service.

aads12

6)    Currently it takes like 30 minutes to get service enabled. Once its setup we can see the DNS server ip address appears. This is important as we need to add these in to virtual network in order to join servers to domain.

aads13

Add DNS server details into Virtual Network

1)    Click on the virtual network where Azure AD domain service also associated with.

aads14

2)    Click on the configure and then add the DNS server info

aads15

3)    Click on Save to submit the changes

Create “AAD DC Administrator” group

Since Azure AD Domain service is managed service you will not get domain admin or enterprise administrator privileges to the Ad instance. But you allowed to create this group and all the members of this group will be granted with administrator privileges to the domain join servers (This group will added to the administrators group in domain join servers).

In order to do that need to load the Azure AD instance again,

1)    Click on the relevant Azure AD instance.

aads16

2)    Click on the “Groups” and then Add Group

aads17

3)    Then in next window type the group name as “AAD DC Administrators” and type as “Security” then click on proceed button. Please note you must use the text on same format in order to get enable this group.

aads18

4)    Then you can add the member as you prefer

aads19

With this our initial configuration is done. The next step is to enable password synchronization to allow users to use their cooperate logins to log in to the domain. I will explain it on my next post as another step-by-step guide.

If you have any questions about the post feel free to contact me on rebeladm@live.com