Category Archives: Azure

Step-by-Step guide to add Additional Local Administrators to Azure AD Joined Devices

I am sure every engineer knows how “Local Administrators” works in a device. If it’s a device in on-premise Active Directory environment, either domain admin or enterprise will need to add it to Administrators group. if it’s a workgroup environment, another user with local administrator privileges will need to add additional users to Administrators group. 

If it is Azure AD join device, Azure Global Administrators and Device Owner have local administrator rights by default. 

localad1

localad2

Azure AD allow to define local administrators in device level. however, this is a global setting. If it is need to handle in device level, still you need to login from an account which already have local administrator rights and then add additional users. 

Let’s see how we can do this. 

1) Log in to azure portal as Global Administrator

2) Then click on Azure Active Directory and the Devices

localad3

3) Then click on Device Settings

localad4

4) By default, Additional local administrators on Azure AD joined devices setting is set to None. click on tab Selected to enable it. 

localad5

5) In my demo, I am going to make user RA886611@therebeladmin.com local administrator for devices. To do that click on Selected option. 

localad6

6) In new window click on Add members to add users. 

localad7

7) From the list find the relevant user and click on it to select. Then click on Select

localad8

8) Then click on OK

localad9

9) Finally click on Save to apply the settings. 

localad10

10) To Test this, I logged in to a Azure Domain Joined Device as RA886611@therebeladmin.com 

localad11

11) Now to test it, I trying to launch PowerShell console as Administrator. If it works, I shouldn’t get login prompt. 

localad12

12) As expected it didn’t ask for admin user name and password as logged in user now have local admin privileges. 

localad13

localad14

13) Also, when needed, using Remove Members option in Local administrators on devices page, we can remove the users from local administrator group. 

localad15

This marks the end of this blog post. If you have any questions feel free to contact me on rebeladm@live.com also follow me on twitter @rebeladm to get updates about new blog posts.

Step-by-Step guide to enable Enterprise State Roaming with Azure Active Directory

If you work with Active Directory you may already know what is roaming profiles is. Roaming profiles allows to sync application and user settings to a file share. When same user login from another computer in to same domain, those settings will sync back from file share. It allows users to have same user experience and data in different corporate devices. Azure Active Directory users may also login from multiple Azure domain joined devices. Enterprise state roaming allows to sync user settings and application settings securely across corporate azure domain joined devices. 

Secured Sync – When this feature enables it will activate free limited Azure Rights Management subscription. It will use to encrypt and decrypt data which is sync to cloud. This will ensure the security of data used by Enterprise State Roaming feature. 

Data Storage – Data storage location for Enterprise State Roaming feature will be align with your Azure Active Directory subscription region. It will not sync between different regions. 

Better Control – This feature can be enable for entire directory or only for selected users. Sync data for each device can review using portal. With help of Azure Support, administrators also can forcefully remove sync data for a device. 

Data Retention – If user account been deleted from directory, profile data will be deleted after 90 days. Administrators also can request (from azure support) to delete specific data from a user profile. If data not been access for 1 year it will consider it as stale data and remove forcefully. It will also happen if Enterprise State Roaming feature is disable in later time. 

Let’s see how we can enable this feature. In order to enable this feature, you must have Azure AD Premium or Enterprise Mobility + Security (EMS) license. Azure AD join devices must be running with Windows 10 (Version 1511, Build 10586 or greater)

1) Log in to Azure Portal as a Global Administrator
2) Go to Azure Active Directory | Devices  
 
ent1
 
3) Then click on Device Settings 
 
ent2
 
4) Under device settings there is option says Users may sync settings and app data across devices. In there you can select All or Selected. If you use selected option, you will need to define the users. in my demo, I am going to enable Enterprise State Roaming for entire directory. Once selection is made click on Save
 
ent3
After the feature is enabled we can review the sync status using Azure Active Directory Admin Center. To do this, 
 
1) Log in to Azure Active Directory Admin Center using https://aad.portal.azure.com
2) Go to Azure Active Directory | Users and Groups 
 
ent4
 
3) In next window, Click on All users and then click on the relevant user. In my demo it is user RA722725@therebeladmin.com
 
ent5
 
4) Then click on Device in new window. 
 
ent6
 
5) Then in right hand window select Device sync settings and app data option from show drop down menu.
 
ent7
 
6) In list it shows the devices, that user logged in and the last sync time. 
 
ent8
 
Now we have everything ready for testing. Before we start there is few things to remind. This is only sync user and app settings. Not user data. Also, sync is not happening at login/log off event. It happens once user is log in. so if you do not see sync data right away after login, allow sometime and keep eye on last sync time value. 
 
In my demo, I am login to a pc called REBEL-PC01 as RA722725@therebeladmin.com. In that pc, I have done certain settings changes. 
 
Under IE, I added few links to favorites. 
 
ent9
 
I also change setting on code writer App and change font and default text size to 20.  
 
ent10
 
After initial sync, I login in to another pc called REBEL-PC02 as RA722725@therebeladmin.com. In there I expect to see the changes I made. (The sync cycles can take up to 30 minutes. So far I didn’t find way to override this setting) 
 
As expected I can see same IE favorites list. 
 
ent11
 
Also, code writer app settings are there. 
 
ent12
 
As we can see it helps to streamline user experience across corporate devices. This marks the end of this blog post. If you have any questions feel free to contact me on rebeladm@live.com also follow me on twitter @rebeladm to get updates about new blog posts.  

Microsoft Compliance Manager makes it easy to deal with compliance challenges!

If you are living in Europe, you may aware how GDPR (General Data Protection Regulation) is storming through IT world. Service providers, Vendors and pretty much every business who deals with digital data are looking or making plans to face GDPR which is going to enforce from 25 May 2018. Some already compliance and some are still struggling to figure it out. It’s a time people talk about compliances more than ever. Compliances are always painful to deals with. Its involves knowledge, experience, skills, people, time, roles and responsibilities, services and many more. More importantly need to evaluate how these compliances, laws are matching with each business model. There is no single button or shortcuts to make organizations to comply with these compliances which comes time to time. 

These compliances are also changes based on industry trends or needs. Even your organization comply with certain compliances today, it may not in 6 months’ time. so, continues awareness and skills are also required to maintain the compliance status. For an organization, it’s not one-man job either. Different roles will have different responsibilities to make it possible. Some compliances are just “good to have” type. but some compliances are must for certain business to operate and some compliance are backed by law, so that types leave no choice. 

This whole GDRP experience taught some lessons,

Complexity – when new regulations and compliances are enforced, lack of information, complexity, lack of experience and skills make it difficult for organizations to adopt it in short period of time. This rush and uncertainty can make organizations to make vulnerable moves which can lead in to bigger problems. 

Compatibility with other compliances – Sometime businesses may comply with multiple compliances. So, things you do to comply with one compliance can affect to compliances you already comply with. It is hard to keep track of each and individual actions and measure its impact. 

Commitment – As I explain before, it is not one-man job, different parties, different roles need to make relevant commitment to achieve compliance targets. Organizations always finds it difficult to measure commitments or evaluate task progress throughout the implementation process.  

Tools and methods – As everyone agrees there are no shortcuts to comply with compliances. It is not like installing a software or enabling a service. Organizations needs to go through relevant rules and see how its apply with its infrastructure, business models. But it is not always practical to do all these manually. As an example, GDPR has more than 100 rules. If we not use tools or other methods to see how its apply to existing infrastructure, it can be time consuming, complex process. There are existing tools which gives your reports based on the information you provide but so far, I am not aware of a tool which do real time analysis of infrastructure and reports back about compliances status. 

On Last Ignite event Microsoft introduced Compliance Manager tool which simplifies the compliance adoption process for organization. As a service provider Microsoft also have role to play to make its cloud products comply with these compliances. So, Microsoft creates a service where it explains how it’s done its task and give insight to customers to do their bit in form of tasks. Each of these tasks include detail explanation. Each of these tasks can assign to a user and measure its progress real-time.   

This service is available for Azure and Office 365 customers. This is not only covering GDPR, it also covers other compliance such as ISO 27001:2013, ISO 27018:2014. This is currently on preview and it will generally available in 2018. 

In order to access this tool, you need to have valid Office 365 Subscription. Azure and Dynamic support is coming soon. This also can test using trial Azure account. Once you have login details ready, go to https://servicetrust.microsoft.com/ and click on “Launch Compliance Manager” 

comp1

In next page, it will ask about the subscription. If you have valid subscription already you can use “Sign In” option. 

comp2

After successful authentication, it will load the Dashboard for the compliance manager. 

comp3

Each tile represent compliance. Using “Add Assessment” button we can add new compliances to the list. To do it first click on Add Assessment option. 

comp4

Then in the pop up select relevant product and click on Next

comp5

In next window, you can select the relevant assessments and click on Add to Dashboard

comp6

Each of the tile have two sections. One is to list down the controls Microsoft comply with and one is to list down controls customer comply with. 

comp7

In order to see these in details click on the assessment name on the tile. 

comp8

Then it lists down the section for each control. 

comp9

As an example, if I expand one of task related to Microsoft, it explains what is it and what Microsoft did to implement it and who assessed it. 

comp10

Now if I do the same for customer controls I can see similar details. But most of it need to be fill by customer. It provides detail description of the assessment. If go to customer actions it gives some insights what customer need to do to pass the assessment. 

comp11

comp12

It also has two sections where we can add notes about implementation, test plan and management response. 

comp13

Using Test Date option we can define the data for assessment. 

comp14

Using Test Result drop down we can select the assessment status.

comp15

Using Manage Documents option we can upload relevant documents for the task. 

comp16

comp17

More importantly using Assign button task can assign to another user in the organization. 

comp18

In my demo, I am assigning it to user Agnes Schleich with high priority. 

comp19

Email notification for this is not working yet, but in future once task been assign, it will send email notification to user. 

Now when I login as user Agnes Schleich to compliance manager, I can see the assigned task under action items.

comp20

Cool, isn’t it? Microsoft promised to add more and more assessment in coming months to make life easier with compliances. Once you done evaluation, do not forget to provide feedback using Feedback button. 

This marks the end of this blog post. If you have any questions feel free to contact me on rebeladm@live.com also follow me on twitter @rebeladm to get updates about new blog posts.  

Self-Service password reset on Azure AD joined windows 10 device

Password resets are common service desk request IT engineers deals with. Passwords are weak authentication method. Passwords are breakable, crackable and guessable. This is why Microsoft invested on password less authentication such as Windows Hello. However, majority of systems still use traditional user name and password to authenticate.

When user forget their password, it prevents them from accessing the systems or services they trying to access. Until someone with higher privileges reset user’s password his/her time will be wasted. It is manageable for small number of users but if its large organization, it can cost lot for both parties. This is why organizations use self-service password reset solutions. It will allow its users to reset their passwords in secure, controlled environment. 

When it comes to Azure AD, it also can allow users to have self-service password reset feature. In one of my previous blog post I explained how it can enable. It can access using http://www.rebeladmin.com/2016/01/step-by-step-guide-to-configure-self-service-password-reset-in-azure-ad/ . Now Azure AD also allows to reset password directly from login screen of Azure AD join windows 10 devices. In this post, I am going to demonstrate this feature. 

In order to use this feature, Azure AD environment should have following,

1. Enable self-service password reset – By default Azure AD do not have this feature enable. It need to enable before users use this feature. It can be enable for all the users or group of users. 

password1

In my demo environment, I have it enable for all the users. 

Also in here users can have one or two authentication methods to reset password. if it’s using two methods, it will verify user using both methods. 

password2

2. Password writeback for Hybrid Environments – If its Hybrid environment (with on-premises AD) password writeback option should enable. Otherwise password which reset from Azure AD will not replicate back. This option is available in Azure AD connect. If you not enable this option, even if you have self-service password reset enable it will not allow password reset for users. 

password3

3. Windows 10 Fall Creator Update – This password reset feature is only available for Windows 10 Version 1709. So, make sure device is running with latest update. it can be apply using windows update. more details can find via https://support.microsoft.com/en-gb/help/4028685/windows-10-get-the-fall-creators-update 

In my demo environment, I have an Azure Domain Join Windows 10 PC. 

password4

After I enable self-service password reset, I am going to log in to this PC as user RA722725@therebeladmin.com

on my login, it says I need to provide additional info for password recovery. 

password5

Click on Set it up now to continue. 

Then it provides list of options I can use to verify. Select the option you need and click Next

password6

Now we have recovery options setup, let’s see how password reset works from the device. 

On my Azure AD join device, in login screen I type the user name. but I do not get any option for password reset. This is because I am also using PIN option for login. If you are using PIN you probably end up using PIN instead of password. so, if you using PIN and still need to recover password, click on Sign -in option

password7

Then click on number pad sign to select PIN option.

password8

Then click on I forgot my PIN option. I know this is confusing as we trying to reset password. but unfortunately, option is in PIN reset page. 

password9

Then it will open new window. In their click on Forgotten password option. 

password10

Now it opens a new window to reset password. click Next to proceed. 

password11

Then it gives option for verification. Select the method you like to use. You can’t change your registered data in here. 

password12

After successful verification, it gives option to define new password. after type new password, click on Next to proceed. 

password13

Then click on Finish to complete the process. 

password14

Then I can login to device with new password. 

password15

Cool ha???, as expected we were able to reset password on device login screen. This marks the end of this blog post. If you have any questions feel free to contact me on rebeladm@live.com also follow me on twitter @rebeladm to get updates about new blog posts.  

Azure AD now support macOS Conditional Access – Let’s see it in action!

Azure AD conditional access policies allows to provide conditional based access to cloud workloads. 

In one of my previous blog post I explain it in detail what is conditional access policy and how we can configure it. you can find it on http://www.rebeladmin.com/2017/07/conditional-access-policies-azure-active-directory/ . I highly recommend to read it before we continue on this post. 

In Condition Access Policy, there are two main section.

Assignments –  This is where we can define conditions applying to user environment such as users and groups, applications, device platform, login locations etc.

Access Control –  This is to control access for the users and groups when they comply with the conditions specified in the “assignments” section. it can be either allow access or deny access. 

Under Assignment section we can define device platforms involves in the condition. Before when I wrote my previous post it was only supporting for following platforms.

• Android

• iOS

• Windows Phone

• Windows

From November 14th 2017, Azure AD add macOS to the list. With this update following OS versions, applications, and browsers are supported on macOS for conditional access:

Operating Systems

macOS 10.11+

Applications

Microsoft Office 2016 for macOS v15.34 and later

Microsoft Teams

Web applications (via Application Proxy)

Browsers

Safari

Chrome

In original documentation, it didn’t say anything about web apps but in this demo, I am going to use conditional access with on-premises web app which is publish to internet using Azure Application Proxy. I wrote article about application proxy while ago and it can access via http://www.rebeladmin.com/2017/06/azure-active-directory-application-proxy-part-02/ 

Before start configuration, let me explain little bit about my environment. I have on-premises domain environment with therebeladmin.com. I integrated it with Azure AD Premium and I have healthy sync. I have on-premises webapp and I have published it to internet using Azure Application Proxy so I can use Azure AD authentication with it. webapp can access via https://webapp-myrebeladmin.msappproxy.net/webapp/ 

I have a mac with sierra running. In this demo, I am going to setup a conditional access policy to block access to webapp if the request coming from a mac environment. 

mac01

In order to configure this, 

1) Log on to Azure as global admin
2) Click on Azure Active Directory from left menu.
 
mac2
 
3) Then in Azure Active Directory panel, click on Conditional Access under security section. 
 
mac3
 
4) It will load up the conditional access window. Click on + New Policy to create new policy. 
 
mac4
 
5) It will open up policy window where we can define policy settings. First thing first, provide a name for policy. in my case I will use “Block access from macOS
 
mac5
 
6) Then click on User and groups to define target users for the policy. in this demo, I am going to target All users. once selection is done click on Done
 
mac6
 
7) Then Click on Clouds Apps to select application for the policy. in my policy, I am going to target rebelwebapp. Once selection is done click on Select and the Done to complete the process. 
 
mac7
 
8) Next step is to define the conditions. In order to do that click on Conditions option. In here I am only worrying about device platforms. To select platforms, click on option Device Platforms. Then to enable the condition click on Yes under configure and then under include tab select macOS. After that click on Done in both windows to complete the process. 
 
mac8
 
9) Next step to define access control rules. To do that click on Grant under access controls section. in my demo, I am going to block access to app. So, I am selecting block access option. Once selection is done click on select to complete the action. 
 
mac9
 
10) Now policy is ready. To enable it click On tab under Enable Policy option. 
 
mac10
 
11) Then to create the policy, click on Create button. 
 
mac11
 
12) Now policy is ready and next step is to test it. in order to do that I am using webapp url via mac. As soon as I access url, it asks for login.
 
mac12
 
13) As soon as I type user name and password, I get following response saying it is not allowed. 
 
mac13
 
14) If we click on More Details it gives more info about error. As expected it was due to the conditional access policy we set up. Nice ha!!
 
mac14
 
So as expected, conditional access with macOS working fine. This is another good step forward. Well done Microsoft! This marks the end of this blog post. If you have any questions feel free to contact me on rebeladm@live.com also follow me on twitter @rebeladm to get updates about new blog posts.  

Azure AD Password Synchronization

Azure AD Connect allows engineers to sync on-permises AD data to Azure AD. If you use express settings for the AD connect setup, by default it enables the password synchronization as well. This allows users to use same Active Directory password to authenticate in to cloud based workloads. This allow users to use single login details without maintaining different passwords. It simplifies the user’s login experience as well as reduce the helpdesk involvements. 

Windows Active Directory uses hash values, which is generated by hash algorithm as passwords. It is not being saved as clear text password and it is impossible to revert it back to a clear text password. There is misunderstanding about this as some people thinks Azure AD password sync uses clear text passwords. In every 2 minutes’ intervals Azure AD connect server retrieves password hashes from on-premises AD and sync it to Azure AD per user-basis in chronological order. This also involves with encryption and decryption process to add extra security to password sync process. In event of password change it will sync to Azure AD in next password sync interval. In healthy environment, maximum delay to update password will be 2 minutes. 

If the password was changed while user has open session, it will affect on next Azure authentication attempt. It will not log out the user from existing session. Also, password synchronization doesn’t mean SSO. Users always have to use corporate login details to authenticate to Azure Services. You can find more information about SSO using https://docs.microsoft.com/en-us/azure/active-directory/connect/active-directory-aadconnect-sso 

Enable synchronization of NTLM and Kerberos credential hashes to Azure AD

However Azure AD Connect does not synchronize NTLM and Kerberos credential hashes to Azure AD by default. So, if you had Azure AD directory setup and only enabled Azure Domain Services recently make sure you check following,

pass1
1. If there is existing Azure AD Connect server, Upgrade the Azure AD connect to latest
2. If there is existing Azure AD Connect server, confirm password synchronization is enabled in Azure AD connect 
 
In order to do that, open Azure AD connect and select option to “view current configuration” and check if password synchronization is enabled. 
 
pass2
 
If it’s not, we need to go back to initial page and select option “customize synchronization options” and under optional features select password synchronization
 
pass3
 
Run following PowerShell script on local AD to force full password synchronization, and enable all on-premises users’ credential hashes to sync to Azure AD. 

$adConnector = "<CASE SENSITIVE AD CONNECTOR NAME>"  
$azureadConnector = "<CASE SENSITIVE AZURE AD CONNECTOR NAME>"  
Import-Module “C:\Program Files\Microsoft Azure AD Sync\Bin\ADSync\ADSync.psd1”  
$c = Get-ADSyncConnector -Name $adConnector  
$p = New-Object Microsoft.IdentityManagement.PowerShell.ObjectModel.ConfigurationParameter "Microsoft.Synchronize.ForceFullPasswordSync", String, ConnectorGlobal, $null, $null, $null
$p.Value = 1  
$c.GlobalParameters.Remove($p.Name)  
$c.GlobalParameters.Add($p)  
$c = Add-ADSyncConnector -Connector $c  
Set-ADSyncAADPasswordSyncConfiguration -SourceConnector $adConnector -TargetConnector $azureadConnector -Enable $false   
Set-ADSyncAADPasswordSyncConfiguration -SourceConnector $adConnector -TargetConnector $azureadConnector -Enable $true  
 
You can find AD connector and Azure AD Connector name using, Start > Synchronization Service > Connections.
 
pass4
 
After that you can try to log in to Azure as a user in on-premises AD. If sync is working properly, it should accept your corporate login. 
 
This marks the end of this blog post. If you have any questions feel free to contact me on rebeladm@live.com also follow me on twitter @rebeladm to get updates about new blog posts.  

Active Directory Health Monitoring with OMS (Operation Management Suite)

System Center Operation Manager (SCOM) is the Microsoft solution to monitor application and systems health in detail. It applies to Active Directory monitoring as well.  Using relevant management packs, it can monitor health of active directory services and its activities. Microsoft introduced Operation Management suite to bring monitoring to the next level with advanced analytics technologies. SCOM was more in to monitoring applications, services and devices running on-premises. But OMS work with on-premises, cloud only or hybrid cloud environments. 

OMS Benefits 

Minimal Configuration and Maintenance – If you worked with SCOM before you may know how many different components we need to configure such as management servers, SQL servers, Gateway Servers, Certificate Authority etc. But with OMS all we need a subscription and initial configuration of monitoring agents or gateway. No more complex maintenance routings either. 

Scalable – Latest records from Microsoft shows OMS is already using by more than 50k customer. More than 20PB data been collected more than 188 million queries been run for a week. With cloud based solution we no longer need to worry about the resource when we expanding. Subscription is based on the features and the amount of data you upload. You do not need to pay for the compute power. I am sure Microsoft no-way near running out of resources!! 

Integration with SCOM – OMS fully supported to integrate with SCOM. It allows engineers to specify which systems and data should be analyze by OMS. It also allows to do smooth migration from SCOM to OMS in stages. In integrated environment SCOM works similar to a gateway and OMS do queries through SCOM. OMS and SCOM both uses same monitoring agent (Microsoft Monitoring Agent) and there for client side configuration are minimum. 

Note – Some OMS components such as Network Performance Monitoring, WireData 2.0, Service Map required additional agent files, system changes and direct connection with OMS. 

Frequent Features Updates –Microsoft releases System center version in every four years’ time. But OMS updates and new services are coming more often. It allows Microsoft to address industry requirements quickly. 

OMS in Hybrid Environment 

In a hybrid environment, we can integrate on-premises system with OMS using three methods. 

Microsoft Monitoring Agent – Monitoring agent need to install in each and every system and it will directly connect to OMS to upload the data and run queries. Every system need to connection to OMS via port 443. 

SCOM – If you already have SCOM installed and configured in your infrastructure, OMS can integrate with it. Data upload to OMS will be done from SCOM management servers. OMS runs the queries to the systems via SCOM. However, some OMS feature still need direct connection to system to collect specific data. 

OMS gateway – Now OMS supports to collect data and run queries via its own gateway. This works similar to SCOM gateways. All the systems do not need to have direct connection to OMS and OMS gateway will collect and upload relevant data from its infrastructure. 

What is in there for AD Monitoring? 

In SCOM environment we can monitor active directory components and services using relevant management packs. It collects great amount of insight. However, to identify potential issues, engineers need to analyze these collected data. OMS provide two solution packs which collect data from Active Directory environment and analyze those for you. After analyzing it will visualize it in user friendly way. It also provides insight how to fix the detected problems as well as provide guidelines to improve the environment performance, security and high availability. 

AD Assessment – This solution will analyze risk and health of AD environments on a regular interval. It provides list of recommendations to improve you existing AD infrastructure. 

AD Replication Status – This solution analyzes replication status of your Active Directory environment. 

In this section I am going to demonstrate how we can monitor AD environment using OMS. Before we start we need, 

1) Valid OMS Subscription – OMS has different level of subscriptions. It is depending on the OMS services you use and amount of data uploaded daily. It does have free version which provides 500mb daily upload and 7-day data retention. 

2) Direct Connection to OMS – In this demo I am going to use the direct OMS integration via Microsoft Monitoring Agent. 

3) Domain Administrator Account – in order to install the agent in the domain controllers we need to have Domain Administrator privileges. 

Enable OMS AD Solutions 

1) Log in to OMS https://login.mms.microsoft.com/signin.aspx?ref=ms_mms as OMS administrator

2) Click on Solution Gallery

oms1

3) By default, AD Assessment solution is enabled. In order to enable AD Replication Status solution, click on the tile from the solution list and then click on Add

oms2

Install OMS Agents 
 
Next step of the configuration is to install monitoring agent in domain controllers and get them connected with OMS. 
 
1) Log in to the domain controller as domain administrator
2) Log in to OMS portal 
3) Go to Settings > Connected Sources > Windows Servers > click on Download Windows Agent (64bit). it will download the monitoring agent to the system. 
 
oms3
 
4) Once it is download, double click on the setup and start the installation process. 
5) In first windows of the wizard click Next to begin the installation. 
6) In next window read and accept the licenses terms.
7) In next window, we can select where it should install. If there is on changes click Next to Continue. 
8) In next window, it asks where it will connect to. In our scenario, it will connect to OMS directly. 
 
oms4
 
9) In next window, it asks about OMS Workspace ID and Key. it can be found in OMS portal in Settings > Connected Sources > Windows Servers. if this server is behind proxy server, we also can specify the proxy setting in this window. Once relevant info provided click on Next to continue. 
 
oms5
 
10) In next window, it asks how I need to check agent updates. It is recommended to use windows updates option. Once selection has made, Click Next
11) In confirmation page, click Install to begin the installation. 
12) Follow same steps for other domain controllers.
13) After few minutes, we can see the newly added servers are connected as data source under Settings > Connected Sources > Windows Servers
 
oms6

View Analyzed Data
 
1) After few minutes, OMS will start to collect data and virtualize the findings. 
2) To view the data, log in to OMS portal and click on relevant solution tile in home page. 
 
oms7
 
3) Once click on the tile it brings you to a page where it displays more details about its findings. 
 
oms8
 
4) As I explain before, it not only displays errors. It also gives recommendation on how to fix the existing issues. 
 
oms9
 
Collect Windows Logs for Analysis
 
Using OMS, we also can collect windows logs and use OMS analyzing capabilities to analyze those. When this enabled, OMS space usage and bandwidth usage on organization end will be higher. In order to collect logs,
 
1) Log in to OMS portal
2) Go to Settings > Data > Windows Event Log
3) In the box, you can search for the relevant log file name and add it to the list. We also can select which type of events to extract. Once selection is made click Save
 
oms10
 
4) After few minutes, you can start to see the events under log search option. In their using queries we can filter out the data. Also, we can setup email alerts based on the events. 
 
oms11
 
I believe now you have a basic knowledge on how to use OMS to monitor AD environment. There is lot of things we can do with OMS and I will cover those in future posts. 
 
This marks the end of this blog post. If you have any questions feel free to contact me on rebeladm@live.com also follow me on twitter @rebeladm to get updates about new blog posts.  

Introducing change tracking and inventory features for Azure VM

In any infrastructure, users are using different applications, services, file shares in order to get their work done. each of these components may have periodic changes. In order to maintain integrity, provide faster IT support and identify risks, it is important to track all the changes against these components. Maintaining inventory (software level) will help engineers to verify if systems are running as expected with relevant software and services. Change tracking and inventory process can be done using software or workflows. 

Operation Management Suite (OMS) provides solution called “Change Tracking” to capture changes in cloud only, hybrid or on-premises only environments. It can track changes on Files, Registry Entries, Software, Windows services and Linux Daemons. 

ch1

Microsoft recently release “Change tracking and Inventory” solution which can implement in Azure VM level. However, this is not replacement for OMS solution. OMS can track changes on any environment whiles this new solution is just for cloud only servers. 

There are few things I like about this solution,

1. Easy to implement – with few clicks this feature can be enable in VM level.

2. No Agents – It doesn’t need agents to track changes or maintain inventory. 

3. No need to log in to VM – It is not required any user interaction or system credentials, in order to use these features. Engineers can view visualized data without login to VM. 

4. No scheduled scans – It tracks changes automatically. you do not need to manually create scan jobs or update schedules. (however, in background jobs are controlled by azure automation) 

Please note this is still in preview mode, there for not recommended to use in production environment. But it is not too early to try its capabilities.  

Let’s go ahead and see how we can enable and use this feature. 

1. Log in to Azure portal (https://portal.azure.com) as Global Administrator.

2. Go to Virtual Machines and click on the VM you like to get this feature enabled. 

3. In VM panel, under OPERATIONS section click on Inventory (preview) 

ch2

4. Then it will load up detail window, click on purple color bar like in below image to enable the feature. 

ch3

5. Then it will load up new window with information such as log analytic workspace id and automation account id. Click on Enable to proceed. 

ch4

6. Once feature is enabled we can see following window. It will take while to populate data.

ch5

7. It also enables the Change tracking (preview) feature. 

ch6

8. After sometime you will be able to see data under Inventory (preview) and Change tracking (preview) windows. Let’s start with Inventory (preview). When I load the window, first it lists down the software installed in the system. 

ch7

9. This includes information about windows updates and all other third-party applications. If it’s a software, it displays the version number and publisher’s info. As an example, I install acrobat reader and I can see its info as following. 

ch8

10. Under the Files tab we can see the files details. By default, it doesn’t scan for any files. In system, there are thousands of files, it is no point to add all those to inventory. Instead of that users can define what folders and files to monitor and add to inventory. If its folder, it will list all these files under that particular folder automatically. In order to do that, click on Edit Settings Option.

ch9

11. In next window, click on Windows Files tab. 

ch10

12. Then click on Add in next window. 

ch11

13. In next window, type a unique name for the file or folder under Item Name. Then type folder or file path under Enter Path. Once everything done, click on Save

ch12

14. Once its added it will show under Windows Files tab. If you need to disable inventory and change tracking for a file or folder all you need to do is click on it and click on false button under Enabled.

ch14

15. If it’s a Linux system, files and folder paths can add using Linux Files tab. 

ch15

16. When we add files here, it is automatically enable change tracking for those files and folders. So, you do not need to add it again under change tracking feature. 

17. Under Registry Files tab we can enable registry files tracking and inventory as well. It does have pre-defined registry path but at the moment I can see a way to add custom path. To enable feature, click on registry entry, then click on True under enabled. In the end click on Save

ch16

18. This also enable tracking for windows registry files under change tracking feature. 

19. Under Windows services tab, it lists all services in the system. It also shows its current status and startup status.

ch17

20. This is just for one VM, if you need to view multiple VM inventory data, you need to click on Manage multiple computers.

ch18

21. In new window, it lists the machines which has this feature enabled. 

ch19

22. If you need to add new VMs to the list, it can be done using Add Azure VM option. Then it will allow you to enable inventory feature. 

ch20

23. There is option to add non-azure virtual machines too. but that will lead you to OMS.

24. All other windows are familiar, only change here you can see if the same event is repeated in different computers. 

ch21

25. All the events in here also can view using log analytics. In order to access that, click on Log Analytics option in main window. 

ch22

26. Then it will load all the events and we can find relevant info using queries or just browsing through filters. 

ch23

27. Now we done with inventory feature and let’s move in to Change tracking (preview) feature. In order to access the feature just click on Change tracking (preview) option under operations. 

ch24

28. As soon as login it shows the changes for last 24 hours. It is shows as graph as well as a list. 

ch25

29. Using Time Range dropdown, we can define the time range for data. 

ch26

30. Using Change Types dropdown, we can select which type of data to view. 

ch27

31. The graph itself really useful to narrow down a change quickly. All you need do is move mouse over the timeline and then select the area you like to dig in to by dragging the cursor. Then it simple list down the results for that particular time. 

ch28

32. Using Manage multiple computers option we can view changes for multiple computers in same window. It works same way it works in inventory feature.

ch29

33. Edit Settings option also same as in inventory feature. So, I am not going to cover it here. 

34. In main window, there is option to manage connection with azure activity log. There you can enable integration with azure activity log. You can find more info about activity log in https://docs.microsoft.com/en-us/azure/monitoring-and-diagnostics/monitoring-overview-activity-logs

ch30

This marks the end of this blog post. If you have any questions feel free to contact me on rebeladm@live.com also follow me on twitter @rebeladm  to get updates about new blog posts.  

Update Management for Azure VM

Keeping your operating systems up-to-date is critical as it will be the first step towards protecting your systems from emerging threats. it will also help to improve efficiency and user experience. Simplest way to update your windows operating systems to use “Windows Update” feature comes with every operating system. But this is not enough for corporates as it is important to manage windows updates in control manner. Microsoft has tools such as WSUS, SCCM to manage windows update in infrastructure. when it comes to hybrid or cloud only environment it is important update your virtual machines running on cloud as well. Microsoft Operation Management Suite (OMS)’s “Update Management” is a great way to manage updates in any environment (on-premises, cloud only or hybrid). It detects and report missing updates in your environment. It also allows to deploy those using Azure automation. 

update1

However, if you running an azure environment, now Microsoft have another solution which will help to manage updates for Azure VMs. This is NOT a replacement for OMS update management even though it works similar. it helps to manage updates in individual VM level or as group. This feature “Update Management” still in preview mode but it is not too early to try its capabilities. 

There are few things I like about this feature.

1. No Agents or additional configuration – This feature can enable under a VM with few clicks and it doesn’t require any additional configuration inside the VM. It doesn’t need any agent installation or any other configuration such as firewall changes. It’s simple and efficient. 

2. No need to log in to VM – This is ideal for MSPs as well. In order to manage updates, you do not need to log in to VM at all. No need to define passwords to install updates either. 

3. Reporting – It list down missing updates and categories those based on type. It lists info about failed deployments. So, everything been logged and visualized in easy way to understand. 

Let’s see how we can get this setup.

1. In order to enable this feature, you need to log in to Azure as global administrator. 

2. The click on Virtual Machines to list down VMs.

update2

3. Then click on the VM which you choose. 

4. From left hand side panel, click on Update Management (Preview)

update3

5. In next window click on purple bar (as in following image) to enable the feature. 

update4

6. Then it will load the page to enable to feature. As we can see it is also creating log analytic workspace as well as automation account. Click on Enable to proceed. 

update5

7. Once it is enabled, it will take 15-20 minutes to gather information about updates. Once it is finish we can see new data under Update Management (Preview) panel. 

update6

8. In Missing Update section, it shows update name, classification, published date and link to see more details about updates. 

update7

9. If we click on one of missing updates it will bring to us to the log search window and in there we can see more details about update. 

update8

10. In Update Management (Preview) panel, lets click Manage Multiple Computers Option. 

update9

11. In that window, we can see all the computers which have this feature enabled and their compliance status. 

update10

12. By clicking on each computer in list, we can see more detail about it using log search window.

update11

13. We also can add Azure VM to update management. To do that click on Add Azure VM option in Manage Multiple Computers panel. 

update12

14. It will list the VMs in account and click on the relevant VM you like to add. Then we can enable the feature under it. 

update13

15. Now we have list of missing updates. Next step is to schedule update. In order to do that go back to Update Management (Preview) panel and click on Schedule update deployments option. 

update14

16. In new window, first thing is to define name for the job. Under Update classification we can select which updates to consider for the schedule. 

update15

17. If need to exclude any updates, we can do that using updates to exclude option. In there we need to define relevant KB numbers. 

update16

18. Under the schedule settings we can define the time to apply updates. It can be either one time or recurring job. 

update17

19. Using maintenance window option we can set how long it should be in maintenance mode. 

update18

20. Once it’s done click on Create to create the schedule. 

update19

21. If you use the same Schedule update deployments option under Manage Multiple Computers window, we can create schedule for multiple computers. 

update20

22. Once schedule is created we can see it under Scheduled update deployments tab. 

update21

23. This completes the configuration part and once schedule run, we can verify it using Update Management (Preview) panel 

This marks the end of the blog post and hope it was useful. If you have any questions feel free to contact me on rebeladm@live.com also follow me on twitter @rebeladm to get updates about new blog posts.

Step-by-Step guide to manage Azure Storage using Azure CLI 2.0 – Part 02

This is last part of my blog post series which is covering Azure CLI 2.0 functions. If you didn’t read part 01 yet please read it before start on this. You can find it on http://www.rebeladmin.com/2017/10/step-step-guide-manage-azure-storage-using-azure-cli-2-0-part-01/ 

In my demo setup, I have two VM running. One is created using Azure Managed disks. In part 01 I explained how to add additional disk. It is currently having a 100GB additional disk attached. 

Expand Disks

Let’s see how we can expand disks using Azure CLI. Before do this, make sure you log in to Azure CLI using az login

let’s start with expanding azure managed disks. First, we can verify the VM’s storage configuration using, 

az vm show --resource-group rebeladminrg01 --name REBLEVM101

clistore1

In there I have two disks. One is for OS called osdisk_6469626e28 and the other data disk called DataDisk01

We can’t increase the disk on a running VM. Not even a data disk. So first we need to deallocate the VM. We can do it using. 

az vm deallocate --resource-group rebeladminrg01 --name REBLEVM101

in above command –resource-group defines resource group VM belongs to. –name defines the VM name. 

clistore2

once it is completed we can increase the disk sizes. 

I need to expand os Disk size to 150 GB. I can do it using,

az disk update --resource-group rebeladminrg01 --name osdisk_6469626e28 --size-gb 150

clistore3

I also like to expand data disk to 150 GB. I can do it using,

az disk update --resource-group rebeladminrg01 --name DataDisk01 --size-gb 150

clistore4

in above commands, –resource-group defines resource group disks belongs to. –name defines the disk’s name. 

after finish, we can start the VM using,

az vm start --resource-group rebeladminrg01 --name REBLEVM101

once VM is up we can go in and expand the disk in OS level. 

clistore5

if you looking to expand disk for unmanaged disks it can be done via interface or Azure CLI 1.0. more info can find in https://docs.microsoft.com/en-us/azure/virtual-machines/linux/expand-disks-nodejs

the document itself for Linux vm but expand part work same way. 

Snapshots

We also can take snapshots of disk as quick recovery option. It is full copy of a disk in the time it’s taken. It can keep as a backup or attach to another machine for troubleshooting. 

In my demo, I am going to take snapshot of an azure managed OS disk. Before do that I need to find the disk ID. It can be done using

az vm show --resource-group rebeladminrg01 --name REBLEVM101

clistore6

then we can take snapshot using,

az snapshot create -g rebeladminrg01 --source "/subscriptions/xxxxx/resourceGroups/REBELADMINRG01/providers/Microsoft.Compute/disks/osdisk_6469626e28" --name vm101osDisk-backup

clistore7

in above –source defines the disk id and –name defines the snapshot name.

if it is a unmanaged disk, snapshot works on different way. You can read more about it from https://docs.microsoft.com/en-us/azure/virtual-machines/linux/incremental-snapshots 

Convert to Managed Disks

if required we can convert vm with unmanaged disks to managed disk. To do that first we need to deallocate the VM.

az vm deallocate --resource-group rebeladminrg01 --name REBELVM102

then we can start the converting process using,

az vm convert --resource-group rebeladminrg01 --name REBELVM102

once process is finished it will start the VM. 

clistore8

clistore9

Manage blobs

We also can create, manage and delete blobs using Azure CLI. 

To create a container we can simply use,

az storage container create --name datastorage01 --connection-string "DefaultEndpointsProtocol=https;EndpointSuffix=core.windows.net;AccountName=rebelstorage01;AccountKey=xxxxxxx/ixi+FKRr3YUS9CgEhCciGVIyI9+6CtqjTIiPvbXkmpFDK9sINE28jdbIwLLOUZyiAtQ3Edzx2y89RPQ=="

in above –name defines the container name. AccountName specify the storage account name and AccountKey specify the auth key for the storage account. 

By default, the container data is set to private. If need it can set to public read access for blobs (blob) or public read and list access to whole container (container). It can define using –public-access

Once container is created we can upload blob using,

az storage blob upload --file C:\myzip1.zip --container-name datastorage01 --name myzip1.zip --connection-string "DefaultEndpointsProtocol=https;EndpointSuffix=core.windows.net;AccountName=rebelstorage01;AccountKey=xxxxxx/ixi+FKRr3YUS9CgEhCciGVIyI9+6CtqjTIiPvbXkmpFDK9sINE28jdbIwLLOUZyiAtQ3Edzx2y89RPQ=="

in above, –file defines the local file path. –container-name defines the container name it is uploading to. –name defines the blob name once it is uploaded. 

clistore10

to verify, we can list down the files in blob using,

az storage blob list --container-name datastorage01 --output table --connection-string "DefaultEndpointsProtocol=https;EndpointSuffix=core.windows.net;AccountName=rebelstorage01;AccountKey=xxxxxx/ixi+FKRr3YUS9CgEhCciGVIyI9+6CtqjTIiPvbXkmpFDK9sINE28jdbIwLLOUZyiAtQ3Edzx2y89RPQ=="

clistore11

we can download blob to local storage using,

az storage blob download --container-name datastorage01 --name myzip1.zip --file C:\myzip2.zip --connection-string "DefaultEndpointsProtocol=https;EndpointSuffix=core.windows.net;AccountName=rebelstorage01;AccountKey=xxxx/ixi+FKRr3YUS9CgEhCciGVIyI9+6CtqjTIiPvbXkmpFDK9sINE28jdbIwLLOUZyiAtQ3Edzx2y89RPQ=="

in above, –file defines the path and name it will have when downloaded to the local storage.

clistore12

we can delete a blob using command similar to,

az storage blob delete --container-name datastorage01 --name myzip1.zip --connection-string "DefaultEndpointsProtocol=https;EndpointSuffix=core.windows.net;AccountName=rebelstorage01;AccountKey=1WzgTd/ixi+FKRr3YUS9CgEhCciGVIyI9+6CtqjTIiPvbXkmpFDK9sINE28jdbIwLLOUZyiAtQ3Edzx2y89RPQ=="

clistore13

This marks the end of the blog post and hope it was useful. If you have any questions feel free to contact me on rebeladm@live.com also follow me on twitter @rebeladm to get updates about new blog posts.