Archive

Posts Tagged ‘Azure’

Key features of the new Microsoft Azure Site Recovery Deployment Planner

December 19, 2017 2 comments

Azure Site Recovery Deployment Planner is now GA with support for both Hyper-V and VMware.

Disaster Recovery cost to Azure is now added in the report. It gives compute, storage, network and Azure Site Recovery license cost per VM.

ASR Deployment Planner does a deep, ASR-specific assessment of your on-premises environment. It provides recommendations that are required by Azure Site Recovery for successful DR operations such as replication, failover, and DR-Drill of your VMware or Hyper-V virtual machines.  

Also, if you intend to migrate your on-premises workloads to Azure, use Azure Migrate for migration planning. Azure Migrate assesses on-premises workloads and provides guidance

 Key features of the tool are:

  1. Estimated Network bandwidth required for initial replication(IR) and delta replication.
  2. Storage type(standard or premium storage) requirement for each VM.
  3. Total number of standard and premium storage accounts to be provisioned.
  4. For VMware, it provides the required number of Configuration Server and Process Server to be deployed on on-prem.
  5. For Hyper-V, it provides additional storage requirements on on-premises.
  6. For Hyper-V, the number of VMs that can be protected in parallel (in a batch) and protection order of each batch for successful initial replication.
  7. For VMware, the number of VMs that can be protected in parallel to complete initial replication in a given time.
  8. Throughput that ASR can get from on-premises to Azure. 
  9. VM eligibility assessment based on number of disks, size of the disk  and IOPS, OS type.   
  10. Estimate DR cost for the target Azure  region in the specific currency.


When to use ASR Deployment Planner and Azure Migrate?

  • DR from VMware/Hyper-V to Azure
  • Migration from VMware to Azure

 

Download the tool and learn more about VMware to Azure Deployment Planner and Hyper-V to Azure Deployment planner.

 

 

 

Extending Microsoft OMS to monitor Squid Proxy running in Linux with a plugin – part 1/3 #MSOMS

November 24, 2016 1 comment

Since Microsoft released OMS, I have been an early adopter and evangelist for the solution. Not only it is simple to deploy but it gives you a full spectrum of many of the workloads you have either on-premises or in the cloud and it does not matter which cloud. Be it Azure, AWS, Google and many others.

So, as I was advising on OMS for a customer, I found that they were running Squid Proxy servers. The Squid proxy server is one of the most famous proxy servers in the world and it has been utilised for years in many organisations. For that reason I then I decided to look at how OMS could leverage the monitoring for Squid.

squi3

As you can see here: https://github.com/Microsoft/OMS-Agent-for-Linux/tree/master/installer/conf/omsagent.d there are already many plugins for OMS to  monitor Windows and many Linux OS as well, DNS, Network, SQL, MySQL, Postgree, VMware, MongoDB, Security, Audit, Change Tracking and so on.

But, there was no Squid plugin and that’s where I brought back my past years of experience as a developer and although that was a long, long time go, I was able to developer in ruby a Squid plugin for Microsoft OMS.

How I developed it?

PART 1 : LOG Files

  1. I started but investigating the squid log on /var/log/squid/access.log and then I research REGEX expressions to extract information out of it. Below is a extract of it

1479696836.902    134 10.1.1.4 TCP_MISS/301 488 open http://cnn.com/ – HIER_DIRECT/151.101.0.73 –
1479696848.110    242 10.1.1.4 TCP_MISS/400 486 open http://www.sydney.com/ – HIER_DIRECT/54.253.253.77 text/html
1479696860.004    407 10.1.1.4 TCP_MISS/301 636 open http://www.7news.com.au/ – HIER_DIRECT/203.84.217.229 text/html

The initial difficult part for me was of to decouple the date/time to get it on a human readable format. So, after long hours of research and playing along, I decided for the following REGEX :

 REGEX =/(?<eventtime>(\d+))\.\d+\s+(?<duration>(\d+))\s+(?<sourceip>(\d+\.\d+\.\d+\.\d+))\s+(?<cache>(\w+))\/(?<status>(\d+))\s+(?<bytes>(\d+)\s+)(?<response>(\w+)\s+)(?<url>([^\s]+))\s+(?<user>(\w+|\-))\s+(?<method>(\S+.\S+))/
(If you have a better one, please feel free to shot me)

 

  1. I then wrote a squidparserlog.rb in ruby to parse the Squid access.log file and turn it into a OMS format
class SquidLogParserLib
require ‘date’
require ‘etc’
require_relative ‘oms_common’
require ‘fluent/parser’
    def initialize(error_handler)
@error_handler = error_handler
end
    REGEX =/(?<eventtime>(\d+))\.\d+\s+(?<duration>(\d+))\s+(?<sourceip>(\d+\.\d+\.\d+\.\d+))\s+(?<cache>(\w+))\/(?<status>(\d+))\s+(?<bytes>(\d+)\s+)(?<response>(\w+)\s+)(?<url>([^\s]+))\s+(?<user>(\w+|\-))\s+(?<method>(\S+.\S+))/
    def parse(line)
      data = {}
time = Time.now.to_f
      begin
REGEX.match(line) { |match|
data[‘Host’] = OMS::Common.get_hostname
          timestamp = Time.at( match[‘eventtime’].to_i() )
data[‘EventTime’] = OMS::Common.format_time(timestamp)
data[‘EventDate’] = timestamp.strftime( ‘%Y-%m-%d’ )
data[‘Duration’] = match[‘duration’].to_i()
data[‘SourceIP’] = match[‘sourceip’]
data[‘cache’] = match[‘cache’]
data[‘status’] = match[‘status’]
data[‘bytes’] = match[‘bytes’].to_i()
data[‘httpresponse’] = match[‘response’]
data[‘bytes’] = match[‘bytes’].to_i()
data[‘url’] = match[‘url’]
data[‘user’] = match[‘user’]
data[‘method’] = match[‘method’]}
rescue => e
@error_handler.logerror(“Unable to parse the line #{e}”)
end
      return time, data
end   #def
   end   #class
3. Finally, I wrote the squid.conf for OMS
# enhanced parse log with date format , which pass the path for the log to the SquidLogParser and tag it as oms.api.Squid. By doing this, you will end up with 11 custom fields in OMS for the LOG TYPE Squid_CL
<source>
type tail
format SquidLogParser
path /var/log/squid/access.log
pos_file /var/opt/microsoft/omsagent/state/var_log_squid_access.pos
tag oms.api.Squid
log_level error
</source>
squid-fields

 

On my next article I will go through the next part, which is getting Squid Proxy Statistics in OMS, along with the full code.

squid2.png

 

Windows 2016 released and with it Hyper-V and System Center

September 27, 2016 Leave a comment

Microsoft released today at the Microsoft Ignite conference in Atlanta the newest release of Windows Server 2016!

Windows Server 2016 is jam-packed with innovation and customer response has been overwhelming, with more than half a million devices running the final Technical Preview. These customers range from large global enterprises to private cloud hosters to organizations of every size from every corner of the globe – Erin Chapple, General Manager, Windows Server

 

Windows Server 2016 delivers powerful innovation across three areas:

  • Advanced Multi-layer Security: Use Shielded Virtual Machines to help protect your virtual machines from a compromised fabric as well as improve your compliance. Shielded Virtual Machines are encrypted using BitLocker and will run on healthy hosts. To help prevent attacks and detect suspicious activity with new features to control privileged access, protect virtual machines and harden the platform against emerging threats.Watch an introduction to Shielded Virtual Machines
  • Software-defined Datacenter with Hyper-V: Run your datacenter with the utmost confidence with an automated, resilient server operating system. Azure utilises Windows Server and Hyper-V at a massive scale. Windows Servers delivers a more flexible and cost-efficient operating system for any datacenter, using software-defined compute, storage and network features inspired by Azure. Explore server virtualization with Hyper-V
  • Cloud-ready Application Platform: Run your existing apps on Windows Server 2016 without modifying them. Take advantage of enhanced security and efficiency features in the fabric. Applications are at the heart of every organization and its ability to serve customers and compete effectively for their loyalty.  Windows Server 2016 delivers new ways to deploy and run both existing and cloud-native applications – whether on-premises or in Microsoft Azure – using new capabilities such as Windows Server Containers and the lightweight Nano Server deployment option.  Learn more about containers  and Learn more about Azure Service Fabric on Windows Server 2016

 

Availability: Windows Server 2016 is available for evaluation starting today

Note: Volume licensing customers will be able to download fully licensed software at General Availability in mid-October.

Azure Automation: Calling a PowerShell from a WebApp

April 15, 2016 Leave a comment

I am working on a project that requires an Azure PowerShell to be called from a WebApp. Without entering in the details of the app, I faced a problem when writing the PowerShell script when it came to the authentication and running the PowerShell script from the Azure Automation portal is not my scenario.

webhook-overview-image

Automation: The figure shows an External App calling a Microsoft Azure Webhook to starts a runbook

Before I start, let’s have a look on the authentication Methods. The following table summarizes the different authentication methods for each environment supported by Azure Automation and the article describing how to setup authentication for your runbooks.

Method Environment Article
Azure AD User Account Azure Resource Manager and Azure Service Management Authenticate Runbooks with Azure AD User account
Azure AD Service Principal object Azure Resource Manager Authenticate Runbooks with Azure Run As account
Windows Authentication On-Premises Datacenter Authenticate Runbooks for Hybrid Runbook Workers
AWS Credentials Amazon Web Services Authenticate Runbooks with Amazon Web Services (AWS)

So, what methods I found to start the PowerShell from my WebApp?

  • Option 1: Webapp calling a PowerShell Azure RM Automation Runbook.
  • Option 2: Webapp calling an Azure Automation webhooks. Great way of doing it. A webhook allows you to start a particular runbook in Azure Automation through a single HTTP request. The webhook would allow external services such as my custom application to start runbooks.
  • Option 3: Webapp calling a PowerShell script. The issue here becomes the authentication.

 

Let’s start with Option 1. I will discuss the other options in the next posts

 

Option 1: Webapp calling a PowerShell Azure RM Automation Runbook

You can use PowerShell Workflow (recommended as you can use parallel processing to perform multiple actions in parallel) or PowerShell Script. More info here.

Note: You can’t convert runbooks from one type to another.

Create an Azure automation account

1.1.         Log in to the Azure portal.

1.2.         Click New > Management > Automation Account

1.3.         In the Add Automation Account blade, configure your Automation Account details (e.g. Name)

1.4.         From your automation account, click the Assets part to open the Assets blade to create a new credential.

1.5.         Click the Credentials part to open the Credentials blade.

1.6.         Click Add a credential at the top of the blade.

1.7.         Complete the form and click Create to save the new credential. For more info see Credential assets in Azure Automation

 

Create a PowerShell script/workflow with the commands required for your solution (for example: get a list of VM’s)

$cred = Get-AutomationPSCredential –Name “Replace with the Crendential NAME”

Add-AzureRMAccount –Credential $cred Select-AzureSubscription –SubscriptionName “replace your Subscription NAME”

Get-AzureVM

 

Create an Azure Automation Runbook

1.8.    In the Azure Portal, click on Automation Accounts and select the Automation account you created previously

1.9.    Click on the Runbooks tile to open the list of runbooks.

1.10.    Click on the Add a runbook button and then Import.

1.11.    Click Runbook file to select the file to import

1.12.    If the Name field is enabled, then you have the option to change it. The runbook name must start with a letter and can have letters, numbers, underscores, and dashes.

1.13.    Select a runbook type taking into account the restrictions listed above.

1.14.    The new runbook will appear in the list of runbooks for the Automation Account.

1.15.    You must publish the runbook before you can run it.

Alternatively, to import a runbook from a script file with Windows PowerShell:

$AutomationAcct = “Your Automation Account Name”

$runbookName = “TestRunbook”

$scriptPath = “c:\MyRunbooks\TestRunbook.ps1”

Set-AzureAutomationRunbookDefinition -AutomationAccountName $AutomationAcct -Name $runbookName -Path $ scriptPath -Overwrite

Publish-AzureAutomationRunbook -AutomationAccountName $AutomationAcct –Name $runbookName

 

Create an ASP.NET website which will call a PowerShell command.

The Webapp should call the following PowerShell:

Start-AzureAutomationRunbook –AutomationAccountName “replace with your Automation Account NAME created in step 1.3″ –Name ” replace with your runbook name. for eample:MyGetVMRunbook ”

For more info, click here

Next Post: Option 2 and 3….

Categories: Cloud, Microsoft Tags: , ,

Azure ASR’s SLA-backed enhanced VMware to Azure solution is now ready to replicate your on-premises workloads to Azure

January 14, 2016 Leave a comment

You heard right. Microsoft has launched an enhanced version of its Azure Site Recovery (ASR) targeted especially for VMware customers.

asr-new

The concept of ASR is very simple: organisations will be able to replicate their VMware virtual machines (VMs) to Azure, update and then run them in Azure as a disaster recovery option. They will be charged a small amount by VM but won’t have to pay for compute or storage until the VM is up and running in Azure.

To note, Azure Site Recovery, as part of Microsoft Operations Management Suite (OMS), enables your organisation to gain control and manage your workloads no matter the source: Azure, AWS, Windows Server, Linux, VMware or OpenStack.

 

Some of the key ASR characteristics:

  • With non-disruptive recovery testing, you can easily test the failover of your VMware virtual machines to Azure within minutes, and validate your workload’s performance in Azure, without impacting on-going replication or the production workload.
  • With ASR-integrated failback, start replicating your Azure virtual machines back to your on-premises ESXi environment, and failback to the original or an alternate location when your on-premises site is once again available for use.
  • Heterogeneous workload support, automated VMware vCenter Server discovery
  • Continuous data protection (CDP), one-click failovers with ASR Recovery Plan
  • Rich health monitoring and e-mail notifications.

I’ve been working with ASR for a while and I definitely recommend it.

Ready to start using ASR? Check out additional product information, to start replicating your workloads to Microsoft Azure using Azure Site Recovery today. You can use the powerful replication capabilities of Site Recovery for 31 days at no charge for every new physical server or virtual machine that you replicate.

You can read the announcement at https://azure.microsoft.com/en-us/blog/ga-enhanced-migration-and-disaster-recovery-for-vmware-virtual-machines-and-physical-servers-to-azure-using-asr/

Is Security a cloud benefit or a shared responsibility?

November 9, 2015 3 comments

Cloud adoption is skyrocketing and there is no doubt about it, with more and more customers realising its benefits: costs, flexibility, availability, etc.

But how about security? Is security a cloud benefit?  Well, sort of. By migrating your systems to a public cloud you certainly be assured that the providers are substantially invest on security measures, policies and certifications to guarantee the underlying infrastructure is a safe place for you to store your data and run your applications. But it stops there.

The conversation you should be having with your cloud provider is not if they are secure. They are! They have all the industry standards and certifications to guarantee that. What you should be asking is if they have real-time data, metric and resources to enable and help you to protect your company data.

The security boundaries are limited to the infrastructure of the public cloud. It is your business responsibility to make sure that your application runs safely and your data is protected and some business don’t get it.

cloudsecurity

Last week when attending a session at the MVP Summit with Brad Anderson about Identity and cloud, I realised how fragile is the conversation that is happening between organisations and the cloud providers – customers are adopting cloud with security in their mind set (In a recent study of IT decision makers by BT, more than three quarters of the respondents (76%) said that security is their main concern when it comes to cloud-based services and). But many of those customers are putting the responsibility to protect their data, solely on the public cloud provider and that is mistake that needs to be addressed.

Let’s take the example of a customer that migrated their email and documents to the cloud: among others benefits, data availability (anywhere, anytime, any device) is in my opinion one of the great cloud realisations. But the data availability also brings a security risk to organisations if they don’t invest on securing and protecting their data from non-authorised access.

Employees who access privileged company data from public Wi-Fi for example are susceptible to all sorts hackers and they have a high risk of having their device compromised. Have you thought about that? Does your company have VPN or other security measures for external access to the company data?

Also, a password only to protect someone from logon on your computer is not sufficient to protect any data you have on it. Is your company making use of solutions to encrypt the local disk? Does your company have policies in place to prevents that company data is not stored locally on your computer?

And how about your mobile? Ransomware is on the rise, with hackers taken over an entire system, holding it hostage until a fee is paid. Take the Whatsapp example – in August 2015, hackers discovered a bug that enabled them to infect devices for those utilising the web version of the app. On another example, you may recall that Lenovo faced trouble earlier this year, when it found that some of its mobiles and notebooks were sold with pre-installed spyware (According to G DATA researchers it happened somewhere along the supply chain by an outside party). The same problem happened with Huawei, Xiaomi and others.

By not having security measures on your mobile, you could let a thief to access your personal and company data if it gets stolen or lost –

  • Do you have a pin to protect your mobile?
  • Is your PIN strong enough or something like 1234 or 0000 or your birthday?
  • If you search yourself on the internet can any of the information led to your password or PIN?
  • Is your company using a device management solution?

A couple of months ago, when running a workshop to architect a solution for a customer to migrate their email to the cloud, I heard incredible the request of their IT manager: “whereas cloud concerns, the solution we want should encompass that some groups of employees should only have access to company email if they are physically connected to our network and data access should be protected from unauthorized people and devices.”.

First you will think that in the cloud times, requests to not allow the data from being accessed outside the company network would not make sense and it is a weird request, as one of the benefits of having the email in the cloud is actually being able to access it elsewhere from any device, right? But the reason is simple: they realised that migrating their email to the cloud, did not mean that their security measures and policies to protect their most precious asset: their customer’s data should not be in place. Their request was true and valid and it got me by surprise as a very few customers really understands that security in the cloud is a shared responsibility.

Security is one on the key concerns when a business decides to migrate to a public cloud and although most of them understand that the level of risk mostly relates to the behaviour and culture of their employees, some still don’t have strict policies in place and lack data access controls, which poses a high risk on their main asset: their data.

I have large experience in Security, Cloud and Datacenter Management. Reach me out and we can organize a workshop for your business at ac@cloudtidings.com

More info on the main public cloud providers security compliance:

Cloud domain controller as a services with @Azure AD Domain Services @microsoftenterprise

October 19, 2015 Leave a comment

That’s right Cloud AD as a services. A fully managed domain by Microsoft : Azure AD Domain Services to manage Azure IaaS workloads.

101415_1620_AzureADDoma4

Azure AD Domain Services It’s a cloud based service which gives you a fully Windows Server Active Directory compatible set of API’s and protocols, delivered as a managed Azure service.

You don’t need to provision a Virtual Machine running Domain Controller on Azure as a IaaS anymore and have those domain controllers synchronize to their on-premises Active Directory servers using a VPN/Expressroute connection.

You can now turn on support for all the critical directory capabilities your application and server VM’s need, including Kerberos, NTLM, GROUP POLICY and LDAP.

For scenarios like Disaster Recovery and hybrid cloud deployments, it is just perfect. It means a full value of Windows Server AD in the cloud domain, without having to deploy, manage, monitor and patch domain controllers.

There are many scenarios that can be explored with this new feature.

You can enable Azure AD Domain Services for any existing Azure AD tenant – the same tenant you use with Office 365 or other SaaS applications. Azure AD Domain Services are available now.

For pricing, please check : http://azure.microsoft.com/pricing/details/active-directory-ds/

To start:

  1. You already deployed Azure AD Connect (to sync identity information from the on-premises Active Directory to your Azure AD tenant. This includes user accounts, their credential hashes for authentication (password sync) and group memberships)
  2. Create the ‘AAD DC Administrators’ group and then add all users who need to be administrators on the managed domain to it. These administrators will be able to join machines to the domain and to configure group policy for the domain.
  3. Configure the Network. Select or create the Azure virtual network you’d like to make domain services available in. Ensure the following:
    • The virtual network belongs to a region supported by Azure AD Domain Services. See the region page for details.
    • Ensure the virtual network is a regional virtual network and doesn’t use the legacy affinity groups mechanism.
    • Ensure your workloads deployed in Azure Infrastructure services are connected to this virtual network

    101415_1620_AzureADDoma8

  4. Enable Azure AD Domain Services for your Azure AD tenant, by going to the Configure tab of your Directory, selecting Yes on ‘Enable Domain Services for This Domain’, specifying the domain name and selecting the Azure Virtual Network. Click on Save to confirm.
  5. Update DNS settings for the Azure virtual network to point to the new IP address of the Azure AD Domain Services you just enabled.
  6. Enable synchronization of legacy credential hashes to Azure AD Domain Services. This is a required step. By default, Azure AD does not store the credential hashes required for NTLM/Kerberos authentication. You need to populate these credential hashes in Azure AD so users can use them to authenticate against the domain.
     Done. In simple tasks yu setup your AD as a Services in Azure.
 A few notes:

A few salient aspects of the managed domain that is provisioned by Azure AD Domain Services are as follows:

  • This is a stand-alone managed domain. It is NOT an extension of your on-premises domain.
  • You won’t  need to manage, patch or monitor this this managed domain.
  • There is no need to manage AD replication to this domain. User accounts, group memberships and credentials from your on-premises directory are already synchronized to Azure AD via Azure AD Connect.
  • Since the domain is managed by Azure AD Domain Services, there is no Domain Administrator or Enterprise Administrator privileges on this domain.