Welcome!

Agile Computing Authors: Liz McMillan, Pat Romanski, Elizabeth White, Dana Gardner, Andy Thurai

Related Topics: Microsoft Cloud, Containers Expo Blog, Silverlight, Agile Computing, @CloudExpo, Cloud Security

Microsoft Cloud: Blog Post

Integrating Active Directory into Windows Azure Virtual Machines

Active Directory Deployment Considerations and Implementation Steps in Windows Azure

Migrating traditional client/server applications to Windows Azure Virtual Machines is what Don Noonan does every day. The majority of these workloads use Active Directory Domain Services as their authentication provider, or in other words, classic Windows authentication. In this post Don walks us through the best practices high level architecture and the basic building blocks of creating a private forest within Windows Azure.

If Active directory is not available, you better be
As we all know, if AD is down so is your app. Imagine setting up a single domain controller responsible for both name resolution (DNS) and authentication. You just created another synonym for single-point-of-failure. At a minimum you should deploy two (2) domain controllers, and they should be created as part of an Availability Set. This will ensure that at least one (1) domain controller is always available for authentication and name resolution requests. If you’re considering saving a few bucks by deploying a single domain controller in non-production environments, let me save you a few more. The first call you get from development or QA will cost you at least 6 months of compute. Telling a dozen upset people on a conference call that you wanted to save the company $50/month will sound pretty bad…

A private forest for me? oh you shouldn’t have
There are currently two major scenarios for providing Windows authentication in Windows Azure Virtual Machines:

  • Deploy a new private forest
  • Extend an existing on-premise forest

In this blog we’ll cover deploying a new private forest. Here is a quick Visio of a classic 3-tier application (using Windows Azure features) to get us started:

As you can see, we have a management subnet that contains our domain controllers, as well as separate database and application “tiers”.

Stop Talking and Start Deploying
As with any new deployment to Windows Azure Virtual Machines, you will perform the following high-level steps:

  1. Create an affinity group (See Bob Hunt’s Article in the Series)
  2. Create a virtual network (See Bob Hunt’s Article in the Series)
  3. Create a storage account (See Kevin Remde’s Article in the Series)
  4. Create virtual machines (See Tommy Patterson’s Article in the Series)

While creating the virtual network, you will need to specify that the domain controllers will also be providing name resolution for all of the servers in your deployment. You can do this in the Windows Azure management portal as well as through the management web service. Here is how you do this via PowerShell:

Specifying custom DNS servers using PowerShell

Example command line:

Set-AzureVNetConfig –ConfigurationPath “C:\networkConfiguration.xml”

Contents of C:\networkConfiguration.xml:

<NetworkConfiguration>
<VirtualNetworkConfiguration>
<Dns>
<DnsServers>
<DnsServer name="skydc01" IPAddress="10.1.1.4" />
<DnsServer name="skydc02" IPAddress="10.1.1.5" />
</DnsServers>
</Dns>
<VirtualNetworkSites>
<VirtualNetworkSite name="skyvn" AffinityGroup="skyag">
<AddressSpace>
<AddressPrefix>10.1.0.0/16</AddressPrefix>
</AddressSpace>
<Subnets>
<Subnet name="Management">
<AddressPrefix>10.1.1.0/24</AddressPrefix>
</Subnet>
<Subnet name="Database">
<AddressPrefix>10.1.2.0/24</AddressPrefix>
</Subnet>
<Subnet name="Middleware">
<AddressPrefix>10.1.3.0/24</AddressPrefix>
</Subnet>
<Subnet name="Application">
<AddressPrefix>10.1.4.0/24</AddressPrefix>
</Subnet>
</Subnets>
<DnsServersRef>
<DnsServerRef name="skydc01" />
<DnsServerRef name="skydc02" />
</DnsServersRef>
</VirtualNetworkSite>
</VirtualNetworkSites>
</VirtualNetworkConfiguration>
</NetworkConfiguration>

In the example above, the IP addresses used assume the domain controllers are the first virtual machines created on the Management subnet. Let’s make sure that’s true by creating them now:

Creating Highly Available Domain Controllers using PowerShell

Relevant excerpts from createService.ps1:

$instanceSize = 'Small'
$imageName = 'MSFT__Win2K8R2SP1-Datacenter-201210.01-en.us-30GB.vhd'
$subnetName = 'Management'
$availabilitySetName = 'skydc'

$password = '@skyDc01'
$vmName = 'skydc01'
$skydc01 = New-AzureVMConfig -Name $vmName -AvailabilitySetName $availabilitySetName -ImageName $imageName -InstanceSize $instanceSize |
Add-AzureProvisioningConfig -Windows -Password $password |
Set-AzureSubnet $subnetName

$password = '@skyDc02'
$vmName = 'skydc02'
$skydc02 = New-AzureVMConfig -Name $vmName -AvailabilitySetName $availabilitySetName -ImageName $imageName -InstanceSize $instanceSize |
Add-AzureProvisioningConfig -Windows -Password $password |
Set-AzureSubnet $subnetName

Once you’ve created the servers, you will need to make them domain controllers, also known as promotion.

Promoting a Server to a Domain Controller using DCPROMO or PowerShell

Depending on what operating system you have chosen, you can automate forest creation via command line. In the following examples, be sure to replace DOMAIN_HERE with the desired domain name, and replace passwords with those corresponding to temporary password you assigned to the local administrator account on the first (primary) server.

Windows Server 2008 R2 – Create a new forest using DCPROMO

dcpromo.exe /unattend:C:\primaryDomainController.txt

Contents of C:\primaryDomainController.txt:

[DCInstall]
; New forest promotion
ReplicaOrNewDomain=Domain
NewDomain=Forest
NewDomainDNSName=[DOMAIN_HERE].com
ForestLevel=4
DomainNetbiosName=DOMAIN_HERE
DomainLevel=4
InstallDNS=Yes
ConfirmGc=Yes
CreateDNSDelegation=No
DatabasePath="C:\Windows\NTDS"
LogPath="C:\Windows\NTDS"
SYSVOLPath="C:\Windows\SYSVOL"
[email protected]
RebootOnCompletion=Yes

Windows Server 2012 – Create a new forest using PowerShell

C:\primaryDomainController.ps1

Contents of C:\primaryDomainController.ps1:

Import-Module ADDSDeployment
Install-ADDSForest `
-CreateDnsDelegation:$false `
-DatabasePath "C:\Windows\NTDS" `
-DomainMode "Win2012" `
-DomainName "[DOMAIN_HERE].com" `
-DomainNetbiosName "DOMAIN_HERE" `
-ForestMode "Win2012" `
-InstallDns:$true `
-LogPath "C:\Windows\NTDS" `
-NoRebootOnCompletion:$false `
-SysvolPath "C:\Windows\SYSVOL" `
-Force:$true

Part of your homework will be to create the second domain controller in the new forest. There will need to be slight changes made to the answer files above.

What's Next?

Creating the rest of servers required by your application seems like the logical next step. However, there are a handful of important tasks I like to do prior to creating ANY additional virtual machines:

Create domain user accounts that will be used for future system administration.

Create containers for major objects such as server computer accounts.

Create core group policies for significant items such as:

  • Remote Desktop Services – Enable Keep-Alives (article posted previously at Skylera)
  • User Account Control
  • Windows Firewall
  • Windows Update

Important Considerations

When creating a private forest, consider the amount of administrative overhead involved vs. level of isolation. For example, you may want to have a single forest for all pre-production environments so that you only need to perform user account tasks in one place. This is easy to do in Windows Azure.

Written by Don Noonan (Don's Blog at Skylera)

Edited by Tommy Patterson (Tommy's Blog on Virtuallycloud9.com)

Be Sure to Read Up on the Rest of the Series for 31 Days of Servers in the Cloud!

http://virtuallycloud9.com/index.php/31-days-of-servers-in-the-cloud-series/

More Stories By Tommy Patterson

Tommy Patterson began his virtualization adventure during the launch of VMware's ESX Server's initial release. At a time when most admins were only adopting virtualization as a lab-only solution, he pushed through the performance hurdles to quickly bring production applications into virtualization. Since the early 2000s, Tommy has spent most of his career in a consulting role providing assessments, engineering, planning, and implementation assistance to many members of the Fortune 500. Troubleshooting complicated scenarios, and incorporating best practices into customer's production virtualization systems has been his passion for many years. Now he share his knowledge of virtualization and cloud computing as a Technology Evangelist in the Microsoft US Developer and Platform Evangelism team.

IoT & Smart Cities Stories
The deluge of IoT sensor data collected from connected devices and the powerful AI required to make that data actionable are giving rise to a hybrid ecosystem in which cloud, on-prem and edge processes become interweaved. Attendees will learn how emerging composable infrastructure solutions deliver the adaptive architecture needed to manage this new data reality. Machine learning algorithms can better anticipate data storms and automate resources to support surges, including fully scalable GPU-c...
Machine learning has taken residence at our cities' cores and now we can finally have "smart cities." Cities are a collection of buildings made to provide the structure and safety necessary for people to function, create and survive. Buildings are a pool of ever-changing performance data from large automated systems such as heating and cooling to the people that live and work within them. Through machine learning, buildings can optimize performance, reduce costs, and improve occupant comfort by ...
The explosion of new web/cloud/IoT-based applications and the data they generate are transforming our world right before our eyes. In this rush to adopt these new technologies, organizations are often ignoring fundamental questions concerning who owns the data and failing to ask for permission to conduct invasive surveillance of their customers. Organizations that are not transparent about how their systems gather data telemetry without offering shared data ownership risk product rejection, regu...
René Bostic is the Technical VP of the IBM Cloud Unit in North America. Enjoying her career with IBM during the modern millennial technological era, she is an expert in cloud computing, DevOps and emerging cloud technologies such as Blockchain. Her strengths and core competencies include a proven record of accomplishments in consensus building at all levels to assess, plan, and implement enterprise and cloud computing solutions. René is a member of the Society of Women Engineers (SWE) and a m...
Poor data quality and analytics drive down business value. In fact, Gartner estimated that the average financial impact of poor data quality on organizations is $9.7 million per year. But bad data is much more than a cost center. By eroding trust in information, analytics and the business decisions based on these, it is a serious impediment to digital transformation.
Digital Transformation: Preparing Cloud & IoT Security for the Age of Artificial Intelligence. As automation and artificial intelligence (AI) power solution development and delivery, many businesses need to build backend cloud capabilities. Well-poised organizations, marketing smart devices with AI and BlockChain capabilities prepare to refine compliance and regulatory capabilities in 2018. Volumes of health, financial, technical and privacy data, along with tightening compliance requirements by...
Predicting the future has never been more challenging - not because of the lack of data but because of the flood of ungoverned and risk laden information. Microsoft states that 2.5 exabytes of data are created every day. Expectations and reliance on data are being pushed to the limits, as demands around hybrid options continue to grow.
Digital Transformation and Disruption, Amazon Style - What You Can Learn. Chris Kocher is a co-founder of Grey Heron, a management and strategic marketing consulting firm. He has 25+ years in both strategic and hands-on operating experience helping executives and investors build revenues and shareholder value. He has consulted with over 130 companies on innovating with new business models, product strategies and monetization. Chris has held management positions at HP and Symantec in addition to ...
Enterprises have taken advantage of IoT to achieve important revenue and cost advantages. What is less apparent is how incumbent enterprises operating at scale have, following success with IoT, built analytic, operations management and software development capabilities - ranging from autonomous vehicles to manageable robotics installations. They have embraced these capabilities as if they were Silicon Valley startups.
As IoT continues to increase momentum, so does the associated risk. Secure Device Lifecycle Management (DLM) is ranked as one of the most important technology areas of IoT. Driving this trend is the realization that secure support for IoT devices provides companies the ability to deliver high-quality, reliable, secure offerings faster, create new revenue streams, and reduce support costs, all while building a competitive advantage in their markets. In this session, we will use customer use cases...