Using DSC with Visual Studio Release Management With Microsoft Release Management 2013 Update 3 CTP 1 (RM), you can now use Windows PowerShell or Windows PowerShell Desired State Configuration (DSC) feature for deploying and managing configuration data. We now support deploying to Standard and Azure environments without having to setup Microsoft Deployment Agent. Using Standard Environments Standard server means any on-premise machine which can be accessed using a FQDN. For this release we have restricted to machines only in same domain or two-way trusted domains. The prerequisite is that the standard server should have PowerShell remoting enabled and WinRM port configured for HTTP communication. To enable PS remoting, run this command in admin PowerShell session: Enable-PSRemoting Force To configure WinRM for HTTP communication, run this command in admin PowerShell session: winrm quickconfig -transport:http See Installation and Configuration for Windows Remote Management. Standard environments can be created in RM by going to Environments Tab, clicking on the arrow on New button and selecting New Standard.
Now you can see this environment in the list of environments. Note that the Type of environment is Standard as opposed to 2013 for a deployer based environment. To add a standard server, go to previously created Standard environment, open it and then click on Create button under Servers tab. On the new server pop-up, you need to specify DNS name for the on-premise machine which you want to be added as Standard server in RM. DNS name should mandatorily also contain a port number. This is the WinRM port with HTTP transport protocol which should already be opened on standard server. Make sure that this port has already been enabled on server. To know how to do
that look at the beginning on this section. By default this port is 5985. To know which port is already open, run this command: winrm e winrm/config/listener Please note we do not support HTTPS transport currently so make sure that HTTP port is enabled. After you save, you will be able to see Standard server in the list of Servers. Please note that the Deployer Status is always Ready irrespective of whether the machine is actually up or not, so take care that all your required machines are already started before doing a release on Standard servers. Now, a release path with stages viz. QA and Production can be created using two Standard environments.
Using Azure Environments Azure Environment refers to any cloud service in Azure, which has one or more IaaS VMs. So first step is to create Azure Environment. Here are snapshots of two different environments namely QA & Prod environment in Azure. QA contains one VM for both Web tier and Data tier. Prod contains 2 VMs for Web tier and one VM for Data tier. Make sure that your VM has exposed PowerShell endpoint while creating VM. Azure Management Portal QA PROD Enabling PowerShell endpoint while creating IaaS VM:
To use these azure environments, Subscription information should be provided to RM via the new Tab Manage Azure in Administration Page. Here you can add all the subscriptions which you want to use for different stages of your application deployment. To import Azure subscription in Release Management Client you need the following information: Azure Subscription ID Management Certificate Key These items you will get for your subscription by downloading publish settings file from here https://windows.azure.com/download/publishprofile.aspx Get Azure publish setting file for your subscription and then enter following details from the publish settings file Here s how you can Manage Azure Subscription
Once Azure subscription gets added it will take couple of minutes (depending on the number of cloud services/vms) to sync all Azure environments and servers. Here is how Environments & Servers looks. You can notice in the below diagram that all Azure Environment has Azure as Type. Microsoft deployment agent based environments will have Type 2013 Environments Severs are listed in Server tab. An Azure server name is appended by <env name>::. Similar to Environments Tab, Servers Tab also called out what's Type, Azure or deployer based. Also for Azure VM you will get required Error message in the Deployer Error column when server is not ready for deployment, it could be because of server is not in Ready state or PowerShell port is not open for the server.
Release Management PS/DSC Based Deployment PowerShell (PS) You can use PS scripts to deploy application components to standard or azure servers. These scripts can be same as what you might have been already using to deploy to Microsoft Deployment Agent based servers. RM requires the PS version on target servers to be higher than 2.0. If Standard Server has PS 2.0, upgrade it to PS 3.0 and that should work. Desired State Configuration (DSC) Given that Microsoft is supporting DSC (Desired State Configuration) as a first-class experience in Windows, we are leveraging Windows DSC agent for deployment and configuration. DSC ships in the box with Windows 8.1 & Windows Server 2012 R2. DSC is also part of Windows Management Framework 4.0 which ships as an optional update and can be installed to Windows Server 2012, Windows Server 2008R2, Windows 7 and Windows 8. DSC based deployment will work only with RM Server machine having PowerShell version 3.0 onwards present and target environment having DSC support. Go through following links to get started with writing DSC scripts 1. High level concepts 2. Basic introduction 3. An In-depth walkthrough 4. Documentation Homepage Tool for PS/DSC Deployment on Standard environment There is a new Tools which will work in context of deployment in Azure Environment. Run PowerShell on Standard Environment Tool This tool will be used to run PS and DSC scripts on remote target machine. As shown below, using this tool you can define a component which can deploy & configure your application through PS and/or DSC Script. Please note that your standard server should be able to access and read the application package from build drop. Since RM creates a remote PowerShell connection to Standard Server, accessing remote application package becomes a second hop. To enable this second hop, we are using CredSSP. You will need to enable CredSSP manually on RM server machine before doing Standard environment based release. To enable CredSSP, run the following PowerShell command on Release Management Server with administrator privileges: Enable-WSManCredSSP -Role Client -DelegateComputer <Target Machine DNS Name> -Force. You can use * or *.domain to select delegate computers. Another point to note is that if Standard Server is a Windows Server OS machine, then you will have to add build drop location as a trusted site. Otherwise, Standard Server might not load imported scripts (your master deployment script can load other scripts) from package location.
Currently we support On-premise TFS server only as source of components. Run Tool accepts six parameters, please see description column to get idea about each parameter. Here are the more details about each parameter:
UserName/Password: Account and Password to connect to target Standard Server. This user needs to be a domain user and also member of local Administrator group on Standard Server. ScriptPath/ConfigurationPath: Consider while defining component for WebTier deployment Path to Package specified is [BuildDropLocation]\StockTraderApplication\StockTraderUpload\WebTier. Given this ScriptPath and ConfigurationPath will be relative to above package path. For example If both Script and Configuration file present under \WebTier folder then value for ScriptPath & ConfigurationPath will be something <script_file_name>. If both Script and Configuration file present under \WebTier\Scripts folder then value for ScriptPath & ConfigurationPath will be something Scripts\<script_file_name>. UseCredSSP: Specify true if your application package path is a UNC path. If you do not want to enable CredSSP server on Standard Server, you can set false here but in that you will have to make sure that the package is available locally with Standard Server. Deployment sequence Find below an example of two tier application being deployed. In QA stage, the application will be deployed on one box only. In Production stage, the same application will be deployed to 2 web serves and 1 db server. The scripts will remain same across stages, only config files for these scripts will vary. Also, in Production Stage, web tier will be deployed using PS script, while DB tier will be deployed using DSC script in parallel. Sequence will look like: First upload complete Application bits in Azure Deploy Web and Data Tier on targeted Standard Server(s) Manual intervention to check if deployment happened properly. Rollback if deployment was found to be wrong. QA stage
Note that on the left hand side toolbox, only deployer related actions are displayed. Actions which are meant for Deployer based scenario are filtered out. Production Stage
Few interesting points: 1. Parallel, Rollback, RollbackAlways are supported in this PS/DSC scenario 2. Same tool is being used to deploy PS as well as DSC scripts. Web tier is being deployed using PS script. Data tier is being deployed using DSC script. 3. PS/DSC Script is same for QA stage also, we are just changing configuration file for this stage. Tool for PS/DSC Deployment on Azure environment There are two new Tools which will work in context of deployment in Azure Environment. Upload Tool For Azure specific deployment where the application bits are behind a firewall, the best for the user is to upload the bits in Azure storage so that deployment process will download it in the target machine from Azure storage. Using Upload tool you can define a component which can upload your package to Azure Storage. Like other component you can specify how and from where to get package in Source Tab.
Let s see how using this tool we can upload Stock Trader application bits to Azure Storage. Consider the below directory structure for my application build drop Here is how Package Source location looks like while defining component. Since \ has been specified as Path to Package, whole build will be uploaded. Currently we support On-premise TFS server only as source of components.
Upload Tools just need one parameter Storage Account where Application bits needs to be uploaded. Tool internally create private container with name as GUID on Blob storage and upload bits. Note that GUID is unique for each Deployment Sequence instance. Let me explain with example how this tool works, say your storage account name is X and assume that Path to Package specified is [BuildDropLocation]\StockTraderApplication\StockTraderUpload for Upload component, then this tool will upload all files under StockTraderApplication\StockTraderUpload folder, see below how blobs name will looks like
We will see how we will use this component in our release workflow in later section. Run PowerShell on Azure Environment Tool This tool will be used to run PS and DSC scripts on remote target machine. As shown below, using this tool you can define a component which can deploy & configure your application through PS and/or DSC Script. Please note that as of now there is direct coupling between Run Tool and Upload Tool. Before executing Deploy component one must executed Upload Component beforehand, will explain more about this later part of the doc.
Run Tool accepts six parameters, please see description column to get idea about each parameter. Here are the more details about each parameter: UserName/Password:
Account and Password to connect to target Azure VM. This user needs to be part of local Administrator user group on Azure VM. ScriptPath/ConfigurationPath: Consider while defining component for WebTier deployment Path to Package specified is [BuildDropLocation]\StockTraderApplication\StockTraderUpload\WebTier. Given this ScriptPath and ConfigurationPath will be relative to above package path. For example If both Script and Configuration file present under \WebTier folder then value for ScriptPath & ConfigurationPath will be something <script_file_name>. If both Script and Configuration file present under \WebTier\Scripts folder then value for DscScriptPath & ConfigurationPath will be something Scripts\<script_file_name>. StorageAccount: Storage name from where required bits get downloaded on the target machine. Now question will arise by just providing storage name how this tool will download required bits on the target machine from right place. Here is the explanation for that: As mentioned earlier Run DSC component assumes that Upload component must executed beforehand in the same deployment sequence instance and the Storage Account Name will be the same what we used in Upload component. Earlier I mentioned that Upload Component create container with GUID name which is unique for each deployment sequence instance. With this now we have Storage and Container Name. So does it mean it will download everything inside the container? Answer is no, you can somewhat control bits to be downloaded on the target machine. If Run DSC component is for WebTier deployment then obviously you will interested to download only WebTier related bits, so you can specify Path to Package accordingly. For my given StockTrader Application example Path to package for DeployStockTraderWebTier component will be [BuildDropLocation]\StockTraderApplication\StockTraderUpload\WebTier SkipCaCheck: Communication between Release Management Server and the deployment machine will happen using SSL connection. The connection established involves certificate verification by Certificate Authority. In order to skip the certificate check from happening before establishing the connection skipcacheck attribute value must be set to true.
If order to enable certificate check before establishing the connection the skipcacheck attribute value must be set to false, and a certificate corresponding to the cloud service to which the deployment machine belongs to must be installed on the release management server. Steps for certificate installation: 1. Install Windows Azure SDK [link] 2. Install Windows Azure PowerShell [ link ] 3. Run Windows Azure PowerShell in Administrator mode 4. Run the following script by modifying the Cloud Service Name and Subscription Name. $cloudservicename = 'YOURCLOUDSERVICENAME' $subscriptionname = 'YOURSUBSCRIPTIONNAME' Add-AzureAccount Select-AzureSubscription -SubscriptionName $subscriptionname $AzureX509cert = Get-AzureCertificate -ServiceName $cloudservicename $certtempfile = [IO.Path]::GetTempFileName() $AzureX509cert.Data Out-File $certtempfile $CertToImport = New-Object System.Security.Cryptography.X509Certificates.X509Certificate2 $certtempfile $store = New-Object System.Security.Cryptography.X509Certificates.X509Store 'Root', 'LocalMachine' $store.certificates.count $store.open([system.security.cryptography.x509certificates.openflags]::readwrite) $store.add($certtoimport) $store.close() Remove-Item $certtempfile 5. Provide credentials to access the corresponding azure subscription when prompted for the same.
Deployment sequence Find below an example of two tier application being deployed. In QA stage, the application will be deployed on one box only. In Production stage, the same application will be deployed to 2 web servers and 1 db server. The scripts will remain same across stages, only config files for these scripts will vary. Also, in Production Stage, web tier will be deployed using PS script, while DB tier will be deployed using DSC script in parallel. Sequence will look like: First upload complete Application bits in Azure Start the Azure Environment Deploy Web and Data Tier on targeted Azure VMs Manual intervention to check if deployment happened properly. Rollback if deployment was found to be wrong. Stop Azure environment to avoid incurring Azure VM cost. QA stage Few points of interest: 1. There are two new actions a. Start Environment. This will start azure cloud service which is linked to this stage. b. Stop Environment. This will stop azure cloud service which is linked to this stage. 2. Start Environment, Stop Environment and Upload component are headless activities i.e. they do not need to reside inside any container. Their effect is at environment level. 3. On the left hand side toolbox, only Azure related actions are displayed. Other actions which are meant for Deployer based scenario are filtered out. Production Stage
Few interesting points: 4. Parallel, Rollback, RollbackAlways are supported in this PS/DSC scenario 5. Inside the Rollback and RollbackAlways block, first you will need to re-do the upload components. This is a limitation right now but will be removed going forward. 6. Same tool is being used to deploy PS as well as DSC scripts. Web tier is being deployed using PS script. Data tier is being deployed using DSC script. 7. PS/DSC Script is same for QA stage also just we are changing configuration file for this stage.