Cisco Unified Communications Manager 10.x New Features Lab Lab Guide Version 1.3



Similar documents
CISCO UNIFIED COMMUNICATIONS MANAGER

F-Secure Messaging Security Gateway. Deployment Guide

Tool for Automated Provisioning System (TAPS) Version 1.2 (1027)

End User Configuration

Lab 2 Collaboration Meeting Rooms: Conferencing in a Cisco pervasive video deployment

Cisco Prime Collaboration Deployment Administration Guide, Release 10.5(1)

Cisco Prime Collaboration Deployment Troubleshooting

Application Note: Cisco Integration with Onsight Connect

IM and Presence Service Network Setup

Installation and Configuration Manual

Installing and Using the vnios Trial

Virtual Appliance Setup Guide

Implementing Cisco Collaboration Applications **Part of the CCNP Collaboration certification track**

Only LDAP-synchronized users can access SAML SSO-enabled web applications. Local end users and applications users cannot access them.

1 VoIP/PBX Axxess Server

CUCM 9.x Configuration Manual for Arc Premium

Hosted PBX. TelePacific Communicator User Guide. Desktop Edition

Installing and Configuring vcenter Support Assistant

ZyXEL IP PBX Support Note. ZyXEL IP PBX (X2002) VoIP. Support Notes

Introducing Cisco Voice and Unified Communications Administration Volume 1

CommandCenter Secure Gateway

Registering with Cisco UCM

Configuring Cisco Unified Communications Manager with Comrex STAC-VIP

RSA Authentication Manager 8.1 Virtual Appliance Getting Started

Connection Broker Managing User Connections to Workstations, Blades, VDI, and More. Quick Start with Microsoft Hyper-V

NOC PS manual. Copyright Maxnet All rights reserved. Page 1/45 NOC-PS Manuel EN version 1.3

SC-T35/SC-T45/SC-T46/SC-T47 ViewSonic Device Manager User Guide

How to install/upgrade the LANDesk virtual Cloud service appliance (CSA)

How to Configure the Cisco UC500 for use with Integra Telecom SIP Solutions

POD INSTALLATION AND CONFIGURATION GUIDE. EMC CIS Series 1

How to Configure an Initial Installation of the VMware ESXi Hypervisor

Installing and Configuring vcloud Connector

Configuring the CyberData VoIP 4-Port Zone Controller with Audio Out

Administrator s Guide for the Polycom Video Control Application (VCA)

Configuring CyberData VoIP Ceiling Speakers

Using Cisco UC320W with Windows Small Business Server

vcenter Operations Manager for Horizon Supplement

Guide to Intel Rapid Storage

Implementing Cisco Collaboration Devices CICD v1.0; 5 Days; Instructor-led

Extreme Control Center, NAC, and Purview Virtual Appliance Installation Guide

Backup & Disaster Recovery Appliance User Guide

PC Installation Documentation for the Instant Messaging and MeetingPlace Features of Your New Telephone

Kerio Operator. Getting Started Guide

EXPRESSCLUSTER X for Windows Quick Start Guide for Microsoft SQL Server Version 1

SonicWALL SRA Virtual Appliance Getting Started Guide

Avaya IP Office 9.1. Set Up Guide for The IP Office Anywhere Demo Platform

Deployment Guide for the Polycom SoundStructure VoIP Interface for Cisco Unified Communications Manager (SIP)

Implementing Intercluster Lookup Service

Bosch Video Management System High availability with VMware

Background Deployment 3.1 (1003) Installation and Administration Guide

Implementing Cisco Unified Communications Manager Part 1, Volume 1

Grandstream Networks, Inc.

Cisco Unified Communications Express - Quick Configuration Tool User Guide

File Share Navigator Online 1

ISERink Installation Guide

Integrating VoIP Phones and IP PBX s with VidyoGateway

Personal Call Manager User Guide. BCM Business Communications Manager

Application Notes for Microsoft Office Communicator Clients with Avaya Communication Manager Phones - Issue 1.1

이 기기는 업무용 급 으로 전자파적합등록을 한 기기이오니 판매자 또는 사용자는 이점을 주의하시기 바라며 가정 외의 지역에서 사용하는 것을 목적으로 합니다

Using Spectralink IP-DECT Server 400 and 6500 with Cisco Unified Communication Manager, 3 rd party SIP

Cisco CallManager configuration for BLU-103

Thinspace deskcloud. Quick Start Guide

XenDesktop Implementation Guide

VMware vcenter Operations Manager for Horizon Supplement

NSi Mobile Installation Guide. Version 6.2

Administrator Guide. v 11

BaseManager & BACnet Manager VM Server Configuration Guide

SIP Trunking using Optimum Business Sip Trunk Adaptor and the Zultys MX250 IP PBX

UC Business & UC Team

AlienVault. Unified Security Management (USM) x Initial Setup Guide

Hosted VoIP Phone System. Desktop Toolbar User Guide

Clearswift SECURE Exchange Gateway Installation & Setup Guide. Version 1.0

CUCM License Migration from Version 8.x to Version 9.x

RealPresence Platform Director

Lab 00: Configuring the Microsoft Lync Ignite Environment Cloud Hosted Version

Installing the Operating System or Hypervisor

SIP Trunking using Optimum Business SIP Trunk Adaptor and the Cisco Call Manager Express Version 8.5

Unitrends Virtual Backup Installation Guide Version 8.0

Yealink Phones User Guide Bicom Systems

Grandstream Networks, Inc. UCM6510 Basic Configuration Guide

Veeam Backup Enterprise Manager. Version 7.0

Installing Management Applications on VNX for File

vsphere Upgrade vsphere 6.0 EN

Deploying Polycom SoundStation IP Conference Phones with Cisco Unified Communications Manager (CUCM)

Storage Sync for Hyper-V. Installation Guide for Microsoft Hyper-V

603: Enhancing mobile device experience with NetScaler MobileStream Hands-on Lab Exercise Guide

OnCommand Unified Manager 6.3

READYNAS INSTANT STORAGE. Quick Installation Guide

VoIP Intercom and Cisco Call Manager Server Setup Guide

Upgrading Redwood Engine Software. Version 2.0.x to 3.1.0

How To Configure A Cisco Unified Ip Phone On A Cnet Box On A Microsoft Powerbook 2.5 (Powerbook) On A Mini Ip Phone (Mini Ip) On An Ip Phone With A Mini Cell Phone (Microsoft Power

Corporate Telephony Toolbar User Guide

ReadyNAS Setup Manual

Intercluster Lookup Service

Moxa Device Manager 2.0 User s Guide

VMware for Bosch VMS. en Software Manual

VMware vcenter Log Insight Getting Started Guide

NAC Guest. Lab Exercises

Windows Server 2008 R2 Initial Configuration Tasks

Configuring the Dolby Conference Phone with Cisco Unified Communications Manager

Transcription:

Lab Guide Version 1.3

Table of Contents Cisco Unified Communications Manager 10.x Lab Guide Key... 3 Cisco Unified CM 10.x Overview:... 4 UCM 10.0 Differentiation Highlights... 4 AUDIENCE... 4 Cisco UCM 10.x Overview... 4 Pod Sites... 4 Lab Agenda... 5 CUCM 10 Topology... 6 CUCM 10.0 Addressing Tables... 7 Lab Workstation Address Table... 8 CUCM Version Table... 9 Lab Pre-configuration... 10 Accessing The Lab Equipment... 11 Logging Into Remote WorkStations... 11 Prime Collaboration Deployment... 14 IM and Presence Node Integration... 28 Section 1 Configure IM&P Node Settings... 28 Appendix... 32 Self-Provisioning... 36 Section 1 Setup LDAP Sync to De fault Users to Default Templates... 37 Section 2 Demonstration of the New Self Care Portal in CUCM10... 55 Section 3 Quick User/Phone Add For Day 2 Support... 59 MediaSense... 60 Section 1 Perform Initial Setup of the MediaSense Server... 61 Section 2 Configure VoH Resource and SIP Trunk in CUCM... 64 Section 3 - Test Video on Hold... 67 Global Dial Plan Replication (GDPR)... 69 Section 1 Verify CIPC Re gistered to the Migrated CUCM v10 Cluster... 70 Section 2 Set Up the ILS Network Between Clusters... 71 Lab Guide Version 1.3 Page 2 of 83

Section 3 Implementing DN Replication with GDPR and Class of Control... 77 Section 4 Verify Proper Setup and Test... 81 End Of Lab... 83 Lab Guide Key The following is a description of the conventions, colors and notation used through this document: Sections with this background color and this icon touch on the business benefits of the step or task with item s and talking points highlighting a value proposition of a Solution. Sections with this background color and this icon cover the technical description of the step or task, with item s and talking points of interest to technical audiences. Sections with this background color and this icon provide a lab tip for the step or task. Sections with this background color and this icon are for scenario description: Provides background inform ation for performing a step or task. Sections with this background color and this icon represent a warning: read this section for special instructions and considerations. Lab Guide Version 1.3 Page 3 of 83

Cisco Unified CM 10.x Overview: Cisco Unified Com munications Manager (C UCM) 10.x will provide a com pelling collaboration platform highlighting video ubiquity, highly differentiated hosted and m id-market offers; positioning Cisco Unified Communications Manager as the core platform for all things collaboration Voice, Video, IM, Presence and Applications. UCM 10.0 Differentiation Highlights Simplified Upgrade, Migration, New Install, and Hostname/IP Address Change Using Prime Collaboration Deployment Integrated Admin, Serviceability, and User Options for IM & Presence Service New Video Conferencing Resources, Deployment Models, Use Cases, and Technologies Global Dial Plan Replication by Intercluster Lookup Service Popular CUCM 10.x Topics that will NOT be covered include: SAML SSO (10.0 only includes SAML SSO for user/admin pages, not applications or device logins such as Jabber) VPN-less FW Traversal (Covered in Collaboration Edge Lab) Many of the IM&P Features are covered in detail in the Jabber Lab and have therefore not been covered here. AUDIENCE This docum ent is intended to assist solution architects, system engineers, field engineers, and consultants in deploying and learning m any of the new features of Cisco Unified Com m unications Manager (CUCM) 10.x System. This document assumes the reader has an architectural and administrative understanding of the CUCM and has reviewed the latest CUCM SRND. Cisco UCM 10.x Overview This lab will consist of 20 pods of lab equipment with each pod containing the following. All servers are virtual m achines running on VMware ESXi 5.0, with hardware platform s consisting of Unified Computing Sys tem (UCS) B-Series Blade systems. Pod Sites CUCM 10.0.1 Server (preinstalled) Providing local device registration and call control CUCM IM & Presence Server 10.0 (2 nodes) Providing Presence and IM CUCM 8.6.5 Server (preinstalled) Used for m igrating to v10 CUP 8.6.5 Used for m igrating to v10 Lab Guide Version 1.3 Page 4 of 83

Cisco Unified Communications Manager 10.x CUCM v10 Server - Target for m igration of 8.6 CUCM server IM&P v10 Server - Target for Migration of 8.6.5 IM&P Server MediaSense Server Used to dem onstrate new Video Capabilities in CUCM 10.0 ESXi VM Hosts the CUCM and IM&P 10.0 VMs for Migration Prime Collaboration Deployment Used to Migrate CUCM and CUP 8.6 to v10.0 Three Windows 7 Workstations Student pod access and Jabber/CIPC Lab Agenda This lab will focus on the following new features in CUCM 10.x: Prim e Collaboration Deployment CUCM IM&P Node Integration User Self Provisioning and Self Care New Video Conferencing Capabilities Global Dial Plan Replication with ILS Lab Guide Version 1.3 Page 5 of 83

CUCM 10 Topology In this lab topology each device is a virtual m achine (VM). This lab is operating on Unified Com puter System (UCS) B-Series blades. VMware ESXi 5.0 is the operating system and hypervisor running on each lab host computer. The lab UCS host computers are oversubscribed and are not following Cisco s best practices for UC on UCS. Please follow the best practices outlined on the uc-virtualized web site, this web site can be found at http://cisco.com/go/ucvirtualized. cucm-pub-v10.uclab.com 10.10.10.15 cucm-pub-v86.uclab.com 10.10.10.25 cucm-pub-mig-v10.uclab.com 10.10.10.35 imp-pub-v10.uclab.com 10.10.10.20 imp-sub-v10.uclab.com 10.10.10.21 cup-pub-v86.uclab.com 10.10.10.30 imp-pub-mig-v10.uclab.com 10.10.10.40 AD / DNS 10.10.10.5 ad-dns.uclab.com mediasense.uclab.com 10.10.10.55 pcd.uclab.com 10.10.10.10 esxi.uclab.com 10.10.10.50 Student Workstation #1 10.10.10.101 Student Workstation #2 10.10.10.102 Student Workstation #3 10.10.10.103 Lab User Laptop Corporate Network Lab Guide Version 1.3 Page 6 of 83

CUCM 10.0 Addressing Tables Domain uclab.com Gateways -.254 Subnet Masks - /24 20 total pods Description IP Address Domain\User Password Hostname AD/DNS/NTP 10.10.10.5 Adm inistrator C1sc0123 ad-dns PCD 10.10.10.10 adm in C1sc0123 pcd CUCM 10.0 10.10.10.15 adm in C1sc0123 cucm -pub-v10 IM&P 10.0 Node 1 10.10.10.20 adm in C1sc0123 im p-pub-v10 IM&P 10.0 Node 2 10.10.10.21 adm in C1sc0123 im p-sub-v10 CUCM 8.6 10.10.10.25 adm in C1sc0123 cucm -pub-v86 CUP 8.6 10.10.10.30 adm in C1sc0123 cup-pub-v86 CUCM 10.0 Migrated 10.10.10.35 adm in C1sc0123 cucm -pub-m ig-v10 IM&P 10.0 Migrated 10.10.10.40 adm in C1sc0123 im p-pub-m ig-v10 ESXi 10.10.10.50 root C1sc0123 esxi MediaSense 10.10.10.55 adm in C1sc0123 m ediasense Workstation 1 10.10.10.101 uclab\student1 C1sc0123 ws01 Workstation 2 10.10.10.102 uclab\student2 C1sc0123 ws02 Workstation 3 10.10.10.103 uclab\student3 C1sc0123 ws03 Lab Guide Version 1.3 Page 7 of 83

Lab Workstation Address Table Pod Workstation Pod IP External IP:Port# Pod Workstation Pod IP External IP:Port# 1 WS01 10.10.10.101 128.107.92.34:2061 WS01 10.10.10.101 128.107.92.34:2091 WS02 10.10.10.102 128.107.92.34:2062 11 WS02 10.10.10.102 128.107.92.34:2092 WS03 10.10.10.103 128.107.92.34:2063 WS03 10.10.10.103 128.107.92.34:2093 2 WS01 10.10.10.101 128.107.92.34:2064 WS01 10.10.10.101 128.107.92.34:2094 WS02 10.10.10.102 128.107.92.34:2065 12 WS02 10.10.10.102 128.107.92.34:2095 WS03 10.10.10.103 128.107.92.34:2066 WS03 10.10.10.103 128.107.92.34:2096 3 WS01 10.10.10.101 128.107.92.34:2067 WS01 10.10.10.101 128.107.92.34:2097 WS02 10.10.10.102 128.107.92.34:2068 13 WS02 10.10.10.102 128.107.92.34:2098 WS03 10.10.10.103 128.107.92.34:2069 WS03 10.10.10.103 128.107.92.34:2099 4 WS01 10.10.10.101 128.107.92.34:2070 WS01 10.10.10.101 128.107.92.34:2100 WS02 10.10.10.102 128.107.92.34:2071 14 WS02 10.10.10.102 128.107.92.34:2101 WS03 10.10.10.103 128.107.92.34:2072 WS03 10.10.10.103 128.107.92.34:2102 5 WS01 10.10.10.101 128.107.92.34:2073 WS01 10.10.10.101 128.107.92.34:2103 WS02 10.10.10.102 128.107.92.34:2074 15 WS02 10.10.10.102 128.107.92.34:2104 WS03 10.10.10.103 128.107.92.34:2075 WS03 10.10.10.103 128.107.92.34:2105 6 WS01 10.10.10.101 128.107.92.34:2076 WS01 10.10.10.101 128.107.92.34:2106 WS02 10.10.10.102 128.107.92.34:2077 16 WS02 10.10.10.102 128.107.92.34:2107 WS03 10.10.10.103 128.107.92.34:2078 WS03 10.10.10.103 128.107.92.34:2108 7 WS01 10.10.10.101 128.107.92.34:2079 WS01 10.10.10.101 128.107.92.34:2109 WS02 10.10.10.102 128.107.92.34:2080 17 WS02 10.10.10.102 128.107.92.34:2110 WS03 10.10.10.103 128.107.92.34:2081 WS03 10.10.10.103 128.107.92.34:2111 8 WS01 10.10.10.101 128.107.92.34:2082 WS01 10.10.10.101 128.107.92.34:2112 WS02 10.10.10.102 128.107.92.34:2083 18 WS02 10.10.10.102 128.107.92.34:2113 WS03 10.10.10.103 128.107.92.34:2084 WS03 10.10.10.103 128.107.92.34:2114 9 WS01 10.10.10.101 128.107.92.34:2085 WS01 10.10.10.101 128.107.92.34:2115 WS02 10.10.10.102 128.107.92.34:2086 19 WS02 10.10.10.102 128.107.92.34:2116 WS03 10.10.10.103 128.107.92.34:2087 WS03 10.10.10.103 128.107.92.34:2117 10 WS01 10.10.10.101 128.107.92.34:2088 WS01 10.10.10.101 128.107.92.34:2118 WS02 10.10.10.102 128.107.92.34:2089 20 WS02 10.10.10.102 128.107.92.34:2119 WS03 10.10.10.103 128.107.92.34:2090 WS03 10.10.10.103 128.107.92.34:2120 Lab Guide Version 1.3 Page 8 of 83

CUCM Version Table Description Version Cisco Unified Com munication Manager 10.0.0.97024-8 Cisco Unified CM IM & Presence 10.0.0.97013-16 & 10.0.0.97071-6 UNRST Prim e Collaboration Deployment 10.0.0.97017-12 Student Remote Work Stations Windows 7 MS Active Directory Server Windows 2008 R2 64 MediaSense 9.1.1.10000-25 Lab Guide Version 1.3 Page 9 of 83

Lab Pre-configuration There are many parts of the lab that have been built and preconfigured before the start of class Application Installations and Basic Configuration Basic Dial Plan LDAP Integration User, Passwords, & PINs CIPC devices added to CUCM database ESXi VM s for Migration Lab Guide Version 1.3 Page 10 of 83

Accessing The Lab Equipment Com plete this lab exercise to get connectivity to the lab. Activity Objective In this activity, you will learn the m ethods to access lab equipment rem otely Required Resources These are the resources and equipment that are required to com plete this activity. Logging Into Remote WorkStations There are 3 remote student WorkStations that are each running Windows 7. Each workstation has CIPC installed, running and registered to the student s pod CUCM. In this section we will show you how to connect to the workstations. Step 1 Step 2 Step 3 Step 4 Click Start All Programs Accessories Remote Desktop Connection, on the students com puter If you are directly connected to the lab environment via Ethernet cable (SEVT Physical Lab Attendees only) proceed to step 3. If you are participating in the lab remotely o r are not directly cabled, go to step 4. Direct Connect Enter 10.10.10.101 in the com puter field of Remote Desktop Connection login screen. You can also connect to workstation 2 and 3 directly using the settings below. Workstation01 = 10.10.10.101 User = uclab\student1 Workstation02 = 10.10.10.102 User = uclab\student2 Workstation03 = 10.10.10.103 User = uclab\student3 Passwords are all: C1sc0123 Remote Connect Enter the external IP address of your Pod s Workstations from the address table in the com puter field of Remote Desktop Connection login screen. Your Workstations will serve as your proxy into the Pod network, so you will access all other applications from here. Workstation01 = 128.107.92.34:<port> User = uclab\student1 Workstation02 = 128.107.92.34:<port> User = uclab\student2 Workstation03 = 128.107.92.34:<port> User = uclab\student3 Password is : C1sc0123 Step 5 Make sure that the connection is set up to play sounds on your local com puter so that you can hear audio from the softphones running on the Workstations by setting the following connection options: Lab Guide Version 1.3 Page 11 of 83

a. Click Options. b. Click the Local Resources tab. c. Click Settings under Remote Audio. d. Click the Play on this Computer radio button. e. Click OK. Step 6 Click Connect Step 7 Step 8 Step 9 Step 10 Click Use Another Account Enter uclab\student1 Enter C1sc0123, in the password field Click OK Lab Guide Version 1.3 Page 12 of 83

Step 11 Your Rem ote Desktop should look something like this Please DO NOT close Cisco IP Communicator (CIPC) on any of the three workstations during the lab. If you accidentally close CIPC please click the reboot.bat shortcut on the desktop. CIPC cannot be opened while in a RDP session due to virtural audio driver issues, CIPC can only be opened from the console at boot. Lab Guide Version 1.3 Page 13 of 83

Prime Collaboration Deployment This section of the lab will focus on the new Prim e Collaboration Deployment (PCD) application. PCD is an application that can be used to m anage the m igration of legacy CUCM clusters (6.1.5, 7.1.5, 8.x, 9.x) to new VM based servers on 10.0, allowing Cisco custom ers to easily m igrate to virtual servers. It can also be used to perform adm inistrative operations on existing UC clusters (version 8.6.1 or higher), such as: Upgrade UC application software Install cop files (locales or device packs) on a cluster Switch versions Reboot Two additional administrative operations can be perform ed on 10.0 clusters: Change IP addresses or hostnames on an existing 10.0 cluster Fresh install a new 10.x CUCM, or CUCM/IM&P cluster Activity Objective In this activity, you will learn the m ethods to Access the PCD Application Add ESXi Host and UC cluster to the inventory of the PCD Application Perform a m igration from CUCM and CUP 8.6 to CUCM and IM&P 10.0 using PCD Change the IP Address and Hostname of the CUCM and IM&P 10.0 Servers Required Resources To com plete this section of the lab you will need a com puter that can either connect directly to the PCD application via Web Brows er and Workstation #1 via RDP (SEVT Attendees), or RDP access to Workstation 01 Public IP address and from there you will access the PCD application via Web browser and SSH (remote GOLD Lab participants). The Chrome browser does not work with the version of PCD used in the lab Section 1 - Step 1 Navigate to 10.10.10.10 in a supported browser to open your pod s PCD Adm inistration Page. Click to continue through any security certificate warnings. You m ay get warnings about a script not responding periodically as you click through the PCD pages. We are working with pre -release beta software and as such please be patient and just click to allow the script to continue processing if/when you receive any warnings. Lab Guide Version 1.3 Page 14 of 83

Step 2 Click Cisco Prime Collaboration Deployment Step 3 Login to PCD with the following credentials User Nam e = admin Password = C1sc0123 (C One s c Zero One Two Three) Step 4 On the top center m enu select Inventory ESXi Hosts. Step 5 Click Add ESXi Host Step 6 Enter the following info Hostnam e/ip Address: 10.10.10.50 Usernam e: root Password: C1sc0123 Description: Pod ESXi Host When you add an ESXi Host to the PCD inventory, PCD logs into the ESXi host with root privileges and creates an NFS datastore on the ESXi host pointing at the SFTP datastore on PCD. This is what PCD will use when perform ing installation, m igration, and upgrade tasks. Lab Guide Version 1.3 Page 15 of 83

Step 7 We need to upload the ISOs for CUCM and IM&P 10.0 into the PCD Datastore so that PCD can utilize these ISOs for upgrade, migration, and installation tasks. If you haven t already, open an RDP session to Workstation 1 at 10.10.10.101. Login as uclab\student1 with password C1sc0123. Step 8 Step 9 Step 10 Step 11 Open the Filezilla client on Workstation 1. Click on the Site Manager button and note in Site Manager that the PCD connection is defined to use SFTP and login with the adminsftp account. The password used is the GUI admin password, in this case C1sc0123. Click Connect. Notice that there are several folders on the PCD SFTP datastore. The two that you will be interested in are the fresh_install folder and the upgrade folder. You will need to upload ISO s to be used for new installs and m igrations into the fresh_install folder. These will need to be bootable ISOs. If you are perform ing an in-place upgrade, the ISOs for that task will need to go into the upgrade folder. Select the Z: drive on the local m achine (left pane in Filezilla) and select the fresh_install folder on the PCD SFTP datastore (right hand pane) since we will be perform ing a migration. (If the Z: drive isn t available in the local machine pane of FileZilla, you may have to connect to the Z: drive by opening Windows Explorer and clicking on the Z: drive.) Double-click the UCSInstall_CUP_UNRST_10.0... ISO file in the local (left hand) pane to transfer it to the SFTP Datastore on PCD. It should begin transferring and take a few m inutes to com plete. Lab Guide Version 1.3 Page 16 of 83

Step 12 Double-click the UCSInstall_UCOS_UNRST_10.0 ISO file in the local (left hand) pane to transfer it to the SFTP datastore on PCD. It will take several m inutes to complete the transfer. (You can transfer both of the above files at the sam e time.) Note that we are using an Unrestricted version of CUCM and CUP for our m igration to 10.0. This is because we installed an Unrestricted version of CUCM 8.6 as the source node for our m igration. It is not currently possible to m igrate from a Restricted to Unrestricted (or vice versa) version of CUCM at this time. Step 13 Once both the CUCM and CUP 10.0 ISO files have been successfully transferred to the SFTP datastore on PCD, you should see the files in the PCD GUI if you click Administration SFTP Datastore. Lab Guide Version 1.3 Page 17 of 83

Step 14 Step 15 Step 16 Now that we have our ESXi host added and the ISO files uploaded to the PCD datastore, we need to add our existing cluster to the PCD inventory. The existing cluster consists of the CUCM 8.6 and CUP 8.6 nodes that we will be m igrating from. On the top m enu in the PCD GUI, Click on Inventory Clusters. Click Discover Cluster. Enter the following information in the Discover Cluster window (Password is C1sc0123): Step 17 Step 18 Click Next. The PCD should contact the CUCM Publisher and discover all of the nodes in the cluster. Note that although you only put in the CUCM node IP address, you will see both the CUCM and CUP nodes in the Cluster Nodes table that is presented. It m ay take several m inutes for all of the inform ation to populate. Verify the inform ation and click Next. Lab Guide Version 1.3 Page 18 of 83

Step 19 Step 20 Step 21 The Cluster Role Assignment window allows you to optionally assign detailed roles such as Music On Hold, etc. to the different nodes in a cluster to help PCD determ ine the order of operations that should be completed in an upgrade or m igration task. Click on Assign Functions. Check Music On Hold, Primary Call Processing, and Primary TFTP for the CUCM node and click Next Node. Check Primary Presence for the CUP node and click OK. Step 22 Step 23 Click Finish. You should now have 1 cluster defined in the Cluster Inventory table. If you click the expansion arrow to see the details of the cluster you can see the two nodes that m ake up the cluster. Step 24 Step 25 Next we need to add the cluster(s) that we will m igrate to. To do this click on Define Migration Destination Cluster in the Cluster Inventory window. In the Specify Clusters section of the window that pops up, we need to define our Source UC Cluster, Destination Cluster Nickname, and Destination Network Settings. Define these settings for the CUCM m igration according to the following im age: Lab Guide Version 1.3 Page 19 of 83

Step 26 Step 27 Step 28 Step 29 Click Next. In the Assign Destination Cluster Nodes window, click on Assign Destination Cluster Nodes. The Configure Destination Cluster window that pops up allows you to pick destination virtual machines for the m igration from the pre -defined VMs that PCD finds on the ESXi host that we added to the PCD inventory earlier. Note that the VMs have to be created using the VSphere client or VC enter as PCD cannot create VMs. The first node we are selecting a destination for is the CUCM Publisher node. Select the CUCM Migration VM from the Virtual Machines table. Under Network, select Enter New Network Settings in the drop down and enter the following settings: Hostname: cucm-pub-mig-v10 IP Address: 10.10.10.35 Subnet Mask: 255.255.255.0 Gateway: 10.10.10.254 Lab Guide Version 1.3 Page 20 of 83

Step 30 Click Next Node to pick a destination VM for the CUP node. Step 31 Step 32 Select the CUCM IM&P Migration VM. Under Network, select Enter New Network Settings in the drop down and enter the following settings: Hostname: imp-pub-mig-v10 IP Address: 10.10.10.40 Subnet Mask: 255.255.255.0 Gateway: 10.10.10.254 Step 33 Step 34 Click OK. Verify that all of the inform ation is correct in the Assign Destination Cluster Nodes section and then click Next. Step 35 In the Configure NTP/SMTP Settings windows, m ake sure NTP Server 1 is set to 10.10.10.5 and all other NTP server settings are empty. (If 10.10.10.254 is pre-populated, please deleted it and m ake NTP Server 1 10.10.10.5.) Lab Guide Version 1.3 Page 21 of 83

Step 36 Step 37 Step 38 Click Next. In the Configure DNS Settings window, verify that the Primary DNS for both nodes shows as 10.10.10.5 in the table. If it does not, select the node, enter 10.10.10.5 in the Primary DNS field at the top, and click Apply to Selected Nodes. Click Finish. When doing a m igration, note that the destination cluster nodes m ust pass all of the network checks that CUCM perform s during install/upgrade. That m eans that the new cluster m ust be able to reach an NTP server defined for it, resolve the new h ostname to the new IP address, etc. Make sure the ancillary network settings and DNS hosts are in place before the m igration is performed. PCD does not validate this inform ation; it sim ply creates an answer file (AFG) with the inform ation to be used to configure the destination node(s) during the m igration task. If the network checks fail, the m igration/install task will fail. We have ensured that all necessary hostnames are present in DNS for you for the lab. Step 39 You should now have the following clusters in your PCD inventory: Step 40 Step 41 Step 42 Step 43 Before performing a m igration, you need to ensure that the proper service is activated on your source node(s) to allow PCD to extract the configuration databases. In CUCM/CUP 8.6.x, the service we need is the Platform SOAP Services. In 9.x and later releases the needed service is the Platform Administrative Web Service. Access CUCM 8.6 by pointing a browser at 10.10.10.25 and logging in with admin/c1sc0123. Click on Cisco Unified Serviceability in the drop down m enu at the top right of the CUCM administration interface. Click Tools Service Activation. Make sure there is a check box next to Platform SOAP Services and that the Activation Status is Activated. If there is no check and the Activation Status is Deactivated, activate the service by placing a check in the box and clicking Save. Then click OK in the pop up notifying you that service activation m ay Lab Guide Version 1.3 Page 22 of 83

take several m inutes. Be sure the service status says Activated before m oving on. Step 44 Perform steps 40-43 for CUP 8.6, accessible via browser at 10.10.10.30 with login admin/c1sc0123. Prior to 9.1, the Platform Adm inistrative Web (SOAP) services are not activated by default, but starting in release 9.1 the service is enabled by default. Step 45 Step 46 Step 47 We now need to create the migration task in PCD. In the PCD browser interface, click on Task Migrate. Click Add Migration Task. In the Choose Source and Destination Clusters section of the Add Migration Task window, select UC 8.6 Cluster for your Source UC Cluster and UC 10.0 Cluster for your Destination Cluster. Step 48 Step 49 Step 50 Step 51 Click the expansion arrows next to each source node on the left and you can see the details of the source and destination nodes to verify information. Click Next. In the Choose Migration Files section, click the Browse button next to the CUCM Migration File field. In the Choose Migration file window select the UCSInstall_UCOS_UNRST_10.0 ISO file (this should be the only file Lab Guide Version 1.3 Page 23 of 83

available). Step 52 Step 53 Step 54 Step 55 Step 56 Step 57 Now Click the Browse button next to the IM&P Migration File field. In the Choose Migration file window select the UCSInstall_CUP_UNRST_10.0 ISO file (this should be the only file available). Click OK. Click Next. In the Set Start Time section select Start task manually. Note that you could schedule the task for a particular tim e or set it to start im m ediately upon com pletion of the wizard. Note the steps in the Specify Migration Procedure section. These are the steps that PCD has autom atically added to our task script. Click through the edit actions links to note how the steps can be m odified, but don t actually m ake any changes to these steps. Step 58 When you are finished clicking through the options and the expansion arrows to fam iliarize yourself with the options (but again, don t make any changes), click Next. Lab Guide Version 1.3 Page 24 of 83

Step 59 Review the setting for the Cluster Migration Task in the Review section. Click Finish. Step 60 You should now have one task listed in the Migrate window of PCD. Step 61 We are now ready to start the m igration task. Click on Start in the UC 8.6 Cluster m igration task row. Step 62 Step 63 Step 64 Click Yes in the are you sure window. Click View Details in the UC 8.6 Cluster m igration task row. You should now see the Monitoring pane of PCD showing the Migrate UC 8.6 Cluster task status. Lab Guide Version 1.3 Page 25 of 83

Step 65 Look over the details of the Migrate UC 8.6 Cluster task and note which step is running. You can also click View Log to see a log of the task progress. Step 66 Step 67 Step 68 Step 69 Refresh the m onitoring panes and you can watch the task progress through its steps. Continue to monitor the progress as closely as you d like. The Migration tasks will proceed and should take some tim e to com plete. Open vsphere to m onitor the install tasks through vsphere as well. On Workstation 1 (10.10.10.101) launch the vsphere client and login to your Pod ESXi host at 10.10.10.50 using username root and password C1sc0123. Once in vsphere you can look at the console of your version 10 VM s and m onitor the progress of the installations. Lab Guide Version 1.3 Page 26 of 83

Step 70 Step 71 Step 72 Step 73 When PCD reaches the step where it powers on the destination VM s, ESXi m ay throw an error asking if the VM was m oved or copied. Select copied and continue. If the m igration task fails at any point, you will need to troubleshoot the cause of the failure and you m ay be able to retry the task that failed after correcting the issue. While the m igration/installation continues you m ay want to go ahead and work on other sections of the CUCM 10.0 la b since it will likely take 45 m inutes or m ore for the installation to com plete. Just continue to m onitor the progress to be sure that it is proceeding. Once the m igration has completed, you will have your CUCM 10.0 and IM&P 10.0 nodes up and running at 10.10.10.35 and 10.10.10.40, respectively. Also, the CUCM and CUP 8.6 nodes should be shut down and inaccessible. PCD will be showing the m igration ta sk as successful with all steps successful. -- This completes this section -- Lab Guide Version 1.3 Page 27 of 83

IM and Presence Node Integration This section of the lab will focus on the integrating an IM & Presence node with CUCM in release 10.0. Activity Objective In this activity, you will learn the m ethods to Configure and verify IM&P node integration Required Resources To com plete this section of the lab you will need a com puter that can either connect directly to the CUCM administration web pages via a Web Browser and Workstation 1, 2, and 3 via RDP (SEVT Attendees), or RDP access to Workstation 1, 2, and 3 external IP address and ports from the table on page 8 and from there you will access the PCD application via Web browser and SSH (rem ote GOLD Lab participants). All lab scenarios for this section were tested using the Chrome 29.0.1547.65 browser on OSX 10.8.4 Section 1 Configure IM&P Node Settings Step 1 Step 2 Step 3 O pen the CUCM Administration page @ 10.10.10.15 or cucm-pubv10.uclab.com in a supported browser. Click to continue through any security certificate warnings. Login to ccm adm in with the following credentials User Nam e: admin Password: C1sc0123 (<capital>c One s c Zero One Two Three) Create a unique Non Secure Sip Trunk Security profile for the SIP Presence publish trunk between CUCM and IMP. Lab Guide Version 1.3 Page 28 of 83

Step 4 Configure SIP Trunk - Device Trunk Step 5 Configure IM And Presence Publish trunk - Service Parameter SIP Device IM and Presence Publish Trunk. Select the SIP trunk created from the previous step. Step 6 Verify IMP Node status - From the CUCM administration, choose System Server. Screen should appear as below. The IMP nodes are subscribers in the cluster and have to be added prior to installation. The IMP node install has changed because of that fact. Appendix A has screen shots of the IMP node installation. Lab Guide Version 1.3 Page 29 of 83

Step 7 Select either one of the servers and click the Presence status to view the services running on each of the nodes. Step 8 Configure Presence Redundancy - System Presence Redundancy Groups. Add the second server to the DefaultCUPSSubcluster and Save. Lab Guide Version 1.3 Page 30 of 83

Step 9 Enable HA. Select the HA check box. Lab Guide Version 1.3 Page 31 of 83

Appendix Install screen captures for IMP. Note the First Node Configuration screen. Because IMP nodes are now CUCM subscribers you must add the nodes to the CUCM publisher prior to installing IMP nodes. IMP nodes utilizes CUCM as the NTP source. IMP nodes will not install if the CUCM is not sync d to a valid NTP source. Lab Guide Version 1.3 Page 32 of 83

Lab Guide Version 1.3 Page 33 of 83

Lab Guide Version 1.3 Page 34 of 83

-- This completes this section -- Lab Guide Version 1.3 Page 35 of 83

Self-Provisioning This section of the lab will focus on the new self-provisioning features available in CUCM 10.0. Self-provisioning allows universal templates to be applied to LDAP im ported users to define m ost aspects of com mon user attributes. The users will then be able to provision devices using new universal templates to provision their own devices either via a voice prom pted IVR application or a self-provisioning web page. Activity Objective In this activity, you will learn the m ethods to Create universal tem plates to allow day one, zero touch administration for new users Define an IVR application which will allow the user to configure auto-registered phones as their own Access quick add pages to quickly insert users and phones based on existing users Access the re-written self-care portal (ccm user pages) to allow for zero touch day 2 configuration for user BYOD and other initiatives Required Resources To com plete this section of the lab you will need a com puter that can either connect directly to the CUCM administration web pages via a Web Browser and Workstation 1, 2, and 3 via RDP (SEVT Attendees), or RDP access to Workstation 1, 2, and 3 external IP address and port num bers from the table on page 8 and from there you will access the PCD application via Web browser and SSH (rem ote GOLD Lab participants). All lab scenarios for this section were tested using the Chrome 29.0.1547.65 browser on OSX 10.8.4 Lab Guide Version 1.3 Page 36 of 83

Section 1 Setup LDAP Sync to Default Users to Default Templates Step 1 Step 2 Step 3 Navigate to 10.10.10.15 in a supported browser to open your pod s CUCM Adm inistration Page. Click to continue through any security certificate warnings. Click Cisco Unified Communications Manager Login to ccm adm in with the following credentials User Nam e: admin Password: C1sc0123 (<capital>c One s c Zero One Two Three) First we will create the CTI Route Point, directory number, and CTI app user so we can enable the self-provisioning application. Step 4 From the CUCM adm inistration, choose Device CTI Route Point. Click the Add New button. Add a new device according to the following param eters. Be sure to choose the appropriate calling search space. Device Nam e: SelfProvIVR CSS: CSS_All_Access Lab Guide Version 1.3 Page 37 of 83

Step 5 Add a directory num ber which users will dial when using the self -provisioning IVR. Click Add a new DN link at the bottom of the CTI Route Point Page. Nam e: SelfProvIVR Directory Num ber: 3001 Route Partition: PT_Local CSS: CSS_All_Access Lab Guide Version 1.3 Page 38 of 83

Step 6 Create an app user for the IVR Application. User Management Application User. Click Add New. User ID: CTI-APP-USER Password: C1sc0123 Controlled Device: SelfProvIVR Groups: Standard CTI Allow Control of All Devices and Standard CTI Enabled Lab Guide Version 1.3 Page 39 of 83

Step 7 Associate the CTI Route Point and Application User with the self -provisioning application at User Management Self Provisioning. Notice the administration lets you know that auto-registration is turned off. We will be changing this later in the lab. Check No Authentication Required The authentication options are of particular interest and will likely be utilized in most production environments. The user only Lab Guide Version 1.3 Page 40 of 83

authentication allows only users to authenticate and provision new devices. The users and administrators allow both users and administrators to authenticate. We will be using the no authentication required for lab simplicity but you can test either option. CTI Route Point: SelfProIVR Application User: CTI-APP-USER Click the Save and Restart Now button to restart the self-provisioning service. Step 8 Create new phone button template for auto-registered devices. This new phone button template will be applied and add a speed dial for the selfprovisioning application. Device Device Settings Phone Button Template. Click Add New. Choose the Universal Device Template Button Layout at the bottom of the drop-down list. Click Copy. Button Tem plate Nam e: AutoReg Universal Device Template Button Layout Set the second button to a speed dial. Lab Guide Version 1.3 Page 41 of 83

Step 9 Create a new Universal Device Template to assign to all new devices. Click User Management -> User/Phone Add -> Universal Device Template. Click Add New then Expand All at the top of the page to expand all configuration areas. Nam e: AutoReg Universal Device Template Device Description: #FirstName# #LastName# (#Product# #Protocol#) Note the ability to add TAGs to this field to populate based upon sync ed information from LDAP. Click the pencil icon to the right to see some of the available options (not all options are available in our lab CUCM build). You can build these by clicking the pencil icon. The #Product# and #Protocol# will not appear in the pencil build page but are available for use if typed manually. Device Pool: Default Device Security Profile: Universal Device Template Model Independent Security Profile SIP Profile: Standard SIP Profile Phone Button Template: AutoReg Universal Device Template Under the Device Settings Lab Guide Version 1.3 Page 42 of 83

Owner User ID: Current Device Owners User ID Under the Phone Buttons Configuration expanded screen, click Edit Line Label: #FirstName# #LastName# Display (Caller ID): #FirstName# #LastName# External Phone Num ber: 408555XXXX Under the Speed Dial expanded screen, click Edit Num ber: 3001 Label: Self Provisioning Lab Guide Version 1.3 Page 43 of 83

Step 10 Now create a new Universal Line Template to assign to all new devices. Click User Management -> User/Phone Add -> Universal Line Template. Click Add New then Expand All at the top of the page to expand all configuration areas. Nam e: AutoReg Universal Line Template Line Description: #FirstName# #LastName# (#UserID#) Route Partition: PT_Local Calling Search Space: CSS_All_Access Alerting Nam e: #FirstName# #LastName# +E.164 Alternate Number Num ber Mask: +1408555XXXX Lab Guide Version 1.3 Page 44 of 83

Add to Local Route Partition: checked Select PT_Local as the Route Partition. Advertise Globally via ILS: checked Lab Guide Version 1.3 Page 45 of 83

Step 11 Now we need to enable auto-registration on the CUCM. This is done the same way as in previous versions but it now has a few new fields. Start by clicking System -> Cisco Unified CM. Click Find and choose the name of the server (10.10.10.15). After filling out the page, click Save, then Apply Config. Universal Device Template: AutoReg Universal Device Template Configuration Universal Line Template: AutoReg Universal Line Template Configuration Starting Directory Num ber: 2001 Ending Directory Num ber: 2009 Auto-Registration disabled..: Unchecked Lab Guide Version 1.3 Page 46 of 83

Step 12 Now we need to enable LDAP sync, and all new users sync ed from LDAP will inherit the universal attributes we just assigned. First, we need to create a user profile for auto registration. Click User Management -> User Settings -> User Profile. For this step, we will just m odify the existing factory default user profile. Click Find then Standard (Factory Default) User Profile. Desk Phones: AutoReg Universal Device Template Configuration Mobile Devices: AutoReg Universal Device Template Configuration Lab Guide Version 1.3 Page 47 of 83

Rem ote Destination/Device Profiles: AutoReg Universal Device Template Configuration Universal Line Template: AutoReg Universal Line Template Configuration Allow End User to Provision their own phones: checked Lim it Provisioning once End User has this m any phones: 10 Click Save Step 13 Next we need to create a Feature Group Template. Click User Management -> User Phone/Add -> Feature Group Template. For this step, we will just m odify the existing default. Click Find then Default Factory Feature Group. Hom e Cluster: checked Enable User for Unified IM and Presence: checked Include Meeting inform ation in Presence: checked Lab Guide Version 1.3 Page 48 of 83

Allow Control of Device from CTI: checked Click Save Step 14 Now we can finally enable LDAP sync. Choose System -> LDAP -> LDAP System. Check Enable Synchronizing from LDAP Server. Click Save. Step 15 Choose System -> LDAP -> LDAP Directory. Click Add New. LDAP Configuration Nam e: AD-DNS LDAP Manager Distinguished Nam e: administrator@uclab.com LDAP Password and confirm password: C1sc0123 LDAP User Search Space: ou=allusers,dc=uclab,dc=com Directory URI in the User Fields needs to be changed to: mail Set the Feature Group Template: Default Feature Group Template Add Standard CCM End Users and Standard CTI Enabled to the Access Control Groups to apply these two security groups to new users Lab Guide Version 1.3 Page 49 of 83

Check the Apply m ask to synced telephone numbers to create a new line for inserted users and use the mask XXXX. This field dictates how the DN will be created from the phone number field in AD. For example, in our AD, the phone number field is 4085551001 but the directory number the self-provisioning code will use is just 1001. LDAP Server Inform ation: 10.10.10.5 Click Save. After clicking Save but before clicking Perform Full Sync Now please verify that the Access Control Groups took as well as the Mask. In this CUCM10 version, those configuration settings do not always take under the first save. Click Perform Full Sync Now Lab Guide Version 1.3 Page 50 of 83

Step 16 Open a rem ote desktop connection to Student1s Windows desktop. (10.10.10.101 using student1:c1sc0123 dom ain of uclab.com as your credentials). You should see an already booted and logged in desktop with IP Com m unicator already loaded. CIPC is in use here in the lab to simulate any supported hardphone. CIPC needs to be closed and re -opened, but due to the nuances of running CIPC in a hosted desktop, it is not possible to just shut down CIPC and relaunch. The desktop needs to be rebooted. Click the reboot.bat batch file link on the desktop. Windows will reload. Please wait about 60 seconds and log back into the desktop via RDP. CIPC should auto-launch on startup and auto -register to CUCM10.0. This can take anywhere from 3 to 5 m inutes in this lab and the CIPC screen will likely flash several tim es as it downloads the new config. Please be patient in this process. You can verify the process is continueing successfully by check ing if the device appears in CUCM 10 under Device - > Phone and clicking Find. Lab Guide Version 1.3 Page 51 of 83

Step 17 Verify you have your RDP sound configured correctly. Press the Speaker button on CIPC and verify you can hear dialtone. Likely the dialtone will be stuttered even when it is not supposed to be but it is still working correctly. Click the Self Provisioning IVR speed-dial You should hear something along the lines (behind stuttered speech) To provision this device, enter your self-provisioning identification number followed by the pound key Enter 1001. This is the self-provisioning ID for student1. This can be verified by checking User Management -> End User. Click Find and choose student1. After pressing #, you should hear a confirm ation m essage played, ending with Press the pound key to confirm. Press the # key to confirm and CIPC will restart. Verify CIPC has restarted with student1 s attributes. Lab Guide Version 1.3 Page 52 of 83

You can verify self-provisioning but checking that student1 has been assigned the CIPC device by checking User Management -> End User and Device -> Phone. Find. Choose student1 s new CIPC. Verify that the device nam e was rewritten, the calling search spaces, etc were all set. Verify the sam e on the directory num ber. Lab Guide Version 1.3 Page 53 of 83

Lab Guide Version 1.3 Page 54 of 83

Section 2 Demonstration of the New Self Care Portal in CUCM10 Step 18 Open a browser to http://10.10.10.15/ucmuser. Usernam e: student1 Password: C1sc0123 Lab Guide Version 1.3 Page 55 of 83

Step 19 Click Add an additional phone under the My Additional Phones section. Choose Enable Single Number Reach Lab Guide Version 1.3 Page 56 of 83

Step 20 Step 21 Step 22 Navigate around the new interface to familiarize yourself with the new layout. The functions from < CUCM10 are still there, but the interface is newer and easier to navigate. You can do things like delete the selfprovisioning IVR speed-dial that was created during auto -registration. Logout but selecting student1 in the upper right corner and Sign Out Navigate back to the CUCM administration pages and login. Check that the rem ote destination profile was created when the student1 visited the self care portal in step 21. Lab Guide Version 1.3 Page 57 of 83

Step 23 Do the sam e for the other two desktops if desired. Student2 pc: 10.10.10.102, student2, C1sc0123, uclab.com Student3 pc: 10.10.10.103, student3, C1sc0123, uclab.com Lab Guide Version 1.3 Page 58 of 83

Section 3 Quick User/Phone Add For Day 2 Support Step 24 On CUCM administration, click User Management -> User/Phone Add -> Quick User/Phone Add. Click Find. Select student1. Select Manage Devices button then the Add a new phone button. Add a new phone with a random m ac address. In this lab version of CUCM 10, you m ay get an error stating the entry already exists in the database. This is in error. Sim ply click OK. Verify the new phone exists, associated to the student1 user. -- This completes this section -- Lab Guide Version 1.3 Page 59 of 83

MediaSense Cisco Unified Communications Manager 10.x This section of the lab will focus on provisioning a MediaSense server to provide Video on Hold (VoH) features available in CUCM 10.0. The Video on Hold feature can be leveraged in video contact centres where customers calling into the video contact centre are able to watch a specific video after initial consultation with the agent at the contact centre. In this case, the agent selects the video stream that gets played to the customer while the customer is on hold. In addition to the video contact centre Video on Hold can be deployed within any enterprise if the deployment requires a generic video on hold capability. Cisco Unified Com munications Manager (Unified Comm unications Manager) now has a new configuration "Video on Hold Server" that allows a m edia content server to be provisioned under the existing "Media Resources" m enu. The media content server can stream audio and video content when directed by Unified Communications Manager. The m edia content server is an external device that can store and stream audio and video content under Unified Communications Manager control using SIP as the signal protocol. The m edia content server is capable of providing hi-definition video content at 1080p, 720p, or lower resolutions such as 360p. The m edia content server configuration and allocation for a particular session follows the "Media Resource Group" and "Media Resource Group List" constructs in Unified Com m unications Manager. Cisco MediaSense is used as the m edia content server. Activity Objective In this activity, you will learn the m ethods to Configure the initial setup of a MediaSense Server. Create a VoH Resource within CUCM and supporting SIP Trunk. Verify VoH resource registration to UCM and UCM. Required Resources To com plete this section of the lab you will need a com puter that can either connect directly to the CUCM and MediaSense administration web pages via a Web Browser and Workstation #1, 2, and 3 via RDP (SEVT Attendees), or RDP access to Workstation 01 Public IP address and from there you will access the PCD application via Web browser and SSH (rem ote GOLD Lab participants). All lab scenarios for this section were tested using the Chrome 29.0.1547.65 browser on OSX 10.8.4 Lab Guide Version 1.3 Page 60 of 83

Section 1 Perform Initial Setup of the MediaSense Server Step 1 Step 2 Step 3 Step 4 Navigate to 10.10.10.55 in a supported browser to open your pod s MediaSense Administration Page. Click to continue through any security certificate warnings. Click Cisco MediaSense Login to MediaSense with the following credentials User Nam e: admin Password: C1sc0123 (<capital>c One s c Zero One Two Three) Once logged in follow the Setup Wizard in the MediaSense administration as seen below. Click the Next button. Lab Guide Version 1.3 Page 61 of 83

Step 5 Wait until all services show enabled in the Media Service Activation screen. Click the Next button once all services show an Enabled status. Step 6 Configure the UCM AXL Service Provider for MediaSense by entering the AXL Service Provider Configuration inform ation. Click the Next button. AXL Service Provider: 10.10.10.15 AXL Username: admin AXL User Password: C1sc0123 Lab Guide Version 1.3 Page 62 of 83

Step 7 Configure the MediaSense Call Control Service Provider. Select the UCM server in the Available Call Control Service Providers and m ove it to the Selected Call Control Service Providers. Click the Next button. Step 8 MediaSense Setup Sum mary. Click Done and continue on to configuring the VoH and SIP Trunk in UCM. Lab Guide Version 1.3 Page 63 of 83

Section 2 Configure VoH Resource and SIP Trunk in CUCM Step 9 Step 10 Step 11 Step 12 Navigate to 10.10.10.55 in a supported browser to open your pod s Cisco Unified Communications Manager Adm inistration Page. Click to continue through any security certificate warnings. Click Cisco Unified Communications Manager Login to Unified Comm unications Manager Adiministration with the following credentials User Nam e: admin Password: C1sc0123 (<capital>c One s c Zero One Two Three) Navigate to Media Resources -> Video On Hold Server Step 13 Click Add New Lab Guide Version 1.3 Page 64 of 83

Step 14 Configure the VoH with the following inform ation. Nam e: 10.10.10.55 Description: voh_ms1 Default Video Content Identifier: SampleVideo SIP Trunk: select Add New Step 15 Enter the following information pertaining to the SIP Trunk that connects to the MediaSense Server. Device Nam e: 10.10.10.55 Description: voh_ms1-siptrunk Location: Hub_None SIP Trunk Security Profile: Non Secure SIP Trunk Profile SIP Profile: Standard SIP Profile For TelePresence Conferencing Destination Address: 10.10.10.55 Destination Port: 5060 Lab Guide Version 1.3 Page 65 of 83

Step 16 Create a Media Resource Group that contains the VoH that was configured in Step 14. Step 17 Create a Media Resource Group List that contains the previously configured MRG Step 18 Ensure that the video devices that have been configured are assigned the VoHMRGL. Lab Guide Version 1.3 Page 66 of 83

Section 3 - Test Video on Hold Step 19 Provision Pod EX90. Go to Settings Administrator Settings Provisioning Step 20 Start the Provisioning Wizard Select Infrastructure Cisco UCM Step 21 Enter the CUCM Address in the External Manager field: 10.10.10.15 Step 22 Click Register. Step 23 Your EX90 should auto register in your auto registration range (2001-2009) Step 24 Optional: dial 3001 from your EX90 and auto -provision using 1002 or 1003 Step 25 Step 26 Go back to CUCM Device and find your EX90 and DX650 on the registered devices list. Confirm that the Media Resource Group List that you configured for Video on Hold is selected for both devices. Step 27 Step 28 Initiate a call from the EX90 to the DX650 From the DX650, press the Hold Key **Make sure you initiate the Hold from the DX650, the DX650 cannot play the specified VoH file form at that is loaded in the MediaSense server** Lab Guide Version 1.3 Page 67 of 83

-- This completes this section -- Lab Guide Version 1.3 Page 68 of 83

Global Dial Plan Replication (GDPR) This section of the lab will focus on a new feature added to the Intercluster Lookup Service (ILS) in CUCM v10 called Global Dial Plan Replication (GDPR). ILS was introduced in CUCM v9 as a way to replicate and route directory URI s across clusters to enable simple SIP URI routing within large organizations having multiple CUCM clusters. GDPR extends that capability to replication of the enterprise dial plan across multiple clusters, allowing the sam e simplified routing across all the clusters in an organization. Further, E164 numbers can also be automatically shared and used for direct dialing or PSTN failover if the link between clusters is disrupted. Activity Objective In this activity, you will learn the m ethods to Create ILS hub and spoke configurations to enable replication between clusters View the dial plan elements learned from other clusters Set up routing between clusters for both SIP URI and enterprise dialing Select which portions of the dial plan to replicate and which to exclude Use partitions and calling search spaces, as well as new pattern blocking and advertising features to provide class of control between clusters Use com m and line tools to view the status of peers in the ILS network Required Resources To com plete this section of the lab you will need a com puter that can either connect directly to the CUCM administration web pages via a Web Browser and Workstation #1, 2, and 3 via RDP (SEVT Attendees). All lab scenarios for this section were tested using the Firefox 23.0.1 browser on OSX 10.8.4 You will also need to have completed: The PCD section of the lab and have the m igrated CUCM available as the 2 nd cluster The Self Provisioning portion of the lab and have the CIPC phones operational on Workstation #1 and #2 Lab Guide Version 1.3 Page 69 of 83

Section 1 Verify CIPC Registered to the Migrated CUCM v10 Cluster This section will help you setup the CIPC on Workstation #3 to register to the m igrated CUCM v10 system at 10.10.10.35. The configuration has already been completed on CUCM so you only need to reconfigure the application on the workstati on. Step 1 Connect to Workstation #3 via RDP. Verify the CIPC phone is registered with the m igrated CUCM and has an extension of 1005. If it does not, verify all settings on the m igrated CUCM 10 server. In addition, check CIPC as you m ay need to delete the security ITL file. To do this, click on the Settings button and then enter **# to unlock. Once unlocked, press the Erase key. The phone should now register. Lab Guide Version 1.3 Page 70 of 83

Section 2 Set Up the ILS Network Between Clusters The diagram above represents the way the lab cluster should be right now from a call control perspective. The phones on Cluster 15 can call each other, phones on Cluster 35 can call each other, but you cannot call between clusters even though the SIP trunk exists. We setup a standard SIP trunk in advance to facilitate the lab, but there is no routing (URI or route patterns) associated with it. Planning the dial plan is the m ost im portant phase of im plem enting an ILS/GDPR network. As you can see from the example, with a sim ple 4-digit dial plan there is a large potential for overlapping DN s. In truth m ost enterprises large enough to have m ultiple clusters would probably not be using 4 digits. You have m any options in planning. You can use site codes to reach the rem otes, you can use 7- or 10- digit dial plans, or you can use a full E.164 im plem entation. Whatever is chosen is referred to as the Enterprise dial plan in context of ILS/GDPR. In addition to the Enterprise Dial Plan, ILS/GDPR allows yo u to assign a full E.164 number to each DN (individually or as a pattern for the site/location) that can be used automatically to route calls across the PSTN in the event the SIP trunk fails. Once configured, the full dial plan including any changes are replicated and maintained autom atically by the ILS/GDPR network. The lab below runs through an example to show you the functionality. Step 2 Open a browser to http://10.10.10.15 and login as admin. Lab Guide Version 1.3 Page 71 of 83

Step 3 Navigate to System -> Enterprise Parameters. Look at the Cluster ID at the top of the page. It should be set to the default StandAloneCluster. Step 4 To setup an ILS network, the Cluster ID m ust be changed from the default. It can be m ade anything but m ust be unique for each cluster participating. Change the Cluster ID to Cluster15 (chosen because this is the.15 publisher) and click Save. Step 5 Navigate to Advanced Features -> ILS Configuration. You can see that this cluster is stand alone and not part of any ILS network. Lab Guide Version 1.3 Page 72 of 83

ILS networks are designed using a hub and spoke topology. There m ust be at least one hub which can support m ultiple spokes. You can also have more than one hub and they will share data between them. For our example, we will add the first cluster as a hub and the second as a spoke. Step 6 Change the option on the Role drop down box to Hub Cluster. Check the box to Enable Global Dial Plan Replication. This adds GDPR as a feature of the ILS network. Select Use Password as the ILS Authentication m ethod and enter the password C1sc0123. If using password authentication, the password m ust m atch on all clusters in the ILS network. Leave the other fields at their default and click Save. Step 7 You will get a popup box asking for the address of the Registration Server for the ILS cluster. Since this is the first cluster on the network, leave that field blank and click OK. Lab Guide Version 1.3 Page 73 of 83

Step 8 Step 9 Click OK on the warning box about restarting the service on the publisher. Hit the Refresh button. Check that the local cluster is listed with an up to date status on the status bar at the bottom of the page. Step 10 Step 11 Open a browser to http://10.10.10.35 and login as admin. From Enterprise Parameters, change the Cluster ID to Cluster35. Step 12 Go to Advanced Features -> ILS Configuration. From the Role drop down box, select Spoke Cluster (not Hub Cluster as we did before). Click the Enable Global Dial Plan Replication checkbox. Change the ILS Authentication to Use Password and enter C1sc0123 as the password. Click Save. Lab Guide Version 1.3 Page 74 of 83

Step 13 This tim e when you are prompted for a Registration Server you need to enter the address of the publisher on Cluster15. This is only required 1 tim e during initial registration. Step 14 Hit the Refresh button periodically until you see Registered Successfully. It m ay take 1-2 minutes. Lab Guide Version 1.3 Page 75 of 83

Step 15 At this point you should see both the hub and spoke clusters in the status panel at the bottom of the page. If all is working correctly, the USN Data Synchronization Status for both will show Up to date. Notice the Advertised Routing Strings for both clusters. These fields were automatically populated for you based on the FQDN for the publisher, but if desired the user can change them during setup. These fields are im portant as the m eans ILS uses to route between clusters in the network (over the SIP trunk). Step 16 ILS / GDPR is now active for the two clusters. Congrats! Lab Guide Version 1.3 Page 76 of 83

Section 3 Implementing DN Replication with GDPR and Class of Control Class of control is implemented in an ILS environment using partitions and calling search spaces, just like in normal CUCM im plementations. When ILS is configured, it autom atically generates a set of partitions to use for learned num bers and patterns. You can choose to use these in calling search spaces, use existing partitions, or create your own. The m ethod that s appropriate for the deployment should be determ ined during the up front design phase. For our lab, we will use the generated partitions and add them to a calling search space. Step 17 On CUCM 10.10.10.15 administration page, click Call Routing -> Class of Control -> Partition. Click Find. Note the autom atically generated partitions are present. Step 18 Navigate to Call Routing -> Global Dial Plan Replication -> Partitions for Learned Numbers and Patterns. Note that the generated partitions are autom atically included. If you wanted to use different partitions, you would select them here. Lab Guide Version 1.3 Page 77 of 83

Step 19 Navigate to Call Routing -> Class of Control -> Calling Search Space. Click Find. You should see a CSS named CSS_All_Access. Select it and be sure the ILS partitions are included. Step 20 Navigate to Call Routing -> Directory Numbers. Click Find. Click on each DN and be sure the Calling Search Space is set to CSS_All_Access. Lab Guide Version 1.3 Page 78 of 83

Note: for purposes of this lab, we will use a prefix of 8 for the Enterprise Dial Plan. For the E.164 num ber, we will use +1408555XXXX for Cluster 15 and +1919209XXXX for Cluster 35. Step 21 While you are in the DN configuration, you can also set up the GDPR replication. Click on the Add Enterprise Alternate Number button and configure with the 8 prefix as shown. Click on the Add +E164 Alternate Number button and add the appropriate pattern for the cluster you are configuring as shown (+1408555XXXX for Cluster 15, +1919209XXXX for Cluster 35). Also check that the Directory URI s are present and that the Advertise Globally via ILS is checked (these may already be populated). Lab Guide Version 1.3 Page 79 of 83

An alternate way to do this for the entire cluster instead of on individual DN s is to create Advertised Patterns for Enterpris e Num ber Patterns and +E.164 Num ber Patterns under Call Routing -> Global Dial Plan Replication -> Advertised Patterns. If done there, the patterns apply for the entire cluster. Step 22 Navigate to Device -> Trunk and click Find. Be sure the CSS on the trunk is set to CSS_All_Access and the Significant Digits is set to All under Inbound Calls. To enable routing between the clusters, the ILS route string is used. There are no route patterns or entries for individual patterns or DN s. ILS tracks learned numbers and routes them based on the ILS route patterns that were configured when ILS was setup. Step 23 Setup the SIP route to allow ILS to route calls between clusters. Navigate to Call Routing -> SIP Route Pattern. Click A dd New. Enter cucm-pubmig-v10.uclab.com as the IPv4 Pattern (the route to Cluster35). Place in the PT_Local partition and use the SIP Trunk Trunk_To_Cluster35. Click Save. Step 24 Open a browser window to http://10.10.10.35 and login as admin. Repeat steps 3 through 7 above using appropriate data for Cluster35. Lab Guide Version 1.3 Page 80 of 83

Section 4 Verify Proper Setup and Test At this point, all the DN s and URI s should be replicating between the clusters. You should be able to place calls from one to the other. Any change to the dial plan should also be reflected across the entire enterprise. Step 25 Open a browser window to http://10.10.10.15 and login as admin. Navigate to Advanced Features -> ILS Configuration. Verify that the USN Data Synchronization Status shows Up to Date for both clusters. Step 26 Navigate to Call Routing -> Global Dial Plan Replication -> Learned Numbers. Click Find. You should see the numbers learned from Cluster 35. Note that in a real environment, there would be a route pattern m atching the E.164 number pointing to a PSTN gateway. With that in place, if the SIP connection fails to Cluster 35, CUCM can automatically reroute to the PSTN using the learned E.164 number. If you look on Cluster 35, you should see all the numbers learned from this cluster as shown below. Step 27 Navigate to Call Routing -> Global Dial Plan Replication -> Learned Directory URIs. Click Find. You should see listed all of the SIP URI s learned from Cluster 35. Lab Guide Version 1.3 Page 81 of 83

Like before, if you look on Cluster 35, you should see all of the SIP URI entries learned from this cluster as shown below. Step 28 Step 29 Step 30 Step 31 Open an RDC session to Workstation #1. CIPC should be running and registered to Cluster 15 with a DN of 1001 (from th e Provisioning section of the lab). Open an RDC session to Workstation #3. CIPC should be running and registered to Cluster 35 with a DN of 1005 (from Section 1 of this lab). From CIPC on Wkstn 1, dial 81005. The CIPC phone on Wkstn 3 should ring. From CIPC on Wkstn 3, dial 81001. The CIPC phone on Wkstn 1 should ring. Note that no route patterns were created between the clusters. The routing is via ILS using the SIP trunk. Experim ent with changing numbers, adding numbers, etc. and watch the effects on replication. If you have an endpoint capable of URI dialing (like a DX650), you could also dial between clusters using the learned URI s. I suggest you seek out the full update training on ILS / GDPR to understand all the design options and troubleshooting tools available. In a large enterprise, this feature can m ake m aintenance of the enterprise dial plan simpler and m uch less prone to error. -- This completes this section -- Lab Guide Version 1.3 Page 82 of 83

End Of Lab This concludes the lab. W e thank you for taking the tim e to com plete this lab. We hope that this lab surpassed your goals and expectation, and was a very useful and positive learning experience for increasing your knowledge of Cisco Unified Communications products. Please don t forget to com plete the survey! Lab Guide Version 1.3 Page 83 of 83