HP ALM and Lab Management Review of the Key Features November 13, 2013
Brought to you by Vivit Testing, Quality and Application Lifecycle Management Special Interest Group (TQA-SIG) Leaders: Damian Versaci, Christopher J. Scharer, Robert Linton, Bernard P. Szymczak Jr., Andreas Birk www.vivit-worldwide.org
Hosted by Dr. Andreas Birk Vivit TQA - SIG Leader Founder & Principal Consultant at Software.Process.Management
Today s Presenter Dr. Eric Schmieder Lead Test Architect Haufe-Lexware
Housekeeping This LIVE session is being recorded The recording will be available on BrightTALK immediately after this session Q&A: Please type questions in the Questions Box below the presentation screen Additional information available for you behind the Attachment button and later on the Vivit website
HP ALM and Lab Management Review of the Key Features November 13, 2013
Company Profile: Haufe-Lexware Haufe.Gruppe: approx. 1,300 employees in Freiburg (HQ), Timisoara and Munich approx. 250 million turnover We make our customers more successful by: General Business Accounting Software Digital workplace solutions Web-based services Educational and training programs
Software Development at HL Software Development & Research Division: Size: approx. 250 employees Product Brand: Haufe-Lexware SWD quality control (QC): Size: approx. 50 engineers & 5 managers. Plus external testing companies and individuals Task: professional testing services in > 30 parallel development projects. In the SWD quality control standard tools used by Hewlett-Packard: ALM Sprinter UFT QA Inspect Performance Center Lab Manager Q3/2013
Challenge Coordination, scheduling and execution of testing inclusive of preparation, set up and cleanup. Maintain test equipment, Install, diagnose, service and. pre-installed products on virtual machines. Installation via Vmware vcloud API Nightly-Build on the TFS build server vcloud Directory
Overview Source : http://www.hpsoftwaregovsummit.com/2013/files/apps-s6-melindacloverjakeburman-16x9.pdf
Log In: http://#####:8080/qcbin/ Projects Lab Management Manage Testing Resources Pool resources Schedule timeslots for Tests and resources
Lab Management Administration Lab Settings (monitoring, maintenance, and distribution of resources to the projects) Lab Resources (Host, Location and Schedule) Performance Center (Reports, Health and Diagnostics) Servers Lab Service (remotely trigger tests on Host ) PC Servers (manage Performance Center functionality) CDA Servers (dynamic provisioning and deployment of resources)
Lab Settings Manage project settings tab is a list of ALM project containing configuration information Maintenance tasks overview tab is run server side tests to constantly monitor the system s key components
Lab Resources Server side check with one-time registration process: validates the identity of the agent and establishes a secure communication channel between ALM and the host.
Performance Center Manage and maintain PC lab resources
Servers PC Servers Module (manage Performance Center functionality CDA Servers Module (dynamic provisioning and deployment of resources)
Log In: http://#####:8080/qcbin/ Manage Testing Resources Pool resources Schedule timeslots for Tests and resources
Lab Manager Overview ALM tests can run using different modes of execution: Functional test sets and Performance test sets Test sets can be run: Immediate or Scheduled tests on remote testing hosts Build test set
Test Lab Create test sets Run tests in a Functional test set Run tests in a Default test set (clientside execution) Run Performance tests (schedule run with ALM) View and analyze test results
Switch Test Set Type Toggles the test set: Default Functional Define Hosts Automatic Request
Build Verification Suites Add Functional Test Sets Add Performance Test Requested Hosts
Timeslots
Test Runs In the Test Runs tab, you can view the results of test executions. Individual runs Test Set State Build Verification Suite Be careful: Don t confuse test run id with run id
Summary GUI interface simplifies the management and schedule of testing assets and resources across projects. Project can create time slots dedicated for running functional and performance test cases or both combine. Better visibility of resources usage and optimize utilization of hardware. Slow performance, i.e. report generation. Issue converting BPT into Performance test cases. UFT Plugin Modules need to align with BC test case on the client, there is no check for required add-ins. Timeslot Usage Report only display Vusers from reservation that were attached to a test.
Thank you