Micro Focus. Enterprise View. COBOL Process Guide



Similar documents
Micro Focus. Data Express. SQL Server Module User Guide

Third Party System Management Integration Solution

HP Quality Center. Software Version: Microsoft Excel Add-in Guide

SQL Server 2008 R2 Express Installation for Windows 7 Professional, Vista Business Edition and XP Professional.

MGC WebCommander Web Server Manager

Inventory Computers Using TechAtlas for Libraries

BrightStor ARCserve Backup for Windows

1 Changes in this release

HAHTsite IDE and IP Installation Guide

Symantec Enterprise Vault

Scheduling in SAS 9.4 Second Edition

Central and Remote Users Guide

Oracle Enterprise Manager. Description. Versions Supported

IBM VisualAge for Java,Version3.5. Remote Access to Tool API

BEAWebLogic. Portal. WebLogic Portlets for SAP Installation Guide

Vector Asset Management User Manual

Oracle Fusion Middleware

HP IMC User Behavior Auditor

ORACLE USER PRODUCTIVITY KIT USAGE TRACKING ADMINISTRATION & REPORTING RELEASE 3.6 PART NO. E

Oracle Enterprise Manager. Description. Versions Supported. Prerequisites

How To Create An Easybelle History Database On A Microsoft Powerbook (Windows)

MySQL Installer Guide

Web Enabled Software for 8614xB-series Optical Spectrum Analyzers. Installation Guide

QACenter Installation and Configuration Guide. Release 4.4.2

Oracle Enterprise Single Sign-on Provisioning Gateway. Administrator Guide Release E

Microsoft SQL Server Installation Guide

Vodafone PC SMS (Software version 4.7.1) User Manual

Embarcadero Performance Center 2.7 Installation Guide

Business Portal for Microsoft Dynamics GP. Key Performance Indicators Release 10.0

CA Nimsoft Monitor. Probe Guide for NT Event Log Monitor. ntevl v3.8 series

CA Nimsoft Service Desk

Database migration using Wizard, Studio and Commander. Based on migration from Oracle to PostgreSQL (Greenplum)

Microsoft BackOffice Small Business Server 4.5 Installation Instructions for Compaq Prosignia and ProLiant Servers

CA XCOM Data Transport for Windows Server/Professional

Oracle Enterprise Manager. Description. Versions Supported. Prerequisites

USER GUIDE. Ethernet Configuration Guide (Lantronix) P/N: Rev 6

Oracle Enterprise Manager. Description. Versions Supported

Cincom Smalltalk. Installation Guide P SIMPLIFICATION THROUGH INNOVATION

CA Nimsoft Monitor Snap

BrightStor ARCserve Backup for Linux

TheFinancialEdge. Journal Entry Guide

Crystal Reports Installation Guide

Pharos Uniprint 8.4. Maintenance Guide. Document Version: UP84-Maintenance-1.0. Distribution Date: July 2013

CA ARCserve Backup for Windows

StreamServe Persuasion SP5 Control Center

Microsoft Dynamics GP. Business Analyzer

DOCUMENTATION MICROSOFT WINDOWS SYSTEM BACKUP AND RESTORE OPERATIONS

Setting up VMware ESXi for 2X VirtualDesktopServer Manual

Oracle Enterprise Manager. Description. Platforms Supported. Versions Supported

Enabling Backups for Windows and MAC OS X

Attix5 Pro Server Edition

Introduction to Version Control in

Tivoli Endpoint Manager for Remote Control Version 8 Release 2. User s Guide

Suite. How to Use GrandMaster Suite. Exporting with ODBC

DOCUMENTATION SHADOWPROTECT - MICROSOFT WINDOWS SYSTEM BACKUP AND RESTORE OPERATIONS

NETWRIX FILE SERVER CHANGE REPORTER

HP Quality Center. Software Version: Microsoft Excel Add-in Guide

Nimsoft Monitor. ntevl Guide. v3.6 series

How To Install An Aneka Cloud On A Windows 7 Computer (For Free)

CA Nimsoft Monitor. snmptd Guide. v3.0 series

MadCap Software. Upgrading Guide. Pulse

Avalanche Site Edition

Avaya Network Configuration Manager User Guide

Juris Installation / Upgrade Guide

Horizon Debt Collect. User s and Administrator s Guide

Human Resources Installation Guide

Novell ZENworks 10 Configuration Management SP3

Change Manager 5.0 Installation Guide

Moving the TRITON Reporting Databases

HP OpenView AssetCenter

Ahsay BackupBox v1.0. Deployment Guide. Ahsay TM Online Backup - Development Department

Reconfiguring VMware vsphere Update Manager

JD Edwards EnterpriseOne Tools. 1 Understanding JD Edwards EnterpriseOne Business Intelligence Integration. 1.1 Oracle Business Intelligence

ATX Document Manager. User Guide

DB2 Database Demonstration Program Version 9.7 Installation and Quick Reference Guide

EPSON Scan Server & EPSON TWAIN Pro Network

Utilities ComCash

NetFlow Collection and Processing Cartridge Pack User Guide Release 6.0

Rev 7 06-OCT Site Manager Installation Guide

HR Data Retrieval in a LDAP- Enabled Directory Service

Oracle Secure Payment Gateway for HIS Profit Series SQL Server Upgrade Procedures. May 2016

How To Install Caarcserve Backup Patch Manager (Carcserver) On A Pc Or Mac Or Mac (Or Mac)

User Guide. Version 3.2. Copyright Snow Software AB. All rights reserved.

NETWORK PRINT MONITOR User Guide

Dell Statistica Statistica Enterprise Installation Instructions

Print Audit 5 User Manual

Scheduling in SAS 9.3

Pacesetter BES Lettering Installation Guide

-lead Grabber Business 2010 User Guide

Shavlik Patch for Microsoft System Center

Kaspersky Security Center Web-Console

Installation Guide for FTMS and Node Manager 1.6.0

Configuration Backup Restore

Veritas Cluster Server Database Agent for Microsoft SQL Configuration Guide

DOCUMENTATION SYSTEM STATE BACKUP & RESTORE OPERATIONS

Auditing manual. Archive Manager. Publication Date: November, 2015

HP Quality Center. Software Version: Microsoft Word Add-in Guide

DOCUMENTATION FILE RESTORE

Transcription:

Micro Focus Enterprise View COBOL Process Guide

Copyright 2007 Micro Focus (IP) Ltd. All rights reserved. Micro Focus (IP) Ltd. has made every effort to ensure that this book is correct and accurate, but reserves the right to make changes without notice at its sole discretion at any time. The software described in this document is supplied under a license and may be used or copied only in accordance with the terms of such license, and in particular any warranty of fitness of Micro Focus software products for any particular purpose is expressly excluded and in no event will Micro Focus be liable for any consequential loss. Animator, COBOL Workbench, EnterpriseLink, Mainframe Express, Micro Focus, Net Express, REQL and Revolve are registered trademarks, and AAI, Analyzer, Application Server, Application to Application Interface, AddPack, AppTrack, AssetMiner, CCI, DataConnect, Dialog System, Enterprise Server, Enterprise Server with MTO, EuroSmart, FixPack, LEVEL II COBOL, License Server, Mainframe Access, Mainframe Manager, Micro Focus COBOL, Micro Focus Studio, Micro Focus Server, Object COBOL, OpenESQL, Personal COBOL, Professional COBOL, Server Express, SmartFind, SmartFind Plus, SmartFix, SourceConnect, SupportLine, Toolbox, VS COBOL, WebSync, and Xilerator are trademarks of Micro Focus (IP) Ltd. IBM and CICS are registered trademarks, and IMS, DB2, DB2/390, DB2 Connect Enterprise Edition, DB2 Connect Personal Edition, COBOL for OS/390, Enterprise Systems Architecture/390, ESA/390, MVS, MVS/ESA, OS/390, S/390, System/390, VSE/ESA and MFS are trademarks, of International Business Machines Corporation. Netscape Enterprise Server is a trademark of Netscape Communications Corporation. Internet Information Server, Windows 95, Windows 98, Windows NT 4.0, Windows ME and Windows 2000 are registered trademarks of Microsoft Corporation. Pentium is a registered trademark of INTEL Corporation. Borland Database Engine and Midas are trademarks of Inprise Corporation. Oracle is a trademark of Oracle Corporation Inc. UNIX is a registered trademark of The Open Group. Solaris, Ultra Sparc and Java are trademarks of Sun Microsystems, Inc. Linux, GCC compiler, GNU CTAGS, GNU DIFF, Samba and KDE/GNOME refer to Open Source tools. This product includes modified versions of software developed by the Apache Software Foundation (as described on the Apache web site).

3 The YGrep Search Engine is Copyright (c) 1992-2004 Yves Roumazeilles. All other trademarks are the property of their respective owners. No part of this publication, with the exception of the software product user documentation contained on a CD-ROM, may be copied, photocopied, reproduced, transmitted, transcribed, or reduced to any electronic medium or machine-readable form without prior written consent of Micro Focus (IP) Ltd. Contact your Micro Focus representative if you require access to the modified Apache Software Foundation source files. Licensees may duplicate the software product user documentation contained on a CD-ROM, but only to the extent necessary to support the users authorized access to the software under the license agreement. Any reproduction of the documentation, regardless of whether the documentation is reproduced in whole or in part, must be accompanied by this copyright statement in its entirety, without modification. U.S. GOVERNMENT RESTRICTED RIGHTS. It is acknowledged that the Software and the Documentation were developed at private expense, that no part is in the public domain, and that the Software and Documentation are Commercial Computer Software provided with RESTRICTED RIGHTS under Federal Acquisition Regulations and agency supplements to them. Use, duplication or disclosure by the U.S. Government is subject to restrictions as set forth in subparagraph (c)(1)(ii) of The Rights in Technical Data and Computer Software clause at DFAR 252.227-7013 et. seq. or subparagraphs (c)(1) and (2) of the Commercial Computer Software Restricted Rights at FAR 52.227-19, as applicable. Contractor is Micro Focus (IP) Ltd, 9420 Key West Avenue, Rockville, Maryland 20850. Rights are reserved under copyright laws of the United States with respect to unpublished portions of the Software.

4

5 Table of Contents 1. ABOUT THIS DOCUMENT 9 2. PREPARATION 11 2.1. PREREQUISITES 11 2.2. UNDERSTANDING THE ANALYSIS PROCESS PHASES 11 3. ENTERPRISE VIEW COBOL PROCESS OVERVIEW 13 4. PROVIDING SOURCES TO THE ENTERPRISE VIEW SERVER MACHINE 17 4.1. COBOL BATCH ENVIRONMENT 18 4.2. COBOL CICS ENVIRONMENT 18 4.3. COBOL IMS ENVIRONMENT 19 4.4. SOURCES WITH COBOL PRE-PROCESSOR STATEMENTS 19 4.5. MISSING SOURCES 19 4.5.1. Missing Programs or Transactions 19 4.5.2. Missing Copy Books 19 4.5.3. Missing JCLs 20 4.5.4. Missing JCL Catalogued Procedures 20 4.5.5. Missing CSD 20 Issues: 20 5. DEFINING AN ORGANIZATION VIEW TREE 21 6. STARTING HOST JOBS 23 7. STARTING THE INVENTORY PROBE JOB 27 7.1. INTRODUCTION 27 7.2. HOW TO RUN THE INVENTORY PROBE JOB 27 7.3. UNDERSTANDING INVENTORY PROBE JOB RESULTS 30 7.4. WHEN TO RERUN THE INVENTORY PROBE JOB 31 7.4.1. Rerun after Adding New Components 31 7.4.2. Rerun after Adding New Extensions 32 8. STARTING THE INVENTORY JOB 33 8.1. INTRODUCTION 33 8.2. HOW TO RUN THE INVENTORY JOB 33 8.3. INVENTORY JOB RESULTS 36 8.4. WHEN TO RERUN THE INVENTORY JOB 38 9. STARTING THE ANALYSIS JOB 41 9.1. INTRODUCTION 41 9.2. HOW TO RUN THE ANALYSIS JOB 42 9.3. ANALYSIS JOB RESULTS 44 9.3.1. Unresolved Objects 46 9.3.2. Analysis Status Done (with warnings) 49 9.3.3. Analysis Status Done (with errors) 50 9.3.4. Analysis Status Done (time out) 50 9.3.5. Analysis Status Done (not parsed) 50 9.3.6. Sources with Status To Do (Extended) after Analysis 50

6 9.4. WHEN TO RERUN THE ANALYSIS JOB 51 10. STARTING THE LIFE CYCLE JOB 53 10.1. INTRODUCTION 53 10.2. HOW TO RUN THE LIFE CYCLE JOB 53 10.3. UNDERSTANDING LIFE CYCLE STATUS RESULTS 55 11. STARTING THE UNSUPPORTED SYNTAX JOB 57 11.1. INTRODUCTION 57 11.2. HOW TO RUN THE UNSUPPORTED SYNTAX JOB 57 11.3. UNDERSTANDING UNSUPPORTED SYNTAX RESULTS 57 12. STARTING THE INVENTORY DELETION JOB 59 12.1. INTRODUCTION 59 12.2. HOW TO RUN THE INVENTORY DELETION JOB 59 13. STARTING THE HTML EXPORT JOB 61 14. STARTING THE END DAEMON JOB 63 15. ANALYZING COBOL SOURCES 65 15.1. INTRODUCTION 65 15.2. SOURCES CONTAINING FILE AND DATABASE DEFINITIONS 65 15.2.1. Introduction 65 15.2.2. DB2 Database 65 15.2.4. VSAM Delete/Define File Statements 68 15.3. COBOL COPY BOOKS 68 15.4. COBOL SOURCES WITH SQL STATEMENTS 69 15.5. DYNAMIC LINK IN COBOL SOURCES 70 16. BATCH ENVIRONMENT 73 16.1. INTRODUCTION 73 16.2. BATCH JOBS (JCL) AND CATALOGUED PROCEDURES 73 16.2.1. Introduction 73 16.2.2. Physical Sequential Files 75 16.2.3. GDG (Generation Data Group) File Definitions 75 16.2.4. Catalogued Procedures 76 16.2.5. Cards 76 16.2.6. JCLs 77 16.2.7. COBOL Batch Programs 77 17. CICS ENVIRONMENT 79 17.1. INTRODUCTION 79 17.2. CICS ONLINE RESOURCE DEFINITION FILE 79 17.3. CICS MAP DEFINITIONS 79 17.4. CICS COBOL TRANSACTIONS AND PROGRAMS 80 18. IMS ENVIRONMENT 83 18.1. INTRODUCTION 83 18.2. IMS RESOURCE DEFINITIONS 83 18.3. IMS MFS DEFINITIONS 83 18.4. IMS COBOL TRANSACTIONS AND PROGRAMS 84 19. COBOL STATEMENTS ANALYSIS 85

7 19.1. COBOL LANGUAGE GENERAL CONSIDERATIONS 85 19.2. COBOL DETAILED ANALYSIS 85

8

9 1. About This Document This guide describes how to use Enterprise View on a COBOL application and takes you step-by-step through the processing phases that must be performed in order to populate an environment. When you are familiar with the subject matter of this guide, you will have a basic understanding of how to process COBOL sources in different environments.

10

11 2. Preparation 2.1. Prerequisites In order to configure and analyze COBOL applications, the following software must be installed: 1. Enterprise View Host Part (Server Modules) on the Server machine. 2. Enterprise View Client Part (Client Solutions) on the Client machine. 3. Oracle Database on the Server machine. 4. Enterprise View MVS Agent (Optional). Enter a User Identifier and Password to ensure: 1. access to the Enterprise View Host Server machine. 2. the Enterprise View product will start on the client PC. 3. access to the Oracle Database Schema on the Server machine. Before processing the environment s sources, perform the following steps: 1. Install Enterprise View. For details, see the Installation Guide. 2. Configure the Enterprise View environment (administrator). For details, see the Administration Guide. 3. Locate the sources that will be processed. For details on the type of sources for a COBOL environment, see How to Provide Sources to Enterprise View. 4. Define the Technical View. For details, see the Organization View Process Guide. You must define the technical view in order to begin processing the sources. 2.2. Understanding the Analysis Process Phases To better understand the process of an analysis of COBOL applications, a process overview is given in the next chapter. It shows the sequence that you must follow to successfully execute the entire process. Depending on the messages and the different cases of an analysis, some of the steps can be repeated. The analysis process has two parts: 1. Primary analysis process of the application. 2. Post-analysis (life cycle) process of the analyzed applications. During this phase the applications are checked for modifications, for example, new sources, deleted ones, or changed program code. The Enterprise View Repository and Database tables are updated with the new modifications.

12 The Inventory and Analysis Jobs for a given defined organization view must be executed until successful completion of the primary analysis process. The client can repeat the execution of the Inventory Job only after executing the Inventory Deletion Job. The Life Cycle Job is executed when modifications have been made to the applications after successful completion of the primary analysis process.

13 3. Enterprise View COBOL Process Overview Part I. Primary analysis process overview Transfer COBOL sources to Enterprise View Server Identify source code and check file extensions Define Technical View Define Custom View (Optional) Run Inventory Probe Job Run Inventory Deletion Job with Delete Structure Check if all necessary components are in the package No Add the components Yes Work with file extensions Yes Check for unsupported files and wrong extensions No

14 No Run Inventory Deletion Job Run Inventory Job Run Inventory Deletion Job without Delete Structure Check for error messages Yes Correct Yes the errors No Create Proposed Dynamic Links (Optional) Run Analysis Job Add missing sources Yes Check the results for: 1. Unresolved Objects 2. Error messages Yes No Check for Proposed Dynamic Links Yes Confirm Dynamic Links No Client Modules: Enterprise View Tech Enterprise View Builder

15 Part II. Post-analysis (Life Cycle) process overview Transfer modified application sources to the Enterprise View Server machine in the corresponding components of the Organization View Select the Organization View (Already successfully analyzed) Run Life Cycle Job Load missing sources Yes Check the results for: 1. Unresolved Objects 2. Error messages Yes Correct the errors Check for Proposed Dynamic Links Confirm Dynamic Links Client Modules: Enterprise View Tech Enterprise View Builder

16

17 4. Providing Sources to the Enterprise View Server Machine Regardless of the source system from which the source code will be provided, if the sources machine is different from the Enterprise View Server, then file transfer is the standard channel used by the Enterprise View Agent Modules. If your company does not allow its use, another transfer method or protocol should be discussed. The product usually stores sources in several folders, one for each kind of source. Each folder should contain the sources transferred and each kind of source must be transferred with the extensions listed in the corresponding column of the grids below (COBOL Batch, COBOL CICS, COBOL IMS). This is mandatory, or the source will not be recognized. A specific module called Enterprise View MVS Agent will be installed to allow a quick configuration of the sources to be transferred to the Enterprise View Server. This module allows partitioned selection and automatic extension assignment. To better represent the structure of code libraries, as close as possible to the developers views, we strongly recommended that you download the sources using the same organization as on the source system. The client can create and delete his own extensions using the Work with File Extensions menu on the Enterprise View Configuration Manager. This operation can be performed only by an administrator of the product. The client can use as the Enterprise View pre-defined extensions as well as the custom defined ones. The extensions in the tables below are the Enterprise View pre-defined extensions. They are grouped depending on the environment in which the COBOL applications are performed:

18 - Batch environment: COBOL Batch; - Online environment: CICS - COBOL CICS and IMS - COBOL IMS. You must assign an extension or the sources will not be recognized. NOTE: The COBOL program can have the general extension.cbl or.cbl or an extension that specifies the environment where the program runs (batch, CICS, IMS). The sources are handled in the same way by the COBOL parser, but we recommend you use extensions linked to the environment as it is easier to display the data in the client modules (i.e. Enterprise View Tech and Enterprise View Builder), because the Type column will show different values related to the environment (i.e. batch, CICS, IMS). The Overview tabsheet will show the program distribution by type and it will be easier to use the programs list applying filters in the column type. 4.1. COBOL Batch Environment The extensions described in the next table are pre-defined extensions in the product. The user can use them for applications containing COBOL programs and other sources running in a batch environment. Component Description Extension Programs Batch COBOL program.cbl,.cblbat,.cbl,.cblbat JCL Job Control Language.jcl,.JCL Catalogued Procedure JCL skeleton JCL Card.proc,.PROC.skel,.SKEL.card,.CARD Copy books COBOL Copy books.cpy,.cpy Files DL/I DBD.dbd,.DBD VSAM Delete Define DL/I PSB DB2 DDL.ddef,.DDEF.dbpsb,.DBPSB.ddldb2,.DDLDB2 4.2. COBOL CICS Environment The extensions described in the next table are pre-defined extensions in the product. The user can use them for applications containing CICS COBOL programs and other sources running in an online CICS environment. Component Description Extension Programs CICS COBOL Transactions and Programs.cblcic,.CBLCIC Copy books COBOL copy books.cpy,.cpy CICS/Resource Definitions - Online CSD File Definition.csd,.CSD Map Definitions Map Definition.map,.MAP

19 BMS Map set Definition BMS Map Definition.mapbms,.MAPBMS.mapbms,.MAPBMS 4.3. COBOL IMS Environment The extensions described in the next table are pre-defined extensions in the product. The user can use them for applications containing COBOL IMS programs and other sources running in online IMS environment. Components Description Extension Programs IMS COBOL Transactions and Programs.cblims,.CBLIMS IMS Stage One.ims,.IMS Copy Books COBOL Copy Books.cpy,.CPY Files VSAM Delete Define.ddef,.DDEF DL/I PSB.dbpsb,.DBPSB DL/I DBD.dbd,.DBD Map Definitions MFS Map Definition.mapmfs,.MAPMFS 4.4. Sources with COBOL Pre-Processor Statements If COBOL sources contain preprocessor statements that differ from EXEC DLI, EXEC SQL, EXEC CICS, since Enterprise View does not support them directly, you must provide not the original sources but the sources already handled by the preprocessor that can be compiled by a standard COBOL compiler. 4.5. Missing Sources If software sources are not provided, they will not be analyzed. This will lead to potential problems and/or incompleteness of analysis. When source is missing but is referenced by any other source, an unresolved object is created. Many other issues can also be generated. The most important issues are listed below. 4.5.1. Missing Programs or Transactions Issues: An unresolved object of a specific type will be generated. Any recognized call to missing objects will link to the unresolved object. Technical Function Point will be corrupted. Impact Analysis will not be possible. 4.5.2. Missing Copy Books Issues: If referenced, an unresolved object of a specific type will be generated. Metrics without include copy books will be corrupted.

20 4.5.3. Missing JCLs Issues: As they are starting points, the chains may appear isolated, if no other object references them. The used files will not be referenced. The relationships among files and programs which are being declared will be lost. There will be missing information about the link between the file and the data areas describing the file. There will be a missing link between a physical file and the programs using it. 4.5.4. Missing JCL Catalogued Procedures Issues: The programs called by the procedures will not be referenced. The used files will not be referenced. The relationships among files and programs which are being declared in the missing procedures will be lost. There will be missing information about the link between the file and the data areas describing the file. There will be missing link between a physical file and the program using it. 4.5.5. Missing CSD Issues: The transactions that have been declared will be lost. The programs that have been declared will be lost. The relationships between transactions and programs that have been declared will be lost.

21 5. Defining an Organization View Tree The Organization View is an Enterprise View structure. It gives a better representation of the structure of the sources. The client can organize his applications in Technical or Custom Organization Views. The Technical organization view is used for: Starting the jobs: Inventory, Analysis, Life Cycle, Delete Inventory etc. Better presenting and understanding the results and the charts after execution of the analysis job You can also create a Custom organization view. This enables you to see the analysis results from a different point of view, reordering and connecting the components from one or more technical organization views in one custom view. You can combine components by criteria, different from the developer s point of view and closer to his business environment. For further details about the goals and how to define an organization view see the Organization View Process Guide. You should bear the following in mind when defining an organization view for COBOL sources running on a mainframe platform: A technical organization view is mandatory for executing the jobs. System OS/390 has to be selected when a technical organization view is defined. A custom organization view can be created at any time during the analysis process. For further details about defining an organization view, see the Organization View Process Guide.

22

23 6. Starting Host Jobs There are two ways to submit the host jobs for execution: using the Work with Jobs icon or the Organization View Tree. They are described individually in the chapters for each host job. In the newer version of the Enterprise View product, a Job Creation Wizard creates the starting parameters of the host jobs. This Job Creation Wizard can also be started from the Work with Jobs icon or from the Organization View Tree. The sequence of the Job Creation Wizard menus is presented below: 1. The Job Wizard Welcome menu for submitting a host job. Press Next to continue or Cancel to exit the wizard. 2. The Job Wizard Selection menu that contains all the different types of jobs. You can select the name of the host job by choosing from the drop-down list. You can also set the schedule date and time. Click Next to continue or Back to review the previous choices. If you choose the Cancel button, the Job Wizard will terminate. The host jobs to be examined are: Inventory Probe Job Inventory Job Analysis Job Life Cycle Job Inventory Deletion Job HTML Export Job End Daemon Job

24 3. The Job Wizard menu for selecting parameters. From the drop-down lists you can select the package, organization view, operational mode and current period. Update Trend is used if you are interested in the previous status and the movement of the metrics of the objects. You can click Next to continue or Back to review your previous choices. If you choose the Cancel button, the Job Wizard will terminate. 4. The Job Wizard menu for selecting a Multidaemon, to execute the submitted job, and enter the parameters of the submitted job. There are two Daemon jobs: MAIN and SECONDARY. You can click Generate to continue or Back to

25 review your previous choices. If you choose the Cancel button, the Job Wizard will terminate. 5. Job Wizard Complete menu. Press Finish to exit the Wizard. After clicking the Finish button, the Work with Jobs menu will appear. Depending on the Daemon name selected in the Wizard menu, the corresponding Multidaemon job starts.

26 Press the OK button to submit the job.

27 7. Starting the Inventory Probe Job 7.1. Introduction The Inventory Probe Job verifies if all sources provided by means of the Define Technical View function have extensions recognized by the Enterprise View Knowledge Base. As described in the chapter Providing Sources to the Enterprise View Server Machine, the product requires the sources which are to be processed to be placed in different folders, depending on the kind of sources. They must have the proper extension for the type of source. This job answers the following questions: How many sources are contained in a package? How many sources of each type are contained in the package? What types of sources are contained in the package? Are there extensions (i.e. types of sources) which are not supported? The Inventory Probe Job produces a report that contains the count of sources in the package, sorted by extension, highlighting which are the catalogued and uncatalogued ones. NOTE: Although you do not have to do this, we strongly recommended you do because it checks at an early stage if all the source types necessary for the process (Inventory and Analysis) have been included in the package. This job can be repeated if necessary. For more information see the chapter When to Rerun the Inventory Probe Job. 7.2. How to Run the Inventory Probe Job There are two ways of running the Inventory Probe job: 1. Using the Work with Jobs icon Click the Work with Jobs icon and select the New button. If the Job Creation Wizard menu appears, go to the chapter Starting Host Jobs. Otherwise select Inventory Probe from the popup window and fill in the fields as described in the chapter Work with Jobs in the Common Features reference guide.

28 2. Using the Organization View Tree Select the package in the organization view tree, right-click and select the Job Submission item and Submit Inventory Probe Job. If the Job Creation Wizard menu appears, go to the chapter Starting Host Jobs. The Work with Jobs window will appear with all the fields populated with the information about the selected package.

29 Start the Multidaemon session. Press the OK or Apply button on the Work with Jobs window to start the job. After execution of the job, select Work with Jobs again. The Show Result button appears. Press the Refresh button, and then the Show Result button. The following window with the results from the execution will be displayed:

30 7.3. Understanding Inventory Probe Job Results The results from the execution of the Inventory Probe job are written to a.txt file on the server machine. The sources (or their extensions) are classified into two main groups: Catalogued All objects in this category will be inventoried. Depending on the Analysis and Save mode, different actions will be taken: a) Extended Analysis - Sources corresponding to the extensions described in this group are fully analyzed by parsers with extended analysis. 1. Object Reference Analysis for all objects, Detailed Analysis only for the supported language. For COBOL sources, detailed analysis can be performed. 2. Sizing Measure 3. Quality Measure 4. Objects Change Traffic Measure 5. LOC Change Traffic Measure b) Basic Analysis - Sources corresponding to the extensions described in this group are analyzed by parsers with basic analysis. 1. No Object Reference and Detailed Analysis 2. Sizing Measure 3. No Quality Measure 4. Objects Change Traffic Measure 5. LOC Change Traffic Measure c) Disabled Analysis - Sources corresponding to the extensions described in this group are recognized in the Enterprise View Tables but only for Inventory purposes. The product does not perform any type of analysis after the Inventory phase. 1. No Object Reference and Detailed Analysis 2. No Sizing Measure 3. No Quality Measure 4. Objects Change Traffic Measure (only New/Deleted) 5. No LOC Change Traffic Measure d) Unsaved Files - Sources corresponding to the extensions described in this group are recognized in the Enterprise View Tables but only for Inventory purposes. The product does not perform any type of analysis after the Inventory phase, and it is not possible to display their source code by means of the Show

31 Source function or to trace source history because they have not been saved in the Enterprise View Repository. 1. No Object Reference and Detailed Analysis 2. No Sizing Measure 3. No Quality Measure 4. No Objects Change Traffic Measure 5. No LOC Change Traffic Measure Uncatalogued All Objects belonging to this category will be ignored for the inventory. a) Skipped Files - Sources corresponding to the extensions described in this group exist in the Enterprise View configuration tables, but objects with that extension are ignored. This group contains sources recognized in the Enterprise View Tables but not included in the Inventory or Repository. b) Undefined Files - Sources corresponding to the extensions described in this group do not exist in the Enterprise View configuration tables, and are therefore not included in the inventory and are not analyzed. This file gives a broad view of what is present in the package and helps to identify immediately if all the components necessary to process your environment (COBOL Batch, COBOL CICS, COBOL IMS) have been included in the package. NOTE: We advise you rerun the Inventory Probe Job after adding new components or new extensions in the Work with File Extensions table to check that the product has recognized them. 7.4. When to Rerun the Inventory Probe Job You should rerun the Inventory Probe Job when there are no more incorrect or undefined extensions and missing sources. 7.4.1. Rerun after Adding New Components If any sources have been omitted, it is important that you add them using Define Technical View before proceeding with the other job. There are two ways of adding missing sources to the package and rerunning the Inventory Probe Job: I. By executing the Inventory Deletion Job The following steps have to be performed: 1. Run Inventory Deletion Job and check the Delete Structure checkbox to delete the structure of the Technical Organization View for the defined application. 2. Create new folders for the missing sources in the main folder of the application on the server machine. 3. Transfer the missing sources to the new folders and check the file extensions. Some sources can also be transferred to the corresponding existing folders. 4. Define the Technical Organization View including the new folders as components.

32 5. Run the Inventory Probe Job again and check the results. II. Without executing the Inventory Deletion Job (for advanced users) The following steps have to be performed: 1. Create a new folder for the missing sources of the application on the server machine. If there are sources with different types of file extensions, the folder can contain subfolders. 2. Transfer the missing sources to the corresponding new folders and check the file extensions. 3. Define the new technical organization view for these new folders only. 4. Attach the new defined package and its components at the right place in the existing organization view (package). 5. Run the Inventory Probe Job again and check the results. 7.4.2. Rerun after Adding New Extensions You must think carefully about the sources catalogued in Undefined Files because their extensions are unrecognized in the Enterprise View Knowledge Base. If these sources are necessary for the environment, you can add their extensions in the Enterprise View File Extension Table (for details refer to Work with File Extensions in the Enterprise View Configuration Manager or reference guide) and include them in the Enterprise View process, otherwise you can ignore or delete them from the source folders. If the extensions shown in Undefined Files are due to having incorrectly assigned extensions to sources, you must correct them and rerun Inventory Probe Job. NOTE: We recommend you rerun the Inventory Probe Job after adding new extensions in the Work With File Extensions table to check that the product has recognized them.

33 8. Starting the Inventory Job 8.1. Introduction The Inventory Job loads information about all the sources in the folders provided by the users by creating organization views in the Enterprise View Tables and Repository. The Inventory Job can be run only at the highest level of the organization view - the package. There are three types of inventory scanners: 1. Code Inventory The Code Inventory is a directory scanner that performs the following operations: Scans the list of directories included in the organization view. Generates a Repository File (a physical copy of all useful objects) in a separate directory: o Generates a history version of all the useful objects (including the different output for the different versions). o Generates a last version file and history of scans. Writes all the retrieved information to the Enterprise View repository and database. 2. Data Inventory Data Inventory is a database scanner that performs the following operations: Scans the database attached to the organization view and extracts the definitions of the tables, indexes, views, primary and foreign keys. Writes all the retrieved information to the Enterprise View Repository and database. 3. Full Inventory The Full Inventory scanner performs Code and Data Inventory functions. Performing an inventory of a package is a prerequisite to running the subsequent phases. The Inventory Job can be rerun as required. For more information, see the chapter When to Rerun the Inventory Job. 8.2. How to Run the Inventory Job There are two ways of running the Inventory Job: 1. Using the Work with Jobs icon Click the Work with Jobs icon and select the New button. If the Job Creation Wizard menu appears, go to the chapter Starting Host Jobs. Otherwise select

34 Inventory Job from the popup window and fill in the fields as described in the chapter Work with Jobs in the Common Features reference guide. 2. Using the Organization View Tree Select the package in the organization view tree, right-click and select the Job Submission item and Submit Inventory Job. If the Job Creation Wizard menu appears, go to the chapter Starting Host Jobs.

35 In both cases the Work with Jobs window will appear with the information about the selected package. Start a Multidaemon session on the server machine. Press the OK or Apply button on the Work with Jobs window to submit the job.

36 8.3. Inventory Job Results The results of completing the Inventory Job can be viewed in the Enterprise View Builder, Enterprise View Tech, and Enterprise View Dynamic Inventory from different points of view. You can see: General and summary information about all loaded sources, by looking at the Overview and Summary tabsheets. Detailed information about all loaded sources, by looking at the specific tabsheet for each source class Programs, Files, Copy/Includes, JCLs, Database Definitions and others. Program source code and object properties, by left-clicking every row of the specific tabsheet for each source class and using the dropdown menu. The next picture shows the Overview tabsheet of the Enterprise View Builder solution. The number of objects and the lines of source are distributed by class. In the next picture, detailed information about all loaded objects is displayed in the All Objects tabsheet of Enterprise View Builder.

37 Information is written for every object in the Enterprise View Tables about the: qualification of the identification object: - name, alias and description - Enterprise View system, class, type, and language metrics: - number of the objects - source lines of code (SLOC) unknown extensions: - the objects with unknown extensions are marked as skipped. These objects are written in a file named skipped.txt in the JOB directory. The types of extensions and the number of ignored sources are recorded in this file. In the newer versions of the Enterprise View product the file skipped.txt is not created. Sources with unknown extensions belong to the uncatalogued category in the Inventory Probe Job results. These sources exist inside the Enterprise View configuration tables, but they are ignored for the inventory. The Analysis Status field in the Enterprise View Builder and Enterprise View Tech client solutions shows the results of the execution of the Inventory Job for every object in the selected organization view.

38 When the Analysis Status field contains: To Do (Basic) or To Do(Extended) - the execution of the Inventory Job has been successfully completed. Disabled - an inventory analysis has not been executed. The object has been recorded in the Enterprise View Repository as an object for references only. The Overview tabsheet in the Enterprise View Builder shows the Object Qualifications Distribution by Analysis Status after execution of the Inventory Job. 8.4. When to Rerun the Inventory Job You must rerun the Inventory Job in the following circumstances: I. There are errors during the execution of this job. Such errors can be: Wrong directory paths No write access to the directories Not enough space on the disk for writing The following steps have to be performed: 1. Run Inventory Deletion Job without checking the Delete Structure checkbox to delete all the information from the Enterprise View Repository and Database tables without deleting the structure of the selected organization view (mandatory). 2. Correct the errors. 3. Run Inventory Job again.

39 II. There are missing sources and incorrect or missing extension assignments. Usually this happens when the Inventory Probe Job has not been executed before executing Inventory Job. In this case, all problems regarding wrong or undefined extensions and missing sources will appear after executing the Inventory Job. In case of missing sources, you can follow the instructions described in the chapter Rerun after Adding New Components and rerun the Inventory Job after Inventory Probe Job. In case of incorrect assignment of extensions to sources or missing extensions, you must follow these steps: 1. Run Inventory Deletion Job without checking the Delete Structure checkbox to delete all the information from the Enterprise View Repository and Database tables without deleting the structure of the selected organization view (advisable). 2. Correct the wrong extensions or create the missing extensions in the Enterprise View Knowledge database. 3. Run the Inventory Probe Job (advisable). 4. Run Inventory Job again and check the results.

40

41 9. Starting the Analysis Job 9.1. Introduction The analysis phase is the most important phase of the product. During this phase, Enterprise View analyzes the sources in the indicated organization view, based on their extensions, system, class, type, language definitions, and the type of analysis. The type of analysis (basic, extended, or disabled) can be shown using the Work with Codes menu. For more information, see the Enterprise View Configuration Manager or reference guide. During the analysis, Enterprise View: Identifies the definitions of the areas with the corresponding fields when possible. Identifies the relationships between objects. Identifies the dynamic links between the objects if there are dynamic calls in the source code. Calculates metrics. All the information that is found is stored in the Enterprise View repository and database tables and can be viewed by using client modules such as Enterprise View Builder or Enterprise View Tech. The analysis can be run at any level of the technical view: package, application, application version, and component. The analysis process works within the package, so it is important that all the components necessary for processing your environment (COBOL Batch), (COBOL CICS), (COBOL IMS) have been included in the same package. If the analysis is run at the package level, Enterprise View automatically processes the sources in order. It first stores the information about files and relationships between files and sources, and then handles include books and programs that need the information previously loaded in order to be able to calculate metrics properly and provide a complete picture of their relationship. Otherwise, you must run the analysis in the order expected by the Enterprise View product: 1. Database definitions (DDL, DBD) 2. Resource definitions (CICS CDS, IMS Stage 1) 3. Maps 4. JCLs 5. Include books 6. Programs Following this order, Enterprise View will generate unresolved objects only if some software parts have not been included in the environment. At the end of the analysis phase, all the information collected in the Enterprise View database will be available. Using the client solutions: Enterprise View Tech Enterprise View Dynamic Inventory Enterprise View Impact Analysis

42 you can see the data from different points of view. For further details, refer to the reference guide of each Enterprise View client solution. This job can be repeated as necessary. For more information, see the chapter When to Rerun the Analysis Job. 9.2. How to Run the Analysis Job There are two ways of running the Analysis Job: 1. Using the Work with Jobs window Click on the Work with Jobs icon and select the New button. If the Job Creation Wizard menu appears, go to the chapter Starting Host Jobs. Otherwise, select Analysis from the popup window and press the OK button. The window showing incomplete fields will appear on the screen. Check Update Trend if you are interested in the previous status and the movement of the metrics of the objects. The Organization View box allows you to select a package, application, application version or component for analysis. Fill in the fields as described in the chapter Work with Jobs in the Common Features reference guide. The Analysis Job with the populated fields is ready for execution.

43 2. Using the Organization View Tree Select a package, application, application version or component in the organization view tree. Right-click the selected package, application, application version or component, choose the Job Submission item and select Submit Analysis Job from the popup menu as shown in the following picture. If the Job Creation Wizard menu appears, go to the chapter Starting Host Jobs.

44 In both cases, the Work with Jobs window will appear with the information about the selected item from the organization view tree. Start a Multidaemon session on the server machine. Press the OK or Apply button on the Work with Jobs window to submit the job for execution. 9.3. Analysis Job Results After finishing the Analysis Job, you must verify that everything has been processed properly, see if errors have been found or if there are problems that require the user s intervention. The Analysis Status field in Enterprise View Builder and Enterprise View Tech shows the results after execution of the Analysis Job for every object in the selected organization view. The analysis can be: Basic - information about the object (name, description, system, class, type, language, number of objects, number of lines) is recorded in the Enterprise View Knowledge Base and in the Repository. Extended - the information from the basic analysis plus information about attributes, relations, dynamic links and metrics is recorded in the Enterprise View Knowledge Base and in the Repository. Disabled - no analysis is performed. The objects are recorded in the Enterprise View Knowledge Base and in the Repository as objects for relations only. Once the analysis job has been run, all the sources should have the analysis status: Disabled Done (Basic) Done (Extended) Done (with warnings, errors, timeout, or not parsed). When the analysis status is: Done (Basic) - the basic analysis has been successfully completed.

45 Done (Extended) - the extended analysis has been successfully completed. The checks to be performed concern: Unresolved Objects Sources with analysis status Done (with warnings) Sources with analysis status Done (with errors) Sources with analysis status Done (timeout) Sources with analysis status Done (not parsed) When you check the results after finishing the Analysis Job, take the following considerations into account: The level at which the analysis has been run: package, application, application version or component. The right order of processing the analysis of the sources when the analysis is not performed at package level. The objects with analysis status Done (with warnings, errors, timeout and not parsed). A quick way to verify the status of the analysis is to open Enterprise View Builder and select the Overview tabsheet, the Objects Qualification Distribution by Analysis Status section. The objects classified by analysis status are displayed below. The All Objects tabsheet shows the Analysis Status of all objects in the selected organization view. You can see the source code of every object using the All Object tabsheet as shown in the next picture.

46 9.3.1. Unresolved Objects One of the main features of the Analysis Job is the tracking of relationships between objects. If during the analysis of a source, Enterprise View does not find in the repository an object used or called by the source itself, it stores the missing object as an unresolved object. The unresolved object can be a program, copy book, file, CICS resource, SQL or DL/I database table. Some cases of missing objects are described in the examples below. Example 1 Program A calls Program B, the source of Program A is in the Enterprise View Repository (belongs to the package), the source of Program B has not been included in the package. When Enterprise View analyzes the source of Program A and finds that there is a call to Program B but does not find program B in the repository, it stores Program B as an unresolved object. Example 2 Program A uses File C, the source of Program A is in the Enterprise View Repository (belongs to the package) but no definition for File C is included in the package, for example: JCL file definition or DB2 DDL table definition. When Enterprise View analyzes the source of Program A and finds that this source uses File C, but does not find File C in the repository, it stores File C as an unresolved object. Example 3 Program A uses Include Book D, the source of Program A is in the Enterprise View Repository (belongs to the package) but the source of Include Book D has not been included in the package. When Enterprise View analyzes the source of Program A and finds that there is a reference to Include Book D but does not find Include Book D in the repository, it stores Include Book D as an unresolved object. The presence of an unresolved object usually means that:

47 Not all objects have been included in your application. Not all objects have been transferred correctly and included in the organization view. If the analysis has been performed at application, application version, or component level, you have not run the analysis in the order expected by Enterprise View. Enterprise View cannot calculate some metrics properly. For example, if the missing object is an include book, this may affect the metrics calculated for the program that uses this include book. The metrics can be Halstead, MI, etc. Enterprise View will not create relationships between these objects. NOTE: The list of unresolved objects is important as for a means of locating the missing objects. Usually, your applications contain thousands of objects and sometimes it is not easy to immediately identify all libraries and folders that contain the sources of the application or to have all the sources available at the time of the inventory phase. Using the list of unresolved objects it is possible to: Check the number of missing objects. Simplify the search for sources in the development environment. 9.3.1.1. How to Manage Unresolved Objects The most common causes of unresolved objects and the ways of managing them are described in this section. Missing application programs These are software parts that have not been included in the package because their sources were not available or because you forgot to put them in the organization view when it was generated. How to solve the problem: Add the missing software part to the package and rerun the Analysis Job following the necessary steps as explained in the chapter When to Rerun the Analysis Job. Sources analyzed in the wrong order As explained in the chapter Introduction, the analysis can be run at any level of the organization view: package, application, application version, or component. If the analysis is not run at the package level and the sources are analyzed in the wrong order, unresolved objects might be created. How to solve the problem: Rerun the analysis of the components in the right order following the steps described in the chapter When to Rerun the Analysis Job.

48 System utility sources not available There are system utility programs or database interface programs used by the application programs whose source is never available, for example: System utilities - IEHMOVE, ICEGENER, IEBGENER, IEFBR14, IEUPDATE, IEBCOPY, IEHLIST, IEHPROGM, EZCOPY, SORT, IDCAMS Database loaders - CBLTDLI, IKJEFT01, IKJEFT1A, DSNMTV01 IMS loaders - DFSRRC00, DFSURGU0 Usually they appear as unresolved objects. How to solve the problem: This problem can be solved by entering in the LOADERS section of the configuration file JclConfig.cfg the names of all loaders that the application uses, as shown in the next example. The JclConfig.cfg file is found in the Job directory on the Enterprise View server machine. The LOADERS section of the JclConfig.cfg file: [LOADERS] [LOADER] NAME = 'IKJEFT01','IKJEFT1A' TYPE = DB2 [LOADER] TYPE = IMS NAME = 'DFSRRC00' [LOADER] TYPE = IMSDB2 IMS_NAME = 'DFSRRC00' DB2_NAME = 'DSNMTV01' [LOADER] TYPE = IDC NAME = 'IDCAMS' 9.3.1.2. How to Identify Unresolved Objects If there are unresolved objects after finishing the Analysis Job for the selected organization view, they can be viewed in the Enterprise View Tech client solution.

49 Open Enterprise View Tech. Select the Unresolved tabsheet. It contains the list of all unresolved objects for the selected organization view. Select one object from the list and right-click. The dropdown menu will appear. Using the Show Source function from the dropdown menu, it is possible to find out if the source has been included in the organization view or not. Using the Browse Object function from the dropdown menu, it is possible to follow the reference and to display the line where the item was involved. Using the References section it is possible to see all the characteristics of this reference and the part from which the unresolved object originates. 9.3.2. Analysis Status Done (with warnings) After completing the analysis of the sources, Enterprise View finds warning errors (usually unsupported or unrecognized syntax) that might affect the reliability of metric calculations or relationships retrieval. How to solve the problem: 1. Submit Unsupported Syntax Job. This job reports the names of all objects that contain unsupported or unrecognized syntax. For more information, see the chapter Unsupported Syntax Job. 2. Select the Unsupported Syntax tabsheet in the Enterprise View Builder and verify for each object the type of syntax Enterprise View does not understand. 3. Correct the problem and rerun the necessary steps as explained in the chapter When to Rerun the Analysis Job.

50 9.3.3. Analysis Status Done (with errors) When the analysis finishes with Done (with errors) this means that the analysis has not been completed because of a serious problem found in the job. This may be a Segmentation fault error, a wrong type of source (for example the source is in binary format or contains unsupported syntax), a wrong extension, or some other serious error. The metrics are not calculated at all or are unreliable. Created relationships are unreliable too. How to solve the problem: Verify that the sources marked with this error contain the correct type of code, rerun the analysis, and if the problem persists, contact Micro Focus SupportLine. 9.3.4. Analysis Status Done (time out) This error appears after the execution of Analysis Job and means that the analysis of the object has not been completed because no answer was provided within the maximum analysis time specified at the time of the installation. Metric calculations and/or relationships retrieval may be unreliable. How to solve the problem: Verify that the machine meets the minimum requirements described in the document Hardware and Software Requirements. Ask the system administrator to check that the message was not due to: Unavailable resources on the machine Competitor jobs on the same database. If the problem persists, contact Micro Focus SupportLine. 9.3.5. Analysis Status Done (not parsed) This error appears after submission of an Analysis Job and means that the analysis of the objects has not been completed due to a parser fault. How to solve the problem: Contact Micro Focus SupportLine. 9.3.6. Sources with Status To Do (Extended) after Analysis Some sources still have a status of To Do (Extended) after execution of an analysis job. Possible reasons for this are: I. Messages about: Wrong extensions for the type of source. Unsupported syntax lines in the source.

51 In this case, the following steps have to be performed: 1. Correct the extension for the type of source. 2. Correct the syntax in the sources. 3. Run Inventory Deletion Job without checking the Delete Structure checkbox to delete all objects with detailed information about them from the Enterprise View Repository and Database tables for the selected package. 4. Run the Inventory Probe Job or Unsupported Syntax Job (Advisable). 5. Run the Inventory Job for the package. 6. Run the Analysis Job for the package. II. Analysis Job does not run for some components If the analysis is not executed at package level, some components are skipped by the analysis. In this case, run the Analysis Job for the components which were never analyzed. taking into account the order of the analysis process as described in the chapter Introduction. If the analysis has not been performed in the right order, rerun the analysis following the instructions in the chapter When to rerun the Analysis Job. III. Analysis Job has stopped for some reason Rerun the analysis following the instructions in the chapter When to rerun the Analysis Job. 9.4. When to Rerun the Analysis Job The Analysis Job must be rerun in the following situations: I. When there are errors during the execution of this job The reasons for these errors are: Sources with analysis status Done (with warnings) Sources with analysis status Done (with errors) Sources with analysis status Done (timeout) Sources with analysis status Done (not parsed) Sources analyzed in a wrong order Sources with unsupported syntax Other errors (not enough space for writing, database problems or Analysis Job has stopped for some reason) The following steps have to be performed: 1. Run Inventory Deletion Job without checking the Delete Structure checkbox to delete all objects with the detailed information about them from the Enterprise View Repository and Database tables for the selected package without deleting the structure of the organization view.

52 2. Correct the errors. 3. Run the Inventory Job for the same package again. 4. Run the Analysis Job for the same package. If the analysis is not executed at package level, you must maintain the right order for analysis of the components. II. When there are unresolved objects (missing sources) If, for some reason, unresolved objects different from the loaders or system utility programs still exist, you need to add them to the package. There are two ways to add missing sources to the package and rerun the Analysis Job: A) By executing Inventory Deletion Job and checking the Delete Structure checkbox The following steps have to be performed: 1. Run Inventory Deletion Job and check the Delete Structure checkbox to delete all objects with the detailed information about them from the Enterprise View Repository and Database tables for the selected package together with the structure of the organization view. 2. Transfer the missing sources to the new or existing folders and check the file extensions. 3. Define the Technical Organization View including the new components. 4. Run Inventory Probe Job and check the results (advisable). 5. Run Inventory Job and check the results. 6. Run Analysis Job and check the results. B) By executing Inventory Deletion Job without checking the Delete Structure checkbox The following steps have to be performed: 1. Run Inventory Deletion Job without checking the Delete Structure checkbox to delete all objects with the detailed information about them from the Enterprise View Repository and Database tables for the selected package except the structure of the organization view. 2. Transfer the missing sources to the existing folders only and check the file extensions. 3. Run the Inventory Probe Job and check the results (advisable). 4. Run the Inventory Job and check the results 5. Run the Analysis Job and check the results. Note: After executing the Analysis Job, the Inventory Job can be rerun only after executing the Inventory Deletion Job for the selected organization view.

53 10. Starting the Life Cycle Job 10.1. Introduction The Life Cycle job is performed after successfully finishing the first part of the analysis: inventory and analysis phases for the selected organization view. It can be repeated in cases of errors, unresolved objects (missing sources), or wrong extensions. This job can be executed at package level only. The Life Cycle job tracks if there are changes in the application (for example, new sources, deleted sources or sources with changed program code) and it updates the information in the Enterprise View Repository and Database tables. This job performs inventory and analysis functions: The files are compared file by file. The program code in the sources is compared and the changes are updated. All new files are written to the Repository and Database tables. The information about the files with changed program code and deleted files is updated. Analysis of all the (new, changed and unchanged) sources is executed the metrics and relations are updated. In order to update the analysis results (attributes, relations, metrics), when the sources in the applications are changed frequently, this job can be scheduled for execution daily using the Enterprise View Agent. 10.2. How to run the Life Cycle Job There are two ways of running the Life Cycle Job: 1. Using the Work with Jobs menu. Click on the Work with Jobs icon and select the New button. Select Life Cycle from the drop down window and press the OK button. If the Job Creation Wizard menu appears, go to the chapter Starting Host Jobs.

54 Select OK from the drop down menu. 2. Using the organization view tree Select the package in the organization view tree, right-click and select the Job Submission item and Submit Life Cycle Job from the dropdown menu. If the Job Creation Wizard menu appears, go to the chapter Starting Host Jobs. In both cases, the Work with Jobs window will appear with information about the selected package.

55 Start a Multidaemon session on the server machine. Press the OK or Apply button on the Work with Jobs window to submit the job. 10.3. Understanding Life Cycle Status Results The results of the Life Cycle Job can be viewed in the Enterprise View Builder and Enterprise View Tech. The Status field in the tabsheets for every class of object (programs, copy/includes, JCLs, transactions, maps, etc) shows the changes after execution of the Life Cycle Job. The status can be: New a new object has been added in the Enterprise View Repository and Database tables. Changed the program code in the existing object has been changed. Unchanged the program code in an existing object has not been changed. In the next picture the user can see how the status code is changed after execution of the Life Cycle Job.

56 There is one changed source and three new sources in the application.

57 11. Starting the Unsupported Syntax Job 11.1. Introduction The Unsupported Syntax Job must be started after execution of the Analysis Job. It can be started at every level of the technical organization view: package, application, application version or component. All objects in the selected organization view with unsupported syntax are recorded in the UnknownSyntax.txt file in the Job directory. Unsupported syntax is syntax that is wrong or not supported by Enterprise View. You can see the programming languages that Enterprise View supports in the Environment Settings of the Enterprise View Configuration Manager. If there are objects with unsupported syntax, the information about these objects is written to the Enterprise View Database. 11.2. How to run the Unsupported Syntax Job Run this job by following the same steps as for running Life Cycle Job, but select Unsupported Syntax instead. 11.3. Understanding Unsupported Syntax Results A list of all objects that contain unsupported syntax lines can be viewed using the Unsupported Syntax tabsheet of the Enterprise View Builder module. The following information is reported: Name - the name of the object that contains unsupported syntax. Location - the number of the line with unsupported syntax. Unsupported Syntax Line the text identified as unsupported syntax. The number of the line with unsupported syntax for every object appears when you click the object containing the syntax in the Enterprise View Builder module. The possible reasons can be: Wrong extensions for the type of source How to solve the problem: Correct the extensions and rerun the job. Wrong syntax in source code How to solve the problem: Correct the syntax and rerun the job. Binary, non-cobol characters or other unspecified text in the source How to solve the problem: Check if in the source code there are characters that are not allowed in COBOL source names, data set names, PATH names, COPY statements, arithmetic operations and relational operations. For more information about the allowed characters, see the table below.

58 COBOL allowed characters with SBCS (single-byte character set) Characters ALLOWED COBOL Program s Elements Source Data Set Name Name (DSN) When the name is a userdefined WORD: A-Z, a-z, 0-9 and hyphen Unqualified DSNames: A-Z, a-z, 0-9 or $, #, @, -, +0 PATH Name A-Z, a-z, 0-9, / or All printable characters from X'40' through X'FE'. COPY statement A Z, a- z, 0 9, and hyphen Arithmetic Operations + - / * ** Relational Operations < > = <= >= When the name is a LITERAL: A-Z, a-z, 0-9, @, #, and $. Qualified DSNames: Multiple names joined by periods (.) Temporary DSNames: Must start with an ampersand (&).

59 12. Starting the Inventory Deletion Job 12.1. Introduction This job deletes objects assigned to organization views, their relations, and metrics. Depending on the delete structure flag it is also possible to delete the organization view structure. The Inventory Deletion Job can be started at every level of the technical organization view: package, application, application version or component. This job performs the following operations: Deletes all objects and the detailed information about them from the Enterprise View Database tables and Repository but not the structure of the organization view if the Delete Structure box is not checked. If the Delete Structure box is checked, the job deletes all objects and the detailed information about them from the Enterprise View Database tables and Repository and also deletes the structure of the selected level of the organization view: package, application, application version or component. 12.2. How to run the Inventory Deletion Job This job can be run using the Organization View Tree. In the newer Enterprise View versions this job can also be started using the Work with Jobs menu. Select the desired level from the Organization view tree (package, application, application version or component), right-click and select the Job Submission item and Submit Inventory Deletion Job. If the Job Creation Wizard menu appears, go to the chapter Starting Host Jobs. Otherwise the following windows will appear: Check the Delete Structure box, if required. Start a Multidaemon session on the server machine. Click the OK or Apply buttons to submit the job.

60 WARNING: We recommend you do not execute an Inventory deletion job during the second phase of the analysis process (Life Cycle process) when the Trend box is checked during the analysis, or the trend information (previous status and the movement of the metrics of the objects) will be lost.

61 13. Starting the HTML Export Job This job can be used for exporting organization view structures with some statistics in HTML format. HTML export can be performed at every level of the technical organization view: package, application, application version or component. The job can be started using the Work with Jobs menu or the Documentation Export of Solutions module. Checking Refresh Existing HTML Export box, you can see for which organization views or their levels in the package, HTML documentation export already exists. If you start a job for a package that already has HTML exported documentation, only the information for updated objects will be added to the existing exported documentation. Three general types of information are reported: Overview information Object list information Details object information. If you open index.html file, you will see detailed information about all linked objects, their attributes, and relationships. Depending on some options, it is also possible to export object source code.

62

63 14. Starting the End Daemon Job The End Daemon Job is used to halt execution of the started Multidaemon. This job performs the following operations: Checks if there is a job running in a UNIX session: for example Inventory Job, Analysis Job, Inventory Deletion Job, Life Cycle Job, and so on. Sends notification to the Multidaemon to stop when the current job has finished its work. If no job is running, End Daemon Job stops the Multidaemon immediately. This job can be started only by using the Job Creation Wizard menus.

64

65 15. Analyzing COBOL Sources 15.1. Introduction COBOL applications can run on different platforms (Windows, Mainframe) and in different environments. Sources from z/os or MVS mainframe systems can run in different environments: BATCH, CICS, or IMS. They can contain statements for reading, writing, or updating the database tables or different types of data sets. In the next paragraphs, we will examine in detail the source types necessary to analyze applications running in different environments (batch, CICS, and IMS) with programs written in the COBOL language. 15.2. Sources Containing File and Database Definitions 15.2.1. Introduction The following languages are used to describe the definitions of system files and databases: 1. Data Definition Language (DDL) for DB2 Database. 2. Database Description Language (DBD) for DL/I Database. 3. VSAM Delete/Define Language for VSAM files. During analysis of the sources containing DB2, DL/I, or VSAM Delete/Define definitions, the parsers scan the sources and load into the Enterprise View Repository information about the files and database definitions as well as about the following measures: Lines of Code - LOC Source Lines of Code - SLOC Comment Lines of Code - CLOC Change Traffic Note: The analysis of the database definitions must be carried out as the first step of the analysis (when the analysis is not at package level) or Enterprise View creates numerous unresolved objects in the subsequent phases of the analysis (i.e. during the analysis of the programs and include books). 15.2.2. DB2 Database DB2 is an enhanced relational database server solution working on Mainframe and Windows platforms. The database is an organized collection of data. The Data Definition Language is used to describe the components of the DB2 database (tables, views, indexes, aliases, etc).

66 Data Definition Language (DDL) You can use the extensions.ddldb2 or.ddldb2 for sources with DDL statements, predefined in the Enterprise View Knowledge database, in order for the sources to be properly analyzed and handled by the parser. The analysis type of the DDL files is Extended. The Enterprise View parser extracts and loads into the repository the database definitions defined in DDL sources - tables, indexes, views, synonyms. Where possible, a list of fields in each file is loaded. The following database components that are described with the data definition language (DDL) are analyzed by the parser: Create Table Create Alias Create View Create Index Create Synonym Note: If a view is defined in the DDL source but not the source describing the table related to the view, the product stores as files in the Enterprise View repository not only the view but the table as a physical file. In this case, the list of the fields of the table will not be available. 15.2.3. DL/I Database DL/I is a hierarchical database working on the OS/390 platform. Database Description Language (DBD) is used to describe the components of the DL/I database - physical files, program specification blocks (PSB), and program communication blocks (PCB). Database Description Language (DBD) You can use the extensions.dbd or.dbd for sources with DBD statements, predefined in the Enterprise View Knowledge database, in order for the sources to be properly analyzed and handled by the parser. The analysis type of the DBD files is Extended. DBD Analysis creates: The DBD list The SEGMENT list The SEGMENT hierarchy. Enterprise View classifies all DBD definitions as physical files. It does not take into account the ACCESS clause: LOGICAL, PHYSICAL, INDEX, GSAM (Generalized Sequential Access Method). The table below presents the CBLTDLI Base Configuration for Function Codes that generate a Program-Segment relationship. Enterprise View retrieves the SEGMENT value name from the last CALL parameter only when its length is greater than 8 bytes, and the parameters number is bigger than 3.

67 Function Code Number of Parameters Length of the Last Parameter Relation Type DLET >= 3 > 8 bytes Delete FLD >= 3 > 8 bytes Read GHN >= 3 > 8 bytes Read GHNP >= 3 > 8 bytes Read GHU >= 3 > 8 bytes Read GN >= 3 > 8 bytes Read GNP >= 3 > 8 bytes Read GU >= 3 > 8 bytes Read ISRT >= 3 > 8 bytes Insert POS >= 3 > 8 bytes Read REPL >= 3 > 8 bytes Update Limitation: Multiple Segment Search Arguments (SSAs) When multiple segment search arguments (SSAs) are used for a call, Enterprise View creates the Program/Segment relationship only for the last SSAs used in the call. Homonymous Segment Names Program Analysis does not take into account Segments having the same name and defined in more than one DBD. This limitation is due to the fact that Enterprise View does not handle PSB definition. Counter Parameter Enterprise View sets as the DL/I Function Code the first parameter of CBLTDLI Call. The table below shows the CBLTDLI Base Configuration for Function Codes that generate a relationship between a program and a generic object. The configuration, as for DL/I Calls for Database Management, takes into account the number of parameters and the length of the last one. Function Code Number of Length of the Relation Type Parameters Last Parameter AUTH = 3 Not relevant Read CHNG = 3 Not relevant Read CMD = 3 Not relevant Read DLET >= 4 8 bytes Read FLD >= 4 8 bytes Read GCMD = 3 Not relevant Read GHN >= 4 8 bytes Read GHNP >= 4 8 bytes Read GHU >= 4 8 bytes Read GN >= 4 8 bytes Read

68 GNP >= 4 8 bytes Read GU >= 4 8 bytes Read ISRT = 3 Not relevant Insert ISRT >= 4 8 bytes Insert POS >= 4 8 bytes Read PURG = 3 Not relevant Read PURG >= 4 8 bytes Read REPL >= 4 8 bytes Read SETO = 3 Not relevant Read For additional information about DL/I sources analysis, see the DL/I Process Guide. 15.2.4. VSAM Delete/Define File Statements A VSAM file is known as a cluster. VSAM clusters are managed by a utility called IDCAMS. VSAM is a Virtual Storage Access Method that allows: sequential access to files either by a key or by sequence. direct access by key, relative byte address, or relative record number. VSAM Delete/Define statements For a description of the VSAM clusters, IDCAMS utility control statements are used. You can use the extensions.ddef or.ddef for sources with VSAM delete/define file statements, predefined in the Enterprise View Knowledge database, in order for the sources to be properly analyzed and handled by the parser. The analysis type of the VSAM Delete/Define files is Extended. The Enterprise View parser extracts and loads into the Enterprise View Repository VSAM Cluster and Path definitions as a physical file and VSAM Alternate Index definitions as a logical file. 15.3. COBOL Copy Books The COPY books are external sources containing a program code. The COPY COBOL statements are used in COBOL programs to incorporate the external source text into the source programs at specified points. The copy books can contain: COBOL program code SQL declare sections SQL statements You can use the extensions.cpy or.cpy for copy book sources, predefined in the Enterprise View Knowledge database, in order for the sources to be properly analyzed and handled by the parser. You do not have to analyze copy books to analyze the programs because the program analysis includes the pre-processing of all copy books called by the

69 programs. Therefore, if an unresolved object is a copy book, it means that this copy book is missing or that it has been loaded with the wrong extension. When the program analysis starts, all includes are expanded in the programs at the place where the COPY statement is met. Therefore there is no difference between the statements belonging to the program and the statements belonging to the copy books. In the client modules (i.e. Enterprise View Tech) all relationships retrieved in the program analysis are displayed as belonging to the program. The statement number indicated will be the original statement where the copy book is included. The analysis type of the COBOL copy books is Basic. The Enterprise View parser scans copy books and loads into the Repository the following measures: Lines of Code - LOC Source Lines of Code - SLOC Comment Lines of Code - CLOC Change Traffic 15.4. COBOL Sources with SQL Statements COBOL programs and copy books can contain embedded SQL statements. Every embedded SQL statement begins with the phrase EXEC SQL. The SQL statement with specific parameters comes after this phrase. The SQL statements end with the END-EXEC statement. For example: EXEC SQL SELECT * FROM SYSADM.EMP WHERE EMPNO = :TEMP-EMPNO END-EXEC. During the analysis, when the Enterprise View parser meets SQL statements, it starts the SQL analyzer, which retrieves the type of relations between the programs and the objects found in the SQL statements. The SQL statements that are analyzed and the corresponding relation types are presented in the following table. SQL Statement Relation Type Related Object Allocate cursor Other *UNKNOWN Call Call Procedure name Close (of a static cursor) Close Table/View of cursor Close (of a dynamic cursor) Close *UNKNOWN Declare cursor Read Table/View of select Declare statement Other *UNKNOWN Declare table Uses Table/View Delete Delete Table/View to be deleted Execute Other *UNKNOWN Execute immediate Other *UNKNOWN

70 Fetch (of a static cursor) Read Table/View of cursor Fetch (of a dynamic cursor) Read *UNKNOWN Include Uses SQL source to be included Insert Insert Table/View to be inserted Lock table Other Table Open (of a static cursor) Open Table/View of cursor Open (of a dynamic cursor) Open *UNKNOWN Prepare Other *UNKNOWN Select into Read Table/View of select Set current SQLID Other Special Register Update Update Table/View to be updated Dynamic statements have the value *UNKNOWN like the related objects. For these statements, the Enterprise View parser will manage the string specified in the dynamic statements in order to identify actual objects. The statement number that indicates the line of source that has generated a relationship (i.e. program-file relationship) is always the statement number in which the EXEC SQL statement begins in the original source. For SQL statements present in copybooks, the reported statement number is the number of the include in the original source copybook. 15.5. Dynamic Link in COBOL Sources The Enterprise View parser analyzes COBOL CALL statements. The object that calls another object is the calling object (source or parent object), while the object identified in the call statement is the called object (destination or child object). The static call is a typical call to an internal function, procedures or macro procedures. The dynamic call is usually a call to an external program. Programs often contain statements which cannot be interpreted easily by the standard parser and require the action of the customer who knows the application to define the rules that have to be followed for correct interpretation of the instructions. Examples of these scenarios are: Use of a program or routine generically called dispatcher in order to recall other programs. In a generalized manner it is not meaningful to have the name of dispatcher as the recalled program but it is important to have the name of the effective program. Programs that handle I/O. Dynamic calls, that is, calls to programs by means of a variable. In the simpler cases the values are assigned to the variables in working storage or by means of a simple MOVE, but there are situations in which the name of the calling program is constructed on the basis of rules that depend on the logical flow of the program. For a clearer presentation of the relationships between the dynamically called objects, you can create dynamic link definitions before the execution of an analysis job. There are two types of custom dynamic link definitions: basic and extended.

71 The Enterprise View Analyzer uses these definitions and creates proposed or confirmed dynamic links. If there are no custom dynamic link definitions, the Enterprise View Analyzer creates (when possible) proposed dynamic links using its own logic. You can manage and change the status of the Proposed Dynamic Links to Confirmed or Not Handled in the Enterprise View Tech client solution, using the function Dynamic Links. When CALL CBLTDLI statements are analyzed by the Enterprise View parser, it creates dynamic links and relationships between called and calling objects based on the parameters passed in the statement. In the next example, four parameters are used. The first one (GU) contains the name of the DL/I function and the other parameters specific DL/I control blocks: CALL 'CBLTDLI' USING GU, SIS-PCB, STID-SEGMENT, SSA-STID. There are some considerations regarding DL/I calls: Enterprise View classifies the DL/I Call type according to the total number of parameters for the CALL and the value of the first of these parameters that represents the DL/I Function Code. Enterprise View ignores the calls that do not clearly specify the SEGMENT Name in order to avoid a manual intervention to confirm all the proposed Dynamic Calls. Detailed information about all dynamic references retrieved by the Enterprise View parser can be seen in the Enterprise View Tech client solution as shown in the next picture. The call between objects (in particular, program-program; transaction-program) is: Standard - when the link is established through fixed names. Dynamic Link when the link is established by means of variables. For further details about how to use the Dynamic Link function, see the Dynamic Link Process Guide.

72

73 16. Batch Environment 16.1. Introduction In this chapter, we will examine the sources, which are analyzed in the batch environment. Batch processing is the execution of a series of programs (jobs) on a computer without human intervention. Job Control Language (JCL) is fundamentally important for batch processing. It is used as the starting point for the call to the programs and procedures. Enterprise View determines not only the relationships between a batch job and programs executed in the steps but also the link between the physical name (DSNAME) and the logical name (DDNAME) of the files, used by the programs in the JCL chain. Analysis of the JCL is therefore a prerequisite for analysis of the programs. 16.2. Batch Jobs (JCL) and Catalogued Procedures 16.2.1. Introduction During analysis of the sources containing JCL statements, pre-processing of the JCL is performed to gather all the information necessary for the Enterprise View parser. The information collected in this phase covers: Catalogued procedures Cards Path list of the catalogued procedures Path list of cards Loaders Catalogued procedures contain JCL statements. They should be available in the package because they are expanded in the JCL sources, or Enterprise View will load them as unresolved objects. Card files are names of partitioned data sets (PDS or extended PDSE) members. They should be available in the package because they are found in JCL DD control statements, or Enterprise View will load them as unresolved objects. The path list for catalogued procedures and cards as well as the loader names can be described in a special configuration file named JclConfig.cfg. This file is located in the Job directory on the server machine. You must provide the last three pieces of information. The configuration file is divided into three main sections: INCLUDE PATHLIST This section specifies the path list to be used as a search list for inclusions. It contains the name of the directory where catalogued procedures are found on the Enterprise View server machine. CARD PATHLIST This section specifies the path list (the name of the directory on the Enterprise View server machine) that will be used as a search list for cards. LOADER LIST This section contains a list of all loaders used in the organization view.

74 The loaders are system utilities or database interface programs whose source is never available. For this reason they cannot be available in the package and usually Enterprise View loads them as unresolved objects and the link between the batch jobs and the program called by the loader will be lost. To avoid this problem, you can describe the loader names in a special configuration file as shown below. The beginning of the next section marks the end of the previous section. The order of the sections is not important. # this section specifies pathlist to be used as a search list for inclusions [INCLUDE PATHLIST] /home/mv/proc/kbb.assist.proc # this section specifies pathlist to be used as a search list for cards [CARD PATHLIST] /home/mv/proc/kbb.assist.proc /home/mv/proc/kbb.ra3.proc # loader list: it contains a list of all interesting loaders; [LOADER LIST] [LOADER] NAME = "IKJEFT01" TYPE = DB2 [LOADER] NAME = "DFSRRC00","DSNMTV01" comma TYPE = IMSDB2 # all possible names are separated by [LOADER] NAME = "IDCAMS" TYPE = IDC There are different classes of loaders: DB2 DB2 loaders IMS IMS loaders IDC IDCAMS utility loaders IMSDB2 IMS loaders for programs performing DB2 activities DL/I Loaders for programs performing DL/I activities SYSTEM UTILITIES z/os or MVS System utilities The configuration file JclConfig.cfg (preloaded with the product) contains the names of some common loaders such as IKJEFT01 or IDCAMS. You should verify (before proceeding with the analysis phase) with someone who knows the application if the environment uses some synonyms of standard loaders. If there are such programs, you must add them to the configuration file. The Enterprise View parser scans the batch job sources and the catalogued procedures and loads into the Repository the following information:

75 Sequential file names GDG file names Temporary file names VSAM file names JCL - Catalogued Procedures (call procedures) relationships JCL - Cards (include cards) relationships JCL - Programs (call programs) relationships Called programs files relationships 16.2.2. Physical Sequential Files A physical sequential file is a data set with sequential text organization. In the JCL DD statement these files have the parameter DSORG=PS. If, during the analysis of the JCL sources, Enterprise View encounters descriptions of the physical sequential files (PS), these files will be recorded as physical files in the Enterprise View Repository. Relationships between these files and the COBOL programs that used them are created based on the DSNAME/DDNAME parameters in JCL DD statement. The DDNAME in the DD statement must be the same as the name of the file described in the FILE-CONTROL paragraph of the Input-Output file definition section (Environment Division) in the COBOL programs. The analysis type of the physical sequential files is Disabled. The status of the sequential files after the analysis phase can be viewed using the Files tabsheet of the Enterprise View Builder or Enterprise View Tech solutions. 16.2.3. GDG (Generation Data Group) File Definitions A generation data set is one of a collection of successive, historically related, cataloged data sets, known as a generation data group (GDG). The system keeps track of each data set in a generation data group as it is created, so that new data sets can be chronologically ordered and old ones easily retrieved. When creating a generation data set, the relative generation number (version number) tells the system whether this is the first data set being added during the job, or the second, or third, etc. In this example a description of the GDG data set with generation number +1 is shown in a JCL DD statement. //DFSUDUMP DD DSN=B072.BACK01.DIARSUBQ(+1),DISP=OLD If, during the JCL analysis, Enterprise View finds a file where the version is specified, this file will be stored as GDG. Limitation If two different batch jobs refer to the same GDG file but in the first one the version is specified, and in the second one it isn t, the first JCL will be loaded as a GDG file, while the second will be loaded as a sequential file. The analysis type of the GDG files is Disabled. The status of the GDG files after the inventory or analysis phase can be viewed using the Files tabsheet of the Enterprise View Builder or Enterprise View Tech solutions.

76 16.2.4. Catalogued Procedures Procedures are JCL skeletons and provide macro ability to JCL. They are used to execute the same set of job control statements repeatedly with little or no change. A cataloged procedure is a named set of job control statements. You can use the extensions.proc or.proc for such types of source, predefined in the Enterprise View Knowledge database, in order for the sources to be properly analyzed and handled by the parser. Analysis of the catalogued procedure is not a prerequisite for analysis of the JCL sources, because Enterprise View analysis includes the pre-processing of all procedures included by the JCL. If catalogued procedures appear as unresolved objects, it means that these procedures are missing or that they have been loaded with the wrong extensions. The analysis type of the catalogued procedures is Extended. The Enterprise View parser scans catalogued procedures and loads into the Enterprise View Repository the following information: Lines of Code - LOC Source Lines of Code - SLOC Comment Lines of Code - CLOC Change Traffic Procedures - Called Programs relationships JCLs - Catalogued Procedures relationships The status of the catalogued procedures after the inventory or analysis phase can be viewed using the JCLs tabsheet of the Enterprise View Builder or Enterprise View Tech solution. 16.2.5. Cards A card, according to Enterprise View Knowledge Base extensions classification, is the name of a member of a partitioned data set (PDS or PDSE). The member name is recorded in parentheses after the name of the partitioned data set in the DSNAME parameter of the JCL DD statement. The member can contain program code, text or input statements for SYSIN DD statements. For example: //SYSIN DD DSN=Z005.SYSINLIB(B072RE2Q),DISP=SHR A file named B072RE2Q has to exist in the sources. You can use the extensions.card or.card for such type of sources, predefined in the Enterprise View Knowledge database, in order for the sources to be properly analyzed and handled by the parser. Analysis of card files is not a prerequisite for analysis of the JCL sources because the Enterprise View analysis includes the pre-processing of all the cards included by the JCL. Therefore, if unresolved objects are card files, it means that they are missing or that they have been loaded with the wrong extensions. The analysis type of the JCL Cards is Basic. The Enterprise View parser scans DD statements with member names included in the DSNAME parameter and loads into the Enterprise View Repository the following measures: Lines of Code - LOC

77 Source Lines of Code - SLOC Comment Lines of Code - CLOC Change Traffic The status of the JCL sources containing cards after the inventory or analysis phase can be viewed using the JCLs tabsheet of the Enterprise View Builder or Enterprise View Tech solutions. 16.2.6. JCLs JCL is a low-level language source with instructions known as job control statements. The job control statements are interpreted rather than compiled. MVS users use JCL to define work requests (called jobs) and to submit these jobs to the system. You can use the extensions.jcl or.jcl for JCL sources, predefined in the Enterprise View Knowledge database, in order for the sources to be properly analyzed and handled by the parser. The analysis type of the JCLs is Extended. When the JCL analysis starts, all the cards or procedures are expanded in the JCL source. Therefore there is no difference between a statement belonging to the JCL and a statement belonging to the procedures. In the client modules (i.e. Enterprise View Tech) all the relationships retrieved in the JCL analysis are displayed as related to the JCL source, even if the statement that gives the relationship originally belongs to a catalogued procedure or a card. If a relationship originates from a statement that belongs to a procedure or card, the related statement number shown in the Enterprise View client solutions will be the JCL statement where the card or procedure has been included. The status of the JCL sources after the inventory or analysis phase can be viewed using the JCLs tabsheet of the Enterprise View Builder or Enterprise View Tech solutions. 16.2.7. COBOL Batch Programs A COBOL program becomes active when it is called by another program. This calling program is usually the operating system, although it could be another program. You can use the extensions.cbl,.cblbat,.cbl,.cblbat for COBOL sources, predefined in the Enterprise View Knowledge database, in order for the sources to be properly analyzed and handled by the parser. The analysis type of the COBOL programs is Extended. At the end of the analysis, the following information is gathered: Lines of Code - LOC Source Lines of Code - SLOC Comment Lines of Code - CLOC Change Traffic JCLs Programs relationships Programs - Copy books relationships Programs - Files relationships Programs - called/caller programs relationships

78 This information is available in the References and Browse Object functions of the Enterprise View Tech module. It is important to point out that you can use this function to identify the line in the source code where the relationship is generated. The relationship types and related objects in COBOL batch programs are shown in the following table: Relation Type Related Object Callproc Procedure Name Read File Name Declare File Name Delete File Name Include Copy Book Insert File Name Open File Name Update File Name Lock File Name Close File Name

79 17. CICS Environment 17.1. Introduction Customer Information Control System (CICS) is a system for online processing of information. The applications are organized in menus. All CICS online resources (transactions, programs, files, maps, terminals, connections) are previously defined and configured in the CICS environment using special CICS transactions. Information about these resources can be retrieved using the CICS system utility DFHCSDUP. This utility produces a CSD file in text format. The CSD file contains all defined CICS online resources: files, transactions, programs, maps, connections, etc. Enterprise View performs two different processes on the CSD file. The first one is performed by the Enterprise View MVS Agent on a z/os or MVS system. This agent processes the sequential file obtained from the CICS utility DFHCSDUP and transforms it into an XML file. The second process is performed on the server machine by Enterprise View. The product processes the XML file to retrieve all the information about defined CICS resources. Enterprise View analyzes the CICS System Definition file and describes the relationships between the objects listed in the CSD file and the application. 17.2. CICS Online Resource Definition File CICS System Definition (CSD) File You can use the extensions.csd or.csd for CSD sources, predefined in the Enterprise View Knowledge database, in order for the sources to be properly analyzed and handled by the parser. Note: It is important to point out that Enterprise View does not directly handle the output from the CICS system utility DFHCSDUP. The Enterprise View Agent transforms the output from the DFHCSDUP utility to XML format on the mainframe environment. The analysis type of the CICS CSD files is Extended. After transferring the CICS CSD file to the Enterprise View server machine and analyzing it, the following information will be extracted: Transaction names included in the CSD file. Program names included in the CSD file. Transaction - Program relationships. Possible problems: Transactions not included in the package are loaded as unresolved objects. Programs not included in the package are loaded as unresolved objects. 17.3. CICS Map Definitions A map is a definition of the size, shape, position, potential content, and properties of BMS map sets, maps, and fields within maps, by means of macros. A mapset is definitions of a set of BMS maps.

80 Basic mapping support (BMS) You can use the extensions.map,.mapbms,.map or.mapbms for map sources, predefined in the Enterprise View Knowledge database, in order for the sources to be properly analyzed and handled by the parser. There is no specific parser for these objects. The analysis type of the Map definitions is Disabled. No metrics are calculated for these objects. They are included in the Enterprise View Repository for inventory purposes only. The analysis type of the BMS Map definitions is Extended. Metrics are calculated for these objects. If, during the analysis of a CICS program, Enterprise View does not find the source of the map definition called by the program, this missing map definition will be stored as an unresolved object. 17.4. CICS COBOL Transactions and Programs The COBOL transactions and programs running in the CICS environment contain CICS statements. You can use the extensions.cblcic or.cblcic for CICS COBOL sources, predefined in the Enterprise View Knowledge database, in order for the sources to be properly analyzed and handled by the parser. The analysis type of the CICS COBOL transactions and programs is Extended. Every embedded CICS statement begins with the phrase EXEC CICS. After this phrase the CICS statement with specific key words follows. The CICS statement ends with the END-EXEC statement. For example: EXEC CICS START TRANSID('AC01') FROM(ACCTDTLO) LENGTH(DTL-LNG) TERMID(PRTRC) RESP(RESPONSE) END-EXEC. During the analysis, when the parser meets CICS statements, it starts the CICS analyzer, which retrieves the type of relationships between the programs and the objects found in the CICS statements. At the end of the analysis phase, the gathered information provides the following information: Transactions - programs relationships Programs - copy books relationships Programs - called/caller programs relationships Programs - maps relationships Programs - transitions relationships Programs - queue relationships This information is available using the References and Browse Object functions of the Enterprise View Tech solution. It is important to point out you can use these functions to identify the line that generates the relationship in the source program. The statement number that indicates the line of source that generates a relationship (i.e. program-file relationship) is always the statement number in which the EXEC CICS statement begins in the original source. For CICS statements in the copy books, the reported statement number is the number of the inclusion in the original source include book.

81 The relationship types and related objects in CICS COBOL programs are shown in the following table: Relationship Type Call Read Insert Update Send Receive Call Trans DeleteQ ReadQ WriteQ CancTrans Unlock CICdelete CIChandle CIClink CICread CICreadNx CICreadPr CICreceive CICresetbr CICreturn CICrewrite CICsend CICstart CICstartbr CICwrite CICxctl CICendbr CICunlock CICcancel CICdeleteQTD CICdeleteQTS CICIssueAdd CICload CICreadQTD CICreadQTS Related Object Program File Name File Name File Name Map Name Map Name Transaction Queue Queue Queue Transaction File Name File Name Program Program File Name File Name File Name Map Name File Name Transaction File Name Map Name Transaction File Name File Name Program File Name File Name Transaction Queue Queue File Name Program Queue Queue

82 CICrelease CICwriteQTD CICwriteQTS Program Queue Queue

83 18. IMS Environment 18.1. Introduction Information Management System (IMS) is any of several system environments available with a database manager and transaction processing. It is capable of managing complex databases and terminal networks. The Stage 1 file contains the resources of the application (Transactions and Programs) that could be used in this environment. It is fundamentally important for the IMS environment. 18.2. IMS Resource Definitions IMS Stage 1 You can use the extensions.ims or.ims for IMS Stage 1 sources, predefined in the Enterprise View Knowledge database, in order for the sources to be properly analyzed and handled by the parser. The analysis type of the IMS Stage 1 Files is Extended. The parser extracts the following information from the Stage 1 file: Transaction names Program names If the analysis finds transactions and programs not included in the package, they are loaded as unresolved objects. 18.3. IMS MFS Definitions In IMS, messages or screen formats are stored in the IMS library and called into the MFS buffer pool when it is necessary for online execution. MFS definitions are formatted for presentation to an application program or terminal. Message Format Service (MFS) You can use the extensions.mapmfs or.mapmfs for MFS sources, predefined in the Enterprise View Knowledge database, in order for the sources to be properly analyzed and handled by the parser. There is no specific parser for these objects. The analysis type of the MFS definitions is Disabled. They are included in the Enterprise View Repository for inventory purposes only. If, during the analysis of IMS programs, Enterprise View does not find MFS definitions called by the programs, these map definitions are stored as unresolved objects.

84 18.4. IMS COBOL Transactions and Programs When DL/I calls in a COBOL programming language are supported by IMS, the DL/I language interface must be called to initiate the functions specified with the DL/I calls. IMS offers several interfaces for DL/I calls. One of them, analyzed by Enterprise View, is the language-specific interface for the COBOL language: CBLTDLI. It has the following format: CALL 'CBLTDLI' USING function,db pcb,i/o area,ssa. You can use the extensions.cblims or.cblims for IMS COBOL sources, predefined in the Enterprise View Knowledge database, in order for the sources to be properly analyzed and handled by the parser. The analysis type of the IMS COBOL transactions and programs is Extended. At the end of the analysis, the following information can be viewed: Programs include books relationships Programs called/caller program relationships Programs maps relationships This information is available in the Enterprise View Tech solution. You can use the References and Browse Object functions to identify in the source code the line that generates the relationship. The relationship types and corresponding related objects in COBOL DLI programs are shown in the following table: Relationship Type Ddelete Dgnext DgnextP Dgunique Dwrite Drewrite Dposition Dschedule Dload Related Object File Name File Name File Name File Name File Name File Name File Name File Name File Name For additional information about IMS sources analysis, see the IMS Process Guide.

85 19. COBOL Statements Analysis 19.1. COBOL Language General Considerations A COBOL source program is a set of COBOL statements grouped in paragraphs and sections. All the statements, entries, paragraphs, and sections in COBOL source programs are grouped into the four divisions: Identification division Environment division Data division Procedure division Enterprise View provides two additional levels of analysis for the COBOL language that can be activated from the Work with Environment Properties functionality which is part of the Configuration Manager module. Those parameters can be set at product installation or in the earlier stages of the product configuration, only by ADMIN users. These new analysis levels handle in a more detailed way field attributes (Detailed Analysis) and allow the tracking of the statements in the PROCEDURE DIVISION (Statement Analysis). It is important to point out that the choice of the analysis level must be performed at the time of the configuration of the product and cannot be changed without carefully examining the phases already performed because mixed analysis levels can have an impact on the data consistency in the repository. You must also take into account the fact that these more detailed levels of analysis take more expansion time than the standard analysis and increase the disk space required to store the additional information. Therefore we recommend you activate detailed analysis only when the additional functionalities are necessary. 19.2. COBOL Detailed Analysis The statement analysis has been developed to track all the information about the statements in the PROCEDURE DIVISION of COBOL sources. The loaded information covers: Verbs Keywords Variables and constants The detailed analysis has been developed to gather additional information about: Field memory mapping Attribute to attribute and attribute to object relationships. Detailed analysis is mandatory to use the Enterprise View Impact Analysis solution because it reports attributes in a structured way and manages them as a starting point. It also manages attribute to attribute and attribute to object relationship as a browsable step for impact analysis. The Detailed Analysis also allows the use of additional functionalities in the modules Enterprise View Tech solution: Display tabsheet Memory Map. Display attributes related to relationship.

86 You can use the Enterprise View Tech solution to see the memory map of all the declared structures in the source. In the next picture, the structure of the program storage area with the elements and their attributes are shown in details. In the picture above, attributes related to object relationships are shown.