Control-M As an Application Management Platform 10/2015 Vedran Vesel, Imaves Control-M stručnjak
Control-M 9 Unmatched application workflow automation Lowering TCO Increasing application deployment speed Gartner Magic Quadrant for Workload Automation
BMC Control-M Workload Automation Recent Releases Single Point of Control Service Level Management Self Service & Mobility Audit & Compliance Application Integrator Application Hub Archiving Workload Change Manager JCL Verify Mobile App Big Data / Hadoop Databases (Hana, Teradata, more ) Oracle Retail SAP IBM Cognos Backup Workload Conversion & Discovery Java, Messaging & Web Services BMC Confidential Subject to Change
The Only Constant is Change
Legacy Application Lifecycle Zero Value Value D ES I G N D EV E L O P T ES T D EP L O Y R U N 5
The Workload Change Request Lifecycle #&?!!! #&?!!! BMC Control-M Workload Change Manager #&?!!! #&?!!! #&?!!!
Workload Change Manager : Functionality Web interface, easy and intuitive, for requesting or modifying batch processes Available items and features (job types, actions, etc) can be configured depending on the end user Site standards & naming conventions enforcement Follow-up for each request status, with optional email notification Tracks and audits all changes Integration with external Workflow and Change Management systems
Workload Change Manager : Benefits Standardize the workload change request process Avoid mistakes that can lead to potential problems on business services Enforce the predefined standards and naming conventions Totally audited process Accelerate the go-live of new business processes Avoid a manual, slow and error prone process Enable an easy development of batch workflows for developers and other users Immediate notification of the requests statuses Save time for both the users requesting the change and the administrators Thanks to a intuitive interface for users and a total integration in the administrator console With an automatic update in the change management system, avoiding additional time needed to update it manually
Customer case Itau Unibanco accelerates change process by 80% 80 % Reduction 400K daily jobs; over a million defined in total About 60,000 change requests monthly 15 schedulers dedicated to making changes Current process is Help desk ticket, email, phone Expect 80% reduction in tickets and time Addresses compliance, audit and security gaps
BIG DATA
Boeing 787 generates an average of 500GB of system data a flight Airbus A380 is fitted with as many as 25,000 sensors capturing 8,000+ data points per second Airlines are looking to use emerging technologies, such as Hadoop and sophisticated data mining algorithms, to capture unstructured data Operations Bag tracking data Ticketing data Scheduling data Capturing sensor data Optimize maintenance Minimize downtime Forecasting the weather to optimize fuel loads Optimization of flight path / Trajectory corrections Flaps Data Wingtips Improved gas mileage System Data Customer Loyalty 360 degree view of the customer 150+ variables about each customer Enhance customer value Cultivate high-value customers Engine data One cross-country flight GE jet engines collect information at 5,000 data points per second 20 TB x 2 x 6 x 28,537 x 365 Info per engine per hour Engines hours Commercial flights per day in the U.S. = 2,499,841,200 TB Days per year
Hadoop Traditional
Big Data The Batch Challenge Nearly 100% of processing in Hadoop is batch Scheduling tools available for Hadoop: Open source tools like Oozie, Spring, Azkaban Batch scheduling tools are version 1 at best Batch jobs are hand-written scripts Tools are used for scheduling jobs not creating and managing workflows
Programmers program SQL Query Informatica File Transfer Hadoop #!/usr/bin/sh # Sample pmcmd script set pagesize 0 linesize 80 feedback off SELECT 'The database ' instance_name ' has been running since ' to_char(startup_time, 'HH24:MI MM/DD/YYYY') FROM v$instance; SELECT 'There are ' count(status) ' data files with a status of ' status FROM dba_data_files GROUP BY status ORDER BY status; SELECT 'The total storage used by the data files is ' sum(bytes)/1024/1024 ' MB' FROM dba_data_files; #!/usr/bin/bash # Sample pmcmd script # Check if the service is alive pmcmd pingservice -sv testservice -d testdomain if [ "$?"!= 0 ]; then # handle error echo "Could not ping service" exit fi # Get service properties pmcmd getserviceproperties -sv testservice -d testdomain if [ "$?"!= 0 ]; then # handle error echo "Could not get service properties" exit fi # Get task details for session task "s_testsessiontask" of workflow # "wf_test_workflow" in folder "testfolder" pmcmd gettaskdetails -sv testservice -d testdomain -u Administrator -p adminpass -folder testfolder -workflow wf_test_workflow s_testsessiontask if [ "$?"!= 0 ]; then # handle error echo "Could not get details for task s_testsessiontask" exit fi #!/bin/ksh cd /home/bmcu1ser/ftp_race_source sftp -b /dev/stdin -o Cipher=blowfish -o Compression=yes -o BatchMode=yes -o IdentityFile=/export/home/user/.ssh/id_rsa -o Port=22 bmcus1ser@hou-hadoopmstr 1>sftp.log 2>&1 <<ENDSFTP if [ -f /home/bmcu1ser/ftp_race_target/daily_shipment_log ]; then exit 1 else put daily_shipment_log /home/bmcu1ser/ftp_race_target fi quit ENDSFTP rc=$? if [[ $rc!= 0 ]]; then print "***Error occurred...$rc" `date "+%Y-%m-%d-%H.%M.%S"` if [[ -f /home/bmcu1ser/ftp_race_target/daily_shipment_log ]]; then rm /home/bmcu1ser/ftp_race_target/daily_shipment_log fi else mv /home/bmcu1ser/ftp_race_source/daily_shipment_log /home/bmcu1ser/ftp_race_source/old/daily_shipment_log print "***Successful transfer...$rc" `date "+%Y-%m-%d-%H.%M.%S"` fi #!/usr/bin/env bash bin=`dirname "$0"` bin=`cd "$bin"; pwd`. "$bin"/../libexec/hadoop-config.sh #set the hadoop command and the path to the hadoop jar HADOOP_CMD="${HADOOP_PREFIX}/bin/hadoop --config $HADOOP_CONF_DIR #find the hadoop jar HADOOP_JAR=' #find under HADOOP_PREFIX (tar ball install) HADOOP_JAR=`find ${HADOOP_PREFIX} -name 'hadoop--*.jar' head -n1` #if its not found look under /usr/share/hadoop (rpm/deb installs) if [ "$HADOOP_JAR" == '' ]then HADOOP_JAR=`find /usr/share/hadoop -name 'hadoop--*.jar' head -n1` fi #if it is still empty then dont run the tests if [ "$HADOOP_JAR" == '' ]then echo "Did not find hadoop--*.jar under '${HADOOP_PREFIX} or '/usr/share/hadoop'" exit 1 fi #dir where to store the data on hdfs. The data is relative of the users home dir on hdfs. PARENT_DIR="validate_deploy_`date+%s` TERA_GEN_OUTPUT_DIR="${PARENT_DIR}/tera_gen_data TERA_SORT_OUTPUT_DIR="${PARENT_DIR}/tera_sort_data
What happens when this runs? What is related to what? Are we on time or late? What if something fails? Which program was running? Where is the output? How do I fix it? Can I just rerun it? If so, from the beginning? Does any cleanup have to be done? How do I track this problem and the steps taken to resolve the problem? Hadoop #!/usr/bin/env bash bin=`dirname "$0"` bin=`cd "$bin"; pwd`. "$bin"/../libexec/hadoop-config.sh #set the hadoop command and the path to the hadoop jar HADOOP_CMD="${HADOOP_PREFIX}/bin/hadoop --config $HADOOP_CONF_DIR #find the hadoop jar HADOOP_JAR=' #find under HADOOP_PREFIX (tar ball install) HADOOP_JAR=`find ${HADOOP_PREFIX} -name 'hadoop--*.jar' head -n1` #if its not found look under /usr/share/hadoop (rpm/deb installs) if [ "$HADOOP_JAR" == '' ]then HADOOP_JAR=`find /usr/share/hadoop -name 'hadoop--*.jar' head -n1` fi #if it is still empty then dont run the tests if [ "$HADOOP_JAR" == '' ]then echo "Did not find hadoop--*.jar under '${HADOOP_PREFIX} or '/usr/share/hadoop'" exit 1 fi #dir where to store the data on hdfs. The data is relative of the users home dir on hdfs. PARENT_DIR="validate_deploy_`date+%s` TERA_GEN_OUTPUT_DIR="${PARENT_DIR}/tera_gen_data TERA_SORT_OUTPUT_DIR="${PARENT_DIR}/tera_sort_data
A better way SQL Query Informatica File Transfer Hadoop BMC Control-M
Defining Control-M for Hadoop jobs Hadoop Program parameters HDFS commands
A better way
And the fun is just beginning
Integration for All Applications
Applications. And More Applications. 2015 Workload Automation Survey Teradata Microsoft SQL Server Reporting Services (SSRS) Microsoft SQL Server Analysis Services (SSAS) SAS Microstrategy Tibco Analytics Splunk Enterprise Qlikview Other Tableau Pentaho SPSS Splunk Microsoft Access Informix SQLite HP Vertica Cassandra MariaDB Neo4j Redis Salesforce.com Oracle JD Edwards EnterpriseOne Workdays Other Concur Infor Lawson Oracle Flexcube (iflex) Accenture Alnova TCS BaNCS (Tata Consulting services) core banking solution Infosys Finance
Control-M Application Integrator Build applications faster Reduce scripting effort Rapid deployment of new applications or application changes Minimize risk and downtime in case of job failures
Applications & Platforms
Native application plug-ins Taking advantage of all CONTROL-M workload automation offerings Predefined connection profiles holding secured credentials Eliminate copy/paste or manually typing of job attributes Automatic load of input parameters, allow use of variables inputs Control level of details to include in job output for auto recovery Advanced controls including restart from point of failure
Automation steps in a job Execute Start the Job in the application Progress Monitor job progress and determine completion Abort Terminate the application job interactively Output Retrieve the job output from the application
New Type of Job in Control-M Job available in the Templates Manager
Application Integrator A workload automation design tool that connects applications and process so business services are quickly and reliably delivered to customers Innovation Made Easy Increase Value The Power of Many
Advanced File Transfer
Control-M Control Module for Advanced File Transfer Integrate file transfers into automated workflows: Manage FTP transfers securely Eliminate lag time between file transfers and subsequent processing steps Gain instant visibility into the status of your transfers. Use automated recovery capabilities to improve reliability.
Control-M for SAP 8 Optimize your SAP environment XBP 3.0 certification Conversion Coming next SAP Process Integration (PI) Solution Manager Financial Closing Cockpit (FCC) BMC Confidential Subject to Change
Oracle Retail Quickly integrate Oracle Retail jobs into your workload automation environment Oracle Retail Merchandising System (RMS) Oracle Retail Price Management (RPM) Support release 13.x and higher
Executive Summary Control-M 9 Why Control-M 9? Leading Workload Automation Solution in the market History of innovation and investment On-going customers engagement Record breaking CXI Key Elements Unmatched application workflow automation Lowering TCO Increasing application deployment speed Release Highlights Automated agent and client deployment Runtime analytics High Availability Automated application workflow promotion between environments
Our commitment remaining focused on customers.
Hvala na pažnji! TM Bring IT to Life.