DataFlux Data Management Server 2.6
|
|
|
- Hugo Oliver
- 10 years ago
- Views:
Transcription
1 DataFlux Data Management Server 2.6 Administrator s Guide SAS Documentation
2 The correct bibliographic citation for this manual is as follows: SAS Institute Inc DataFlux Data Management Server 2.6: Administrator's Guide. Cary, NC: SAS Institute Inc. DataFlux Data Management Server 2.6: Administrator's Guide Copyright 2014, SAS Institute Inc., Cary, NC, USA All rights reserved. Produced in the United States of America. For a hard-copy book: No part of this publication may be reproduced, stored in a retrieval system, or transmitted, in any form or by any means, electronic, mechanical, photocopying, or otherwise, without the prior written permission of the publisher, SAS Institute Inc. For a web download or e-book: Your use of this publication shall be governed by the terms established by the vendor at the time you acquire this publication. The scanning, uploading, and distribution of this book via the Internet or any other means without the permission of the publisher is illegal and punishable by law. Please purchase only authorized electronic editions and do not participate in or encourage electronic piracy of copyrighted materials. Your support of others' rights is appreciated. U.S. Government License Rights; Restricted Rights: The Software and its documentation is commercial computer software developed at private expense and is provided with RESTRICTED RIGHTS to the United States Government. Use, duplication or disclosure of the Software by the United States Government is subject to the license terms of this Agreement pursuant to, as applicable, FAR , DFAR (a), DFAR (a) and DFAR and, to the extent required under U.S. federal law, the minimum restricted rights as set out in FAR (DEC 2007). If FAR is applicable, this provision serves as notice under clause (c) thereof and no other notice is required to be affixed to the Software or documentation. The Government's rights in Software and documentation shall be only those set forth in this Agreement. SAS Institute Inc., SAS Campus Drive, Cary, North Carolina November 2014 SAS provides a complete selection of books and electronic products to help customers use SAS software to its fullest potential. For more information about our offerings, visit support.sas.com/bookstore or call SAS and all other SAS Institute Inc. product or service names are registered trademarks or trademarks of SAS Institute Inc. in the USA and other countries. indicates USA registration. Other brand and product names are trademarks of their respective companies.
3 Contents What's New in DataFlux Data Management Server vii Accessibility ix Chapter 1 Introducing the DataFlux Data Management Server Introduction to the DataFlux Data Management Server How It Works Introducing the User Interface Introducing the Directories Chapter 2 Configuring the DataFlux Data Management Server Configure and Migrate after Software Upgrades Configure Additional Software Set Directory Permissions Register a DataFlux Data Management Server Configure DataFlux Data Management Server to Run Studio Jobs and Services Chapter 3 Managing Security Security Overview About Authentication About Authorization Configure a SAS Metadata Server for Security Configure Mid-Tier Options for Security Configure a DataFlux Authentication Server for Security Manage Permissions Control Access by IP Address Configure SSL and AES Encrypt Passwords for DSNs and SSL Troubleshoot Security Errors Chapter 4 Administering the DataFlux Data Management Server Start or Stop the Server on Windows Start or Stop the Server on UNIX or Linux Troubleshoot Server Start or Restart Administer DataFlux Data Management Server Log Files Administer Data Service Log Files Administer Log Files for Batch and Profile Jobs Enable the SOAP Log Change Log Events and Thresholds Troubleshoot Server Start Troubleshoot ActiveX Error to Display Help Chapter 5 Managing Data Connections Overview of Data Connections Configure the Data Access Component (DAC) Display Data Connections Create a Server Job That Uses a Driver and a Data Source Use the Windows ODBC Data Source Administrator Use dfdbconf and dfdbview for UNIX and Linux ODBC Connections Create a Domain-Enabled ODBC Connection
4 vi Contents Create a Custom Data Connection Create a SAS Connection Edit a Data Connection Delete a Data Connection Manage ODBC Credentials Troubleshoot ODBC Data Connections Chapter 6 Managing Jobs, Services, and the Repository Overview of Jobs and Services Configure the SOAP and WLP Servers Configure the Server to Pre-load Services Browse Available Services and WSDLs Apply SOAP Commands and WSDL Options Define Macros Terminate Real-Time Services Manage the DFWSVC Process Manage the DFWFPROC Process Run Jobs with the dmpexec Command Configure Jobs and Services Collect Job Status Information with the SAS Job Monitor About the Repository Troubleshoot Jobs and Services Load Balance Service Requests Customize the Server s WSDL File Customize the WSDL File for Java Customize the WSDL File for C# Chapter 7 Configuration Option Reference Configuration Options Overview Configuration Options Reference for dmserver.cfg Recommended Reading Glossary Index
5 vii What's New in DataFlux Data Management Server Overview The primary enhancements for DataFlux Data Management Server include the following: Retain permissions when updating objects Refine the default access control list Specify a separate directory for batch and profile job logs Configure a log file for jobs run with the dmpexec command Enable the SOAP log for debugging Validate XML in output data from real-time services Retain Permissions When Updating Objects When you import objects that replace existing objects, you can now apply the permissions from the existing object to the new object. Simply select Retain permissions for replaced objects in the Import Location dialog box. Refine the Default Access Control List Objects, jobs, and services receive a default access control list (ACL) when they are added to the DataFlux Data Management Server. Prior to the 2.6 release, the default ACL received ALLOW or DENY permissions for the default groups PUBLIC and USERS. In the 2.6 release, the default ACL contains ALLOW or DENY permissions for lists of users and groups. To implement this change, the configuration options DMSERVER/SECURE/ DEFAULT_ACE_PUBLIC and DMSERVER/SECURE/DEFAULT_ACE_USERS have been superseded by the following new configuration options: DMSERVER/SECURE/DEFAULT_ACE_USERS_ALLOW DMSERVER/SECURE/DEFAULT_ACE_USERS_DENY
6 viii What's New in DataFlux Data Management Server DMSERVER/SECURE/DEFAULT_ACE_GROUPS_ALLOW DMSERVER/SECURE/DEFAULT_ACE_GROUPS_DENY The superseded options are deprecated but still recognized in the 2.6 release. Specify a Separate Directory for Batch and Profile Job Logs To make batch and profile job logs more accessible to administrators and to the SAS Job Monitor, you can now specify a separate storage location for those log files. Specify the path to the job log directory as the value of the configuration option DMSERVER/ JOB_LOGS_DIR. Configure a Log File for Jobs Run with the dmpexec Command When you run jobs with the dmpexec command, you can now configure the log files that are generated by those job runs. Enable the SOAP Log for Debugging You can now enable the capture of log data for the SOAP packets that are received and transmitted by the DataFlux Data Management Server. This change the removal of support for the configuration option DMSERVER/SOAP/LOG_PACKETS. The removed configuration option is no longer recognized by the server. Validate XML in Output Data from Real-Time Services You can now set DMSERVER/SOAP/VALIDATE_XML_CHARS to YES to ensure that all output text strings from real-time data and process services contain only valid XML. Any control characters in the output text strings are replaced with question marks.
7 ix Accessibility The DataFlux Data Management Server software includes features that improve usability for the disabled. The usability features are related to accessibility standards for electronic information technology that were adopted by the United States (U.S.) Government under Section 508 of the U.S. Rehabilitation Act of 1973, as amended. If you have questions or concerns about the accessibility of DataFlux products, please send an to
8 x Accessibility
9 1 Chapter 1 Introducing the DataFlux Data Management Server Introduction to the DataFlux Data Management Server Overview How It Works Introducing the User Interface Overview About the Navigation Pane About the Information Pane Introducing the Directories Introduction to the DataFlux Data Management Server Overview The DataFlux Data Management Server provides consistent, accurate, and reliable access to data across a network by integrating real-time data quality, data integration, and data governance routines. With DataFlux Data Management Server, you can replicate your business rules for acceptable data across applications and systems, enabling you to build a single, unified view of your enterprise. The server implements business rules that you create in DataFlux Data Management Studio, in both batch and real-time environments. DataFlux Data Management Server enables pervasive data quality, data integration, process orchestration, and master data management (MDM) throughout your enterprise. The Data Management Server provides a service-oriented architecture (SOA) application server that enables you to execute batch or profile jobs on a server-based platform, in Windows, Linux, or UNIX. By processing batch and profile jobs where the data resides, you avoid network bottlenecks and take advantage of performance features available with higher-performance computers. In addition, the Data Management Server executes real-time data services and real-time process services. These services can be invoked by any web service application, such as SAP, Siebel, Tibco, or Oracle. You can convert your existing batch jobs to real-time services, to reuse the business logic that you developed for data migration or to load a data warehouse. You can apply your real-time services at the point of data entry to ensure consistent, accurate, and reliable data across your enterprise.
10 2 Chapter 1 Introducing the DataFlux Data Management Server The following diagram shows how DataFlux Data Management Server connects to other servers and clients: RDBMS Fixed/ Delimited Files Customer Applications, Message Queues ODBC, Wire drivers, native text importers TCP/IP SOAP over HTTP/HTTPS Data Management Server Local Quality Knowledge Base Available Access Control Reference files (postal, geocode, custom) SOAP HTTPS IOM Data Management Studio Visual Process Orchestration Web Client SAS Servers: Network QKBs, Metadata Server Business Rules Authentication Server Federation Server Web Studio Server Visual Process Orchestration Server The following diagram shows how the DataFlux Data Management Server connects into enterprise software solutions:
11 Introduction to the DataFlux Data Management Server 3 Also included with Data Management Server is the ability to make Application Programming Interface (API) calls to the same core data quality engine. Discrete API calls are available through native programmatic interfaces for data parsing, standardization, match key generation, address verification, geocoding, and other processes. DataFlux Data Management Studio is the development, test, and administration client for the DataFlux Data Management Server. DataFlux Data Management Studio enables you to create, test, and upload batch jobs, profile jobs, real-time data services, and realtime process services. Production jobs can be run by individuals, clients, or by your scheduling application. Your clients and web applications can access DataFlux Data Management Server through a SOAP interface. The Data Management Server also supports a WSDL client interface (Web Services Description Language). Security on the Data Management Server is implemented through external authentication and internal authorization. External security services are provided either by a SAS Metadata Server or a DataFlux Authentication Server. Both of the security servers authenticate users using network authentication providers. The security servers also maintain a database of users and groups. The DataFlux Data Management Server applies group membership information to its internal access control lists for data, jobs, commands, and services. Additional broad-brush security features grant or deny server access based on IP address and on membership in ALLOW, DENY, or administrative groups. Encryption is applied to all TCP/IP network communication and all stored
12 4 Chapter 1 Introducing the DataFlux Data Management Server passwords. You can configure encryption to use private keys up to 256 bits in length. You can also configure SSL to protect client connections using HTTPS addresses. The Data Management Server is provided as part of an increasing number of enterprise solutions, including SAS MDM and SAS Visual Process Orchestration. The following diagram shows how the DataFlux Data Management Server provides a job execution environment and an external client interface for SAS Visual Process Orchestration. SAS Data Management Console: - SAS Visual Process Orchestration - SAS Job Monitor Enterprise Applications DataFlux Data Management Studio SAS Management Console HTTPS SAS Web Application Server IOM Design Server Runtime Server SAS Visual Process Orchestration DataFlux Data Management Server SAS Federation Server, DataFlux Authentication Server SAS Workspace Server SAS Metadata Server SAS Content Server Enterprise Data How It Works The DataFlux Data Management Server is responsible not only for sending and receiving SOAP requests, but also for monitoring the progress of all registered data management services. Job status information is available in DataFlux Data Management Studio, and, when configured, in the Job Monitor add-in the SAS Environment Manager software.
13 On the Data Management Server, SOAP and WLP (web application logic) servers listen on separate ports. When the server receives a job run request, the server authenticates, authorizes, and sends the request to a threaded process. The process runs and executes the real-time data service, real-time process service, batch job, or profile job. When the job is complete, the server sends data to the client and the process is assigned to the next job run request. You can preload processes, spawn new processes, and enqueue job run requests as needed to customize server performance for the dynamics of your enterprise. The use of separate processes for job run requests enables robust error recovery and effective distribution of processing across multiple CPUs. The Data Management Server handles the following processes: Client queries the server to return the names of available services. If the server receives a list services request, the server simply queries the services directory and returns the name of each found file. Return requested input/output fields for a specified service. Introducing the User Interface 5 Pass data and macros to a service, run the service, and receive output data and macros in return. When the server receives a service request, it identifies an idle service, sends data to the idle service, and listens for additional requests. If an idle service is not available, the server will load a new service into memory and pass the data or macros to the new service. The server monitors the service progress; as soon as the service returns output, the server sends the output back to the client application. If the service fails for any reason, the server will terminate the service process and return an error message to the calling application. After a service completes a request, both changed and unchanged data and macros will be reset to their default values. Introducing the User Interface Overview The user interface for DataFlux Data Management Server is provided by DataFlux Data Management Studio. To display the interface, open the Administration riser bar and click the DataFlux Data Management Servers riser bar. DataFlux Data Management Studio then displays a tree view of your Data Management Servers in the left-hand navigation pane. The right-hand information pane displays a list of server names. About the Navigation Pane The left-hand navigation pane provides a toolbar that contains the following icons: Action Menu Used to create, edit, and delete a DataFlux Data Management Server. Here you can change the server's credentials and unload idle processes (that are not real-time data services processes). New Used to register a new DataFlux Data Management Server, so that you can connect to it. Import Enables you to import items from a repository.
14 6 Chapter 1 Introducing the DataFlux Data Management Server Export Enables you to export the selected object to a repository. Edit Enables you to export the selected object to a repository. Expand Enables you to expand all folders for the selected server. In the tree view, you can expand a server to display information about the jobs and services that are available on that server. Right-click to connect. About the Information Pane The right-hand information pane provides a toolbar that contains the following icons: New Enables you to register a new DataFlux Data Management Server. Edit Enables you to import items from a repository. Delete Enables you to export the selected object to a repository. Find Enables you to edit the selected object. If this option is not available, you cannot edit the object. Introducing the Directories The following table lists the directories that are created when you install Data Management Server. Directory \bin \data \data\install \doc \etc \lib \etc\dftkdsn \etc\dsn Description Contains the executable files for this platform. Contains files that include data information that is specific to this installation. Contains a collection of files pertinent to installation such as templates and scripts. Contains the documentation that is installed with the server. Contains the configuration and license files. Contains the library files for this platform. Contains the non-odbc data connection configurations. Contains the saved credential files for each data source name (DSN).
15 Introducing the Directories 7 Directory \etc\license \etc\macros \etc\repositories \etc\security \share \var \var\repositories Description By default, the location where the license files reside. The path to the license file is located in the etc\app.cfg file. Contains the.cfg files, which specify the macro key and value pairs. All files in this directory are loaded in alphabetical order. Contains the sample repository configuration file, server.rcf. The repository configuration file defines the location of the repository file that is used by the server to run profile jobs. Contains files that specify server commands and permissions for specific users and groups. Contains message files that are needed by the software. If the files are removed, the software will fail to run. The directory also contains a sample copy of the WSDL file, which is used by the DataFlux Data Management Server. Contains the log files from the running of the DataFlux Data Management Server as well as job-specific logs. Contains the sample repository file, server.rps.
16 8 Chapter 1 Introducing the DataFlux Data Management Server
17 9 Chapter 2 Configuring the DataFlux Data Management Server Configure and Migrate after Software Upgrades Configure Additional Software Overview Address Update Configure the Quality Knowledge Base Configure DataPacks Set Directory Permissions Register a DataFlux Data Management Server Configure DataFlux Data Management Server to Run Studio Jobs and Services Configure and Migrate after Software Upgrades Follow these steps to configure your DataFlux Data Management Server after software updates and to migrate security information from the previous version to the new version. CAUTION: Do not remove or uninstall the previous version until after you migrate your security settings and objects. Make sure that all changes to users and groups are applied to the access control lists in the new version of the DataFlux Data Management Server. For an overview of migration issues, see Upgrading Existing Installations of the Software in the DataFlux Data Management Studio: Installation and Configuration Guide. See also the SAS Guide to Software Updates. 1. Migrate your SAS Metadata Server (see the SAS Intelligence Platform: Migration Guide) or DataFlux Authentication Server (see the DataFlux Authentication Server: Administrator s Guide.) 2. Run the SAS Deployment Wizard to install the software update for the DataFlux Data Management Server. 3. Run the SAS Deployment Wizard a second time to configure the new instance of the DataFlux Data Management Server.
18 10 Chapter 2 Configuring the DataFlux Data Management Server 4. Copy your jobs, services, ACLs, and WSDLs from the previous version of the server to the new version. Copy entire directories and subdirectories, such as C:\Program Files\DataFlux\DMServer\2.6\var\data_services. 5. Transfer the security configuration from the old server to the new server. Transfer options in dmserver.cfg and copy user and group permissions from C:\Program Files \DataFlux\DMServer\2.6\etc\security\. 6. Import profile jobs into the new repository. 7. Create user and group command permissions to match those of the previous version. 8. Create ACL permissions for jobs to match those of the previous version. 9. Test the new server. 10. Uninstall the old server if it is no longer in use. Configure Additional Software Overview You can add data cleansing, data quality, and address verification applications to your Data Management Server so that job nodes can access the applications on the local host. These applications are available on the SAS support site, in the downloads and hot fixes section. See You can customize applications such as dfintelliserver, Quality Knowledge Bases (QKB), Accelerators, and DataPacks to meet the needs of your service-oriented architecture. For information about installing dfintelliserver, QKBs, and Accelerators, see the relevant software installation documentation. For information about installing and configuring the DataPacks for address verification, including USPS, Canada Post, and Geocode, see the DataFlux Data Management Studio Installation and Configuration Guide. Address Update About Address Update The DataFlux Address Update add-on enables you to use the United States Postal Service (USPS) NCOALink system to identify and update address information in customer records. For businesses and organizations with very large North Americabased customer information databases, this essential feature maintains accurate and upto-date address information for location-based marketing and direct mail marketing. Address update jobs can be imported from DataFlux Data Management Studio to a DataFlux Data Management Server, where the jobs are executed. One approach is to run test jobs and small jobs on the DataFlux Data Management Studio client workstation, and to upload larger jobs to the DataFlux Data Management Server. Using this approach, both DataFlux Data Management Studio and DataFlux Data Management Server must be identically configured to execute address update jobs and reports. The following information outlines the necessary tasks associated with deployment of Address Update on the DataFlux Data Management Server. For detailed information about installing and configuring Address Update, see the Address Update Add-On to
19 Configure Additional Software 11 DataFlux Data Management Studio Quick Start Guide and the DataFlux Data Management Studio Help topic Using the Address Update Add-On with DataFlux Data Management Server. Install Address Update Follow these steps to install Address Update: 1. Before you install Address Update on your DataFlux Data Management Server, install, configure, and test Address Update on an instance of DataFlux Data Management Studio. When your configuration is established on DataFlux Data Management Studio, you can replicate that configuration on your DataFlux Data Management Server. 2. Install on the Data Management Server the same Quality Knowledge Base that is used on DataFlux Data Management Studio. Replicate all customizations. 3. Download and run the Address Update installer on the host of the DataFlux Data Management Server. 4. Install NCOALINK Data from the United States Postal Service (USPS). Follow the instructions in the Address Update Add-On to DataFlux Data Management Studio Quick Start Guide that is provided with the Address Update installer. 5. Install USPS test data (CASS, DPV, and LACS), as described in the Address Update Add-On to DataFlux Data Management Studio Quick Start Guide. Configure DataFlux Data Management Server for Address Update Follow these steps to configure Address Update: 1. If necessary, stop the DataFlux Data Management Server. 2. Open the configuration file install-path/etc/macros/ncoa.cfg. 3. Set the value of the option NCOA/DVDPATH to the installation path of the USPS NCOALink data. 4. Set the value of the option NCOA/QKBPATH to be the installation path of the Quality Knowledge Base. 5. Set the value of the option NCOA/USPSPATH to be the installation path of the USPS Address verification data. 6. Review the default value of the option NCOA/DFAV_CACHE_SIZE. This option specifies the size of the cache that is used for address verification. The range of valid values is 0 100, and 0 is the default. Increasing the value increases the amount of memory used and increases the performance of address verification. 7. Review the default value of the option NCOA/DFAV_PRELOAD. This option specifies the states and categories of addresses that you preload, to enhance the performance of address verification. Valid values for this option are defined as follows: "." No preload. This is the default value. "ALL" Preload all states. "MIL" Preload military addresses only.
20 12 Chapter 2 Configuring the DataFlux Data Management Server "CONUSA" Preload the 48 contiguous states. "TX FL" Preload Texas and Florida, or any list of two-digit state names. 8. Save and close the configuration file ncoa.cfg. 9. Open the configuration file install-path/etc/app.cfg. 10. Update the values of the following options according to the app.cfg file in Data Management Studio: NCOA/REPOSDSN specifies the DSN connection for the address update repository. NCOA/REPOSPREFIX specifies the table prefix for the tables in this repository, if a prefix has been specified. NCOA/REPOSTYPE specifies the type of repository. Valid values for this option are defined as follows: No type specified. The DataFlux Data Access Component attempts to determine the repository type from the connection string. Specifies the repository type ODBC DSN. Specifies the repository type Custom DSN. Configure Jobs to Use Address Update After you configure DataFlux Data Management Server to use Address Update, configure Data Management Server jobs to use Address Update. The following steps are described in detail in the DataFlux Data Management Studio Help topic entitled Online Help, Using the Address Update Add-On with DataFlux Data Management Server: 1. Create a separate Processing Acknowledgment Form (PAF) for the DataFlux Data Management Server if Data Management Studio and DataFlux Data Management Server are running on different operating systems. 2. Enable jobs on Data Management Server to access an address update repository. 3. Configure a DSN on the DataFlux Data Management Server that is identical to the DSN defined in the NCOA/REPOSDSN option in the app.cfg file. Users need to save credentials for this DSN. 4. Import your Address Update Lookup jobs from DataFlux Data Management Studio to the Batch Jobs folder on the DataFlux Data Management Server. At this point, you are ready to run your Address Update Lookup jobs. Configure the Quality Knowledge Base If you add a QKB to your DataFlux Data Management Server, make sure that it is the same QKB that you installed on DataFlux Data Management Studio. To specify the location of the QKB, open the configuration file install-path/etc/ app.cfg. For the QKB variable, remove the comment character and replace PATH with the full path to the QKB. # qkb/path = PATH
21 Set Directory Permissions 13 # Location of the active Quality Knowledge Base. # # example: qkb/path = C:\QKB Configure DataPacks If you download DataPacks, open install-path/etc/app.cfg, remove comment characters, and update variable values as follows. CASS (US Data, USPS) # verify/usps = PATH # Location of US address verification data. # # example: verify/usps = C:\USPSData Geocode # verify/geo = PATH # Location of Geocode/Phone data. # # example: verify/geo = C:\GeoPhoneData SERP (Canadian Data) # verify/canada = PATH # Location of Canadian address verification data. # # example: verify/canada = C:\CanadaPostData World World Address Verification requires you to enter an unlock code in addition to the path. The unlock code is supplied with the DataPack. # verifyworld/db = PATH # Location of World address verification data. # # example: verifyworld/db = C:\Platon # # verifyworld/unlk = UNLOCK_CODE # Unlock code provided by DataFlux for unlocking the World address # verification functionality. # # example: verifyworld/unlk = ABCDEFGHIJKLMNOPQRSTUVWXYZ Set Directory Permissions The following tables outline the recommended permissions for users of DataFlux Data Management Server. The default installation path under Windows is SASHome\product-instancename. The default installation path under UNIX is SASHome/product-instance-name. In this document, the default installation path is indicated by the term install-path.
22 14 Chapter 2 Configuring the DataFlux Data Management Server Table 2.1 Directory Permissions for Windows Directories Users Default Permissions install-path\dmserver Administrator, Installer Full Control Process user Read and Execute, List Folder Contents install-path\dmserver\var Installer Full Control Process user The user who backs up the DataFlux Data Management Server, Backup Administrator Read, Write, List Folder Contents Read, List Folder Contents Table 2.2 Directory Permissions for UNIX and Linux Directories Users Default Permissions install-path/dmserver Installer Read, Write, Execute Process user Read, Execute install-path / dmserver/var Installer Process user Read, Write, Execute Read, Write, Execute The user who backs up the DataFlux Data Management Server; Backup Administrator Read, Execute Note: TMPDIR might have to be set in the event that the system's default temp directory (/TMP) runs out of space while running jobs or services. If this occurs, set the TMPDIR environment variable to read/write for the run-time user. Register a DataFlux Data Management Server Follow these steps to register a DataFlux Data Management Server in DataFlux Data Management Studio: 1. In DataFlux Data Management Studio, click the DataFlux Data Management Servers riser bar. 2. Click New DataFlux Data Management Server on the toolbar. The Management Server dialog box appears.
23 Configure DataFlux Data Management Server to Run Studio Jobs and Services In the Management Server dialog box, a. Specify a name for the server in the Name field. b. Enter a description for the server. c. Enter the server host name in the Server field. d. Port is the default port number. Keep the default port number unless it is already in use on that host. e. Enter the Domain Name for the associated Authentication Server. 4. Click Test Connection to verify that you can connect to the server. Click OK to close the Test dialog box. 5. Click OK to close the DataFlux Data Management Server dialog box. The new Data Management Server appears in the left navigation pane. Configure DataFlux Data Management Server to Run Studio Jobs and Services You create and test jobs and real-time services in DataFlux Data Management Studio. You then upload those jobs and services to a DataFlux Data Management Server. To run a new job or service, the configuration of the DataFlux Data Management Server needs to replicate the configuration of the Studio client. Certain job nodes require specific option settings. Other nodes require additional configuration on the server, such as the creation of a repository. Because jobs are created and configured in Studio, the documentation for how to configure both the client and server is provided in the DataFlux Data Management Studio User's Guide and in the DataFlux Data Management Studio Installation and Configuration Guide. The configuration information refers to the Studio client, but the configuration process also needs to be applied to the DataFlux Data Management Server. To run a particular job on the Data Management Server, you might need to complete the following tasks: Configure a repository. Configure the Java plug-in. Set configuration options in the Data Management Server app.cfg file. A listing of app.cfg options is provided in the DataFlux Data Management Studio User's Guide. Set configuration options in the DataFlux Data Management Server configuration file dmserver.cfg. For information about these configuration options, see Configuration Options Reference for dmserver.cfg. In addition to transferring the Studio configuration to the DataFlux Data Management Server, you also need to consider the application of access controls to specify the users who will be permitted to run jobs and real-time services.
24 16 Chapter 2 Configuring the DataFlux Data Management Server
25 17 Chapter 3 Managing Security Security Overview About Authentication About Authorization Overview Group and User Authorization Checks Group Permissions ACL Authorization Checks Configure a SAS Metadata Server for Security Overview Basic Configuration Occurs during Installation Manage Server Configuration Options That Are Set from Metadata Configure Server Restart Additional Configuration after Installation Configure Mid-Tier Options for Security Configure a DataFlux Authentication Server for Security Overview Prepare a DataFluxAuthentication Server Configure the DataFlux Data Management Server to Use the DataFlux Authentication Server Manage Permissions Overview Configure Default Access Control Entries Set Permissions Using a Job List Remove Users and Groups Reference for Permissions Control Access by IP Address Configure SSL and AES Overview Enable SOAP with SSL OpenSSL Encrypt Passwords for DSNs and SSL Overview Encrypt in Windows Encrypt in UNIX and Linux Troubleshoot Security Errors Overview
26 18 Chapter 3 Managing Security 401 Unauthorized Forbidden Security Overview The Data Management Server is a network resource that is used to access and modify your data. A well-planned security model is based on usage policy, risk assessment, and response. Determining user and group usage policies prior to implementation helps you minimize risk, maximize utilization of the technology, and expedite deployment. The Data Management Server supports the following levels of security: Unsecured the default security mode after installation, grants access to all users to all of the DataFlux Data Management Server's jobs, services, and data sources. Secured by IP Address grants access to server resources based on the IP addresses of the computers that connect to the server. This security level can be used in combination with the other levels of security. Secured by Local Authorization uses internal user and group definitions to authorize access to server resources, without using an authentication provider. Secured by SAS Metadata Server or DataFlux Authentication Server uses either a SAS Metadata Server or a DataFlux Authentication Server to authenticate user credentials and provide group membership information. This mode enables you to use a single set of user and group definitions for your entire enterprise. Secured by SSL and AES upgrades SOAP communication from HTTP to the Secure Sockets Layer (HTTPS). Encryption on disk and over the network is upgraded from the SASPROPRIETARY algorithm to the 256-bit AES algorithm. These features are provided by the DataFlux Secure software, which is installed by default in a disabled state. For more information about DataFlux Secure, see Configure SSL and AES and the DataFlux Secure Administrator s Guide. When security is not enabled, you cannot run jobs that request authentication, and you cannot run jobs that access a SAS Federation Server. All data sources (DSNs) needed by jobs and services must be defined on the Data Management Server. When you upgrade to a new release of the DataFlux Data Management Server, you must manually migrate your security settings from the previous release to the new release. To migrate your security settings, see Configure and Migrate after Software Upgrades on page 9. About Authentication Authentication is the process of confirming the identity of users. When authentication is enabled on a DataFlux Data Management Server, the server requires a connection to a DataFlux Authentication Server or a SAS Metadata Server. The authentication process begins when the DataFlux Data Management Server receives a connection request from a client. The DataFlux Data Management Server passes the user s credentials to the DataFlux Authentication Server or SAS Metadata Server. The other server then attempts to authenticate the user by submitting the user s credentials to an authentication provider in the network domain that is specified in the credentials.
27 After it receives a response from the authentication provider, the Authentication Server or SAS Metadata Server notifies the DataFlux Data Management Server of the result of the authentication attempt. Successful authentication enables the DataFlux Data Management Server to begin the authorization process. Authentication is enabled on the DataFlux Data Management Server with the configuration option DMSERVER/SECURE, in the configuration file dmserver.cfg. For information about configuring authentication providers and defining users and groups, see the following documents: SAS Intelligence Platform: System Administration Guide SAS Intelligence Platform: Security Administration Guide DataFlux Data Management Server: Administrator s Guide About Authorization 19 About Authorization Overview The authorization process applies access controls to authenticated users. After a user successfully authenticates, the DataFlux Data Management Server queries the SAS Metadata Server or DataFlux Authentication Server for the group membership information of the authenticated user. The DataFlux Data Management Server applies the group membership information to its locally defined authorizations to allow or deny the user access to the requested object or command. Note: Authorizations that might be defined on the SAS Metadata Server are not applied to the Data Management Server s authorization process. Authorizations in the form of access control lists are defined on the DataFlux Data Management Server using the Administration riser in DataFlux Data Management Studio. You can define additional enterprise-level authorizations by setting configuration options. You can allow or deny access to the Data Management Server by IP address. You can also enable ALLOW and DENY groups. If the requesting user is a member, direct or indirect, of the ALLOW or DENY group, then full access is granted or denied, and no further authentication takes place. Group and User Authorization Checks The Data Management Server checks a user s authorizations in the following sequence: 1. membership in the administrators group 2. membership in a default DENY group, if it is configured for use 3. membership in a default ALLOW group, if it is configured for use 4. command permissions and access control lists Note: All groups must be created on the SAS Metadata Server or DataFlux Authentication Server before they can be authorized access on the DataFlux Data Management Server. If the requesting user is a member of the administrators group, the DENY group, or the ALLOW group, then access is granted or denied and no further authorization takes
28 20 Chapter 3 Managing Security place. Members of the administrators group and the ALLOW group are granted access to all DataFlux Data Management Server commands and objects. After group memberships are compared against the access controls lists, the Data Management Server determines whether the following command permissions are set for the user: If, for a given command or object, the user has deny set, then the user is denied access. If an ACL exists, it is not checked. If the user has inherit set, authorization checks proceed to group permissions. If the user has allow set, and if the request does not apply to a specific object, then the user is granted access. If the request does apply to a specific object, then the server checks the object's ACL. Group Permissions Group permissions are handled in accordance with the group's membership hierarchy. For example, a user can be a member of groups G1 and G2. Group G1 is a member of group G3. So, G1 and G2 are one step away from the user, and G3 is two steps away from the user. The authorization process looks at permissions on all group sets in an increasing order of steps from the user. If a command permission can be determined from the groups that are one step from the user, then the DataFlux Data Management Server will not look further. When the server looks at a set of groups that are the same distance from the user, if any group has the DENY permission, then the user is denied access. Otherwise, if any group has the ALLOW permission, then if there is an ACL to check, the authorization process moves to the ACL. If there is no ACL at this point, then the user receives access. If permissions are not set for any group, or the permission is set to INHERIT, then the authorization checks move to the set of groups one step farther from the user. If access rights cannot be determined after going through the groups to which the user is a member, then the next group whose permissions are checked is the USERS group. All users that have definitions on the SAS Metadata Server or the DataFlux Authentication Server belong to the USERS group. Administrators can set command permissions for the USERS group and use that group in ACLs in the same manner as any other group. If access rights have not been determined, based on command permissions, the last step in the authorization process is to check whether permissions are set for the PUBLIC group. The PUBLIC group includes all users who are not registered on the SAS Metadata Server or the DataFlux Authentication Server. If the permission is ALLOW and is there is an ACL to check, then the authorization check moves to the ACL. Otherwise, the user is granted access. If the permission is DENY, INHERIT, or is not set, then the user is denied access. If neither the user, nor the user s groups, the USERS group, or the PUBLIC group have permission set, then the DataFlux Data Management Server denies access without checking the ACL. This means that the DataFlux Data Management Server requires a specific command permission before the Data Management Server will look at the ACL of an individual object. ACL Authorization Checks Authorization checks of ACLs begin by determining if the user is the owner of the object. If the user is the owner, then the user is granted access to the object. If the object is owned by a group, the user must be a direct or indirect member of that group to be granted access to the object.
29 Configure a SAS Metadata Server for Security 21 Next, the authorization check search the access control entries (ACEs). If ALLOW or DENY permissions are not found for the requesting user, then the ACEs are checked for groups of which the user is a member. If the ACL does not grant the user access to the corresponding job or service, the user is denied access. Configure a SAS Metadata Server for Security Overview When you install a Data Management Server, it is configured by default to use a SAS Metadata Server for authentication and authorization. If your site uses a DataFlux Authentication Server instead of a SAS Metadata Server, then see Configure a DataFlux Authentication Server for Security. Basic Configuration Occurs during Installation When you install a Data Management Server, the SAS Deployment Wizard sets the value of the configuration option BASE/AUTH_SERVER_LOC to specify the network name and port of the SAS Metadata Server. After installation, the file installpath/etc/app.cfg contains an entry that is similar to this example: base/auth_server_loc=iom://orion.us.southeast.omr.com:8561 Note: 8561 is the default port number for the SAS Metadata Server. Always use this port number unless it is already in use on the host of the SAS Metadata Server. The SAS Deployment Wizard also creates a metadata definition for the DataFlux Data Management Server on the SAS Metadata Server. After installation, you can see and control the DataFlux Data Management Server in SAS Management Console or in a newer administrative client. Manage Server Configuration Options That Are Set from Metadata When you use a SAS Metadata Server for security, you download the values of the following configuration options when you start the DataFlux Data Management Server: DMSERVER/SOAP/SSL, DMSERVER/SOAP/LISTEN/PORT, and DMSERVER/ SECURE. The Data Management Server uses the value of DMSERVER/NAME to query its own metadata definition on the SAS Metadata Server. If the name is valid and if the metadata definition can be accessed, then the DataFlux Data Management Server sets the local values from the supplied metadata. To access the metadata definition, the process owner of the DataFlux Data Management Server must have a user definition on the SAS Metadata Server. Another method of enabling access is to specify Read access to the metadata definition for the PUBLIC group. If the metadata definition cannot be accessed by the specified name, or if the name is valid and if access is denied, then the DataFlux Data Management Server does not start. If the server starts, and if the preceding options are specified in the Data Management Server s dmserver.cfg file, then the local values supersede the metadata values. For this reason, the preceding options should be commented-out in dmserver.cfg. This happens
30 22 Chapter 3 Managing Security by default when you install the DataFlux Data Management Server with the SAS Management Server. To change the metadata definition of the DataFlux Data Management Server, open SAS Management Console, enter administrative credentials, right-click the Data Management Server instance, and select Properties. After you save your changes, restart the DataFlux Data Management Server to download the latest configuration option values. Configure Server Restart Because the Data Management Server cannot start unless the SAS Metadata Server is fully operational, you might want to configure a server dependency to prevent failures at invocation. To configure a server dependency, see Troubleshoot Server Start or Restart. Additional Configuration after Installation After you install a DataFlux Data Management Server for use with a SAS Metadata Server, you create new user and group definitions (as needed) on the SAS Metadata Server. To create users and groups on the SAS Metadata Server, see the SAS Intelligence Platform: Security Administration Guide. You can also implement other access controls on the DataFlux Data Management Server. You can restrict server access by IP address, and you can create default access control lists with ALLOW and DENY permissions for users and groups, as described in Manage Permissions. When no default access control lists are defined, the members of the PUBLIC and USERS groups receive DENY permission. Configure Mid-Tier Options for Security The Data Management Server uses the following mid-tier configuration options when it connects with a SAS Metadata Server and with the Visual Process Orchestration software. The default values of these options, which are set at install time, are sufficient in most cases. You can set these options in the file install-path/etc/app.cfg. Configuration Option BASE/APP_CONTAINER_DOMAIN BASE/APP_CONTAINER_LOC Description Specifies the authentication domain that is expected by mid-tier application container services. If this option is not specified, then the value of DefaultAuth is used by default. DefaultAuth specifies a default authentication domain at install time. Specifies the path to the mid-tier application container services. In most cases this option is not required. If it is required, then the value is typically an HTTP address.
31 Configure a DataFlux Authentication Server for Security 23 Configuration Option BASE/AUTH_DEFAULT_DOMAIN Description Specifies the name of the resolved identity domain. When a supplied user ID is associated with multiple logins in metadata, the authentication domain for the submitted credentials in the following sequence: 1. Use the value of BASE/ AUTH_DEFAULT_DOMAIN, if the value matches the authentication domain of one of the user s logins, or 2. Use the value of DefaultAuth, if the value matches the authentication domain of one of the user s logins, or 3. Use the domain of the first login that matches the submitted login. Configure a DataFlux Authentication Server for Security Overview Use this section to configure your DataFlux Data Management Server to use a DataFlux Authentication Server to support authorization. To use a SAS Metadata Server for this same purpose, see Configure a SAS Metadata Server for Security. To use a DataFluxAuthentication Server, you first install, configure, and register the server. You then create users and groups on the server. When the DataFlux Authentication Server is configured, you then install and configure the DataFlux Data Management Server, develop authorizations, and connect to the DataFlux Authentication Server. Note: All communication between the DataFlux Data Management Server and the Authentication Server is encrypted. Prepare a DataFluxAuthentication Server Follow these steps to prepare a DataFluxAuthentication Server to work with a DataFlux Data Management Server: 1. Register your new Data Management Server in DataFlux Data Management Studio, as described in Register a DataFlux Data Management Server. 2. Install and configure your DataFlux Authentication Server, and then create domains, users, groups, and shared logins, as described in the DataFlux Authentication Server Administrator s Guide. 3. To register your DataFlux Authentication Server, open DataFlux Data Management Studio and click the Administration riser bar. 4. Select Authentication Server, and then select New.
32 24 Chapter 3 Managing Security 5. In the Add Authentication Server Definition dialog box, enter server connection information. If you want to connect to this particular Authentication Server by default, each time you start Studio, click Set as default. 6. To create a group of administrators for the DataFlux Data Management Server, open the Group riser, click All Groups, and then click New Group. 7. To add one or more users to your new group of administrators, click the new group, and then click Add Members. 8. If you need to create a new administrative user definition, click the Users riser, and then click Add New. CAUTION: All of the members of your administrative group will be configured to have unrestricted access to all of the objects and commands on this particular DataFlux Data Management Server. 9. Click Test Connection and click OK twice to close the dialog box. Configure the DataFlux Data Management Server to Use the DataFlux Authentication Server After you prepare a DataFluxuthentication Server, follow these steps to configure the DataFlux Data Management Server. 1. Open the file install-path/etc/dmserver.cfg. 2. Enable security by setting dmserver/secure = yes. 3. Identify the name of your administrative group by setting dmserver/secure/ grp_admin = your-group-name. 4. To prevent authorization checks, consider adding users and groups to the options dmserver/secure/grp_allow or dmserver/secure/grp_deny. For information about these options, see Manage Permissions. 5. Save and close dmserver.cfg. 6. Specify the DataFlux Authentication Server by setting base/auth_server_loc = iom://server-network-name: is the default port number, as shown in this example: # # base/auth_server_loc = URL # Location and connection parameters for the Authentication Server. # # example: base/auth_server_loc = iom://authserv.mycompany.com:21030 base/auth_server_loc = iom://orion.us.mktng.com: Save and close dmserver.cfg, and then restart the DataFlux Data Management Server. 8. Open Data Management Studio, click the Administration riser bar, and then connect to the DataFlux Data Management Server that you just restarted. 9. In the Details pane, notice that the security status has changed to Secured. When the server is secured, the Data Connections and Security tabs are enabled. Now that security is enabled, you grant permissions on the DataFlux Data Management Server. Note that all users and groups must be created on the DataFlux Authentication Server before they can be accessed on the DataFlux Data Management Server. To learn more, see Manage Permissions.
33 Manage Permissions 25 Manage Permissions Overview When security is enabled, each DataFlux Data Management Server maintains permissions that determine, in part, a user s access to the jobs, services, data sets, and commands on that server. The permissisons are maintained for each object in an access control list (ACL.) Authorization can also be determined by IP address and by default access control entries. Configure Default Access Control Entries When a new object is added to the repository of the DataFlux Data Management Server, the server associates default access control entries (ACEs) with that new object. The default ACEs allow or deny access to the object by named users and groups. The default ACEs are determined by the following configuration options: DMSERVER/SECURE/DEFAULT_ACE_GROUPS_ALLOW DMSERVER/SECURE/DEFAULT_ACE_GROUPS_DENY DMSERVER/SECURE/DEFAULT_ACE_USERS_ALLOW DMSERVER/SECURE/DEFAULT_ACE_USERS_DENY Consider these implementation details before you develop lists of users and groups for the four configuration options: When you add or change the default ACE configuration, the changes apply only to subsequent additions to the repository. The group allow and deny options can include the default groups PUBLIC and USERS. Command permissions that are not object-based, such as List/Post, are not affected by the default ACE configuration. Any conflict of ALLOW and DENY permissions generate error messages and prevent all users from connecting to the DataFlux Data Management Server. Any user or group name in the four configuration options that is not recognized by your authentication provider (the SAS Metadata Server by default,) generates an error message and prevents all users from connecting to the server. Follow these steps to configure your default access control entries: 1. Develop a plan for your default ACE configuration that includes exact syntax for the users and groups that you plan to assign ALLOW or DENY access. 2. Stop on page 31 the the DataFlux Data Management Server. 3. Open the configuration file install-path/etc/dmserver.cfg. 4. For each of the configuration options in your plan, apply the planned list of users or groups as the values of the options. The lists are all formatted with a delimiter of the form or space space, as shown in the following example. DMSERVER/SECURE/DEFAULT_ACE_USERS_ALLOW = Jones, Susan Jim Albrecht darusso
34 26 Chapter 3 Managing Security 5. Save and close the configuration file, and then restart the DataFlux Data Management Server. Set Permissions Using a Job List When a user posts a job or service to the server, that user becomes the owner of that object. The owner of an object can always execute and delete an object, regardless of user or group authorizations. When a user creates an object by copying the file, ownership is set to the administrators group. An administrator can change ownership to another user or group at any time. Follow these steps to grant permissions directly from a job list in DataFlux Data Management Server for Batch Jobs and Real-Time Services: Note: Profile jobs do not have associated object-level access control, so you cannot set permissions for profile jobs. 1. Open Data Management Studio and click the DataFlux Data Management Servers riser bar. 2. In the left navigation pane, select the DataFlux Data Management Server that you want to work with and connect to that server. 3. Click the + sign next to your server to expand the list of job folders. 4. Click the + to expand the category of jobs or services that you want to work with: Batch Jobs, Real-Time Data, or Process Services.. 5. Select a job or service from list in the left navigation pane, and then click the Permissions tab in the right information pane. 6. Under Participants, click Add to open the Add Users and Groups dialog box. Note: If the Permissions tab does not appear, you might be viewing a profile job that does not have object-level access control. 7. Select a user, or multiple users, and click Add. The user is added to the participant list for the job and granted permissions. Note: On the Permissions tab, you can also change ownership of a job or service by clicking to the right of the Owner field. Remove Users and Groups Follow these steps to remove a user or group object from the Users and Groups list on the DataFlux Data Management Server: Note: The definition of the user or group is retained on the SAS Metadata Server or the Authentication Server. 1. Connect to Data Management Server and open the Security tab. 2. Select the user or group that you want to remove and click delete. 3. Click Yes at the confirmation dialog box. When the object is removed, its associated permissions are deleted.
35 Manage Permissions 27 Reference for Permissions Permissions on the Data Management Server are defined as follows. Permission Execute data service Execute process service Execute Batch Job Execute Profile Job Post Data Service Post Process Service Post Batch Job Post Profile Job Delete Data Service Delete process service Delete batch job Delete profile job List data service List process service List batch job List profile job Description When this option is enabled, the user can view and execute realtime data services. This includes run, preload, and unload a data service. When this option is enabled, the user can view and execute realtime process services. This includes run, preload, and unload a process service. When enabled, the user can run a batch job, get a batch job file and get a batch job nodes' status. When enabled, the user can get and run a profile job. When enabled, the user can upload real-time data services to the server. When enabled, the user can upload real-time process services to the server. When enabled, the user can upload a batch job to the server. When enabled, the user can upload a profile job to the server. When enabled, the user can delete a real-time data service.* When enabled, the user can delete a real-time process service.* When enabled, the user can delete a batch job.* When enabled, the user can delete a profile job.* When enabled, the user can list real-time data services. When enabled, the user can list real-time process services. When enabled, the user can list batch jobs. When enabled, the user can list profile jobs. * In addition to enabling this permission, the user must also be the owner of the object, or an administrator, when performing these delete functions.
36 28 Chapter 3 Managing Security Control Access by IP Address Specify the following configuration options to control access to the DataFlux Data Management Server by IP address. The options are specified in install-path/etc/ dmserver.cfg. Option Syntax DMSERVER/IPACC/ALL_REQUESTS = allow IPlist-or-range deny IP-list-or-range allow all allow none deny all deny none DMSERVER/IPACC/POST_DELETE =allow IPlist-or-range deny IP-list-or-range allow all allow none deny all deny none DMSERVER/IPACC/NOSECURITY =allow IP-listor-range deny IP-list-or-range allow all allow none deny all deny none Description and Example Allows or denies the ability to connect to the DataFlux Data Management Server. If this option is not specified, then the default value is allow all. For example: DMSERVER/IPACC/ ALL_REQUESTS = allow Allows or denies the ability to post and delete jobs. If this option is not specified, then the default is allow all. For example:dmserver/ipacc/ POST_DELETE = Allows or denies the ability to bypass all security checks on the DataFlux Data Management Server. If this option is not specified, then the default value is allow none (no IP will bypass security checks). For example: DMSERVER/IPACC/NOSECURITY = allow A list of IP addresses is formatted using blank spaces. A range of IP addresses formatted with a hyphen character ('-') between the low and high ends of the range. If any option value contains all or none, then any specified IP addresses are ignored for that option. Configure SSL and AES Overview Beginning in the 2.5 release, the DataFlux Secure software is installed by default when you install your DataFlux Data Management Server. The DataFlux Secure software provides increased security through the Advanced Encryption Standard and through the use of the Secure Sockets Layer to protect HTTP client connections. These security enhancements, and their configuration on the DataFlux Data Management Server, are addressed in detail in the DataFlux Secure Administrator s Guide.
37 Configure SSL and AES 29 All of the clients and servers that connect to the DataFlux Data Management Server need to be configured for the same level of encryption and SSL implementation. Enable SOAP with SSL Edit the following settings as they apply to your environment. Configure these settings in the install-path/etc/dmserver.cfg. CAUTION: Stop the DataFlux Data Management Server before you make any changes to the configuration file. Configuration Option DMSERVER/SOAP/SSL DMSERVER/SOAP/SSL/KEY_FILE DMSERVER/SOAP/SSL/ KEY_PASSWD DMSERVER/SOAP/SSL/ CA_CERT_FILE DMSERVER/SOAP/SSL/ CA_CERT_PATH Description If you use a DataFlux Authentication Server for security, then set the value to YES. Later, if you need to disable SSL, set the value to NO. If you use a SAS Metadata Server for security, then this option should remain disabled by comment characters, as is the case by default. This option should not be set in dmserver.cfg because the value is set at server start, based on the server s metadata definition. If you set the option locally, then the local value overrides the value in metadata. Specifies the path to the key file that is required when the SOAP server must authenticate to clients. Specifies the password for DMSERVER/SOAP/SSL/ KEY_FILE. If the key file is not password protected, then comment-out this option. The value of this option must be encrypted. To encrypt passwords, see Encrypt Passwords for DSNs and SSL. Specifies the file that stores your trusted certificates. Specifies the path to the directory where you store your trusted certificates. OpenSSL DataFlux Secure software requires the OpenSSL libraries to communicate by means of the Secure Sockets Layer. OpenSSL is already installed and configured on most UNIX and Linux distributions. Windows systems require you to download and install OpenSSL libraries. The OpenSSL libraries must be available in the execution path for the DataFlux Secure software. The OpenSSL for Windows installation defaults to copying these libraries to the appropriate Windows system directory. DataFlux Data Management Studio is a 32-bit Windows application. Therefore, it requires the 32-bit OpenSSL for Windows libraries. DataFlux Data Management Server can be installed on either 32-bit Windows or 64-bit Windows. The OpenSSL libraries must match the bitness of the DataFlux Data Management Server executables.
38 30 Chapter 3 Managing Security Encrypt Passwords for DSNs and SSL Overview To improve security, encrypt the passwords of your DSNs and your SSL key file. Encrypt in Windows To encrypt passwords in the Windows operating environment, run install-path \bin\encryptpassword.exe. Enter the password, confirm your initial entry, and receive the encrypted password. Encrypt in UNIX and Linux To encrypt passwords in the UNIX and Linux operating environments, enter the command dmsadmin crypt. Troubleshoot Security Errors Overview Interpret and resolve the following security errors. 401 Unauthorized This HTTP error can indicate that the user entered incorrect credentials. The error can also indicate that a user account has not been created on the authorizing server (SAS Metadata Server or DataFlux Authentication Server.) 403 Forbidden This HTTP error indicates that the user is not authorized to use a particular Data Management Server command. For more information, see Manage Permissions.
39 31 Chapter 4 Administering the DataFlux Data Management Server Start or Stop the Server on Windows Start or Stop the Server on UNIX or Linux Troubleshoot Server Start or Restart Administer DataFlux Data Management Server Log Files Administer Data Service Log Files Administer Log Files for Batch and Profile Jobs Enable the SOAP Log Change Log Events and Thresholds Troubleshoot Server Start Troubleshoot ActiveX Error to Display Help Start or Stop the Server on Windows In the Windows operating environment, to start and stop the DataFlux Data Management Server, use the Microsoft Management Console or follow these steps: 1. Select ð Start > Control Panel. 2. Double-click Administrative Tools ð Computer Management. 3. Expand the Services and Applications folder. 4. Click Services. 5. Click DataFlux Data Management Server. 6. Click either Stop the service or Restart the service. Note: You can also access the service by selecting Start ð All Programs ð DataFlux. If the Data Management Server fails to start or restart, see Troubleshoot Server Start or Restart on page 32.
40 32 Chapter 4 Administering the DataFlux Data Management Server Start or Stop the Server on UNIX or Linux In the UNIX or Linux operating environments, use the following command to stop or start the DataFlux Data Management Server: install-path/bin/dmsadmin your-command Here is a typical example: <SASHome>/DataFluxDataManagementServer/2.6/dmserver03/bin/dmsadmin start The dmsadmin utility accepts the following options: Command start stop status crypt help version Description Starts the server. For example:./bin/dmsadmin start Stops the server. Checks whether the server is running. Encrypts a password for use in configuration files. Displays Help information. Displays version information. If the server fails to start or restart, see Troubleshoot Server Start or Restart on page 32. Troubleshoot Server Start or Restart If your Data Management Server fails to start or restart, you might need to resolve a server dependency. This dependency applies when the DataFlux Data Management Server is configured to use a SAS Metadata Server for authorization and authentication. The SAS Metadata Server needs to be fully operational before the Data Management Server can start. This server dependency exists because the DataFlux Data Management Server needs to retrieve several configuration option values from the SAS Metadata Server at start-up. The server dependency occurs predominantly in single-machine installs, when all services start at one time. You can resolve the server dependency as you see fit, or you can run the following command on the host of the DataFlux Data Management Server: sc config "DMServer-service-name" depends= "SASMetadata-service-name"
41 Administer DataFlux Data Management Server Log Files 33 The service names are specified in the properties of the service. Do not use the displayed server names. Use quotation marks as shown, use no blank space after depends, and use a blank space after =, as shown in the following example: sc config "dfx-dmserver-server1" depends= "SAS [Config-Lev1] SASMeta - Metadata Server" Administer DataFlux Data Management Server Log Files All of the service requests that are received by the DataFlux Data Management Server are assigned a unique request identification (RID). As the DataFlux Data Management Server processes a request, all log entries for that request begin with the associated RID. Log events that relate to security use a different format. By default, a new server log subdirectory is generated for each server request. The default path to the log files is: install-path\var\server_logs\log-subdirectory The default subdirectory name is defined as shown in this example: pid C is the date, is the time, pid5072 is the process ID, and 034C24 is a unique Data Management Server request ID. Use the following configuration options in dmserver.cfg to change the default logging behavior: Configuration Option DMSERVER/WORK_ROOT_PATH = path DMSERVER/NO_WORK_SUBDIRS = Yes No Usage The path to the directory where the server creates its working files and subdirectories. To change the destination directory, enter a new path. The default installation path is shown above. Controls whether each server run creates a new log subdirectory. The default is No, which specifies that all log and work files are created in subdirectories. To disable creation of the subdirectories change this value to Yes. To change the storage location or logging level for dmserver.log, open the file install-path\etc\dmserver.log.xml. To change the location of the log, change the option BASE/LOGCONFIG_PATH. To change the logging level, see Change Log Events and Thresholds. CAUTION: Setting the DataFlux Data Management Server log to the TRACE level creates a memory leak. Collecting server log entries at the TRACE level removes from use approximately 4 megabytes of memory for each 1000 HTTP service requests. To
42 34 Chapter 4 Administering the DataFlux Data Management Server prevent memory errors, limit the duration of your TRACE sessions accordingly. Memory leaks do not occur at other levels of logging. To change the encoding of your server log, set the configuration option BASE/ JOB_LOG_ENCODING in the file install-path/etc/app.cfg. By default, the log is written in the encoding of the locale of the process that executes the job. For English-speaking organizations, the encoding can be LATIN-1 or UTF-8. If a log line contains characters that cannot be represented in the encoding, then the log line is not written to the log file. Administer Data Service Log Files When enabled, a unique data service log file records log events for each server request that runs a real-time data service (as processed by DFWSVC.) The name of each data service log file is added to the DataFlux Data Management Server log file dmserver.log. The server log also contains debugging information for the job run of the real-time data service. If a new service process is used, then the server log includes the PID of the corresponding process. Note: To maximize performance, data service logging is disabled by default. To enable data service logging for testing purposes, follow the steps provided later in this section. Each new data service log file is stored by default in the following directory: install-path/var/server_logs/log-subdirectory The name of the log subdirectory specifies the date, time, process ID, and server request ID. The name of the data service log file is illustrated in the following example: _2778_datasvc_Verify-Address-Job.ddf.log In the preceding example, is a time stamp, 2778 is the server request ID, and datasvc is the log file type. The remainder of the name specifies the name of the real-time service. Data service logging is configured by default by the following file: install-path/etc/service.log.xml Follow these steps to enable logging for real-time services: 1. Open service.log.xml and locate the root tag. 2. Change the OFF value in <level value="off"/> to DEBUG or TRACE, depending on the level of information that you want to gather. TRACE provides the greatest level of information. 3. Restart the Data Management Server. 4. At the conclusion of testing, repeat this procedure to disable logging. If you require additional information to conclude your testing process, contact your SAS technical support representative. To change the name and location of the data service log configuration file service.log.xml, open the following file: install-path/etc/service.cfg
43 Administer Log Files for Batch and Profile Jobs 35 In service.cfg, change value of the option BASE/LOGCONFIG_PATH, and then restart the Data Management Server. To change the encoding of the job logs for your real-time data services, set the configuration option BASE/JOB_LOG_ENCODING in the file install-path/etc/ app.cfg. By default, the log is written in the encoding of the locale of the process that executes the job. For English-speaking organizations, the encoding can be LATIN-1 or UTF-8. If a log line contains characters that cannot be represented in the encoding, then the log line is not written to the log file. Administer Log Files for Batch and Profile Jobs A unique log file is generated by default each time you run a batch job or a profile job. Each such job is processed by a unique instance of the DFWFPROC process. By default, the name of each batch job log file is added as an entry in the Data Management Server log file dmserver.log. Job status, based on the content of the job logs, is displayed in the Monitor folder in DataFlux Data Management Studio. You can use the SAS Job Monitor in the SAS Environment Manager to collect statistics during and after job runs as described in Collect Job Status Information with the SAS Job Monitor. Each new job log file for a batch or profile job is stored by default in the following directory: install-path/var/server_logs/log-subdirectory The default directory is specified by the configuration option DMSERVER/ WORK_ROOT_PATH. To separate the batch and profile job logs from the other files that are generated during job runs, you can specify a non-default storage location. Specify that path as the value of the configuration option DMSERVER/JOB_LOGS_DIR, in the configuration file dmserver.cfg. To locate batch and profile job logs, locate the name of the log subdirectory that specifies the date, time, process ID, and server request ID. The name of the batch or profile log file is illustrated in the following example: _3727_wfjob_Marketing_Profile_1.log In the preceding example, is a time stamp, 3727 is the server request ID, and wfjob is the log file type. The remainder of the name specifies the name of the job. Batch and profile job logs are configured by default by the following file: install-path/etc/batch.log.xml To change the default name and default location of the batch and profile job log configuration file, edit the value of the option BASE/LOGCONFIG_PATH. To change log events and thresholds, see Change Log Events and Thresholds. Restart the server to apply your changes. To change the encoding of your batch and profile logs, set the configuration option BASE/JOB_LOG_ENCODING in the file install-path/etc/app.cfg. By default, the log is written in the encoding of the locale of the process that executes the job. For English-speaking organizations, the encoding can be LATIN-1 or UTF-8. If a
44 36 Chapter 4 Administering the DataFlux Data Management Server log line contains characters that cannot be represented in the encoding, then the log line is not written to the log file. Enable the SOAP Log Follow these steps to enable the logging of the SOAP packets that are transmitted and received by the DataFlux Data Management Server. Note: Enable the SOAP log only temporarily, and only for the purpose of debugging. When debugging is complete, be sure to disable the SOAP log. 1. Stop the DataFlux Data Management Server. 2. Create a backup copy of the file install-path\etc\dmserver.log.xml. CAUTION: An improperly edited dmserver.log.xml file can prevent the start of the DataFlux Data Management Server. 3. Open the file install-path\etc\dmserver.log.xml. 4. To remove the comment tags from the configuration information that enables the SOAP log, see the comments in the dmserver.log.xml file. The configuration information also indicates the structure of the SOAP log filename. 5. Save and close the file dmserver.log.xml. 6. Restart the DataFlux Data Management Server. Contact your SAS Technical Support Representative if you need to route SOAP log entries to the main server log file. Change Log Events and Thresholds Loggers and appenders determine the content in DataFlux Data Management Server log files. The loggers and appenders are defined in each log configuration file, such as install-path\etc\dmserver.log.xml. Appenders specify the log output destination. Loggers specify log event types and thresholds. If a logger lists a given log event, then those events are recorded in the log file. The threshold value determines the amount of information that is captured in the log file for each event. The available threshold levels are ranked as shown in the following diagram.
45 Troubleshoot ActiveX Error to Display Help 37 The default threshold level capture most of the events that you will need to diagnose server problems. However, should there be a need to increase logging events and threshold levels, contact your SAS technical support representative for assistance. Altering threshold levels above INFO when the server is operational in a production environment is discouraged since this can result in a reduction in server performance. When you change a log configuration file, you are required to restart the Data Management Server. To learn more about logging, see the SAS Logging: Configuration and Programming Reference and the SAS Interface to Application Response Measurement (ARM): Reference. Troubleshoot Server Start If the Data Management Server does not start, and if the server log file lists the failure dfwlplistenattr_connattr(wlp), then a port might be in use by another application. By default, SOAP requests are handled on port WLP requests are handled on port If either of the default ports are being used by another application, then assign that process to an unused port. If the server log does not indicate the source of the problem, then follow these steps if your server installed on Windows: 1. Open the Windows Event Viewer. 2. Select the Application event type. 3. Click the Source column, to sort the events based on the type of source. 4. Search the Source column for DataFluxDMS. Typically, two such events are logged for each time period. One message specifies the name of the log file, as shown in the following example: WARNING: Messages have been logged to the file named 'C:\Documents and Settings\LocalService\Application Data\SAS\LOGS\DFINTG~1.EXE CBDA9C.log' If your server is installed on UNIX or Linux, then errors will be written to the stdout location of the shell from which the DataFlux Data Management Server was started. Troubleshoot ActiveX Error to Display Help Internet Explorer versions 6 and later can be configured to block the download and execution of ActiveX controls. ActiveX is required to display the online Help for DataFlux Data Management Studio. Follow these steps to enable the download and execution of ActiveX controls: 1. In Internet Explorer, select Tools ð Internet Options. 2. In Internet Options, click the Security tab. 3. Select Allow active content from CDs to run on My Computer, and Allow active content to run in files on My Computer.
46 38 Chapter 4 Administering the DataFlux Data Management Server
47 39 Chapter 5 Managing Data Connections Overview of Data Connections Configure the Data Access Component (DAC) Display Data Connections Create a Server Job That Uses a Driver and a Data Source Use the Windows ODBC Data Source Administrator Use dfdbconf and dfdbview for UNIX and Linux ODBC Connections Create a Domain-Enabled ODBC Connection Create a Custom Data Connection Create a SAS Connection Edit a Data Connection Delete a Data Connection Manage ODBC Credentials Troubleshoot ODBC Data Connections Overview SQL Server ODBC Driver Error in Windows Teradata and Informix ODBC Drivers Fail to Load Overview of Data Connections The Data Management Server connects to data sources on databases through ODBC or through jobs that connect to the Federation Server. For more information about accessing data on or through a Federation Server, refer to the DataFlux DataFlux Data Management Studio User s Guide and to the DataFlux Federation Server Administrator s Guide. To add a data source using ODBC, use the ODBC Data Source Administrator provided with Windows, or use the dfdbconf command in UNIX and Linux. You can configure the following data connections (also known as DSNs or data sources) for the jobs that you run on the DataFlux Data Management Server: Domain-Enabled ODBC Connection - Enables you to create a connection that links a DataFlux Authentication Server domain to an ODBC Data Source Name (DSN). User credentials from the Authentication Server are automatically applied
48 40 Chapter 5 Managing Data Connections when the user accesses the domain-enabled connection. This approach ensures that the appropriate credentials for that domain are applied to the access request. Custom Connection - Enables you to create a custom connection string for non- ODBC connection types. These custom strings enable you to establish native connections from a SAS Federation Server to third-party databases or to draw data from more than one type of data input. SAS Data Set Connection - Enables you to create SAS data set connections. In Windows, DSNs are stored in install-path\etc\dftkdsn. In UNIX and Linux, DSNs are stored in install-path/etc/odbc.ini. You can store ODBC credentials for data sources that require login credentials with the ODBC Credential Manager. With stored ODBC credentials, you can make connections to data sources without being prompted for login credentials. When a job is run, the saved user credentials are retrieved and used. The credentials are not stored within the job. The job references the connection by DSN only. In UNIX and Linux, credentials are stored in the directory /$HOME/.dfpower/dsn. When you develop jobs and services in DataFlux Data Management Studio, use the Data Connections riser to set up and store login credentials for any Open Database Connectivity (ODBC) data source. The DataFlux Data Management Server can use these data sources directly if Studio is installed on the same host as DataFlux Data Management Server. Stored credentials do not have to be entered each time the job is run, and that information can be used by any DataFlux application. If you do not use stored credentials, then your job must authenticate through a Metadata Server or Authentication Server. Use global variables within jobs and services to accept or retrieve data. Using global variables increases the flexibility and portability of Studio jobs and services between data sources. If you want to use ODB drivers other than those that are supplied, note that the Data Management Server is compatible with most ODBC-compliant data sources. Also note that SAS provides limited support for drivers that are not supplied by SAS. If you develop jobs that access a SAS Federation Server, then you can use JDBC drivers and other drivers that are written for native access to popular databases. To learn more about developing jobs that use Federation Server drivers, refer to the DataFlux Data Management Studio User s Guide and to the SAS Federation Server Administrator s Guide. Configure the Data Access Component (DAC) The Data Access Component (DAC) allows the DataFlux Data Management Server to communicate with databases and manipulate data. The DAC uses Open Database Connectivity (ODBC) and Threaded Kernel Table Services (TKTS). ODBC database source names (DSNs) are not managed by the DAC. In the Windows operating environment, ODBC DSNs are managed by the Microsoft ODBC Administrator. In the UNIX and Linux operating environments, ODBC DSNs are created with the dfdbconf tool and tested with the dbdfview tool. TKTS DSNs are managed by the DAC. TKTS DSNs stored in the DSN directory.
49 Create a Server Job That Uses a Driver and a Data Source 41 The DAC is configured with the following two options in the DataFlux Data Management Server s app.cfg file. If necessary, the options can be moved to the macro.cfg file. Setting DAC/DSN DAC/SAVEDCONNSYSTEM Default Value install-path \etc\dftkdsn\ install-path\etc\dsn\ For more information about the app.cfg file, see the DataFlux Data Management Studio Installation and Configuration Guide. For a complete list of Data Access Component options, see the DataFlux DataFlux Data Management Studio Online Help. Display Data Connections Follow these steps to display the data connections that have been created for the Data Management Server: 1. Open Data Management Studio and click the Administration riser. 2. Connect to the Data Management Server. 3. Click the Data Connections tab to display data connections. Create a Server Job That Uses a Driver and a Data Source Follow these steps to develop a job that runs on a DataFlux Data Management Server and accesses a database: 1. Install the DataFlux ODBC drivers (unless they were installed initially) on both the Data Management Studio client and on the DataFlux Data Management Server. Use the regular installers as needed. DataFlux provides a number of wire-protocol drivers that enable you to connect to specific databases without using a client-side library. 2. Create a DSN on the database server. 3. Configure an ODBC client-side connection on the DataFlux Data Management Studio host using the ODBC Connections window in Studio, as described in the Data Management Studio User's Guide. 4. Create and test the job in DataFlux Data Management Studio. The nodes in the job will access the ODBC connection. 5. Upload the job from Studio to the Data Management Server. 6. Either copy the ODBC connection file onto the server or create the connection on the server. To create a server connection in Windows, use the ODBC Data Source Administrator. If your server is running in UNIX or Linux, use the dbdfconf tool that is provided with the DataFlux Data Management Server.
50 42 Chapter 5 Managing Data Connections If DataFlux Data Management Studio and DataFlux Data Management Server are installed and running on the same (Windows) host, then you will need to set up the ODBC DSN two times. Set up one DSN through ODBC Connections in Studio. For the DataFlux Data Management Server, set up the DSN again using the ODBC Data Source Administrator. Use the Windows ODBC Data Source Administrator In the Windows operating environment, use the Microsoft ODBC Data Source Administrator to manage database drivers and data sources on a DataFlux Data Management Server. To add required stored credentials to Windows DSNs, use the DataFlux Data Management Server s administrative interface in SAS Data Management Studio, as described in the following steps: Note: To learn about DSN options, open the Help for the ODBC Data Source Administrator. Follow these steps to configure a driver to connect to a database: 1. Click Start and point to Settings. 2. Click Control Panel. 3. Double-click Data Sources (ODBC) to open the ODBC Data Source Administrator dialog box. 4. Select the System DSN tab and click Add. 5. Select the appropriate driver from the list and click Finish. 6. Enter your information in the Driver Setup dialog box and click OK when finished. 7. To add required stored credentials to the DSN, open SAS Data Management Studio, select the Administration riser, and expand the DataFlux Data Management Server. Use dfdbconf and dfdbview for UNIX and Linux ODBC Connections Follow these steps to create ODBC connections with required stored credentials in the UNIX and Linux ODBC operating environments: 1. Execute the following command: install-path/bin/dfdbconf 2. Select a driver from the list of available drivers. 3. Set the appropriate parameters for the driver. The new data source is added to the odbc.ini file, which is stored at install-path/etc. Note: You can also use dfdbconf to delete the data sources. 4. Add stored credentials to the data source by executing the following command: install-path/bin/dfdbview -s data-source-name
51 5. Test the new data source by executing the following command: install-path/bin/dfdbview data-source-name Create a Custom Data Connection If the connection succeeds, use the prompt to enter SQL commands to test the connection. If the connection fails, resolve the errors described in the error messages. Create a Domain-Enabled ODBC Connection Domain-enabled ODBC connections use a domain and credentials that are supplied by a SAS Metadata Server or a DataFlux Authentication Server. You can create domainenabled ODBC connections for use with jobs that run on the Data Management Studio host, or that run on a DataFlux Data Management Server host. If you create a domain-enabled ODBC connection for use with Studio, and if you want to move that connection to a DataFlux Data Management Server, then you copy the.dtfk file from the client to the DataFlux Data Management Server into the directory install-path\etc\dftkdsn. To execute jobs, the DSN in the domain-enabled ODBC connection must be defined in a location that is accessible to the DataFlux Data Management Server. Otherwise, the reference to the DSN in the domain-enabled ODBC connection will not resolve. To create a new domain-enabled ODBC connection for use on the DataFlux Data Management Studio host, see the Data Management Studio User's Guide. To create a new domain-enabled ODBC connection for use on the DataFlux Data Management Server host, follow these steps: 1. Open a registered Data Management Server using the riser bar in Studio and log on if prompted. 2. Click the Data Connections tab and click New. 3. Select Domain Enabled ODBC Connection from the drop-down menu. 4. In the Name field, enter a name for the connection, preferably referring to the Domain that was created in Authentication Server. 5. Enter some information about the domain and connection in the Description field. 6. Select the DSN. 7. In the Domain name field, enter the actual domain name as it was created in the Metadata Server or Authentication Server. 8. Click OK. The new connection places a.dftk file in the install-path\etc\dftkdsn. Create a Custom Data Connection Custom connections enable you to access data sources that are not otherwise supported in the Data Connections interface of DataFlux Data Management Server. Examples of custom connections are those that connect to SAP or SQLite. Follow these steps to create a custom connection:
52 44 Chapter 5 Managing Data Connections 1. In DataFlux Data Management Studio, click the Data riser, and then expand Data Connections. 2. Click New Data Connection, and then select Custom Connection. 3. Enter the name, description, and the full connection string. 4. Click Test Connection. 5. Click OK to save the new connection. Create a SAS Connection Follow these steps to create a connection to a SAS data set: 1. Click the Data Connections tab and click New. 2. Select SAS Data Set Connection from the drop-down menu. 3. Enter a name and description for the new connection into the appropriate fields. 4. Enter the path to the directory that contains the SAS data set that you want to connect to. 5. Verify that the appropriate option is specified in the Access field. The Default value assigns read-write access. The Read-Only value assigns Read-Only access. The Temp value specifies that the data set is to be treated as a scratch file, which is stored only in memory, not on disk. 6. Verify that the appropriate option is specified in the Compression field. If you compress data, you might experience a slowdown in performance. The No value specifies that the observations are uncompressed. Use the Yes or Character value to compress character data. Use the Binary value to compress binary data. 7. Verify that the appropriate option is specified in the Table Locking field. The Share value specifies that other users or processes can read data from the table but prevents other users from updating. The Exclusive value locks tables exclusively, which prevents other users from accessing that table until you close it. 8. Verify that the appropriate option is specified in the Encoding field. The default value is SAS System encoding. You might select an encoding that is different than the default if you are processing a file with a different encoding. Note: You can select an encoding from the drop-down menu. If you enter the first few letters of the desired encoding, the closest match will be added to this field. 9. The Connection String field displays the connection string for the SAS data set. Check the connection string to see whether the appropriate options encoding has been selected for this connection. You can test the connection by clicking Test Connection. 10. Click OK to save the new connection. Edit a Data Connection Follow these steps to edit a data connection:
53 Manage ODBC Credentials In DataFlux Data Management Studio, click the DataFlux Data Management Servers riser to access a list of DataFlux Data Management Servers. 2. Select the name of the server for which you want to manage connections. If you are prompted to do so, enter your user ID and password, and then click Log On. 3. In the information pane, click the Data Connections tab that presents a list of current connections. 4. Select a data connection and click Edit. The connection type will determine which fields and options are available for you to edit. 5. In the dialog box that opens, make the desired changes and click OK. Delete a Data Connection Follow these steps to delete a data connection: 1. From Data Management Studio, click the DataFlux Data Management Servers riser to access a list of your DataFlux Data Management Servers. 2. Click the name of the server for which you want to manage connections. If you are prompted to do so, enter your user ID and password, and then click Log On. 3. In the information pane, under the Data Connections tab, select the name of the data connection that you want to delete and click Delete. 4. When prompted, confirm that you want to delete the data connection by clicking Yes. Manage ODBC Credentials ODBC credentials enable connections to data source. The credentials contain connection information, so they do not need to be stored inside the job, or entered when job is run. This allows for better protection and management of security, increased confidentiality, and a more versatile way to handle access to data sources that require authentication. To manage ODBC credentials, complete the following steps: 1. In DataFlux Data Management Studio, click the DataFlux Data Management Servers riser bar. 2. Select the name of the DataFlux Data Management Server for which you want to manage connections. If you are prompted to do so, enter your user ID and password, and then click Log On. 3. In the information pane, select the Data Connections tab and click the Manage ODBC Credentials icon. 4. To create ODBC credentials in the Manage ODBC Credentials dialog box, click New ODBC Credentials. Enter the ODBC DSN, user name, and password. Review your entries, and then click OK. 5. To edit ODBC credentials, select a name from the list and click the Edit ODBC Credentials icon. In the ODBC Credentials dialog box, change the user name or password that will be used to access the ODBC DSN. Click OK to close the dialog
54 46 Chapter 5 Managing Data Connections box. Note that the Edit ODBC Credentials icon is available only when credentials have been saved for an ODBC DSN. 6. To delete ODBC credentials, select a name and click Delete ODBC Credentials. You can use Ctrl + left click to select more than one name. Click OK to close the Manage ODBC Credentials dialog box when you are finished. Use caution when deleting an ODBC credential. When a name is deleted from the list, clicking Cancel will not reverse the deletion. Troubleshoot ODBC Data Connections Overview If your ODBC connections show any of the following symptoms, refer to the following resolutions. SQL Server ODBC Driver Error in Windows If you have an ODBC DSN that uses the Windows SQL Server driver, replace that DSN with one that uses the DataFlux 32 or 64-bit SQL Server Wire Protocol driver. Using the Windows SQL Server driver can cause problems with your repository. Note: Use the 32 or 64-bit driver depending on the operating environment of the DataFlux Data Management Server. Access the drivers by selecting Control Panel ð ODBC Data Sources. The DataFlux drivers are listed in the Drivers tab. Teradata and Informix ODBC Drivers Fail to Load In the Solaris x86 operating environment, DataDirect does not currently provide Teradata or Informix drivers. In Linux operating environments, the directory that contains the Teradata client libraries needs to be in your LD_LIBRARY PATH. The exact path will vary depending on the version of your Teradata client.
55 47 Chapter 6 Managing Jobs, Services, and the Repository Overview of Jobs and Services Configure the SOAP and WLP Servers Configure the Server to Pre-load Services Overview Pre-load All Services Pre-load One or More Specific Services Configure Complex Pre-loads Browse Available Services and WSDLs Apply SOAP Commands and WSDL Options Overview SOAP Commands Reference Response from the GenerateWSDL Command Reference for WSDL Configuration Options Debug Real-Time Services Using SOAP Fault Elements and Log Files Define Macros Overview Declare Input and Output Variables for Data Services Update Macros Terminate Real-Time Services Overview Set a Job ID Terminate the Real-Time Service Manage the DFWSVC Process Overview Unload DFWSVC Processes Reference for DFWSVC Configuration Options in dmserver.cfg Reference for DFWSVC Configuration Options in service.cfg Manage the DFWFPROC Process Overview Limit the Number of Jobs and Queue Job Run Requests Unload Idle DFWFPROC Processes Reference for DFWFPROC Configuration Options Run Jobs with the dmpexec Command Overview dmpexec Options Configure Authentication for dmpexec Return Codes for the dmpexec Command
56 48 Chapter 6 Managing Jobs, Services, and the Repository Configure a Log File for dmpexec Configure Jobs and Services Overview Grant Job Permissions QKB Memory Usage for Jobs and Services Configure Bulk Loading Configure Storage for Temporary Jobs Support for Remote-Access Clients Resolve Out-of-Memory Errors When Using Sun JVM Collect Job Status Information with the SAS Job Monitor About the Repository Troubleshoot Jobs and Services Overview Troubleshoot SOAP Packets Server Processes (DFWSVC or DFWFPROC) Fail to Start, or Out of Memory Error in Windows When Launching Server Processes Required OpenSSL Libraries Were Not Found The Repository Is Newer Than This Client Creating a New WSDL in DataFlux Web Studio Generates a Not-Licensed Error SQL Lookup Job Fails on a UNIX or Linux System Using the Driver for BASE.. 75 When Opening a Job Log: SOAP-ENV:Client:UNKNOWN Error (or Time-out). 76 Error Occurs in an Address Verification Job on Linux DQ Engine Cannot Process New Quality Knowledge Base Definitions Job with Custom Scheme Fails to Run Load Balance Service Requests Customize the Server s WSDL File Customize the WSDL File for Java Customize the WSDL File for C# Overview of Jobs and Services The types of jobs and services that you can store and run on a DataFlux Data Management Server include real-time data services, real-time process services, batch jobs, and profile jobs. Real-Time Services Data Services Real-time data services are designed to quickly respond to a request from a client application. Real-time data services can process a small amount of data input from a client, or it can retrieve and deliver a small amount of data from a database. Realtime data services are executed by the DFWSFC process, as defined in Manage the DFWSVC Process. Real-Time Process Services Real-time process services accept input parameters only from clients, to trigger events or change a display. Real-time process services are executed the DFWFPROC process, which runs a WorkFlow Engine (WFE), as defined in Manage the DFWFPROC Process.
57 About Real-Time Data Services and Real-Time Process Services If a real-time data service or real-time process service fails to terminate normally, then the service is terminated when the client connection times-out. To maximize performance, logging is not enabled for real-time services. To activate logging for debugging purposes, see Administer Data Service Log Files. Real-time services are stored in install-path\ var\data_services process_services. Batch Jobs Batch jobs are designed to be run at specified times to collect data and generate reports. Batch jobs are not intended to provide real-time responses to client requests. All batch jobs are logged in dmserver.log. For more information, see Administer Log Files for Batch and Profile Jobs. Batch jobs are stored in install-path\ var\batch_jobs. Batch jobs, like real-time process services, are run by the DFWFPROC process. You can pass input parameters into batch jobs, but not any actual data. Profile Jobs Profile jobs are designed to analyze the quality of specified data sets. Profile jobs are handled as repository objects. They are required to reside in the Data Management Repository. When you run a profile job, the server finds the job in the repository and then starts a new instance of the DFWFPROC process. The requested profile is then run by ProfileExec.djf, which resides in the same directory as the repository. For more information about the Data Management Repository, see About the Repository on page 73. Unlike batch jobs, you cannot grant unique user permissions for profile jobs since they do not have associated object-level access control. To learn more about permissions, see Manage Permissions. When you install a new version of the DataFlux Data Management Server, you are required to import all of your profile jobs into a new Data Management Repository. For more information about importing profile jobs, see Configure and Migrate after Software Upgrades. Note that in Windows, it is possible to save a DataFlux Data Management Server job to a directory, and then not be able to see that job in that directory. To resolve this issue, save your jobs in a location that does not use mapped drives. A Windows service is not able to access mapped drives, even if the service is started under a user account that maps those drives. A DataFlux Data Management Server cannot run a remote job or service whose location is specified with a mapped drive letter, such as Z:\path\remote_job.ddf. The server runs as a service under a SYSTEM account and no mapped drives are available to such an account. Use a UNC path to specify the location of a remote job or service, such as: \\ServerHostName.MyDomain.com\path\remote_job.ddf. The following restrictions apply to the name of a job that will be deployed to a DataFlux Data Management Server. You should follow these restrictions for all jobs. A job name can contain any alpha-numeric characters, white spaces, and any characters from the following list:,.'[]{}()+=_-^%$@!~/\ Overview of Jobs and Services 49 The maximum length of job names is 8,192 bytes. DataFlux Data Management Server will not upload, list, or run a job name with characters other than those cited above.
58 50 Chapter 6 Managing Jobs, Services, and the Repository In UNIX or Linux, to run a shell command in a job, use the execute() function, as shown in the following examples. To run the command directly: execute("/bin/chmod", "777", "file.txt") To run a command through a shell: execute("/bin/sh", "-c", "chmod 777 file.txt") The preceding examples return the host authorizations for a text file. Configure the SOAP and WLP Servers The Data Management Server manages client connections using multiple threads. By default, a SOAP server communicates with clients. To enhance performance, you can enable a second server that listens for client connections at a separate port. The two servers run as separate threads in a single process. Both servers spawn new threads to connect to clients, using a shared thread pool. The two servers are defined as follows. The SOAP server uses a SOAP interface, as defined in the DataFlux Web Service Definition Language (WSDL) file. For more information about the WSDL file, see Customize the Server s WSDL File. The Wire-Level Protocol (WLP) server uses a proprietary WLP client library. WLP offers a significant performance increase over SOAP, especially for real-time data services that require significant amounts of data transfer. The WLP server is disabled by default. When it is enabled, the WLP server is used with the SAS programs that include the procedure DMSRVDATASVC, and all functions. In SAS programs, WLP is enabled by setting the system option DQOPTIONS=(DQSRVPROTOCOL=WIRELINE). WLP is used primarily in the z/os operating environment, where it is a required replacement for SOAP. The Data Management Server processes a single SOAP request per client connection. After the server returns a response, the connection is closed. This is true even if the client attempts to make a persistent connection by including Connection: Keep-Alive (for HTTP 1.0) or by omitting Connection: close (for HTTP 1.1). The connection parameters are specified in the HTTP header of the SOAP request. To manage the configuration of the SOAP and the WLP servers, set the following configuration options.
59 Configure the SOAP and WLP Servers 51 Configuration Option DMSERVER/SOAP/LISTEN_HOST DMSERVER/SOAP/LISTEN_PORT Description Specifies the host name or IP address to which the SOAP server must bind. A machine running Data Management Server might be available on the network under different host names or IP addresses. Different machines on the network might have access to this machine via one host name or IP address, but not via others. By binding to a particular host name or IP address, the server hears only requests addressed specifically to that host name or IP address. A pair of host names and IP addresses can be used interchangeably. For example, if this option is set to localhost, local clients sending requests to will still be heard. However, requests to public IP address or the host name of the machine (or requests coming from external clients to the machine) will not be heard. By default, this option is left blank. That means Data Management Server is not bound to any specific host name or the IP address of the machine and receives all requests that are coming to the machine, on the correct port. Note: If the value for this option originates from the Metadata Server (when DMSERVER/ NAME is configured), and you want to change the setting for that value, then add this option to dmserver.cfg file. This is necessary when you want the DMServer to start and listen for all available interfaces (as if there were+ no value set for this option). In addition, leave the value empty after the = symbol. Specifies the port on which the SOAP server listens for connections. The default value is when you use a DataFlux Authentication Server for security. When you use a SAS Metadata Server for security, the DataFlux Data Management Server uses the DMSERVER/NAME option to retrieve from metadata the values of three configuration options, including LISTEN_PORT. If the SAS Metadata Server does not return a value for LISTEN_PORT, then the DataFlux Data Management Server does not start. If a value is returned, and if dmserver.cfg also contains a value for LISTEN_PORT, then the local value overrides the metadata value. For this reason, it is recommended that you not set LISTEN_PORT in dmserver.cfg when using a SAS Metadata Server. For further information, see DMSERVER/NAME and DMSERVER/ SOAP/SSL.
60 52 Chapter 6 Managing Jobs, Services, and the Repository Configuration Option DMSERVER/WLP DMSERVER/WLP/LISTEN_PORT DMSERVER/WLP/LISTEN_HOST Description Enables or disables the WLP server. When the value is YES, the WLP server is started and uses its own listen port. When the value is NO, the WLP server is bypassed during start-up of Data Management Server. This means that WLP clients cannot connect to the DataFlux Data Management Server, but SOAP clients can. The DataFlux Data Management Server log receives entries for the status of WLP server. Specifies the port on which the WLP server listens for connections from WLP clients. If you are running multiple instances of the server on the same machine, each instance must have a unique port configured for it. The default port is Specifies the host name or IP address to which the WLP server must bind. By default, this option is left blank. For more information, see DMSERVER/SOAP/LISTEN_HOST. Configure the Server to Pre-load Services Overview The following sections describe how to use pre-load configuration settings when you start your DataFlux Data Management Server. This is helpful if you typically use the same services each time you run DataFlux Data Management Server. Use the following options to configure pre-load: DMSERVER/SOAP/DATA_SVC/PRELOAD_ALL = count DMSERVER/SOAP/DATA_SVC/PRELOAD = count:name-of-servicecount:nameof-service... The value count specifies the number of pre-load instances. The value name-ofservice indicates the name of the service element. This can include the directory where the service is located. DMSERVER/SOAP/DATA_SVC/PRELOAD_DURING_RUN = yes no By default, the Data Management Server pre-loads all configured services before accepting SOAP requests. When the value is yes, the DataFlux Data Management Server starts a separate thread to pre-load all configured services at run time, while accepting SOAP requests at the same time. If DataFlux Data Management Server is stopped while the pre-load thread is still running, that thread will be terminated.
61 Configure the Server to Pre-load Services 53 Pre-load All Services The configuration option DMSERVER/SOAP/DATA_SVC/PRELOAD_ALL = count causes the DataFlux Data Management Server to find and pre-load a specified number of all services. This includes services found in subdirectories. The number of instances of each service (count) must be an integer greater than 0, or the directive is ignored. For example, DMSERVER/SOAP/DATA_SVC/PRELOAD_ALL = 2 causes DataFlux Data Management Server to preload two instances of each service that is available, including those found in subdirectories. Pre-load One or More Specific Services The configuration option DMSERVER/SOAP/DATA_SVC/PRELOAD =count:name-ofservice designates the specific services, as well as the count for each service, that the DataFlux Data Management Server is to pre-load at start-up. Use additional count and service elements for each service. Separate each count and service element by one or more blank spaces. The service element itself cannot include blank spaces. Also, all elements must be listed on a single line. Using this format, you can configure a directive that starts a number of services, each with a different count. The following example loads two counts of the abc service and one count of the xyz service. The xyz service is located in the subdir2 subdirectory: DMSERVER/SOAP/DATA_SVC/PRELOAD = 2:abc.ddf 1:subdir1\xyz.ddf Configure Complex Pre-loads By combining options, you can configure more complex pre-loads. The two options add the counts arithmetically to determine how many services are actually loaded. Internally, the DataFlux Data Management Server builds a list of all of the services that it needs to pre-load and, for each service, sets the total count. The following two example options illustrate the logic of how this works: DMSERVER/SOAP/DATA_SVC/PRELOAD_ALL = 2 DMSERVER/SOAP/DATA_SVC/PRELOAD = 2:svc1.ddf -1:subdir1\svc2.ddf -2:svc3.ddf The first option instructs the DataFlux Data Management Server to pre-load a total of two instances of all existing services. The second options modify the first as follows: Two additional counts of svc1.ddf are added, for a total of four instances. The counts are added together, and the total is the number of instances that DataFlux Data Management Server tries to preload. The svc2.ddf file, which is found in the subdir1 subdirectory, has a -1 count. This produces a total count of one for svc2.ddf. For the svc3.ddf file, there is a combined total count of zero, so this service is not loaded at all. The count value must be greater than zero for a service to be preloaded. Some important points to remember: DataFlux Data Management Server attempts to pre-load a single instance of all requested services before trying to pre-load more instances, if more than one instance is specified.
62 54 Chapter 6 Managing Jobs, Services, and the Repository The service element can include the path to the service, relative to the root of the services directory. For example, 1:subdir1\svc2.ddf specifies one instance of service svc2.ddf, which is located in the subdir1 subdirectory. The count value can be a negative value. This is meaningful only when both configuration options are used together. Pre-loading stops when the Data Management Server has attempted to pre-load all required instances (successfully or not), or if the limit on the number of services has been reached. Depending on whether a SOAP or WLP server is used, the limit can be specified by using one of the following configuration options: DMSERVER/SOAP/ DATA_SVC/MAX_NUM, or DMSERVER/WLP/DATA_SVC/MAX_NUM. These configurations will default to 10 if a number is not specified. Browse Available Services and WSDLs You can use a web browser to display lists of available data services and process services. You can also display definitions of available data services and process services. The definition files are formatted in the Web Service Definition Language. You can configure your DataFlux Data Management Server to generate WSDL service definitions dynamically, in response to GET WSDL requests. To use a web browser to display a list of available data services, enter an address in the following format: As shown in this example: To use a web browser to display a list of available process services, enter an address in the following format: As shown in this example: To use a web browser to display the WSDL of a data service, enter an address in the following format: The path is the directory path in install-path/share/web/data-services. The following example displays the WSDL of the data service named RAM.DDF: To use a web browser to display the WSDL of a process service, enter an address in the following format: The path is the directory path in install-path/share/web/data-services. The following example displays the WSDL of a process service named RAM.DDF:
63 Apply SOAP Commands and WSDL Options 55 If a WSDL does not already exist for a data service or a process service, then one of the two things will happen. If the DataFlux Data Management Server is configured to generate a WSDL in response to GET WSDL requests, then the server generates a WSDL for display in the browser. Otherwise, the browser displays an error. To generate WSDLs in response to GET WSDL requests, set the following option in dmserver.cfg: DMSERVER/SOAP/WSDL/GEN_ON_GET = yes. Apply SOAP Commands and WSDL Options Overview The Data Management Server supports a number of SOAP commands that enable clients to run jobs and services and administer the server. These Simple Object Access Protocol (SOAP) commands cause the server to return simple types (integers and strings) or types (structures built from simple types and other structures). Definitions of all requests, responses, and complex types are found in the Web Service Definition Language (WSDL) file, in install-path/share. Note: WSDL 2.0 is not supported. SOAP Commands Reference The following table describes the SOAP commands that appear in the WSDL file: Command Description Command Type GenerateWSDL Generates a WSDL for a real-time data service or a real-time process service, or for an entire directory including all associated subdirectories. You can pass service names or a single directory name but both cannot be used together in the command. Using a service and directory name together will result in an error. The DataFlux Data Management Server can generate multiple WSDLs within a request and will not stop if an error occurs. To review the responses that result from the use of this command, see Response from the GenerateWSDL Command. GetServerVersion Returns the server version and the versions of the installed reference data, the repository, and some of the libraries. Also returns date and time elements. Single version command ArchitectServiceParam Returns the input and output fields of a real-time data service. Data services command
64 56 Chapter 6 Managing Jobs, Services, and the Repository Command Description Command Type ArchitectService Runs a real-time data service. The timeout integer element specifies the number of seconds to allow a realtime data service to run before it is stopped by the server. If the timeout element is omitted from the request, or if the value is 0, then the real-time data service will not be stopped by server. Note: The actual duration of a realtime data service can vary depending on the rounding-up of fractional timeout values. For example, a timeout value of 1.5 is rounded up to an actual duration of 2 seconds. Data services command ArchitectServicePreload Starts the requested number of processes, and loads into those processes the specified real-time data services. For information about preloading services, see Configure the Server to Pre-load Services. Data services command ArchitectServiceUnload Terminates the process that is running the specified real-time data or process service. For further information, Data services command LoadedObjectList Returns a list of running real-time data service processes, along with the name of the loaded service job for each process. Data services command MaxNumJobs Sets the maximum number of concurrent processes that can run realtime data services. This is a run time setting only and has no effect on the value of the configuration option in the dmserver.cfg file. To learn about configuration options, see Configuration Options Reference for dmserver.cfg. Data services command WorkFlowJobParams Returns the inputs and outputs of either a rea-time process service or a batch job. Process services command WorkFlowService Runs a real-time process service. Process services command UnloadProcesses Kills all idle dfwfproc processes and subsequent busy dfwfproc processes once they become idle. The DFWFPROC processes run real-time process services and batch jobs. Process services command
65 Apply SOAP Commands and WSDL Options 57 Command Description Command Type RunArchitectJob Runs a batch job. Batch and profile jobs commands RunProfileJob Runs a profile job. Batch and profile jobs commands TerminateJob Terminates a running batch job or profile job. The client can still retrieve the status, log, and statistics file (if one exists) after the job has been terminated. Batch and profile jobs commands JobStatus Returns status information for one or more batch jobs or profile jobs. Applies to jobs that are running or that have already finished. Batch and profile jobs commands JobNodesStatus Returns status information for every node in a batch job. Applies only to the jobs that are currently running. Batch and profile jobs commands JobLog Returns the log file and statistics file (if one exists) for a batch job or profile job. Applies only to already finished jobs. Batch and profile jobs commands DeleteJobLog Deletes the job log, statistics file (if one exists), and all history for a given job run. Applies only to already finished jobs. Batch and profile jobs commands ObjectList Retrieves a list of available objects of a specified type (data services, process services, batch jobs, or profile jobs). Object files commands PostObject Uploads an object of a specified type. If an object of that type with the same name and path already exists, an error is returned. Object files commands ObjFile Downloads an object of a specified type. Object files commands DeleteObject Deletes an existing object of a specified type. Object files commands ListAccounts Returns a list of user and group IDs with explicitly configured server commands permissions (which are included). Security commands SetPermissions Sets server command permissions for a user or group ID. Security commands
66 58 Chapter 6 Managing Jobs, Services, and the Repository Command Description Command Type DeleteAccount Deletes server command permissions for a user or group ID. Security commands SetACL Sets an access control list (ACL) for an object (data service, process service, or batch job). Security commands GetACL Retrieves an ACL for an object. Security commands Response from the GenerateWSDL Command When a client requests the GenerateWSDL command, the response contains two lists: 1. A list of the job names for which WSDLs were generated successfully. 2. A list of job names for which WSDLs could not be generated. Each entry in this list includes a detailed error message as to the cause of the problem. Generating a WSDL is similar to obtaining properties for a service, so the error messages in the job list are similar. Typical messages include the following: job file not found access denied job format is invalid failed to connect to DB The two job lists are sent only when the request is completed, with or without errors. The amount of time that is required to generate a WSDL is determined by the nature of the job and its dependencies. Reference for WSDL Configuration Options The following table describes the configuration options in dmserver.cfg that relate to WSDLs. Configuration Option DMSERVER/SOAP/WSDL Description Specify a value of YES to load existing WSDLs when you start the DataFlux Data Management Server. Specifying YES also enables the server to recognize other WSDL configuration options and respond to client requests for WSDL-based jobs and services. The default value NO specifies that no existing WSDLs are loaded at start-up, no new WSDLs are generated, client requests to run WSDL-based jobs and services are ignored, and other WSDL configuration options are ignored.
67 Apply SOAP Commands and WSDL Options 59 Configuration Option DMSERVER/SOAP/ WSDL/GEN DMSERVER/SOAP/WSDL/ GEN_ON_GET Description Allows or denies the server the ability to generate runtime WSDLs. Valid option values are NO, SINGLE, or MULTIPLE. The default value NO specifies that the only WSDLs that can be used to respond to client requests are those that are loaded when you start the Data Management Server. Errors are generated in response to requests to the GenerateWSDL command or the PostObject command. The value SINGLE enables the generation of a single WSDL for each instance of GenerateWSDL or PostObject. The value MULTIPLE enables the generation of multiple WSDLs, for SOAP commands that apply to multiple job files or directories of jobs. Note that WSDL generation can be a time-consuming and resource-intensive process. Also note that erroneous requests to generate multiple WSDLs can cause a severe degradation in server performance. Allows or denies WSDL generation as part of an HTTP getwsdl request. The default value NO indicates that an error is returned if the WSDL does not exist, or if the mod.time stamp in the WSDL differs from the mod.time stamp in the service job file. The different mod.time values indicate that the WSDL has been updated on the server, but not on the client. The value YES indicates that the server does not return an error message if the WSDL does not exist or if the mod.time stamps differ. Instead, the server attempts to generate the latest WSDL. If successful, the server returns the new WSDL to the client. This option is valid only when the value of DMSERVER/ SOAP/WSDL/GEN is SINGLE or MULTIPLE. If the value of DMSERVER/SOAP/WSDL/GEN is NO, then the GEN_ON_GET option is ignored and WSDLs are not generated on HTTP getwsdl requests.
68 60 Chapter 6 Managing Jobs, Services, and the Repository Configuration Option DMSERVER/SOAP/WSDL/ RUN_IGNORE_MTIME Description When the server receives an HTTP RunSVC request, this option determines whether the client needs to be rebuilt in order to match an updated WSDL. A rebuild is indicated when the mod.time stamp in the WSDL differs from the mod.time stamp in the service job file. The default value NO specifies not to ignore differences in mod.time values. When the server detects a difference in mod.time values, the server returns to the client the error message Job Has Changed. The client responds by requesting that the server regenerate the WSDL and send the WSDL to the client. The client then rebuilds as directed by the new WSDL. A value of YES indicates that the server will not compare mod.time values. The server passes the RunSVC request directly to the service process for execution. This behavior applies to service requests that are based on generic WSDL Debug Real-Time Services Using SOAP Fault Elements and Log Files When the server encounters an error during the processing of that request, the SOAP response from the server will contain a faultcode tag and a faultstring tag. The faultcode tag categorizes the error. The faultcode tag provides a human-readable description of the error. The Data Management Server generates the following fault codes: Error VersionMismatch MustUnderstand SOAP-ENV:Client SOAP-ENV:Server Description Pertains to the SOAP Envelope element, which is the root element for all SOAP messages. Indicates that the recipient of a message did not recognize the namespace name of the Envelope element. Indicates that the recipient of an element child of the Header element had a soap:mustunderstand attribute that was not understood by the recipient. Indicates that the SOAP message did not contain all of the information that is required by the recipient. This could mean that something was missing from inside the Body element. Equally, an expected extension inside the Header element could have been missing. In either case, the sender should not resend the message without correcting the problem. Indicates that the recipient of the message was unable to process the message because of a server problem. The message contents are not at fault. Instead, a resource was unavailable or process logic failed.
69 Define Macros 61 Other fault elements are delivered in the SOAP response, such as the tags faultactor or detail. These elements do not specify a reason for the fault or indicate the data element that triggered the error. These tags are generic in nature and are usually returned for any and all SOAP requests. Because the DataFlux Data Management Server logs information related to processing errors, these optional SOAP elements are not used for error messaging. It is necessary to look at a log file to obtain details for problem determination. Depending on the nature of the problem, the error might be exposed in a server log or specific service process log. To learn more about logs, see Administer DataFlux Data Management Server Log Files and see Administer Data Service Log Files.. Also note that the nil tags are unused for SOAP fault messaging. It is best not to refer to these elements for problem determination. Define Macros Overview The macros.cfg configuration file defines macro values for substitution into batch jobs, and overrides predefined values. This file is located in install-path/etc. Each line in the file represents a macro value in the form key = value, where the key is the macro name and the value is its value. The following example of a macro defines a Windows path: INPUT_FILE_PATH = C:\files\inputfile.txt On a UNIX system: INPUT_FILE_PATH = /home/dfuser/files/inputfile.txt The example macro is useful when you are porting jobs from one machine to another, because the paths to an input file in different operating environments often differ. By using a macro to define the input filename, you do not need to change the path to the file after you port the job to UNIX. You add the macro to install-path/etc/ macros.cfg in both the Windows and UNIX, and set the path appropriately in each. The etc directory contains the macros.cfg file and a macros subdirectory. The macros subdirectory can contain multiple.cfg files. If one or more of the.cfg files exist in that subdirectory, then they will be read in alphabetical order before the macros.cfg file is read. The last value read becomes the value that is applied. If your jobs use system and user-created macros, you must create a combined macro file to be able to use the macros in DataFlux Data Management Server. For more information about macros, see the online Help for DataFlux Data Management Studio. Declare Input and Output Variables for Data Services Prior to Release 2.2, the macros that were passed into real-time data services were the only ones that could be returned. Also, any macro variable was allowed to be passed to the service, even if it was not being used by the service. In Release 2.2 and thereafter, input and output variables for real-time data services behave similarly to variables of real-time process services and batch jobs. Specifically, only input variables that are declared in a real-time data service job can be passed in, and only final values for declared output variables will be returned. If a variable that was not declared as input is passed into a data service, then an error is returned. To revert to
70 62 Chapter 6 Managing Jobs, Services, and the Repository behavior prior to Release 2.2, set the following configuration option in the service.cfg file: DATASVC/IGNORE_DECLARED_VARS = yes Update Macros About Updates For each job process, the DataFlux Data Management Server reads configured macros at the beginning of execution. When a macro changes, you can update the macro on the server without having to restart the server, using one of the following procedures. Update Macros for Process Services, Batch Jobs, and Profile Jobs For real-time process services, batch jobs, and profile jobs, all of which are executed by separate instances of the DFWFPROC process: 1. In the Data Management Servers tree in Studio, select a server by name. 2. Right-click on the server name and select Unload idle processes from the dropdown menu. Unloading idle processes also updates macros for all subsequent instances of the DFWFPROC process. Update Macros for Real-Time Data Services For real-time data services, all of which are executed by separate instances of the DFWSVC process: 1. In the Data Management Servers tree in Studio, expand a server by name. 2. Select the Real-Time Data Services folder under. 3. In the right pane click the Loaded Processes tab. 4. Select all of the processes under Process ID, and then click one of two buttons depending on the status of the job: Unload Process When Idle, or Unload Process: Terminate Real-Time Services Overview You (or your client) can submit macros or input variables in SOAP commands to terminate real-time data services and real-time process services. You submit a job run identifier when you request the service. Later, to kill the service, you include the ID in the unload command. Set a Job ID When you submit a job run request with ArchitectService or WorkFlowService, you set a job ID, which you can then use to terminate the real-time service. Submit the following key/value pair in a macro (in the varvalue element) or as an input variable (in the inputs element):
71 Manage the DFWSVC Process 63 JOB_METADATA/USER_JOBRUN_ID = your-id-string You ensure that the value is unique, as necessary, and not NULL. If the value is not unique, then the DataFlux Data Management Server will search the active real-time services terminate the first real-time service with a matching identifier. Setting a job run identifier provides the job run request with the following two new elements: svctype values can be data or process. usrjobid the value is a job run identifier. Terminate the Real-Time Service To terminate a real-time data or process service that has job run identifier, include usrjobid and svctype (as needed) in the SOAP command ArchitectServiceUnload. Manage the DFWSVC Process Overview One instance of the DFWSVC process runs one real-time data service. The Data Management Server tracks both idle and active processes. The server also understands whether any service jobs are loaded and waiting to run. When a request for a real-time data service is received from a client, the server first tries to finds an idle DFWSVC process. If one does not exist, then the server looks for a DFWSVC process that does not have any jobs loaded. Finally, if the server does not find a process to reuse, a new process is started, if the configuration allows. When an active DFWSVC process is terminated, the DataFlux Data Management Server records the event in the server log. If an idle DFWSVC process terminates, the server logs the event and starts a new process when another request is received. The maximum run time for data services is set by the configuration option DMSERVER/ SOAP/DATA_SVC/MAX_RUNTIME. Note: When processes are reused too often, performance can degrade. You can specify the amount of reuse in the configuration option POOLING/MAXIMUM_USE, in the server s app.cfg file. After the process has been used the specified number of times, it is terminated. Unload DFWSVC Processes Follow these steps to unload one or more real-time data service processes: 1. Open Data Management Studio and connect to the DataFlux Data Management Server. 2. Select the Real-Time Data Services folder in the navigation pane. 3. Click the Loaded Processes tab. 4. Select one or more real-time data services.
72 64 Chapter 6 Managing Jobs, Services, and the Repository 5. Click either Unload Process When Idle or Unload Process. Unload Process unloads the process immediately. Reference for DFWSVC Configuration Options in dmserver.cfg Use the following configuration options to manage your DFWSVC processes. These options are specified in install-path/etc/dmserver.cfg. Configuration Option DMSERVER/SOAP/DATA_SVC/ IDLE_TIMEOUT DMSERVER/SOAP/DATA_SVC/QUEUE DMSERVER/SOAP/DATA_SVC/MAX_NUM DMSERVER/WLP/DATA_SVC/MAX_NUM Description Specifies the number of seconds to allow a DFWSVC process to remain idle before it is terminated. The default setting is 0 indicating that there is no time-out. Negative values are ignored. Specifies whether to queue real-time data service requests. The value YES indicates that if all DFWSVC process are busy and no new process are allowed to start, then service requests are placed in a queue. The requests are submitted to DFWSVC processes as processes become available. The default value NO returns an error to the requesting client when the preceding conditions apply. Specifies the maximum number of realtime data services that the SOAP server is allowed to run simultaneously. The default value is 10. If a new service request would exceed the set limit, and if the request queue is not enabled, then an error message is sent to the requesting client. This option applies to the SOAP server, meaning that the service requests are coming from a SOAP client. It does not apply to the WLP server or requests coming from a WLP client. Specifies the maximum number of realtime data services that the WLP server is allowed to run simultaneously. The default value is 10. If a new service request exceeds this limit, an error message is returned to the requesting client. This option applies to the WLP server, which processes service requests from WLP clients. This option does not apply to the SOAP server or requests from SOAP clients.
73 Manage the DFWSVC Process 65 Configuration Option DMSERVER/SOAP/DATA_SVC/MAX_ERRS DMSERVER/SOAP/DATA_SVC/ MAX_REQUESTS DMSERVER/SOAP/DATA_SVC/ MAX_RUNTIME Description Specifies the maximum number of service errors that can occur in a DFWSVC process before the process is terminated. The default value is -1, which indicates that there is no process termination due to service errors. Specifies the maximum number of service requests that a DFWSVC is allowed to handle before it is terminated. The default value is -1, which indicates that processes are not terminated based on the number of service requests received. Specifies the number of seconds to allow a real-time data service to produce output data or an error. If a data service does not produce a response within the specified number of seconds, the server terminates the corresponding DFWSVC process. The server then sends a SOAP fault message to the client. The default value for this option is 0 (zero), which indicates that a real-time data service can run indefinitely. Negative values are ignored. Note that the actual time to terminate the process can differ from the value of this option because fractional values are rounded up. An option value of 1.5 can result in a physical duration of 2 seconds before the service is terminated. Reference for DFWSVC Configuration Options in service.cfg Use the following configuration options to manage your DFWSVC processes. These options are provided in the file install-path/etc/service.cfg. When you set these configurations, consider that similar configuration options in the DataFlux Data Management Server s app.cfg configuration file are also applied. The options in app.cfg are applied first, before the configuration options are applied from service.cfg.
74 66 Chapter 6 Managing Jobs, Services, and the Repository Configuration Option DATASVC/ IGNORE_DECLARED_VARS DATASVC/THREAD_STACK_SIZE DATASVC/ MAX_OUTPUT_BYTES Description The value YES causes DFWSVC to ignore all input and output variables that are declared in the real-time data service. Use this value to run realtime data services that were created prior to DataFlux Data Management Server 2.2. All input macros are passed into the data service and all final macro values are returned to the client along with the output data. The default value NO causes DFWSVC to allow in only the input variables that are declared on the job. If any other input variables are passed in, the real-time data service terminates and returns an error to the client. Only the output variables declared on the job will be returned along with the output data from the service. This option sets the stack size, in bytes, for each thread of DFWSVC in the UNIX and Linux operating environments. The default value is 1MB. This option is ignored in the Windows operating environment. This option controls the maximum amount of data, in bytes, that the DFWSVC process is allowed to return to the SOAP client. The default value is 128MB ( bytes). If the output data exceeds the limit, the real-time data service is terminated. The error message Output data size limit exceeded is logged in the log files of DFWSVC and DataFlux Data Management Server. The SOAP client also receives the error message. A value of 0 or less disables the examination of the output data size. Note: This option applies only when DFWSVC is invoked by a SOAP client request. The option is ignored when DFWSVC is invoked by a WLP client request. Manage the DFWFPROC Process Overview The DFWFPROC process runs real-time process services, batch jobs, and profile jobs. Process services are handled independently of batch and profile jobs, by a pooler. The DataFlux Data Management Server requests a new process from the pooler. The server then sends the process the service name and input parameters so that the process can load and run the service. For batch and profile jobs, the DataFlux Data Management Server starts a new DFWFPROC process and assigns the job to that process. Log entries record job status:
75 Manage the DFWFPROC Process 67 start, running, finished successfully, or terminated due to error. This information is displayed in the Monitor folder in DataFlux Data Management Studio. You can also configure the DataFlux Data Management Server to collect job run statistics for batch and profile jobs. The statistics are parsed from job run log files by the SAS Job Monitor (in SAS Environment Manager.) The Job Monitor can examine log entries that are collected during job runs. To learn more about using the SAS Job Monitor, see Collect Job Status Information with the SAS Job Monitor. The default directories for process services, batch jobs, and profile jobs are located in install-path/var. Limit the Number of Jobs and Queue Job Run Requests To manage server performance, you can use DMSERVER/MAX_JOB_NUM to control the number of jobs (batch and profile) that can run simultaneously. When the maximum number of active jobs is reached, you can choose to refuse additional job run requests, or to enqueue job run requests using DMSERVER/JOBS_QUEUE. Using the queue, the server honors all job run requests in the order in which they were received, as the number of active jobs drops below the specified limit. By default, the server accepts all job requests and does not use a queue. In DataFlux Data Management Studio, queued job run requests are displayed in the job s Run History tab. To display the Run History tab, you expand the DataFlux Data Management Servers riser, expand the server s entry in the tree list, and click the job. In the Run History tab, the Status column displays Queued. When you right-click a queued job in the Run History tab, the only action that does not generate an error message is Stop. Stopping the job removes the job run request from the queue. Similarly, when your SOAP clients submit SOAP commands for enqueued jobs, SOAP Fault messages are returned for all commands except those that stop the job. Unload Idle DFWFPROC Processes You can unload idle DFWFPROC processes for real-time process services, batch jobs, and profile jobs using the SOAP command UnloadProcesses, as described in Apply SOAP Commands and WSDL Options. To unload processes manually, follow these steps: 1. Open Data Management Studio and open the DataFlux Data Management Servers window. 2. In the Data Management Servers window, locate the action menu. 3. Select Unload Idle Loaded Processes. Reference for DFWFPROC Configuration Options Use the following configuration options to manage your DFWSVC processes. These options are specified in install-path/etc/dmserver.cfg.
76 68 Chapter 6 Managing Jobs, Services, and the Repository Configuration Option DMSERVER/SOAP/PROC_SVC/MAX_NUM DMSERVER/JOBS_MAX_NUM DMSERVER/JOBS_QUEUE DMSERVER/SOAP/DATA_SVC/ JOB_COUNT_MIN DMSERVER/SOAP/DATA_SVC/ JOB_COUNT_MAX DMSERVER/JOBS_ROOT_PATH Description Specifies the maximum number of real-time process services that are allowed to run simultaneously. The default value is 10. If a new service request exceeds the limit, then an error message is displayed and the new service is not executed. Specifies the maximum number of batch jobs and profile jobs that are allowed to run simultaneously. The default value is 10. If a new job request exceeds the limit, and if DMSERVER/JOBS=YES, then the job run request is placed in a queue. If DMSERVER/JOBS=NO, then an error message is displayed and the new job is not executed. Allows or denies the server the ability to queue batch job run requests when the number of running jobs exceeds the value of the DMSERVER/JOBS_MAX_NUM option. A value of YES queues job run requests and sends clients a success response, as if the job had begun to run. As running jobs finish or are canceled, queued jobs run and the queue is cleared. This option does not apply to profile jobs. The default value is NO. Specifies the minimum number of instances of a given real-time service that remain loaded. When the number of jobs reaches the minimum limit, the Data Management Server halts the unloading of instances of the specified job. The value of this option is of the form min-count:service-file-name. Specifies the maximum number of instances of a given service job that can be loaded at the same time. Once this limit is reached, the DataFlux Data Management Server will not load any more instances of that service job. The value of this option is of the form max-count:servicefile-name. Specifies the location of the root directory for the job and service subdirectories. The default object root directory is installpath\var. The subdirectories for jobs and services are: data services, process services, and batch jobs.
77 Run Jobs with the dmpexec Command 69 Run Jobs with the dmpexec Command Overview You can run execute jobs on the DataFlux Data Management Server with the command install-path/bin/dmpexec. dmpexec Options The dmpexec command accepts the following options: Option Purpose -c filename Reads a configuration file to set option values that are specific to the job or command, including the authentication option. (See the -a option.) -j file Executes the job in the specified file. -l path-filename Writes job run log messages to a file. Specify different log files for each job run, The path value is absolute. It is not affected by the values of any configuration option. -i key=value Sets the input variable key to a value before running the job. -o key=value Sets a server option to a value. -b key=value Sets a job option to a value. -a Authenticates the user who is executing dmpexec using a Metadata Server or Authentication Server. This option is required for domain-enabled connections. To successfully authenticate, you need to specify options that specify the authenticating server, user name, and password. Note: You can use the -i, -b, and -o options multiple times to set multiple values. Configure Authentication for dmpexec When you specify the -a option in the dmpexec command, the DataFlux Data Management Server requires three server configuration options. The following configuration options specify the location of the authenticating server, a user name that is registered on the authenticating server, and the password that validates the user name: BASE/AUTH_SERVER_LOC=network-path:port defines how to connect to the authenticating server. BASE/AUTH_SERVER_USER=user-name specifies the user name that will be authenticated by the specified server.
78 70 Chapter 6 Managing Jobs, Services, and the Repository BASE/AUTH_SERVER_PASS=password specifies the password that is associated with the user name. The authenticating server can be a Metadata Server or an Authentication Server. You can set default values for these options in the configuration file dmserver.cfg. You can also specify these options in a configuration file that you create specifically for a given job or command, using the -c option. Return Codes for the dmpexec Command The dmpexec command returns the following status codes: Return Code Reason 0 Job is still running 1 Job has finished successfully 2 Job has finished with errors: general error (see job log file) 3 Job has finished with errors: process terminated 4 Job has finished with errors: job was canceled Configure a Log File for dmpexec Follow these steps to generate a log file that applies specifically to your use of dmpexec: 1. In the directory install-path/etc, copy the file batch.cfg to create the new file dfwfproc.cfg. 2. In the same directory, copy the file batch.log.xml to create the new file dfwfproc.log.xml. 3. Edit dfwfproc.cfg to specify the location of dfwfproc.log.xml. 4. Open the new file dfwfproc.log.xml. In the file, replace this text: {OSENV.DIS_LOG_DIR}/batch With this text: {OSENV.DFEXEC_HOME}/var/server_logs/dfwfproc 5. Save and close the new files. Configure Jobs and Services Overview Jobs and services are configured using the following configuration files, all of which are stored in install-path/etc :
79 Configure Jobs and Services 71 app.cfg Specifies options that determine how job nodes interface with the resources on the Data Management Server. Options in app.cfg specify how job nodes send , use a Quality Knowledge Base, and access address verification software. Most of these options are commented-out by default. They are enabled only when your jobs need to use a particular resource. Real-time data services, real-time process services, batch jobs, and profile jobs are all developed and tested in DataFlux Data Management Studio. When you upload those jobs to DataFlux Data Management Server, the job execution environment has to enable the same configuration options that were used to develop and test those jobs. For this reason, the options that are enabled on the Data Management Server should be similar to the options that are enabled in DataFlux Data Management Studio. Option values differ primarily when they reference storage locations. For more information about the app.cfg file, see the DataFlux Data Management Studio Installation and Configuration Guide. service.cfg Specifies options that apply to real-time data services and real-time process services. This file currently supports one option, BASE/LOGCONFIG_PATH, which specifies the path to the log file directory that is used by service jobs. batch.cfg Specifies options that apply to batch jobs. This file provides an alternate value for the BASE/LOGCONFIG_PATH option. macros.cfg Specifies options (none by default) and macros that apply to all jobs and real-time services. For information about using macros, see Define Macros. Options are set by order of precedence, starting in the job s advanced properties. If an option is not specified in the job, then the server checks for a value in macros.cfg, followed by either service.cfg or batch.cfg. If no options are specified, then the default value is retained. Grant Job Permissions You can configure user permissions for batch jobs using the DataFlux Data Management Studio administrative interface in DataFlux Data Management Studio. Profile jobs do not have security functions like batch jobs, so they cannot be secured at the job level. You can still grant user permissions for profile jobs at the user or group level. QKB Memory Usage for Jobs and Services The DataFlux Quality Knowledge Base is loaded into memory in different ways to support batch jobs, profile jobs, real-time data services, and real-time process services. For batch and profile jobs, the definitions in the QKB are loaded into memory individually, as they are needed by the job. The definitions remain in memory until the end of the job. This default behavior can be changed by setting the configuration option QKB/ON_DEMAND=NO in the configuration file app.cfg. Specifying NO loads the entire QKB into memory each time you run a batch job or profile job. Real-time data services and real-time process services always load the entire QKB into memory for each new service. Similarly, existing services load the entire QKB into memory for each new thread. The QKB remains in memory until the termination of the service or the thread. The loading of the QKB into memory can affect the performance
80 72 Chapter 6 Managing Jobs, Services, and the Repository of real-time services. The memory used by the QKB can be a factor in the preloading of real-time services. Configure Bulk Loading Bulk loading enhances the performance of jobs that monitor business rules, when those jobs include row-logging events. You can optimize performance for your implementation by changing the number of rows in each bulk load. By default, the number of rows per load is You can change the default value in the app.cfg option MONITOR/ BULK_ROW_SIZE. Configure Storage for Temporary Jobs The business rules monitor creates and executes temporary jobs. Those jobs are normally kept in memory and are not stored on disk. When a directory is specified for temporary jobs, the Monitor stores temporary jobs in that location and leaves them in place after the job is complete. To specify a directory for temporary jobs, create the directory and set the path of that directory as the value of the app.cfg option MONITOR/ DUMP_JOB_DIR. By default, this option is not set and the Monitor does not store temporary jobs on disk. Support for Remote-Access Clients Remote-access clients are published applications (rather than streamed applications) that are run out of software such as Citrix or Microsoft RemoteApps. Your client applications, and DataFlux Data Management Studio, can be run as remote-access clients. Remote-access clients require additional support to ensure that the cancellation of jobs results in the termination of all remote child processes. To effectively cancel remote processes, set the following option in install-path/etc/app.cfg: BASE/MAINTAIN_GROUP=YES If you do not set the MAINTAIN_GROUP option, then the cancellation of jobs can allow child processes to persist on remote-access clients. These rogue processes can become associated with a new group or job. If you set the MAINTAIN_GROUP, and if remote child processes persist, then you might have to restart the remote-access client to terminate the processes. Resolve Out-of-Memory Errors When Using Sun JVM You can encounter out-of-memory errors in jobs with a SOAP Request node or an HTTP Request node, when you are using a Sun Java Virtual Machine (JVM). To resolve this error, add the following option to the Java start command that is specified in the app.cfg file: -XX:MaxPermSize=256m -XX:+CMSClassUnloadingEnabled
81 About the Repository 73 Collect Job Status Information with the SAS Job Monitor You can collect status information about your job runs using the SAS Job Monitor. The SAS Job Monitor is supplied as part of the SAS Environment Manager software. The SAS Job Monitor collects statistics from job run log files. Job run log files are generated by the Monitor logger on the Data Management Server. The Monitor logger is active and configured on the server by default. By default, the Job Monitor collects statistics at the end of the job run. To collect statistics at specified intervals during job runs, add the configuration option BASE/ MONITOR_FREQUENCY to the file install-path/etc/app.cfg. The default value of this option is -1. To collect statistics during the job run, set an option value that determines the time interval in milliseconds between the generation of log entries. Experiment with different option values to achieve desired results. To store job run logs in a separate directory, specify a directory path as the value of the option DMSERVER/JOB_LOGS_DIR, in the file install-path/etc/ dmserver.cfg. By default, job run logs are stored in the same directory as all other job logs and job files, as specified by the value of DMSERVER/WORK_ROOT_PATH. The default directory can become large and difficult to navigate. If you specify DMSERVER/JOB_LOGS_DIR, the value of that option will be set when you start the server. At start time, the value of the option is recorded in the server log file. If you need to change the encoding of your job logs, set the option BASE/ JOB_LOG_ENCODING in the DataFlux Data Management Server s app.cfg file. The BASE/JOB_LOG_ENCODING option specifies the encoding that is used in the job log. By default, the log is written in the encoding associated with the locale of the process that executes the job. For English-speaking organizations, this might be LATIN-1 or UTF-8. If a log entry contains characters that cannot be represented in the specified encoding, then the log entry is not written to the log file. For additional information about the collection of job statistics, see the DataFlux Data Management Studio Installation and Configuration Guide, the DataFlux Data Management Studio User s Guide, and the Help for the SAS Job Monitor. About the Repository A repository is a directory that stores a database file or database connection information. You run jobs on the DataFlux Data Management Server to access the data in the repository. The DataFlux Data Management Server enables access to one and only one repository. If a job attempts to access a repository other than the one that is currently enabled, then the server sends an error messages to the server log file. Because of the way that jobs are created, tested, and exported, the repository that is used by the server needs to be configured to match the repository that is used by DataFlux Data Management Studio. In fact, both the client and the server can share the same repository. Each repository is defined by a repository configuration file. The.rcf file provides metadata definitions for Unified Database connection parameters. The Unified Database
82 74 Chapter 6 Managing Jobs, Services, and the Repository connection consists of a connection string and a table prefix. The connection string can identify a file in the repository, or a connection to a DBMS. The.rcf file is located in the directory of the repository. The location of the directory is specified by default or by the configuration option BASE/REPOS_SYS_PATH, which is not set by default. The default repository is install-path\etc\repositories. The default repository configuration file in that directory is server.rcf. To enable a non-default repository, set the option BASE/REPOS_SYS_PATH in install-path\etc \dmserver.cfg, and then restart the DataFlux Data Management Server. To learn more about creating and migrating a repository, see the topic Maintaining Repositories in the DataFlux Data Management Studio User s Guide. Troubleshoot Jobs and Services Overview If your job or service experiences any of the following symptoms, refer to the following resolutions. Troubleshoot SOAP Packets To debug jobs and services, you can enable the logging of the SOAP packets that are transmitted and received by the DataFlux Data Management Server. See Enable the SOAP Log on page 36, Server Processes (DFWSVC or DFWFPROC) Fail to Start, or Out of Memory Error in Windows When Launching Server Processes Windows displays the following error message: The application failed to initialize properly (0xc ). Click OK to terminate the application. The Data Management Server log file might also display one of the following messages: Data Service error: failed to start service process: 1 - Child failed to contact server process. Failed to start base services, rc=1 (Error loading dependency library). Process Service error: Failed to getprocess, errorcode=2 (Process 'HOST:ADDR' exited unexpectedly.) Batch Job error: failed to get process; err: 0 - Process 'HOST:ADDR' exited unexpectedly. It is possible for the Windows event log to not contain entries for DFWSVC and DFWFPROC, even when the DataFlux Data Management Server logs contain one or more entries. This symptom often indicates that the failure to start processes is caused by
83 Windows running too many internal processes. The Data Management Server cannot start new processes. The log discrepancy occurs when Windows runs out of desktop heap. Specifically, the desktop heap in the WIN32 subsystem becomes depleted. To free system resources, stop as many non-essential applications and processes as permissible and try to run the jobs again on the DataFlux Data Management Server. If the errors persist, you might need to make a minor change in the Windows registry to increase the SharedSection parameter of the SubSystems key in HKEY_LOCAL_MACHINE. For additional information, see the following Microsoft Support articles: "Out of Memory" error message appears when you have a large number of programs running User32.dll or Kernel32.dll fails to initialize Troubleshoot Jobs and Services 75 Unexpected behavior occurs when you run many processes on a computer running SQL Server Required OpenSSL Libraries Were Not Found This error message indicates that the libraries for OpenSSL were placed in a directory other than /bin. Make sure that the libraries match the host operating environment (32-bit or 64-bit.) Copy the libraries to the /bin directory and restart the server. The Repository Is Newer Than This Client This error message indicates that someone at your site has upgraded the repository on the Data Management Server. Install a new version of DataFlux Data Management Studio to use the new repository. Creating a New WSDL in DataFlux Web Studio Generates a Not- Licensed Error If you create a new WSDL for a service using DataFlux Web Studio, you can receive a not licensed error in the Web Studio log file. To resolve this error, open DataFlux Data Management Studio, and then select Tools ð Data Management Studio Options ð Data Management Server. Select the options Connect to Data Management Server for SAS and Connect to Data Management Server Web Edition. SQL Lookup Job Fails on a UNIX or Linux System Using the Driver for BASE The Driver for BASE does not allow data sets to be created that cannot be read by SAS. If you have Driver for SAS files that contain letters that cannot be accessed in the UNIX or Linux operating environments, then you will need to rename the file to all-lowercase. Other files that contain mixed case or uppercase letters might also need to be renamed using lowercase letters. Once the files are renamed, they can then be accessed in jobs using any case. For example, the file might be named lookupsource. In jobs, you can reference LOOKUPSOURCE, lookupsource, or LookUPSoUrCe, just to name a few.
84 76 Chapter 6 Managing Jobs, Services, and the Repository When Opening a Job Log: SOAP-ENV:Client:UNKNOWN Error (or Time-out) This error occurs on some configurations of Microsoft Windows Server 2003, when the log file exceeds 32KB. A workaround for this problem is to set the following configuration value in the dmserver.cfg file: DMSERVER/LOG_CHUNK_SIZE = 32KB This error and this resolution apply only when the host of your Data Management Server is running Windows Server Error Occurs in an Address Verification Job on Linux The following error message content indicates that the job is attempting to use an unsupported version of Address Doctor: T10:52:11,303 INFO [ ] - Node DATAFLOW_0 started t10:52:11, 390 ERROR [ ] - Unknown locale name To resolve this error, edit the job to use the latest Address Verification node, which uses the latest version of the Address Doctor software. DQ Engine Cannot Process New Quality Knowledge Base Definitions This error occurs when the currently loaded QKB uses definitions that are newer than those that are supported by the current version of the QKB. By default, the QKB attempts to load the definitions and issues warnings before loading them. If the definitions include instructions that QKB cannot process, the instructions are ignored and an error is displayed. The QKB/ALLOW_INCOMPAT option can be used to specify whether to allow incompatible QBK definitions to be processed by the QKB. The option is defined in the app.cfg file; it enables you to choose to either stop processing or allow the incompatibility and continue processing the definitions. Job with Custom Scheme Fails to Run A job with a custom scheme that fails to run will produce an error similar to the following: 0817_11:17: ERROR Node DATAFLOW_0 error: 3: DQ Engine - DQ load scheme 'frfra001.sch.bfd' failed: DQ Engine - DQ error -400: DQ Engine - Cannot open file "frfra001.sch" _11:17: INFO Job terminated due to error in one or more nodes. To resolve this error, ensure that the name of the scheme is entered correctly, as it is case sensitive. Also ensure that the Quality Knowledge Base (QKB) you are using is an exact copy of the QKB used when the job was created in DataFlux Data Management Studio.
85 Customize the WSDL File for Java 77 To copy the QKB from Windows to UNIX or Linux, use FTP or Samba mappings. After you copy the QKB, restart the DataFlux Data Management Server and run the job again. In UNIX and Linux, change the scheme name (in the scheme directory of the QKB) as needed to use all lowercase letters. Load Balance Service Requests It is possible for the performance of your real-time services to decline due to a high volume of service requests. The decline in performance can be resolved by creating a larger pool of available services. If pooling changes do not resolve the issue, then you can balance the incoming service requests across multiple instances of the DataFlux Data Management Server. To balance service requests across multiple servers, use a third-party load balancing service. The load balancing service intercepts HTTP requests (SOAP over HTTP) and then distributes those requests to the pool of DataFlux Data Management Servers. Normally, load balancing services use a round-robin load balancing algorithm. Customize the Server s WSDL File The Data Management Server's client library is available in Java and C. You can customize the file to suit your environment using the Web Service Definition Language (WSDL) file. You can access the WSDL file at the following path or by entering the following URL into a web browser: contains the descriptions of the available web services. You can access the WSDL file directly, or you can use the following URL in a web browser: install-path/share/arch.wsdl In the WSDL file, the value of the SOAP:address location reflects the local server's host name and port number. Using an XML editor, you can update the SOAP:address location to reflect the host name and port number of any DataFlux Data Management Server. One note of caution, please do not edit any other values in the arch.wsdl file. For example: <service name="dfx-dmserver-instance-name"> <documentation>dataflux Data Management Server</documentation> <port name="dqisservice" binding="tns:architectservice"> <SOAP:address location=" </port> </service> Customize the WSDL File for Java To customize arch.wsdl for Java, use a mapping tool, such as wscompile, to generate stubs and build classes that wrap the DataFlux Data Management Server interface. In the examples below, using wscompile, the WSDL file is imported and the stubs are
86 78 Chapter 6 Managing Jobs, Services, and the Repository generated using information from the WSDL file. To use a stub, it must be configured with the service endpoint, or server address. //////////////////////////////////////////////////////// // Imports //////////////////////////////////////////////////////// import arch.*; //////////////////////////////////////////////////////// // INITIALIZATION //////////////////////////////////////////////////////// ArchitectServicePortType_Stub stub; // get the stub stub =(ArchitectServicePortType_Stub)new DQISService_Impl()).getDQISService(); // optionally set to point to a different end point stub._setproperty(javax.xml.rpc.stub.endpoint_address_property, " //////////////////////////////////////////////////////// // 1) Get Object List example //////////////////////////////////////////////////////// String[] res; res=stub.getobjectlist(objecttype.archservice); //////////////////////////////////////////////////////// // 2) Post Object example //////////////////////////////////////////////////////// byte[] mydata; ObjectDefinition obj = new ObjectDefinition(); obj.setobjectname("name"); obj.setobjecttype(objecttype.fromstring("archservice")); // read the job file in from the h/d mydata = getbytesfromfile(new File(filename)); // post the job to the server String res=stub.postobject(obj, mydata); //////////////////////////////////////////////////////// // 3) Delete Object //////////////////////////////////////////////////////// ObjectDefinition obj = new ObjectDefinition(); obj.setobjectname("myjob.ddf"); obj.setobjecttype(objecttype.fromstring("archservice")); String res = stub.deleteobject(obj); //////////////////////////////////////////////////////// // 4) Get Data Service Params //////////////////////////////////////////////////////// GetArchitectServiceParamResponse resp; FieldDefinition[] defs; resp=stub.getarchitectserviceparams("myjob.ddf",""); // Get Definitions for Either Input or Output defs=resp.getinflddefs(); defs=resp.getoutflddefs(); //Loop through Defs defs[i].getfieldname();
87 Customize the WSDL File for Java 79 defs[i].getfieldtype(); defs[i].getfieldlength(); //////////////////////////////////////////////////////// // 5) Execute Data Service //////////////////////////////////////////////////////// FieldDefinition[] defs; DataRow[] rows; String[] row; GetArchitectServiceResponse resp; // Fill up the Field Definitions defs=new FieldDefinition[1]; defs[0] = new FieldDefinition(); defs[0].setfieldname("name"); defs[0].setfieldtype(fieldtype.string); defs[0].setfieldlength(15); // Fill up Data matching the definition rows = new DataRow[3]; row=new String[1]; row[0] ="Test Data"; rows[i] = new DataRow(); rows[i].setvalue(row[0]); resp=stub.executearchitectservice("myjob.ddf", defs, rows, ""); // Get the Status, Output Fields and Data returned from the Execute Call String res = resp.getstatus(); defs=resp.getfielddefinitions(); rows=resp.getdatarows(); // Output Field Definitions defs[i].getfieldname(); defs[i].getfieldtype(); defs[i].getfieldlength(); // Output Data row=rows[i].getvalue(); res=row[j]; //////////////////////////////////////////////////////// // 6) Run Batch Job //////////////////////////////////////////////////////// ArchitectVarValueType[] vals; vals=new ArchitectVarValueType[1]; vals[0]=new ArchitectVarValueType(); vals[0].setvarname("testvar"); vals[0].setvarvalue("testval"); // Returns JOBID String res=stub.runarchitectjob("myjob.ddf", vals, ""); //////////////////////////////////////////////////////// // 7) Get Job Status //////////////////////////////////////////////////////// JobStatusDefinition[] defs; // if you wanted the status for a single job, you would // pass the jobid returned from runarchitectjob or runprofilejob
88 80 Chapter 6 Managing Jobs, Services, and the Repository defs=stub.getjobstatus(""); ObjectDefinition obj; obj=defs[i].getjob(); defs[i].getjobid(); defs[i].getstatus(); obj.getobjectname() obj.getobjecttype() //////////////////////////////////////////////////////// // 8) Get Job Log //////////////////////////////////////////////////////// GetJobLogResponseType resp; FileOutputStream fo; resp=stub.getjoblog(jobid,0); // write it to a file fo = new FileOutputStream (resp.getfilename()); fo.write(resp.getdata()); fo.close(); //////////////////////////////////////////////////////// // 9) Terminate Job //////////////////////////////////////////////////////// String res=stub.terminatejob(jobid); //////////////////////////////////////////////////////// // 10) Clear Log //////////////////////////////////////////////////////// String res=stub.deletejoblog(jobid); Customize the WSDL File for C# To customize your arch.wsdl file for a C# environment, import a web reference into your project. This builds the object that is required to interface with the Data Management Server. //////////////////////////////////////////////////////// // Imports //////////////////////////////////////////////////////// // Add Web reference using the DataFlux supplied WSDL //////////////////////////////////////////////////////// // INITIALIZATION //////////////////////////////////////////////////////// DQISServer.DQISService mservice= new DQISServer.DQISService(); mservice.url = " + ":" + "PORT"; //////////////////////////////////////////////////////// // 1) Get Object List example //////////////////////////////////////////////////////// string[] jobs; jobs=mservice.getobjectlist(dqisserver.objecttype.archservice);
89 Customize the WSDL File for C# 81 //////////////////////////////////////////////////////// // 2) Post Object example //////////////////////////////////////////////////////// DQISServer.ObjectDefinition def = new DQISServer.ObjectDefinition(); def.objectname = "VerifyAddress.ddf"; def.objecttype = DQISServer.ObjectType.ARCHSERVICE; // Grab Bytes from a job file byte[] data = new byte[short.maxvalue]; FileStream fs = File.Open(@"c:\Develop\SoapUser\VerifyAddress.ddf", FileMode.Open, FileAccess.Read, FileShare.None); fs.read(data,0,data.length); DQISServer.SendPostObjectRequestType req= new DQISServer.SendPostObjectRequestType(); req.@object = def; req.data = data; mservice.postobject(req); //////////////////////////////////////////////////////// // 3) Delete Object //////////////////////////////////////////////////////// DQISServer.SendDeleteObjectRequestType req = new DQISServer.SendDeleteObjectRequestType(); DQISServer.ObjectDefinition def = new DQISServer.ObjectDefinition(); def.objectname = "VerifyAddress.ddf"; def.objecttype = DQISServer.ObjectType.ARCHSERVICE; req.job = def; mservice.deleteobject(req); //////////////////////////////////////////////////////// // 4) Get Data Service Params //////////////////////////////////////////////////////// DQISServer.GetArchitectServiceParamResponseType resp; DQISServer.SendArchitectServiceParamRequestType req; req=new DQISServer.SendArchitectServiceParamRequestType(); req.servicename="myjob"; resp=mservice.getarchitectserviceparams(req); string val; int i; DQISServer.FieldType field; // loop through this data val = resp.inflddefs[0].fieldname; i = resp.inflddefs[0].fieldlength; field = resp.inflddefs[0].fieldtype; val = resp.outflddefs[0].fieldname; i = resp.outflddefs[0].fieldlength; field = resp.outflddefs[0].fieldtype;
90 82 Chapter 6 Managing Jobs, Services, and the Repository //////////////////////////////////////////////////////// // 5) Execute Data Service //////////////////////////////////////////////////////// DQISServer.SendArchitectServiceRequestType req = new DQISServer.SendArchitectServiceRequestType(); DQISServer.GetArchitectServiceResponseType resp; //////////////////////////////////////////////////////// DQISServer.GetArchitectServiceParamResponseType respparam; DQISServer.SendArchitectServiceParamRequestType reqparam; reqparam=new DQISServer.SendArchitectServiceParamRequestType(); reqparam.servicename="servicename"; respparam=mservice.getarchitectserviceparams(reqparam); //////////////////////////////////////////////////////// DQISServer.FieldDefinition[] defs; DQISServer.DataRow[] data_rows; string[] row; defs=new DQISServer.FieldDefinition[respParam.inFldDefs.Length]; for(int i=0; i < respparam.inflddefs.length; i++) { // Fill up the Field Definitions defs[i] = new DQISServer.FieldDefinition(); defs[i].fieldname = respparam.inflddefs[i].fieldname; defs[i].fieldtype = respparam.inflddefs[i].fieldtype; defs[i].fieldlength = respparam.inflddefs[i].fieldlength; } DataTable table = m_inputdataset.tables["data"]; // externally provided data // Fill up Data matching the definition data_rows = new DQISServer.DataRow[Number of Rows]; for(int i=0;i < table.rows.count;i++) { System.Data.DataRow myrow = table.rows[i]; row=new String[table.Columns.Count]; for(int c=0;c < table.columns.count;c++) { row[c] = myrow[c].tostring(); } // Loop and create rows of data to send to the service data_rows[i] = new DQISServer.DataRow(); data_rows[i].value = new string[table.columns.count]; data_rows[i].value = row; } req.servicename = "ServiceName"; req.fielddefinitions = defs; req.datarows = data_rows; resp=mservice.executearchitectservice(req); //////////////////////////////////////////////////////// // 6) Run Batch Job //////////////////////////////////////////////////////// DQISServer.SendRunArchitectJobRequest req = new DQISServer.SendRunArchitectJobRequest(); DQISServer.GetRunArchitectJobResponse resp;
91 Customize the WSDL File for C# 83 DQISServer.ArchitectVarValueType[] varval = new DQISServer.ArchitectVarValueType[1]; varval[0] = new DQISServer.ArchitectVarValueType(); varval[0].varname = "TESTVAR"; varval[0].varvalue = "TESTVAL"; req.job = "JOB_NAME"; req.varvalue = varval; resp = mservice.runarchitectjob(req); string jobid = resp.jobid; //////////////////////////////////////////////////////// // 7) Get Job Status //////////////////////////////////////////////////////// DQISServer.SendJobStatusRequestType req = new DQISServer.SendJobStatusRequestType(); DQISServer.JobStatusDefinition[] resp; req.jobid = ""; resp = mservice.getjobstatus(req); DQISServer.ObjectDefinition def = resp[0].job; string jobid = resp[0].jobid; string jobstatus = resp[0].status; //////////////////////////////////////////////////////// // 8) Get Job Log //////////////////////////////////////////////////////// DQISServer.SendJobLogRequestType req = new DQISServer.SendJobLogRequestType(); DQISServer.GetJobLogResponseType resp; req.jobid = "SOMEJOBID"; resp = mservice.getjoblog(req); string filename = resp.filename; byte []data = resp.data; //////////////////////////////////////////////////////// // 9) Terminate Job //////////////////////////////////////////////////////// DQISServer.SendTerminateJobRequestType req = new DQISServer.SendTerminateJobRequestType(); DQISServer.GetTerminateJobResponseType resp; req.jobid = "SOMEJOBID"; resp = mservice.terminatejob(req); string filename = resp.status; //////////////////////////////////////////////////////// // 10) Clear Log //////////////////////////////////////////////////////// DQISServer.SendDeleteJobLogRequestType req = new
92 84 Chapter 6 Managing Jobs, Services, and the Repository DQISServer.SendDeleteJobLogRequestType(); DQISServer.GetDeleteJobLogResponseType resp; req.jobid = "SOMEJOBID"; resp = mservice.deletejoblog(req); string filename = resp.status;
93 85 Chapter 7 Configuration Option Reference Configuration Options Overview Configuration Options Reference for dmserver.cfg Configuration Options Overview This chapter provides detailed information for the configuration options that are maintained in the file install-path\etc\dmserver.cfg. Edit the configuration file to change option values, and then restart the server to apply your changes. This chapter documents only the subset of the options in dmserver.cfg that pertain directly to the DataFlux Data Management Server. For information about options in dmserver.cfg that are not documented in this chapter, refer to the Configuration Options Reference in the DataFlux Data Management Studio Installation and Configuration Guide. The Data Management Server also uses configuration options that are provided in the file install-path\etc\app.cfg. Reference material for the configuration options in the app.cfg file is also provided in the DataFlux Data Management Studio Installation and Configuration Guide. Configuration options in dmserver.cfg and app.cfg are generally maintained in parallel between a given DataFlux Data Management Server and the instances of Data Management Studio that upload jobs to that server. Maintaining parallel configurations helps ensure that jobs that are tested in DataFlux Data Management Studio will run after they are uploaded to DataFlux Data Management Server.
94 86 Chapter 7 Configuration Option Reference Configuration Options Reference for dmserver.cfg Configuration Option BASE/AUTH_SERVER_PASS BASE/AUTH_SERVER_USER DMSERVER/CHILD/ LISTEN_HOST DMSERVER/CHILD/ LISTEN_PORT DMSERVER/IPACC/ ALL_REQUESTS DMSERVER/IPACC/NOSECURITY DMSERVER/IPACC/ POST_DELETE Description Specifies a default password that is submitted to the authenticating server (SAS Metadata Server or DataFlux Authentication Server) when you use issue the command dmpexec -a. This option is overridden when the dmpexec command provides a job-specific value. For more information, see Run Jobs with the dmpexec Command. Specifies a default user name that is submitted to the authenticating server (SAS Metadata Server or DataFlux Authentication Server) when you use issue the command dmpexec -a. Specifies the host name or IP address to which the DataFlux Data Management Server must bind for DFWSVC child process connections. The default value is localhost. For more information about binding to a host name or IP address, see the option DMSERVER/SOAP/LISTEN_HOST. Specifies the port on which the DataFlux Data Management Server listens for connections from DFWSVC child processes. This option defaults to a dynamic available port. If this option is specified and you are running multiple instances of the server on the same machine, this port must be unique for the ports, both specified and default. For more information about the default ports, see DMSERVER/SOAP/ LISTEN_PORT and DMSERVER/WLP/ LISTEN_PORT. Controls access to all SOAP requests based on the client's IP address. By default, access is enabled for all IP addresses. As shown in. $$$ Control Access by IP Address, a non-default value allows or denies access to specified IP addresses. Allows or denies to specified IP addresses the ability to bypass all security checks on the DataFlux Data Management Server. This option is disabled by default. Controls by IP address a client s ability to submit SOAP post and delete requests. These controls restrict uploads and deletions of objects, such as jobs and services. This option is disabled by default.
95 Configuration Options Reference for dmserver.cfg 87 Configuration Option DMSERVER/ JOBS_HISTORY_MAXAGE DMSERVER/ JOBS_KEEP_HISTORY DMSERVER/JOB_LOGS_DIR DMSERVER/JOBS_MAX_NUM DMSERVER/JOBS_NO_STATE Description Defines a retention period, in seconds, for the history items that are generated by batch and profile job run instances. After the completion of a job, and after the expiration of the specified time period, the DataFlux Data Management Server purges the job's history from memory. The server also deletes any corresponding log files and statistics files. If the DMSERVER/ JOBS_KEEP_HISTORY=NO, then a history record is also deleted from history database. The default value of the MAXAGE option is -1, which specifies that history items are never purged. Note that jobs can delete their own log and statistics files by submitting the SOAP command DeleteJobLog. A value of YES specifies that the histories of job run instances are retained across server restarts. The default value is NO. Specifies the path and directory that is used to store the log files that are generated for each run of a batch job or a profile job. This option enables you to separate your job run logs from other logs and other files that are generated by job runs. Separating the job run logs makes it easier to see the files that are used to collect job run statistics. Statistics can be collected by the Job Monitor plug-in SAS Environment Manager. The default value of this option is specified by DMSERVER/WORK_ROOT_PATH, which specifies the storage location for all job-related logs and files. Specifies the maximum number of batch and profile jobs that the DataFlux Data Management Server runs simultaneously. (Batch and profile jobs are counted against the same pool). The default value is 10. If a new job request exceeds the limit, and if DMSERVER/ JOBS=YES, then the job request is placed in a queue. If DMSERVER/JOBS=NO, then an error message is displayed and the new job is not executed. Allows or denies batch and profile jobs the ability to generate state files for their runs. The default value NO, which means that jobs are allowed to generate state files. A value of YES prevents the generation of state files. If your job is denied the ability to generate a state file, the server returns the SOAP Fault message State Generation Not Allowed.
96 88 Chapter 7 Configuration Option Reference Configuration Option DMSERVER/JOBS_QUEUE DMSERVER/JOBS_ROOT_PATH DMSERVER/LOG_CHUNK_SIZE Description Allows or denies the server the ability to queue batch and profile job run requests when the number of running jobs exceeds the value of the DMSERVER/ JOBS_MAX_NUM option. A value of YES queues job run requests and sends clients a success response, as if the job had begun to run. As running jobs finish or are canceled, queued job run and the queue is cleared. This option does not apply to profile jobs. The default value is NO. Specifies the location of the root directory for the jobs and services subdirectories. The default root directory is install-path\var. The subdirectories for jobs and services are: data services, process services, and batch jobs. Controls the size of each log file or statistics file chunk that is sent to the client, in response to the getjoblog request. For a log file, this option controls the number of characters per chunk. For statistics files, this option controls the number of bytes per chunk. The default value is 512K.
97 Configuration Options Reference for dmserver.cfg 89 Configuration Option DMSERVER/NAME DMSERVER/ NO_WORK_SUBDIRS Description Specifies the name of the metadata definition of the DataFlux Data Management Server that is stored on the SAS Metadata Server. When the DataFlux Data Management Server is started, it uses the name to query the SAS Metadata Server for configuration information. When the option BASE/AUTH_SERVER_LOC in app.cfg identifies a SAS Metadata Server, the DataFlux Data Management Server retrieves and sets the following values: DMSERVER/SOAP/LISTEN_HOST DMSERVER/SOAP/LISTEN_PORT DMSERVER/SOAP/SSL DMSERVER/SECURE If the SAS Metadata Server cannot locate a metadata definition based on the name, then the DataFlux Data Management Server does not start. If any of the preceding options have values in the DataFlux Data Management Server s dmservre.cfg file, then the local values override the values that are supplied in metadata. For this reason, it is recommended that you comment-out these options in dmserver.cfg. To access the named metadata definition on the SAS Metadata Server, one of two conditions must be met. You can ensure that the process owner of the DataFlux Data Management Server has a user definition on the SAS Metadata Server. Otherwise, the named metadata definition needs to be available to the PUBLIC group. This option is ignored when BASE/ AUTH_SERVER_LOC identifies a DataFlux Authentication Server rather than a SAS Metadata Server. This option is specified by default when the DataFlux Data Management Server is installed as part of SAS Visual Process Orchestration. Specifies whether to create separate log subdirectories. The default value is NO, which means that all log files are created in subdirectories under the default directory, server_logs, or an alternate directory specified in the DMSERVER/WORK_ROOT_PATH option. The value YES should be applied only in special cases, as it creates numerous log files in a single directory. This single directory makes it difficult to determine which jobs and services log files belong to which server run instance and corresponding log file. Each run instance of each process (server, DFWFPROC, and DFWSVC) gets its own unique log file. Therefore, each new DataFlux Data Management Server run instance has to have its own log file, while pre-existing log files, if any, are renamed.
98 90 Chapter 7 Configuration Option Reference Configuration Option DMSERVER/SECURE DMSERVER/SECURE/ DEFAULT_ACE_GROUPS_ALLO W Description Enables or disables authorization and authentication on the DataFlux Data Management Server. This option also enables the use of extended encryption algorithms for stored passwords and IOM (TCP/IP) communication. The default value NO specifies that SECURE configuration options are ignored. The value YES specifies that other configuration options are required to properly secure the server, including BASE/ AUTH_SERVER_LOC. When using a SAS Metadata Server for security, the value of this option is retrieved from metadata when you start the DataFlux Data Management Server. At start time, if the dmserver.cfg file contains a value for this option, then the local value overrides the metadata value. For this reason, it is recommended that you not enable this option in dmserver.cfg when using a SAS Metadata Server. For more information, see DMSERVER/NAME, DMSERVER/SOAP/SSL, and Configure SSL and AES. For the groups that are listed in the value of this option, this option allows access by default to the server s batch jobs and real-time services. This default is overridden when a batch job or real-time service has an access control list. The groups in the list must be recognized by the authentication provider (SAS Metadata Server by default.) The group names in the list are case-sensitive. The users in the ALLOW groups cannot have a conflicting permission in any of the other three DEFAULT_ACE options. Name errors or permission conflicts generate error messages and prevent all users from connecting to the server. In the list of groups, the delimiter is, or space space, as shown in this example: DMSERVER/SECURE/ DEFAULT_ACE_GROUPS_ALLOW = SalesAllRegion Directors Finance For more information, see Manage Permissions on page 25.
99 Configuration Options Reference for dmserver.cfg 91 Configuration Option DMSERVER/SECURE/ DEFAULT_ACE_GROUPS_DENY DMSERVER/SECURE/ DEFAULT_ACE_USERS_ALLOW Description For the groups that are listed in the value of this option, this option denies access by default to the server s batch jobs and real-time services. This default is overridden when a batch job or real-time service has an access control list. The groups in the list must be recognized by the authentication provider (SAS Metadata Server by default.) The group names in the list are case-sensitive. The users in the DENY groups cannot have a conflicting permission in any of the other three DEFAULT_ACE options. Name errors or permission conflicts generate error messages and prevent all users from connecting to the server. In the list of groups, the delimiter is, or space space, as shown in this example: DMSERVER/SECURE/ DEFAULT_ACE_GROUPS_DENY = Administrators Manufacturing QualityControl For more information, see Manage Permissions on page 25. For the users that are listed in the value of this option, this option allows access by default to the server s batch jobs and real-time services. This default is overridden when a batch job or real-time service has an access control list. The users in the list must be recognized by the authentication provider (SAS Metadata Server by default. The user names in the list are case-sensitive. The users in the list cannot have a conflicting permission in any of the other three DEFAULT_ACE options. In the list of users, the delimiter is, or space space, as shown in this example: DMSERVER/SECURE/ DEFAULT_ACE_USERS_ALLOW = JohnC box, don RAM For more information, see Manage Permissions on page 25.
100 92 Chapter 7 Configuration Option Reference Configuration Option DMSERVER/SECURE/ DEFAULT_ACE_USERS_DENY DMSERVER/SECURE/ GRP_ADMIN DMSERVER/SECURE/ GRP_ALLOW DMSERVER/SECURE/GRP_DENY Description For the users that are listed in the value of this option, this option denies access by default to the server s batch jobs and real-time services. This default is overridden when a batch job or real-time service has an access control list. The users in the DENY list must be recognized by the authentication provider (SAS Metadata Server by default.) The user names in the list are case-sensitive. The users in the list cannot have a conflicting permission in any of the other three DEFAULT_ACE options. In the list of users, the delimiter is, or space space, as shown in this example: DMSERVER/SECURE/ DEFAULT_ACE_USERS_DENY = SMITH, John Data Admin01 JJames For more information, see Manage Permissions on page 25. Specifies the name of the DataFlux Data Management Server administrator group. If this option is defined, the group must exist on the authenticating server (SAS Metadata Server or DataFlux Authentication Server.) An error message is generated at server invocation when the following conditions exist: DMSERVER/SECURE/GRP_ADMIN is not defined the group is not defined DMSERVER/SECURE=YES Specifies the name of a non-administrative group, to provide access to the DataFlux Data Management Server for a few users and to exclude the rest. If this option is not set, all users are allowed in accordance with permissions and other configured options. The group must be defined by the same name on the authenticating server (SAS Metadata Server or DataFlux Authentication Server) before the group name can be applied as the value of this option. Specifies the name of a non-administrative group, to provide access to the DataFlux Data Management Server for most users and to exclude a few users. If this option is not set, all users are allowed in accordance with permissions and other configured options. The group must be defined by the same name on the authenticating server (SAS Metadata Server or DataFlux Authentication Server) before the name can be applied as the value of this option..
101 Configuration Options Reference for dmserver.cfg 93 Configuration Option DMSERVER/SOAP/ CONNS_BACKLOG DMSERVER/SOAP/DATA_SVC/ IDLE_TIMEOUT DMSERVER/SOAP/DATA_SVC/ JOB_COUNT_MAX DMSERVER/SOAP/DATA_SVC/ JOB_COUNT_MIN DMSERVER/SOAP/DATA_SVC/ MAX_ERRS DMSERVER/SOAP/DATA_SVC/ MAX_NUM DMSERVER/SOAP/DATA_SVC/ MAX_REQUESTS Description Specifies the maximum size of the connection request queue. The queue size limit enables the SOAP server to refuse connection requests when it can no longer process them within an acceptable period of time. The default is 100. Specifies the number of seconds to allow a DFWSVC process to remain idle before it is terminated. The default setting is 0 indicating that there is no time-out. Negative values are ignored. Specifies the maximum number of instances of a given service job that can be loaded at any given time. When this limit is reached, the DataFlux Data Management Server will not load any more instances of that service job. The option value syntax is count:job_file_name. Note: A whitespace character (' ') is used as a separator between potentially multiple entries. Therefore, the pathname of the services used in this option cannot contain whitespace characters. Specifies the minimum number of instances of a given service job that must remain loaded. When this limit is reached, the DataFlux Data Management Server will not unload any more instances of that service job. The option value syntax is count:job_file_name. Note: A whitespace character (' ') is used as a separator between potentially multiple entries. Therefore, the pathname of the services used in this option cannot contain whitespace characters. Specifies the maximum number of service errors that can occur in a DFWSVC process before it is forced to terminate. The default is -1, meaning there is no limit. Specifies the maximum number of real-time data services that the SOAP server is allowed to run simultaneously. The default is 10. If a new service request would exceed the limit, and if a queue is not enabled, then an error message is displayed. This option applies to the SOAP server, meaning that the service requests are coming from SOAP clients. This option does not apply to the WLP server or to requests from WLP clients. Specifies the maximum number of service requests that a given instance of the DFWSVC process is allowed to handle before that instance of the DFWSVC process is forced to terminate. The default is -1, meaning that no limit is enforced..
102 94 Chapter 7 Configuration Option Reference Configuration Option DMSERVER/SOAP/DATA_SVC/ MAX_RUNTIME DMSERVER/SOAP/DATA_SVC/ PRELOAD DMSERVER/SOAP/DATA_SVC/ PRELOAD_ALL DMSERVER/SOAP/DATA_SVC/ PRELOAD_DURING_RUN DMSERVER/SOAP/DATA_SVC/ QUEUE Description Specifies the maximum number of seconds that a data service is allowed to run a job and produce a response (output data or an error). If a data service does not produce a response within the time limit, then the corresponding instance of the DFWSVC process is terminated. The client receives a SOAP Fault message. The default value is zero, which means that no time-out occurs. Negative values are ignored. Note that the time-out can vary by one or two seconds due to the rounding up of counts less than a second. Specifies the number of instances of specified services that the DataFlux Data Management Server preloads during start-up. The option value syntax consists of one or more blank-separated instances of count:nameof-service. This option can be used with the option DMSERVER/SOAP/DATA_SVC/ PRELOAD_ALL. For more information, see Configure the Server to Pre-load Services. Note: A whitespace character (' ') is used as a separator between potentially multiple entries. Therefore, the pathname of the services used in this option cannot contain whitespace characters. Note: Dynamically generated WSDLs can execute preloaded data services. Specifies that the DataFlux Data Management Server preload at server initialization a specified number of instances of all real-time data services, including those found in subdirectories. The value of the option must be an integer greater than zero. Otherwise, the option is ignored. This option can be used with DMSERVER/ SOAP/DATA_SVC/PRELOAD. Allows or denies the DataFlux Data Management Server the ability to start a separate thread to preload services and accept SOAP requests immediately during preload. By default, the value NO specifies that the server preloads services before it accepts SOAP requests. The value YES specifies that the server start a separate thread to preload services and accept SOAP request simultaneously. If the server stops while the preload thread is running, that thread is terminated. Allows or denies the DataFlux Data Management Server the ability to queue requests for real-time data services. The default value NO indicates that service request generate a SOAP Fault error when all instances of the DFWSVC process are busy. The value YES indicates that the DataFlux Data Management Server queues service requests, and processes those requests when instances of the DFWSVC process become available.
103 Configuration Options Reference for dmserver.cfg 95 Configuration Option DMSERVER/SOAP/IGNORE_NS DMSERVER/SOAP/LISTEN_HOST DMSERVER/SOAP/LISTEN_PORT Description Allows or denies the DataFlux Data Management Server the ability to ignore namespaces in SOAP requests. The default value NO specifies that NULL values in input data fields are passed to jobs as empty strings. The value YES causes the server to ignore namespaces in SOAP requests. This allows the SOAP server to preserve NULL values when receiving input data instead of converting the values to empty strings. Specifies the host name or IP address to which the SOAP server must bind. A machine running DataFlux Data Management Server might be available on the network under different hostnames or IP addresses, and different machines on the network might have access to this machine via one host name or IP address, but not via others. By binding to a particular host name or IP address, the server hears only requests addressed specifically to that host name or IP address. A pair of hostnames and IP addresses can be used interchangeably. For example, if this option is set to localhost, local clients sending requests to will still be heard. However, requests to public IP address or the host name of the machine (or requests coming from external clients to the machine) will not be heard. By default, this option is left blank. That means the DataFlux Data Management Server is not bound to any specific host name or the IP address of the machine and will receive all requests that are coming to the machine, on the correct port. Note: If the value for this option originates from the Metadata Server (when DMSERVER/NAME is configured) and you want to unset that value, then add this option to dmserver.cfg file. This is necessary when you want the DMServer to start and listen for all available interfaces (as if there was no value set for this option). In addition, leave the value empty after the = symbol. Specifies the port on which the SOAP server listens for connections. The default value is when you use a DataFlux Authentication Server for security. When you use a SAS Metadata Server for security, the DataFlux Data Management Server uses the DMSERVER/NAME option to retrieve from metadata the values of three configuration options, including LISTEN_PORT. If the SAS Metadata Server does not return a value for LISTEN_PORT, then the DataFlux Data Management Server does not start. If a value is returned, and if dmserver.cfg also contains a value for LISTEN_PORT, then the local value overrides the metadata value. For this reason, it is recommended that you not set LISTEN_PORT in dmserver.cfg when using a SAS Metadata Server. For more information, see DMSERVER/NAME and DMSERVER/SOAP/ SSL.
104 96 Chapter 7 Configuration Option Reference Configuration Option DMSERVER/SOAP/PROC_SVC/ MAX_NUM DMSERVER/SOAP/ RDWR_TIMEOUT DMSERVER/SOAP/ RETURN_NULLS DMSERVER/SOAP/SSL DMSERVER/SOAP/SSL/ CA_CERT_FILE Description Specifies the maximum number of real-time process services that the DataFlux Data Management Server runs simultaneously. The default is ten. If a new service request exceeds this limit, an error message is displayed. Specifies the time-out value for Read and Write operations on a socket. Set a positive value to seconds, a negative value to microseconds, and no time-out to 0. If a non-0 value is set, a time-out occurs if no data can be sent or received within the specified time limit. The time-out count begins after the server initiates a send or receive operation over the socket. When a time-out occurs, a SOAP_EOF error is returned. The default value is zero seconds. Specifies how real-time data service return null values in output fields. The default value NO specifies that empty strings are returned. The value YES specifies that NULLs are returned. Enables or disables the use of SSL to protect SOAP communication. Encryption for SSL is also enabled, using algorithms that exceed the default SASProprietary. The default value of this option is NO. Specify a value of YES to enables the following SSL configuration options: DMSERVER/SOAP/SSL/CA_CERT_FILE DMSERVER/SOAP/SSL/CA_CERT_PATH DMSERVER/SOAP/SSL/KEY_FILE DMSERVER/SOAP/SSL/KEY_PASSWD When you use a SAS Metadata Server for security, the value of the DMSERVER/SOAP/SSL option is retrieved from metadata when you start the DataFlux Data Management Server. At start time, if the dmserver.cfg file contains a value for this option, then the local value overrides the metadata value. For this reason, it is recommended that you not specify a value for this option in dmserver.cfg. When you use a DataFlux Authentication Server for security, specify a value of YES in dmserver.cfg to enable SSL. For more information, see DMSERVER/NAME, DMSERVER/SECUREand Configure SSL and AES. Specifies the file where the Certificates Authority stores trusted certificates. If this option is not needed, then comment it out.
105 Configuration Options Reference for dmserver.cfg 97 Configuration Option DMSERVER/SOAP/SSL/ CA_CERT_PATH DMSERVER/SOAP/SSL/ KEY_FILE DMSERVER/SOAP/SSL/ KEY_PASSWD DMSERVER/SOAP/ VALIDATE_XML_CHARS DMSERVER/SOAP/WSDL Description Specifies the path to the directory where trusted certificates are stored. If this option is not needed, then comment it out. Specifies the path to the key file that is required when the SOAP server authenticates clients. If this option is not used, then comment it out. Specifies the password for DMSERVER/SOAP/SSL/ KEY_FILE. If the key file is not password protected, then comment-out this option. The value of this option must be encrypted in order for the option to be recognized. To encrypt passwords, see Encrypt Passwords for DSNs and SSL. Specify a value of YES to check every output string from all real-time services to make sure only valid XML characters are present. All control characters are replaced with a question mark (?). Control characters include integer values less that 32 (space), except 9 (tab), 10 (carriage return), and 13 (line feed). The default value is NO, for no XML validation. Setting this option will have a negative impact on performance of real-time services. In most installations, this option is not necessary. Allows or denies the DataFlux Data Management Server the ability to load WSDLs when the server is initialized. Specify a value of YES to load existing WSDLs at start-up, recognize jobs that are started by runsvc requests (if matching WSDLs exist,) and to recognize other WSDL configuration options. The default value NO specifies that WSDLs are not used, runsvc requests are ignored, and other WSDL configuration option are ignored.
106 98 Chapter 7 Configuration Option Reference Configuration Option DMSERVER/SOAP/WSDL/GEN DMSERVER/SOAP/WSDL/ GEN_ON_GET Description Enables or disables the generation of run-time WSDLs. The default value NO specifies that runsvc requests for run-time WSDL jobs will generate a SOAP Fault. Requests to upload jobs with run-time WSDLs are also ignored. The only WSDL jobs that run are those that have WSDLs that are preloaded at the invocation of the DataFlux Data Management Server. The value SINGLE enables the generation of a single WSDL for each postjob request or genwsdl request for a single file. The value MULTIPLE enables generation of multiple WSDLs for genwsdl requests that apply to multiple job files or to entire directories of jobs. Note that generating a WSDL can be a time-consuming and resource-intensive process, which depends on the parameters of the originating job. Also note that a request to generate WSDLs for all jobs under the root directory can cause a severe degradation in server performance. Specify the value MULTIPLE with caution. Specifies how the DataFlux Data Management Server generates WSDLs for jobs that include run-time WSDLs. The value NO specifies that an error message is generated in response HTTP getwsdl requests when: a WSDL does not exist on the server, or the WSDL does exist on the server, but the mod.time timestamp in the server WSDL differs from the same value in the client WSDL. The difference in mod.time values indicates that the client WSDL differs from the corresponding server WSDL. The value YES indicates that the server responds to the preceding conditions by generating a new WSDL and sending the new WSDL to the client. This option is valid only when the value of DMSERVER/SOAP/WSDL/GEN is SINGLE or MULTIPLE.
107 Configuration Options Reference for dmserver.cfg 99 Configuration Option DMSERVER/SOAP/WSDL/ RUN_IGNORE_MTIME DMSERVER/THREADS/ COUNT_MAX DMSERVER/THREADS/ IDLE_MAX DMSERVER/THREADS/ IDLE_TIMEOUT DMSERVER/WLP DMSERVER/WLP/DATA_SVC/ MAX_NUM Description Allows or denies the server the ability to detect differences between the mod.time timestamps in the DataFlux Data Management Server WSDL and in the WSDL of the client that submits a service request. The default value NO indicates that the server responds to mod.time differences by sending the client the SOAP Fault message Job Has Changed. This message indicates to the client that the client needs to request the server to generate a new WSDL and send the new WSDL to the client. The client can then update its version of the WSDL. The value YES specifies that the server does not compare mod.time values. Service requests are passed to a WLP process for execution. Note that this is the behavior of service requests based on generic WSDL. Specifies the maximum number of threads that can be started by the DataFlux Data Management Server. The default value is 1026 threads. If the setting is too low, it is adjusted automatically. There is no setting for an unlimited number of threads. For optimal performance, configure the number of threads based on the expected number of parallel clients and requests. Specifies the number of idle threads that can be kept open by the DataFlux Data Management Server. The default value of zero indicates that threads are terminated when they become idle. If that thread is needed again, it is restarted. Specifies the number of microseconds before a thread is flagged as idle after the thread stops doing work. The default of zero indicates that threads are initially flagged as idle. Enables or disables the execution of the WLP server. The default value NO specifies that the WLP server is bypassed at the invocation of the DataFlux Data Management Server. WLP clients then cannot connect to the DataFlux Data Management Server. (SOAP clients can still connect to the server.) The value YES specifies that the WLP server starts and listens at its assigned port. The DataFlux Data Management Server log file will contain status entries for the WLP server. Specifies the maximum number of real-time data services that the WLP server is allowed to run simultaneously. The default is 10. If a new service request exceeds this limit, an error message is sent to the requesting WLP client. This option does not apply to the SOAP server or requests coming from SOAP clients.
108 100 Chapter 7 Configuration Option Reference Configuration Option DMSERVER/WLP/LISTEN_HOST DMSERVER/WLP/LISTEN_PORT DMSERVER/WORK_ROOT_PATH Description Specifies a host name or IP address to which the WLP server must bind. This specific binding prevents the WLP server from responding to requests that are sent to its port with a different host name or IP address. A pair of host names and IP addresses can be used interchangeably. For example, if the option value is localhost, then local clients sending requests to will still be heard. However, requests that are sent to a public IP address, to the host name of the machine, or those that come from external clients will not be heard. By default, this option is left blank, so that the WLP server is not bound to a specific host name or IP address. The WLP server responds to all requests that apply to the DataFlux Data Management Server host and to the port of the WLP server. Specifies the port on which the WLP server listens for requests from WLP clients. If you are running multiple instances of the server on the same machine, then a unique port must be configured for each instance. The default port is Specifies the root directory under which the DataFlux Data Management Server work and log subdirectories are created. Each time the server starts, a new work directory is created for that instance of the server. The name of this directory contains the server start-up date and time, as well as the corresponding process ID. The default directory is install-path\var \server_logs.
109 101 Recommended Reading DataFlux Data Management Server Help DataFlux Data Management Studio User s Guide DataFlux Data Management Studio Installation and Configuration Guide SAS Visual Process Orchestration Server Administrator s Guide SAS Intelligence Platform: System Administration Guide SAS Intelligence Platform: Security Administration Guide SAS Federation Server Administrator s Guide DataFlux Authentication Server Administrator s Guide DataFlux Web Studio User s Guide DataFlux Web Studio Installation and Configuration Guide For a complete list of SAS books, go to support.sas.com/bookstore. If you have questions about which titles you need, please contact a SAS Book Sales Representative: SAS Books SAS Campus Drive Cary, NC Phone: Fax: [email protected] Web address: support.sas.com/bookstore
110 102 Recommended Reading
111 103 Glossary ACE An access control entry (ACE) is an item in an access control list used to administer object and user privileges such as read, write, and execute. ACL Access control lists (ACLs) are used to secure access to individual DataFlux Data Management Server objects. API An application programming interface (API) is a set of routines, data structures, object classes and/or protocols provided by libraries and/or operating system services in order to support the building of applications. DAC A data access component (DAC) allows software to communicate with databases and manipulate data. dfwfproc A process handled by DataFlux Data Management Server that runs process services, batch jobs, and profile jobs dfwsvc A DataFlux Data Management Server process that runs real-time services. DPV Delivery Point Validation (DPV) is a USPS database that checks the validity of residential and commercial addresses. DSN A data source name (DSN) contains connection information, such as user name and password, to connect through a database through an ODBC driver. LACS Locatable Address Conversion System (LACS) is used updated mailing addresses when a street is renamed or the address is updated for 911, usually by changing a rural route format to an urban/city format.
112 104 Glossary MMC The Microsoft Management Console (MMC) is an interface new to the Microsoft Windows 2000 platform which combines several administrative tools into one configurable interface. ODBC Open Database Connectivity (ODBC) is an open standard application programming interface (API) for accessing databases. OpenSSL The open-source implementation of SSL. See SSL. PID Process ID; a number used to uniquely identify a process. QAS Quick Address Software (QAS) is used to verify and standardize US addresses at the point of entry. Verification is based on the latest USPS address data file. QKB The Quality Knowledge Base (QKB) is a collection of files and configuration settings that contain all DataFlux data management algorithms. The QKB is directly editable using DataFlux DataFlux Data Management Studio. RDI Residential Delivery Indicator (RDI) identifies addresses as residential or commercial. SERP The Software Evaluation and Recognition Program (SERP) is a program the Canadian Post administers to certify address verification software. SOA Service Oriented Architecture (SOA) enables systems to communicate with the master customer reference database to request or update information. SOAP Simple Object Access Protocol (SOAP) is a web service protocol used to encode requests and responses to be sent over a network. This XML-based protocol is platform independent and can be used with a variety of Internet protocols. SSL Secure Sockets Layer; security protocol to enable web sites to pass sensitive information securely in an encrypted format. USPS The United States Postal Service (USPS) provides postal services in the United States. The USPS offers address verification and standardization tools. WSDL Web Services Definition Language: an XML-based language that provides a model for describing web services.
113 105 Index A Accelerators 10 access control by IP address 28 access control lists 19 access control, default 25 Activex, troubleshoot 37 AES 28 AES encryption 18 app.cfg 71 authentication 18 authentication provider 21, 23 authorization 19 B batch jobs 66 batch service log 35 batch.cfg 71 bulk loading 72 C clients, remote-access 72 configuration options dmserver.cfg 85 reference 86 credentials, encrypt, UNIX/Linux 30 credentials, encrypt, Windows 30 D DAC See data access component data access component 40 data connection create custom 43 create SAS 44 create, UNIX/Linux 42 create, Windows 42 delete 45 display 41 edit 44 manage credentials 45 troubleshoot 46 data connections create domain-enabled 43 overview, types of 39 data service log 34 data sources See data connections DataFlux Authentication Server 23 default access control lists 19 dfdbconf 42 dfintelliserver 10 DFWFPROC process 66 DFWSVC process 63 dmpexec authentication for 69 enable log 69 job run command 69 dmserver.cfg 85 DSN create, UNIX/Linux 42 create, Windows 42 G groups, remove 26 I installing with DataFlux products 10 J job constraints 49 types 48 job monitor 4 job permissions 71 job run command 69 job, driver, connection process 41 jobs, collect status information 73 jobs, configure 70 jobs, temporary, storage for 72
114 106 Index jobs, troubleshoot 74 L log batch service 35 data service 34 events and thresholds 36 server 33 SOAP 36 M macros, define 61 macros.cfg 71 migration 9 multi-thread 50 O ODBC 42, 43, 45, 46 OpenSSL 29 P permissions list of 27 managing 25 set using job list 26 pooler 66 pre-load services 52 processes, unload idle 67 profile jobs 66 Q QKB See Quality Knowledge Base QKBs 10 Quality Knowledge Base use with jobs and real-time services 71 Quality Knowledge Bases 10 R real-time servcies Quality Knowledg Base, use with 71 unload data services 63 real-time service constraints 49 types 48 real-time services browse available 54 configure 70 data service 63 preload 52 process services 66 terminate 62 registration 14 repository 73 S SAS Job Monitor 73 SAS Metadata Server 21 SAS Visual Process Orchestration 4 secruity troubleshoot 30 security by IP address 28 configure SAS metadata server 21 DataFlux Authentication Server 23 encrypt credentials, UNIX/Linux 30 encrypt credentials, Windows 30 mid-tier options 22 migration 9 overview 18 SSL and AES 28 server configure to run jobs, service 15 log 33 overview 1 process types 5 restart dependency 22 start, UNIX or Linux 32 stop, UNIX or Linux 32 troubleshoot start/stop 32 WSDL file, customize 77 service requests, load-balance 77 service.cfg 71 SOAP commands 55 SOAP log 36 SOAP server 5, 50 SSL 28 SSL security 18 T thread pool, shared 50 U users, remove 26 W Wire-Level Protocol (WLP) server 50 WLP server 5 WSDL 54 options 55 WSDL file, C# 80
115 WSDL file, customize 77 WSDL file, Java 77 Index 107
116 108 Index
DataFlux Data Management Server 2.5
DataFlux Data Management Server 2.5 Administrator s Guide SAS Documentation The correct bibliographic citation for this manual is as follows: SAS Institute Inc. 2013. DataFlux Data Management Server 2.5:
Guide to Operating SAS IT Resource Management 3.5 without a Middle Tier
Guide to Operating SAS IT Resource Management 3.5 without a Middle Tier SAS Documentation The correct bibliographic citation for this manual is as follows: SAS Institute Inc. 2014. Guide to Operating SAS
SAS 9.4 Intelligence Platform
SAS 9.4 Intelligence Platform Application Server Administration Guide SAS Documentation The correct bibliographic citation for this manual is as follows: SAS Institute Inc. 2013. SAS 9.4 Intelligence Platform:
SAS Task Manager 2.2. User s Guide. SAS Documentation
SAS Task Manager 2.2 User s Guide SAS Documentation The correct bibliographic citation for this manual is as follows: SAS Institute Inc. 2015. SAS Task Manager 2.2: User's Guide. Cary, NC: SAS Institute
Grid Computing in SAS 9.4 Third Edition
Grid Computing in SAS 9.4 Third Edition SAS Documentation The correct bibliographic citation for this manual is as follows: SAS Institute Inc. 2014. Grid Computing in SAS 9.4, Third Edition. Cary, NC:
Scheduling in SAS 9.4 Second Edition
Scheduling in SAS 9.4 Second Edition SAS Documentation The correct bibliographic citation for this manual is as follows: SAS Institute Inc. 2015. Scheduling in SAS 9.4, Second Edition. Cary, NC: SAS Institute
SAS Business Data Network 3.1
SAS Business Data Network 3.1 User s Guide SAS Documentation The correct bibliographic citation for this manual is as follows: SAS Institute Inc. 2014. SAS Business Data Network 3.1: User's Guide. Cary,
SAS 9.3 Intelligence Platform
SAS 9.3 Intelligence Platform System Administration Guide Second Edition SAS Documentation The correct bibliographic citation for this manual is as follows: SAS Institute Inc. 2012. SAS 9.3 Intelligence
SAS MDM 4.1. Administrator s Guide Second Edition. SAS Documentation
SAS MDM 4.1 Administrator s Guide Second Edition SAS Documentation The correct bibliographic citation for this manual is as follows: SAS Institute Inc. 2014. SAS MDM 4.1: Administrator's Guide, Second
SAS 9.3 Intelligence Platform
SAS 9.3 Intelligence Platform Application Server Administration Guide SAS Documentation The correct bibliographic citation for this manual is as follows: SAS Institute Inc 2011. SAS SAS 9.3 Intelligence
SAS 9.4 PC Files Server
SAS 9.4 PC Files Server Installation and Configuration Guide SAS Documentation The correct bibliographic citation for this manual is as follows: SAS Institute Inc. 2014. SAS 9.4 PC Files Server: Installation
SAS MDM 4.2. Administrator s Guide. SAS Documentation
SAS MDM 4.2 Administrator s Guide SAS Documentation The correct bibliographic citation for this manual is as follows: SAS Institute Inc. 2015. SAS MDM 4.2: Administrator's Guide. Cary, NC: SAS Institute
SAS IT Resource Management 3.2
SAS IT Resource Management 3.2 Reporting Guide Second Edition SAS Documentation The correct bibliographic citation for this manual is as follows: SAS Institute Inc 2011. SAS IT Resource Management 3.2:
SAS 9.4 Intelligence Platform
SAS 9.4 Intelligence Platform Installation and Configuration Guide Second Edition SAS Documentation The correct bibliographic citation for this manual is as follows: SAS Institute Inc. 2015. SAS 9.4 Intelligence
SAS 9.4 Logging. Configuration and Programming Reference Second Edition. SAS Documentation
SAS 9.4 Logging Configuration and Programming Reference Second Edition SAS Documentation The correct bibliographic citation for this manual is as follows: SAS Institute Inc. 2014. SAS 9.4 Logging: Configuration
Scheduling in SAS 9.3
Scheduling in SAS 9.3 SAS Documentation The correct bibliographic citation for this manual is as follows: SAS Institute Inc 2011. Scheduling in SAS 9.3. Cary, NC: SAS Institute Inc. Scheduling in SAS 9.3
SAS. 9.3 Guide to Software Updates. SAS Documentation
SAS 9.3 Guide to Software Updates SAS Documentation The correct bibliographic citation for this manual is as follows: SAS Institute Inc. 2011. SAS 9.3 Guide to Software Updates. Cary, NC: SAS Institute
SAS. Cloud. Account Administrator s Guide. SAS Documentation
SAS Cloud Account Administrator s Guide SAS Documentation The correct bibliographic citation for this manual is as follows: SAS Institute Inc. 2014. SAS Cloud: Account Administrator's Guide. Cary, NC:
SAS Visual Analytics 7.1 for SAS Cloud. Quick-Start Guide
SAS Visual Analytics 7.1 for SAS Cloud Quick-Start Guide The correct bibliographic citation for this manual is as follows: SAS Institute Inc. 2014. SAS Visual Analytics 7.1 for SAS Cloud: Quick-Start Guide.
SAS University Edition: Installation Guide for Windows
SAS University Edition: Installation Guide for Windows i 17 June 2014 The correct bibliographic citation for this manual is as follows: SAS Institute Inc. 2015. SAS University Edition: Installation Guide
OnDemand for Academics
SAS OnDemand for Academics User s Guide SAS Documentation The correct bibliographic citation for this manual is as follows: SAS Institute Inc. 2015. SAS OnDemand for Academics: User's Guide. Cary, NC:
SAS University Edition: Installation Guide for Linux
SAS University Edition: Installation Guide for Linux i 17 June 2014 The correct bibliographic citation for this manual is as follows: SAS Institute Inc. 2014. SAS University Edition: Installation Guide
SAS 9.4 Intelligence Platform: Migration Guide, Second Edition
SAS 9.4 Intelligence Platform: Migration Guide, Second Edition SAS Documentation The correct bibliographic citation for this manual is as follows: SAS Institute Inc. 2015. SAS 9.4 Intelligence Platform:
FileMaker Server 11. FileMaker Server Help
FileMaker Server 11 FileMaker Server Help 2010 FileMaker, Inc. All Rights Reserved. FileMaker, Inc. 5201 Patrick Henry Drive Santa Clara, California 95054 FileMaker is a trademark of FileMaker, Inc. registered
SAS 9.4 Management Console
SAS 9.4 Management Console Guide to Users and Permissions SAS Documentation The correct bibliographic citation for this manual is as follows: SAS Institute Inc 2015. SAS 9.4 Management Console: Guide to
SAS BI Dashboard 4.4. User's Guide Second Edition. SAS Documentation
SAS BI Dashboard 4.4 User's Guide Second Edition SAS Documentation The correct bibliographic citation for this manual is as follows: SAS Institute Inc. 2013. SAS BI Dashboard 4.4: User's Guide, Second
SAS. 9.4 Guide to Software Updates. SAS Documentation
SAS 9.4 Guide to Software Updates SAS Documentation The correct bibliographic citation for this manual is as follows: SAS Institute Inc. 2013. SAS 9.4 Guide to Software Updates. Cary, NC: SAS Institute
SAS Marketing Automation 5.1. User s Guide
SAS Marketing Automation 5.1 User s Guide The correct bibliographic citation for this manual is as follows: SAS Institute Inc. 2007. SAS Marketing Automation 5.1: User s Guide. Cary, NC: SAS Institute
Configuring Apache HTTP Server as a Reverse Proxy Server for SAS 9.3 Web Applications Deployed on Oracle WebLogic Server
Configuration Guide Configuring Apache HTTP Server as a Reverse Proxy Server for SAS 9.3 Web Applications Deployed on Oracle WebLogic Server This document describes how to configure Apache HTTP Server
SAS 9.3 Logging: Configuration and Programming Reference
SAS 9.3 Logging: Configuration and Programming Reference SAS Documentation The correct bibliographic citation for this manual is as follows: SAS Institute Inc. 2011. SAS 9.3 Logging: Configuration and
SAS 9.3 Foundation for Microsoft Windows
Software License Renewal Instructions SAS 9.3 Foundation for Microsoft Windows Note: In this document, references to Microsoft Windows or Windows include Microsoft Windows for x64. SAS software is licensed
FileMaker Server 14. FileMaker Server Help
FileMaker Server 14 FileMaker Server Help 2007 2015 FileMaker, Inc. All Rights Reserved. FileMaker, Inc. 5201 Patrick Henry Drive Santa Clara, California 95054 FileMaker and FileMaker Go are trademarks
FileMaker Server 10 Help
FileMaker Server 10 Help 2007-2009 FileMaker, Inc. All Rights Reserved. FileMaker, Inc. 5201 Patrick Henry Drive Santa Clara, California 95054 FileMaker, the file folder logo, Bento and the Bento logo
DataFlux Web Studio 2.5. User s Guide
DataFlux Web Studio 2.5 User s Guide The correct bibliographic citation for this manual is as follows: SAS Institute Inc. 2014. DataFlux Web Studio 2.5: User s Guide. Cary, NC: SAS Institute Inc. DataFlux
FileMaker Server 12. FileMaker Server Help
FileMaker Server 12 FileMaker Server Help 2010-2012 FileMaker, Inc. All Rights Reserved. FileMaker, Inc. 5201 Patrick Henry Drive Santa Clara, California 95054 FileMaker is a trademark of FileMaker, Inc.
FileMaker Server 13. FileMaker Server Help
FileMaker Server 13 FileMaker Server Help 2010-2013 FileMaker, Inc. All Rights Reserved. FileMaker, Inc. 5201 Patrick Henry Drive Santa Clara, California 95054 FileMaker and Bento are trademarks of FileMaker,
Implementing a SAS 9.3 Enterprise BI Server Deployment TS-811. in Microsoft Windows Operating Environments
Implementing a SAS 9.3 Enterprise BI Server Deployment TS-811 in Microsoft Windows Operating Environments Table of Contents Introduction... 1 Step 1: Create a SAS Software Depot..... 1 Step 2: Prepare
SAS 9.2 Management Console. Guide to Users and Permissions
SAS 9.2 Management Console Guide to Users and Permissions The correct bibliographic citation for this manual is as follows: SAS Institute Inc. 2009. SAS ) 9.2 Management Console: Guide to Users and Permissions.
Sage Intelligence Financial Reporting for Sage ERP X3 Version 6.5 Installation Guide
Sage Intelligence Financial Reporting for Sage ERP X3 Version 6.5 Installation Guide Table of Contents TABLE OF CONTENTS... 3 1.0 INTRODUCTION... 1 1.1 HOW TO USE THIS GUIDE... 1 1.2 TOPIC SUMMARY...
SAS. 9.1.3 Intelligence Platform. System Administration Guide
SAS 9.1.3 Intelligence Platform System Administration Guide The correct bibliographic citation for this manual is as follows: SAS Institute Inc. 2006. SAS 9.1.3 Intelligence Platform: System Administration
9.1 SAS/ACCESS. Interface to SAP BW. User s Guide
SAS/ACCESS 9.1 Interface to SAP BW User s Guide The correct bibliographic citation for this manual is as follows: SAS Institute Inc. 2004. SAS/ACCESS 9.1 Interface to SAP BW: User s Guide. Cary, NC: SAS
Configuring Security Features of Session Recording
Configuring Security Features of Session Recording Summary This article provides information about the security features of Citrix Session Recording and outlines the process of configuring Session Recording
SAS 9.3 Management Console
SAS 9.3 Management Console Guide to Users and Permissions SAS Documentation The correct bibliographic citation for this manual is as follows: SAS Institute Inc 2011. SAS 9.3 Management Console: Guide to
Communications Access Methods for SAS/CONNECT 9.3 and SAS/SHARE 9.3 Second Edition
Communications Access Methods for SAS/CONNECT 9.3 and SAS/SHARE 9.3 Second Edition SAS Documentation The correct bibliographic citation for this manual is as follows: SAS Institute Inc. 2012. Communications
SAS Quality Knowledge Base for Contact Information 25. Installation and Configuration Guide
SAS Quality Knowledge Base for Contact Information 25 Installation and Configuration Guide The correct bibliographic citation for this manual is as follows: SAS Institute Inc. 2015. SAS Quality Knowledge
Tutorial: BlackBerry Object API Application Development. Sybase Unwired Platform 2.2 SP04
Tutorial: BlackBerry Object API Application Development Sybase Unwired Platform 2.2 SP04 DOCUMENT ID: DC01214-01-0224-01 LAST REVISED: May 2013 Copyright 2013 by Sybase, Inc. All rights reserved. This
IBM Security QRadar Vulnerability Manager Version 7.2.1. User Guide
IBM Security QRadar Vulnerability Manager Version 7.2.1 User Guide Note Before using this information and the product that it supports, read the information in Notices on page 61. Copyright IBM Corporation
Reconfiguring VMware vsphere Update Manager
Reconfiguring VMware vsphere Update Manager vsphere Update Manager 6.0 This document supports the version of each product listed and supports all subsequent versions until the document is replaced by a
SAS 9.4 In-Database Products
SAS 9.4 In-Database Products Administrator s Guide Fifth Edition SAS Documentation The correct bibliographic citation for this manual is as follows: SAS Institute Inc. 2015. SAS 9.4 In-Database Products:
Jet Data Manager 2012 User Guide
Jet Data Manager 2012 User Guide Welcome This documentation provides descriptions of the concepts and features of the Jet Data Manager and how to use with them. With the Jet Data Manager you can transform
Implementing a SAS Metadata Server Configuration for Use with SAS Enterprise Guide
Implementing a SAS Metadata Server Configuration for Use with SAS Enterprise Guide Step 1: Setting Up Required Users and Groups o Windows Operating Systems Only Step 2: Installing Software Using the SAS
Spectrum Technology Platform. Version 9.0. Administration Guide
Spectrum Technology Platform Version 9.0 Administration Guide Contents Chapter 1: Getting Started...7 Starting and Stopping the Server...8 Installing the Client Tools...8 Starting the Client Tools...9
SAS MDM 4.2. User s Guide. SAS Documentation
SAS MDM 4.2 User s Guide SAS Documentation The correct bibliographic citation for this manual is as follows: SAS Institute Inc. 2015. SAS MDM 4.2: User's Guide. Cary, NC: SAS Institute Inc. SAS MDM 4.2:
IBM Endpoint Manager Version 9.1. Patch Management for Red Hat Enterprise Linux User's Guide
IBM Endpoint Manager Version 9.1 Patch Management for Red Hat Enterprise Linux User's Guide IBM Endpoint Manager Version 9.1 Patch Management for Red Hat Enterprise Linux User's Guide Note Before using
SAS University Edition: Installation Guide for Amazon Web Services
SAS University Edition: Installation Guide for Amazon Web Services i 17 June 2014 The correct bibliographic citation for this manual is as follows: SAS Institute Inc. 2015. SAS University Edition: Installation
SAS MDM 4.1. User s Guide Second Edition. SAS Documentation
SAS MDM 4.1 User s Guide Second Edition SAS Documentation The correct bibliographic citation for this manual is as follows: SAS Institute Inc. 2014. SAS MDM 4.1: User's Guide, Second Edition. Cary, NC:
Installing and Configuring vcenter Multi-Hypervisor Manager
Installing and Configuring vcenter Multi-Hypervisor Manager vcenter Server 5.1 vcenter Multi-Hypervisor Manager 1.1 This document supports the version of each product listed and supports all subsequent
Configuring Apache HTTP Server as a Reverse Proxy Server for SAS 9.2 Web Applications Deployed on BEA WebLogic Server 9.2
Configuration Guide Configuring Apache HTTP Server as a Reverse Proxy Server for SAS 9.2 Web Applications Deployed on BEA WebLogic Server 9.2 This document describes how to configure Apache HTTP Server
How To Install An Aneka Cloud On A Windows 7 Computer (For Free)
MANJRASOFT PTY LTD Aneka 3.0 Manjrasoft 5/13/2013 This document describes in detail the steps involved in installing and configuring an Aneka Cloud. It covers the prerequisites for the installation, the
User Guide. Informatica Smart Plug-in for HP Operations Manager. (Version 8.5.1)
User Guide Informatica Smart Plug-in for HP Operations Manager (Version 8.5.1) Informatica Smart Plug-in for HP Operations Manager User Guide Version 8.5.1 December 2008 Copyright 2008 Informatica Corporation.
Guide to SAS/AF Applications Development
Guide to SAS/AF Applications Development SAS Documentation The correct bibliographic citation for this manual is as follows: SAS Institute Inc. 2012. Guide to SAS/AF Applications Development. Cary, NC:
ODBC Driver User s Guide. Objectivity/SQL++ ODBC Driver User s Guide. Release 10.2
ODBC Driver User s Guide Objectivity/SQL++ ODBC Driver User s Guide Release 10.2 Objectivity/SQL++ ODBC Driver User s Guide Part Number: 10.2-ODBC-0 Release 10.2, October 13, 2011 The information in this
Plug-In for Informatica Guide
HP Vertica Analytic Database Software Version: 7.0.x Document Release Date: 2/20/2015 Legal Notices Warranty The only warranties for HP products and services are set forth in the express warranty statements
SAS Add-In 2.1 for Microsoft Office: Getting Started with Data Analysis
SAS Add-In 2.1 for Microsoft Office: Getting Started with Data Analysis The correct bibliographic citation for this manual is as follows: SAS Institute Inc. 2007. SAS Add-In 2.1 for Microsoft Office: Getting
MadCap Software. Upgrading Guide. Pulse
MadCap Software Upgrading Guide Pulse Copyright 2014 MadCap Software. All rights reserved. Information in this document is subject to change without notice. The software described in this document is furnished
Management Center. Installation and Upgrade Guide. Version 8 FR4
Management Center Installation and Upgrade Guide Version 8 FR4 APPSENSE MANAGEMENT CENTER INSTALLATION AND UPGRADE GUIDE ii AppSense Limited, 2012 All rights reserved. part of this document may be produced
SAS 9.3 Open Metadata Interface: Reference and Usage
SAS 9.3 Open Metadata Interface: Reference and Usage SAS Documentation The correct bibliographic citation for this manual is as follows: SAS Institute Inc. 2011. SAS 9.3 Open Metadata Interface: Reference
Migrating to vcloud Automation Center 6.1
Migrating to vcloud Automation Center 6.1 vcloud Automation Center 6.1 This document supports the version of each product listed and supports all subsequent versions until the document is replaced by a
Out n About! for Outlook Electronic In/Out Status Board. Administrators Guide. Version 3.x
Out n About! for Outlook Electronic In/Out Status Board Administrators Guide Version 3.x Contents Introduction... 1 Welcome... 1 Administration... 1 System Design... 1 Installation... 3 System Requirements...
Managing Multi-Hypervisor Environments with vcenter Server
Managing Multi-Hypervisor Environments with vcenter Server vcenter Server 5.1 vcenter Multi-Hypervisor Manager 1.0 This document supports the version of each product listed and supports all subsequent
TIBCO Spotfire Automation Services 6.5. Installation and Deployment Manual
TIBCO Spotfire Automation Services 6.5 Installation and Deployment Manual Revision date: 17 April 2014 Important Information SOME TIBCO SOFTWARE EMBEDS OR BUNDLES OTHER TIBCO SOFTWARE. USE OF SUCH EMBEDDED
IBM Endpoint Manager Version 9.2. Patch Management for SUSE Linux Enterprise User's Guide
IBM Endpoint Manager Version 9.2 Patch Management for SUSE Linux Enterprise User's Guide IBM Endpoint Manager Version 9.2 Patch Management for SUSE Linux Enterprise User's Guide Note Before using this
Communications Access Methods for SAS/CONNECT 9.4 and SAS/SHARE 9.4 Second Edition
Communications Access Methods for SAS/CONNECT 9.4 and SAS/SHARE 9.4 Second Edition SAS Documentation The correct bibliographic citation for this manual is as follows: SAS Institute Inc. 2015. Communications
Integrating with BarTender Integration Builder
Integrating with BarTender Integration Builder WHITE PAPER Contents Overview 3 Understanding BarTender's Native Integration Platform 4 Integration Builder 4 Administration Console 5 BarTender Integration
Configuring IBM HTTP Server as a Reverse Proxy Server for SAS 9.3 Web Applications Deployed on IBM WebSphere Application Server
Configuration Guide Configuring IBM HTTP Server as a Reverse Proxy Server for SAS 9.3 Web Applications Deployed on IBM WebSphere Application Server This document is revised for SAS 9.3. In previous versions
PATROL Console Server and RTserver Getting Started
PATROL Console Server and RTserver Getting Started Supporting PATROL Console Server 7.5.00 RTserver 6.6.00 February 14, 2005 Contacting BMC Software You can access the BMC Software website at http://www.bmc.com.
User Installation Guide for SAS 9.1 Foundation for 64-bit Microsoft Windows
User Installation Guide for SAS 9.1 Foundation for 64-bit Microsoft Windows Installation Instructions Where to Begin SAS Setup Wizard Repair or Remove SAS Software Glossary Where to Begin Most people who
PUBLIC Management Console Guide
SAP Data Services Document Version: 4.2 Support Package 7 (14.2.7.0) 2016-05-06 PUBLIC Content 1 Introduction.... 7 1.1 Welcome to SAP Data Services....7 Welcome....7 Documentation set for SAP Data Services....7
enicq 5 System Administrator s Guide
Vermont Oxford Network enicq 5 Documentation enicq 5 System Administrator s Guide Release 2.0 Published November 2014 2014 Vermont Oxford Network. All Rights Reserved. enicq 5 System Administrator s Guide
FileMaker Server 13. Getting Started Guide
FileMaker Server 13 Getting Started Guide 2007 2013 FileMaker, Inc. All Rights Reserved. FileMaker, Inc. 5201 Patrick Henry Drive Santa Clara, California 95054 FileMaker and Bento are trademarks of FileMaker,
Sophos for Microsoft SharePoint startup guide
Sophos for Microsoft SharePoint startup guide Product version: 2.0 Document date: March 2011 Contents 1 About this guide...3 2 About Sophos for Microsoft SharePoint...3 3 System requirements...3 4 Planning
TIBCO Spotfire Web Player 6.0. Installation and Configuration Manual
TIBCO Spotfire Web Player 6.0 Installation and Configuration Manual Revision date: 12 November 2013 Important Information SOME TIBCO SOFTWARE EMBEDS OR BUNDLES OTHER TIBCO SOFTWARE. USE OF SUCH EMBEDDED
WhatsUp Gold v16.3 Installation and Configuration Guide
WhatsUp Gold v16.3 Installation and Configuration Guide Contents Installing and Configuring WhatsUp Gold using WhatsUp Setup Installation Overview... 1 Overview... 1 Security considerations... 2 Standard
Configuring Secure Socket Layer and Client-Certificate Authentication on SAS 9.3 Enterprise BI Server Systems That Use Oracle WebLogic 10.
Configuring Secure Socket Layer and Client-Certificate Authentication on SAS 9.3 Enterprise BI Server Systems That Use Oracle WebLogic 10.3 Table of Contents Overview... 1 Configuring One-Way Secure Socket
Copyright 2012 Trend Micro Incorporated. All rights reserved.
Trend Micro Incorporated reserves the right to make changes to this document and to the products described herein without notice. Before installing and using the software, please review the readme files,
Getting Started with. Ascent Capture Internet Server 5. 10300260-000 Revision A
Ascent Capture Internet Server 5 Getting Started with Ascent Capture Internet Server 5 10300260-000 Revision A Copyright Copyright 2001 Kofax Image Products. All Rights Reserved. Printed in USA. The information
WhatsUp Gold v16.2 Installation and Configuration Guide
WhatsUp Gold v16.2 Installation and Configuration Guide Contents Installing and Configuring Ipswitch WhatsUp Gold v16.2 using WhatsUp Setup Installing WhatsUp Gold using WhatsUp Setup... 1 Security guidelines
CA Performance Center
CA Performance Center Single Sign-On User Guide 2.4 This Documentation, which includes embedded help systems and electronically distributed materials, (hereinafter referred to as the Documentation ) is
RSA Security Analytics
RSA Security Analytics Event Source Log Configuration Guide Microsoft SQL Server Last Modified: Thursday, July 30, 2015 Event Source Product Information: Vendor: Microsoft Event Source: SQL Server Versions:
Active Directory Adapter with 64-bit Support Installation and Configuration Guide
IBM Security Identity Manager Version 6.0 Active Directory Adapter with 64-bit Support Installation and Configuration Guide SC27-4384-02 IBM Security Identity Manager Version 6.0 Active Directory Adapter
Novell ZENworks 10 Configuration Management SP3
AUTHORIZED DOCUMENTATION Software Distribution Reference Novell ZENworks 10 Configuration Management SP3 10.3 November 17, 2011 www.novell.com Legal Notices Novell, Inc., makes no representations or warranties
HP IMC Firewall Manager
HP IMC Firewall Manager Configuration Guide Part number: 5998-2267 Document version: 6PW102-20120420 Legal and notice information Copyright 2012 Hewlett-Packard Development Company, L.P. No part of this
Technical Paper. Defining an ODBC Library in SAS 9.2 Management Console Using Microsoft Windows NT Authentication
Technical Paper Defining an ODBC Library in SAS 9.2 Management Console Using Microsoft Windows NT Authentication Release Information Content Version: 1.0 October 2015. Trademarks and Patents SAS Institute
TREENO ELECTRONIC DOCUMENT MANAGEMENT. Administration Guide
TREENO ELECTRONIC DOCUMENT MANAGEMENT Administration Guide October 2012 Contents Introduction... 8 About This Guide... 9 About Treeno... 9 Managing Security... 10 Treeno Security Overview... 10 Administrator
BEAWebLogic. Portal. WebLogic Portlets for SAP Installation Guide
BEAWebLogic Portal WebLogic Portlets for SAP Installation Guide Version 8.1 with Service Pack 4 (SAP Portlets Version 1.1) Document Revised: September 2004 Copyright Copyright 2004-2005 BEA Systems, Inc.
CA arcserve Unified Data Protection Agent for Linux
CA arcserve Unified Data Protection Agent for Linux User Guide Version 5.0 This Documentation, which includes embedded help systems and electronically distributed materials, (hereinafter referred to as
Getting Started Guide
BlackBerry Web Services For Microsoft.NET developers Version: 10.2 Getting Started Guide Published: 2013-12-02 SWD-20131202165812789 Contents 1 Overview: BlackBerry Enterprise Service 10... 5 2 Overview:
Oracle WebCenter Content Service for Microsoft Exchange
Oracle WebCenter Content Service for Microsoft Exchange Installation and Upgrade Guide 10g Release 3 (10.3) November 2008 Oracle WebCenter Content Service for Microsoft Exchange Installation and Upgrade
HYPERION SYSTEM 9 N-TIER INSTALLATION GUIDE MASTER DATA MANAGEMENT RELEASE 9.2
HYPERION SYSTEM 9 MASTER DATA MANAGEMENT RELEASE 9.2 N-TIER INSTALLATION GUIDE P/N: DM90192000 Copyright 2005-2006 Hyperion Solutions Corporation. All rights reserved. Hyperion, the Hyperion logo, and
Legal Notes. Regarding Trademarks. 2012 KYOCERA Document Solutions Inc.
Legal Notes Unauthorized reproduction of all or part of this guide is prohibited. The information in this guide is subject to change without notice. We cannot be held liable for any problems arising from
