docs.rackspace.com/api

Size: px
Start display at page:

Download "docs.rackspace.com/api"

Transcription

1 docs.rackspace.com/api

2 Rackspace Cloud Big Data Getting Started API v2.0 ( ) 2015 Rackspace US, Inc. This guide is intended for software developers interested in developing applications using the Rackspace Cloud Big Data Application Programming Interface (API). The document is for informational purposes only and is provided AS IS. RACKSPACE MAKES NO REPRESENTATIONS OR WARRANTIES OF ANY KIND, EXPRESS OR IMPLIED, AS TO THE ACCURACY OR COM- PLETENESS OF THE CONTENTS OF THIS DOCUMENT AND RESERVES THE RIGHT TO MAKE CHANGES TO SPECIFICATIONS AND PROD- UCT/SERVICES DESCRIPTION AT ANY TIME WITHOUT NOTICE. RACKSPACE SERVICES OFFERINGS ARE SUBJECT TO CHANGE WITH- OUT NOTICE. USERS MUST TAKE FULL RESPONSIBILITY FOR APPLICATION OF ANY SERVICES MENTIONED HEREIN. EXCEPT AS SET FORTH IN RACKSPACE GENERAL TERMS AND CONDITIONS AND/OR CLOUD TERMS OF SERVICE, RACKSPACE ASSUMES NO LIABILITY WHATSOEVER, AND DISCLAIMS ANY EXPRESS OR IMPLIED WARRANTY, RELATING TO ITS SERVICES INCLUDING, BUT NOT LIMITED TO, THE IMPLIED WARRANTY OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE, AND NONINFRINGEMENT. Except as expressly provided in any written license agreement from Rackspace, the furnishing of this document does not give you any license to patents, trademarks, copyrights, or other intellectual property. Rackspace, Rackspace logo and Fanatical Support are registered service marks of Rackspace US, Inc. All other product names and trademarks used in this document are for identification purposes only and are property of their respective owners. ii

3 Table of Contents 1. Overview Cloud Big Data concepts Use cases Prerequisites for running examples Pricing and service level Service access endpoints Sending requests to Cloud Big Data Using curl Sending API requests by using curl Copying and pasting curl request examples into a terminal window Setting up python-lavaclient CLI Prerequisites Installing the CLI Generating an authentication token using curl Generating an authentication token using the lavaclient Creating and managing credentials Creating a credential curl example Client example Listing all credentials curl example Client example Updating credentials curl example Client example Deleting credentials curl example Client example Viewing resource limits curl example Client example Creating and managing Hadoop clusters Listing flavors curl example Client example Listing available distros curl example Client example Listing available stacks curl example Client example Creating a cluster curl example Client example Listing clusters curl example Client example iii

4 8.6. Viewing node details curl example Client example Resizing clusters curl example Client example Creating a script curl example Client example Listing all scripts curl example Client example Deleting clusters curl example Client example Additional resources Document change history Glossary iv

5 List of Tables 2.1. Regionalized service endpoints curl command-line options... 5 v

6 List of Examples 3.1. curl authenticate request: JSON curl authenticate request: JSON Authenticate response: JSON Authentication response using CLI utility Export environment variables curl create a credential - ssh_keys request Create a credential - ssh_keys request: JSON body Create a credential - ssh_keys response: JSON curl create a credential - cloud_files request Create a credential - cloud_files request: JSON body Create a credential - cloud_files response: JSON Create a SSH credential using the CLI Create a Cloud Files credential using the CLI curl list all credentials request List all credentials using the CLI curl update a credential request Update a credential request: JSON body Update a credential response: JSON Update a credential using the CLI curl delete a credential request Delete a credential using the CLI curl view resource limits request: JSON View resource limits response: JSON View resource limits curl list flavors request: JSON List flavors response: JSON List flavors and associated resources by using the flavors list command with the CLI curl list available distros request: JSON List available distros response: JSON View available distros with the CLI curl list all stacks request: JSON List all stacks response: JSON View available stacks with the CLI curl create cluster request Create cluster request: JSON body Create cluster response: JSON Create a cluster with the CLI curl list clusters request: JSON List clusters response: JSON List clusters with the CLI curl list cluster nodes request: JSON List cluster nodes response: JSON Query the details of a cluster by using the show and nodes commands with the CLI curl resize cluster request: JSON Resize cluster request: JSON body Resize cluster response: JSON vi

7 8.23. Increase cluster size by using the resize command with the CLI curl create a script Create a script request: JSON body Create a script response: JSON Create a script with the CLI curl list all scripts request List all scripts response: JSON List available scripts with the CLI curl delete cluster request: JSON Remove clusters by using the delete command vii

8 1. Overview Rackspace Cloud Big Data is an on-demand Apache Hadoop service for the Rackspace open cloud. The service supports a RESTful API and alleviates the pain associated with deploying, managing, and scaling Hadoop clusters. Cloud Big Data is just as flexible and feature-rich as Hadoop. With Cloud Big Data, you benefit from on-demand servers, utility-based pricing, and access to the full set of Hadoop features and APIs. However, you do not have to worry about provisioning, growing, or maintaining your Hadoop infrastructure. The Cloud Big Data service uses an environment that is specifically optimized for Hadoop, which ensures that your jobs run efficiently and reliably. Note that you are still responsible for developing, troubleshooting, and deploying your applications. The primary use cases for Cloud Big Data are as follows: Create on-demand infrastructure for applications in production where physical servers would be too costly and time-consuming to configure and maintain. Develop, test, and pilot data analysis applications. Cloud Big Data provides the following benefits: Create or resize Hadoop clusters in minutes and pay only for what you use. Access the Hortonworks Data Platform (HDP), an enterprise-ready distribution that is 100 percent Apache open source. Provision and manage Hadoop through an easy-to-use Control Panel and a RESTful API. Seamlessly access data in Cloud Files containers. Gain interoperability with any third-party software tool that supports HDP. Access Fanatical Support on a 24x7x365 basis via chat, phone, or ticket. This guide provides examples for the following ways to use the Cloud Big Data API: Using the API directly with curl Using the python-lavaclient command-line client (CLI) Examples for both ways to make request to Cloud Big Data are provided for authentication (Chapter 4, Generating an authentication token using curl [9] and Chapter 5, Generating an authentication token using the lavaclient [14]and for creating and managing clusters (Chapter 8, Creating and managing Hadoop clusters [22] Cloud Big Data concepts To use the Cloud Big Data API effectively, you should understand the following terminology: 1

9 Credentials: Credentials allow you to set up SSH keys and other connector credentials for use with clusters. Ex: Cloud Files credentials. Distros: Distros provide a list of supported distributions and their corresponding versions, as well as a list of supported services and components per distribution. Stacks: Stacks are high-level building blocks of software that compose a Big Data architecture. Stacks are composed of services, which in turn are composed of components. A stack is specific to a distribution because of the differences in services that are supported across distributions. Clusters: A cluster is a group of servers (nodes). Cloud Big Data supports both virtual and OnMetal servers. Nodes: A node is either a virtual or an OnMetal server that serves a particular role in the cluster. A node runs one or more components in the Hadoop ecosystem. Scripts: You can create a custom script that runs during various phases of the cluster's life cycle. The script is invoked on all nodes of the cluster. The script type currently supported is POST_INIT, which runs after the cluster is completely set up. The script must be executable. Preferably, the script should be a bash script, but it could be a Python script or a self-contained executable that works with the base libraries of the installed OS. Flavor: A flavor is an available configuration for each node in a Cloud Big Data cluster. Each flavor has a unique combination of memory capacity, priority for CPU time and storage space. Resource limits: Resource limits include items such as remaining node count, available RAM, and remaining disk space for the user. For the definitions of additional terminology related to Cloud Big Data, see the Glossary [47] Use cases Use cases for Cloud Big Data include but not limited to the following examples: Clickstream analysis Analyze click stream data in order to segment users and understand user preferences. Advertisers can also analyze click streams and advertising impression logs to deliver more effective ads. Log analysis Process logs generated by web and mobile applications. Cloud Big Data Platform helps customers turn petabytes of unstructured or semi-structured data into useful insights about their applications or users. Sentiment analysis Examine a corpus of text to determine the attitude of a speaker or a writer with respect to some topic or the overall contextual polarity of a document. 2

10 1.3. Prerequisites for running examples In order to run the examples in this guide, you must have the following prerequisites: A Rackspace Cloud account A Rackspace Cloud username and password, as specified during registration Prior knowledge of HTTP/1.1 conventions Basic familiarity with Cloud and RESTful APIs Prior knowledge of Hadoop or a third-party tool that works with Hadoop Ability to work with the Hortonworks Data Platform (HDP) By using the Cloud Big Data API, you understand and agree to the following limitations and conditions: Cloud Big Data includes a Swift integration feature so that Hadoop, MapReduce, Pig, Hive and Spark jobs can directly reference Cloud Files containers Pricing and service level Cloud Big Data is part of the Rackspace Cloud and your use through the API will be billed according to the pricing schedule at The Service Level Agreement (SLA) for Cloud Big Data is available at 3

11 2. Service access endpoints The Cloud Big Data service is a regionalized service. The user of the service is therefore responsible for appropriate replication, caching, and overall maintenance of Cloud Big Data data across regional boundaries to other Cloud Servers. The endpoints to use for your Cloud Big Data API calls are summarized in the table below. To help you decide which regionalized endpoint to use, read the Knowledge Center article about special considerations for choosing a data center at About Regions. Table 2.1. Regionalized service endpoints Region Chicago (ORD) Dallas/Ft. Worth (DFW) London (LON) Northern Virginia (IAD) Endpoint Replace the youraccountid placeholder with your actual account number, which is returned as part of the authentication service response, after the final / in the publicurl field. Note All examples in this guide assume that you are operating against the DFW data center. If you are using a different data center, be sure to use the associated endpoint from the table above. When you perform a Cloud Big Data API operation, place the endpoint at the beginning of the request URL. For example: v2/youraccountid/. 4

12 3. Sending requests to Cloud Big Data You have several options for sending requests to Cloud Big Data: You can use curl, the command-line tool from With curl, you can send HTTP requests and receive responses back from the command line. You can use the python-lavaclient CLI. If you like to use a more graphical interface you can use the Rackspace Cloud Control Panel Using curl You can use curl, the command-line tool from With curl, you can send HTTP requests and receive responses back from the command line Sending API requests by using curl curl is a command-line tool that is available in most UNIX system-based environments and Apple Mac OS X systems, and can be downloaded for Microsoft Windows to interact with REST interfaces. For more information about curl, visit curl enables you to transmit and receive HTTP requests and responses from the command line or from within a shell script. As a result, you can work with the REST API directly without using one of the client APIs. The following curl command-line options are used in this guide to run the examples: Table 3.1. curl command-line options Option Description -d Sends the specified data in a POST request to the HTTP server. Use this option to send a JSON request body to the server. -H Specifies an extra HTTP header in the request. You can specify any number of extra headers. Precede each header with the -H option. Common headers in Rackspace API requests are as follows: Content-Type. Required for operations with a request body. Specifies the format of the request body. The syntax for the Content-Type header is as follows: Content-Type: application/format format is json. X-Tenant-Id. Optional. Specifies the Tenant ID, which is your account number. 5

13 Option Description Accept. Optional. Specifies the format of the response body. The syntax for the Accept header is as follows: Accept: application/format format is json. The default is json. X-Auth-Token. Required. Specifies the authentication token. -i Includes the HTTP header in the output. -s Silent or quiet mode. Does not show progress or error messages. Makes curl mute. Note: If your curl command is not generating any output, try replacing the -s option with -i. -T Transfers the specified local file to the remote URL. -X Specifies the request method to use when communicating with the HTTP server. The specified request is used instead of the default method, which is GET. About json.tool For commands that return a response, you can append the following code to the command to call json.tool to pretty-print output: python -m json.tool To use json.tool, import the json module. For information about json.tool, see json JSON encoder and decoder. If you do not want to pretty-print JSON output, omit this code Copying and pasting curl request examples into a terminal window To run the curl request examples shown in this guide on Linux or Mac systems, perform the following actions: 1. Copy and paste each example from the HTML version of this guide into an ASCII text editor (for example, vi or TextEdit). You can click on the small document icon to the right of each request example to select it. 2. Modify each example with your required account information and so on, as detailed in this guide. 3. After you are finished modifying the text for the curl request example with your information (for example, your username and your API key), paste the command into your terminal window. 4. Press Enter to run the curl command. 6

14 Note The carriage returns in the curl request examples that are part of the curl syntax are escaped with a backslash (\) to avoid prematurely terminating the command. However, you should not escape carriage returns inside the JSON message within the command. Consider the following curl authentication request: JSON example, which is described in detail in Chapter 4, Generating an authentication token using curl [9]. Example 3.1. curl authenticate request: JSON curl -i -d \ ' "auth": "RAX-KSKEY:apiKeyCredentials": "username": "yourusername", "apikey": "yourapikey" ' \ -H 'Content-Type: application/json' \ ' Notice that the lines that are part of the curl command syntax have been escaped with a backslash (\) to indicate that the command continues on the next line. curl -i -d \ (... lines within the JSON portion of the message are not shown in this example) (... the example only shows lines that are part of curl syntax) -H 'Content-Type: application/json' \ ' However, the lines within the JSON portion of the message are not escaped with a backslash to avoid issues with the JSON processing. ' "auth": "RAX-KSKEY:apiKeyCredentials": "username": "yourusername", "apikey": "yourapikey" ' \ The final line of the JSON message is escaped because the backslash lies outside the JSON message and continues the curl command to the next line. 7

15 Tip If you have trouble copying and pasting the examples as described, try typing the entire example on one long line, removing all the backslash line continuation characters Setting up python-lavaclient CLI Another way you can send requests to Cloud Big Data is to use the python-lavaclient CLI. This section provides the prerequisites for use of the client and installation instructions Prerequisites Following are the requirements for using the python-lavaclient CLI: Linux or Mac OS X Python or later Rackspace Cloud account and access to Rackspace Cloud Big Data Installing the CLI Perform the following steps to install the CLI. 1. Install the python-lavaclient from PyPI using pip. $ pip install lavaclient 2. Run the help command to ensure that the client has been installed correctly and note the usage information. $ lava help 8

16 4. Generating an authentication token using curl Whether you use curl or a REST client to interact with the Cloud Big Data API, you must generate an authentication token. You provide this token in the X-Auth-Token header in each Cloud Big Data API request. Example 4.1, curl authenticate request: JSON [9] demonstrates how to use curl to obtain the authentication token as well as your account number. You must provide both when making subsequent Cloud Big Data API requests. Remember to replace the placeholders in the following authentication request examples with your information: yourusername Your common Cloud Big Data user name, as supplied during registration. yourapikey Your API access key. You can obtain the key from the Rackspace Cloud Control Panel in the Your Account / API Keys section. Note This guide uses yourusername and yourapikey for authentication. For information about other supported authentication methods, see Authentication tokens in the Cloud Identity Client Developer Guide. Use the following global endpoint to access the Cloud Identity service for authentication: You authenticate by using the URL v2.0/tokens for the Cloud Identity services. Note that the v2.0 component in the URL indicates that you are using version 2.0 of the Cloud Identity API. Example 4.1. curl authenticate request: JSON curl -s -d \ ' "auth": "RAX-KSKEY:apiKeyCredentials": "username": "yourusername", "apikey": "yourapikey" ' \ -H 'Content-Type: application/json' \ ' In the authentication response (example follows), the authentication token id is returned with an expires attribute that specifies when the token expires. Remember to supply your authentication token wherever you see the placeholder yourauthtoken in the examples in this guide. 9

17 Notes The values that you receive in your responses vary from the examples shown in this document because they are specific to your account. The expires attribute denotes the time after which the token will automatically become invalid. A token might be manually revoked before the time identified by the expires attribute. The attribute predicts a token's maximum possible lifespan but does not guarantee that it will reach that lifespan. Clients are encouraged to cache a token until it expires. Applications should be designed to re-authenticate after receiving a 401 (Unauthorized) response from a service endpoint. The publicurl endpoints for Cloud Big Data (for example dfw.bigdata.api.rackspacecloud.com/v2/ ) are also returned in the response. Your actual account number is after the final slash (/) in the publicurl field. In the following examples, the account number is You must specify your account number on most of the Cloud Big Data API operations, wherever you see the placeholder youraccountid specified in the examples in this guide. After authentication, you can use curl to perform GET, DELETE, and POST requests for the Cloud Big Data API. Example 4.2. Authenticate response: JSON HTTP/ OK Content-Type: application/json; charset=utf-8 Content-Length: 477 Date: Sat, 07 Dec :45:13 GMT "access": "token": "expires": " T22:51: :00", "id": "xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx", "user": "id": "123456", "name": "jsmith", "RAX-AUTH:defaultRegion": "DFW", "roles": [ "description": "Admin Role.", "id": "identity:admin", "name": "identity:admin", "description": "Default Role.", "id": "identity:default", "name": "identity:default" ], "servicecatalog": [ 10

18 "endpoints": [ "publicurl": " com/v2/ ", "region": "DFW", "tenantid": " ",, "name": "cloudbigdata", "type": "rax:bigdata" "endpoints": [ "publicurl": " rackspacecloud.com/v1.0/ ", "region": "DFW", "tenantid": " ", "publicurl": " rackspacecloud.com/v1.0/ ", "region": "ORD", "tenantid": " " "name": "cloudloadbalancers", "type": "rax:load-balancer", "endpoints": [ "tenantid": " ", "region": "DFW", "publicurl": " com/v2/ ", "versionid": "2", "versioninfo": " rackspacecloud.com/v2/", "versionlist": " rackspacecloud.com/", "tenantid": " ", "region": "ORD", "publicurl": " com/v2/ ", "versionid": "2", "versioninfo": " rackspacecloud.com/v2/", "versionlist": " rackspacecloud.com/" "name": "cloudserversopenstack", "type": "compute", "endpoints": [ 11

19 v1.0/ ", com/v1.0/", com/", "tenantid": " ", "publicurl": " "versionid": "1.0", "versioninfo": " "versionlist": " "name": "cloudservers", "type": "compute" "endpoints": [ "tenantid": "MossoCloudFS_aaaaaaaa-bbbb-cccc-ddddeeeeeeee", "publicurl": " v1/mossocloudfs_aaaaaaaa-bbbb-cccc-dddd-eeeeeeee", "internalurl": " clouddrive.com/v1/mossocloudfs_aaaaaaaa-bbbb-cccc-dddd-eeeeeeee", "region": "DFW", "tenantid": "MossoCloudFS_aaaaaaaa-bbbb-cccc-ddddeeeeeeee", "publicurl": " v1/mossocloudfs_aaaaaaaa-bbbb-cccc-dddd-eeeeeeee", "internalurl": " clouddrive.com/v1/mossocloudfs_aaaaaaaa-bbbb-cccc-dddd-eeeeeeee", "region": "ORD" "name": "cloudfiles", "type": "object-store", "endpoints": [ "tenantid": "MossoCloudFS_aaaaaaaa-bbbb-cccc-ddddeeeeeeee", "publicurl": " MossoCloudFS_aaaaaaaa-bbbb-cccc-dddd-eeeeeeee", "region": "DFW", "tenantid": "MossoCloudFS_aaaaaaaa-bbbb-cccc-ddddeeeeeeee", "publicurl": " MossoCloudFS_aaaaaaaa-bbbb-cccc-dddd-eeeeeeee", "region": "ORD" "name": "cloudfilescdn", "type": "rax:object-cdn", "endpoints": [ 12

20 " ] "tenantid": " ", "publicurl": " "name": "clouddns", "type": "rax:dns" 13

21 5. Generating an authentication token using the lavaclient To authenticate your session by using the lavaclient, use the following steps. You need your Cloud username, API key and tenant ID. 1. Run the authenticate command with the parameters shown below. $ lava --user [username] --tenant [tenant_id] --api-key [api_key] --region DFW authenticate If the command runs successfully, your authentication token is displayed, as shown in the following example. Example 5.1. Authentication response using CLI utility AUTH_TOKEN=692c2a14-39ad-4ee0-991d-06cd7331f3ca 2. Export the AUTH_TOKEN and LAVA2_API_URL environment variables as shown in the following example. Replace yourtenantid with the your actual tenant ID. Example 5.2. Export environment variables $ export AUTH_TOKEN=692c2a14-39ad-4ee0-991d-06cd7331f3ca $ export LAVA2_API_URL= yourtenantid Note The export commands are valid only for the current session. You need to rerun the export commands if, for example, you create a new console window. 3. To confirm that the client is running, run the distros list command. $ lava distros list ID Name Version HDP2.2 HortonWorks Data Platform

22 6. Creating and managing credentials Before you can create Hadoop clusters, you must create credentials. Credentials allow you to setup SSH keys and other connector credentials for use with clusters. Note Your Cloud Big Data credentials are different from your cloud account. Your credentials have the following characteristics and requirements: A credential is the configuration for the administration and login account for the cluster. You can create any number of SSH credentials and attach to a cluster. Each cluster can contain only one Cloud Files credential connector. After you create a credential, you can attach that credential to clusters that you provision by using the API. This allows you to remotely SSH into a server to transfer data, run or troubleshoot jobs, and so on Creating a credential Verb URI Description POST /v2/tenant_id/credentials/type Creates a credential. This operation adds new credentials for a specific type. Based on the chosen type, ssh_keys or cloud_files, the request body varies. A general pattern is followed of a dict of the type that contains one or more credential related fields curl example The following examples show the curl request and corresponding response for creating a credential. Example 6.1. curl create a credential - ssh_keys request curl -i -X POST credentials/ssh_keys -d \ -H "X-Auth-Token: yourauthtoken" \ -H "Accept: application/json" \ -H "Content-type: application/json" Example 6.2. Create a credential - ssh_keys request: JSON body "ssh_keys": "key_name": "cbdkey", "public_key": "ssh-rsa AAkphQZaDNi2Ij3DX...5twE62lerq7Xhaff foo@bar" 15

23 Example 6.3. Create a credential - ssh_keys response: JSON "credentials": "ssh_keys": "key_name": "cbdkey" Example 6.4. curl create a credential - cloud_files request curl -i -X POST credentials/cloud_files -d \ -H "X-Auth-Token: yourauthtoken" \ -H "Accept: application/json" \ -H "Content-type: application/json" Example 6.5. Create a credential - cloud_files request: JSON body "cloud_files": "username": "cfuser", "api_key": "samplekey" Example 6.6. Create a credential - cloud_files response: JSON "credentials": "cloud_files": "username": "cfuser" Client example Using the client, create credentials as shown in the following example. Example 6.7. Create a SSH credential using the CLI $ lava credentials create_ssh_key cbdkey "ssh-rsa AAkphQZaDNi2Ij3DX... 5twE62lerq7Xhaff foo@bar" Type SSH Key Name cbdkey Example 6.8. Create a Cloud Files credential using the CLI $ lava credentials create_cloud_files cfuser samplekey Type Cloud Files Username cfuser

24 6.2. Listing all credentials Verb URI Description GET /v2/tenant_id/credentials Lists all user credentials. This operation lists all user credentials curl example This operation does not accept a request body. The following examples show the curl request and corresponding response for listing all user credentials. Example 6.9. curl list all credentials request curl -i -X GET credentials -d \ -H "X-Auth-Token: yourauthtoken" \ -H "Accept: application/json" \ -H "Content-type: application/json" "credentials": "cloud_files": [ "username": "cfuser" "ssh_keys": [ "key_name": "cbdkey" ] Client example Using the client, list all credentials as shown in the following example. Example List all credentials using the CLI $ lava credentials list Type Name SSH Key cbdkey Cloud Files cfuser

25 6.3. Updating credentials PUT Verb URI Description /v2/tenant_id/credentials/type/ name Updates the specified user credential. The update marks clusters that already use the credential as out of sync curl example The following examples show the curl request and corresponding response for updating a credential. Example curl update a credential request curl -i -X PUT credentials/ssh_keys/cbdkey -d \ -H "X-Auth-Token: yourauthtoken" \ -H "Accept: application/json" \ -H "Content-type: application/json" Example Update a credential request: JSON body "ssh_keys": "key_name": "cbdkey", "public_key": "ssh-rsa AAkddddddddd3DX...5twE62lerq7Xhaff foo@bar" Example Update a credential response: JSON "credentials": "ssh_keys": "key_name": "cbdkey" Client example Using the client, update a credential as shown in the following example. Example Update a credential using the CLI $ lava credentials update_ssh_key cbdkey "ssh-rsa AAkphQZaDNi2Ij3DX... 5twE62lerq7Xhaff foo@bar" Type SSH Key Name cbdkey

26 6.4. Deleting credentials DELETE Verb URI Description /v2/tenant_id/credentials/type/ name Deletes the specified user credential. You can delete only credentials that are not used by any active clusters curl example The following example show the curl request for deleting a credential. Example curl delete a credential request curl -i -X DELETE credentials/ssh_keys/cbdkey -d \ -H "X-Auth-Token: yourauthtoken" \ -H "Accept: application/json" \ -H "Content-type: application/json" This operation does not accept a request body. This operation does not return a response body Client example Using the client, delete a credential as shown in the following example. Example Delete a credential using the CLI $ lava credentials delete_ssh_key cbdkey 19

27 7. Viewing resource limits The use of the Rackspace Cloud Big Data API is subject to resource limits. You can view the limits associated with your account by using the operation to view the resource limits, which displays limits such as remaining node count, available RAM, and remaining disk space for the user. Verb URI Description GET /v2/tenant_id/limits Displays the resource limits for the user curl example This operation does not accept a request body. The following examples show the curl request and corresponding response for viewing resource limits. Example 7.1. curl view resource limits request: JSON curl -i -X GET limits -d \ -H "X-Auth-Token: yourauthtoken" \ -H "Accept: application/json" \ -H "Content-type: application/json" Example 7.2. View resource limits response: JSON "limits": "absolute": "disk": "limit": , "remaining": 28882, "node_count": "limit": 100, "remaining": 50, "ram": "limit": 50000, "remaining": 34567, "vcpus": "limit": 50, "remaining": 25, 7.2. Client example Using the client, view the limits associated with your account by using the limits command. 20

28 Example 7.3. View resource limits $ lava limits get Quotas Property Limit Remaining Nodes RAM Disk VCPUs

29 8. Creating and managing Hadoop clusters Now you are ready to create and manage Hadoop clusters by using the Rackspace Cloud Big Data API. This chapter provides examples, using curl and the client, for some of the common operations against Cloud Big Data. For information about all of the operations available in Cloud Big Data, see the Cloud Big Data Developer Listing flavors A flavor is an available hardware configuration for a cluster. Each flavor has a unique combination of memory capacity and priority for CPU time. The larger the flavor size you use, the larger the amount of RAM and the higher the priority for CPU time your cluster receives. You use the operation to list flavors to find the available configurations for your cluster, and then you decide which size you need for your cluster. You perform this operation when you create a cluster. Verb URI Description GET /v2/tenant_id/flavors Lists all available flavors curl example This operation does not require a request body. The following examples show the curl request and the corresponding response for listing flavors. Example 8.1. curl list flavors request: JSON curl -i -X GET flavors -d \ -H "X-Auth-Token: yourauthtoken" \ -H "Accept: application/json" \ -H "Content-type: application/json" Example 8.2. List flavors response: JSON "flavors": [ "disk": 10000, "id": "hadoop1-60", "name": "XLarge Hadoop Instance", "ram": 61440, "vcpus": 16, "class": "hadoop1" 22

30 ,, "disk": 1250, "id": "hadoop1-7", "name": "Small Hadoop Instance", "ram": 8192, "vcpus": 2, "class": "hadoop1" "disk": 3200, "id": "onmetal-io1", "name": "OnMetal IO 1", "ram": , "vcpus": 40, "class": "onmetal" "links": [ "href": " flavors?limit=2&marker=hadoop1-7", "rel": "next" ] Client example You can enumerate the flavors and associated resources by using the flavors list command, as shown in the following example. Example 8.3. List flavors and associated resources by using the flavors list command with the CLI $ lava flavors list ID Name RAM VCPUs Disk hadoop1-15 Medium Hadoop Instance hadoop1-30 Large Hadoop Instance hadoop1-60 XLarge Hadoop Instance hadoop1-7 Small Hadoop Instance Listing available distros Distros provide a list of supported distributions and their corresponding versions, as well as a list of supported services and components per distribution. Use the operation to list the distros to see the distros that are available. Verb URI Description GET /v2/tenant_id/distros Lists available distros. 23

31 curl example This operation does not accept a request body. The following examples show the curl request and the corresponding response for listing cluster types. Example 8.4. curl list available distros request: JSON curl -i -X GET distros -d \ -H "X-Auth-Token: yourauthtoken" \ -H "Accept: application/json" \ -H "Content-type: application/json" Example 8.5. List available distros response: JSON "distros": [ "id": "HDP1.3", "name": "Hortonworks Data Platform", "version": "1.3", "links": [ "href": " 1234/distros/HDP1.3", "rel": "self", "href": " distros/hdp1.3", "rel": "bookmark", "id": "HDP2.2", "name": "Hortonworks Data Platform", "version": "2.2", "links": [ "href": " 1234/distros/HDP2.2", "rel": "self", "href": " distros/hdp2.2", "rel": "bookmark", "id": "CDH5", "name": "Cloudera Hadoop", "version": "5", "links": [ 24

32 1234/distros/CDH5", distros/cdh5", ], "href": " "rel": "self" "href": " "rel": "bookmark" Client example Using the client, view available distros using the distros list command as shown in the following example. Example 8.6. View available distros with the CLI $ lava distros list ID Name Version HDP2.2 HortonWorks Data Platform Listing available stacks Stacks are high-level building blocks of software that compose a Big Data architecture. Stacks are comprised of services, which in turn are comprised of components. A stack is specific to a distribution due to the differences in services that are supported across distributions. You can create a stack or use one of the preconfigured stacks. Verb URI Description GET /v2/tenant_id/stacks Lists available stacks curl example The following examples show the curl request and corresponding response for listing all stacks. This operation does not accept a request body. Example 8.7. curl list all stacks request: JSON curl -i -X GET stacks -d \ -H "X-Auth-Token: yourauthtoken" \ -H "Accept: application/json" \ -H "Content-type: application/json" 25

33 Example 8.8. List all stacks response: JSON "stacks": [ "distro": "HDP2.2", "id": "HDP2.2_Hadoop", "name": "Core Hadoop", "description": "Core Hadoop Stack with Hive", "services": [ "modes": ["HA" "name": "HDFS", "name": "Yarn", "name": "MapReduce", "name": "Hive", "name": "Pig" "links": [ "href": " 1234/stacks/HDP2.2_Hadoop", "rel": "self", "href": " stacks/hdp2.2_hadoop", "rel": "bookmark", "distro": "HDP2.2", "id": "HDP2.2_HBase", "name": "HBase", "description": "Core Hadoop Stack with HBase", "services": [ "modes": ["HA" "name": "HDFS", "name": "Yarn", "name": "HBase", "name": "MapReduce", 26

34 "name": "Hive", "name": "Pig" "links": [ "href": " 1234/stacks/HDP2.2_HBase", "rel": "self", "href": " stacks/hdp2.2_hbase", "rel": "bookmark" "links": [ "href": " stacks?limit=2&marker=hdp2.2_hbase", "rel": "next", ] Client example Using the client, view available stacks using the stacks list command as shown in the following example. Example 8.9. View available stacks with the CLI $ lava stacks list ID Name Distro Description Services HADOOP_HDP2_2 Hadoop HDP 2.2 HDP2.2 Core batch processing systems and interactive [name=hdfs, modes=[secondary querying with Hive. name=yarn, modes=[ name=mapreduce, modes=[ name=hive, modes=[ name=pig, modes=[ name=sqoop, modes=[ name=oozie, modes=[ 27

35 name=flume, modes=[ name=zookeeper, modes=[]] KAFKA_HDP2_2 Kafka HDP 2.2 HDP2.2 An individual Kafka stack serving as the backbone [name=hdfs, modes=[secondary of a distributed message queuing system. name=kafka, modes=[ name=zookeeper, modes=[]] SPARK_HDP2_2 Spark HDP 2.2 HDP2.2 Spark on Yarn supporting both batch and real-time [name=hdfs, modes=[secondary processing. name=yarn, modes=[ name=mapreduce, modes=[ name=hive, modes=[ name=pig, modes=[ name=zookeeper, modes=[ name=spark, modes=[]] Creating a cluster This operation creates a cluster for your account. Verb URI Description POST /v2/tenant_id/clusters Creates a cluster curl example The following examples show the curl request followed by the JSON request body and the corresponding response for creating a cluster. Example curl create cluster request curl -i -X POST clusters -d \ -H "X-Auth-Token: yourauthtoken" \ -H "Accept: application/json" \ -H "Content-type: application/json" Example Create cluster request: JSON body "cluster": "name": "test", "username": "cbduser", "ssh_keys": ["cbdkey" "stack_id": "HDP2.1_Hadoop", "node_groups": [ 28

36 "count": 10, "flavor_id": "hadoop1-7", "id": "slave" "connectors": [ "type": "cloud_files", "credential": "name": "cfuser" "scripts": [ "id": "c ff d425e1f9dd" ] Example Create cluster response: JSON "cluster": "created": " T10:10:10Z", "id": "aaa-bbbb-cccc", "name": "test", "username": "cbduser", "ssh_keys": ["cbdkey" "status": "BUILDING", "progress": "5", "links": [ "href": " clusters/aaa-bbbb-cccc", "rel": "self", "href": " clusters/aaa-bbbb-cccc", "rel": "bookmark" "stack_id": "HDP2.1_Hadoop", "node_groups": [ "components": [ "name": "Namenode", "name": "ResourceManager", "name": "YarnTimelineServer", "name": "JobHistoryServer" 29

37 ,,,, "count": 1, "flavor_id": "hadoop1-7", "id": "master" "components": [ "name": "Namenode" "count": 1, "flavor_id": "hadoop1-7", "id": "standby-namenode" "components": [ "name": "JournalNode" "count": 3, "flavor_id": "hadoop1-1", "id": "journalnodes" "components": [ "name": "Datanode", "name": "NodeManager" "count": 10, "flavor_id": "hadoop1-7", "id": "slave", "components": [ "name": "HiveServer2", "name": "HiveMetastore", "name": "HiveClient", "name": "HiveAPI", "name": "PigClient" "count": 1, "flavor_id": "hadoop1-2", "id": "gateway" 30

38 "updated": "", "connectors": [ "type": "cloud_files", "credential": "name": "cfuser" "scripts": [ "id": "c ff d425e1f9dd", "name": "Mongo Connector", "status": "PENDING" ] Client example Using the client, create a cluster using the clusters create command as shown in the following example. Example Create a cluster with the CLI $ lava clusters create test KAFKA_HDP2_2 --node-groups='slave(flavor_id= hadoop1-7, count=3)' \ --ssh-key cbdkey --username cbduser Cluster ID c5444b98-f4b4-aaaa-bbbb-b6e9d3313da1 Name test Status BUILDING Stack KAFKA_HDP2_2 Created :10:37+00:00 CBD Version 2 Username cbduser Progress Node Groups ID Flavor Count Components master hadoop1-4 1 [name=namenode] secondary hadoop1-4 1 [name=secondarynamenode] slave hadoop1-7 3 [name=datanode, name=kafkabroker, name=zookeeperclient] zookeeper hadoop1-2 3 [name=zookeeperserver, name=zookeeperclient]

39 8.5. Listing clusters You use the operation to list clusters to find the available clusters for your account. Verb URI Description GET /v2/tenant_id/clusters Lists all clusters for your account curl example This operation does not require a request body. The following examples show the curl request and the corresponding response for listing clusters. Example curl list clusters request: JSON curl -i -X GET clusters -d \ -H "X-Auth-Token: yourauthtoken" \ -H "Accept: application/json" \ -H "Content-type: application/json" Example List clusters response: JSON "clusters": [ "created": " T10:10:10Z", "id": "aaa-bbbb-cccc", "name": "test", "status": "ACTIVE", "stack_id": "HDP2.1_Hadoop", "updated": "", "links": [ "href": " 1234/clusters/aaa-bbbb-cccc", "rel": "self", "href": " clusters/aaa-bbbb-cccc", "rel": "bookmark" "links": [ "href": " limit=1&marker=aaa-bbbb-cccc", "rel": "next" ] 32

40 Client example Using the client, list clusters using the clusters list command as shown in the following example. Example List clusters with the CLI $ lava clusters list ID Name Status Stack Created c5444b98-f4b4-aaaa-bbbb-b6e9d3313da1 test ACTIVE KAFKA_HDP2_ :10:37+00: Viewing node details The operation to get node details lists all server nodes for the specified cluster. GET Verb URI Description curl example /v2/tenant_id/clusters/clusterid/nodes Lists all nodes for a cluster. In the following example, the cluster has a master node and two slave nodes. Each node has a private IP address, which is used for backend (Hadoop) data transfer, and a public IP address, which enables you to access it over the public Internet. In the example, you can remotely SSH into the master or slave nodes over the IP address by using the username and ssh_key that you added during cluster creation. The following example show the curl request and corresponding response for listing all nodes for a cluster. Example curl list cluster nodes request: JSON curl -i -X GET clusters/ac d cbe787bbbc41/nodes -d \ -H "X-Auth-Token: yourauthtoken" \ -H "Accept: application/json" \ -H "Content-Type: application/json" Example List cluster nodes response: JSON "nodes": [ "created": " T10:10:10Z", "id": " ", "name": "master-1", "node_group": "master", "status": "ACTIVE", "updated": "", 33

41 "addresses": "public": [ "addr": "168.x.x.3", "version": 4 "private": [ "addr": "10.x.x.3", "version": 4 ], "flavor_id": "hadoop1-4", "components": [ "name": "Namenode", "nice_name": "HDFS Namenode", "uri": " "name": "ResourceManager", "nice_name": "YARN Resource Manager", "uri": " "name": "YarnTimelineServer", "nice_name": "YARN Timeline History Server", "uri": " "name": "JobHistoryServer", "nice_name": "MapReduce History Server", "uri": " "links": [ "rel": "self", "href": " 1234/clusters/aaa-bbbb-cccc/nodes/ ", "rel": "bookmark", "href": " clusters/aaa-bbbb-cccc/nodes/ " ], "created": " T10:10:10Z", "id": " ", "name": "slave-1", "node_group": "slave", "status": "ACTIVE", "updated": "", "addresses": "public": [ 34

42 "addr": "168.x.x.4", "version": 4 "private": [ "addr": "10.x.x.4", "version": 4 ], "flavor_id": "hadoop1-7", "components": [ "name": "Datanode", "nice_name": "HDFS Datanode", "uri": " "name": "NodeManager", "nice_name": "YARN Node Manager", "uri": " "links": [ "rel": "self", "href": " 1234/clusters/aaa-bbbb-cccc/nodes/ ", "rel": "bookmark", "href": " clusters/aaa-bbbb-cccc/nodes/ " ], "created": " T10:10:10Z", "id": " ", "name": "slave-2", "node_group": "slave", "status": "ACTIVE", "updated": "", "addresses": "public": [ "addr": "168.x.x.5", "version": 4 "private": [ "addr": "10.x.x.5", "version": 4 ], "flavor_id": "hadoop1-7", "components": [ 35

docs.rackspace.com/api

docs.rackspace.com/api docs.rackspace.com/api Rackspace Cloud Block Storage Getting Started (2015-07-27) 2015 Rackspace US, Inc. This document is intended for software developers interested in developing applications using the

More information

docs.rackspace.com/api

docs.rackspace.com/api docs.rackspace.com/api Rackspace Cloud Backup Getting API v1.0 (2015-03-18) 2015 Rackspace US, Inc. This document is intended for software developers interested in developing applications using the Rackspace

More information

docs.rackspace.com/api

docs.rackspace.com/api docs.rackspace.com/api Rackspace Cloud Big Data Developer API v1.0 (2015-04-23) 2015 Rackspace US, Inc. This guide is intended for software developers interested in developing applications using the Rackspace

More information

docs.rackspace.com/api

docs.rackspace.com/api docs.rackspace.com/api Rackspace Cloud Big Data Developer API v1.0 (2015-04-23) 2015 Rackspace US, Inc. This guide is intended for software developers interested in developing applications using the Rackspace

More information

docs.rackspace.com/api

docs.rackspace.com/api docs.rackspace.com/api Rackspace Cloud Big Data Release (2015-06-30) 2015 Rackspace US, Inc. This document is intended for software developers who are interested in developing applications using the Rackspace

More information

How To Create A Port On A Neutron.Org Server On A Microsoft Powerbook 2.5.2 (Networking) On A Macbook 2 (Netware) On An Ipad Or Ipad 2.2.2 On A

How To Create A Port On A Neutron.Org Server On A Microsoft Powerbook 2.5.2 (Networking) On A Macbook 2 (Netware) On An Ipad Or Ipad 2.2.2 On A docs.rackspace.com/api Cloud Networks Getting Started (2015-06-15) 2015 Rackspace US, Inc. This document is for software developers who develop applications by using Rackspace Cloud Networks, which is

More information

docs.rackspace.com/api

docs.rackspace.com/api docs.rackspace.com/api Rackspace Cloud Backup Developer API v1.0 (2015-06-30) 2015 Rackspace US, Inc. This document is intended for software developers interested in developing applications using the Rackspace

More information

Cloudera Manager Training: Hands-On Exercises

Cloudera Manager Training: Hands-On Exercises 201408 Cloudera Manager Training: Hands-On Exercises General Notes... 2 In- Class Preparation: Accessing Your Cluster... 3 Self- Study Preparation: Creating Your Cluster... 4 Hands- On Exercise: Working

More information

docs.rackspace.com/api

docs.rackspace.com/api docs.rackspace.com/api Rackspace Cloud Backup Release (2015-09-09) 2015 Rackspace US, Inc. This document is intended for software developers who are interested in developing applications using the Rackspace

More information

Cloudera Enterprise Reference Architecture for Google Cloud Platform Deployments

Cloudera Enterprise Reference Architecture for Google Cloud Platform Deployments Cloudera Enterprise Reference Architecture for Google Cloud Platform Deployments Important Notice 2010-2015 Cloudera, Inc. All rights reserved. Cloudera, the Cloudera logo, Cloudera Impala, Impala, and

More information

Cloudera Enterprise Reference Architecture for Google Cloud Platform Deployments

Cloudera Enterprise Reference Architecture for Google Cloud Platform Deployments Cloudera Enterprise Reference Architecture for Google Cloud Platform Deployments Important Notice 2010-2016 Cloudera, Inc. All rights reserved. Cloudera, the Cloudera logo, Cloudera Impala, Impala, and

More information

Introduction to Big data. Why Big data? Case Studies. Introduction to Hadoop. Understanding Features of Hadoop. Hadoop Architecture.

Introduction to Big data. Why Big data? Case Studies. Introduction to Hadoop. Understanding Features of Hadoop. Hadoop Architecture. Big Data Hadoop Administration and Developer Course This course is designed to understand and implement the concepts of Big data and Hadoop. This will cover right from setting up Hadoop environment in

More information

DameWare Server. Administrator Guide

DameWare Server. Administrator Guide DameWare Server Administrator Guide About DameWare Contact Information Team Contact Information Sales 1.866.270.1449 General Support Technical Support Customer Service User Forums http://www.dameware.com/customers.aspx

More information

Deploying Hadoop with Manager

Deploying Hadoop with Manager Deploying Hadoop with Manager SUSE Big Data Made Easier Peter Linnell / Sales Engineer plinnell@suse.com Alejandro Bonilla / Sales Engineer abonilla@suse.com 2 Hadoop Core Components 3 Typical Hadoop Distribution

More information

Ankush Cluster Manager - Hadoop2 Technology User Guide

Ankush Cluster Manager - Hadoop2 Technology User Guide Ankush Cluster Manager - Hadoop2 Technology User Guide Ankush User Manual 1.5 Ankush User s Guide for Hadoop2, Version 1.5 This manual, and the accompanying software and other documentation, is protected

More information

Dell One Identity Cloud Access Manager 8.0 - How to Configure vworkspace Integration

Dell One Identity Cloud Access Manager 8.0 - How to Configure vworkspace Integration Dell One Identity Cloud Access Manager 8.0 - How to Configure vworkspace Integration February 2015 This guide describes how to configure Dell One Identity Cloud Access Manager to communicate with a Dell

More information

Red Hat Enterprise Linux OpenStack Platform 7 OpenStack Data Processing

Red Hat Enterprise Linux OpenStack Platform 7 OpenStack Data Processing Red Hat Enterprise Linux OpenStack Platform 7 OpenStack Data Processing Manually provisioning and scaling Hadoop clusters in Red Hat OpenStack OpenStack Documentation Team Red Hat Enterprise Linux OpenStack

More information

Fairsail REST API: Guide for Developers

Fairsail REST API: Guide for Developers Fairsail REST API: Guide for Developers Version 1.02 FS-API-REST-PG-201509--R001.02 Fairsail 2015. All rights reserved. This document contains information proprietary to Fairsail and may not be reproduced,

More information

Hadoop on OpenStack Cloud. Dmitry Mescheryakov Software Engineer, @MirantisIT

Hadoop on OpenStack Cloud. Dmitry Mescheryakov Software Engineer, @MirantisIT Hadoop on OpenStack Cloud Dmitry Mescheryakov Software Engineer, @MirantisIT Agenda OpenStack Sahara Demo Hadoop Performance on Cloud Conclusion OpenStack Open source cloud computing platform 17,209 commits

More information

EMC Data Domain Management Center

EMC Data Domain Management Center EMC Data Domain Management Center Version 1.1 Initial Configuration Guide 302-000-071 REV 04 Copyright 2012-2015 EMC Corporation. All rights reserved. Published in USA. Published June, 2015 EMC believes

More information

Cloudera Backup and Disaster Recovery

Cloudera Backup and Disaster Recovery Cloudera Backup and Disaster Recovery Important Note: Cloudera Manager 4 and CDH 4 have reached End of Maintenance (EOM) on August 9, 2015. Cloudera will not support or provide patches for any of the Cloudera

More information

RackConnect User Guide

RackConnect User Guide RackConnect User Guide Updated: November 8, 2011 RackConnect User Guide Page 2 of 15 DISCLAIMER This RackConnect User Guide (the Guide ) is for informational purposes only and is provided AS IS. The information

More information

Remote Access API 2.0

Remote Access API 2.0 VYATTA A BROCADE COMPANY Vyatta System Remote Access API 2.0 REFERENCE GUIDE Vyatta A Brocade Company 130 Holger Way San Jose, CA 95134 www.brocade.com 408 333 8400 COPYRIGHT Copyright 2005 2015 Vyatta,

More information

VMware vsphere Big Data Extensions Administrator's and User's Guide

VMware vsphere Big Data Extensions Administrator's and User's Guide VMware vsphere Big Data Extensions Administrator's and User's Guide vsphere Big Data Extensions 1.0 This document supports the version of each product listed and supports all subsequent versions until

More information

Dell Recovery Manager for Active Directory 8.6. Quick Start Guide

Dell Recovery Manager for Active Directory 8.6. Quick Start Guide Dell Recovery Manager for Active Directory 8.6 2014 Dell Inc. ALL RIGHTS RESERVED. This guide contains proprietary information protected by copyright. The software described in this guide is furnished

More information

vcloud Air Platform Programmer's Guide

vcloud Air Platform Programmer's Guide vcloud Air Platform Programmer's Guide vcloud Air OnDemand 5.7 This document supports the version of each product listed and supports all subsequent versions until the document is replaced by a new edition.

More information

Introduction to Hadoop. New York Oracle User Group Vikas Sawhney

Introduction to Hadoop. New York Oracle User Group Vikas Sawhney Introduction to Hadoop New York Oracle User Group Vikas Sawhney GENERAL AGENDA Driving Factors behind BIG-DATA NOSQL Database 2014 Database Landscape Hadoop Architecture Map/Reduce Hadoop Eco-system Hadoop

More information

rackspace.com/cloud/private

rackspace.com/cloud/private TM rackspace.com/cloud/private Rackspace Private Cloud Active v 4.0 (2013-06-25) Copyright 2013 Rackspace All rights reserved. This document is intended to assist Rackspace Private Cloud customers in updating

More information

EMC ViPR Controller. ViPR Controller REST API Virtual Data Center Configuration Guide. Version 2.3.0.0 302-002-070 01

EMC ViPR Controller. ViPR Controller REST API Virtual Data Center Configuration Guide. Version 2.3.0.0 302-002-070 01 EMC ViPR Controller Version 2.3.0.0 ViPR Controller REST API Virtual Data Center Configuration Guide 302-002-070 01 Copyright 2013-2015 EMC Corporation. All rights reserved. Published in USA. Published

More information

docs.rackspace.com/api

docs.rackspace.com/api docs.rackspace.com/api Rackspace Cloud Feeds Developer API v1.0 (2015-11-16) 2015 Rackspace US, Inc. This document is intended for software developers interested in developing applications using the Rackspace

More information

WP4: Cloud Hosting Chapter Object Storage Generic Enabler

WP4: Cloud Hosting Chapter Object Storage Generic Enabler WP4: Cloud Hosting Chapter Object Storage Generic Enabler Webinar John Kennedy, Thijs Metsch@ Intel Outline 1 Overview of the Cloud Hosting Work Package 2 Functionality Trust and Security Operations FI-WARE

More information

SOA Software API Gateway Appliance 7.1.x Administration Guide

SOA Software API Gateway Appliance 7.1.x Administration Guide SOA Software API Gateway Appliance 7.1.x Administration Guide Trademarks SOA Software and the SOA Software logo are either trademarks or registered trademarks of SOA Software, Inc. Other product names,

More information

Cloudera Backup and Disaster Recovery

Cloudera Backup and Disaster Recovery Cloudera Backup and Disaster Recovery Important Notice (c) 2010-2013 Cloudera, Inc. All rights reserved. Cloudera, the Cloudera logo, Cloudera Impala, and any other product or service names or slogans

More information

ITG Software Engineering

ITG Software Engineering Introduction to Apache Hadoop Course ID: Page 1 Last Updated 12/15/2014 Introduction to Apache Hadoop Course Overview: This 5 day course introduces the student to the Hadoop architecture, file system,

More information

LifeSize Control Installation Guide

LifeSize Control Installation Guide LifeSize Control Installation Guide April 2005 Part Number 132-00001-001, Version 1.0 Copyright Notice Copyright 2005 LifeSize Communications. All rights reserved. LifeSize Communications has made every

More information

Quick Install Guide. Lumension Endpoint Management and Security Suite 7.1

Quick Install Guide. Lumension Endpoint Management and Security Suite 7.1 Quick Install Guide Lumension Endpoint Management and Security Suite 7.1 Lumension Endpoint Management and Security Suite - 2 - Notices Version Information Lumension Endpoint Management and Security Suite

More information

Chase Wu New Jersey Ins0tute of Technology

Chase Wu New Jersey Ins0tute of Technology CS 698: Special Topics in Big Data Chapter 4. Big Data Analytics Platforms Chase Wu New Jersey Ins0tute of Technology Some of the slides have been provided through the courtesy of Dr. Ching-Yung Lin at

More information

MySQL and Virtualization Guide

MySQL and Virtualization Guide MySQL and Virtualization Guide Abstract This is the MySQL and Virtualization extract from the MySQL Reference Manual. For legal information, see the Legal Notices. For help with using MySQL, please visit

More information

Configuring Keystone in OpenStack (Essex)

Configuring Keystone in OpenStack (Essex) WHITE PAPER Configuring Keystone in OpenStack (Essex) Joshua Tobin April 2012 Copyright Canonical 2012 www.canonical.com Executive introduction Keystone is an identity service written in Python that provides

More information

NetIQ Identity Manager Setup Guide

NetIQ Identity Manager Setup Guide NetIQ Identity Manager Setup Guide July 2015 www.netiq.com/documentation Legal Notice THIS DOCUMENT AND THE SOFTWARE DESCRIBED IN THIS DOCUMENT ARE FURNISHED UNDER AND ARE SUBJECT TO THE TERMS OF A LICENSE

More information

Dell Enterprise Reporter 2.5. Configuration Manager User Guide

Dell Enterprise Reporter 2.5. Configuration Manager User Guide Dell Enterprise Reporter 2.5 2014 Dell Inc. ALL RIGHTS RESERVED. This guide contains proprietary information protected by copyright. The software described in this guide is furnished under a software license

More information

About Recovery Manager for Active

About Recovery Manager for Active Dell Recovery Manager for Active Directory 8.6.1 May 30, 2014 These release notes provide information about the Dell Recovery Manager for Active Directory release. About Resolved issues Known issues System

More information

Workshop on Hadoop with Big Data

Workshop on Hadoop with Big Data Workshop on Hadoop with Big Data Hadoop? Apache Hadoop is an open source framework for distributed storage and processing of large sets of data on commodity hardware. Hadoop enables businesses to quickly

More information

Big Data Operations Guide for Cloudera Manager v5.x Hadoop

Big Data Operations Guide for Cloudera Manager v5.x Hadoop Big Data Operations Guide for Cloudera Manager v5.x Hadoop Logging into the Enterprise Cloudera Manager 1. On the server where you have installed 'Cloudera Manager', make sure that the server is running,

More information

Title page. Alcatel-Lucent 5620 SERVICE AWARE MANAGER 13.0 R7

Title page. Alcatel-Lucent 5620 SERVICE AWARE MANAGER 13.0 R7 Title page Alcatel-Lucent 5620 SERVICE AWARE MANAGER 13.0 R7 APPLICATION API DEVELOPER GUIDE 3HE-10590-AAAA-TQZZA Issue 1 December 2015 Legal notice Legal notice Alcatel, Lucent, Alcatel-Lucent and the

More information

Dell One Identity Cloud Access Manager 8.0.1 - How to Develop OpenID Connect Apps

Dell One Identity Cloud Access Manager 8.0.1 - How to Develop OpenID Connect Apps Dell One Identity Cloud Access Manager 8.0.1 - How to Develop OpenID Connect Apps May 2015 This guide includes: What is OAuth v2.0? What is OpenID Connect? Example: Providing OpenID Connect SSO to a Salesforce.com

More information

Installation Guide NetIQ AppManager

Installation Guide NetIQ AppManager Installation Guide NetIQ AppManager April 2016 www.netiq.com/documentation Legal Notice NetIQ AppManager is covered by United States Patent No(s): 05829001, 05986653, 05999178, 06078324, 06397359, 06408335.

More information

RSA Authentication Manager 7.1 Basic Exercises

RSA Authentication Manager 7.1 Basic Exercises RSA Authentication Manager 7.1 Basic Exercises Contact Information Go to the RSA corporate web site for regional Customer Support telephone and fax numbers: www.rsa.com Trademarks RSA and the RSA logo

More information

CA Nimsoft Service Desk

CA Nimsoft Service Desk CA Nimsoft Service Desk Configure Outbound Web Services 7.13.7 Legal Notices Copyright 2013, CA. All rights reserved. Warranty The material contained in this document is provided "as is," and is subject

More information

Microsoft Dynamics GP 2010. SQL Server Reporting Services Guide

Microsoft Dynamics GP 2010. SQL Server Reporting Services Guide Microsoft Dynamics GP 2010 SQL Server Reporting Services Guide April 4, 2012 Copyright Copyright 2012 Microsoft. All rights reserved. Limitation of liability This document is provided as-is. Information

More information

Zend Server Amazon AMI Quick Start Guide

Zend Server Amazon AMI Quick Start Guide Zend Server Amazon AMI Quick Start Guide By Zend Technologies www.zend.com Disclaimer This is the Quick Start Guide for The Zend Server Zend Server Amazon Machine Image The information in this document

More information

RSA Authentication Manager 7.1 to 8.1 Migration Guide: Upgrading RSA SecurID Appliance 3.0 On Existing Hardware

RSA Authentication Manager 7.1 to 8.1 Migration Guide: Upgrading RSA SecurID Appliance 3.0 On Existing Hardware RSA Authentication Manager 7.1 to 8.1 Migration Guide: Upgrading RSA SecurID Appliance 3.0 On Existing Hardware Contact Information Go to the RSA corporate website for regional Customer Support telephone

More information

Installing Management Applications on VNX for File

Installing Management Applications on VNX for File EMC VNX Series Release 8.1 Installing Management Applications on VNX for File P/N 300-015-111 Rev 01 EMC Corporation Corporate Headquarters: Hopkinton, MA 01748-9103 1-508-435-1000 www.emc.com Copyright

More information

Hadoop Basics with InfoSphere BigInsights

Hadoop Basics with InfoSphere BigInsights An IBM Proof of Technology Hadoop Basics with InfoSphere BigInsights Unit 4: Hadoop Administration An IBM Proof of Technology Catalog Number Copyright IBM Corporation, 2013 US Government Users Restricted

More information

Cloudera Navigator Installation and User Guide

Cloudera Navigator Installation and User Guide Cloudera Navigator Installation and User Guide Important Notice (c) 2010-2013 Cloudera, Inc. All rights reserved. Cloudera, the Cloudera logo, Cloudera Impala, and any other product or service names or

More information

Virtual Contact Center

Virtual Contact Center Virtual Contact Center Zendesk CTI Integration Configuration Guide Version 8.0 Revision 1.0 Copyright 2013, 8x8, Inc. All rights reserved. This document is provided for information purposes only and the

More information

http://www.trendmicro.com/download

http://www.trendmicro.com/download Trend Micro Incorporated reserves the right to make changes to this document and to the products described herein without notice. Before installing and using the software, please review the readme files,

More information

Cloud Servers Developer Guide

Cloud Servers Developer Guide Cloud Servers Developer Guide API v0.9 This document is intended for software developers interested in developing applications using the Cloud Servers Application Programming Interface (API). Table of

More information

CA Nimsoft Monitor. Probe Guide for IIS Server Monitoring. iis v1.5 series

CA Nimsoft Monitor. Probe Guide for IIS Server Monitoring. iis v1.5 series CA Nimsoft Monitor Probe Guide for IIS Server Monitoring iis v1.5 series Legal Notices Copyright 2013, CA. All rights reserved. Warranty The material contained in this document is provided "as is," and

More information

docs.hortonworks.com

docs.hortonworks.com docs.hortonworks.com : Ambari User's Guide Copyright 2012-2015 Hortonworks, Inc. Some rights reserved. The, powered by Apache Hadoop, is a massively scalable and 100% open source platform for storing,

More information

Siebel Installation Guide for UNIX. Siebel Innovation Pack 2013 Version 8.1/8.2, Rev. A April 2014

Siebel Installation Guide for UNIX. Siebel Innovation Pack 2013 Version 8.1/8.2, Rev. A April 2014 Siebel Installation Guide for UNIX Siebel Innovation Pack 2013 Version 8.1/8.2, Rev. A April 2014 Copyright 2005, 2014 Oracle and/or its affiliates. All rights reserved. This software and related documentation

More information

Interworks. Interworks Cloud Platform Installation Guide

Interworks. Interworks Cloud Platform Installation Guide Interworks Interworks Cloud Platform Installation Guide Published: March, 2014 This document contains information proprietary to Interworks and its receipt or possession does not convey any rights to reproduce,

More information

CDH 5 Quick Start Guide

CDH 5 Quick Start Guide CDH 5 Quick Start Guide Important Notice (c) 2010-2015 Cloudera, Inc. All rights reserved. Cloudera, the Cloudera logo, Cloudera Impala, and any other product or service names or slogans contained in this

More information

2X ApplicationServer & LoadBalancer Manual

2X ApplicationServer & LoadBalancer Manual 2X ApplicationServer & LoadBalancer Manual 2X ApplicationServer & LoadBalancer Contents 1 URL: www.2x.com E-mail: info@2x.com Information in this document is subject to change without notice. Companies,

More information

Introduction to Openstack, an Open Cloud Computing Platform. Libre Software Meeting

Introduction to Openstack, an Open Cloud Computing Platform. Libre Software Meeting Introduction to Openstack, an Open Cloud Computing Platform Libre Software Meeting 10 July 2012 David Butler BBC Research & Development david.butler@rd.bbc.co.uk Introduction: Libre Software Meeting 2012

More information

Oracle Enterprise Manager

Oracle Enterprise Manager Oracle Enterprise Manager System Monitoring Plug-in Installation Guide for EMC Symmetrix DMX System Release 12.1.0.2.0 E27543-03 February 2014 This document provides installation and configuration instructions

More information

CA Nimsoft Monitor Snap

CA Nimsoft Monitor Snap CA Nimsoft Monitor Snap Configuration Guide for IIS Server Monitoring iis v1.5 series Legal Notices This online help system (the "System") is for your informational purposes only and is subject to change

More information

Bright Cluster Manager

Bright Cluster Manager Bright Cluster Manager A Unified Management Solution for HPC and Hadoop Martijn de Vries CTO Introduction Architecture Bright Cluster CMDaemon Cluster Management GUI Cluster Management Shell SOAP/ JSONAPI

More information

NetIQ Sentinel 7.0.1 Quick Start Guide

NetIQ Sentinel 7.0.1 Quick Start Guide NetIQ Sentinel 7.0.1 Quick Start Guide April 2012 Getting Started Use the following information to get Sentinel installed and running quickly. Meeting System Requirements on page 1 Installing Sentinel

More information

Windows Azure Pack Installation and Initial Configuration

Windows Azure Pack Installation and Initial Configuration Windows Azure Pack Installation and Initial Configuration Windows Server 2012 R2 Hands-on lab In this lab, you will learn how to install and configure the components of the Windows Azure Pack. To complete

More information

HADOOP. Revised 10/19/2015

HADOOP. Revised 10/19/2015 HADOOP Revised 10/19/2015 This Page Intentionally Left Blank Table of Contents Hortonworks HDP Developer: Java... 1 Hortonworks HDP Developer: Apache Pig and Hive... 2 Hortonworks HDP Developer: Windows...

More information

http://docs.trendmicro.com

http://docs.trendmicro.com Trend Micro Incorporated reserves the right to make changes to this document and to the products described herein without notice. Before installing and using the product, please review the readme files,

More information

docs.hortonworks.com

docs.hortonworks.com docs.hortonworks.com Hortonworks Data Platform: Administering Ambari Copyright 2012-2015 Hortonworks, Inc. Some rights reserved. The Hortonworks Data Platform, powered by Apache Hadoop, is a massively

More information

Polar Help Desk Installation Guide

Polar Help Desk Installation Guide Polar Help Desk Installation Guide Copyright (legal information) Copyright Polar 1995-2005. All rights reserved. The information contained in this document is proprietary to Polar and may not be used or

More information

Oracle Enterprise Manager

Oracle Enterprise Manager Oracle Enterprise Manager System Monitoring Plug-in Installation Guide for Microsoft Active Directory Release 12.1.0.1.0 E28548-04 February 2014 Microsoft Active Directory, which is included with Microsoft

More information

Communicating with the Elephant in the Data Center

Communicating with the Elephant in the Data Center Communicating with the Elephant in the Data Center Who am I? Instructor Consultant Opensource Advocate http://www.laubersoltions.com sml@laubersolutions.com Twitter: @laubersm Freenode: laubersm Outline

More information

API Reference Guide. API Version 1. Copyright Platfora 2016

API Reference Guide. API Version 1. Copyright Platfora 2016 API Reference Guide API Version 1 Copyright Platfora 2016 Last Updated: 10:05 a.m. April 21, 2016 Contents Document Conventions... 5 Contact Platfora Support...6 Copyright Notices... 6 Chapter 1: Using

More information

Administration Quick Start

Administration Quick Start www.novell.com/documentation Administration Quick Start ZENworks 11 Support Pack 3 February 2014 Legal Notices Novell, Inc., makes no representations or warranties with respect to the contents or use of

More information

Programming Hadoop 5-day, instructor-led BD-106. MapReduce Overview. Hadoop Overview

Programming Hadoop 5-day, instructor-led BD-106. MapReduce Overview. Hadoop Overview Programming Hadoop 5-day, instructor-led BD-106 MapReduce Overview The Client Server Processing Pattern Distributed Computing Challenges MapReduce Defined Google's MapReduce The Map Phase of MapReduce

More information

EMC Data Protection Search

EMC Data Protection Search EMC Data Protection Search Version 1.0 Security Configuration Guide 302-001-611 REV 01 Copyright 2014-2015 EMC Corporation. All rights reserved. Published in USA. Published April 20, 2015 EMC believes

More information

NetIQ Directory and Resource Administrator NetIQ Exchange Administrator. Installation Guide

NetIQ Directory and Resource Administrator NetIQ Exchange Administrator. Installation Guide NetIQ Directory and Resource Administrator NetIQ Exchange Administrator Installation Guide August 2013 Legal Notice NetIQ Directory and Resource Administrator is protected by United States Patent No(s):

More information

A Study of Data Management Technology for Handling Big Data

A Study of Data Management Technology for Handling Big Data Available Online at www.ijcsmc.com International Journal of Computer Science and Mobile Computing A Monthly Journal of Computer Science and Information Technology IJCSMC, Vol. 3, Issue. 9, September 2014,

More information

Cisco TelePresence Authenticating Cisco VCS Accounts Using LDAP

Cisco TelePresence Authenticating Cisco VCS Accounts Using LDAP Cisco TelePresence Authenticating Cisco VCS Accounts Using LDAP Deployment Guide Cisco VCS X8.1 D14465.06 December 2013 Contents Introduction 3 Process summary 3 LDAP accessible authentication server configuration

More information

Dell One Identity Cloud Access Manager 8.0.1 - How to Configure for SSO to SAP NetWeaver using SAML 2.0

Dell One Identity Cloud Access Manager 8.0.1 - How to Configure for SSO to SAP NetWeaver using SAML 2.0 Dell One Identity Cloud Access Manager 8.0.1 - How to Configure for SSO to SAP NetWeaver using SAML 2.0 May 2015 About this guide Prerequisites and requirements NetWeaver configuration Legal notices About

More information

RSA Authentication Manager 8.1 Virtual Appliance Getting Started

RSA Authentication Manager 8.1 Virtual Appliance Getting Started RSA Authentication Manager 8.1 Virtual Appliance Getting Started Thank you for purchasing RSA Authentication Manager 8.1, the world s leading two-factor authentication solution. This document provides

More information

User and Programmer Guide for the FI- STAR Monitoring Service SE

User and Programmer Guide for the FI- STAR Monitoring Service SE User and Programmer Guide for the FI- STAR Monitoring Service SE FI-STAR Beta Release Copyright 2014 - Yahya Al-Hazmi, Technische Universität Berlin This document gives a short guide on how to use the

More information

Integrating SAP BusinessObjects with Hadoop. Using a multi-node Hadoop Cluster

Integrating SAP BusinessObjects with Hadoop. Using a multi-node Hadoop Cluster Integrating SAP BusinessObjects with Hadoop Using a multi-node Hadoop Cluster May 17, 2013 SAP BO HADOOP INTEGRATION Contents 1. Installing a Single Node Hadoop Server... 2 2. Configuring a Multi-Node

More information

Peers Techno log ies Pv t. L td. HADOOP

Peers Techno log ies Pv t. L td. HADOOP Page 1 Peers Techno log ies Pv t. L td. Course Brochure Overview Hadoop is a Open Source from Apache, which provides reliable storage and faster process by using the Hadoop distibution file system and

More information

Architecting the Future of Big Data

Architecting the Future of Big Data Hive ODBC Driver User Guide Revised: July 22, 2014 2012-2014 Hortonworks Inc. All Rights Reserved. Parts of this Program and Documentation include proprietary software and content that is copyrighted and

More information

Rackspace Cloud Big Data Platform On-demand Big Data processing platform

Rackspace Cloud Big Data Platform On-demand Big Data processing platform Rackspace Cloud Big Data Platform On-demand Big Data processing platform Rackspace Cloud Big Data Platform: On-demand Big Data Processing Platform Cover Table of Contents Introduction 1 Challenges of Managing

More information

IBM WebSphere Portal Reference Guide Release 9.2

IBM WebSphere Portal Reference Guide Release 9.2 [1]JD Edwards EnterpriseOne IBM WebSphere Portal Reference Guide Release 9.2 E53620-03 March 2016 Describes how to use this guide to supplement the use of the IBM WebSphere Portal with Oracle JD Edwards

More information

INTEGRATION GUIDE. DIGIPASS Authentication for Citrix NetScaler (with AGEE)

INTEGRATION GUIDE. DIGIPASS Authentication for Citrix NetScaler (with AGEE) INTEGRATION GUIDE DIGIPASS Authentication for Citrix NetScaler (with AGEE) Disclaimer Disclaimer of Warranties and Limitation of Liabilities All information contained in this document is provided 'as is';

More information

HADOOP MOCK TEST HADOOP MOCK TEST I

HADOOP MOCK TEST HADOOP MOCK TEST I http://www.tutorialspoint.com HADOOP MOCK TEST Copyright tutorialspoint.com This section presents you various set of Mock Tests related to Hadoop Framework. You can download these sample mock tests at

More information

CA Unified Infrastructure Management

CA Unified Infrastructure Management CA Unified Infrastructure Management Probe Guide for IIS Server Monitoring iis v1.7 series Copyright Notice This online help system (the "System") is for your informational purposes only and is subject

More information

Setup Guide Access Manager 3.2 SP3

Setup Guide Access Manager 3.2 SP3 Setup Guide Access Manager 3.2 SP3 August 2014 www.netiq.com/documentation Legal Notice THIS DOCUMENT AND THE SOFTWARE DESCRIBED IN THIS DOCUMENT ARE FURNISHED UNDER AND ARE SUBJECT TO THE TERMS OF A LICENSE

More information

Oracle Virtual Desktop Infrastructure. VDI Demo (Microsoft Remote Desktop Services) for Version 3.2

Oracle Virtual Desktop Infrastructure. VDI Demo (Microsoft Remote Desktop Services) for Version 3.2 Oracle Virtual Desktop Infrastructure VDI Demo (Microsoft Remote Desktop Services) for Version 2 April 2011 Copyright 2011, Oracle and/or its affiliates. All rights reserved. This software and related

More information

NIST/ITL CSD Biometric Conformance Test Software on Apache Hadoop. September 2014. National Institute of Standards and Technology (NIST)

NIST/ITL CSD Biometric Conformance Test Software on Apache Hadoop. September 2014. National Institute of Standards and Technology (NIST) NIST/ITL CSD Biometric Conformance Test Software on Apache Hadoop September 2014 Dylan Yaga NIST/ITL CSD Lead Software Designer Fernando Podio NIST/ITL CSD Project Manager National Institute of Standards

More information

Application Discovery Manager User s Guide vcenter Application Discovery Manager 6.2.1

Application Discovery Manager User s Guide vcenter Application Discovery Manager 6.2.1 Application Discovery Manager User s Guide vcenter Application Discovery Manager 6.2.1 This document supports the version of each product listed and supports all subsequent versions until the document

More information

Version 3.8. Installation Guide

Version 3.8. Installation Guide Version 3.8 Installation Guide Copyright 2007 Jetro Platforms, Ltd. All rights reserved. This document is being furnished by Jetro Platforms for information purposes only to licensed users of the Jetro

More information

Hadoop & Spark Using Amazon EMR

Hadoop & Spark Using Amazon EMR Hadoop & Spark Using Amazon EMR Michael Hanisch, AWS Solutions Architecture 2015, Amazon Web Services, Inc. or its Affiliates. All rights reserved. Agenda Why did we build Amazon EMR? What is Amazon EMR?

More information