Introduction to analyzing big data using Amazon Web Services

Size: px
Start display at page:

Download "Introduction to analyzing big data using Amazon Web Services"

Transcription

1 Introduction to analyzing big data using Amazon Web Services This tutorial accompanies the BARC seminar given at Whitehead on January 31, It contains instructions for: 1. Getting started with Amazon Web Services 2. Navigating S3 buckets from the command line 3. Creating and logging into an Amazon EC2 instance 4. Running the case study map reduce job. This tutorial assumes you are working in a UNIX environment and are reasonably comfortable with using command line tools. Any commands that should be entered at the terminal will be denoted by Courier New font in a text box. Prompts are preceded by a $, whereas command line output is not. For example: $ echo Hello World Hello World Getting started with Amazon Web Services Sign up for an AWS account 1. Go to 2. Click Sign up and follow the steps. You will need to enter credit card information to use AWS, even if you only use free services. All steps presented in this tutorial besides the case study use free resources. The case study will cost around $ After you have signed up, log in to your account. Go to the AWS Console The AWS console is where you can access the user interface for all the Amazon Cloud services. To go to the console, either click My Account/Console - > AWS Management Console, or go to the URL This will you bring you to a page looking something like this:

2 We will return to this page to start using each of the services covered here.

3 Navigating S3 buckets from the command line S3cmd is a useful tool for interfacing with Amazon S3 from the command line. Here we will go over how to install and set it up. Setting up s3cmd You can download s3cmd using apt- get: $ sudo apt-get install s3cmd Before you can use s3cmd, you will need to configure it using your AWS credentials. To do this, run: $ s3cmd -configure Enter new values or accept defaults in brackets with Enter. Refer to user manual for detailed description of all options. Access key and Secret key are your identifiers for Amazon S3 Access Key: This will prompt you to enter your access key. You can find this at the AWS console. From the console homepage, you will see your username in the top right corner. Click to bring up a drop down box, and go to the Security Credentials option. Scroll down until you see the heading Access Credentials. Here you will find your access key (I have blanked out my personal key below, but you should see yours there):

4 Copy the key under Access Key ID and paste it into the command line prompt. It will now ask for your Secret key. Access Key: XXXXXXXXX Secret Key: To get your secret key, click Show under Secret Access Key. Copy this and paste it into the command line prompt. You will then be given a series of prompts. Press Enter at all of these to leave the default settings. Continue pressing Enter until the prompt: Test access with supplied credentials? [Y/n] Press Y, then Enter to verify that everything worked. When asked if you want to save your settings, press Y again. This will save your credentials file at ~/.s3cfg. This file is required to run s3cmd. S3cmd examples Here we give several examples using s3cmd. To see the full range of options, type: $ s3cmd --help Example 1: Make a new s3 bucket, upload a file to the bucket, and view bucket contents. Replace mgymrek with your own identifier. $ s3cmd mb s3://mgymrek-test-s3 Bucket 's3://mgymrek-test-s3/' created $ echo "Hello world" > hello-world.txt $ s3cmd put hello-world.txt s3://mgymrek-test-s3/ $ s3cmd ls s3://mgymrek-test-s :31 12 s3://mgymrek-test-s3/hello-world.txt If you navigate to the S3 console, you will see your new bucket and can view its contents. CAVEAT: avoid using anything besides dashes - and lower case letters for paths in S3. Many hours of unhappy debugging can be saved by following this simple rule of

5 thumb. Never use _ in a filename. For some reason this will break downstream steps! Example 2: View and download data from a public repository. The 1000 Genomes data is available at the s3 bucket s3://1000genomes. Below we will view the contents of this directory and download a small file from it: $ s3cmd ls s3://1000genomes/ DIR DIR DIR DIR DIR DIR DIR DIR s3://1000genomes/alignment_indices/ s3://1000genomes/changelog_details/ s3://1000genomes/data/ s3://1000genomes/phase1/ s3://1000genomes/pilot_data/ s3://1000genomes/release/ s3://1000genomes/sequence_indices/ s3://1000genomes/technical/... $ s3cmd get s3://1000genomes/readme.alignment_data s3://1000genomes/readme.alignment_data ->./README.alignment_data [1 of 1] of % in 0s kb/s done This downloaded the README file with documentation on how samples were aligned (the actual file is not important here, we just chose a small file to download as an example) to your local computer.

6 Creating and logging into an Amazon EC2 instance Here we will see how to start EC2 compute nodes from the AWS console, and how to connect to a virtual instance from the command line. Before beginning this step, go to the EC2 console at east- 1. Generate a key- pair In order to log into any of your EC2 instances, you will need a key- pair that will be used to verify your identity. From the EC2 console, in the menu on the left hand side under Network & Security, click the Key Pairs option. Then click the Create Key Pair button at the top of the page. First you will be asked to provide a name for your key- pair: Clicking Create will generate a key- pair and automatically download a file <keypair- name>.pem. Store this file in a location where you will remember. Mine is stored in ~/keys/mgymrek_key.pem. You will need to change the permissions of this file to only be readable by you, or else you will run into problems when we try to use the key to SSH into an instance. $ chmod 400 ~/keys/mgymrek_key.pem Set up your default security group To make sure you ll be able to log into your EC2 instances from SSH, we just need to make sure the security settings are all set to allow this. On the left hand menu of the EC2 console, go to Security Groups under the heading Network & Security. Select Default from the list at the top, and then go to the tab Inbound on the bottom panel. For the option Create a new rule, select SSH. Then click Add Rule followed by Apply Rule Changes. Then you should be all set for the next step, launching the actual instance.

7 Launch an EC2 instance Back at the EC2 console, at the menu on the left side click EC2 Dashboard. Click the blue button Launch Instance. This will bring up a pop- up screen. Select Classic Wizard and click Continue. Options marked with a star are eligible for free- tier pricing, which we ll use here. We will use the fourth option down Ubuntu Server LTS. Click Select next to that option.

8 This will bring you to the next set of options, Instance Details. Here leave everything the same, except change the Availability Zone option to us- east- 1a. It is important to always use the same zone, as data transfer from one zone to another is charged, whereas transfer within the same zone is always free. Click Continue. We don t need to set any advanced configuration, so hit Continue three times more until you get to the Create Key Pair step. Select Choose from your existing Key Pairs and make sure the key pair you just created is selected. Click Continue to go to Configure Firewall. Select Choose one or more of your existing Security Groups and select the default group.

9 Hit Continue once more to review the settings for this instance. Everything should be all set, so click Launch. This will bring you to a page listing all your existing EC2 instances. The instance can take a minute or two to start up, and may say Pending. When Status Checks shows a green checkmark, you are ready to continue to the next step. Log- in and explore the EC2 instance Selecting your new instance at the console will bring up information in the bottom panel about that instance. You will find at the top of the bottom panel the public DNS address, which you can use to SSH into your instance. This is shown highlighted below: We can now SSH into this instance, using the key- pair that we generated earlier.

10 $ ssh -i ~/keys/mgymrek_key.pem 2.compute-1.amazonaws.com The - i argument takes the location where you stored your public key. We log in with the user name ubuntu. Answer yes at the prompt Are you sure you want to continue connecting. This will log you into your micro EC2 instance. From here you can do just about anything you can do from the command line on any Ubuntu machine. Below are a couple simple examples. Note that the boxes are shaded a different color to denote that you are now entering commands on the EC2 instance, rather than on your own machine. Example 1: Look at the default storage. $ df h Filesystem Size Used Avail Use% Mounted on /dev/xvda1 7.9G 773M 6.8G 11% / udev 288M 8.0K 288M 1% /dev tmpfs 119M 160K 118M 1% /run So we have about 7GB we can use on this machine. It is pretty small, that s why it s called a micro instance, and that s why it s free to use these! Example 2: Install software Using sudo apt- get install requires first running sudo apt- get update to update repository information. In this example, we update the system, then install R. $ sudo apt-get update $ sudo apt-get install r-base-core Example 3: Transfer data (for free) from an s3 bucket. Data transfer between AWS services is free, as long as it is in the same geographic region. This tutorial assumes we are working in the region us- east- 1a. Using s3cmd in an instance requires installing s3cmd, and getting your credential file onto the instance. This will come up again in the next section when we run a MapReduce job and need to call s3cmd. First upload your credential file from your computer to the instance:

11 $ scp i ~/keys/mgymrek_key.pem /home/mgymrek/.s3cfg ubuntu@ec compute- 1.amazonaws.com:/home/ubuntu/.s3cfg Then install s3cmd on the instance, and transfer the test file we made earlier: $ sudo apt-get install s3cmd $ s3cmd get s3://mgymrek-test-s3/hello-world.txt $ ls l -rw-rw-r-- 1 ubuntu ubuntu 12 Jan 30 05:41 hello-world.txt Terminate the EC2 instance When you are done running the instance, you should terminate it. At the console, right click on your instance, and select Terminate.

12 Running the case study map reduce job In this simple Elastic MapReduce (EMR) example we will run SNP calling on the genomes of ten individuals from the 1000 Genomes Project (only on chromosome 20 to keep the example small). All examples here use the s3 bucket s3://mgymrek- barc- example. Install and configure the elastic- mapreduce command line tool It is possible to run EMR jobs from the AWS console, but there are great command line tools that give you more flexibility and that are quite easy to use. First download the elastic mapreduce tools, which are based on ruby (you will need to have ruby installed): $ sudo apt-get install ruby-full $ wget emr/ $ unzip emr/elastic-mapreduce-ruby.zip To configure elastic- mapreduce, you will need to create a file with your credentials in the same directory that you unzipped the software. Create a file credentials.json there with the following (note the orange color will denote the contents of a file): { "access-id": "XXX", "private-key": "XXX", "key-pair": "mgymrek_key", "key-pair-file": "/home/mgymrek/keys/mgymrek_key.pem", "log-uri": "s3://mgymrek-test-s3/log", } Here access- id and private- key are the access key and secret key we went over on Pages 3-4. Key- pair is the name of the key created above, and key- pair- file is the path to the.pem file containing that key. Log- uri is a location in one of your s3 buckets where EMR can store log files. To check that you have configured everything correctly, try:

13 $ cd emr $./elastic-mapreduce If everything is configured correctly, you should see a long help message with all the options for this tool. If something is wrong, you will get an error message complaining about your access credentials. Make sure credentials.json is in the emr directory. Prepare inputs: list of sample IDs The inputs to an EMR job consist of the input of one map job per line. In this example, each map job will be to run SNP calling on a single genome. So our input file will have one line with the accession for each genome to process: We create a file called genomeids.txt: NA18499 NA18501 NA18502 NA18504 NA18505 NA18507 NA18508 NA18510 NA18511 NA18516 Bootstrapping: install software on each mapper Each map instance in an EMR task is basically a fresh node with nothing you will need already installed on it. Any data, software, or general configuration you need to do to each mapper in order for it to be able to complete the map tasks, you can do using something called a bootstrap script. This script runs when a map instance is started before it processes any map tasks. In our case, we will want to do the following:

14 Download our s3cfg file so we can use s3cmd. In this example, the s3cfg file is transferred from an s3 bucket. This is probably not very secure and there may be a better way to do this. Install all the software we ll need for SNP calling. We will perform SNP calling using VarScan. For this, we ll need to install VarScan, java, and samtools. Create directories where we ll store different data files. The storage space on the mapper nodes can be accessed in the /mnt directory. We will create all data directories here. Download and unzip the human reference genome. We ll need this for the steps required for SNP calling. We put all of these steps into a bash script named download-snptools.sh that will run on startup: #!/bin/bash set -e # Download s3cfg file wget -S -T 10 -t 5 sudo mv.s3cfg /mnt/ # Install Java, s3cmd, and samtools sudo apt-get update sudo apt-get install -y default-jre s3cmd samtools # Transfer VarScan from S3 sudo s3cmd -c /mnt/.s3cfg get s3://mgymrek-barcexample/tools/varscan.v2.3.3.jar /mnt/varscan.v2.3.3.jar # Make directories to store data sudo mkdir /mnt/alignments; sudo mkdir /mnt/genome; sudo mkdir /mnt/varscan sudo chmod -R 777 /mnt/alignments/ sudo chmod -R 777 /mnt/genome/ sudo chmod -R 777 /mnt/varscan/ # Download and unzip reference genome sudo s3cmd -c /mnt/.s3cfg get s3://mgymrek-barcexample/human_g1k_v37.fasta.gz mv human_g1k_v37.fasta.gz /mnt/genome/ gunzip /mnt/genome/human_g1k_v37.fasta.gz If you run this example yourself, note that you will need to change the path to the s3cfg file to where you have it stored. Anywhere that you can access using wget is fine. I have removed my file from this location for security reasons. The VarScan jar file and reference genome are still at the s3 buckets referenced above, so you are welcome to download them from there.

15 Create the mapper Mappers follow a basic structure: it reads from standard input. Each line from standard input consists of a different task. When no lines of input are left, it terminates. We will write the mapper in python, but it can be in any language (as long as it is either supported by default on the map instances, or you install it during the bootstrap stage). Here our mapper will take in a genome accession from 1000 Genomes as each line of input. Then a single job will consist of calling SNPs in that genome using VarScan. The mapper will need to do the following for each task: Download the bam files for that genome from the 1000 Genomes bucket. Call samtools and VarScan for SNP calling Upload the results to S3 The code for this mapper, which is in the file snpcall-mapper.py, is below: #!/usr/bin/python import sys import os S3_ONEKGDATAPATH = "s3://1000genomes/phase1/data" S3_VARSCANPATH = "s3://mgymrek-barc-example/varscan" ALIGNPATH = "/mnt/alignments" PILEUPPATH = "/mnt/pileups" VARSCANPATH = "/mnt/varscan" GENOMEPATH = "/mnt/genome/human_g1k_v37.fasta" S3CONFIG = "/mnt/.s3cfg" for line in sys.stdin: sample = line.strip() # download BAM alignment bamfile = "%s.chrom20.illumina.bwa.yri.low_coverage bam"%sample cmd = "s3cmd -c %s get %s/%s/alignment/%s %s/%s"%(s3config, S3_ONEKGDATAPATH, sample, bamfile, ALIGNPATH, bamfile) os.system(cmd) # Create pileup and run VarScan resultsfile = "%s/%s.varscan"%(varscanpath, sample) cmd = "samtools mpileup -f %s %s/%s java -jar /mnt/varscan.v2.3.3.jar mpileup2snp> %s"%(genomepath, ALIGNPATH,bamfile,resultsfile) os.system(cmd) # Upload results to s3 cmd = "s3cmd -c %s put %s %s/%s.varscan"%(s3config, resultsfile, S3_VARSCANPATH, sample) os.system(cmd) One caveat about the mapper is that you should avoid printing any output to standard out. Anything printed is assumed to be input to the reducer. If you want to

16 print debugging messages, be sure to write them to standard error instead, then you can view them in the logs later to debug. Upload data and scripts to s3 The last step before actually running the EMR job is to put the inputs, mapper, and bootstrap script into s3. We will also need to make sure that the permissions and metadata for each is set correctly, which we can do from the S3 console. Inputs $ s3cmd put genomeids.txt s3://mgymrek-barcexample/inputs/genomeids.txt Mapper $ s3cmd put snpcall-mapper.py s3://mgymrek-barcexample/scripts/snpcall-mapper.py S3 will automatically recognize that this is a python script because of the header line. To check this, you can navigate to your mapper script from the S3 Console. Select the mapper, right click and select Properties, then Select the Metadata tab on the right. You should see: Bootstrap script $ s3cmd put download-snptools.sh s3://mgymrek-barcexample/scripts/download-snptools.sh Again, S3 will recognize the file type automatically. But we will have to set the permissions for the bootstrap script so that the mappers are allowed to open it. In the S3 console, navigate to the bootstrap script. Right click, select Properties, and then select the Permissions tab. Click Add more permissions. Set Grantee: Everyone and select Open/download. Then click Save.

17 Run the EMR job! We will run the job from the command line. There are quite a few options we will need to set. First we ll show the command, which is a bit intimidating. Then we ll go through what each of those options mean. $./emr/elastic-mapreduce --create --stream --alive \ --name snpcalls \ --num-instances 3 \ --slave-instance-type m1.medium \ --master-instance-type m1.small \ --availability-zone us-east-1a \ --input s3n://mgymrek-barc-example/inputs/ \ --mapper s3://mgymrek-barc-example/scripts/snpcall-mapper.py \ --output s3n://mgymrek-barc-example/logs \ --bootstrap-action s3://mgymrek-barcexample/scripts/download-snptools.sh --bootstrap-action s3://elasticmapreduce/bootstrapactions/configure-hadoop --args "- s,mapred.map.tasks.speculative.execution=false,- s,mapred.tasktracker.map.tasks.maximum=1,- s,mapred.map.tasks=1,-s,mapred.reduce.tasks=0,- s,mapred.tasktracker.reduce.tasks.maximum=0" Name: a unique identifier to reference this specific EMR job Num- instances: the total number of compute nodes used in the job. This includes the master and slaves, so it needs to be at least two to allow for one master and one slave. Here we are doing a small example, so we set it to three. On big jobs, you can set this to tens or even hundreds of mappers to make your jobs extremely parallelized. Slave- instance- type: what type of EC2 node to use for the slave. This depends on the memory and space requirements of your map job. Here we don t need that much RAM, and we only need enough space to process a single genome at a time, so m1.medium is enough. (It has 3+GB RAM, enough to handle loading the human genome).

18 Master- instance- type: what type of EC2 node to use for the master. Usually the mapper is not doing anything that intense, it just distributes jobs. The smallest type of instance you are allowed to use is m1.small, so that s what we use here. Availability- zone: as mentioned above, to keep S3 data transfer charges free, always specify the same zone. A good default is to just use us- east- 1a always. Input: specify the location in S3 from where EMR should read the inpu. This is always a folder, and EMR will read from any file that is in this folder. The s3n:// instead of s3:// prefix is used to specify something that will be read from standard input. Always use the s3n prefix for input to EMR. Mapper: specify the location in S3 to find the mapper script. Again, the mapper script is an executable file that processes one line of standard input at a time. Beyond that requirement, you can basically write whatever you want into the mapper script. Output: specify a location in S3 where EMR can write any output from this job. It includes any log messages, and any output from the reducer if you use a reducer. Caveat this location must be unique for each EMR job. You must specify a folder that does not already exist, or else EMR will complain. Bootstrap- action: specify bootstrap scripts to run upon startup of each map node. You can specify as many bootstrap- actions as you want. Here we specified two of them: 1. The first is our custom bootstrap script we defined earlier to download all necessary software and data. 2. The second is one of the bootstrap scripts already available from Amazon. It allows configuring hadoop, which is the framework behind all the mapreduce jobs. The arguments we set look kind of cryptic, but they are all to ensure that we don t have any reducers, and that each mapper runs only one task at a time so we don t run out of space. A full set of the arguments you can set is here: default.html. This page is very useful for any kind of advanced configuration of hadoop for specific needs of your jobs. Ok, now we are finally read to run the EMR job! Simply run the above command at the command line. This command is in the shell script file run-emr-example.sh. $ sh run-emr-example.sh Created job flow j-2xfp2bjelril0 After you ran the command above, you should see a message that a job flow was created. Cross your fingers before moving on.

19 Now, go the AWS console. Navigate to the Elastic MapReduce console and you should see a job listed. At first it s state will say Starting. You can select the job and view its properties in the tabs below. For instance, the Steps tab will tell you which steps have been completed so far. Eventually, if all is good, the state will change to Bootstrapping while the bootstrap steps are running Finally, the status will change to Running, indicating that your EMR job is off and running! View EMR progress You can view the progress of your EMR job at the console. At the bottom panel, select the Monitoring tab. This will show you graphs of all kinds of helpful and fun information, such as how many jobs are running, how many are remaining, how many mappers are working at the moment, etc. Examine outputs Output of our map tasks will be in the directory we specified, which in this example was s3://mgymrek- barc- example/varscan/. We can look at what files are there using s3cmd (or on the S3 console):

20 $ s3cmd ls s3://mgymrek-barc-example/varscan/ : s3://mgymrek-barcexample/varscan/na18499.varscan We can see that only one sample has finished so far. We can download this file and view the SNP calls: $ s3cmd get s3://mgymrek-barcexample/varscan/na18499.varscan. $ head NA18499.varscan cut f 1,2,3,4 Chrom Position Ref Var T C A C C T C T T C T C A G G C C T You can either transfer all the files to your local computer for downstream analysis, or do more computing with them on the cloud! The sequel: debugging EMR jobs A lot of things can go wrong during MapReduce jobs, and debugging these is a whole other tutorial. Here are some general tips: You can log onto individual slave instances and look into the logs to get an idea if something went wrong. To do so, go to your EC2 console, find the slaves that are running, and SSH into them. You will need to use the same key- pair you ve been using, and this time log in with username hadoop instead of Ubuntu. $ ssh i ~/keys/mgymrek_key.pem hadoop@hadoop@ec compute-1.amazonaws.com Navigate to the logs, which are stored in /mnt/var/log/. From there you can view the standard output and standard error of the bootstrap and mapper scripts. Logs for the bootstrap action are in the bootstrap- actions folder, and logs for map tasks are in the hadoop folder.

21 instance- $ ls /mnt/var/log/ bootstrap-actions hadoop instance-controller state service-nanny At the EMR console, after selecting your job, the bottom tab will usually give informative status messages, such as the last state change or why something didn t work. For example: This error message tells us that the bootstrap action failed on the master. Then to figure out the problem, we can SSH into the master node, go to the logs for the bootstrap action, and likely find some helpful error message. Make sure your slave instance type is big enough for the job. If you find you start and EMR job but can t SSH into the slave even though the job says it s running, it may be running out of memory and stalling. Try bumping up the RAM. If all else fails, there is a ton of documentation scattered on the internet. Googling problems tends to work.

CONFIGURING ECLIPSE FOR AWS EMR DEVELOPMENT

CONFIGURING ECLIPSE FOR AWS EMR DEVELOPMENT CONFIGURING ECLIPSE FOR AWS EMR DEVELOPMENT With this post we thought of sharing a tutorial for configuring Eclipse IDE (Intergrated Development Environment) for Amazon AWS EMR scripting and development.

More information

Cloud Computing. AWS a practical example. Hugo Pérez UPC. Mayo 2012

Cloud Computing. AWS a practical example. Hugo Pérez UPC. Mayo 2012 Cloud Computing AWS a practical example Mayo 2012 Hugo Pérez UPC -2- Index Introduction Infraestructure Development and Results Conclusions Introduction In order to know deeper about AWS services, mapreduce

More information

Hadoop Installation MapReduce Examples Jake Karnes

Hadoop Installation MapReduce Examples Jake Karnes Big Data Management Hadoop Installation MapReduce Examples Jake Karnes These slides are based on materials / slides from Cloudera.com Amazon.com Prof. P. Zadrozny's Slides Prerequistes You must have an

More information

Running Knn Spark on EC2 Documentation

Running Knn Spark on EC2 Documentation Pseudo code Running Knn Spark on EC2 Documentation Preparing to use Amazon AWS First, open a Spark launcher instance. Open a m3.medium account with all default settings. Step 1: Login to the AWS console.

More information

FREE computing using Amazon EC2

FREE computing using Amazon EC2 FREE computing using Amazon EC2 Seong-Hwan Jun 1 1 Department of Statistics Univ of British Columbia Nov 1st, 2012 / Student seminar Outline Basics of servers Amazon EC2 Setup R on an EC2 instance Stat

More information

Getting Started with Hadoop with Amazon s Elastic MapReduce

Getting Started with Hadoop with Amazon s Elastic MapReduce Getting Started with Hadoop with Amazon s Elastic MapReduce Scott Hendrickson scott@drskippy.net http://drskippy.net/projects/emr-hadoopmeetup.pdf Boulder/Denver Hadoop Meetup 8 July 2010 Scott Hendrickson

More information

DVS-100 Installation Guide

DVS-100 Installation Guide DVS-100 Installation Guide DVS-100 can be installed on any system running the Ubuntu 14.04 64 bit Linux operating system, the guide below covers some common installation scenarios. Contents System resource

More information

Amazon Web Services (AWS) Setup Guidelines

Amazon Web Services (AWS) Setup Guidelines Amazon Web Services (AWS) Setup Guidelines For CSE6242 HW3, updated version of the guidelines by Diana Maclean [Estimated time needed: 1 hour] Note that important steps are highlighted in yellow. What

More information

Using The Hortonworks Virtual Sandbox

Using The Hortonworks Virtual Sandbox Using The Hortonworks Virtual Sandbox Powered By Apache Hadoop This work by Hortonworks, Inc. is licensed under a Creative Commons Attribution- ShareAlike3.0 Unported License. Legal Notice Copyright 2012

More information

Single Node Hadoop Cluster Setup

Single Node Hadoop Cluster Setup Single Node Hadoop Cluster Setup This document describes how to create Hadoop Single Node cluster in just 30 Minutes on Amazon EC2 cloud. You will learn following topics. Click Here to watch these steps

More information

Hadoop Data Warehouse Manual

Hadoop Data Warehouse Manual Ruben Vervaeke & Jonas Lesy 1 Hadoop Data Warehouse Manual To start off, we d like to advise you to read the thesis written about this project before applying any changes to the setup! The thesis can be

More information

Building a Private Cloud Cloud Infrastructure Using Opensource

Building a Private Cloud Cloud Infrastructure Using Opensource Cloud Infrastructure Using Opensource with Ubuntu Server 10.04 Enterprise Cloud (Eucalyptus) OSCON (Note: Special thanks to Jim Beasley, my lead Cloud Ninja, for putting this document together!) Introduction

More information

DVS-100 Installation Guide

DVS-100 Installation Guide DVS-100 Installation Guide DVS-100 can be installed on any system running the Ubuntu 14.04 64 bit Linux operating system, the guide below covers some common installation scenarios. Contents System resource

More information

Eucalyptus 3.4.2 User Console Guide

Eucalyptus 3.4.2 User Console Guide Eucalyptus 3.4.2 User Console Guide 2014-02-23 Eucalyptus Systems Eucalyptus Contents 2 Contents User Console Overview...4 Install the Eucalyptus User Console...5 Install on Centos / RHEL 6.3...5 Configure

More information

OpenTOSCA Release v1.1. Contact: info@opentosca.org Documentation Version: March 11, 2014 Current version: http://files.opentosca.

OpenTOSCA Release v1.1. Contact: info@opentosca.org Documentation Version: March 11, 2014 Current version: http://files.opentosca. OpenTOSCA Release v1.1 Contact: info@opentosca.org Documentation Version: March 11, 2014 Current version: http://files.opentosca.de NOTICE This work has been supported by the Federal Ministry of Economics

More information

INSTALLING KAAZING WEBSOCKET GATEWAY - HTML5 EDITION ON AN AMAZON EC2 CLOUD SERVER

INSTALLING KAAZING WEBSOCKET GATEWAY - HTML5 EDITION ON AN AMAZON EC2 CLOUD SERVER INSTALLING KAAZING WEBSOCKET GATEWAY - HTML5 EDITION ON AN AMAZON EC2 CLOUD SERVER A TECHNICAL WHITEPAPER Copyright 2012 Kaazing Corporation. All rights reserved. kaazing.com Executive Overview This document

More information

Getting Started with Amazon EC2 Management in Eclipse

Getting Started with Amazon EC2 Management in Eclipse Getting Started with Amazon EC2 Management in Eclipse Table of Contents Introduction... 4 Installation... 4 Prerequisites... 4 Installing the AWS Toolkit for Eclipse... 4 Retrieving your AWS Credentials...

More information

Source Code Management for Continuous Integration and Deployment. Version 1.0 DO NOT DISTRIBUTE

Source Code Management for Continuous Integration and Deployment. Version 1.0 DO NOT DISTRIBUTE Source Code Management for Continuous Integration and Deployment Version 1.0 Copyright 2013, 2014 Amazon Web Services, Inc. and its affiliates. All rights reserved. This work may not be reproduced or redistributed,

More information

Cloud Backup Express

Cloud Backup Express Cloud Backup Express Table of Contents Installation and Configuration Workflow for RFCBx... 3 Cloud Management Console Installation Guide for Windows... 4 1: Run the Installer... 4 2: Choose Your Language...

More information

Renderbot Tutorial. Intro to AWS

Renderbot Tutorial. Intro to AWS Renderbot Tutorial Thanks for choosing to render your Blender projects in the cloud using Renderbot. This guide will introduce Amazon AWS, walk you through the setup process, and help you render your first

More information

AdWhirl Open Source Server Setup Instructions

AdWhirl Open Source Server Setup Instructions AdWhirl Open Source Server Setup Instructions 11/09 AdWhirl Server Setup Instructions The server runs in Amazon s web cloud. To set up the server, you need an Amazon Web Services (AWS) account and the

More information

Online Backup Guide for the Amazon Cloud: How to Setup your Online Backup Service using Vembu StoreGrid Backup Virtual Appliance on the Amazon Cloud

Online Backup Guide for the Amazon Cloud: How to Setup your Online Backup Service using Vembu StoreGrid Backup Virtual Appliance on the Amazon Cloud Online Backup Guide for the Amazon Cloud: How to Setup your Online Backup Service using Vembu StoreGrid Backup Virtual Appliance on the Amazon Cloud Here is a step-by-step set of instructions to get your

More information

Hands-on Exercises with Big Data

Hands-on Exercises with Big Data Hands-on Exercises with Big Data Lab Sheet 1: Getting Started with MapReduce and Hadoop The aim of this exercise is to learn how to begin creating MapReduce programs using the Hadoop Java framework. In

More information

How to Run Spark Application

How to Run Spark Application How to Run Spark Application Junghoon Kang Contents 1 Intro 2 2 How to Install Spark on a Local Machine? 2 2.1 On Ubuntu 14.04.................................... 2 3 How to Run Spark Application on a

More information

Continuous Delivery on AWS. Version 1.0 DO NOT DISTRIBUTE

Continuous Delivery on AWS. Version 1.0 DO NOT DISTRIBUTE Continuous Version 1.0 Copyright 2013, 2014 Amazon Web Services, Inc. and its affiliates. All rights reserved. This work may not be reproduced or redistributed, in whole or in part, without prior written

More information

MATLAB on EC2 Instructions Guide

MATLAB on EC2 Instructions Guide MATLAB on EC2 Instructions Guide Contents Welcome to MATLAB on EC2...3 What You Need to Do...3 Requirements...3 1. MathWorks Account...4 1.1. Create a MathWorks Account...4 1.2. Associate License...4 2.

More information

Team Foundation Server 2012 Installation Guide

Team Foundation Server 2012 Installation Guide Team Foundation Server 2012 Installation Guide Page 1 of 143 Team Foundation Server 2012 Installation Guide Benjamin Day benday@benday.com v1.0.0 November 15, 2012 Team Foundation Server 2012 Installation

More information

AWS Data Pipeline. Developer Guide API Version 2012-10-29

AWS Data Pipeline. Developer Guide API Version 2012-10-29 AWS Data Pipeline Developer Guide Amazon Web Services AWS Data Pipeline: Developer Guide Amazon Web Services What is AWS Data Pipeline?... 1 How Does AWS Data Pipeline Work?... 1 Pipeline Definition...

More information

CSE 344 Introduction to Data Management. Section 9: AWS, Hadoop, Pig Latin TA: Yi-Shu Wei

CSE 344 Introduction to Data Management. Section 9: AWS, Hadoop, Pig Latin TA: Yi-Shu Wei CSE 344 Introduction to Data Management Section 9: AWS, Hadoop, Pig Latin TA: Yi-Shu Wei Homework 8 Big Data analysis on billion triple dataset using Amazon Web Service (AWS) Billion Triple Set: contains

More information

Creating a DUO MFA Service in AWS

Creating a DUO MFA Service in AWS Amazon AWS is a cloud based development environment with a goal to provide many options to companies wishing to leverage the power and convenience of cloud computing within their organisation. In 2013

More information

A Tutorial Introduc/on to Big Data. Hands On Data Analy/cs over EMR. Robert Grossman University of Chicago Open Data Group

A Tutorial Introduc/on to Big Data. Hands On Data Analy/cs over EMR. Robert Grossman University of Chicago Open Data Group A Tutorial Introduc/on to Big Data Hands On Data Analy/cs over EMR Robert Grossman University of Chicago Open Data Group Collin BenneE Open Data Group November 12, 2012 1 Amazon AWS Elas/c MapReduce allows

More information

Rstudio Server on Amazon EC2

Rstudio Server on Amazon EC2 Rstudio Server on Amazon EC2 Liad Shekel liad.shekel@gmail.com June 2015 Liad Shekel Rstudio Server on Amazon EC2 1 / 72 Rstudio Server on Amazon EC2 Outline 1 Amazon Web Services (AWS) History Services

More information

Windows Intune Walkthrough: Windows Phone 8 Management

Windows Intune Walkthrough: Windows Phone 8 Management Windows Intune Walkthrough: Windows Phone 8 Management This document will review all the necessary steps to setup and manage Windows Phone 8 using the Windows Intune service. Note: If you want to test

More information

Student installation of TinyOS

Student installation of TinyOS Jan.12, 2014 Author: Rahav Dor Student installation of TinyOS TinyOs install Automatic installation... 1 Get Linux... 2 Install Ubuntu on a Virtual Machine... 2 Install Ubuntu on VMware... 2 Installing

More information

Moving Drupal to the Cloud: A step-by-step guide and reference document for hosting a Drupal web site on Amazon Web Services

Moving Drupal to the Cloud: A step-by-step guide and reference document for hosting a Drupal web site on Amazon Web Services Moving Drupal to the Cloud: A step-by-step guide and reference document for hosting a Drupal web site on Amazon Web Services MCN 2009: Cloud Computing Primer Workshop Charles Moad

More information

Installing Sun's VirtualBox on Windows XP and setting up an Ubuntu VM

Installing Sun's VirtualBox on Windows XP and setting up an Ubuntu VM Installing Sun's VirtualBox on Windows XP and setting up an Ubuntu VM laptop will need to have 10GB of free space to install download the latest VirtualBox software from www.sun.com make sure you pick

More information

USER CONFERENCE 2011 SAN FRANCISCO APRIL 26 29. Running MarkLogic in the Cloud DEVELOPER LOUNGE LAB

USER CONFERENCE 2011 SAN FRANCISCO APRIL 26 29. Running MarkLogic in the Cloud DEVELOPER LOUNGE LAB USER CONFERENCE 2011 SAN FRANCISCO APRIL 26 29 Running MarkLogic in the Cloud DEVELOPER LOUNGE LAB Table of Contents UNIT 1: Lab description... 3 Pre-requisites:... 3 UNIT 2: Launching an instance on EC2...

More information

Code::Block manual. for CS101x course. Department of Computer Science and Engineering Indian Institute of Technology - Bombay Mumbai - 400076.

Code::Block manual. for CS101x course. Department of Computer Science and Engineering Indian Institute of Technology - Bombay Mumbai - 400076. Code::Block manual for CS101x course Department of Computer Science and Engineering Indian Institute of Technology - Bombay Mumbai - 400076. April 9, 2014 Contents 1 Introduction 1 1.1 Code::Blocks...........................................

More information

Immersion Day. Creating an Elastic Load Balancer. Rev 2015-01

Immersion Day. Creating an Elastic Load Balancer. Rev 2015-01 Rev 2015-01 Table of Contents Overview...3 Launch a Second Web Server...4 Create an ELB...6 Copyright 2015, Amazon Web Services, All Rights Reserved Page 2 Overview This lab will walk the user through

More information

Comsol Multiphysics. Running COMSOL on the Amazon Cloud. VERSION 4.3a

Comsol Multiphysics. Running COMSOL on the Amazon Cloud. VERSION 4.3a Comsol Multiphysics Running COMSOL on the Amazon Cloud VERSION 4.3a Running COMSOL on the Amazon Cloud 1998 2012 COMSOL Protected by U.S. Patents 7,519,518; 7,596,474; and 7,623,991. Patents pending. This

More information

Running Kmeans Mapreduce code on Amazon AWS

Running Kmeans Mapreduce code on Amazon AWS Running Kmeans Mapreduce code on Amazon AWS Pseudo Code Input: Dataset D, Number of clusters k Output: Data points with cluster memberships Step 1: for iteration = 1 to MaxIterations do Step 2: Mapper:

More information

Creating an ESS instance on the Amazon Cloud

Creating an ESS instance on the Amazon Cloud Creating an ESS instance on the Amazon Cloud Copyright 2014-2015, R. James Holton, All rights reserved (11/13/2015) Introduction The purpose of this guide is to provide guidance on creating an Expense

More information

Setting Up Your Android Development Environment. For Mac OS X (10.6.8) v1.0. By GoNorthWest. 3 April 2012

Setting Up Your Android Development Environment. For Mac OS X (10.6.8) v1.0. By GoNorthWest. 3 April 2012 Setting Up Your Android Development Environment For Mac OS X (10.6.8) v1.0 By GoNorthWest 3 April 2012 Setting up the Android development environment can be a bit well challenging if you don t have all

More information

Introduction to Cloud Computing on Amazon Web Services (AWS) with focus on EC2 and S3. Horst Lueck

Introduction to Cloud Computing on Amazon Web Services (AWS) with focus on EC2 and S3. Horst Lueck Introduction to Cloud Computing on Amazon Web Services (AWS) with focus on EC2 and S3 Horst Lueck 2011-05-17 IT Pro Forum http://itproforum.org Thanks to Open Office Impress The Cloud the Name The 90s

More information

1. Product Information

1. Product Information ORIXCLOUD BACKUP CLIENT USER MANUAL LINUX 1. Product Information Product: Orixcloud Backup Client for Linux Version: 4.1.7 1.1 System Requirements Linux (RedHat, SuSE, Debian and Debian based systems such

More information

Using Windows Task Scheduler instead of the Backup Express Scheduler

Using Windows Task Scheduler instead of the Backup Express Scheduler Using Windows Task Scheduler instead of the Backup Express Scheduler This document contains a step by step guide to using the Windows Task Scheduler instead of the Backup Express Scheduler. Backup Express

More information

Yocto Project Eclipse plug-in and Developer Tools Hands-on Lab

Yocto Project Eclipse plug-in and Developer Tools Hands-on Lab Yocto Project Eclipse plug-in and Developer Tools Hands-on Lab Yocto Project Developer Day San Francisco, 2013 Jessica Zhang Introduction Welcome to the Yocto Project Eclipse plug-in

More information

Eucalyptus Tutorial HPC and Cloud Computing Workshop http://portal.nersc.gov/project/magellan/euca-tutorial/abc.html

Eucalyptus Tutorial HPC and Cloud Computing Workshop http://portal.nersc.gov/project/magellan/euca-tutorial/abc.html Eucalyptus Tutorial HPC and Cloud Computing Workshop http://portal.nersc.gov/project/magellan/euca-tutorial/abc.html Iwona Sakrejda Lavanya Ramakrishna Shane Canon June24th, UC Berkeley Tutorial Outline

More information

MATLAB Distributed Computing Server Cloud Center User s Guide

MATLAB Distributed Computing Server Cloud Center User s Guide MATLAB Distributed Computing Server Cloud Center User s Guide How to Contact MathWorks Latest news: Sales and services: User community: Technical support: www.mathworks.com www.mathworks.com/sales_and_services

More information

VX 9000E WiNG Express Manager INSTALLATION GUIDE

VX 9000E WiNG Express Manager INSTALLATION GUIDE VX 9000E WiNG Express Manager INSTALLATION GUIDE 2 VX 9000E WiNG Express Manager Service Information If you have a problem with your equipment, contact support for your region. Support and issue resolution

More information

Team Foundation Server 2013 Installation Guide

Team Foundation Server 2013 Installation Guide Team Foundation Server 2013 Installation Guide Page 1 of 164 Team Foundation Server 2013 Installation Guide Benjamin Day benday@benday.com v1.1.0 May 28, 2014 Team Foundation Server 2013 Installation Guide

More information

RecoveryVault Express Client User Manual

RecoveryVault Express Client User Manual For Linux distributions Software version 4.1.7 Version 2.0 Disclaimer This document is compiled with the greatest possible care. However, errors might have been introduced caused by human mistakes or by

More information

Online Backup Client User Manual Linux

Online Backup Client User Manual Linux Online Backup Client User Manual Linux 1. Product Information Product: Online Backup Client for Linux Version: 4.1.7 1.1 System Requirements Operating System Linux (RedHat, SuSE, Debian and Debian based

More information

Cloud Computing For Bioinformatics. EC2 and AMIs

Cloud Computing For Bioinformatics. EC2 and AMIs Cloud Computing For Bioinformatics EC2 and AMIs Cloud Computing Quick-starting an EC2 instance (let s get our feet wet!) Cloud Computing: EC2 instance Quick Start On EC2 console, we can click on Launch

More information

Online Backup Linux Client User Manual

Online Backup Linux Client User Manual Online Backup Linux Client User Manual Software version 4.0.x For Linux distributions August 2011 Version 1.0 Disclaimer This document is compiled with the greatest possible care. However, errors might

More information

KeyControl Installation on Amazon Web Services

KeyControl Installation on Amazon Web Services KeyControl Installation on Amazon Web Services Contents Introduction Deploying an initial KeyControl Server Deploying an Elastic Load Balancer (ELB) Adding a KeyControl node to a cluster in the same availability

More information

Online Backup Client User Manual

Online Backup Client User Manual For Linux distributions Software version 4.1.7 Version 2.0 Disclaimer This document is compiled with the greatest possible care. However, errors might have been introduced caused by human mistakes or by

More information

Comsol Multiphysics. Running COMSOL on the Amazon Cloud. VERSION 4.3b

Comsol Multiphysics. Running COMSOL on the Amazon Cloud. VERSION 4.3b Comsol Multiphysics Running COMSOL on the Amazon Cloud VERSION 4.3b Running COMSOL on the Amazon Cloud 1998 2013 COMSOL Protected by U.S. Patents 7,519,518; 7,596,474; and 7,623,991. Patents pending. This

More information

Hadoop Basics with InfoSphere BigInsights

Hadoop Basics with InfoSphere BigInsights An IBM Proof of Technology Hadoop Basics with InfoSphere BigInsights Part: 1 Exploring Hadoop Distributed File System An IBM Proof of Technology Catalog Number Copyright IBM Corporation, 2013 US Government

More information

Using Google Compute Engine

Using Google Compute Engine Using Google Compute Engine Chris Paciorek January 30, 2014 WARNING: This document is now out-of-date (January 2014) as Google has updated various aspects of Google Compute Engine. But it may still be

More information

Hadoop Tutorial. General Instructions

Hadoop Tutorial. General Instructions CS246: Mining Massive Datasets Winter 2016 Hadoop Tutorial Due 11:59pm January 12, 2016 General Instructions The purpose of this tutorial is (1) to get you started with Hadoop and (2) to get you acquainted

More information

Zend Server Amazon AMI Quick Start Guide

Zend Server Amazon AMI Quick Start Guide Zend Server Amazon AMI Quick Start Guide By Zend Technologies www.zend.com Disclaimer This is the Quick Start Guide for The Zend Server Zend Server Amazon Machine Image The information in this document

More information

Chapter 9 PUBLIC CLOUD LABORATORY. Sucha Smanchat, PhD. Faculty of Information Technology. King Mongkut s University of Technology North Bangkok

Chapter 9 PUBLIC CLOUD LABORATORY. Sucha Smanchat, PhD. Faculty of Information Technology. King Mongkut s University of Technology North Bangkok CLOUD COMPUTING PRACTICE 82 Chapter 9 PUBLIC CLOUD LABORATORY Hand on laboratory based on AWS Sucha Smanchat, PhD Faculty of Information Technology King Mongkut s University of Technology North Bangkok

More information

OUTLOOK ANYWHERE CONNECTION GUIDE FOR USERS OF OUTLOOK 2010

OUTLOOK ANYWHERE CONNECTION GUIDE FOR USERS OF OUTLOOK 2010 OUTLOOK ANYWHERE CONNECTION GUIDE FOR USERS OF OUTLOOK 2010 CONTENTS What is Outlook Anywhere? Before you begin How do I configure Outlook Anywhere with Outlook 2010? How do I use Outlook Anywhere? I already

More information

SOS SO S O n O lin n e lin e Bac Ba kup cku ck p u USER MANUAL

SOS SO S O n O lin n e lin e Bac Ba kup cku ck p u USER MANUAL SOS Online Backup USER MANUAL HOW TO INSTALL THE SOFTWARE 1. Download the software from the website: http://www.sosonlinebackup.com/download_the_software.htm 2. Click Run to install when promoted, or alternatively,

More information

Extreme computing lab exercises Session one

Extreme computing lab exercises Session one Extreme computing lab exercises Session one Michail Basios (m.basios@sms.ed.ac.uk) Stratis Viglas (sviglas@inf.ed.ac.uk) 1 Getting started First you need to access the machine where you will be doing all

More information

Amazon Web Services EC2 & S3

Amazon Web Services EC2 & S3 2010 Amazon Web Services EC2 & S3 John Jonas FireAlt 3/2/2010 Table of Contents Introduction Part 1: Amazon EC2 What is an Amazon EC2? Services Highlights Other Information Part 2: Amazon Instances What

More information

How to Setup and Connect to an FTP Server Using FileZilla. Part I: Setting up the server

How to Setup and Connect to an FTP Server Using FileZilla. Part I: Setting up the server How to Setup and Connect to an FTP Server Using FileZilla The ability to store data on a server and being able to access the data from anywhere in the world has allowed us to get rid of external flash

More information

Extending Remote Desktop for Large Installations. Distributed Package Installs

Extending Remote Desktop for Large Installations. Distributed Package Installs Extending Remote Desktop for Large Installations This article describes four ways Remote Desktop can be extended for large installations. The four ways are: Distributed Package Installs, List Sharing,

More information

IDS 561 Big data analytics Assignment 1

IDS 561 Big data analytics Assignment 1 IDS 561 Big data analytics Assignment 1 Due Midnight, October 4th, 2015 General Instructions The purpose of this tutorial is (1) to get you started with Hadoop and (2) to get you acquainted with the code

More information

ST 810, Advanced computing

ST 810, Advanced computing ST 810, Advanced computing Eric B. Laber & Hua Zhou Department of Statistics, North Carolina State University January 30, 2013 Supercomputers are expensive. Eric B. Laber, 2011, while browsing the internet.

More information

How To Create A Virtual Private Cloud In A Lab On Ec2 (Vpn)

How To Create A Virtual Private Cloud In A Lab On Ec2 (Vpn) Virtual Private Cloud - Lab Hands-On Lab: AWS Virtual Private Cloud (VPC) 1 Overview In this lab we will create and prepare a Virtual Private Cloud (VPC) so that we can launch multiple EC2 web servers

More information

Opsview in the Cloud. Monitoring with Amazon Web Services. Opsview Technical Overview

Opsview in the Cloud. Monitoring with Amazon Web Services. Opsview Technical Overview Opsview in the Cloud Monitoring with Amazon Web Services Opsview Technical Overview Page 2 Opsview In The Cloud: Monitoring with Amazon Web Services Contents Opsview in The Cloud... 3 Considerations...

More information

educ Office 365 email: Remove & create new Outlook profile

educ Office 365 email: Remove & create new Outlook profile Published: 29/01/2015 If you have previously used Outlook the with the SCC/SWO service then once you have been moved into Office 365 your Outlook will need to contact the SCC/SWO servers one last time

More information

Online Backup Client User Manual

Online Backup Client User Manual For Mac OS X Software version 4.1.7 Version 2.2 Disclaimer This document is compiled with the greatest possible care. However, errors might have been introduced caused by human mistakes or by other means.

More information

Tutorial: Using HortonWorks Sandbox 2.3 on Amazon Web Services

Tutorial: Using HortonWorks Sandbox 2.3 on Amazon Web Services Tutorial: Using HortonWorks Sandbox 2.3 on Amazon Web Services Sayed Hadi Hashemi Last update: August 28, 2015 1 Overview Welcome Before diving into Cloud Applications, we need to set up the environment

More information

Desktop : Ubuntu 10.04 Desktop, Ubuntu 12.04 Desktop Server : RedHat EL 5, RedHat EL 6, Ubuntu 10.04 Server, Ubuntu 12.04 Server, CentOS 5, CentOS 6

Desktop : Ubuntu 10.04 Desktop, Ubuntu 12.04 Desktop Server : RedHat EL 5, RedHat EL 6, Ubuntu 10.04 Server, Ubuntu 12.04 Server, CentOS 5, CentOS 6 201 Datavoice House, PO Box 267, Stellenbosch, 7599 16 Elektron Avenue, Technopark, Tel: +27 218886500 Stellenbosch, 7600 Fax: +27 218886502 Adept Internet (Pty) Ltd. Reg. no: 1984/01310/07 VAT No: 4620143786

More information

Hands on Lab: Building a Virtual Machine and Uploading VM Images to the Cloud using Windows Azure Infrastructure Services

Hands on Lab: Building a Virtual Machine and Uploading VM Images to the Cloud using Windows Azure Infrastructure Services Hands on Lab: Building a Virtual Machine and Uploading VM Images to the Cloud using Windows Azure Infrastructure Services Windows Azure Infrastructure Services provides cloud based storage, virtual networks

More information

Getting Started with AWS. Hosting a Static Website

Getting Started with AWS. Hosting a Static Website Getting Started with AWS Hosting a Static Website Getting Started with AWS: Hosting a Static Website Copyright 2016 Amazon Web Services, Inc. and/or its affiliates. All rights reserved. Amazon's trademarks

More information

CASHNet Secure File Transfer Instructions

CASHNet Secure File Transfer Instructions CASHNet Secure File Transfer Instructions Copyright 2009, 2010 Higher One Payments, Inc. CASHNet, CASHNet Business Office, CASHNet Commerce Center, CASHNet SMARTPAY and all related logos and designs are

More information

DraganFly Guardian: API Instillation Instructions

DraganFly Guardian: API Instillation Instructions Setting Up Ubuntu to Run Draganflyer Guardian API Page 1 of 16 \ DraganFly Guardian: API Instillation Instructions Spring 2015 Casey Corrado Setting Up Ubuntu to Run Draganflyer Guardian API Page 2 of

More information

Introduction to Operating Systems

Introduction to Operating Systems Introduction to Operating Systems It is important that you familiarize yourself with Windows and Linux in preparation for this course. The exercises in this book assume a basic knowledge of both of these

More information

Installing an open source version of MateCat

Installing an open source version of MateCat Installing an open source version of MateCat This guide is meant for users who want to install and administer the open source version on their own machines. Overview 1 Hardware requirements 2 Getting started

More information

Livezilla How to Install on Shared Hosting http://www.jonathanmanning.com By: Jon Manning

Livezilla How to Install on Shared Hosting http://www.jonathanmanning.com By: Jon Manning Livezilla How to Install on Shared Hosting By: Jon Manning This is an easy to follow tutorial on how to install Livezilla 3.2.0.2 live chat program on a linux shared hosting server using cpanel, linux

More information

AWS Account Setup and Services Overview

AWS Account Setup and Services Overview AWS Account Setup and Services Overview 1. Purpose of the Lab Understand definitions of various Amazon Web Services (AWS) and their use in cloud computing based web applications that are accessible over

More information

SOA Software API Gateway Appliance 7.1.x Administration Guide

SOA Software API Gateway Appliance 7.1.x Administration Guide SOA Software API Gateway Appliance 7.1.x Administration Guide Trademarks SOA Software and the SOA Software logo are either trademarks or registered trademarks of SOA Software, Inc. Other product names,

More information

webkpi SaaS ETL Connector Installation & Configuration Guide

webkpi SaaS ETL Connector Installation & Configuration Guide webkpi SaaS ETL Connector Installation & Configuration Guide SaaS ETL Version 2.5.0.12 Version 1.6 September 2013 webkpi SaaS ETL Connector Version 2.5.0.12 V 1.6 Page 1 Table of Contents Table of Contents

More information

The Social Accelerator Setup Guide

The Social Accelerator Setup Guide The Social Accelerator Setup Guide Welcome! Welcome to the Social Accelerator setup guide. This guide covers 2 ways to setup SA. Most likely, you will want to use the easy setup wizard. In that case, you

More information

Reflection DBR USER GUIDE. Reflection DBR User Guide. 995 Old Eagle School Road Suite 315 Wayne, PA 19087 USA 610.964.8000 www.evolveip.

Reflection DBR USER GUIDE. Reflection DBR User Guide. 995 Old Eagle School Road Suite 315 Wayne, PA 19087 USA 610.964.8000 www.evolveip. Reflection DBR USER GUIDE 995 Old Eagle School Road Suite 315 Wayne, PA 19087 USA 610.964.8000 www.evolveip.net Page 1 of 1 Table of Contents Overview 3 Reflection DBR Client and Console Installation 4

More information

Cloudera Manager Training: Hands-On Exercises

Cloudera Manager Training: Hands-On Exercises 201408 Cloudera Manager Training: Hands-On Exercises General Notes... 2 In- Class Preparation: Accessing Your Cluster... 3 Self- Study Preparation: Creating Your Cluster... 4 Hands- On Exercise: Working

More information

CLC Server Command Line Tools USER MANUAL

CLC Server Command Line Tools USER MANUAL CLC Server Command Line Tools USER MANUAL Manual for CLC Server Command Line Tools 2.5 Windows, Mac OS X and Linux September 4, 2015 This software is for research purposes only. QIAGEN Aarhus A/S Silkeborgvej

More information

EVault Software. Course 361 Protecting Linux and UNIX with EVault

EVault Software. Course 361 Protecting Linux and UNIX with EVault EVault Software Course 361 Protecting Linux and UNIX with EVault Table of Contents Objectives... 3 Scenario... 3 Estimated Time to Complete This Lab... 3 Requirements for This Lab... 3 Computers Used in

More information

PLEASE NOTE: The client data used in these manuals is purely fictional.

PLEASE NOTE: The client data used in these manuals is purely fictional. Welcome! CAREWare Quick Start guides will walk you through the basics of setting up, managing and using the main CAREWare functions. It is intended for non-technical users who just need to get basic information

More information

GPG installation and configuration

GPG installation and configuration Contents Introduction... 3 Windows... 5 Install GPG4WIN... 5 Configure the certificate manager... 7 Configure GPG... 7 Create your own set of keys... 9 Upload your public key to the keyserver... 11 Importing

More information

3CX IP PBX with Twilio Elastic SIP Trunking Interconnection Guide

3CX IP PBX with Twilio Elastic SIP Trunking Interconnection Guide 3CX IP PBX with Twilio Elastic SIP Trunking Interconnection Guide Hello and welcome to our guide on how to set up a 3CX IP PBX for use with Twilio s Elastic SIP Trunking service. This guide covers the

More information

Deploying MongoDB and Hadoop to Amazon Web Services

Deploying MongoDB and Hadoop to Amazon Web Services SGT WHITE PAPER Deploying MongoDB and Hadoop to Amazon Web Services HCCP Big Data Lab 2015 SGT, Inc. All Rights Reserved 7701 Greenbelt Road, Suite 400, Greenbelt, MD 20770 Tel: (301) 614-8600 Fax: (301)

More information

Management Utilities Configuration for UAC Environments

Management Utilities Configuration for UAC Environments Management Utilities Configuration for UAC Environments For optimal use of SyAM Management Utilities, Windows client machines should be configured with User Account Control disabled or set to the least

More information

Leveraging SAP HANA & Hortonworks Data Platform to analyze Wikipedia Page Hit Data

Leveraging SAP HANA & Hortonworks Data Platform to analyze Wikipedia Page Hit Data Leveraging SAP HANA & Hortonworks Data Platform to analyze Wikipedia Page Hit Data 1 Introduction SAP HANA is the leading OLTP and OLAP platform delivering instant access and critical business insight

More information

Workshop: From Zero. Budapest DW Forum 2014

Workshop: From Zero. Budapest DW Forum 2014 Workshop: From Zero to _ Budapest DW Forum 2014 Agenda today 1. Some setup before we start 2. (Back to the) introduction 3. Our workshop today 4. Part 2: a simple Scalding job on EMR Some setup before

More information

Amazon EFS (Preview) User Guide

Amazon EFS (Preview) User Guide Amazon EFS (Preview) User Guide Amazon EFS (Preview): User Guide Copyright 2015 Amazon Web Services, Inc. and/or its affiliates. All rights reserved. Amazon's trademarks and trade dress may not be used

More information