HDFS - Java API. 2012 coreservlets.com and Dima May. 2012 coreservlets.com and Dima May



Similar documents
COSC 6397 Big Data Analytics. Distributed File Systems (II) Edgar Gabriel Spring HDFS Basics

Hadoop Streaming coreservlets.com and Dima May coreservlets.com and Dima May

HDFS Installation and Shell

Advanced Java Client API

Virtual Machine (VM) For Hadoop Training

Map Reduce Workflows

Apache Pig Joining Data-Sets

Hadoop Distributed File System (HDFS) Overview

HBase Java Administrative API

HBase Key Design coreservlets.com and Dima May coreservlets.com and Dima May

5 HDFS - Hadoop Distributed System

Hadoop Distributed File System (HDFS)

Java with Eclipse: Setup & Getting Started

The Google Web Toolkit (GWT): The Model-View-Presenter (MVP) Architecture Official MVP Framework

MapReduce on YARN Job Execution

Hadoop Introduction coreservlets.com and Dima May coreservlets.com and Dima May

The Google Web Toolkit (GWT): Declarative Layout with UiBinder Basics

Official Android Coding Style Conventions

Hadoop Lab Notes. Nicola Tonellotto November 15, 2010

HSearch Installation

Managed Beans II Advanced Features

Single Node Setup. Table of contents

Building Web Services with Apache Axis2

Hadoop Streaming. Table of contents

Android Programming: 2D Drawing Part 1: Using ondraw

Android Programming: Installation, Setup, and Getting Started

Tutorial- Counting Words in File(s) using MapReduce

CS506 Web Design and Development Solved Online Quiz No. 01

Session Tracking Customized Java EE Training:

Prepared By : Manoj Kumar Joshi & Vikas Sawhney

Spectrum Scale HDFS Transparency Guide

Android Programming Basics

1. GridGain In-Memory Accelerator For Hadoop. 2. Hadoop Installation. 2.1 Hadoop 1.x Installation

CS380 Final Project Evaluating the Scalability of Hadoop in a Real and Virtual Environment

Hadoop MapReduce Tutorial - Reduce Comp variability in Data Stamps

JHU/EP Server Originals of Slides and Source Code for Examples:

For live Java EE training, please see training courses

Web Applications. Originals of Slides and Source Code for Examples:

COSC 6397 Big Data Analytics. Mahout and 3 rd homework assignment. Edgar Gabriel Spring Mahout

MarkLogic Server. MarkLogic Connector for Hadoop Developer s Guide. MarkLogic 8 February, 2015

Introduction to Cloud Computing

Integrating VoltDB with Hadoop

Package hive. January 10, 2011

TIBCO ActiveMatrix BusinessWorks Plug-in for Big Data User s Guide

OLH: Oracle Loader for Hadoop OSCH: Oracle SQL Connector for Hadoop Distributed File System (HDFS)

Running Hadoop on Windows CCNP Server

Xiaoming Gao Hui Li Thilina Gunarathne

Introduction to HDFS. Prasanth Kothuri, CERN

The Hadoop Eco System Shanghai Data Science Meetup

19 Putting into Practice: Large-Scale Data Management with HADOOP

Servlet and JSP Filters

USING HDFS ON DISCOVERY CLUSTER TWO EXAMPLES - test1 and test2

Web Applications. For live Java training, please see training courses at

The Google Web Toolkit (GWT): Overview & Getting Started

Zebra and MapReduce. Table of contents. 1 Overview Hadoop MapReduce APIs Zebra MapReduce APIs Zebra MapReduce Examples...

Introduction to HDFS. Prasanth Kothuri, CERN

Installation Guide Setting Up and Testing Hadoop on Mac By Ryan Tabora, Think Big Analytics

CDH 5 Quick Start Guide

map/reduce connected components

Hadoop Shell Commands

Continuous Integration Part 2

CS242 PROJECT. Presented by Moloud Shahbazi Spring 2015

How To Install Hadoop From Apa Hadoop To (Hadoop)

The objective of this lab is to learn how to set up an environment for running distributed Hadoop applications.

TIBCO ActiveMatrix BusinessWorks Plug-in for Big Data User's Guide

Cloud Computing. Chapter Hadoop

HADOOP MOCK TEST HADOOP MOCK TEST II

Distributed Filesystems

Hadoop Setup Walkthrough

Extreme computing lab exercises Session one

Application Security

HDInsight Essentials. Rajesh Nadipalli. Chapter No. 1 "Hadoop and HDInsight in a Heartbeat"

Rumen. Table of contents

Hadoop Shell Commands

Word Count Code using MR2 Classes and API

CS2510 Computer Operating Systems Hadoop Examples Guide

Kognitio Technote Kognitio v8.x Hadoop Connector Setup

Apache Hadoop new way for the company to store and analyze big data

Hadoop. Dawid Weiss. Institute of Computing Science Poznań University of Technology

Hadoop Basics with InfoSphere BigInsights

About the Tutorial. Audience. Prerequisites. Copyright & Disclaimer. AVRO Tutorial

Setup Hadoop On Ubuntu Linux. ---Multi-Node Cluster

HDFS File System Shell Guide

研 發 專 案 原 始 程 式 碼 安 裝 及 操 作 手 冊. Version 0.1

Getting to know Apache Hadoop

Setting up Hadoop with MongoDB on Windows 7 64-bit

Hadoop WordCount Explained! IT332 Distributed Systems

Hadoop Training Hands On Exercise

Building a Multi-Threaded Web Server

Hadoop Distributed File System. T Seminar On Multimedia Eero Kurkela

Hadoop and ecosystem * 本 文 中 的 言 论 仅 代 表 作 者 个 人 观 点 * 本 文 中 的 一 些 图 例 来 自 于 互 联 网. Information Management. Information Management IBM CDL Lab

Case-Based Reasoning Implementation on Hadoop and MapReduce Frameworks Done By: Soufiane Berouel Supervised By: Dr Lily Liang

How To Use Hadoop

Package hive. July 3, 2015

A. Aiken & K. Olukotun PA3

Transcription:

2012 coreservlets.com and Dima May HDFS - Java API Originals of slides and source code for examples: http://www.coreservlets.com/hadoop-tutorial/ Also see the customized Hadoop training courses (onsite or at public venues) http://courses.coreservlets.com/hadoop-training.html Customized Java EE Training: http://courses.coreservlets.com/ Hadoop, Java, JSF 2, PrimeFaces, Servlets, JSP, Ajax, jquery, Spring, Hibernate, RESTful Web Services, Android. Developed and taught by well-known author and developer. At public venues or onsite at your location. 2012 coreservlets.com and Dima May For live customized Hadoop training (including prep for the Cloudera certification exam), please email info@coreservlets.com Taught by recognized Hadoop expert who spoke on Hadoop several times at JavaOne, and who uses Hadoop daily in real-world apps. Available at public venues, or customized versions can be held on-site at your organization. Courses developed and taught by Marty Hall JSF 2.2, PrimeFaces, servlets/jsp, Ajax, jquery, Android development, Java 7 or 8 programming, custom mix of topics Courses Customized available in any state Java or country. EE Training: Maryland/DC http://courses.coreservlets.com/ area companies can also choose afternoon/evening courses. Hadoop, Courses Java, developed JSF 2, PrimeFaces, and taught Servlets, by coreservlets.com JSP, Ajax, jquery, experts Spring, (edited Hibernate, by Marty) RESTful Web Services, Android. Spring, Hibernate/JPA, GWT, Hadoop, HTML5, RESTful Web Services Developed and taught by well-known author and developer. At public venues or onsite at your location. Contact info@coreservlets.com for details

Agenda Java API Introduction Configuration Reading Data Writing Data Browsing file system 4 File System Java API org.apache.hadoop.fs.filesystem Abstract class that serves as a generic file system representation Note it s a class and not an Interface Implemented in several flavors Ex. Distributed or Local 5

FileSystem Implementations Hadoop ships with multiple concrete implementations: org.apache.hadoop.fs.localfilesystem Good old native file system using local disk(s) org.apache.hadoop.hdfs.distributedfilesystem Hadoop Distributed File System (HDFS) Will mostly focus on this implementation org.apache.hadoop.hdfs.hftpfilesystem Access HDFS in read-only mode over HTTP org.apache.hadoop.fs.ftp.ftpfilesystem File system on FTP server 6 FileSystem Implementations FileSystem concrete implementations Two options that are backed by Amazon S3 cloud org.apache.hadoop.fs.s3.s3filesystem http://wiki.apache.org/hadoop/amazons3 org.apache.hadoop.fs.kfs.kosmosfilesystem Backed by CloudStore http://code.google.com/p/kosmosfs 7

FileSystem Implementations Different use cases for different concrete implementations HDFS is the most common choice org.apache.hadoop.hdfs.distributedfilesystem 8 SimpleLocalLs.java Example public class SimpleLocalLs { public static void main(string[] args) throws Exception{ Path path = new Path("/"); if ( args.length == 1){ path = new Path(args[0]); Read location from the command line arguments Configuration conf = new Configuration(); FileSystem fs = FileSystem.get(conf); List the files and directories under the provided path Acquire FileSystem Instance 9 FileStatus [] files = fs.liststatus(path); for (FileStatus file : files ){ System.out.println(file.getPath().getName()); Print each sub directory or file to the screen

FileSystem API: Path Hadoop's Path object represents a file or a directory Not java.io.file which tightly couples to local filesystem Path is really a URI on the FileSystem HDFS: hdfs://localhost/user/file1 Local: file:///user/file1 Examples: new Path("/test/file1.txt"); new Path("hdfs://localhost:9000/test/file1.txt"); 10 Hadoop's Configuration Object Configuration object stores clients and servers configuration Very heavily used in Hadoop HDFS, MapReduce, HBase, etc... Simple key-value paradigm Wrapper for java.util.properties class which itself is just a wrapper for java.util.hashtable Several construction options Configuration conf1 = new Configuration(); Configuration conf2 = new Configuration(conf1); Configuration object conf2 is seeded with configurations of conf1 object 11

Hadoop's Configuration Object Getting properties is simple! Get the property String nnname = conf.get("fs.default.name"); returns null if property doesn't exist Get the property and if doesn't exist return the provided default String nnname = conf.get("fs.default.name", "hdfs://localhost:9000"); There are also typed versions of these methods: getboolean, getint, getfloat, etc... Example: int prop = conf.getint("file.size"); 12 Hadoop's Configuration Object 13 Usually seeded via configuration files that are read from CLASSPATH (files like conf/coresite.xml and conf/hdfs-site.xml): Configuration conf = new Configuration(); conf.addresource(new Path(HADOOP_HOME + "/conf/coresite.xml")); Must comply with the Configuration XML schema, ex: <?xml version="1.0"?> <?xml-stylesheet type="text/xsl" href="configuration.xsl"?> <configuration> <property> <name>fs.default.name</name> <value>hdfs://localhost:9000</value> </property> </configuration>

Hadoop Configuration Object 14 conf.addresource() - the parameters are either String or Path conf.addresource("hdfs-site.xml") CLASSPATH is referenced when String parameter is provided conf.addresource(new Path("/exact/location/file.xml") : Path points to the exact location on the local file system By default Hadoop loads core-default.xml Located at hadoop-common-x.x.x.jar/core-default.xml core-site.xml Looks for these configuration files on CLASSPATH LoadConfigurations.java Example public class LoadConfigurations { private final static String PROP_NAME = "fs.default.name"; public static void main(string[] args) { Configuration conf = new Configuration(); 1. Print the property with empty Configuration System.out.println("After construction: " + conf.get(prop_name)); 2. Add properties from core-dsite.xml conf.addresource(new Path(Vars.HADOOP_HOME + "/conf/core-site.xml")); System.out.println("After addresource: "+ conf.get(prop_name)); 3. Manually set the property 15 conf.set(prop_name, "hdfs://localhost:8111"); System.out.println("After set: " + conf.get(prop_name));

Run LoadConfigurations $ java -cp $PLAY_AREA/HadoopSamples.jar:$HADOOP_HOME/share/had common/hadoop-common-2.0.0- cdh4.0.0.jar:$hadoop_home/share/hadoop/common/lib/* hdfs.loadconfigurations After construction: file:/// After addresource: hdfs://localhost:9000 After set: hdfs://localhost:8111 3. Manually set the property 1. Print the property with empty Configuration 2. Add properties from core-site.xml 16 FileSystem API 17 Recall FileSystem is a generic abstract class used to interface with a file system FileSystem class also serves as a factory for concrete implementations, with the following methods public static FileSystem get(configuration conf) Will use information from Configuration such as scheme and authority Recall hadoop loads conf/core-site.xml by default Core-site.xml typically sets fs.default.name property to something like hdfs://localhost:8020 org.apache.hadoop.hdfs.distributedfilesystem will be used by default Otherwise known as HDFS

Simple List Example public class SimpleLocalLs { public static void main(string[] args) throws Exception{ Path path = new Path("/"); if ( args.length == 1){ path = new Path(args[0]); Configuration conf = new Configuration(); FileSystem fs = FileSystem.get(conf); FileStatus [] files = fs.liststatus(path); for (FileStatus file : files ){ System.out.println(file.getPath().getName()); What happens when you run this code? 18 Execute Simple List Example The Answer is... it depends $ java -cp $PLAY_AREA/HadoopSamples.jar:$HADOOP_HOME/share/hadoop/commo n/hadoop-common-2.0.0- cdh4.0.0.jar:$hadoop_home/share/hadoop/common/lib/* hdfs.simplels lib...... var sbin etc Uses java command, not yarn core-site.xml and core-default.xml are not on the CLASSPATH properties are then NOT added to Configuration object Default FileSystem is loaded => local file system 19 $ yarn jar $PLAY_AREA/HadoopSamples.jar hdfs.simplels hbase lost+found test1 tmp training user Yarn script will place core-default.xml and core-site-xml on the CLASSPATH Properties within those files added to Configuration object HDFS is utilized, since it was specified in core-site.xml

Reading Data from HDFS 1. Create FileSystem 2. Open InputStream to a Path 3. Copy bytes using IOUtils 4. Close Stream 20 1: Create FileSystem FileSystem fs = FileSystem.get(new Configuration()); If you run with yarn command, DistributedFileSystem (HDFS) will be created Utilizes fs.default.name property from configuration Recall that Hadoop framework loads core-site.xml which sets property to hdfs (hdfs://localhost:8020) 21

2: Open Input Stream to a Path... InputStream input = null; try { input = fs.open(filetoread);... 22 fs.open returns org.apache.hadoop.fs.fsdatainputstream Another FileSystem implementation will return their own custom implementation of InputStream Opens stream with a default buffer of 4k If you want to provide your own buffer size use fs.open(path f, int buffersize) 3: Copy bytes using IOUtils IOUtils.copyBytes(inputStream, outputstream, buffer); Copy bytes from InputStream to OutputStream Hadoop s IOUtils makes the task simple buffer parameter specifies number of bytes to buffer at a time 23

4: Close Stream... finally { IOUtils.closeStream(input);... Utilize IOUtils to avoid boiler plate code that catches IOException 24 ReadFile.java Example public class ReadFile { public static void main(string[] args) throws IOException { Path filetoread = new Path("/training/data/readMe.txt"); FileSystem fs = FileSystem.get(new Configuration()); InputStream input = null; try { input = fs.open(filetoread); 1: Open FileSystem 2: Open InputStream IOUtils.copyBytes(input, System.out, 4096); finally { IOUtils.closeStream(input); 3: Copy from Input to Output Stream 4: Close stream $ yarn jar $PLAY_AREA/HadoopSamples.jar hdfs.readfile Hello from readme.txt 25

Reading Data - Seek FileSystem.open returns FSDataInputStream Extension of java.io.datainputstream Supports random access and reading via interfaces: PositionedReadable : read chunks of the stream Seekable : seek to a particular position in the stream 26 Seeking to a Position FSDataInputStream implements Seekable interface void seek(long pos) throws IOException Seek to a particular position in the file Next read will begin at that position If you attempt to seek past the file boundary IOException is emitted Somewhat expensive operation strive for streaming and not seeking long getpos() throws IOException Returns the current position/offset from the beginning of the stream/file 27

SeekReadFile.java Example 28 public class SeekReadFile { public static void main(string[] args) throws IOException { Path filetoread = new Path("/training/data/readMe.txt"); FileSystem fs = FileSystem.get(new Configuration()); FSDataInputStream input = null; try { Start at position 0 input = fs.open(filetoread); System.out.print("start postion=" + input.getpos() + ": IOUtils.copyBytes(input, System.out, 4096, false); Seek to position 11 input.seek(11); System.out.print("start postion=" + input.getpos() + ": IOUtils.copyBytes(input, System.out, 4096, false); Seek back to 0 input.seek(0); System.out.print("start postion=" + input.getpos() + ": IOUtils.copyBytes(input, System.out, 4096, false); finally { IOUtils.closeStream(input); Run SeekReadFile Example $ yarn jar $PLAY_AREA/HadoopSamples.jar hdfs.seekreadfile start position=0: Hello from readme.txt start position=11: readme.txt start position=0: Hello from readme.txt 29

Write Data 1. Create FileSystem instance 2. Open OutputStream FSDataOutputStream in this case Open a stream directly to a Path from FileSystem Creates all needed directories on the provided path 3. Copy data using IOUtils 30 WriteToFile.java Example public class WriteToFile { public static void main(string[] args) throws IOException { String texttowrite = "Hello HDFS! Elephants are awesome!\n"; InputStream in = new BufferedInputStream( new ByteArrayInputStream(textToWrite.getBytes())); Path tohdfs = new Path("/training/playArea/writeMe.txt"); Configuration conf = new Configuration(); FileSystem fs = FileSystem.get(conf); FSDataOutputStream out = fs.create(tohdfs); 1: Create FileSystem instance 2: Open OutputStream IOUtils.copyBytes(in, out, conf); 3: Copy Data 31

Run WriteToFile $ yarn jar $PLAY_AREA/HadoopSamples.jar hdfs.writetofile $ hdfs dfs -cat /training/playarea/writeme.txt Hello HDFS! Elephants are awesome! 32 FileSystem: Writing Data Append to the end of the existing file fs.append(path) Optional support by concrete FileSystem HDFS supports No support for writing in the middle of the file 33

FileSystem: Writing Data FileSystem's create and append methods have overloaded version that take callback interface to notify client of the progress FileSystem fs = FileSystem.get(conf); FSDataOutputStream out = fs.create(tohdfs, new Progressable(){ @Override public void progress() { System.out.print(".."); ); Report progress to the screen 34 Overwrite Flag Recall FileSystem's create(path) creates all the directories on the provided path create(new Path( /doesnt_exist/doesnt_exist/file/txt ) can be dangerous, if you want to protect yourself then utilize the following overloaded method: public FSDataOutputStream create(path f, boolean overwrite) Set to false to make sure you do not overwrite important data 35

Overwrite Flag Example Path tohdfs = new Path("/training/playArea/writeMe.txt"); FileSystem fs = FileSystem.get(conf); FSDataOutputStream out = fs.create(tohdfs, false); Set to false to make sure you do not overwrite important data $ yarn jar $PLAY_AREA/HadoopSamples.jar hdfs.badwritetofile Exception in thread "main" org.apache.hadoop.ipc.remoteexception: java.io.ioexception: failed to create file /training/playarea/anothersubdir/writeme.txt on client 127.0.0.1 either because the filename is invalid or the file exists...... Error indicates that the file already exists 36 Copy/Move from and to Local FileSystem Higher level abstractions that allow you to copy and move from and to HDFS copyfromlocalfile movefromlocalfile copytolocalfile movetolocalfile 37

Copy from Local to HDFS FileSystem fs = FileSystem.get(new Configuration()); Path fromlocal = new Path("/home/hadoop/Training/exercises/sample_data/hamlet.txt"); Path tohdfs = new Path("/training/playArea/hamlet.txt"); fs.copyfromlocalfile(fromlocal, tohdfs); Copy file from local file system to HDFS $ hdfs dfs -ls /training/playarea/ Empty directory before copy $ yarn jar $PLAY_AREA/HadoopSamples.jar hdfs.copytohdfs $ hdfs dfs -ls /training/playarea/ File was copied Found 1 items -rw-r r-- hadoop supergroup /training/playarea/hamlet.txt 38 Delete Data FileSystem.delete(Path path,boolean recursive) If recursive == true then non-empty directory will be deleted otherwise IOException is emitted Path todelete = new Path("/training/playArea/writeMe.txt"); boolean isdeleted = fs.delete(todelete, false); System.out.println("Deleted: " + isdeleted); $ yarn jar $PLAY_AREA/HadoopSamples.jar hdfs.deletefile Deleted: true 39 $ yarn jar $PLAY_AREA/HadoopSamples.jar hdfs.deletefile Deleted: false File was already deleted by previous run

FileSystem: mkdirs Create a directory - will create all the parent directories Configuration conf = new Configuration(); Path newdir = new Path("/training/playArea/newDir"); FileSystem fs = FileSystem.get(conf); boolean created = fs.mkdirs(newdir); System.out.println(created); $ yarn jar $PLAY_AREA/HadoopSamples.jar hdfs.mkdir true 40 FileSystem: liststatus Browse the FileSystem with liststatus() methods Path path = new Path("/"); Configuration conf = new Configuration(); FileSystem fs = FileSystem.get(conf); FileStatus [] files = fs.liststatus(path); for (FileStatus file : files ){ System.out.println(file.getPath().getName()); $ yarn jar $PLAY_AREA/HadoopSamples.jar hdfs.simplels training user List files under / for HDFS 41

LsWithPathFilter.java example FileSystem fs = FileSystem.get(conf); FileStatus [] files = fs.liststatus(path, new PathFilter() { ); @Override public boolean accept(path path) { if (path.getname().equals("user")){ return false; return true; Do not show path whose name equals to "user" for (FileStatus file : files ){ System.out.println(file.getPath().getName()); Restrict result of liststatus() by supplying PathFilter object 42 Run LsWithPathFilter Example $ yarn jar $PLAY_AREA/HadoopSamples.jar hdfs.simplels training user $yarn jar $PLAY_AREA/HadoopSamples.jar hdfs.lswithpathfilter training 43

FileSystem: Globbing FileSystem supports file name pattern matching via globstatus() methods Good for traversing through a sub-set of files by using a pattern Support is similar to bash glob: *,?, etc... 44 SimpleGlobbing.java public class SimpleGlobbing { public static void main(string[] args) throws IOException { Path glob = new Path(args[0]); Read glob from command line FileSystem fs = FileSystem.get(new Configuration()); FileStatus [] files = fs.globstatus(glob); Similar usage to liststatus method for (FileStatus file : files ){ System.out.println(file.getPath().getName()); 45

Run SimpleGlobbing $ hdfs dfs -ls /training/data/glob/ Found 4 items drwxr-xr-x - hadoop supergroup drwxr-xr-x - hadoop supergroup drwxr-xr-x - hadoop supergroup drwxr-xr-x - hadoop supergroup 0 2011-12-24 11:20 /training/data/glob/2007 0 2011-12-24 11:20 /training/data/glob/2008 0 2011-12-24 11:21 /training/data/glob/2010 0 2011-12-24 11:21 /training/data/glob/2011 $ yarn jar $PLAY_AREA/HadoopSamples.jar hdfs.simpleglobbing /training/data/glob/201* 2010 2011 Usage of glob with * 46 FileSystem: Globbing Glob Explanation? Matches any single character * Matches zero or more characters [abc] Matches a single character from character set {a,b,c. [a-b] Matches a single character from the character range {a...b. Note that character a must be lexicographically less than or equal to character b. [^a] Matches a single character that is not from character set or range {a. Note that the ^ character must occur immediately to the right of the opening bracket. \c Removes (escapes) any special meaning of character c. {ab,cd Matches a string from the string set {ab, cd {ab,c{de,fh Matches a string from the string set {ab, cde, cfh 47 Source: FileSystem.globStatus API documentation

FileSystem There are several methods that return true for success and false for failure delete rename mkdirs What to do if the method returns 'false'? Check Namenode's log Located at $HADOOP_LOG_DIR/ 48 BadRename.java FileSystem fs = FileSystem.get(new Configuration()); Path source = new Path("/does/not/exist/file.txt"); Path nonexistentpath = new Path("/does/not/exist/file1.txt"); boolean result = fs.rename(source, nonexistentpath); System.out.println("Rename: " + result); $ yarn jar $PLAY_AREA/HadoopSamples.jar hdfs.badrename Rename: false Namenode's log at $HADOOP_HOME/logs/hadoop-hadoop-namenodehadoop-laptop.log 2011-12-25 01:18:54,684 WARN org.apache.hadoop.hdfs.statechange: DIR* FSDirectory.unprotectedRenameTo: failed to rename /does/not/exist/file.txt to /does/not/exist/file1.txt because source does not exist 49

2012 coreservlets.com and Dima May Wrap-Up Customized Java EE Training: http://courses.coreservlets.com/ Hadoop, Java, JSF 2, PrimeFaces, Servlets, JSP, Ajax, jquery, Spring, Hibernate, RESTful Web Services, Android. Developed and taught by well-known author and developer. At public venues or onsite at your location. Summary We learned about HDFS API How to use Configuration class How to read from HDFS How to write to HDFS How to browse HDFS 51

2012 coreservlets.com and Dima May Questions? More info: http://www.coreservlets.com/hadoop-tutorial/ Hadoop programming tutorial http://courses.coreservlets.com/hadoop-training.html Customized Hadoop training courses, at public venues or onsite at your organization http://courses.coreservlets.com/course-materials/java.html General Java programming tutorial http://www.coreservlets.com/java-8-tutorial/ Java 8 tutorial http://www.coreservlets.com/jsf-tutorial/jsf2/ JSF 2.2 tutorial http://www.coreservlets.com/jsf-tutorial/primefaces/ PrimeFaces tutorial http://coreservlets.com/ JSF 2, PrimeFaces, Java 7 or 8, Ajax, jquery, Hadoop, RESTful Web Services, Android, HTML5, Spring, Hibernate, Servlets, JSP, GWT, and other Java EE training Customized Java EE Training: http://courses.coreservlets.com/ Hadoop, Java, JSF 2, PrimeFaces, Servlets, JSP, Ajax, jquery, Spring, Hibernate, RESTful Web Services, Android. Developed and taught by well-known author and developer. At public venues or onsite at your location.