This four-day course provides Java programmers the necessary training for creating enterprise solutions using Apache Hadoop. It consists of an effective mix of interactive lecture and extensive hand-on lab exercises.
At the completion of the course students will be enabled to perform the following:
- Execute an Apache Hadoop project from conception to completion.
- Write a MapReduce program using Hadoop API
- Utilize HDFS for effective loading and processing of data with CLI and API.
- Understand best practices for building, debugging, monitoring and optimizing Hadoop solutions.
- MapReduce Code
- HDFS Intro
- HDFS Admin
- MapReduce – JobTracker, TaskTracker and
- Running Jobs
- MapReduce Combiner
- MapReduce Partitioner
- MapReduce Distributed Cache
- MapReduce Streaming
- MapReduce Data Handling
- Pig Intro & Pig Latin Basics
- More Pig Latin
- Advanced Pig Latin
- Embedding Pig with Python
- Hive – Part 1
- Hive – Part 2
- Enterprise Integration
- Future of Hadoop
Extensive hands-on lab experience
Students will work through the following exercises using the Hortonworks Data Platform:
- Running a Hadoop Solution
- MapReduce Programming
- MapReduce in Operation
- MapReduce with Combiner
- MapReduce with Partitioner
- MapReduce with a Secondary Sort and a Custom Comparator
- MapReduce with Distributed Cache
- MapReduce with Data Handling
- MapReduce with Streaming
- Using Pig for a Join
- Using Pig for a Clustering Algorithm
- Using Pig with User-Defined Functions
- Basic Hive
- Using Hive for a Join
- HBase Basics
- HCatalog Basics
- MapReduce, Pig, and Hive in a Combined Solutions
- Solving a Problem from Conception to Completion
Java programmers who are interested in learning how to build Hadoop solutions
Facility Details including directions, parking, hotel recommendations, and more.
Please contact us at HWUniversity if you have any questions and/or comments.
When & Where
Hortonworks, Inc. & Guident
Hortonworks is focused on accelerating the development and adoption of Apache Hadoop. Together with the Apache community, we are making Hadoop more robust and easier to use for enterprises and more open and extensible for solution providers. We also provide expert support and training.