The Introduction to Administering Hadoop Clusters training course is designed to demonstrate the key aspects of installing and maintaining a Hadoop cluster in various forms.
The course begins by examining how to operate the Hadoop Distributed File System (HDFS) file system and MapReduce I/O framework as complementary technologies. Next, it explores how to configure and monitor processes to manage storage and job tasks. The course concludes with an analysis of supplementing clusters with enhanced storage features and client tools.
Purpose
|
Learn how to set, configure, and administer Hadoop. |
Audience
|
System administrators, developers, and DevOps engineers creating Big Data solutions using Hadoop. |
Role
| Software Developer - System Administrator |
Skill Level
| Intermediate |
Style
| Hack-a-thon - Learning Spikes - Workshops |
Duration
| 4 Days |
Related Technologies
| Java | Big Data Training | Hadoop | Apache |
Productivity Objectives
- Describe the HDFS file system and MapReduce I/O frameworks
- Configure and monitor storage management processes and tasks
- Add network topology awareness to a cluster
- Configure a highly-available storage system
- Supplement clusters with enhanced storage features and client tools