Hadoop, Spark and Scala Training

3 day
90 Lessons
0 Enrolled
(0 Ratings)

Course Overview

A framework which allows distributed processing of large data sets across a cluster of computers using simple programming models is called Hadoop.

A general purpose and fast cluster computing or framework is called Spark whereas Scala is a programming language in which Spark is written

  • During this training, participants would
    • Learn about Hadoop Traditional Models
    • Understand HDFS Architecture
    • Understand MapReduce
    • Learn about Impala and Hive
    • Understand RDD lineage
    • Understand PIG

Requirements

  • Familiarity with Java
  • Intermediate level of exposure in Data Analytics

Curriculum

User Avatar

admin

2 Reviews
51 Students
127 Courses
0.0
0 rating
5 stars
0%
4 stars
0%
3 stars
0%
2 stars
0%
1 stars
0%

Be the first to review “Hadoop, Spark and Scala Training”

Main Content