BIG DATA And HADOOP



What is Big Data?
As the name suggests, Big Data refers to large and complex sets of data which cannot be processed or integrated using traditional processing applications. The size of such data is enormous to an extent that it cannot be curetted, analyzed, stored or captured by commonly used software. Thus, Big Data requires technologies with new forms of integration, which can store, analyze and segregate data that is complex and on a massive scale.

Why is Big Data Important?
The importance of Big Data lies in the efficient utilization of the same. Such data can be extracted from any source and analyzed to find answers that enable time and cost reductions. An accurate analysis of such data also leads to new product development and smart decision making, by businesses. Therefore, big data with powered analytics is essential in calculating risk portfolios and finding solutions for the same.

Why Hadoop?
Hadoop is an open-source software by Apache Software Solutions, that has extreme proficiency in handling and segregation of enormous amount of data. It is one of the fastest growing technologies, for data architecture and efficiently distributes and processes huge amounts of data. For processing Big Data, Hadoop is the most extensively used software, in the present scenario.

About the Course:
The certificate course in Big Data and Hadoop is primarily designed to acquaint students with a pervasive knowledge of the basics and as well as advanced concepts of the Hadoop eco-system. The functional benefits of Map Reduce, HBase, Zookeeper and Sqoop‘s are highlighted and practically integrated in this course. At the end of the course, the candidates will gain in-depth knowledge of all the core concepts and techniques associated with Big Data and Hadoop.

Program Details:

Program Benefits:

  • Proficiency in the advanced Big Data scenario
  • Knowledge about installing Hadoop on the Cloud
  • Understand the Map Reduce Architecture and Algorithm.
  • Proficiency in Data analysis through PIG and Hive
  • Understanding the concepts of HBase, Zookeeper & Sqoop


Who Should go for this Course?
  • Analytics professionals
  • IT Project Managers
  • Testing professionals
  • Software developers
  • Software architects

Learning Outcomes:
The program focuses in grooming the students with a variety of skills and techniques involved in efficient analysis, segregation and execution of Big Data, using Hadoop software. Some of them include:

  • Data loading techniques using Sqoop and Flume
  • Writing complex MapReduce programs
  • Performing data analysis using various programs
  • Concepts of Hadoop Distributed File System and MapReduce framework
Assessment Mechanism
  • Internal assessment- Participants would need to make a Project which would carry 25 marks
  • External assessment-Participants would take an Objective type MCQ exam which would carry 100 marks and which would be reconverted to 75 marks There would be 50 MCQ questions in total.
  • The participant can attempt the exam after completion of 6 months of program duration

Fees:

#

 Program

   Fees (Rs.)

 Tenure
 (months)

Total Lecture
 Duration
 (hrs)

   OTP

 Initial

 PDC

  1

 Big Data and Hadoop

   24,800

 15,800

 10,000

          6

          50


   #

   Program (Combined)

   OTP

   Initial

   PDC 1

   PDC 2

   1

    Big Data and Hadoop
  with R Programming

   34600

   15600

   10000

   10000