Download as pdf or txt
Download as pdf or txt
You are on page 1of 2

National Institute of Electronics & Information Technology

Gorakhpur – Extension Centre Lucknow


(Under Ministry of Electronics and Information Technology, Govt. of India)
MMMUT Campus, Deoria Road, Gorakhpur-273010
http://www.nielit.gov.in/gorakhpur/

Certificate course in Big Data Analytics Using Hadoop

4 Weeks (2Hrs. per day)


Course in Big Data Analytics Using Hadoop Timing: - 03:00 AM to 05:00 PM

4 Weeks Online Course


The Big Data Hadoop Training Course is proposed to give all around
Objective learning of the Big Data framework using Hadoop, HBase, Hive, Sqoop
and PIG. After the completion of this course students may work as a Big
Data Analyst , Big Data Consultant, Manager etc.

B.E./B.Tech., M.Sc.(IT/ computer Science),


MCA, BCA, PGDCA, DOEACC A, B level, Three Eligibility
Years Diploma in Computer Science.

 Candidate must have latest computer/laptop with preferably 16


GB RAM and Operating System of 64bit.
Prerequisite
 Software: Notepad, JDK, Eclipse 64 Bit, Cloudera Hadoop (can be
s downloaded from respective websites).
 Internet connection with good speed (preferably 2 Mbps or higher)

Rs. 1800/- incl. GST& all other charges Course Fees

Certificate Certificate will be provided to the participants, based on minimum 75%


attendance and on performance (minimum 50% marks) in the online test,
conducted at the end of the course.

 Instructor-led live classes.


 Instructor-led hands-on lab sessions. Methodology
 Content Access through e-Learning portal.
 Assessment and Certification

Step-1: Read the course structure & course requirements carefully.


Step-2: Visit the Registration portal and click on apply button.
Step-3: Create your login credentials and fill up all the details, see the
preview and submit the form.
How to Apply?
Step-4: Login with your credentials to verify the mobile number, email ID
and then upload the documents, Lock the profile and Pay the Fees online,
using ATM-Debit Card / Credit Card / Internet Banking / UPI etc.
National Institute of Electronics & Information Technology
Gorakhpur – Extension Centre Lucknow
(Under Ministry of Electronics and Information Technology, Govt. of India)
MMMUT Campus, Deoria Road, Gorakhpur-273010
http://www.nielit.gov.in/gorakhpur/

Course Content

Da
Day Topic Topic Day Topic
y
Big Data Overview, What is
Big Data?, Benefits of Big What is Java, What is JVM, Conditional Statement in
Day data, Big Data Challenges, Day What is JRE & JDK Day Java, Java String, Arrays,
#01 What is Hadoop?, Hadoop #02 Java Keywords & Operators, #03 Java Loops, OOPS Concepts,
Big Data Solution, Data types & Variables What is class
prerequisites

Methods in java, new Polymorphism in java,


Package in java, Access
Keyword in java, Method Abstraction in java, Interface
Day Day Day Modifier in java, Abstract
overloading, What is in java, Overriding,
#04 Constructor, static keyword,
#05
Encapsulation
#06 Method, Super Keyword,
Interfaces, Exceptions
this keyword, Inheritance

Hadoop Fremework, Day Mapreduce Data Flow, HDFS Day Hadoop Installation
Day Modules of Hadoop, #08 #09 through Virtual Machine
#07 Mapreduce, Mapreduce Job, and Cloudera, Installation
Mapper, Reducer of Bitwise SSH Client
What is HBase, Limitation of HBase Architecture, HBase
Day Hadoop Command, Day Hadoop, Difference between Day Shell, HBase Commands,
#10 Namenode, Datanode, YARN #11 HBase and HDFS, Storage #12 Create Table, HBase
Mechanism in HBase Describe and Alter
What is Hive?, Characteristics
Data types in Hive, Create
HBase Drop Table, Shutting of Hive, Hive vs RDMS, Hive
Database, Drop Database,
Day Down, Create Data, Update Day Architecture, Hive Client, Day
Create Table, Alter Table,
#13 Data, Read Data, Delete Data, #14 Services, Storage and #15
Table Types-Internal and
Scan, Count and Truncate Computing, Job Execution
External
Flow
What is Sqoop?, Creating Sqoop import command with What is Apache Pig, Need of
MySql Table, Adding target directory, Sqoop Apache Pig, Pig vs
Day Day Day
Information to MySql, import command with where Mapreduce, Pig vs Hive, Pig
#16 Import Table MySql to
#17
clouse, Sqoop import all table, #18 Architecture, Pig
Hadoop using Sqoop list database, Sqoop Export Components
Pig Latin Data Model, Pig
Loading into Apache Pig using LOAD operator, Diagnostic Operators,
Day Execution Mechanism, Grunt Day
Group Operators, Join Operator, Cross Operator, Union Operator,
#19 Shell, Pig Latin Statement, #20
Split Operator, Filter Operator, Text Loader
Data Types, Reading Data

Course Coordinator

Sh. Vishal Maurya, Scientist C, Sh. Akhilesh Shukla, STO,


NIELIT Lucknow NIELIT Gorakhpur
Contact: 7706009305 Contact: 8840086673
Email: vishal@nielit.gov.in Email: akhilesh.shukla@nielit.gov.in

CLICK HERE TO REGISTER

You might also like