Premium

Mapr (HP) Hadoop Developer Certification Questions and Answers (Dumps and Practice Questions)



Question : What is data localization ?
  : What is data localization ?
1. Before processing the data, bringing them to the local node.
2. Hadoop will start the Map task on the node where data block is kept via HDFS
3. Access Mostly Uused Products by 50000+ Subscribers
4. None of the 1 and 2 is correct


Correct Answer : Get Lastest Questions and Answer :


Explanation:

Map Reduce Data Locality: Whenever possible, Hadoop will attempt to ensure that a MapTask on a node is working on a block of data stored locally
On that node via HDFS.

There is no concept of data locality for the Reducers. All mappers in general have to communicate with all reducers.

Refer HadoopExam.com Recorded Training Module : 3





Question : All the mappers, have to communicate with all the reducers...
  : All the mappers, have to communicate with all the reducers...
1. True
2. False



Correct Answer : Get Lastest Questions and Answer :


Explanation: There is no concept of data locality for the Reducers. All mappers in general have to communicate with all reducers.

Refer HadoopExam.com Recorded Training Module : 3





Question : Mapper and Reducer runs on the same machine then output of the Mapper will not be transferred via network to the reducer
  : Mapper and Reducer runs on the same machine then output of the Mapper will not be transferred via network to the reducer
1. True
2. False


Correct Answer : Get Lastest Questions and Answer :


Explanation: If Mappers and Reducer runs on the same node, then there is ni need of transferring the data over the network.
Which will reduce lot of network overhead.

Refer HadoopExam.com Recorded Training Module : 3 and 4



Related Questions


Question : Select correct statements regarding logs in Hadoop program


 : Select correct statements regarding logs in Hadoop program
1. You can configure log levels in commons-logging.properties file and needs to be paced in CLASSPATH

2. Supported log levels are trace, debug, info, warn ,error and fatal. You can also add System.out.prinln as well in the Mapper and Reducer code

3. Access Mostly Uused Products by 50000+ Subscribers

4. 1,2

5. 1,2,3



Question : What is the purpose of a Driver class ?

 : What is the purpose of a Driver class ?
1. It creates the output directory for entire Job

2. It is responsible for reading data from input file and shuffling, sorting of intermediate data.

3. Access Mostly Uused Products by 50000+ Subscribers

4. It kills the job as soon as last reducer finishes


Question : In the MapReduce job, map() method of the Mapper is called for each key-value pairs. Now this key-value pair are collected together and sent toString


 : In the MapReduce job, map() method of the Mapper is called for each key-value pairs. Now this key-value pair are collected together and sent toString
1. RecordReader

2. OutputFormat

3. Access Mostly Uused Products by 50000+ Subscribers

4. Partitioner



Question : Which of the methods are available, if you use mapreduce package Mapper class?


 : Which of the methods are available, if you use mapreduce package Mapper class?
1. map(), setup(), cleanup(),run()

2. map(), reduce(), setup(), cleanup(),run()

3. Access Mostly Uused Products by 50000+ Subscribers

4. map(), reduce(), setup(),run()

5. map(), reduce(), setup()



Question : Which of the following class is responsible for creating key-value pairs from input-split and submitted to Mapper for further processing

 : Which of the following class is responsible for creating key-value pairs from input-split and submitted to Mapper for further processing
1. RecordReader

2. Tasktracker

3. Access Mostly Uused Products by 50000+ Subscribers

4. JobTracker


Question : Which all methods can be called by the run() method of a Mapper/Reducer class
 : Which all methods can be called by the run() method of a Mapper/Reducer class
1. setup(),map(),cleanup() or setup(),reduce(),cleanup()
2. setup(),map(),cleanup() or setup(),reduce()
3. Access Mostly Uused Products by 50000+ Subscribers
4. setup(),map(),reduce(),cleanup()