Premium

Cloudera Hadoop Developer Certification Questions and Answer (Dumps and Practice Questions)



Question :

Which class is use to preprocessing and postprocessing of a MapReduce Job
 :
1. ChainMapper
2. ChainReducer
3. Access Mostly Uused Products by 50000+ Subscribers
4. 1 and 2 Both

Correct Answer : Get Lastest Questions and Answer :

The ChainMapper class allows to use multiple Mapper classes within a single Map task.
The Mapper classes are invoked in a chained (or piped) fashion, the output of the first becomes the input of the second, and so on until the last Mapper, the output of the last Mapper will be written to the task's output.

The ChainReducer class allows to chain multiple Mapper classes after a Reducer within the Reducer task.

For each record output by the Reducer, the Mapper classes are invoked in a chained (or piped) fashion, the output of the first becomes the input of the second, and so on until the last Mapper, the output of the last Mapper will be written to the task's output.

Notes: Running all the Pre and Post processing in a single Jobs leaves no intermediate file and there is dramatic reduction in IO




Question :

Is Data Joining like (RDBMS Join is possible in the Hadoop MapReduce)
 :
1. Yes
2. NO

Correct Answer : Get Lastest Questions and Answer :

There is a contrib package called datajoin that works as a generice framework for the data joining in Hadoop framework.




Question :

Which method of the FileSystem object is used for reading a file in HDFS
 :
1. open()
2. access()
3. Access Mostly Uused Products by 50000+ Subscribers
4. None of the above

Correct Answer : Get Lastest Questions and Answer :

Opens an FSDataInputStream at the indicated Path


Related Questions


Question :

What is HIVE?

 :
1. Hive is a part of the Apache Hadoop project that provides SQL like interface for data processing
2. Hive is one component of the Hadoop framework that allows for collecting data together into an external repository
3. Access Mostly Uused Products by 50000+ Subscribers
4. HIVE is part of the Apache Hadoop project that enables in-memory analysis of real-time streams of data




Question :

What is PIG?
 :
1. Pig is a subset fo the Hadoop API for data processing
2. Pig is a part of the Apache Hadoop project that provides C-like scripting languge interface for data processing
3. Access Mostly Uused Products by 50000+ Subscribers
4. None of Above



Question :

How can you disable the reduce step?

 :
1. The Hadoop administrator has to set the number of the reducer slot to zero on all slave nodes. This will disable the reduce step.
2. It is imposible to disable the reduce step since it is critical part of the Mep-Reduce abstraction.
3. Access Mostly Uused Products by 50000+ Subscribers
4. While you cannot completely disable reducers you can set output to one.
There needs to be at least one reduce step in Map-Reduce abstraction.




Question : Why would one create a map-reduce without the reduce step?
  : Why would one create a map-reduce without the reduce step?
1. Developers should design Map-Reduce jobs without reducers only if no reduce slots are available on the cluster
2. Developers should never design Map-Reduce jobs without reducers. An error will occur upon compile
3. Access Mostly Uused Products by 50000+ Subscribers
4. It is not possible to create a map-reduce job without at least one reduce step.
A developer may decide to limit to one reducer for debugging purposes


Question :

What is the default input format?

 :
1. The default input format is xml. Developer can specify other input formats as appropriate if xml is not the correct input
2. There is no default input format. The input format always should be specified.
3. Access Mostly Uused Products by 50000+ Subscribers
4. The default input format is TextInputFormat with byte offset as a key and entire line as a value




Question : How can you overwrite the default input format?


 : How can you overwrite the default input format?
1. In order to overwrite default input format, the Hadoop administrator has to change default settings in config file
2. In order to overwrite default input format, a developer has to set new input format
on job config before submitting the job to a cluster
3. Access Mostly Uused Products by 50000+ Subscribers
4. None of these answers are correct
Solution : 21