Premium

Cloudera Hadoop Developer Certification Questions and Answer (Dumps and Practice Questions)



Question : Why would one create a map-reduce without the reduce step?
  : Why would one create a map-reduce without the reduce step?
1. Developers should design Map-Reduce jobs without reducers only if no reduce slots are available on the cluster
2. Developers should never design Map-Reduce jobs without reducers. An error will occur upon compile
3. Access Mostly Uused Products by 50000+ Subscribers
4. It is not possible to create a map-reduce job without at least one reduce step.
A developer may decide to limit to one reducer for debugging purposes

Correct Answer : Get Lastest Questions and Answer :


Explanation: This is a map step only. MapReduce jobs are very common. They normally are used to perform transformations on data without sorting and aggregations




Question :

What is the default input format?

 :
1. The default input format is xml. Developer can specify other input formats as appropriate if xml is not the correct input
2. There is no default input format. The input format always should be specified.
3. Access Mostly Uused Products by 50000+ Subscribers
4. The default input format is TextInputFormat with byte offset as a key and entire line as a value



Correct Answer : Get Lastest Questions and Answer :


Explanation: Hadoop permits a large range of input formats. The default is text input format. This format is the simplest way to access data as text lines.





Question : How can you overwrite the default input format?


 : How can you overwrite the default input format?
1. In order to overwrite default input format, the Hadoop administrator has to change default settings in config file
2. In order to overwrite default input format, a developer has to set new input format
on job config before submitting the job to a cluster
3. Access Mostly Uused Products by 50000+ Subscribers
4. None of these answers are correct
Solution : 21


Correct Answer : Get Lastest Questions and Answer :

Developer can always set different input formats on job configuration (e.g sequence files, binary files, compressed format).



Related Questions


Question : A combiner reduce the amount of data sent to the Reducer ?

  : A combiner reduce the amount of data sent to the Reducer ?
1. True
2. False


Question :

Combiner reduces the network traffic but increases the amount of work needed to be done by the reducer ?

  :
1. True
2. False


Question :

Which is the correct for Pseudo-Distributed mode of the Hadoop

  :
1. This a single machine cluster
2. All daemons run on the same machine
3. It does not require to run all the daemon in this mode
4. All 1,2 and 3 are correct
5. Only 1 and 2 are correct





Question :

Which daemon is responsible for the Housekeeping of the NameNode ?
  :
1. JobTracker
2. Tasktracker
3. NameNode
4. Secondary NameNode




Question :

Which daemon is responsible for insantiating and monitoring individaul Map and Reduce Task
  :
1. JobTracker
2. TaskTracker
3. Seconary NameNode
4. DataNode



Question :

Which Daemon distributes individual task to machines
  :
1. TaskTracker
2. JobTracker
3. MasterTracker
4. NameNode
Solution : 15