Premium

Mapr (HP) Hadoop Developer Certification Questions and Answers (Dumps and Practice Questions)



Question : Which of the following property will help to limit number of reduce tasks across all the Jobs


 : Which of the following property will help to limit number of reduce tasks across all the Jobs
1. mapred.reduce.tasks

2. mapred.max.reduce.tasks

3. Access Mostly Uused Products by 50000+ Subscribers

4. mapred.min.reduce.tasks


Correct Answer : Get Lastest Questions and Answer :
Explanation:




Question : What is the ideal range for mapred.reduce.tasks parameters?


 : What is the ideal range for mapred.reduce.tasks parameters?
1. between 0.95 to 1.75

2. between 1 to 2

3. Access Mostly Uused Products by 50000+ Subscribers

4. You can choose any range, and it will derived based on relative value.


Correct Answer : Get Lastest Questions and Answer :
Explanation: mapred.reduce.tasks : The default number of reduce tasks per job. Typically set to 99% of the cluster's reduce capacity, so that if
a node fails the reduces can still be executed in a single wave. Ignored when mapred.job.tracker is "local".




Question : You have a MapReduce jobs, which create unique data sets and finally insert each record in JDBC database table. Reducer is responsible writing
data in Database. There are chances that your cluster is very heavily loaded and few map tasks and reduce tasks can fail in between and re-launched in
different node. So, which statement is correct for above scenario?



 : You have a MapReduce jobs, which create unique data sets and finally insert each record in JDBC database table. Reducer is responsible writing
1. to avoid slowness, we should enable speculative execution

2. we should not enable speculative execution

3. Access Mostly Uused Products by 50000+ Subscribers

4. We should reduce number of reducer


Correct Answer : Get Lastest Questions and Answer :
Explanation: As enabling speculative execution will , launch same reducer tasks on another node. Which will also write the data in JDBC tables.
Similarly the node which is also will also write same data.
Which is not right, because we need unique record as a put.


Related Questions


Question : Which of the following class is responsible for committing output of the Job

 : Which of the following class is responsible for committing output of the Job
1. OutputFormat

2. Job

3. Access Mostly Uused Products by 50000+ Subscribers

4. Context


Question : You are running a word count MapReduce job. But somehow job is not successfully completed and fails after processing % of reducer class.
Which statement is correct in this case?


 : You are running a word count MapReduce job. But somehow job is not successfully completed and fails after processing % of reducer class.
1. It will generate 90% output only

2. It will only generate _logs directory as output

3. Access Mostly Uused Products by 50000+ Subscribers

4. 1,2

5. 2,3



Question : Select correct statements

 : Select correct statements
1. RecordWriter writes the key-value pairs to the output files

2. The TextOutputFormat.LineRecordWriter implementation requires a java.io.DataOutputStream
object to write the key-value pairs to the HDFS/MapR-FS file system

3. Access Mostly Uused Products by 50000+ Subscribers

4. 1,2
5. 1,2,3


Question : Default separator between key and value is tab
 : Default separator between key and value is tab
1. True
2. False


Question : Select correct statement regarding Reducer

 : Select correct statement regarding Reducer
1. Each reducer will take , partitioned generated and decided by Hadoop framework as an input. And processes one iterable list of key-value
pairs at a time.

2. Reducer generates output as a patitioned file in a format part-r-0000x

3. Access Mostly Uused Products by 50000+ Subscribers

4. 1,2
5. 1,2,3


Question : Select correct statement regarding input key-values of a Mapper class


 : Select correct statement regarding input key-values of a Mapper class
1. Whatever you have configured as an input key and value type must match in the Mapper class

2. Input key and value type defined on the Mapper class level must match in map() method arguments

3. Access Mostly Uused Products by 50000+ Subscribers

4. 1,2
5. 1,2,3