Premium
Mapr (HP) Hadoop Developer Certification Questions and Answers (Dumps and Practice Questions)
Question : The key output of the Mapper must be identical to reducer input key.
1. True
2. False
Correct Answer
:
Get Lastest Questions and Answer
:
Question : One key is processed by one reducer ?
1. True
2. False
Correct Answer
:
Get Lastest Questions and Answer
:
Question : Number of the Mapper configuration is defined in JobConf object ?
1. True
2. False
Correct Answer
:
Get Lastest Questions and Answer
:
Number of mapper is decided by the Hadoop framework
Related Questions
Question : Which one of the following is NOT a valid Oozie action?
1. mapreduce
2. pig
3. hive
4. mrunit
Question : You want to count the number of occurrences for each unique word in the supplied input data. You've decided to implement this by having your
mapper tokenize each word and
emit a literal value 1, and then have your reducer increment a counter for each literal 1 it receives. After successful implementing this, it occurs to you
that you could optimize
this by specifying a combiner. Will you be able to reuse your existing Reduces as your combiner in this case and why or why not?
1. Yes, because the sum operation is both associative and commutative and the input and output types to the reduce method match.
2. No, because the sum operation in the reducer is incompatible with the operation of a Combiner.
3. No, because the Reducer and Combiner are separate interfaces.
4. No, because the Combiner is incompatible with a mapper which doesn't use the same data type for both the key and value.
5. Yes, because Java is a polymorphic object-oriented language and thus reducer code can be reused as a combiner.
Question : Workflows expressed in Oozie can contain:
1. Sequences of MapReduce and Pig. These sequences can be combined with other actions including forks, decision points, and path joins.
2. Sequences of MapReduce job only; on Pig on Hive tasks or jobs. These MapReduce sequences can be combined with forks and path joins.
3. Sequences of MapReduce and Pig jobs. These are limited to linear sequences of actions with exception handlers but no forks.
4. Iterative repetition of MapReduce jobs until a desired answer or state is reached.
Question : Your clusters HDFS block size in MB . You have directory containing plain text files, each of which is MB in size. The InputFormat for your job is
TextInputFormat. Determine how many Mappers will run?
1. 64
2. 100
3. 200
4. 640
Question : Which process describes the lifecycle of a Mapper?
1. The JobTracker calls the TaskTracker's configure () method, then its map () method and finally its close () method.
2. The TaskTracker spawns a new Mapper to process all records in a single input split.
3. The TaskTracker spawns a new Mapper to process each key-value pair.
4. The JobTracker spawns a new Mapper to process all records in a single file.
Question : What is a SequenceFile?
1. A SequenceFile contains a binary encoding of an arbitrary number of homogeneous writable objects.
2. A SequenceFile contains a binary encoding of an arbitrary number of heterogeneous writable objects.
3. A SequenceFile contains a binary encoding of an arbitrary number of WritableComparable objects, in sorted order.
4. A SequenceFile contains a binary encoding of an arbitrary number key-value pairs. Each key must be the same type. Each value must be same type.