Premium
Mapr (HP) Hadoop Developer Certification Questions and Answers (Dumps and Practice Questions)
Question : Which of the following is correct options to pass different type of files for MapReduce job
1. hadoop jar --files
2. hadoop jar --libjars
3. Access Mostly Uused Products by 50000+ Subscribers
4. 1,3
5. 1,2,3
Correct Answer
:
Get Lastest Questions and Answer
:
Explanation:
Question : How can you use Java API, to submit Distributed Cache file to job in a Driver class
1. DistributedCache.addCacheFile()
2. DistributedCache.addCacheArchive()
3. Access Mostly Uused Products by 50000+ Subscribers
4. 1,2
5. 1,2,3
Correct Answer
:
Get Lastest Questions and Answer
:
Explanation:
Question : When you use HBase as Source and Sink for your MapReduce job, which statement is true
1. Data are splitted based on region, and map task will be launched for each region data.
2. After map tasks partitions will be created and each key will be in same partition. However, a partition can have multiple keys
3. Access Mostly Uused Products by 50000+ Subscribers
4. 1,2
5. 1,2,3
Correct Answer
:
Get Lastest Questions and Answer
:
Explanation:
Related Questions
Question : Select correct statements from below
1. reduce phase does not start until all map tasks complete
2. In general, it is re-commended, we should not enable speculative execution
3. Access Mostly Uused Products by 50000+ Subscribers
4. 1,2
5. 1,2,3
Question : In which of the Scenario, we should enable JVM re-use ?
1. When there is long running TaskTracker and JobTracker
2. If we have small number of Map tasks and reduce tasks
3. Access Mostly Uused Products by 50000+ Subscribers
4. 1,2
5. None of 1,2,3
Question : Select correct statement regarding JVM re-use?
1. There is a parameter named mapred.job.reuse.jvm.num.tasks to configure JVM re-use
2. if we set mapred.job.reuse.jvm.num.tasks to -1 , unlimited number of tasks can be executed on this JVM
3. Access Mostly Uused Products by 50000+ Subscribers
4. 1,2
5. 1,2,3
Question : In General, which of the following will help us to improve the MapReduce job performance, with regards to Circular Buffer?
1. Increasing the size of Circular Buffer
2. Reducing number of Spills of Circular Buffer
3. Access Mostly Uused Products by 50000+ Subscribers
4. 1,2
5. 1,2,3
Question : Select correct statement regarding Circular buffer and spilling of these buffers
1. When circular buffer reaches 80% (or any configured size). It will first sent data to sort by key, if combiner is configured, it will also executed.
2. By default 10 Spills can again merge, after spills
3. Access Mostly Uused Products by 50000+ Subscribers
4. 1,2
5. 1,2,3
Question : Please map the followings
A. mapred.map.child.java.opts
B. mapred.reduce.child.java.opts
C. mapred.child.java.opts
D. mapred.child.ulimit
1. Maximum size of virtual memory consumed by a task and its children.
2. Applies to Map Tasks
3. Access Mostly Uused Products by 50000+ Subscribers
4. Applies to both Map and Reduce tasks
1. A-1, B-2 , C-3, D-4
2. A-2, B-3 , C-1, D-4
3. Access Mostly Uused Products by 50000+ Subscribers
4. A-2, B-3 , C-4, D-1
5. A-3, B-2 , C-1, D-4