Premium

Mapr (HP) Hadoop Developer Certification Questions and Answers (Dumps and Practice Questions)



Question : In a MapReduce job with map tasks, how many map task attempts will there be?
A. It depends on the number of reduces in the job.
  : In a MapReduce job with  map tasks, how many map task attempts will there be?
1. Between 500 and 1000.
2. At most 500.
3. At least 500.
4. Exactly 500.

Correct Answer : Get Lastest Questions and Answer :
Explanation: Task attempt is a particular instance of an attempt to execute a task
- There will be at least as many task attempts as there are tasks
- If a task attempt fails, another will be started by the JobTracker
- Speculative execution can also result in more task attempts than completed tasks




Question : Which HDFS command uploads a local file X into an existing HDFS directory Y?
  : Which HDFS command uploads a local file X into an existing HDFS directory Y?
1. hadoop scp X Y
2. hadoop fs -localPut X Y
3. hadoop fs-put X Y
4. hadoop fs -get X Y

Correct Answer : Get Lastest Questions and Answer :
Explanation: Usage: hadoop fs -put ...

Copy single src, or multiple srcs from local file system to the destination file system. Also reads input from stdin and writes to destination file system.

hadoop fs -put localfile /user/hadoop/hadoopfile
hadoop fs -put localfile1 localfile2 /user/hadoop/hadoopdir
hadoop fs -put localfile hdfs://nn.example.com/hadoop/hadoopfile
hadoop fs -put - hdfs://nn.example.com/hadoop/hadoopfile Reads the input from stdin.
Exit Code:

Returns 0 on success and -1 on error.




Question : Which one of the following files is required in every Oozie Workflow application?
  : Which one of the following files is required in every Oozie Workflow application?
1. job.properties
2. Config-default.xml
3. Workflow.xml
4. Oozie.xml

Correct Answer : Get Lastest Questions and Answer :
Explanation:


Related Questions


Question : Determine which best describes when the reduce method is first called in a MapReduce job?


  : Determine which best describes when the reduce method is first called in a MapReduce job?
1. Reducers start copying intermediate key-value pairs from each Mapper as soon as it
has completed. The programmer can configure in the job what percentage of the
intermediate data should arrive before the reduce method begins.
2. Reducers start copying intermediate key-value pairs from each Mapper as soon as it
has completed. The reduce method is called only after all intermediate data has been
copied and sorted.
3. Access Mostly Uused Products by 50000+ Subscribers
optimal performance for map-only or reduce-only jobs.
4. Reducers start copying intermediate key-value pairs from each Mapper as soon as it
has completed. The reduce method is called as soon as the intermediate key-value pairs
start to arrive.


Question : What is the possible block size in HDFS
  : What is the possible block size in HDFS
1. 512 Bytes
2. 64 MB
3. Access Mostly Uused Products by 50000+ Subscribers
4. None of the above




Question : To Create Sequence of Multiple MapReduce job (Chaining) same JobConf object is used
  : To Create Sequence of Multiple MapReduce job (Chaining) same JobConf object is used
1. True
2. False


Question : Which class/object represent a MapReduce job
  : Which class/object represent a MapReduce job
1. Job
2. JobControl
3. Access Mostly Uused Products by 50000+ Subscribers




Question : If X and Y are two MapReduce jobs and their dependency is set as below

x.addDependingJob(y)

What does it mean ?
  : If X and Y are two MapReduce jobs and their dependency  is set as below
1. X will not start until y has finished
2. Y will not start until x has finished
3. Access Mostly Uused Products by 50000+ Subscribers
4. All of the above




Question : The option or switch in "hadoop fs" command for detailed help is
  : The option or switch in
1. '-show'
2. '-help'
3. Access Mostly Uused Products by 50000+ Subscribers
4. Any of the above