Premium

Cloudera Hadoop Developer Certification Questions and Answer (Dumps and Practice Questions)



Question :

Which language is used in oozie to define MapReduce workflow
 :
1. Java
2. XML
3. Pig Latin
4. None of the above



Correct Answer : 2







Question :

Select the correct statement?
 :
1. In oozie workflow, all the MapReduce jobs can run in sequence only
2. Jobs can run parallel as well as in sequence
3. One Job can wait for another job to finish
4. All of the above
5. 2 and 3


Correct Answer : 5

Oozie is a system for describing the workflow of a job,
where that job may contain a set of map reduce jobs, pig scripts, fs operations etc and supports fork and joining of the data flow.

It doesnt however allow you to stream the input of one MR job as the input to another
- the map-reduce action in oozie still requires an output format of some type,
typically a File based on, so your output from job 1 will still be serialized via HDFS,
before being processed by job 2.

Oozie can run jobs sequentially (one after the other) and in parallel (multiple at a time)





Question :

What is the benefit of the Hadoop framework
 :
1. Can process PetaBytes of data volume
2. Can store PetaBytes of Structured and Unstructured data
3. data can be imported and exported from RDBMS
4. All of the above



Correct Answer : 4




Related Questions


Question : If a file which is MB how much space block space it will used ?
  : If a file which is MB how much space block space it will used ?
1. 33 MB
2. 64 MB
3. Access Mostly Uused Products by 50000+ Subscribers
4. None of the Above




Question : How blocks are stored in HDFS
  :  How blocks are stored in HDFS
1. As a binary file
2. As a decoded file
3. Access Mostly Uused Products by 50000+ Subscribers
4. Stored as archived



Question : Without the metadata on the NameNode file can be recovered ?



 : Without the metadata on the NameNode file can be recovered ?
1. True
2. False


Question : Select the correct option ?
  : Select the correct option ?
1. NameNode is the bottleneck for reading the file in HDFS
2. NameNode is used to determine the all the blocks of a file
3. Access Mostly Uused Products by 50000+ Subscribers
4. All of the above


Question :

Which is the correct option for accessing the file which is stored in HDFS

 :
1. Application can read and write files in HDFS using JAVA API
2. There is a commnad line option to access the files
3. Access Mostly Uused Products by 50000+ Subscribers
4. 1,2 and 3 are correct
5. 1 and 2 are correct


Question : Which is the correct command to copy files from local to HDFS file systems
 : Which is the correct command to copy files from local to HDFS file systems
1. hadoop fs -copy pappu.txt pappu.txt
2. hadoop fs -copyFromPath pappu.txt pappu.txt
3. Access Mostly Uused Products by 50000+ Subscribers
4. None of the above