Premium

AWS Certified Developer - Associate Questions and Answers (Dumps and Practice Questions)



Question : What item operation allows the retrieval of multiple items from a DynamoDB table in a single API call?
 : What item operation allows the retrieval of multiple items from a DynamoDB table in a single API call?
1. GetItem
2. BatchGetItem
3. Access Mostly Uused Products by 50000+ Subscribers
4. GetItemRange

Correct Answer : Get Lastest Questions and Answer :
Exp: BatchGetItem

The BatchGetItem operation returns the attributes of one or more items from one or more tables. You identify requested items by primary key.

A single operation can retrieve up to 16 MB of data, which can contain as many as 100 items. BatchGetItem will return a partial result if the response size limit is exceeded, the table's provisioned throughput is exceeded, or an internal processing failure occurs. If a partial result is returned, the operation returns a value for UnprocessedKeys. You can use this value to retry the operation starting with the next item to get.

Important
If you request more than 100 items BatchGetItem will return a ValidationException with the message "Too many items requested for the BatchGetItem call".
For example, if you ask to retrieve 100 items, but each individual item is 300 KB in size, the system returns 52 items (so as not to exceed the 16 MB limit). It also returns an appropriate UnprocessedKeys value so you can get the next page of results. If desired, your application can include its own logic to assemble the pages of results into one data set.

If none of the items can be processed due to insufficient provisioned throughput on all of the tables in the request, then BatchGetItem will return a ProvisionedThroughputExceededException. If at least one of the items is successfully processed, then BatchGetItem completes successfully, while returning the keys of the unread items in UnprocessedKeys.

Important
If DynamoDB returns any unprocessed items, you should retry the batch operation on those items. However, we strongly recommend that you use an exponential backoff algorithm. If you retry the batch operation immediately, the underlying read or write requests can still fail due to throttling on the individual tables. If you delay the batch operation using exponential backoff, the individual requests in the batch are much more likely to succeed.
For more information, see Batch Operations and Error Handling in the Amazon DynamoDB Developer Guide.
By default, BatchGetItem performs eventually consistent reads on every table in the request. If you want strongly consistent reads instead, you can set ConsistentRead to true for any or all tables.

In order to minimize response latency, BatchGetItem retrieves items in parallel.

When designing your application, keep in mind that DynamoDB does not return attributes in any particular order. To help parse the response by item, include the primary key values for the items in your request in the AttributesToGet parameter.







Question : Which of the following services are included at no additional cost with the use of the AWS platform? Choose answers
A. Simple Storage Service
B. Elastic Compute Cloud
C. Auto Scaling
D. Elastic Load Balancing
E. CloudFormation
F. Simple Workflow Service

 : Which of the following services are included at no additional cost with the use of the AWS platform? Choose  answers
1. A,B
2. C,E
3. Access Mostly Uused Products by 50000+ Subscribers
4. D,E

5. E,F



Correct Answer : Get Lastest Questions and Answer :

Explanation: Auto Scaling is enabled by Amazon CloudWatch and carries no additional fees. Amazon EC2 and Amazon CloudWatch service fees apply and are billed separately. Partial hours are billed as full hours.
There is no additional charge for AWS CloudFormation. You pay for AWS resources (such as Amazon EC2 instances, Elastic Load Balancing load balancers, etc.) created using AWS CloudFormation in the same manner as if you created them manually. You only pay for what you use, as you use it; there are no minimum fees and no required upfront commitments.







Question : How is provisioned throughput affected by the chosen consistency model when reading data from a DynamoDB table?
  :  How is provisioned throughput affected by the chosen consistency model when reading data from a DynamoDB table?
1. Strongly consistent reads use the same amount of throughput as eventually consistent reads
2. Strongly consistent reads use more throughput than eventually consistent reads.
3. Access Mostly Uused Products by 50000+ Subscribers
4. Strongly consistent reads use variable throughput depending on read activity


Correct Answer : Get Lastest Questions and Answer :

Explanation: Q: What is the consistency model of Amazon DynamoDB?

When reading data from Amazon DynamoDB, users can specify whether they want the read to be eventually consistent or strongly consistent:

Eventually Consistent Reads (Default) – the eventual consistency option maximizes your read throughput. However, an eventually consistent read might not reflect the results of a recently completed write. Consistency across all copies of data is usually reached within a second. Repeating a read after a short time should return the updated data.

Strongly Consistent Reads — in addition to eventual consistency, Amazon DynamoDB also gives you the flexibility and control to request a strongly consistent read if your application, or an element of your application, requires it. A strongly consistent read returns a result that reflects all writes that received a successful response prior to the read.





Related Questions


Question :

What is the primary difference between a global secondary index and a local secondary index?

 :
1. The global secondary index is not region specific
2. There are no differences
3. A global secondary index has the same hash key as the primary key and the local secondary index has a different hash and range key
4. A local secondary index has the same hash key as the primary key and the global secondary index has a different hash and range key


Question :

Company B has a DynamoDB table where the average item size is 10KB.
Company B anticipates the application will read 100 items from the table per second using eventually consistent reads.
How much read capacity throughput should they provision?


 :
1. 100
2. 300
3. 150
4. 200



Question :

How many tables can an AWS account have per region?

 :
1. 255
2. 126
3. 282
4. 256


Question :

How many secondary indexes are allowed per table?

 :
1. 10
2. 5
3. 1
4. There is no limit


Question : How can you increase your DynamoDB table limit in a region?
 :  How can you increase your DynamoDB table limit in a region?
1. By calling the UpdateLimit API call
2. By contacting AWS and requesting a limit increase
3. DynamoDB is one of the services that does not allow table increases
4. DynamoDB can't increase table limit so you increase it by writing code that uses multiple regions



Question : With AMIs backed by Amazon EBS, you're charged for
 :  With AMIs backed by Amazon EBS, you're charged for
1. volume storage
2. usage in addition to the AMI
3. instance usage charges
4. 1 and 3
5. All 1,2 and 3