Premium

AWS Certified Solutions Architect – Associate Questions and Answers (Dumps and Practice Questions)



Question : What is the minimum and maximum size of a single S object?
  :  What is the minimum and maximum size of a single S object?
1. 1MB and 5GB
2. 1B and 1TB
3. Access Mostly Uused Products by 50000+ Subscribers
4. 1Byte and 5TB



Correct Answer : Get Lastest Questions and Answer :


Explanation: Depending on the size of the data you are uploading, Amazon S3 offers the following options:

Upload objects in a single operation With a single PUT operation you can upload objects up to 5 GB in size.

For more information

Upload objects in parts Using the Multipart upload API you can upload large objects, up to 5 TB.

The Multipart Upload API is designed to improve the upload experience for larger objects. You can upload objects in parts. These object parts can be
uploaded independently, in any order, and in parallel. You can use a Multipart Upload for objects from 5 MB to 5 TB in size.

Amazon encourage Amazon S3 customers to use Multipart Upload for objects greater than 100 MB.

The total volume of data and number of objects you can store are unlimited. Individual Amazon S3 objects can range in size from 1 byte to 5 terabytes.
The largest object that can be uploaded in a single PUT is 5 gigabytes.





Question : You are an AWS architech, and in your organization there are multiple processes which runs asynchronously,
however they have some depnedencies on each other and It requires that you coordinate the execution of multiple distributed components
and deal with the increased latencies and unreliability inherent in remote communication. So which of the following solutions perfectly fit
to handle this scenerio

   : You are an AWS  architech, and in your organization there are multiple processes which runs asynchronously,
1. You will implement this with the help of message queues and databases, along with the logic to synchronize them.
2. You will use Amazon Simple Workflow (SWF)
3. Access Mostly Uused Products by 50000+ Subscribers
4. You will solve this problem using Amazon Simple Notification Service (Amazon SNS)


Correct Answer : Get Lastest Questions and Answer :


Explanation: Amazon Simple Workflow (Amazon SWF) is a task coordination and state management service for cloud applications.
With Amazon SWF, you can stop writing complex glue-code and state machinery and invest more in the business logic that makes your applications unique.
The Amazon Simple Workflow Service (Amazon SWF) makes it easier to develop asynchronous and distributed applications by providing a programming model
and infrastructure for coordinating distributed components and maintaining their execution state in a reliable way.
By relying on Amazon SWF, you are freed to focus on building the aspects of your application that differentiate it.






Question : How do you define the Activity Task, in the context of Amazon Simple Workflow

  : How do you define the Activity Task, in the context of Amazon Simple Workflow
1. It is a definition of the Activity
2. One invocation of an activity
3. Access Mostly Uused Products by 50000+ Subscribers
4. Collection of activity


Correct Answer : Get Lastest Questions and Answer :


Explanation: Lets have an example of SWF, in a customer-order workflow, you might have an activity that handles purchased items.
If the customer purchases multiple items, then this activity would have to run multiple times.
Amazon SWF has the concept of an activity task that represents one invocation of an activity.
In this example, the processing of each item would be represented by a single activity task.



Related Questions


Question : Administrator of your company has uploaded a big file assuming it very infrequently accessed data, and while accessing the data you find it
is
taking several
hours to checkout the data. So which one of the following storage has been used by the Administrator
 : Administrator of your company has uploaded a big file assuming it very infrequently accessed data, and while accessing the data you find it
1. Standard S3 storage
2. RRS , Reduced Redundancy Storage
3. Access Mostly Uused Products by 50000+ Subscribers
4. None of the above



Question : You have created a VPC in a region which has three AZ, now you will be creating public subnet on each AZ and create one instance in each
AZ. Each instance is hosting a different, different website. However, these are the websites which want to communicate with the internet. So you will be …


 : You have created a VPC in a region which has three AZ, now you will be creating public subnet on each AZ and create one instance in each
1. Creating three IGW and attach each one to different subnet, so that they are accessible from internet.

2. Creating three IGW and attach them to VPC and each EC2 server will use independent IGW for accessing internet.

3. Access Mostly Uused Products by 50000+ Subscribers

4. Creating only one IGW and attach it to VPC. You have to create a route in route table attached to subnet, which can send traffic via
IGW.



Question : You are working with a social media company, which stores photos, videos and audio files of the users in S and all the related metadata is
stored in the DynamoDB . Your website represents slide show or individual media item, underneath of each slide you want to show related metadata as
well, which is stored in DynamoDB . Which of the following is fastest and correct way to extract metadata from DynamoDB?


 : You are working with a social media company, which stores photos, videos and audio files of the users in S and all the related metadata is
1. You will be scanning entire table, in which metadata stored and get the related metadata from that.

2. You will fire search operation on DynamoDB table, and get the related result.

3. Access Mostly Uused Products by 50000+ Subscribers

4. You will be using find operation, so that related metadata can be retrieved from the table



Question : You are at the initial stage of creating a web based discussion application, and for that you have created a simple UI. Where you have
UserID, Comment and its PostingTimeStamp. You decided to use DynamoDB and assuming that heavy load will be there on this. You wanted faster retrieval of
the Comment and decided to partition the data and also make sure, Comment should always be sorted based on timestamp. Which of the following will you
choose to create a partition key?

 : You are at the initial stage of creating a web based discussion application, and for that you have created a simple UI. Where you have
1. UserID, PostingTimeStamp

2. PostingTimeStamp and Comment

3. Access Mostly Uused Products by 50000+ Subscribers

4. Comment and UserID



Question : Select the correct scenario, where you can use Amazon Redshift?
A. To store Equity Market Data and continuous stream of every second data
B. You will be storing user information and can use for applying analytics or creating some reporting on it.
C. You can use it as a Data Warehouse replacement and can store data which is coming from various Data sources.
D. You should be using as a ecommerce database to store live purchase orders.

  : Select the correct scenario, where you can use Amazon Redshift?
1. A,B
2. B,C
3. Access Mostly Uused Products by 50000+ Subscribers
4. A,D
5. B,D


Question : You have recently joined a company which has their website hosted on AWS EC instances. They have also configured very nice monitoring
through Cloudwatch, which keep sending the very granular monitoring detail, which always ignored and nobody interested in. Because this website is not
heavily loaded and not plan to have any high load in future, even if it is down for few hours in a week than also its acceptable. Your chief architect
asked you to reduce the cost for this setup, which one of the following can help you implement it. This website however, support user login access only.

  : You have recently joined a company which has their website hosted on AWS EC instances. They have also configured very nice monitoring
1. You will asked al the website user, when they prefer to use the website once in a week and then shutdown the website for other days.

2. You will make your website static and install it in S3.

3. Access Mostly Uused Products by 50000+ Subscribers

4. You will disable the paid detail monitoring