Premium

AWS Certified Solutions Architect - Professional Questions and Answers (Dumps and Practice Questions)



Question : Your company policies require encryption of sensitive data at rest. You are considering the possible options for protecting data while storing it at rest on an EBS
data volume, attached to an EC2 instance. Which of these options would allow you to encrypt your data at rest?
(Choose 3 answers)

A. Implement third party volume encryption tools
B. Do nothing as EBS volumes are encrypted by default
C. Encrypt data inside your applications before storing it on EBS
D. Encrypt data using native data encryption drivers at the file system level
E. Implement SSL/TLS for all services running on the server

  :  Your company policies require encryption of sensitive data at rest. You are considering the possible options for protecting data while storing it at rest on an EBS
1. A,B,C
2. B,C,D
3. Access Mostly Uused Products by 50000+ Subscribers
4. A,E,D
5. B,D,E

Answer: 3

Explanation: Refer : https://media.amazonwebservices.com/AWS_Securing_Data_at_Rest_with_Encryption.pdf






Question : You currently operate a web application in the AWS US-East region. The application runs on an auto-scaled layer of EC instances and an RDS Multi-AZ database. Your IT
security compliance officer has tasked you to develop a reliable and durable logging solution to track changes made to your EC2, IAM And RDS resources. The solution must ensure the
integrity and confidentiality of your log data. Which of these solutions would you recommend?
  : You currently operate a web application in the AWS US-East region. The application runs on an auto-scaled layer of EC instances and an RDS Multi-AZ database. Your IT
1. Create a new CloudTrail trail with one new S3 bucket to store the logs and with the global services option selected. Use IAM roles S3 bucket policies and Multi Factor Authentication (MFA) Delete on the S3 bucket that stores your logs.
2. Create a new cloudTrail with one new S3 bucket to store the logs. Configure SNS to send log file delivery notifications to your management system. Use IAM roles and S3 bucket policies on the S3 bucket mat stores your logs.
3. Access Mostly Uused Products by 50000+ Subscribers
4. Create three new CloudTrail trails with three new S3 buckets to store the logs one for the AWS Management console, one for AWS SDKs and one for command line tools Use IAM roles and S3 bucket policies on the S3 buckets that store your logs.

Answer: 1

Explanation: As CloudTrail will be stored in S3, and to avoid any delete we should have IAM Role as well as MFA enabled. Hence option 2 and 4 is out. Between 1 and 3 , option 3 says ACL
on S3. If you decide to use an existing bucket when you turn on CloudTrail for a new region, you might receive the error There is a problem with the bucket policy. If so, it is
possible that your bucket policy does not enable access for the new region. For example, you might receive this error if your bucket policy supports only the us-east-1 (US East
(N. Virginia)) and us-west-2 (US West (Oregon)) regions and you try to turn on your trail in ap-southeast-2 (Asia Pacific (Sydney))

You turn on CloudTrail on a per-region basis. If you use multiple AWS regions, you can choose where log files are delivered for each region. For example, you can have a separate
Amazon S3 bucket for each region, or you can aggregate log files from all regions in a single S3 bucket. API calls for global AWS services such as AWS IAM and AWS STS are recorded
and delivered by CloudTrail along with regional events. By default, CloudTrail delivers API calls for global services in every region.





Question : To serve Web traffic for a popular product. Your chief financial officer and IT director have
purchased 10 ml large heavy utilization Reserved Instances (RIs) evenly spread across two
availability zones. Route 53 is used to deliver the traffic to an Elastic Load Balancer (ELB).
After several months, the product grows even more popular and you need additional
capacity. As a result, your company purchases two C3.2xlarge medium utilization Reserved Instances. You
register the two c3 2xlarge instances with your ELB and quickly find that the ml large
instances are at 100% of capacity and the c3 2xlarge instances have significant capacity
that's unused. Which option is the most cost effective and uses EC2 capacity most
effectively?


  : To serve Web traffic for a popular product. Your chief financial officer and IT director have
1. Use a separate ELB for each instance type and distribute load to ELBs with Route 53 weighted round robin
2. Configure Autoscaning group and Launch Configuration with ELB to add up to 10 more
on-demand mi large instances when triggered by Cloudwatch shut off c3 2xiarge instances
3. Access Mostly Uused Products by 50000+ Subscribers
based routing and health checks shut off ELB
4. Configure ELB with two c3 2xiarge Instances and use on-demand Autoscailng group for
up to two additional c3.2xlarge instances Shut on mi .large instances.


Answer:1

Explanation: As we can see here, all the capacity is not used. Hence, load distribution is not proper by ELB. There are various routing policy. We need to configure proper routing policy so
that load can be distributed to all the nodes.

Weighted Routing Policy
Use the weighted routing policy when you have multiple resources that perform the same function (for example, web servers that serve the same website) and you want Amazon Route 53 to
route traffic to those resources in proportions that you specify (for example, 40% to one server and 60% to the other).






Related Questions


Question : You are implementing AWS Direct Connect. You intend to use AWS public service end points such as Amazon S, across the AWS Direct Connect link. You want other Internet
traffic to use your existing link to an Internet Service Provider. What is the correct way to configure AWS Direct connect for access to services such as Amazon S3?
 : You are implementing AWS Direct Connect. You intend to use AWS public service end points such as Amazon S, across the AWS Direct Connect link. You want other Internet
1. Configure a public Interface on your AWS Direct Connect link. Configure a static route via your AWS Direct Connect link that points to Amazon S3. Advertise a default
route to AWS using BGP.
2. Create a private interface on your AWS Direct Connect link. Configure a static route via your AWS Direct connect link that points to Amazon S3. Configure specific
routes to your network in your VPC.
3. Access Mostly Uused Products by 50000+ Subscribers
to AWS.
4. Create a private interface on your AWS Direct connect link. Redistribute BGP routes into your existing routing infrastructure and advertise a default route to AWS.



Question : An administrator is using Amazon CloudFormation to deploy a three tier web application that consists of a web tier and application tier that will utilize Amazon
DynamoDB for storage when creating the CloudFormation template which of the following would allow the application instance access to the DynamoDB tables without exposing API
credentials?
 :  An administrator is using Amazon CloudFormation to deploy a three tier web application that consists of a web tier and application tier that will utilize Amazon
1. Create an Identity and Access Management Role that has the required permissions to read and write from the required DynamoDB table and associate the Role to the
application instances by referencing an instance profile.
2. Use the Parameter section in the Cloud Formation template to have the user input Access and Secret Keys from an already created IAM user that has the permissions
required to read and write from the required DynamoDB table.
3. Access Mostly Uused Products by 50000+ Subscribers
instance profile property of the application instance.
4. Create an identity and Access Management user in the CioudFormation template that has permissions to read and write from the required DynamoDB table, use the GetAtt
function to retrieve the Access and secret keys and pass them to the application instance through user-data.


Question : Your company has an on-premises multi-tier PHP web application, which recently experienced downtime due to a large burst In web traffic due to a company announcement
Over the coming days, you are expecting similar announcements to drive similar unpredictable bursts, and are looking to find ways to quickly improve your infrastructures
ability to handle unexpected increases in traffic. The application currently consists of 2 tiers. A web tier which consists of a load balancer and several Linux Apache web servers as
well as a database tier which hosts a Linux server hosting a MySQL database. Which scenario below will provide full site functionality, while helping to improve the ability
of your application in the short timeframe required?
 :  Your company has an on-premises multi-tier PHP web application, which recently experienced downtime due to a large burst In web traffic due to a company announcement
1. Offload traffic from on-premises environment. Setup a CloudFront distribution and configure CloudFront to cache objects from a custom origin. Choose to customize your
object cache behavior, and select a TTL that objects should exist in cache.
2. Migrate to AWS. Use VM import `Export to quickly convert an on-premises web server to an AMI create an Auto Scaling group, which uses the imported AMI to scale the web
tier based on incoming traffic. Create an RDS read replica and setup replication between the RDS instance and on-premises MySQL server to migrate the database.
3. Access Mostly Uused Products by 50000+ Subscribers
failover to the S3 hosted website.
4. Hybrid environment. Create an AMI which can be used of launch web servers in EC2. Create an Auto Scaling group which uses the AMI to scale the web tier based on
incoming traffic. Leverage Elastic Load Balancing to balance traffic between on-premises web servers and those hosted in AWS.



Question : You're running an application on-premises due to its dependency on non-x hardware and want to use AWS for data backup. Your backup application is only able to write
to POSIX-compatible block-based storage. You have 140TB of data and would like to mount it as a single folder on your file server. Users must be able to access portions of this data
while the backups are taking place. What backup solution would be most appropriate for this use case?
 :  You're running an application on-premises due to its dependency on non-x hardware and want to use AWS for data backup. Your backup application is only able to write
1. Use Storage Gateway and configure it to use Gateway Cached volumes.
2. Configure your backup software to use S3 as the target for your data backups.
3. Access Mostly Uused Products by 50000+ Subscribers
4. Use Storage Gateway and configure it to use Gateway Stored volumes.


Question : An enterprise wants to use a third-party SaaS application. The SaaS application needs to have access to issue several API commands to discover Amazon EC resources
running within the enterprise's account. The enterprise has internal security policies that require any outside access to their environment must conform to the principles of least
privilege and there must be controls in place to ensure that the credentials used by the SaaS vendor cannot be used by any other third party. Which of the following would meet all of
these conditions?

 :  An enterprise wants to use a third-party SaaS application. The SaaS application needs to have access to issue several API commands to discover Amazon EC resources
1. From the AWS Management Console, navigate to the Security Credentials page and retrieve the access and secret key for your account.
2. Create an IAM user within the enterprise account assign a user policy to the IAM user that allows only the actions required by the SaaS application create a new
access and secret key for the user and provide these credentials to the SaaS provider.
3. Access Mostly Uused Products by 50000+ Subscribers
SaaS application.
4. Create an IAM role for EC2 instances, assign it a policy that allows only the actions required for the Saas application to work, provide the role ARN to the SaaS
provider to use when launching their application instances.



Question : Your company is getting ready to do a major public announcement of a social media site on AWS. The website is running on EC instances deployed across multiple
Availability Zones with a Multi-AZ RDS MySQL Extra Large DB Instance. The site performs a high number of small reads and writes per second and relies on an eventual consistency
model. After comprehensive tests you discover that there is read contention on RDS MySQL. Which are the best approaches to meet these requirements? (Choose 2 answers)

1. Deploy ElasticCache in-memory cache running in each availability zone
2. Implement sharding to distribute load to multiple RDS MySQL instances
3. Access Mostly Uused Products by 50000+ Subscribers
4. Add an RDS MySQL read replica in each availability zone
 :  Your company is getting ready to do a major public announcement of a social media site on AWS. The website is running on EC instances deployed across multiple
1. 1,2
2. 2,3
3. Access Mostly Uused Products by 50000+ Subscribers
4. 1,4