T15may2024 Examm

Download as pdf or txt
Download as pdf or txt
You are on page 1of 142

AWS Real Exam Questions

- By JD
Note : Real exam question does not mean this questions will be asked 100% in exams but it means
there is high probability that you get SAME or SIMILAR questions.

1. Which AWS services should be used for read/write of constantly changing data? Choose two.
Answer : Amazon RDS & Amazon EFS
2. What is one of the advantages of the Amazon Relational Database Service (Amazon RDS) ?
Answer : It simplifies relational database administration tasks
3. A customer needs to run a MySQL database that easily scales. Which AWS service should they use?
Answer : Amazon Aurora
4. Which of the following components of the AWS Global infrastructure consist of one or more
discrete data centers interconnected through low latency links?
Answer : Availability Zone.
5. Which of the following is a shared control between the customer and AWS?
Answer : Awareness and training.
6. How many Availability zones should compute resources be provisioned across to achieve high
availability?
Answer : A minimum of Two
7. One of the advantages to moving infrastructure from an on-premises data center to AWS cloud is?
Answer : It allows the business to focus on business activities.
8. What is the lowest cost, durable storage option for retaining database backups for immediate
retrieval?
Answer : Amazon S3
9. Which of the following is a fast and reliable NoSQL Database service?
Answer : Amazon DynamoDB
10. What is an example of agility in the AWS?
Answer : Decreased acquisition time for new compute resources.
11. Which service should a customer use to consolidate and centrally manage multiple AWS accounts?
Answer : AWS Organizations
12. What approach to transcoding a large number of individual video files adheres to AWS architecture
principle?
Answer : Using many instances in parallel
13. For which auditing process does AWS have sole responsibility?
Answer : Physical Security
14. Which feature of the AWS cloud will support an international company’s requirement for low
latency to all of its customers?
Answer : Global reach
15. Which of the following is the customer’s responsibility under the AWS Shared Responsibility
Model?
Answer : Patching Amazon EC2 instances
16. A customer is using multiple AWS account with separate billing. How can the customer take
advantage of volume discounts with minimal impact to the AWS resources?
Answer : Use the Consolidated billing feature from AWS Organizations.
17. Which of the following are the features of amazon Cloudwatch Logs? (Choose Two)
Answer : Real time monitoring & adjustable retention
18. Which of the following is an AWS managed Domain Name System (DNS) web service?
Answer : Amazon Route53
19. A customer is deploying a new application and needs to choose an AWS region. Which of the
following factors could influence the customers decision? (Choose Two)
Answer : Reduced latency & Data sovereignty compliance
20. Which storage service can be used as a low cost option for hosting static websites?
Answer : Amazon Simple Storage Service (Amazon S3)
21. Which Amazon EC2 instance pricing model can provide discounts upto 90% ?
Answer : Spot instances
22. Webservers running on Amazon EC2 access a legacy application running in a corporate data center.
What term would describe this model?
Answer : Hybrid Architecture
23. What is the benefit of using AWS managed services, such as Amazon ElastiCache and Amazon
Relational Database Service (Amazon RDS)?
Answer : They have better performance than customer-managed services.
24. Which service provides a virtually unlimited amount of online highly durable object storage?
Answer : Amazon S3
25. Which of the following identity and Access Management entities is associated with an access key ID
and secret access key when using AWS command line interface (AWS CLI)?
Answer : IAM user
26. Which of the following security related services does AWS offer? (Choose Two)
Answer : AWS Trusted Advisor security checks & Data encryption
27. Which AWS managed service is used to host databases?
Answer : Amazon RDS
28. Which AWS service provides a simple and scalable shared file storage solution for use with linux
based AWS and on premises servers?
Answer : Amazon EFS
29. When architecting cloud applications, which of the following are a key design principle?
Answer : Implementing Elasticity
30. Which AWS service should be used for long term, low cost storage of data backups?
Answer : Amazon Glacier
31. Under the shared responsibility model, which of the following is a shared control between a
customer and AWS?
Answer : Patch Management
32. Which AWS service allows companies to connect an Amazon VPC to an on premises data center?
Answer : Amazon DirectConnect
33. A company wants to reduce the physical compute footprint that developers use to run code. Which
service would meet that need by enabling serverless architecture?
Answer : AWS Lambda
34. Which task is AWS responsible for in the shared responsibility model for security and compliance?
Answer : Updating Amazon EC2 host firmware
35. Where should a company go to search software listings from independent software vendors to find,
test, buy and deploy software that runs on AWS?
Answer : Amazon MarketPlace
36. Which of the following is a benefit of using the AWS cloud?
Answer : Ability to focus on revenue-generating activities
37. When performing a cost analysis that supports physical isolation of a customer workload, which
compute hosting model should be accounted for in the Total Cost of Ownership (TCO)?
Answer : Dedicated Hosts
38. Which AWS service provides the ability to manage infrastructure as code?
Answer : AWS Cloudformation
39. If a customer needs to audit the change management of AWS resources, which of the following
AWS services should the customer use?
Answer : AWS Config
40. Which service allows a company with multiple AWS accounts to combine its usage to obtain volume
discounts?
Answer : AWS Organizations
41. Which of the following services could be used to deploy an application to servers running on
premises? (Choose two)
Answer : AWS Opswork & AWS CodeDeploy
42. Which Amazon EC2 pricing model adjusts based on supply and demand of EC2 instances?
Answer : Spot Instances
43. Which design principles for cloud architecture are recommended when re-architecting a large
monolithic application? (Choose two)
Answer : Implement Loose Coupling & Design for scalability
44. Which is the minimum AWS support plan that allows for one hour target response time for support
cases?
Answer : Business
45. Where can AWS compliance and certification reports be downloaded?
Answer : AWS Artifacts
46. Which of the following is an advantage of consolidated billing on AWS?
Answer : Volume pricing qualification
47. Which of the following AWS features enables a user to launch a pre-configured Amazon Elastic
Compute cloud (Amazon EC2 instance?
Answer: Amazon machine Image
48. How would an AWS customer easily apply common access controls to a alarge set of users?
Answer : Apply an IAM policy to an IAM group
49. What technology enables compute capacity to adjust as loads change?
Answer : Auto Scaling
50. Which AWS services are defined as global instead of regional? (Choose Two)
Answer : Amazon Route53 & Amazon Cloudfront
51. Which AWS service would you use to obtain compliance report and certificates?
Answer : AWS Artifact
52. Under the shared responsibility model, which of the following tasks are responsibility of the AWS
customer? (Choose two)
Answer : Ensuring that application data is encrypted at rest & Ensuring that users have received
security training in the use of AWS services.
53. Which AWS service can be used to manually launch instances based on resource requirements?
Answer : Amazon EC2
54. A company Is migrating an application that is running non interruptible workloads for a three year
time frame. Which pricing construct would provide the most cost effective solutions?
Answer : Amazon EC2 Reserved instances
55. The financial benefits of using AWS are : (Choose Two)
Answer : Reduced Total Cost of Ownership (TCO) & Rduced Operational Expenditure (Opex)
56. Which AWS Cost Management Tool allows you to view the most granular data about your AWS bill?
Answer : AWS Cost and Usage Report
57. Which of the following can an AWS customer use to launch a new Amazon Relational databse
service (Amazon RDS) cluster (Choose Two)
Answer : AWS Cloudformation & AWS Management Console
58. Which of the following is an AWS Cloud architecture design principle?
Answer : Implement Loose coupling
59. Which of the following security measures protect access to an AWS account? (Choose Two)
Answer : Grant least privilege access to IAM users & Activate multi factor authentication (MFA for
privileged users.
60. Which service provides a hybrid storage service that enables on premises applications to seamlessly
use cloud storage?
Answer : AWS Storage Gateway
61. Which of the following services falls under the responsibility of the customer to maintain operating
system configuration, security patching and networking?
Answer : Amazon EC2
62. Which of the following is an important architectural design principle when designing cloud
applications?
Answer : Use multiple Availability Zones.
63. Which AWS support plan includes a dedicated Technical Account Manager?
Answer : Enterprise
64. Amazon Relational Database Service (Amazon RDS) offers which of the following benefits over
traditional database management?
Answer : AWS manages the maintenance of the operating system
65. Which service is best for storing common database query results, which helps to alleviate database
access load?
Answer : Amazon ElastiCache
66. Which of the following is a component of the shared responsibility model managed entirely by
AWS?
Answer : Auditing physical data center assets
67. If each department within a company has its own AWS account, what is one way to enable
consolidated billing?
Answer : create an AWS Organization from the payer account and invite the other accounts to join.
68. Which AWS services can be used to gather information about AWS account activity? (Choose Two.)
Answer : AWS Cloudtrail & AWS Cloudwatch
69. In which Scenerio should amazon EC2 spot instances be used?
Answer : A company has a number of infrequent, interruptible jobs that are currently using On-
Demand instances.
70. Which AWS feature should a customer leverage to achieve high availability of an application?
Answer : Availability Zones
71. Which is the minimum AWS support plan that includes infrastructure Event management without
additional costs?
Answer : Business
72. Which AWS service can serve a static website?
Answer : Amazon S3
73. How does AWS shorten the time to provision IT resources?
Answer : It provides the ability to programmatically provision existing resources.
74. What can AWS edge locations be used for? (Choose Two)
Answer : Delivering content closer to users & Reducing traffic on the server by caching resources
75. Which of the following can limit Amazon Simple Storage service (Amazon S3) bucket access to
specific users?
Answer : AWS Identity and Access Management (IAM) Policies
76. A solution that is able to support growth in users, traffic or data size with no drop in performance
aligns with cloud architecture principle?
Answer : Implement elasticity
77. A company will be moving from an on premises data center to the AWS Cloud. What would be on
financial difference after the move?
Answer : moving from upfront capital expense (capex) to variable operational expense (opex)
78. How should a customer forecast the future costs for running a new web application?
Answer : AWS simple monthly calculator
79. Which is the minimum AWS support plan that provides technical support through phone calls?
Answer : Business
80. Which of the following tasks is the responsibility of AWS?
Answer : Securing the EC2 hypervisor
81. One benefit of on-demand Amazon Elastic Compute Cloud (Amazon EC2) pricing is :
Answer : paying only for time used.
82. An administrator needs to rapidly deploy a popular IT solution and start using it immediately.
Where can the administrator find assistance?
Answer : AWS Quick Start reference deployments.
83. A start-up organization is using the cost explorer tool to view and analyze its costs and usage.
Which of the below statements are correct with regards to the cost explorer tool? (Select TWO)
Answer : Provides Usage-Based Forecating & Provides trends that you can use to understand your
costs
84. The project team requires an AWS service that provides a filesystem simultaneously mounted from
different instances of EC2. Which AWS service will satisfy this requirement?
Answer : Amazon EFS
85. Which of the below statements is incorrect with regards to the advantages of moving to cloud?
Answer : Trade variable expense for capital expense
86. Project team enhancing the security features of a banking application, requires implementing a
threat detection service that continuously monitors malicious activities and unauthorized behaviors
to protect AWS accounts, workloads, and data stored in Amazon S3. Which AWS services should
the project team select?
Answer : Amazon GuardDuty
87. Which of the following support plans offer 24*7 technical support via phone, email, and chat access
to Cloud Support Engineers? (Select TWO.)
Answer : Business & Enterprise
88. Which AWS product provides a unified user interface, enabling easy management of software
development activities in one place, along with, quick development, build, and deployment of
applications on AWS?
Answer : AWS CodeStar
89. __________________ automates the discovery of sensitive data at scale and lowers the cost of
protecting your data using machine learning and pattern matching techniques.
Answer : Amazon Macie.
90. Security and Compliance is a shared responsibility between AWS and the customer. Which
amongst the below-listed options are AWS responsibilities? (Select TWO.)
Answer : Security of the cloud & Patch management within the infrastructure.
91. Based on the AWS Well-Architected Framework, how should a start-up company with a dynamic
AWS environment manage its users and resources securely without affecting the cost? Select
(TWO)
Answer : Create multiple unique IAM users with administrator access for each functional group of
the company & use of AWS Cloudfront template versions and revision controls to kee p track of the
dynamic configuration changes.
92. Which pillar of the AWS Well-Architected Framework places emphasis on making informed
decisions on the backdrop of processed data?
Answer : Operational excellence pillar
93. In the AWS environment using an EC2 instance, what is the difference between metadata and user
data?
Answer : Instance metadata are the defined parameters and attributes specified in instance
configuration, whilst user data is the information passed to the instance’s operating system to
automatically execute while launching the instance.
94. An administrator would like to install and run the same CloudWatch Agent configuration on ten
Amazon EC2 instances to collect custom metrics from them. What is the most efficient method to
achieve this objective?
Answer : Install and configure the CloudWatch Agent on one of the EC2 instances, then write the
CloudWatch Agent configuration to the parameter store of AWS Systems Manager (SSM). Install the
CloudWatch Agent configuration from SSM onto the other nine EC2 instances.
95. A group of non-tech savvy friends are looking to set up a website for an upcoming event at a cost -
effective price, with a novice-friendly interface. Which AWS service is the most appropriate to use?
Answer : Use AWS Marketplace to install a ready-made WordPress AMI.

96. Which of the following accurately describes a typical use case in which the AWS CodePipeline
service can be utilized?
Answer : to orchestrate and automate the various phases involved in the releas e of application
updates in-line with a predefined release model.
97.
NEW QUESTION 1
A company wants to run a gaming application on Amazon EC2 instances that are part of an Auto Scaling group in the AWS
Cloud. The application will transmit data by using UDP packets. The company wants to ensure that the application can
scale out and in as traffic increases and decreases.
What should a solutions architect do to meet these requirements?

A. Attach a Network Load Balancer to the Auto Scaling group


B. Attach an Application Load Balancer to the Auto Scaling group.
C. Deploy an Amazon Route 53 record set with a weighted policy to route traffic appropriately
D. Deploy a NAT instance that is configured with port forwarding to the EC2 instances in the Auto Scaling group.

Answer: B

NEW QUESTION 2
A company is developing a file-sharing application that will use an Amazon S3 bucket for storage. The company wants to
serve all the files through an Amazon CloudFront distribution. The company does not want the files to be accessible through
direct navigation to the S3 URL.
What should a solutions architect do to meet these requirements?

A. Write individual policies for each S3 bucket to grant read permission for only CloudFront access.
B. Create an IAM use
C. Grant the user read permission to objects in the S3 bucke
D. Assign the user to CloudFront.
E. Write an S3 bucket policy that assigns the CloudFront distribution ID as the Principal and assigns the target S3 bucket as
the Amazon Resource Name (ARN).
F. Create an origin access identity (OAI). Assign the OAI to the CloudFront distributio
G. Configure the S3 bucket permissions so that only the OAI has read permission.

Answer: D

Explanation:
Explanation
https://aws.amazon.com/premiumsupport/knowledge-
center/cloudfront-access-to-amazon-s3/
https://docs.aws.amazon.com/AmazonCloudFront/latest/DeveloperG
uide/private-content-restricting-access-to-s3

NEW QUESTION 3
A company has two applications: a sender application that sends messages with payloads to be processed and a
processing application intended to receive the messages with payloads. The company wants to implement an AWS service
to handle messages between the two applications. The sender application can send about 1.000 messages each hour. The
messages may take up to 2 days to be processed. If the messages fail to process, they must be retained so that they do
not impact the processing of any remaining messages.
Which solution meets these requirements and is the MOST operationally efficient?

A. Set up an Amazon EC2 instance running a Redis databas


B. Configure both applications to use the instanc
C. Store, process, and delete the messages, respectively.
D. Use an Amazon Kinesis data stream to receive the messages from the sender applicatio
E. Integrate the processing application with the Kinesis Client Library (KCL).
F. Integrate the sender and processor applications with an Amazon Simple Queue Service (Amazon SQS) queu
G. Configure a dead-letter queue to collect the messages that failed to process.
H. Subscribe the processing application to an Amazon Simple Notification Service (Amazon SNS) topic to receive
notifications to proces
I. Integrate the sender application to write to the SNS topic.

Answer: C

Explanation:
Explanation
https://aws.amazon.com/blogs/compute/building-loosely-coupled-
scalable-c-applications-with-amazon-sqs-and-
https://docs.aws.amazon.com/AWSSimpleQueueService/latest/SQSD
eveloperGuide/sqs-dead-letter-queues.htm

NEW QUESTION 4
A company has created an image analysis application in which users can upload photos and add photo frames to their
images. The users upload images and metadata to indicate which photo frames they want to add to their images. The
application uses a single Amazon EC2 instance and Amazon DynamoDB to store the metadata.
The application is becoming more popular, and the number of users is increasing. The company expects the number of
concurrent users to vary significantly depending on the time of day and day of week. The company must ensure that the
application can scale to meet the needs of the growing user base.
Which solution meats these requirements?

A. Use AWS Lambda to process the photo


B. Store the photos and metadata in DynamoDB.
C. Use Amazon Kinesis Data Firehose to process the photos and to store the photos and metadata.
D. Use AWS Lambda to process the photo
E. Store the photos in Amazon S3. Retain DynamoDB to store the metadata.
F. Increase the number of EC2 instances to thre
G. Use Provisioned IOPS SSD (io2) Amazon Elastic Block Store (Amazon EBS) volumes to store the photos and metadata.

Answer: A

NEW QUESTION 5
A company runs its two-tier ecommerce website on AWS. The web tier consists of a load balancer that sends traffic to
Amazon EC2 instances. The database tier uses an Amazon RDS DB instance. The EC2 instances and the RDS DB instance
should not be exposed to the public internet. The EC2 instances require internet
access to complete payment processing of orders through a third-party web service.
The application must be highly available. Which combination of configuration
options will meet these requirements? (Choose two.)

A. Use an Auto Scaling group to launch the EC2 instances in private subnet
B. Deploy an RDS Multi-AZ DB instance in private subnets.
C. Configure a VPC with two private subnets and two NAT gateways across two Availability Zones.Deploy an Application
Load Balancer in the private subnets.
D. Use an Auto Scaling group to launch the EC2 instances in public subnets across two Availability Zones.Deploy an RDS
Multi-AZ DB instance in private subnets.
E. Configure a VPC with one public subnet, one private subnet, and two NAT gateways across two Availability Zone
F. Deploy an Application Load Balancer in the public subnet.
G. Configure a VPC with two public subnets, two private subnets, and two NAT gateways across two Availability Zone
H. Deploy an Application Load Balancer in the public subnets.

Answer: AE

Explanation:
Explanation
Before you begin: Decide which two Availability Zones you will use for your EC2 instances. Configure your
virtual private cloud (VPC) with at least one public subnet in each of these Availability Zones. These public subnets are used
to configure the load balancer. You can launch your EC2 instances in other subnets of these Availability Zones instead.

NEW QUESTION 6
A company wants to migrate its on-premises data center to AWS. According to the company's compliance
requirements, the company can use only the ap- northeast-3 Region. Company administrators are not permitted
to connect VPCs to the internet.
Which solutions will meet these requirements? (Choose two.)

A. Use AWS Control Tower to implement data residency guardrails to deny internet access and deny access to all AWS
Regions except ap-northeast-3.
B. Use rules in AWS WAF to prevent internet acces
C. Deny access to all AWS Regions except ap-northeast-3 in the AWS account settings.
D. Use AWS Organizations to configure service control policies (SCPS) that prevent VPCs from gaining internet acces
E. Deny access to all AWS Regions except ap-northeast-3.
F. Create an outbound rule for the network ACL in each VPC to deny all traffic from 0.0.0.0/0. Create an IAM policy for
each user to prevent the use of any AWS Region other than ap-northeast-3.
G. Use AWS Config to activate managed rules to detect and alert for internet gateways and to detect and alert
for new resources deployed outside of ap- northeast-3.

Answer: AC

NEW QUESTION 7
A company wants to manage Amazon Machine Images (AMIs). The company currently copies AMIs to the same AWS Region
where the AMIs were created. The company needs to design an application that captures AWS API calls and sends alerts
whenever the Amazon EC2 Createlmage API operation is called within the company's account.
Which solution will meet these requirements with the LEAST operational overhead?

A. Create an AWS Lambda function to query AWS CloudTrail logs and to send an alert when a Createlmage API call is
detected.
B. Configure AWS CloudTrail with an Amazon Simple Notification Service {Amazon SNS) notification that occurs when
updated logs are sent to Amazon S3. Use Amazon Athena to create a new table and to query on Createlmage when an
API call is detected.
C. Create an Amazon EventBridge (Amazon CloudWatch Events) rule for the Createlmage API call.Configure the target as
an Amazon Simple Notification Service (Amazon SNS) topic to send an alert when a Createlmage API call is detected.
D. Configure an Amazon Simple Queue Service (Amazon SQS) FIFO queue as a target for AWS CloudTrail log
E. Create an AWS Lambda function to send an alert to an Amazon Simple NotificationService (Amazon SNS) topic when a
Createlmage API call is detected.

Answer: B

NEW QUESTION 8
A company hosts an application on multiple Amazon EC2 instances The application processes messages from an Amazon
SQS queue writes to an Amazon RDS table and deletes the message from the queue Occasional duplicate records are
found in the RDS table. The SQS queue does not contain any duplicate messages.
What should a solutions architect do to ensure messages are being processed once only?

A. Use the CreateQueue API call to create a new queue


B. Use the Add Permission API call to add appropriate permissions
C. Use the ReceiveMessage API call to set an appropriate wail time
D. Use the ChangeMessageVisibility APi call to increase the visibility timeout

Answer: D

Explanation:
Explanation
The visibility timeout begins when Amazon SQS returns a message. During this time, the consumer processes and deletes
the message. However, if the consumer fails before deleting the message and your system doesn't call the DeleteMessage
action for that message before the visibility timeout expires, the message becomes visible to other consumers and the
message is received again. If a message must be received only once, your consumer should delete it within the duration of
the visibility timeout. https://docs.aws.amazon.com/AWSSimpleQueueService/latest/SQSDeveloperGuide/sqs-visibility-
timeout.html
Keyword: SQS queue writes to an Amazon RDS From this, Option D best suite & other Options ruled out [Option A -
You can't intruduce one more Queue in the existing one; Option B - only Permission & Option C - Only Retrieves
Messages] FIF O queues are designed to never introduce duplicate messages.
However, your message producer might introduce duplicates in certain scenarios: for example, if the producer sends a
message, does not receive a response, and then resends the same message. Amazon SQS APIs provide deduplication
functionality that prevents your message producer from sending duplicates. Any duplicates introduced by the message
producer are removed within a 5-minute deduplication interval. For standard queues, you might occasionally receive a
duplicate copy of a message (at-least- once delivery). If you use a standard queue, you must design your applications to be
idempotent (that is, they must not be affected adversely when processing the same message more than once).
NEW QUESTION 9
A company collects temperature, humidity, and atmospheric
pressure data in cities across multiple continents. The average
volume of data collected per site each day is 500 GB. Each site
has a highspeed
internet connection. The company's weather forecasting applications are based in a
single Region and analyze the data daily. What is the FASTEST way to aggregate
data from all of these global sites?

A. Enable Amazon S3 Transfer Acceleration on the destination bucke


B. Use multipart uploads todirectly upload site data to the destination bucket.
C. Upload site data to an Amazon S3 bucket in the closest AWS Regio
D. Use S3 cross-Regionreplication to copy objects to the destination bucket.
E. Schedule AWS Snowball jobs daily to transfer data to the closest AWS Regio
F. Use S3 cross-Regionreplication to copy objects to the destination bucket.
G. Upload the data to an Amazon EC2 instance in the closest Regio
H. Store the data in an AmazonElastic Block Store (Amazon EBS) volum
I. Once a day take an EBS snapshot and copy it to thecentralized Regio
J. Restore the EBS volume in the centralized Region and run an analysis on the datadaily.

Answer: A

Explanation:
Explanation
You might want to use Transfer Acceleration on a bucket for various reasons, including the following:
You have customers that upload
to a centralized bucket from all
over the world. You transfer
gigabytes to terabytes of data on
a regular basis across continents.
You are unable to utilize all of your available bandwidth
over the Internet when uploading to Amazon S3.
https://docs.aws.amazon.com/AmazonS3/latest/dev/transfer-acceleration.html
https://aws.amazon.com/s3/transferacceleration/#:~:text=S3%20Transfer%20Acceleration%20(S3TA)%20reduces,to%20S3%2
0for%20remote%20applications: "Amazon S3 Transfer Acceleration can speed up content transfers to and from Amazon S3
by as much
as 50-500% for long-distance transfer of larger objects. Customers who have either web or mobile
applications with widespread users or applications hosted far away from their S3 bucket can experience long and variable
upload and download speeds over the Internet"
https://docs.aws.amazon.com/Am
azonS3/latest/userguide/mpuover
view.html "Improved throughput
- You can upload parts in parallel
to improve throughput."

NEW QUESTION 10
A company is designing an application. The application uses an AWS Lambda function to receive information through
Amazon API Gateway and to store the information in an Amazon Aurora PostgreSQL database.
During the proof-of-concept stage, the company has to increase the Lambda quotas significantly to handle the high
volumes of data that the company needs to load into the database. A solutions architect must recommend a new design
to improve scalability and minimize the configuration effort.
Which solution will meet these requirements?

A. Refactor the Lambda function code to Apache Tomcat code that runs on Amazon EC2 instances.Connect the
database by using native Java Database Connectivity (JDBC) drivers.
B. Change the platform from Aurora to Amazon DynamoD
C. Provision a DynamoDB Accelerator (DAX) cluste
D. Use the DAX client SDK to point the existing DynamoDB API calls at the DAX cluster.
E. Set up two Lambda function
F. Configure one function to receive the informatio
G. Configure the other function to load the information into the databas
H. Integrate the Lambda functions by using Amazon Simple Notification Service (Amazon SNS).
I. Set up two Lambda function
J. Configure one function to receive the informatio
K. Configure the other function to load the information into the databas
L. Integrate the Lambda functions by using an Amazon Simple Queue Service (Amazon SQS) queue.

Answer: D

Explanation:
Explanation
bottlenecks can be avoided with queues (SQS).

NEW QUESTION 10
A company needs to review its AWS Cloud deployment to ensure that its Amazon S3 buckets do not have
unauthorized configuration changes. What should a solutions architect do to accomplish this goal?

A. Turn on AWS Config with the appropriate rules.


B. Turn on AWS Trusted Advisor with the appropriate checks.
C. Turn on Amazon Inspector with the appropriate assessment template.
D. Turn on Amazon S3 server access loggin
E. Configure Amazon EventBridge (Amazon Cloud Watch Events).

Answer: A

NEW QUESTION 14
A company is launching a new application and will display application metrics on an Amazon CloudWatch dashboard. The
company’s product manager needs to access this dashboard periodically. The product manager does not have an AWS
account. A solution architect must provide access to the product manager by following the principle of least privilege.
Which solution will meet these requirements?
A. Share the dashboard from the CloudWatch consol
B. Enter the product manager’s email address, and complete the sharing step
C. Provide a shareable link for the dashboard to the product manager.
D. Create an IAM user specifically for the product manage
E. Attach the CloudWatch Read Only Access managed policy to the use
F. Share the new login credential with the product manage
G. Share the browser URL of the correct dashboard with the product manager.
H. Create an IAM user for the company’s employees, Attach the View Only Access AWS managed policy to the IAM use
I. Share the new login credentials with the product manage
J. Ask the product manager to navigate to the CloudWatch console and locate the dashboard by name in the Dashboards
section.
K. Deploy a bastion server in a public subne
L. When the product manager requires access to the dashboard, start the server and share the RDP credential
M. On the bastion server, ensure that the browser is configured to open the dashboard URL with cached AWS
credentials that have appropriate permissions to view the dashboard.

Answer: A

NEW QUESTION 16
A company that hosts its web application on AWS wants to ensure all Amazon EC2 instances. Amazon RDS DB instances. and
Amazon Redshift clusters are configured with tags. The company wants to minimize the effort of configuring and operating
this check.
What should a solutions architect do to accomplish this?

A. Use AWS Config rules to define and detect resources that are not properly tagged.
B. Use Cost Explorer to display resources that are not properly tagge
C. Tag those resources manually.
D. Write API calls to check all resources for proper tag allocatio
E. Periodically run the code on an EC2 instance.
F. Write API calls to check all resources for proper tag allocatio
G. Schedule an AWS Lambda function through Amazon CloudWatch to periodically run the code.

Answer: A

NEW QUESTION 21
A development team needs to host a website that will be accessed by other teams. The website contents consist of HTML,
CSS, client-side JavaScript, and images Which method is the MOST costeffective for hosting the website?

A. Containerize the website and host it in AWS Fargate.


B. Create an Amazon S3 bucket and host the website there
C. Deploy a web server on an Amazon EC2 instance to host the website.
D. Configure an Application Loa d Balancer with an AWS Lambda target that uses the Express js framework.

Answer: B

Explanation:
Explanation
In Static Websites, Web pages are
returned by the server which are
prebuilt. They use simple
languages such as HTML, CSS, or
JavaScript.
There is no processing of content on the server (according to the user) in Static Websites. Web pages are returned by the
server with no change therefore, static Websites are fast.
There is no interaction with databases.
Also, they are less costly as the host does not need to support server-side processing with different languages.
============
In Dynamic Websites, Web pages are returned by the server which are processed during runtime means they are not prebuilt
web pages but they are built during runtime according to the user’s demand.
These use server-side scripting languages such as PHP, Node.js, ASP.NET and
many more supported by the server. So, they are slower than static
websites but updates and interaction with databases are possible.

NEW QUESTION 26
A company hosts its multi-tier applications on AWS. For compliance, governance, auditing, and security, the company
must track configuration changes on its AWS resources and record a history of API calls made to these resources.
What should a solutions architect do to meet these requirements?

A. Use AWS CloudTrail to track configuration changes and AWS Config to record API calls
B. Use AWS Config to track configuration changes and AWS CloudTrail to record API calls
C. Use AWS Config to track configuration changes and Amazon CloudWatch to record API calls
D. Use AWS CloudTrail to track configuration changes and Amazon CloudWatch to record API calls

Answer: B

NEW QUESTION 31
A company is preparing to launch a public-facing web application in the AWS Cloud. The architecture consists of Amazon
EC2 instances within a VPC behind an Elastic Load Balancer (ELB). A third-party service is used for the DNS. The company's
solutions architect must recommend a solution to detect and protect against large-scale DDoS attacks.
Which solution meets these requirements?

A. Enable Amazon GuardDuty on the account.


B. Enable Amazon Inspector on the EC2 instances.
C. Enable AWS Shield and assign Amazon Route 53 to it.
D. Enable AWS Shield Advanced and assign the ELB to it.
Answer: D

NEW QUESTION 33
A company is hosting a static website on Amazon S3 and is using Amazon Route 53 for DNS. The website is experiencing
increased demand from around the world. The company must decrease latency for users who access the website.
Which solution meets these requirements MOST cost-effectively?

A. Replicate the S3 bucket that contains the website to all AWS Region
B. Add Route 53 geolocation routing entries.
C. Provision accelerators in AWS Global Accelerato
D. Associate the supplied IP addresses with the S3 bucke
E. Edit the Route 53 entries to point to the IP addresses of the accelerators.
F. Add an Amazon CloudFront distribution in front of the S3 bucke
G. Edit the Route 53 entries to point to the CloudFront distribution.
H. Enable S3 Transfer Acceleration on the bucke
I. Edit the Route 53 entries to point to the new endpoint.

Answer: C

NEW QUESTION 35
A company has thousands of edge devices that collectively generate 1 TB of status alerts each day.
Each alert is approximately 2 KB in size. A solutions architect needs to implement a solution to ingest and store the alerts for
future analysis.
The company wants a highly available solution. However, the company needs to minimize costs and does not want to
manage additional infrastructure. Additionally, the company wants to keep 14 days of data available for immediate
analysis and archive any data older than 14 days.
What is the MOST operationally efficient solution that meets these requirements?

A. Create an Amazon Kinesis Data Firehose delivery stream to ingest the alerts Configure the Kinesis Data Firehose stream
to deliver the alerts to an Amazon S3 bucket Set up an S3 Lifecycle configuration to transition data to Amazon S3 Glacier
after 14 days
B. Launch Amazon EC2 instances across two Availability Zones and place them behind an Elastic Load Balancer to ingest
the alerts Create a script on the EC2 instances that will store tne alerts m an Amazon S3 bucket Set up an S3 Lifecycle
configuration to transition data to Amazon S3 Glacier after 14 days
C. Create an Amazon Kinesis Data Firehose delivery stream to ingest the alerts Configure the Kinesis Data Firehose stream
to deliver the alerts to an Amazon Elasticsearch Service (Amazon ES) duster Set up the Amazon ES cluster to take manual
snapshots every day and delete data from the duster that is older than 14 days
D. Create an Amazon Simple Queue Service (Amazon SQS i standard queue to ingest the alerts and set the message
retention period to 14 days Configure consumers to poll the SQS queue check the age of the message and analyze the
message data as needed If the message is 14 days old the consumer should copy the message to an Amazon S3 bucket
and delete the message from the SQS queue

Answer: A

Explanation:
Explanation
https://aws.amazon.com/kinesis/datafirehose/features/?nc=sn&loc=2#:~:text=into%20Amazon%20S3%2C%20Amazon%20Red
shift%2C%20Amazon%20OpenSe arch%20Service%2C%20Kinesis,Delivery%20streams
NEW QUESTION 39
A company's application integrates with multiple software-as-a-service (SaaS) sources for data collection. The company
runs Amazon EC2 instances to receive the data and to upload the data to an Amazon S3 bucket for analysis. The same EC2
instance that receives and uploads the data also sends a notification to the user when an upload is complete. The
company has noticed slow application performance and wants to improve the performance as much as possible.
Which solution will meet these requirements with the LEAST operational overhead?

A. Create an Auto Scaling group so that EC2 instances can scale ou


B. Configure an S3 event notification to send events to an Amazon Simple Notification Service (Amazon SNS) topic when the
upload to the S3 bucket is complete.
C. Create an Amazon AppFlow flow to transfer data between each SaaS source and the S3 bucket.Configure an S3
event notification to send events to an Amazon Simple Notification Service (Amazon SNS) topic when the upload
to the S3 bucket is complete.
D. Create an Amazon EventBridge (Amazon CloudWatch Events) rule for each SaaS source to send output dat
E. Configure the S3 bucket as the rule's targe
F. Create a second EventBridge (CloudWatch Events) rule to send events when the upload to the S3 bucket is complet
G. Configure an Amazon Simple Notification Service (Amazon SNS) topic as the second rule's target.
H. Create a Docker container to use instead of an EC2 instanc
I. Host the containerized application on Amazon Elastic Container Service (Amazon ECS). Configure Amazon CloudWatch
Container Insights to send events to an Amazon Simple Notification Service (Amazon SNS) topic when the upload to the S3
bucket is complete.

Answer: B

NEW QUESTION 43
A company runs a highly available image-processing application on Amazon EC2 instances in a single VPC The EC2
instances run inside several subnets across multiple Availability Zones. The EC2 instances do not communicate with each
other However, the EC2 instances download images from Amazon S3 and upload images to Amazon S3 through a single
NAT gateway The company is concerned about data transfer charges What is the MOST cost-effective way for the
company to avoid Regional data transfer charges?

A. Launch the NAT gateway in each Availability Zone


B. Replace the NAT gateway with a NAT instance
C. Deploy a gateway VPC endpoint for Amazon S3
D. Provision an EC2 Dedicated Host to run the EC2 instances

Answer: C

NEW QUESTION 47
A company has an on-premises application that generates a large amount of time-sensitive data that is backed up to Amazon
S3. The application has grown and there are user complaints about internet bandwidth limitations. A solutions architect
needs to design a long-term solution that allows for both timely backups to Amazon S3 and with minimal impact on internet
connectivity for internal users.
Which solution meets these requirements?

A. Establish AWS VPN connections and proxy all traffic through a VPC gateway endpoint
B. Establish a new AWS Direct Connect connection and direct backup traffic through this new connection.
C. Order daily AWS Snowball devices Load the data onto the Snowball devices and return the devices to AWS each day.
D. Submit a support ticket through the AWS Management Console Request the removal of S3 service limits from the
account.

Answer: B

NEW QUESTION 49
A company has an application that provides marketing services to stores. The services are based on previous purchases by
store customers. The stores upload transaction data to the company through SFTP, and the data is processed and analyzed
to generate new marketing offers. Some of the files can exceed 200 GB in size.
Recently, the company discovered that some of the stores have uploaded files that contain personally identifiable
information (PII) that should not have been included. The company wants administrators to be alerted if PII is shared again.
The company also wants to automate remediation.
What should a solutions architect do to meet these requirements with the LEAST development effort?

A. Use an Amazon S3 bucket as a secure transfer poin


B. Use Amazon Inspector to scan me objects in the bucke
C. If objects contain Pl
D. trigger an S3 Lifecycle policy to remove the objects that contain Pll.
E. Use an Amazon S3 bucket as a secure transfer poin
F. Use Amazon Macie to scan the objects in the bucke
G. If objects contain Pl
H. Use Amazon Simple Notification Service (Amazon SNS) to trigger a notification to the administrators to remove the
objects mat contain Pll.
I. Implement custom scanning algorithms in an AWS Lambda functio
J. Trigger the function when objects are loaded into the bucke
K. It objects contain Rl
L. use Amazon Simple Notification Service (Amazon SNS) to trigger a notification to the administrators to remove the
objects that contain Pll.
M. Implement custom scanning algorithms in an AWS Lambda functio
N. Trigger the function when objects are loaded into the bucke
O. If objects contain Pl
P. use Amazon Simple Email Service (Amazon STS) to trigger a notification to the administrators and trigger on S3
Lifecycle policy to remove the objects mot contain PII.

Answer: B

NEW QUESTION 54
A company wants to migrate its on-premises application to AWS. The application produces output files that vary in size from
tens of gigabytes to hundreds of terabytes The application data must be stored in a standard file system structure
The company wants a solution that scales automatically, is highly available, and
requires minimum operational overhead. Which solution will meet these
requirements?

A. Migrate the application to run as containers on Amazon Elastic Container Service (Amazon ECS) Use Amazon S3 for
storage
B. Migrate the application to run as containers on Amazon Elastic Kubernetes Service (Amazon EKS) Use Amazon Elastic
Block Store (Amazon EBS) for storage
C. Migrate the application to Amazon EC2 instances in a Multi-AZ Auto Scaling grou
D. Use Amazon Elastic File System (Amazon EFS) for storage.
E. Migrate the application to Amazon EC2 instances in a Multi-AZ Auto Scaling grou
F. Use Amazon Elastic Block Store (Amazon EBS) for storage.

Answer: C

NEW QUESTION 55
A company needs to keep user transaction data in an Amazon DynamoDB table. The
company must retain the data for 7 years. What is the MOST operationally efficient
solution that meets these requirements?

A. Use DynamoDB point-in-time recovery to back up the table continuously.


B. Use AWS Backup to create backup schedules and retention policies for the table.
C. Create an on-demand backup of the table by using the DynamoDB consol
D. Store the backup in an Amazon S3 bucke
E. Set an S3 Lifecycle configuration for the S3 bucket.
F. Create an Amazon EventBridge (Amazon CloudWatch Events) rule to invoke an AWS Lambda functio
G. Configure the Lambda function to back up the table and to store the backup in an Amazon S3 bucke
H. Set an S3 Lifecycle configuration for the S3 bucket.

Answer: C

NEW QUESTION 58
A company has more than 5 TB of file data on Windows file servers that run on premises Users and applications interact with
the data each day
The company is moving its Windows workloads to AWS. As the company continues this process, the company requires
access to AWS and on-premises file storage with minimum latency The company needs a solution that minimizes
operational overhead and requires no significant changes to the existing file access patterns. The company uses an AWS
Site-to-Site VPN connection for connectivity to AWS
What should a solutions architect do to meet these requirements?

A. Deploy and configure Amazon FSx for Windows File Server on AW


B. Move the on-premises file data to FSx for Windows File Serve
C. Reconfigure the workloads to use FSx for Windows File Server on AWS.
D. Deploy and configure an Amazon S3 File Gateway on premises Move the on-premises file data to the S3 File Gateway
Reconfigure the on-premises workloads and the cloud workloads to use the S3 File Gateway
E. Deploy and configure an Amazon S3 File Gateway on premises Move the on-premises file data to Amazon S3
Reconfigure the workloads to use either Amazon S3 directly or the S3 File Gateway, depending on each workload's
location
F. Deploy and configure Amazon FSx for Windows File Server on AWS Deploy and configure an Amazon FSx File Gateway
on premises Move the on-premises file data to the FSx File Gateway Configure the cloud workloads to use FSx for
Windows File Server on AWS Configure the on-premises workloads to use the FSx File Gateway

Answer: D

NEW QUESTION 60
A company is running a high performance computing (HPC) workload on AWS across many Linux based Amazon EC2
instances. The company needs a shared storage system that is capable of sub-millisecond latencies, hundreds of Gbps of
throughput and millions of IOPS. Users will store millions of small files.
Which solution meets these requirements?

A. Create an Amazon Elastic File System (Amazon EFS) file system Mount me file system on each of the EC2 instances
B. Create an Amazon S3 bucket Mount the S3 bucket on each of the EC2 instances
C. Ensure that the EC2 instances ate Amazon Elastic Block Store (Amazon EBS) optimized Mount Provisioned lOPS SSD
(io2) EBS volumes with Multi-Attach on each instance
D. Create an Amazon FSx for Lustre file syste
E. Mount the file system on each of the EC2 instances

Answer: D

NEW QUESTION 62
A company collects data from thousands of remote devices by using a RESTful web services application that runs on an
Amazon EC2 instance. The EC2 instance receives the raw data, transforms the raw data, and stores all the data in an Amazon
S3 bucket. The number of remote devices will increase into the millions soon. The company needs a highly scalable solution
that minimizes operational overhead.
Which combination of steps should a solutions architect take to meet these requirements9 (Select TWO.)

A. Use AWS Glue to process the raw data in Amazon S3.


B. Use Amazon Route 53 to route traffic to different EC2 instances.
C. Add more EC2 instances to accommodate the increasing amount of incoming data.
D. Send the raw data to Amazon Simple Queue Service (Amazon SOS). Use EC2 instances to process the data.
E. Use Amazon API Gateway to send the raw data to an Amazon Kinesis data strea
F. Configure Amazon Kinesis Data Firehose to use the data stream as a source to deliver the data to Amazon S3.

Answer: BE

NEW QUESTION 65
A company is expecting rapid growth in the near future. A solutions architect needs to configure existing users and grant
permissions to new users on AWS The solutions architect has decided to create IAM groups The solutions architect will add
the new users to IAM groups based on department
Which additional action is the MOST secure way to grant permissions to the new users?

A. Apply service control policies (SCPs) to manage access permissions


B. Create IAM roles that have least privilege permission Attach the roles lo the IAM groups
C. Create an IAM policy that grants least privilege permission Attach the policy to the IAM groups
D. Create IAM roles Associate the roles with a permissions boundary that defines the maximum permissions

Answer: C

NEW QUESTION 67
A company hosts a serverless application on AWS. The application uses Amazon API Gateway. AWS Lambda, and an
Amazon RDS for PostgreSQL database. The company notices an increase in application errors that result from database
connection timeouts during times of peak traffic or unpredictable traffic. The company needs a solution that reduces the
application failures with the least amount of change to the code.
What should a solutions architect do to meet these requirements?

A. Reduce the Lambda concurrency rate.


B. Enable RDS Proxy on the RDS DB instance.
C. Resize the ROS DB instance class to accept more connections.
D. Migrate the database to Amazon DynamoDB with on-demand scaling

Answer: B

NEW QUESTION 72
A company hosts its product information webpages on AWS The existing solution uses multiple Amazon EC2 instances behind
an Application Load Balancer in an Auto Scaling group. The website also uses a custom DNS name and communicates with
HTTPS only using a dedicated SSL certificate The company is planning a new product launch and wants to be sure that users
from around the world have the best possible experience on the new website
What should a solutions architect do to meet these requirements?

A. Redesign the application to use Amazon CloudFront


B. Redesign the application to use AWS Elastic Beanstalk
C. Redesign the application to use a Network Load Balancer.
D. Redesign the application to use Amazon S3 static website hosting

Answer: A
Explanation:
as CloudFront can help provide the best experience for global users. CloudFront integrates seamlessly with ALB and provides
and option to use custom DNS and SSL certs.

NEW QUESTION 77
A company is planning to build a high performance computing (HPC) workload as a service solution that Is hosted on AWS A
group of 16 AmazonEC2Ltnux Instances requires the lowest possible latency for
node-to-node communication. The instances also need a shared
block device volume for high-performing storage.
Which solution will meet these requirements?

A. Use a duster placement grou


B. Attach a single Provisioned IOPS SSD Amazon Elastic Block Store (Amazon E BS) volume to all the instances by using
Amazon EBS Multi-Attach
C. Use a cluster placement grou
D. Create shared 'lie systems across the instances by using Amazon Elastic File System (Amazon EFS)
E. Use a partition placement grou
F. Create shared tile systems across the instances by using Amazon Elastic File System (Amazon EFS).
G. Use a spread placement grou
H. Attach a single Provisioned IOPS SSD Amazon Elastic Block Store (Amazon EBS) volume to all the instances by using
Amazon EBS Multi-Attach

Answer: A

NEW QUESTION 78
A company wants to use the AWS Cloud to make an existing application highly available and resilient. The current version
of the application resides in the company's data center. The application recently experienced data loss after a database
server crashed because of an unexpected power outage.
The company needs a solution that avoids any single points of failure. The solution must give the application the
ability to scale to meet user demand. Which solution will meet these requirements?

A. Deploy the application servers by using Amazon EC2 instances in an Auto Scaling group across multiple Availability Zone
B. Use an Amazon RDS DB instance in a Multi-AZ configuration.
C. Deploy the application servers by using Amazon EC2 instances in an Auto Scaling group in a single Availability Zon
D. Deploy the database on an EC2 instanc
E. Enable EC2 Auto Recovery.
F. Deploy the application servers by using Amazon EC2 instances in an Auto Scaling group across multiple Availability Zone
G. Use an Amazon RDS DB instance with a read replica in a single Availability Zon
H. Promote the read replica to replace the primary DB instance if the primary DB instance fails.
I. Deploy the application servers by using Amazon EC2 instances in an Auto Scaling group across multiple Availability
Zones Deploy the primary and secondary database servers on EC2 instances across multiple Availability Zones Use
Amazon Elastic Block Store (Amazon EBS) Multi-Attach to create shared storage between the instances.

Answer: A

NEW QUESTION 82
A company is implementing a new business application The application runs on two Amazon EC2 instances and uses an
Amazon S3 bucket for document storage A solutions architect needs to ensure that the EC? instances can access the S3
bucket
What should the solutions architect do to moot this requirement?

A. Create an IAM role that grants access to the S3 bucke


B. Attach the role to the EC2 Instances.
C. Create an IAM policy that grants access to the S3 bucket Attach the policy to the EC2 Instances
D. Create an IAM group that grants access to the S3 bucket Attach the group to the EC2 instances
E. Create an IAM user that grants access to the S3 bucket Attach the user account to the EC2 Instances

Answer: C

NEW QUESTION 83
A solution architect is creating a new Amazon CloudFront distribution for an application Some of Ine information
submitted by users is sensitive. The application uses HTTPS but needs another layer" of security The sensitive information
should be protected throughout the entire application stack end access to the information should be restricted to certain
applications
Which action should the solutions architect take?

A. Configure a CloudFront signed URL


B. Configure a CloudFront signed cookie.
C. Configure a CloudFront field-level encryption profile
D. Configure CloudFront and set the Origin Protocol Policy setting to HTTPS Only for the Viewer Protocol Policy

Answer: C

NEW QUESTION 88
A company wants to build a scalable key management Infrastructure to support developers who
need to encrypt data in their applications. What should a solutions architect do to reduce the
operational burden?

A. Use multifactor authentication (MFA) to protect the encryption keys.


B. Use AWS Key Management Service (AWS KMS) to protect the encryption keys
C. Use AWS Certificate Manager (ACM) to create, store, and assign the encryption keys
D. Use an IAM policy to limit the scope of users who have access permissions to protect the encryption keys

Answer: B
NEW QUESTION 92
A company has migrated a two-tier application from its on-premises data center to the AWS Cloud The data tier is a Multi-
AZ deployment of Amazon RDS for Oracle with 12 TB of General Purpose SSD Amazon Elastic Block Store (Amazon EBS)
storage The application is designed to process and store documents in the database as binary large objects (blobs) with an
average document size of 6 MB
The database size has grown over time reducing the performance and increasing the cost of storage. The company must
improve the database performance and needs a solution that is highly available and resilient
Which solution will meet these requirements MOST cost-effectively?

A. Reduce the RDS DB instance size Increase the storage capacity to 24 TiB Change the storage type to Magnetic
B. Increase the RDS DB instance siz
C. Increase the storage capacity to 24 TiB Change the storage type to Provisioned IOPS
D. Create an Amazon S3 bucke
E. Update the application to store documents in the S3 bucket Store theobject metadata m the existing database
F. Create an Amazon DynamoDB tabl
G. Update the application to use DynamoD
H. Use AWS Database Migration Service (AWS DMS) to migrate data from the Oracle database to DynamoDB

Answer: C

NEW QUESTION 93
A company's website handles millions of requests each day and the number of requests continues to increase. A solutions
architect needs to improve the response time of the web application. The solutions architect determines that the application
needs to decrease latency when retrieving product details from the Amazon DynamoDB table
Which solution will meet these requirements with the LEAST amount of operational overhead?

A. Set up a DynamoDB Accelerator (DAX) cluster Route all read requests through DAX.
B. Set up Amazon ElastiCache for Redis between the DynamoDB table and the web application Route all read requests
through Redis.
C. Set up Amazon ElastrCachertor Memcached between the DynamoDB table and the web application Route all read
requests through Memcached.
D. Set up Amazon DynamoDB streams on the table and have AWS Lambda read from the table andpopulate Amazon
ElastiCache Route all read requests through ElastiCache

Answer: A

NEW QUESTION 96
A company is building a containerized application on premises and decides to move the application to AWS. The
application will have thousands of users soon after li is deployed. The company Is unsure how to manage the deployment
of containers at scale. The company needs to deploy the containerized application in a highly available architecture that
minimizes operational overhead.
Which solution will meet these requirements?

A. Store container images In an Amazon Elastic Container Registry (Amazon ECR) repositor
B. Use an Amazon Elastic Container Service (Amazon ECS) cluster with the AWS Fargate launch type to run the container
C. Use target tracking to scale automatically based on demand.
D. Store container images in an Amazon Elastic Container Registry (Amazon ECR) repositor
E. Use an Amazon Elastic Container Service (Amazon ECS) cluster with the Amazon EC2 launch type to run the container
F. Use target tracking to scale automatically based on demand.
G. Store container images in a repository that runs on an Amazon EC2 instanc
H. Run the containers on EC2 instances that are spread across multiple Availability Zone
I. Monitor the average CPU utilization in Amazon CloudWatc
J. Launch new EC2 instances as needed
K. Create an Amazon EC2 Amazon Machine Image (AMI) that contains the container image Launch EC2 Instances in
an Auto Scaling group across multiple Availability Zone
L. Use an Amazon CloudWatch alarm to scale out EC2 instances when the average CPU utilization threshold is breached.

Answer: A

NEW QUESTION 100


A company has five organizational units (OUS) as part of its organization in AWS Organization. Each OU correlate to the
five business that the company owns. The company research and development R&D business is separating from the
company and will need its own organization. A solutions architect creates a separate new management account for this
purpose.

A. Have the R&D AWS account be part of both organizations during the transition.
B. Invite the R&D AWS account to be part of the new organization after the R&D AWS account has left the prior
organization.
C. Create a new R&D AWS account in the new organizatio
D. Migrate resources from the period R&D AWS account to thee new R&D AWS account
E. Have the R&D AWS account into the now organisatio
F. Make the now management account a member of the prior organisation

Answer: B

NEW QUESTION 101


A company needs to store data in Amazon S3 and must prevent the data from being changed. The company wants new
objects that are uploaded to Amazon S3 to remain unchangeable for a nonspecific amount of time until the company
decides to modify the objects. Only specific users in the company’s AWS account can have the ability to delete the objects.
What should a solutions architect do to meet these requirements?

A. Create an S3 Glacier vault Apply a write-once, read-many (WORM) vault lock policy to the objects
B. Create an S3 bucket with S3 Object Lock enabled Enable versioning Set a retention period of 100 years Use
governance mode as the S3 bucket's default retention mode for new objects
C. Create an S3 bucket Use AWS CloudTrail to (rack any S3 API events that modify the objects Upon notification, restore the
modified objects from any backup
versions that the company has
D. Create an S3 bucket with S3 Object Lock enabled Enable versioning Add a legal hold to the objects Add the s3
PutObjectLegalHold permission to the 1AM policies of users who need to delete the objects

Answer: D

NEW QUESTION 104


A company has a stateless asynchronous application that runs in an Apache Hadoop cluster The application is invoked on
demand to run extract, transform and load (ETL) jobs several limes a day
A solutions architect needs to migrate this application to the AWS Cloud by designing an Amazon EMR cluster for the
workload. The cluster must be available immediately to process jobs.
Which implementation meets these requirements MOST cost-effectively?

A. Use zonal Reserved Instances for the master nodes and the ewe nodes Use a Spot Fleet lor tire task nodes
B. Use zonal Reserved Instances for the master nodes Use Spot instances for the core nodes and the task nodes
C. Use regional Reserved Instances for the master nodes Use a Spot Fleer for the core nodes and the task nodes
D. Use regional Reserved Instances for the master node
E. Use On-Demand Capacity Reservations for the core nodes and the task nodes.

Answer: A

NEW QUESTION 107


A company has on-premises servers that run a relational database The database serves high-read traffic for users in different
locations The company wants to migrate the database to AWS with the least amount of effort The database solution must
support high availability and must not affect the company's current traffic flow
Which solution meets these requirements?

A. Use a database in Amazon RDS with Multi-AZ and at least one read replica.
B. Use a database in Amazon RDS with Multi-AZ and at least one standby replica.
C. Use databases that are hosted on multiple Amazon EC2 instances in different AWS Regions.
D. Use databases that are hosted on Amazon EC2 instances behind an Application Load Balancer in different Availability
Zones

Answer: A

Explanation:
https://aws.amazon.com/blogs/database/implementing-a-disaster-recovery-strategy-with-amazon-rds/

NEW QUESTION 108


A company runs a high performance computing (HPC) workload on AWS. The workload required low-latency network
performance and high network throughput with tightly coupled node-to-node communication. The Amazon EC2 instances
are properly sized for compute and storage capacity, and are launched using default options.
What should a solutions architect propose to improve the performance of the workload?

A. Choose a cluster placement group while launching Amazon EC2 instances.


B. Choose dedicated instance tenancy while launching Amazon EC2 instances.
C. Choose an Elastic Inference accelerator while launching Amazon EC2 instances.
D. Choose the required capacity reservation while launching Amazon EC2 instances.
Answer: A

Explanation:
https://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-resource-ec2-placementgroup.html "A cluster
placement group is a logical grouping of instances within a single Availability Zone that benefit from low network latency,
high network throughput"

NEW QUESTION 112


A company has an application that runs on Amazon EC2 instances and uses an Amazon Aurora database. The EC2 instances
connect to the database by using user names and passwords that are stored locally in a file. The company wants to
minimize the operational overhead of credential management.
What should a solutions architect do to accomplish this goal?

A. Use AWS Secrets Manage


B. Turn on automatic rotation.
C. Use AWS Systems Manager Parameter Stor
D. Turn on automatic rotatio
E. • Create an Amazon S3 bucket lo store objects that are encrypted with an AWS Key
F. Management Service (AWS KMS) encryption ke
G. Migrate the credential file to the S3 bucke
H. Point the application to the S3 bucket.
I. Create an encrypted Amazon Elastic Block Store (Amazon EBS) volume (or each EC2 instanc
J. Attach the new EBS volume to each EC2 instanc
K. Migrate the credential file to the new EBS volum
L. Point the application to the new EBS volume.

Answer: C

NEW QUESTION 117


A company uses an Amazon Auroia PostgreSQL DB cluster 10 store its critical data m tne us-east-l Region The company
wants to develop a disaster recovery plan to recover the database m the us west 1 Region The company has a recovery
time objective (RTO) of S minutes and has a recovery point objective (RPO) of 1 minute
What should a solutions architect do to moot these requirements?

A. Create a read replica in us-west-1 Set the DB cluster to automaKaliy fail over to the read replica if the primary instance is
not responding
B. Create an Aurora global database Sel us-west-1 as the secondary Region update connections to use the writer and reader
endpomis as appropriate
C. Set up a second Aurora DB cluster in us-west-1 Use logical replication to keep the databases synchronized
Create an Amazon EvontBridgc (Amazon CloudWatch Events) rule to change thedatabase endpoint rf the
primary DB cluster does not respond.
D. Use Aurora automated snapshots to store data in an Amazon S3 bucket Enable S3 Verswnm
E. Configure S3 Cross-Region Replication to us-west-1 Create a second Aurora DB cluster in us-west-1 Create an
Amazon EventBndge (Amazon CloudWatch Events) rule to restore the snapshot il the primary D8 cluster does not
respond

Answer: B

NEW QUESTION 122


A company has two AWS accounts in the same AWS Region. One account is a publisher account, and the other account is a
subscriber account Each account has its own Amazon S3 bucket.
An application puts media objects into the publisher account's S3 bucket The objects are encrypted with server-side
encryption with customer-provided encryption keys (SSE-C). The company needs a solution that will automatically copy the
objects to the subscriber's account's S3 bucket.
Which solution will meet these requirements with the LEAST operational overhead?

A. Enable S3 Versioning on the publisher account's S3 bucket Configure S3 Same-Region Replication of the objects to the
subscriber account's S3 bucket
B. Create an AWS Lambda function that is invoked when objects are published in the publisher account's S3 bucke
C. Configure the Lambda function to copy the objects to the subscriber accounts S3 bucket
D. Configure Amazon EventBridge (Amazon CloudWatch Events) to invoke an AWS Lambda function when objects are
published in the publisher account's S3 bucket Configure the Lambda function to copy the objects to the subscriber
account's S3 bucket
E. Configure Amazon EventBridge (Amazon CloudWatch Events) to publish Amazon Simple Notification Service
(Amazon SNS) notifications when objects are published in the publisher account's S3 bucket When notifications are
received use the S3 console to copy the objects to the subscriber accounts S3 bucket

Answer: B

NEW QUESTION 123


A company is running an ASP.NET MVC application on a single Amazon EC2 instance. A recent increase in application traffic is
causing slow response times for users during lunch hours. The company needs to resolve this concern with the least amount
of configuration.
What should a solutions architect recommend to meet these requirements?

A. Move the application to AWS Elastic Beanstal


B. Configure load-based auto scaling and time-based scaling to handle scaling during lunch hours
C. Move the application to Amazon Elastic Container Service (Amazon ECS) Create an AWS Lambda function to handle
scaling during lunch hours.
D. Move the application to Amazon Elastic Container Service (Amazon ECS). Configure scheduled scaling for AWS
Application Auto Scaling during lunch hours.
E. Move the application to AWS Elastic Beanstal
F. Configure load-based auto scaling, and create an AWS Lambda function to handle scaling during lunch hours.

Answer: A

Explanation:
- Scheduled scaling is the solution here, while "using the least amount of settings possible" - Beanstalk vs moving to ECS -
ECS requires MORE CONFIGURATION / SETTINGS (task and service definitions, configuring ECS container agent) than
Beanstalk (upload application code) https://docs.aws.amazon.com/elasticbeanstalk/latest/dg/environments-cfg-
autoscaling-scheduledactions.html Elastic Beanstalk supports time based scaling, since we are aware that the application
performance slows down during the lunch hours.
https://aws.amazon.com/about-aws/whats-new/2015/05/aws-elastic-beanstalk-supports-time-based-scaling/

NEW QUESTION 125


A hospital wants to create digital copies for its large collection of historical written records. The hospital will continue to add
hundreds of new documents each day. The hospital's data team will scan the documents and will upload the documents to
the AWS Cloud.
A solutions architect must implement a solution to analyze the documents: extract the medical information, and store the
documents so that an application can run SQL queries on the data The solution must maximize scalability and operational
efficiency
Which combination of steps should the solutions architect take to meet these requirements? (Select TWO.)

A. Write the document information to an Amazon EC2 instance that runs a MySQL database
B. Write the document information to an Amazon S3 bucket Use Amazon Athena to query the data
C. Create an Auto Scaling group of Amazon EC2 instances to run a custom application that processes the scanned files and
extracts the medical information.
D. Create an AWS Lambda function that runs when new documents are uploaded Use Amazon Rekognition to convert
the documents to raw text Use Amazon Transcribe Medical to detect and extract relevant medical Information from
the text.
E. Create an AWS Lambda function that runs when new documents are uploaded Use Amazon Textract to convert
the documents to raw text Use Amazon Comprehend Medical to detect and extract relevant medical information
from the text

Answer: AE

NEW QUESTION 128


A company has a web application that runs on Amazon EC2 instances. The company wants end users to authenticate
themselves before they use the web application. The web application accesses AWS resources, such as Amazon S3 buckets,
on behalf of users who are logged on.
Which combination of actions must a solutions architect take to meet these requirements? (Select TWO).

A. Configure AWS App Mesh to log on users.


B. Enable and configure AWS Single Sign-On in AWS Identity and Access Management (IAM).
C. Define a default (AM role for authenticated users.
D. Use AWS Identity and Access Management (IAM) for user authentication.
E. Use Amazon Cognito for user authentication.

Answer: BE
NEW QUESTION 132
An ecommerce company has an order-processing application that uses Amazon API Gateway and an AWS Lambda
function. The application stores data in an Amazon Aurora PostgreSQL database. During a recent sales event, a sudden
surge in customer orders occurred. Some customers experienced timeouts and the application did not process the orders
of those customers A solutions architect determined that the CPU utilization and memory utilization were high on the
database because of a large number of open connections The solutions architect needs to prevent the timeout errors
while making the least possible changes to the application.
Which solution will meet these requirements?

A. Configure provisioned concurrency for the Lambda function Modify the database to be a global database in multiple AWS
Regions
B. Use Amazon RDS Proxy to create a proxy for the database Modify the Lambda function to use the RDS Proxy endpoint
instead of the database endpoint
C. Create a read replica for the database in a different AWS Region Use query string parameters in API Gateway to route
traffic to the read replica
D. Migrate the data from Aurora PostgreSQL to Amazon DynamoDB by using AWS Database Migration Service (AWS
DMS| Modify the Lambda function to use the OynamoDB table

Answer: C

NEW QUESTION 137


A company is designing a new web application that the company will deploy into a single AWS Region. The application
requires a two-tier architecture that will include Amazon EC2 instances and an Amazon RDS DB instance. A solutions
architect needs to design the application so that all components are highly available.

A. Deploy EC2 instances In an additional Region Create a DB instance with the Multi-AZ option activated
B. Deploy all EC2 instances in the same Region and the same Availability Zon
C. Create a DB instance with the Multi-AZ option activated.
D. Deploy the fcC2 instances across at least two Availability Zones within the some Regio
E. Create a DB instance in a single Availability Zone
F. Deploy the EC2 instances across at least Two Availability Zones within the same Regio
G. Create a DB instance with the Multi-AZ option activated

Answer: D

NEW QUESTION 141


A company is hosting a website from an Amazon S3 bucket that is configured for public hosting. The company’s security
team mandates the usage of secure connections for access to the website. However; HTTP-based URLS and HTTPS-based
URLS mist be functional.
What should a solution architect recommend to meet these requirements?

A. Create an S3 bucket policy to explicitly deny non-HTTPS traffic.


B. Enable S3 Transfer Acceleratio
C. Select the HTTPS Only bucket property.
D. Place thee website behind an Elastic Load Balancer that is configured to redirect HTTP traffic to HTTTPS.
E. Serve the website through an Amazon CloudFront distribution that is configured to redirect HTTP traffic to HTTPS.

Answer: D
NEW QUESTION 142
A company has a business system that generates hundreds of reports each day. The business system saves the reports to a
network share in CSV format The company needs to store this data in the AWS Cloud in near-real time for analysis. Which
solution will meet these requirements with the LEAST administrative overhead?

A. Use AWS DataSync to transfer the files to Amazon S3 Create a scheduled task that runs at the end of each day.
B. Create an Amazon S3 File Gateway Update the business system to use a new network share from the S3 File Gateway.
C. Use AWS DataSync to transfer the files to Amazon S3 Create an application that uses the DataSync API in the automation
workflow.
D. Deploy an AWS Transfer for SFTP endpoint Create a script that checks for new files on the network share and uploads the
new files by using SFTP.

Answer: B

NEW QUESTION 143


A company needs to ingested and handle large amounts of streaming data that its application generates. The application
runs on Amazon EC2 instances and sends data to Amazon Kinesis Data Streams. which is contained wild default settings.
Every other day the application consumes the data and writes the data to an Amazon S3 bucket for business intelligence
(BI) processing the company observes that Amazon S3 is not receiving all the data that trio application sends to Kinesis
Data Streams.
What should a solutions architect do to resolve this issue?

A. Update the Kinesis Data Streams default settings by modifying the data retention period.
B. Update the application to use the Kinesis Producer Library (KPL) lo send the data to Kinesis Data Streams.
C. Update the number of Kinesis shards lo handle the throughput of me data that is sent to Kinesis Data Streams.
D. Turn on S3 Versioning within the S3 bucket to preserve every version of every object that is ingested in the S3 bucket.

Answer: A

NEW QUESTION 147


A company is developing a new machine learning (ML) model solution on AWS. The models are developed as independent
microservices that fetch approximately 1GB of model data from Amazon S3 at startup and load the data into memory Users
access the models through an asynchronous API Users can send a request or a batch of requests and specify where the results
should be sent
The company provides models to hundreds of users. The usage patterns for the models are irregular. Some models
could be unused for days or weeks Other models could receive batches of thousands of requests at a time
Which design should a solutions architect recommend to meet these requirements?
A. Direct the requests from the API to a Network Load Balancer (NLB) Deploy the models as AWS Lambda functions that are
invoked by the NLB.
B. Direct the requests from the API to an Application Load Balancer (ALB). Deploy the models as Amazon Elastic
Container Service (Amazon ECS) services that read from an Amazon Simple Queue Service (Amazon SQS) queue Use AWS
App Mesh to scale the instances of the ECS cluster based on the SQS queue size
C. Direct the requests from the API into an Amazon Simple Queue Service (Amazon SQS) queue Deploy the models as
AWS Lambda functions that are invoked by SQS events Use AWS Auto Scaling to increase the number of vCPUs for the
Lambda functions based on the SQS queue size
D. Direct the requests from the API into an Amazon Simple Queue Service (Amazon SQS) queue Deploy the models as
Amazon Elastic Container Service (Amazon ECS) services that read from the queue Enable AWS Auto Scaling on
Amazon ECS for both the cluster and copies of the service based on thequeue size

Answer: C

NEW QUESTION 149


A company is deploying a web portal. The company wants to ensure that only the web portion of the application is publicly
accessible. To accomplish this, the VPC was designed with two public subnets and two private subnets. The application will
run on several Amazon EC2 instances in an Auto Scaling group. SSL termination must be offloaded from the EC2 instances.
What should a solutions architect do to ensure these requirements are met? Configure a Network Load Balancer in the
public subnets. Configure the Auto Scaling

A. group in the private subnets and associate it with an Application Load Balancer Configure a Network Load Balancer in the
public subnet
B. Configure the Auto Scaling
C. group in the public subnets and associate it with an Application Load Balancer.
D. Configure an Application Load Balancer in the public subnet
E. Configure the Auto Scaling group in the private subnets and associate it with the Application Load
F. Balancer, Configure an Application Load Balancer in the private subnet
G. Configure the Auto Scaling group in the private subnets and associate it with the Application Load Balancer.

Answer: C

NEW QUESTION 150


A company has developed a new content-sharing application that runs on Amazon Elastic Container Service (Amazon ECS).
The application runs on Amazon Linux Docker tasks that use the Amazon EC2 launch type. The application requires a
storage solution that has the following characteristics:
• Accessibility (or multiple ECS tasks through bind mounts
• Resiliency across Availability Zones
• Burstable throughput of up to 3 Gbps
• Ability to be scaled up over time
Which storage solution meets these requirements?

A. Launch an Amazon FSx for Windows File Server Multi-AZ instanc


B. Configure the ECS task definitions to mount the Amazon FSx instance volume at launch.
C. Launch an Amazon Elastic File System (Amazon EFS) instanc
D. Configure the ECS task definitions to mount the EFS Instance volume at launch.
E. Create a Provisioned IOPS SSD (io2) Amazon Elastic Block Store (Amazon EBS) volume with Multi-Attach set to enable
F. Attach the EBS volume to the ECS EC2 instance Configure ECS task definitions to mount the EBS instance volume at
launch.
G. Launch an EC2 instance with several Provisioned IOPS SSD (k>2) Amazon Elastic Block Store (Amazon EBS) volumes
attached m a RAID 0 configuratio
H. Configure the EC2 instance as an NFS storage serve
I. Configure ECS task definitions to mount the volumes at launch.

Answer: B

NEW QUESTION 151


An image-processing company has a web application that users use to upload images. The application uploads the images
into an Amazon S3 bucket. The company has set up S3 event notifications to publish the object creation events to an A
company has a service that produces event queue. The SQS queue serves as the event source for an AWS Lambda function
that processes the images and sends the results to users through email.
Users report that they are receiving multiple email messages for every uploaded image. A solutions architect determines that
SQS messages are invoking the Lambda function more than once, resulting in multiple email messages.
What should the solutions architect do to resolve this issue with the LEAST operational overhead?

A. Set up long polling in the SQS queue by increasing the ReceiveMessage wait time to 30 seconds.
B. Change the SQS standard queue to an SQS FIFO queu
C. Use the message deduplication ID to discard duplicate messages.
D. Increase the visibility timeout in the SQS queue to a value that is greater than the total of the function timeout and the
batch window timeout.
E. Modify the Lambda function to delete each message from the SQS queue immediately after the message is read before
processing.

Answer: B

NEW QUESTION 156


A company needs to move data from an Amazon EC2 instance to an Amazon S3 bucket. The company mutt ensure that no
API calls and no data aim routed through public internet routes Only the EC2 instance can have access to upload data to
the S3 bucket.
Which solution will meet these requirements?

A. Create an interlace VPC endpoinl for Amazon S3 in the subnet where the EC2 instance is located Attach a resource
policy to the S3 bucket to only allow the EC2 instance's 1AM rote for access
B. Create a gateway VPC endpoinl for Amazon S3 in the Availability Zone where the EC2 instance is located Attach
appropriate security groups to the endpoint Attach a resource policy to the S3 bucket to only allow the EC2 instance's
lAM tote for access
C. Run the nslookup toot from inside the EC2 instance to obtain the private IP address of the S3 bucket's service API
endpoint Create a route in the VPC route table to provide the EC2 instance with access to the S3 bucket Attach a
resource policy to the S3 bucket to only allow the EC2 instance's AM role for access
D. Use the AWS provided publicly available ip-ranges |son file to obtam the pnvate IP address of the S3 bucket's service
API endpoint Create a route in the VPC route table to provide the EC2 instance with access to the S3 bucket Attach a
resource policy to the S3 bucket to only allow the EC2 instance's 1AM role for access
Answer: B

NEW QUESTION 159


A company hosts its web application on AWS using seven Amazon EC2 instances. The company requires that the IP addresses
of all healthy EC2 instances be returned in response to DNS queries.
Which policy should be used to meet this requirement?

A. Simple routing policy


B. Latency routing policy
C. Multivalue routing policy
D. Geolocation routing policy

Answer: C

Explanation:
https://aws.amazon.com/premiumsupport/knowledge-center/multivalue-versus-simple-policies/
"Use a multivalue answer routing policy to help distribute DNS responses across multiple resources. For example, use
multivalue answer routing when you want to associate your routing records with a Route 53 health check."
https://docs.aws.amazon.com/Route53/latest/DeveloperGuide/routing-policy.html#routing-policy-multivalue

NEW QUESTION 161


A company is planning on deploying a newly built application on AWS in a default VPC. The application will consist of a
web layer and database layer. The web server was created in public subnets, and the MySQL database was created in
private subnet. All subnets are created with the default network ACL settings, and the default security group in the VPC
will be replaced with new custom security groups.

A. Create a database server security group with inbound and outbound rules for MySQL port 3306 traffic to and from
anywhere (0.0.0.0/0).
B. Create a database server security group with an inbound rule for MySQL port 3300 and specify the source as a web server
security group.
C. Create a web server security group within an inbound allow rule for HTTPS port 443 traffic from anywbere (0.0.0.0/0) and
an inbound deny rule for IP range
182. 20.0.0/16
D. Create a web server security group with an inbound rule for HTTPS port 443 traffic from anywhere (0.0.0.0/0). Create
network ACL inbound and outbound deny rules for IP range 182. 20.0.0/16
E. Create a web server security group with an inbound and outbound rules for HTTPS port 443 traffic to and from
anywbere (0.0.0.0/0). Create a network ACL inbound deny rule for IP range 182. 20.0.0/16.

Answer: BD

NEW QUESTION 164


A company is launching a new application and will display application metrics on an Amazon CloudWatch dashboard. The
company’s product manager needs to access this dashboard periodically. The product manager does not have an AWS
account. A solution architect must provide access to the product manager by following the principle of least privilege.
Which solution will meet these requirements?

A. Share the dashboard from the CloudWatch consol


B. Enter the product manager’s email address, and complete the sharing step
C. Provide a shareable link for the dashboard to the product manager.
D. Create an IAM user specifically for the product manage
E. Attach the CloudWatch Read Only Access managed policy to the use
F. Share the new login credential with the product manage
G. Share the browser URL of the correct dashboard with the product manager.
H. Create an IAM user for the company’s employees, Attach the View Only Access AWS managed policy to the IAM use
I. Share the new login credentials with the product manage
J. Ask the product manager to navigate to the CloudWatch console and locate the dashboard by name in the Dashboards
section.
K. Deploy a bastion server in a public subne
L. When the product manager requires access to the dashboard, start the server and share the RDP credential
M. On the bastion server, ensure that the browser is configured to open the dashboard URL with cached AWS
credentials that have appropriate permissions to view the dashboard.

Answer: A

NEW QUESTION 166


A company wants to build a data lake on AWS from data that is stored in an onpremises Oracle relational database. The data
lake must receive ongoing updates from the on-premises database.
Which solution will meet these requirements with the LEAST operational overhead?

A. Use AWS DataSync to transfer the data to Amazon S3. Use AWS Glue to transform the data and integrate the data into a
data lake.
B. Use AWS Snowball to transfer the data to Amazon S3. Use AWS Batch to transform the data and integrate the data into a
data lake.
C. Use AWS Database Migration Service (AWS DMS) to transfer the data to Amazon S3 Use AWS Glue to transform the
data and integrate the data into a data lake.
D. Use an Amazon EC2 instance to transfer the data to Amazon S3. Configure the EC2 instance to transform the data and
integrate the data into a data lake.

Answer: C

NEW QUESTION 167


A solutions architect is designing a customer-facing application for a company. The application's database will have a clearly
defined access pattern throughout the year and will have a variable number of reads and writes that depend on the time of
year. The company must retain audit records for the database for 7 days. The recovery point objective (RPO) must be less
than 5 hours. Which solution meets these requirements?

A. Use Amazon DynamoDB with auto scaling Use on-demand backups and Amazon DynamoDB Streams
B. Use Amazon Redshif
C. Configure concurrency scalin
D. Activate audit loggin
E. Perform database snapshots every 4 hours.
F. Use Amazon RDS with Provisioned IOPS Activate the database auditing parameter Perform database snapshots every 5
hours
G. Use Amazon Aurora MySQL with auto scalin
H. Activate the database auditing parameter

Answer: B

NEW QUESTION 171


A company wants to migrate its existing on-premises monolithic application to AWS.
The company wants to keep as much of the front- end code and the backend code as possible. However, the company wants
to break the application into smaller applications. A different team will manage each application. The company needs a
highly scalable solution that minimizes operational overhead.
Which solution will meet these requirements?

A. Host the application on AWS Lambda Integrate the application with Amazon API Gateway.
B. Host the application with AWS Amplif
C. Connect the application to an Amazon API Gateway API that is integrated with AWS Lambda.
D. Host the application on Amazon EC2 instance
E. Set up an Application Load Balancer with EC2 instances in an Auto Scaling group as targets.
F. Host the application on Amazon Elastic Container Service (Amazon ECS) Set up an Application Load Balancer with Amazon
ECS as the target.

Answer: C

NEW QUESTION 176


A company's application integrates with multiple software-as-a-service (SaaS) sources for data collection. The company
runs Amazon EC2 instances to receive the data and to upload the data to an Amazon S3 bucket for analysis. The same EC2
instance that receives and uploads the data also sends a notification to the user when an upload is complete. The
company has noticed slow application performance and wants to improve the performance as much as possible.
Which solution will meet these requirements with the LEAST operational overhead?

A. Create an Auto Scaling group so that EC2 instances can scale ou


B. Configure an S3 event notification to send events to an Amazon Simple Notification Service (Amazon SNS) topic when the
upload to the S3 bucket is complete.
C. Create an Amazon AppFlow flow to transfer data between each SaaS source and the S3 bucket.Configure an S3
event notification to send events to an Amazon Simple Notification Service (Amazon SNS) topic when the upload
to the S3 bucket is complete.
D. Create an Amazon EventBridge (Amazon CloudWatch Events) rule for each SaaS source to send output dat
E. Configure the S3 bucket as the rule's targe
F. Create a second EventBridge (CloudWatch Events) rule to send events when the upload to the S3 bucket is complet
G. Configure an Amazon Simple Notification Service (Amazon SNS) topic as the second rule's target.
H. Create a Docker container to use instead of an EC2 instanc
I. Host the containerized application on Amazon Elastic Container Service (Amazon ECS). Configure Amazon CloudWatch
Container Insights to send events to an Amazon Simple Notification Service (Amazon SNS) topic when the upload to the S3
bucket is complete.
Answer: D

NEW QUESTION 179


A gaming company wants to launch a new internet-facing application in multiple AWS Regions. The application will use the
TCP and UDP protocols for communication. The company needs to provide high availability and minimum latency for global
users.
Which combination of actions should a solutions architect take to meet these requirements? (Select TWO.)

A. Create internal Network Load Balancers in front of the application in each Region
B. Create external Application Load Balancers in front of the application in each Region
C. Create an AWS Global Accelerator accelerator to route traffic to the load balancers in each Region
D. Configure Amazon Route 53 to use a geolocation routing policy to distribute the traffic
E. Configure Amazon CloudFront to handle the traffic and route requests to the application in each Region

Answer: AC

NEW QUESTION 182


An online photo application lets users upload photos and perform image editing operations The application offers two
classes of service free and paid Photos submitted by paid users are processed before those submitted by free users Photos
are uploaded to Amazon S3 and the job information is sent to Amazon SQS. Which configuration should a solutions architect
recommend?

A. Use one SQS FIFO queue Assign a higher priority to the paid photos so they are processed first
B. Use two SQS FIFO queues: one for paid and one for free Set the free queue to use short polling and the paid queue to use
long polling
C. Use two SQS standard queues one for paid and one for free Configure Amazon EC2 instances to prioritize polling for the
paid queue over the free queue.
D. Use one SQS standard queu
E. Set the visibility timeout of the paid photos to zero Configure Amazon EC2 instances to prioritize visibility settings so paid
photos are processed first

Answer: C

Explanation:
https://acloud.guru/forums/guru-of-the-week/discussion/-
L7Be8rOao3InQxdQcXj/ https://aws.amazon.com/sqs/features/ Priority:
Use separate queues to provide prioritization of work.
https://aws.amazon.com/sqs/features/
https://aws.amazon.com/sqs/features/#:~:text=Priority%3A%20Use%20sepa
rate%20queues%20to%20provide%
https://docs.aws.amazon.com/AWSSimpleQueueService/latest/SQSDevelope
rGuide/sqs-short-and-long-polling.

NEW QUESTION 184


A company wants to use Amazon S3 for the secondary copy of itdataset. The company would rarely need to
access this copy. The storage solution’s cost should be minimal.
Which storage solution meets these requirements?

A. S3 Standard
B. S3 Intelligent-Tiering
C. S3 Standard-Infrequent Access (S3 Standard-IA)
D. S3 One Zone-Infrequent Access (S3 One Zone-IA)

Answer: C

NEW QUESTION 186


A company runs us two-tier ecommerce website on AWS The web tier consists of a load balancer that sends traffic to
Amazon EC2 instances The database tier uses an Amazon RDS D8 instance The EC2 instances and the ROS DB instance
should not be exposed to the public internet The EC2 instances require internet access to complete payment processing of
orders through a third-party web service The application must be highly available
Which combination of configuration options will meet these requirements? (Select TWO.)

A. Use an Auto Scaling group to launch the EC2 Instances in private subnets Deploy an RDS Mulli-AZ DB instance in private
subnets
B. Configure a VPC with two private subnets and two NAT gateways across two Availability Zones Deploy an Application
Load Balancer in the private subnets
C. Use an Auto Scaling group to launch the EC2 instances in public subnets across two Availability Zones Deploy an RDS
Multi-AZ DB instance in private subnets
D. Configure a VPC with one public subnet, one private subnet, and two NAT gateways across two Availability Zones
Deploy an Application Load Balancer in the public subnet
E. Configure a VPC with two public subnets, two private subnets, and two NAT gateways across two Availability Zones
Deploy an Application Load Balancer in the public subnets

Answer: AE

NEW QUESTION 188


A company has an AWS Glue extract. transform, and load (ETL) job that runs every day at the same time. The job processes
XML data that is in an Amazon S3 bucket.
New data is added to the S3 bucket every day. A solutions
architect notices that AWS Glue is processing all the data during
each run.
What should the solutions architect do to prevent AWS Glue from reprocessing old data?

A. Edit the job to use job bookmarks.


B. Edit the job to delete data after the data is processed
C. Edit the job by setting the NumberOfWorkers field to 1.
D. Use a FindMatches machine learning (ML) transform.

Answer: B

NEW QUESTION 191


A company uses 50 TB of data for reporting. The company wants to move this data from on premises to AWS A custom
application in the company's data center runs a weekly data transformation job. The company plans to pause the
application until the data transfer is complete and needs to begin the transfer process as soon as possible.
The data center does not have any available network bandwidth for additional workloads A solutions architect must
transfer the data and must configure the transformation job to continue to run in the AWS Cloud
Which solution will meet these requirements with the LEAST operational overhead?

A. Use AWS DataSync to move the data Create a custom transformation job by using AWS Glue
B. Order an AWS Snowcone device to move the data Deploy the transformation application to the device
C. Order an AWS Snowball Edge Storage Optimized devic
D. Copy the data to the devic
E. Create a customtransformation job by using AWS Glue
F. Order an AWS
G. Snowball Edge Storage Optimized device that includes Amazon EC2 compute Copy the data to the device Create a
new EC2 instance on AWS to run the transformation application

Answer: D

NEW QUESTION 192


A company that primarily runs its application servers on premises has decided to migrate to AWS. The company wants to
minimize its need to scale its Internet Small Computer Systems Interface (iSCSI) storage on premises. The company wants
only its recently accessed data to remain stored locally.
Which AWS solution should the company use to meet these requirements?

A. Amazon S3 File Gateway


B. AWS Storage Gateway Tape Gateway
C. AWS Storage Gateway Volume Gateway stored volumes
D. AWS Storage Gateway Volume Gateway cachea volumes

Answer: D

NEW QUESTION 196


A company uses a legacy application to produce data in CSV format The legacy application stores the output data In Amazon
S3 The company is deploying a new commercial off-the-shelf (COTS) application that can perform complex SQL queries to
analyze data that is stored Amazon Redshift and Amazon S3 only However the COTS application cannot process the csv files
that the legacy application produces The company cannot update the legacy application to produce data in another format
The company needs to implement a solution so that the COTS application can use the data that the legacy applicator
produces.
Which solution will meet these requirements with the LEAST operational overhead?
A. Create a AWS Glue extract, transform, and load (ETL) job that runs on a schedul
B. Configure the ETL job to process the .csv files and store the processed data in Amazon Redshit.
C. Develop a Python script that runs on Amazon EC2 instances to convert th
D. csv files to sql files invoke the Python script on cron schedule to store the output files in Amazon S3.
E. Create an AWS Lambda function and an Amazon DynamoDB tabl
F. Use an S3 event to invoke the Lambda functio
G. Configure the Lambda function to perform an extract transform, and load (ETL) job to process the .csv files and
store the processed data in the DynamoDB table.
H. Use Amazon EventBridge (Amazon CloudWatch Events) to launch an Amazon EMR cluster on a weekly schedul
I. Configure the EMR cluster to perform an extract, tractform, and load (ETL) job to process the .csv files and store
the processed data in an Amazon Redshift table.

Answer: C

NEW QUESTION 197


A company hosts a website on Amazon EC2 instances behind an Application Load Balancer (ALB) The website serves static
content Website traffic is increasing, and the company is concerned about a potential increase in cost.
What should a solutions architect do to reduce the cost of the website?

A. Create an Amazon CloudFront distribution to cache static files at edge locations.


B. Create an Amazon ElastiCache cluster Connect the ALB to the ElastiCache cluster to serve cached files.
C. Create an AWS WAF web ACL, and associate it with the ALB Add a rule to the web ACL to cache static files.
D. Create a second ALB in an alternative AWS Region Route user traffic to the closest Region to minimize data transfer costs.

Answer: A

NEW QUESTION 200


An online retail company has more than 50 million active customers and receives more than 25,000 orders each day. The
company collects purchase data for customers and stores this data in Amazon S3. Additional customer data is stored in
Amazon RDS.
The company wants to make all the data available to various teams so that the teams can perform analytics. The solution
must provide the ability to manage fine- grained permissions for the data and must minimize operational overhead.
Which solution will meet these requirements?

A. Migrate the purchase data to write directly to Amazon RD


B. Use RDS access controls to limit access.
C. Schedule an AWS Lambda function to periodically copy data from Amazon RDS to Amazon S3. Create an AWS Glue crawle
D. Use Amazon Athena to query the dat
E. Use S3 policies to limit access.
F. Create a data lake by using AWS Lake Formatio
G. Create an AWS Glue JOBC connection to Amazon RD
H. Register the S3 bucket in Lake Formatio
I. Use Lake
J. Formation access controls to limit acces
K. Create an Amazon Redshift cluster Schedule an AWS Lambda function to periodically copy data from Amazon S3 and
Amazon RDS to Amazon Redshif
L. Use Amazon Redshift access controls to limit access.

Answer: C
NEW QUESTION 204
A company has an application that processes customer of tiers. The company hosts the application on an Amazon EC2
instance that saves the orders to an Amazon Aurora database. Occasionally when traffic Is high, the workload does not
process orders fast enough.
What should a solutions architect do to write the orders reliably to the database as quickly as possible?

A. Increase the instance size of the EC2 instance when baffle Is hig
B. Write orders to Amazon Simple Notification Service (Amazon SNS) Subscribe the database endpoint to the SNS topic
C. Write orders to an Amazon Simple Queue Service (Amazon SOS) queue Use EC2 instances in an Auto Scaling group
behind an Application Load Balancer to read born the SQS queue and process orders into the database
D. Write orders to Amazon Simple Notification Service (Amazon SNS). Subscribe the database endpoint to the SNS topi
E. Use EC2 ^stances in an Auto Scaling group behind an Application Load Balancer to read from the SNS topic.
F. Write orders to an Amazon Simple Queue Service (Amazon SQS) queue when the EC2 instance reaches CPU threshold
limit
G. Use scheduled scaling of EC2 instances in an Auto Scaling group behind an Application Load Balancer to read from
the SQS queue and process orders into the database

Answer: B

NEW QUESTION 205


To meet security requirements, a company needs to encrypt all of its application data in transit while communicating with an
Amazon RDS MySQL DB instance A recent security audit revealed that encryption al rest is enabled using AWS Key
Management Service (AWS KMS). but data in transit Is not enabled
What should a solutions architect do to satisfy the security requirements?

A. Enable IAM database authentication on the database.


B. Provide self-signed certificates, Use the certificates in all connections to the RDS instance
C. Take a snapshot of the RDS instance Restore the snapshot to a new instance with encryption enabled
D. Download AWS-provided root certificates Provide the certificates in all connections to the RDS instance

Answer: C

Explanation:
https://docs.aws.amazon.com/AmazonRDS/latest/UserGuide/Overview.Encryption.html#Overview.Encryption.
NEW QUESTION 209
A company maintains a searchable repository of items on its website. The data is stored in an Amazon RDS for MySQL
database table that contains more than 10 million rows The database has 2 TB of General Purpose SSD storage There are
millions of updates against this data every day through the company's website The company has noticed that some insert
operations are taking 10 seconds or longer The company has determined that the database storage performance is the
problem
Which solution addresses this performance issue?

A. Change the storage type to Provisioned IOPS SSD


B. Change the DB instance to a memory optimized instance class
C. Change the DB instance to a burstable performance instance class
D. Enable Multi-AZ RDS read replicas with MySQL native asynchronous replication.

Answer: A

Explanation:
https://aws.amazon.com/ebs/features/
"Provisioned IOPS volumes are backed by solid-state drives (SSDs) and are the highest performance EBS volumes designed
for your critical, I/O intensive database applications. These volumes are ideal for both IOPS-intensive and throughput-
intensive workloads that require extremely low latency."

NEW QUESTION 210


A company runs its ecommerce application on AWS. Every new order is published as a message in a RabbitMQ queue that
runs on an Amazon EC2 instance in a single Availability Zone. These messages are processed by a different application that
runs on a separate EC2 instance. This application stores the details in a PostgreSQL database on another EC2 instance. All
the EC2 instances are in the same Availability Zone.
The company needs to redesign its architecture to provide the highest
availability with the least operational overhead. What should a solutions
architect do to meet these requirements?

A. Migrate the queue to a redundant pair (active/standby) of RabbitMQ instances on Amazon M


B. Create a Multi-AZ Auto Scaling group (or EC2 instances that host the applicatio
C. Create another Multi-AZAuto Scaling group for EC2 instances that host the PostgreSQL database.
D. Migrate the queue to a redundant pair (active/standby) of RabbitMQ instances on Amazon M
E. Create a Multi-AZ Auto Scaling group for EC2 instances that host the applicatio
F. Migrate the database to run on a Multi-AZ deployment of Amazon RDS for PostgreSQL.
G. Create a Multi-AZ Auto Scaling group for EC2 instances that host the RabbitMQ queu
H. Create another Multi-AZ Auto Scaling group for EC2 instances that host the applicatio
I. Migrate the database to runon a Multi-AZ deployment of Amazon RDS fqjPostgreSQL.
J. Create a Multi-AZ Auto Scaling group for EC2 instances that host the RabbitMQ queu
K. Create another Multi-AZ Auto Scaling group for EC2 instances that host the applicatio
L. Create a third Multi-AZ AutoScaling group for EC2 instances that host the PostgreSQL database.

Answer: C

NEW QUESTION 212


A solution architect is using an AWS CloudFormation template to deploy a three-tier web application. The web application
consist of a web tier and an application that stores and retrieves user data in Amazon DynamoDB tables. The web and
application tiers are hosted on Amazon EC2 instances, and the database tier is not publicly accessible. The application EC2
instances need to access the Dynamo tables Without exposing API credentials in the template.
What should the solution architect do to meet the requirements?

A. Create an IAM role to read the DynamoDB table


B. Associate the role with the application instances by referencing an instance profile.
C. Create an IAM role that has the required permissions to read and write from the DynamoDB table
D. Add the role to the EC2 instance profile, and associate the instances profile with the application instances.
E. Use the parameter section in the AWS CloudFormation template to have the user input access and secret keys from
an already-created IAM user that has the required permissions to read and write from the DynamoDB tables.
F. Create an IAM user in the AWS CloudFormation template that has the required permissions to read and write from the
DynamoDB table
G. Use the GetAtt function to retrieve the access secret keys, and pass them to the application instances through the user
data.

Answer: B

NEW QUESTION 216


A company's application is running on Amazon EC2 instances within an Auto Scaling group behind an Elastic Load Balancer
Based on the application's history the company anticipates a spike m traffic during a holiday each year A solutions architect
must design a strategy to ensure that the Auto Scaling group proactively increases capacity to minimize any performance
impact on application users.
Which solution will meet these requirements?

A. Create an Amazon CloudWatch alarm to scale up the EC2 instances when CPU utilization exceeds 90%.
B. Create a recurring scheduled action to scale up the Auto Scaling group before the expected period of peak demand.
C. increase the minimum and maximum number of EC2 instances in the Auto Scaling group during the peak demand period
D. Configure an Amazon Simple Notification Service (Amazon SNS) notification to send alerts when there are autoscaling
EC2_INSTANCE_LAUNCH events

Answer: B

NEW QUESTION 219


A company wants to run its critical applications in containers to meet requirements tor scalability and availability The
company prefers to focus on maintenance of the critical applications The company does not want to be responsible for
provisioning and managing the underlying infrastructure that runs the containerized workload
What should a solutions architect do to meet those requirements?

A. Use Amazon EC2 Instances, and Install Docker on the Instances


B. Use Amazon Elastic Container Service (Amazon ECS) on Amazon EC2 worker nodes
C. Use Amazon Elastic Container Service (Amazon ECS) on AWS Fargate
D. Use Amazon EC2 instances from an Amazon Elastic Container Service (Amazon ECS)-op6mized Amazon Machine Image
(AMI).

Answer: C

Explanation:
using AWS ECS on AWS Fargate since they requirements are for scalability and availability without having to provision and
manage the underlying infrastructure to run the containerized workload.
https://docs.aws.amazon.com/AmazonECS/latest/userguide/what-is-fargate.html

NEW QUESTION 224


A company needs to retain application logs files for a critical application for 10 years. The application team regularly accesses
logs from the past month for troubleshooting, but logs older than 1 month are rarely accessed. The application generates
more than 10 TB of logs per month.
Which storage option meets these requirements MOST cost-effectively?

A. Store the Iogs in Amazon S3 Use AWS Backup lo move logs more than 1 month old to S3 Glacier Deep Archive
B. Store the logs in Amazon S3 Use S3 Lifecycle policies to move logs more than 1 month old to S3 Glacier Deep Archive
C. Store the logs in Amazon CloudWatch Logs Use AWS Backup to move logs more then 1 month old to S3 Glacier Deep
Archive
D. Store the logs in Amazon CloudWatch Logs Use Amazon S3 Lifecycle policies to move logs more than 1 month old to S3
Glacier Deep Archive

Answer: B

NEW QUESTION 227


A company wants to manage Amazon Machine Images (AMls). The company currently copies AMls to the same AWS Region
where the AMls were created. The company needs to design an application that captures AWS API calls and sends alerts
whenever the Amazon EC2 Createlmage API operation is called within the company's account
Which solution will meet these requirements with the LEAST operational overhead?

A. Create an AWS Lambda function to query AWS CloudTrail logs and to send an alert when a Createlmage API call is
detected
B. Configure AWS CloudTrail with an Amazon Simple Notification Sen/ice (Amazon SNS) notification that occurs when
updated logs are sent to Amazon S3 Use Amazon Athena to create a new table and to query on Createlmage when an
API call is detected
C. Create an Amazon EventBndge (Amazon CloudWatch Events) rule for the Createlmage API call Configure the target as
an Amazon Simple Notification Service (Amazon SNS) topic to send an alert when a Createlmage API call is detected
D. Configure an Amazon Simple Queue Service (Amazon SQS) FIFO queue as a target for AWS CloudTrail logs Create an
AWS Lambda function to send an alert to an Amazon Simple Notification Service (Amazon SNS) topic when a Createlmage
API call is detected

Answer: B

NEW QUESTION 231


A company has deployed a server less application that invokes an AWS Lambda function when new documents are
uploaded to an Amazon S3 bucket The application uses the Lambda function to process the documents After a recent
marketing campaign the company noticed that the application did not process many of The documents
What should a solutions architect do to improve the architecture of this application?

A. Set the Lambda function's runtime timeout value to 15 minutes


B. Configure an S3 bucket replication policy Stage the documents m the S3 bucket for later processing
C. Deploy an additional Lambda function Load balance the processing of the documents across the two Lambda functions
D. Create an Amazon Simple Queue Service (Amazon SOS) queue Send the requests to the queue Configure the queue as an
event source for Lambda.

Answer: B

NEW QUESTION 235


A company wants to reduce the cost of its existing three-tier web architect. The web, application, and database servers are
running on Amazon EC2 instance EC2 instance for the development, test and production environments. The EC2 instances
average 30% CPU utilization during peak hours and 10% CPU utilization during non-peak hours.
The production EC2 instance purchasing solution will meet the company’s requirements MOST cost-effectively?

A. Use Spot Instances for the production EC2 instance


B. Use Reserved Instances for the development and test EC2 instances
C. Use Reserved Instances for the production EC2 instance
D. Use On-Demand Instances for the development and test EC2 instances
E. Use blocks for the production FC2 ins ranges Use Reserved instances for the development and lest EC2 instances
F. Use On-Demand Instances for the production EC2 instance
G. Use Spot blocks for the development and test EC2 instances

Answer: B

NEW QUESTION 239


A global company hosts its web application on Amazon EC2 instances behind an Application Load Balancer (ALB). The web
application has static data and dynamic data. The company stores its static data in an Amazon S3 bucket. The company
wants to improve performance and reduce latency for the static data and dynamic data. The company is using its own
domain name registered with Amazon Route 53.
What should a solutions architect do to meet these requirements?

A. Create an Amazon CloudFront distribution that has the S3 bucket and the ALB as origins Configure Route 53 to route
traffic to the CloudFront distribution.
B. Create an Amazon CloudFront distribution that has the ALB as an origin Create an AWS Global Accelerator standard
accelerator that has the S3 bucket as an endpoin
C. Configure Route 53 to route traffic to the CloudFront distribution.
D. Create an Amazon CloudFront distribution that has the S3 bucket as an origin Create an AWS Global Accelerator standard
accelerator that has the ALB and the
CloudFront distribution as endpoints Create a custom domain name that points to the accelerator DNS name Use the custom
domain name as an endpoint for the web application.
E. Create an Amazon CloudFront distribution that has the ALB as an origin
F. Create an AWS Global Accelerator standard accelerator that has the S3 bucket as an endpoint Create two domain name
G. Point one domain name to the CloudFront DNS name for dynamic content, Point the other domain name to the
accelerator DNS name for static content Use the domain names as endpoints for the web application.

Answer: D

NEW QUESTION 243


A company hosts its web applications in the AWS Cloud. The company configures Elastic Load Balancers to use certificate
that are imported into AWS Certificate Manager (ACM). The company’s security team must be notified 30 days before the
expiration of each certificate.
What should a solutions architect recommend to meet the requirement?

A. Add a rule m ACM to publish a custom message to an Amazon Simple Notification Service (Amazon SNS) topic
every day beginning 30 days before any certificate will expire.
B. Create an AWS Config rule that checks for certificates that will expire within 30 day
C. Configure Amazon EventBridge (Amazon CloudWatch Events) to invoke a custom alert by way of Amazon Simple
Notification Service (Amazon SNS) when AWS Config reports a noncompliant resource
D. Use AWS trusted Advisor to check for certificates that will expire within to day
E. Create an Amazon CloudWatch alarm that is based on Trusted Advisor metrics for check status changes Configure the
alarm to send a custom alert by way of Amazon Simple rectification Service (Amazon SNS)
F. Create an Amazon EventBridge (Amazon CloudWatch Events) rule to detect any certificates that will expire within 30 day
G. Configure the rule to invoke an AWS Lambda functio
H. Configure the Lambda function to send a custom alert by way of Amazon Simple Notification Service (Amazon SNS).

Answer: B

NEW QUESTION 244


A solutions architect is using Amazon S3 to design the storage architecture of a new digital media application. The media
files must be resilient to the loss of an Availability Zone Some files are accessed frequently while other files are rarely
accessed in an unpredictable pattern. The solutions architect must minimize the costs of storing and retrieving the media
files.
Which storage option meets these requirements?

A. S3 Standard
B. S3 Intelligent-Tiering
C. S3 Standard-Infrequent Access {S3 Standard-IA)
D. S3 One Zone-Infrequent Access (S3 One Zone-IA)

Answer: B

NEW QUESTION 247


A solutions architect is tasked with transferring 750 TB of data from a network-attached file system located at a branch
office to Amazon S3 Glacier The solution must avoid saturating the branch office's tow-bandwidth internet connection
What is the MOST cost-effective solution?
A. Create a site-to-site VPN tunnel to an Amazon S3 bucket and transfer the files directl
B. Create a bucket policy to enforce a VPC endpoint
C. Order 10 AWS Snowball appliances and select an S3 Glacier vault as the destinatio
D. Create a bucket policy to enforce a VPC endpoint
E. Mount the network-attached file system to Amazon S3 and copy the files directl
F. Create a lifecycle policy to transition the S3 objects to Amazon S3 Glacier
G. Order 10 AWS Snowball appliances and select an Amazon S3 bucket as the destinatio
H. Create a lifecycle policy to transition the S3 objects to Amazon S3 Glacier

Answer: D

NEW QUESTION 251


A company's web application resizes uploaded images lot users The application stores the original images and the resized
images in Amazon S3 The company needs lo minimize the storage costs tor all the images Original images ate viewed
frequently. and resized images are viewed infrequently after they are created Both types of images need to be
immediately available
Which combination of actions should a solutions architect take to meet these requirements? (Select TWO.) A. Store the
original images In S3 Standard.

A. Store the resized images in S3 Standard


B. Store the original images in S3 Glacier
C. Store the resized Images In S3 Glacier
D. Store the resized Images In S3 One Zone-Infrequent Access (S3 One Zone-IA).

Answer: AD

NEW QUESTION 256


A company is deploying a new application lo Amazon Elastic Kubernetes Service (Amazon EKS) with an AWS Fargate duster
The application needs a storage solution for data persistence The solution must be highly available and fault tolerant The
solution also must be shared between multiple application containers Which solution will meet these requirements with
the LEAST operational overhead?

A. Create Amazon Elastic Block Store (Amazon EBS) volumes In the same Availability Zones where EKS worker nodes are
place
B. Register the volumes In a StorageClass object on an EKS cluster Use EBS Multi-Attach to share the data between
containers
C. Create an Amazon Elastic File System (Amazon EFS) tile system Register the tile system in a StorageClass object on an EKS
cluster Use the same file system
for all containers
D. Create an Amazon Elastic Block Store (Amazon EBS) volume Register the volume In a StorageClass object on an EKS
cluster Use the same volume for all containers.
E. Create Amazon Elastic File System (Amazon EFS) file systems In the same Availability Zones where EKS worker nodes
are placed Register the file systems in a StorageClass obied on an EKS duster Create an AWS Lambda function to
synchronize the data between file systems

Answer: B

NEW QUESTION 257


A company has a document management application that contains PDF documents The company hosts the application on
Amazon EC2 instances According to regulations, the instances must not have access to the internet The application must be
able to read and write to a persistent storage system that provides native versioning capabilities
A solutions architect needs to design secure storage that maximizes resiliency and
facilitates data sharing across instances Which solution meets these requirements?

A. Place the instances in a public subnet Use Amazon S3 for storage Access S3 objects by using URLs
B. Place the instances in a private subnet use Amazon S3 for storage Use a VPC endpoint to access S3 objects
C. Use the instances with a Provisioned IOPS SSD (io2) Amazon Elastic Block Store (Amazon EBS) volume.
D. Use Amazon Elastic File System (Amazon EPS) Standard-Infrequent Access (Standard-IA) to store data and provide shared
access to the instances

Answer: B

NEW QUESTION 261


......
Thank You for Trying Our Product

We offer two products:

1st - We have Practice

Tests Software with

Actual Exam

Questions 2nd -

Questons and

Answers in PDF

Format

You might also like