Amazon AWS Certified Solutions Architect - Associate SAA-C03
Prev

There are 677 results

Next
#181 (Accuracy: 100% / 1 votes)
A company is launching a new application that requires a structured database to store user profiles, application settings, and transactional data. The database must be scalable with application traffic and must offer backups.

Which solution will meet these requirements MOST cost-effectively?
  • A. Deploy a self-managed database on Amazon EC2 instances by using open source software. Use Spot Instances for cost optimization. Configure automated backups to Amazon S3.
  • B. Use Amazon RDS. Use on-demand capacity mode for the database with General Purpose SSD storage. Configure automatic backups with a retention period of 7 days.
  • C. Use Amazon Aurora Serverless for the database. Use serverless capacity scaling. Configure automated backups to Amazon S3.
  • D. Deploy a self-managed NoSQL database on Amazon EC2 instances. Use Reserved Instances for cost optimization. Configure automated backups directly to Amazon S3 Glacier Flexible Retrieval.
#182 (Accuracy: 100% / 9 votes)
A company that primarily runs its application servers on premises has decided to migrate to AWS. The company wants to minimize its need to scale its Internet Small Computer Systems Interface (iSCSI) storage on premises. The company wants only its recently accessed data to remain stored locally.

Which AWS solution should the company use to meet these requirements?
  • A. Amazon S3 File Gateway
  • B. AWS Storage Gateway Tape Gateway
  • C. AWS Storage Gateway Volume Gateway stored volumes
  • D. AWS Storage Gateway Volume Gateway cached volumes
#183 (Accuracy: 100% / 1 votes)
A company stores user data in AWS. The data is used continuously with peak usage during business hours. Access patterns vary, with some data not being used for months at a time. A solutions architect must choose a cost-effective solution that maintains the highest level of durability while maintaining high availability.

Which storage solution meets these requirements?
  • A. Amazon S3 Standard
  • B. Amazon S3 Intelligent-Tiering
  • C. Amazon S3 Glacier Deep Archive
  • D. Amazon S3 One Zone-Infrequent Access (S3 One Zone-IA)
#184 (Accuracy: 100% / 1 votes)
A company is creating a prototype of an ecommerce website on AWS. The website consists of an Application Load Balancer, an Auto Scaling group of Amazon EC2 instances for web servers, and an Amazon RDS for MySQL DB instance that runs with the Single-AZ configuration.

The website is slow to respond during searches of the product catalog.
The product catalog is a group of tables in the MySQL database that the company does not update frequently. A solutions architect has determined that the CPU utilization on the DB instance is high when product catalog searches occur.

What should the solutions architect recommend to improve the performance of the website during searches of the product catalog?
  • A. Migrate the product catalog to an Amazon Redshift database. Use the COPY command to load the product catalog tables.
  • B. Implement an Amazon ElastiCache for Redis cluster to cache the product catalog. Use lazy loading to populate the cache.
  • C. Add an additional scaling policy to the Auto Scaling group to launch additional EC2 instances when database response is slow.
  • D. Turn on the Multi-AZ configuration for the DB instance. Configure the EC2 instances to throttle the product catalog queries that are sent to the database.
#185 (Accuracy: 100% / 1 votes)
A company hosts its core network services, including directory services and DNS, in its on-premises data center. The data center is connected to the AWS Cloud using AWS Direct Connect (DX). Additional AWS accounts are planned that will require quick, cost-effective, and consistent access to these network services.

What should a solutions architect implement to meet these requirements with the LEAST amount of operational overhead?
  • A. Create a DX connection in each new account. Route the network traffic to the on-premises servers.
  • B. Configure VPC endpoints in the DX VPC for all required services. Route the network traffic to the on-premises servers.
  • C. Create a VPN connection between each new account and the DX VPRoute the network traffic to the on-premises servers.
  • D. Configure AWS Transit Gateway between the accounts. Assign DX to the transit gateway and route network traffic to the on-premises servers.
#186 (Accuracy: 100% / 2 votes)
A company runs an on-premises application on a Kubernetes cluster. The company recently added millions of new customers. The company's existing on-premises infrastructure is unable to handle the large number of new customers. The company needs to migrate the on-premises application to the AWS Cloud.

The company will migrate to an Amazon Elastic Kubernetes Service (Amazon EKS) cluster.
The company does not want to manage the underlying compute infrastructure for the new architecture on AWS.

Which solution will meet these requirements with the LEAST operational overhead?
  • A. Use a self-managed node to supply compute capacity. Deploy the application to the new EKS cluster.
  • B. Use managed node groups to supply compute capacity. Deploy the application to the new EKS cluster.
  • C. Use AWS Fargate to supply compute capacity. Create a Fargate profile. Use the Fargate profile to deploy the application.
  • D. Use managed node groups with Karpenter to supply compute capacity. Deploy the application to the new EKS cluster.
#187 (Accuracy: 100% / 4 votes)
A company is migrating its on-premises Oracle database to an Amazon RDS for Oracle database. The company needs to retain data for 90 days to meet regulatory requirements. The company must also be able to restore the database to a specific point in time for up to 14 days.

Which solution will meet these requirements with the LEAST operational overhead?
  • A. Create Amazon RDS automated backups. Set the retention period to 90 days.
  • B. Create an Amazon RDS manual snapshot every day. Delete manual snapshots that are older than 90 days.
  • C. Use the Amazon Aurora Clone feature for Oracle to create a point-in-time restore. Delete clones that are older than 90 days.
  • D. Create a backup plan that has a retention period of 90 days by using AWS Backup for Amazon RDS.
#188 (Accuracy: 100% / 2 votes)
A company is designing a new internal web application in the AWS Cloud. The new application must securely retrieve and store multiple employee usernames and passwords from an AWS managed service.

Which solution will meet these requirements with the LEAST operational overhead?
  • A. Store the employee credentials in AWS Systems Manager Parameter Store. Use AWS CloudFormation and the BatchGetSecretValue API to retrieve usernames and passwords from Parameter Store.
  • B. Store the employee credentials in AWS Secrets Manager. Use AWS CloudFormation and AWS Batch with the BatchGetSecretValue API to retrieve the usernames and passwords from Secrets Manager.
  • C. Store the employee credentials in AWS Systems Manager Parameter Store. Use AWS CloudFormation and AWS Batch with the BatchGetSecretValue API to retrieve the usernames and passwords from Parameter Store.
  • D. Store the employee credentials in AWS Secrets Manager. Use AWS CloudFormation and the BatchGetSecretValue API to retrieve the usernames and passwords from Secrets Manager.
#189 (Accuracy: 100% / 4 votes)
A company uses an Amazon DynamoDB table to store data that the company receives from devices. The DynamoDB table supports a customer-facing website to display recent activity on customer devices. The company configured the table with provisioned throughput for writes and reads.

The company wants to calculate performance metrics for customer device data on a daily basis.
The solution must have minimal effect on the table's provisioned read and write capacity.

Which solution will meet these requirements?
  • A. Use an Amazon Athena SQL query with the Amazon Athena DynamoDB connector to calculate performance metrics on a recurring schedule.
  • B. Use an AWS Glue job with the AWS Glue DynamoDB export connector to calculate performance metrics on a recurring schedule.
  • C. Use an Amazon Redshift COPY command to calculate performance metrics on a recurring schedule.
  • D. Use an Amazon EMR job with an Apache Hive external table to calculate performance metrics on a recurring schedule.
#190 (Accuracy: 100% / 3 votes)
A company has a multi-tier web application. The application's internal service components are deployed on Amazon EC2 instances. The internal service components need to access third-party software as a service (SaaS) APIs that are hosted on AWS.

The company needs to provide secure and private connectivity from the application's internal services to the third-party SaaS application.
The company needs to ensure that there is minimal public internet exposure.

Which solution will meet these requirements?
  • A. Implement an AWS Site-to-Site VPN to establish a secure connection with the third-party SaaS provider.
  • B. Deploy AWS Transit Gateway to manage and route traffic between the application's VPC and the third-party SaaS provider.
  • C. Configure AWS PrivateLink to allow only outbound traffic from the VPC without enabling the third-party SaaS provider to establish.
  • D. Use AWS PrivateLink to create a private connection between the application's VPC and the third-party SaaS provider.