Amazon AWS Certified Solutions Architect - Associate SAA-C03
Prev

There are 677 results

Next
#301 (Accuracy: 93% / 5 votes)
A marketing company receives a large amount of new clickstream data in Amazon S3 from a marketing campaign. The company needs to analyze the clickstream data in Amazon S3 quickly. Then the company needs to determine whether to process the data further in the data pipeline.

Which solution will meet these requirements with the LEAST operational overhead?
  • A. Create external tables in a Spark catalog. Configure jobs in AWS Glue to query the data.
  • B. Configure an AWS Glue crawler to crawl the data. Configure Amazon Athena to query the data.
  • C. Create external tables in a Hive metastore. Configure Spark jobs in Amazon EMR to query the data.
  • D. Configure an AWS Glue crawler to crawl the data. Configure Amazon Kinesis Data Analytics to use SQL to query the data.
#302 (Accuracy: 100% / 3 votes)
A company is migrating its databases to Amazon RDS for PostgreSQL. The company is migrating its applications to Amazon EC2 instances. The company wants to optimize costs for long-running workloads.

Which solution will meet this requirement MOST cost-effectively?
  • A. Use On-Demand Instances for the Amazon RDS for PostgreSQL workloads. Purchase a 1 year Compute Savings Plan with the No Upfront option for the EC2 instances.
  • B. Purchase Reserved Instances for a 1 year term with the No Upfront option for the Amazon RDS for PostgreSQL workloads. Purchase a 1 year EC2 Instance Savings Plan with the No Upfront option for the EC2 instances.
  • C. Purchase Reserved Instances for a 1 year term with the Partial Upfront option for the Amazon RDS for PostgreSQL workloads. Purchase a 1 year EC2 Instance Savings Plan with the Partial Upfront option for the EC2 instances.
  • D. Purchase Reserved Instances for a 3 year term with the All Upfront option for the Amazon RDS for PostgreSQL workloads. Purchase a 3 year EC2 Instance Savings Plan with the All Upfront option for the EC2 instances.
#303 (Accuracy: 100% / 2 votes)
A company is running a media store across multiple Amazon EC2 instances distributed across multiple Availability Zones in a single VPC. The company wants a high-performing solution to share data between all the EC2 instances, and prefers to keep the data within the VPC only.

What should a solutions architect recommend?
  • A. Create an Amazon S3 bucket and call the service APIs from each instance's application
  • B. Create an Amazon S3 bucket and configure all instances to access it as a mounted volume
  • C. Configure an Amazon Elastic Block Store (Amazon EBS) volume and mount it across all instances
  • D. Configure an Amazon Elastic File System (Amazon EFS) file system and mount it across all instances
#304 (Accuracy: 100% / 2 votes)
A company runs all its business applications in the AWS Cloud. The company uses AWS Organizations to manage multiple AWS accounts.

A solutions architect needs to review all permissions that are granted to IAM users to determine which IAM users have more permissions than required.


Which solution will meet these requirements with the LEAST administrative overhead?
  • A. Use Network Access Analyzer to review all access permissions in the company's AWS accounts.
  • B. Create an AWS CloudWatch alarm that activates when an IAM user creates or modifies resources in an AWS account.
  • C. Use AWS Identity and Access Management (IAM) Access Analyzer to review all the company’s resources and accounts.
  • D. Use Amazon Inspector to find vulnerabilities in existing IAM policies.
#305 (Accuracy: 100% / 2 votes)
A company wants to improve the availability and performance of its hybrid application. The application consists of a stateful TCP-based workload hosted on Amazon EC2 instances in different AWS Regions and a stateless UDP-based workload hosted on premises.

Which combination of actions should a solutions architect take to improve availability and performance? (Choose two.)
  • A. Create an accelerator using AWS Global Accelerator. Add the load balancers as endpoints.
  • B. Create an Amazon CloudFront distribution with an origin that uses Amazon Route 53 latency-based routing to route requests to the load balancers.
  • C. Configure two Application Load Balancers in each Region. The first will route to the EC2 endpoints, and the second will route to the on-premises endpoints.
  • D. Configure a Network Load Balancer in each Region to address the EC2 endpoints. Configure a Network Load Balancer in each Region that routes to the on-premises endpoints.
  • E. Configure a Network Load Balancer in each Region to address the EC2 endpoints. Configure an Application Load Balancer in each Region that routes to the on-premises endpoints.
#306 (Accuracy: 100% / 3 votes)
A company runs a critical data analysis job each week before the first day of the work week. The job requires at least 1 hour to complete the analysis. The job is stateful and cannot tolerate interruptions. The company needs a solution to run the job on AWS.

Which solution will meet these requirements?
  • A. Create a container for the job. Schedule the job to run as an AWS Fargate task on an Amazon Elastic Container Service (Amazon ECS) cluster by using Amazon EventBridge Scheduler.
  • B. Configure the job to run in an AWS Lambda function. Create a scheduled rule in Amazon EventBridge to invoke the Lambda function.
  • C. Configure an Auto Scaling group of Amazon EC2 Spot Instances that run Amazon Linux. Configure a crontab entry on the instances to run the analysis.
  • D. Configure an AWS DataSync task to run the job. Configure a cron expression to run the task on a schedule.
#307 (Accuracy: 100% / 1 votes)
A company has applications that run in an organization in AWS Organizations. The company outsources operational support of the applications. The company needs to provide access for the external support engineers without compromising security.

The external support engineers need access to the AWS Management Console.
The external support engineers also need operating system access to the company’s fleet ofAmazon EC2 instances that run Amazon Linux in private subnets.

Which solution will meet these requirements MOST securely?
  • A. Confirm that AWS Systems Manager Agent (SSM Agent) is installed on all instances. Assign an instance profile with the necessary policy to connect to Systems Manager. Use AWS IAM Identity Center to provide the external support engineers console access. Use Systems Manager Session Manager to assign the required permissions.
  • B. Confirm that AWS Systems Manager Agent (SSM Agent) is installed on all instances. Assign an instance profile with the necessary policy to connect to Systems Manager. Use Systems Manager Session Manager to provide local IAM user credentials in each AWS account to the external support engineers for console access.
  • C. Confirm that all instances have a security group that allows SSH access only from the external support engineers’ source IP address ranges. Provide local IAM user credentials in each AWS account to the external support engineers for console access. Provide each external support engineer an SSH key pair to log in to the application instances.
  • D. Create a bastion host in a public subnet. Set up the bastion host security group to allow access from only the external engineers’ IP address ranges. Ensure that all instances have a security group that allows SSH access from the bastion host. Provide each external support engineer an SSH key pair to log in to the application instances. Provide local account IAM user credentials to the engineers for console access.
#308 (Accuracy: 100% / 4 votes)
A company needs to use its on-premises LDAP directory service to authenticate its users to the AWS Management Console. The directory service is not compatible with Security Assertion Markup Language (SAML).

Which solution meets these requirements?
  • A. Enable AWS IAM Identity Center (AWS Single Sign-On) between AWS and the on-premises LDAP.
  • B. Create an IAM policy that uses AWS credentials, and integrate the policy into LDAP.
  • C. Set up a process that rotates the IAM credentials whenever LDAP credentials are updated.
  • D. Develop an on-premises custom identity broker application or process that uses AWS Security Token Service (AWS STS) to get short-lived credentials.
#309 (Accuracy: 100% / 1 votes)
A company has an AWS Direct Connect connection from its corporate data center to its VPC in the us-east-1 Region. The company recently acquired a corporation that has several VPCs and a Direct Connect connection between its on-premises data center and the eu-west-2 Region. The CIDR blocks for the VPCs of the company and the corporation do not overlap. The company requires connectivity between two Regions and the data centers. The company needs a solution that is scalable while reducing operational overhead.

What should a solutions architect do to meet these requirements?
  • A. Set up inter-Region VPC peering between the VPC in us-east-1 and the VPCs in eu-west-2.
  • B. Create private virtual interfaces from the Direct Connect connection in us-east-1 to the VPCs in eu-west-2.
  • C. Establish VPN appliances in a fully meshed VPN network hosted by Amazon EC2. Use AWS VPN CloudHub to send and receive data between the data centers and each VPC.
  • D. Connect the existing Direct Connect connection to a Direct Connect gateway. Route traffic from the virtual private gateways of the VPCs in each Region to the Direct Connect gateway.
#310 (Accuracy: 100% / 3 votes)
A company runs a three-tier application in a VPC. The database tier uses an Amazon RDS for MySQL DB instance.

The company plans to migrate the RDS for MySQL DB instance to an Amazon Aurora PostgreSQL DB cluster.
The company needs a solution that replicates the data changes that happen during the migration to the new database.

Which combination of steps will meet these requirements? (Choose two.)
  • A. Use AWS Database Migration Service (AWS DMS) Schema Conversion to transform the database objects.
  • B. Use AWS Database Migration Service (AWS DMS) Schema Conversion to create an Aurora PostgreSQL read replica on the RDS for MySQL DB instance.
  • C. Configure an Aurora MySQL read replica for the RDS for MySQL DB instance.
  • D. Define an AWS Database Migration Service (AWS DMS) task with change data capture (CDC) to migrate the data.
  • E. Promote the Aurora PostgreSQL read replica to a standalone Aurora PostgreSQL DB cluster when the replica lag is zero.