Summer Sale Special Limited Time 65% Discount Offer - Ends in 0d 00h 00m 00s - Coupon code: v4s65

SAA-C03 Exam Dumps - AWS Certified Solutions Architect - Associate (SAA-C03)

Go to page:
Question # 17

An application runs on Amazon EC2 instances in private subnets. The application needs to access an Amazon DynamoDB table. What is the MOST secure way to access the table while ensuring that the traffic does not leave the AWS network?

A.

Use a VPC endpoint for DynamoDB.

B.

Use a NAT gateway in a public subnet.

C.

Use a NAT instance in a private subnet.

D.

Use the internet gateway attached to the VPC.

Full Access
Question # 18

A company hosts more than 300 global websites and applications. The company requires a platform to analyze more than 30 TB of clickstream data each day.

What should a solutions architect do to transmit and process the clickstream data?

A.

Design an AWS Data Pipeline to archive the data to an Amazon S3 bucket and run an Amazon EMR duster with the data to generate analytics

B.

Create an Auto Scaling group of Amazon EC2 instances to process the data and send it to an Amazon S3 data lake for Amazon Redshift to use tor analysis

C.

Cache the data to Amazon CloudFron: Store the data in an Amazon S3 bucket When an object is added to the S3 bucket, run an AWS Lambda function to process the data tor analysis.

D.

Collect the data from Amazon Kinesis Data Streams. Use Amazon Kinesis Data Firehose to transmit the data to an Amazon S3 data lake Load the data in Amazon Redshift for analysis

Full Access
Question # 19

A company is running a popular social media website. The website gives users the ability to upload images to share with other users. The company wants to make sure that the images do not contain inappropriate content. The company needs a solution that minimizes development effort.

What should a solutions architect do to meet these requirements?

A.

Use Amazon Comprehend to detect inappropriate content. Use human review for low-confidence predictions.

B.

Use Amazon Rekognition to detect inappropriate content. Use human review for low-confidence predictions.

C.

Use Amazon SageMaker to detect inappropriate content. Use ground truth to label low-confidence predictions.

D.

Use AWS Fargate to deploy a custom machine learning model to detect inappropriate content. Use ground truth to label low-confidence predictions.

Full Access
Question # 20

A company that primarily runs its application servers on premises has decided to migrate to AWS. The company wants to minimize its need to scale its Internet Small

Computer Systems Interface (iSCSI) storage on premises. The company wants only its recently accessed data to remain stored locally.

Which AWS solution should the company use to meet these requirements?

A.

Amazon S3 File Gateway

B.

AWS Storage Gateway Tape Gateway

C.

AWS Storage Gateway Volume Gateway stored volumes

D.

AWS Storage Gateway Volume Gateway cachea volumes

Full Access
Question # 21

A company is using a fleet of Amazon EC2 instances to ingest data from on-premises data sources. The data is in JSON format and Ingestion rates can be as high as 1 MB/s. When an EC2 instance is rebooted, the data in-flight is lost. The company's data science team wants to query Ingested data In near-real time.

Which solution provides near-real -time data querying that is scalable with minimal data loss?

A.

Publish data to Amazon Kinesis Data Streams Use Kinesis data Analytics to query the data.

B.

Publish data to Amazon Kinesis Data Firehose with Amazon Redshift as the destination Use Amazon Redshift to query the data

C.

Store ingested data m an EC2 Instance store Publish data to Amazon Kinesis Data Firehose with Amazon S3 as the destination. Use Amazon Athena to query the data.

D.

Store ingested data m an Amazon Elastic Block Store (Amazon EBS) volume Publish data to Amazon ElastiCache tor Red Subscribe to the Redis channel to query the data

Full Access
Question # 22

A company provides a Voice over Internet Protocol (VoIP) service that uses UDP connections. The service consists of Amazon EC2 instances that run in an Auto Scaling group. The company has deployments across multiple AWS Regions.

The company needs to route users to the Region with the lowest latency. The company also needs automated failover between Regions.

Which solution will meet these requirements?

A.

Deploy a Network Load Balancer (NLB) and an associated target group. Associate the target group with the Auto Scaling group. Use the NLB as an AWS Global Accelerator endpoint in each Region.

B.

Deploy an Application Load Balancer (ALB) and an associated target group. Associate the target group with the Auto Scaling group. Use the ALB as an AWS Global Accelerator endpoint in each Region.

C.

Deploy a Network Load Balancer (NLB) and an associated target group. Associate the target group with the Auto Scaling group. Create an Amazon Route 53 latency record that points to aliases for each NLB. Create an Amazon CloudFront distribution that uses the latency record as an origin.

D.

Deploy an Application Load Balancer (ALB) and an associated target group. Associate the target group with the Auto Scaling group. Create an Amazon Route 53 weighted record that points to aliases for each ALB. Deploy an Amazon CloudFront distribution that uses the weighted record as an origin.

Full Access
Question # 23

A company stores customer data in a multitenant Amazon S3 bucket. Each customer's data is stored in a prefix that is unique to the customer. The company needs to migrate data for specific customers to a new. dedicated S3 bucket that is in the same AWS Region as the source bucket. The company must preserve object metadata such as creation date and version IDs.

After the migration is finished, the company must delete the source data for the migrated customers from the original multitenant S3 bucket.

Which combination of solutions will meet these requirements with the LEAST overhead? (Select THREE.)

A.

Create a new S3 bucket as a destination bucket. Enable versioning on the new bucket.

B.

Use S3 batch operations to copy objects from the specified prefixes to the destination bucket.

C.

Use the S3 CopyObject API, and create a script to copy data to the destination S3 bucket.

D.

Configure S3 Same-Region Replication (SRR) to replicate existing data from the specified prefixes in the source bucket to the destination bucket.

E.

Configure AWS DataSync to migrate data from the specified prefixes in the source bucket to the destination bucket.

F.

Use an S3 Lifecycle policy to delete objects from the source bucket after the data is migrated to the destination bucket.

Full Access
Question # 24

A company is migrating applications to AWS. The applications are deployed in different accounts. The company manages the accounts centrally by using AWS Organizations. The company's security team needs a single sign-on (SSO) solution across all the company's accounts. The company must continue managing the users and groups in its on-premises self-managed Microsoft Active Directory.

Which solution will meet these requirements?

A.

Enable AWS Single Sign-On (AWS SSO) from the AWS SSO console. Create a one-way forest trust or a one-way domain trust to connect the company's self-managed Microsoft Active Directory with AWS SSO by using AWS Directory Service for Microsoft Active Directory.

B.

Enable AWS Single Sign-On (AWS SSO) from the AWS SSO console. Create a two-way forest trust to connect the company's self-managed Microsoft Active Directory with AWS SSO by using AWS Directory Service for Microsoft Active Directory.

C.

Use AWS Directory Service. Create a two-way trust relationship with the company's self-managed Microsoft Active Directory.

D.

Deploy an identity provider (IdP) on premises. Enable AWS Single Sign-On (AWS SSO) from the AWS SSO console.

Full Access
Go to page: