Winter Sale Special Limited Time 65% Discount Offer - Ends in 0d 00h 00m 00s - Coupon code: v4s65

ARA-C01 Exam Dumps - SnowPro Advanced: Architect Certification Exam

Go to page:
Question # 33

An Architect for a multi-national transportation company has a system that is used to check the weather conditions along vehicle routes. The data is provided to drivers.

The weather information is delivered regularly by a third-party company and this information is generated as JSON structure. Then the data is loaded into Snowflake in a column with a VARIANT data type. This

table is directly queried to deliver the statistics to the drivers with minimum time lapse.

A single entry includes (but is not limited to):

- Weather condition; cloudy, sunny, rainy, etc.

- Degree

- Longitude and latitude

- Timeframe

- Location address

- Wind

The table holds more than 10 years' worth of data in order to deliver the statistics from different years and locations. The amount of data on the table increases every day.

The drivers report that they are not receiving the weather statistics for their locations in time.

What can the Architect do to deliver the statistics to the drivers faster?

A.

Create an additional table in the schema for longitude and latitude. Determine a regular task to fill this information by extracting it from the JSON dataset.

B.

Add search optimization service on the variant column for longitude and latitude in order to query the information by using specific metadata.

C.

Divide the table into several tables for each year by using the timeframe information from the JSON dataset in order to process the queries in parallel.

D.

Divide the table into several tables for each location by using the location address information from the JSON dataset in order to process the queries in parallel.

Full Access
Question # 34

When activating Tri-Secret Secure in a hierarchical encryption model in a Snowflake account, at what level is the customer-managed key used?

A.

At the root level (HSM)

B.

At the account level (AMK)

C.

At the table level (TMK)

D.

At the micro-partition level

Full Access
Question # 35

A retail company has over 3000 stores all using the same Point of Sale (POS) system. The company wants to deliver near real-time sales results to category managers. The stores operate in a variety of time zones and exhibit a dynamic range of transactions each minute, with some stores having higher sales volumes than others.

Sales results are provided in a uniform fashion using data engineered fields that will be calculated in a complex data pipeline. Calculations include exceptions, aggregations, and scoring using external functions interfaced to scoring algorithms. The source data for aggregations has over 100M rows.

Every minute, the POS sends all sales transactions files to a cloud storage location with a naming convention that includes store numbers and timestamps to identify the set of transactions contained in the files. The files are typically less than 10MB in size.

How can the near real-time results be provided to the category managers? (Select TWO).

A.

All files should be concatenated before ingestion into Snowflake to avoid micro-ingestion.

B.

A Snowpipe should be created and configured with AUTO_INGEST = true. A stream should be created to process INSERTS into a single target table using the stream metadata to inform the store number and timestamps.

C.

A stream should be created to accumulate the near real-time data and a task should be created that runs at a frequency that matches the real-time analytics needs.

D.

An external scheduler should examine the contents of the cloud storage location and issue SnowSQL commands to process the data at a frequency that matches the real-time analytics needs.

E.

The copy into command with a task scheduled to run every second should be used to achieve the near-real time requirement.

Full Access
Question # 36

What is a characteristic of Role-Based Access Control (RBAC) as used in Snowflake?

A.

Privileges can be granted at the database level and can be inherited by all underlying objects.

B.

A user can use a "super-user" access along with securityadmin to bypass authorization checks and access all databases, schemas, and underlying objects.

C.

A user can create managed access schemas to support future grants and ensure only schema owners can grant privileges to other roles.

D.

A user can create managed access schemas to support current and future grants and ensure only object owners can grant privileges to other roles.

Full Access
Question # 37

A group of Data Analysts have been granted the role analyst role. They need a Snowflake database where they can create and modify tables, views, and other objects to load with their own data. The Analysts should not have the ability to give other Snowflake users outside of their role access to this data.

How should these requirements be met?

A.

Grant ANALYST_R0LE OWNERSHIP on the database, but make sure that ANALYST_ROLE does not have the MANAGE GRANTS privilege on the account.

B.

Grant SYSADMIN ownership of the database, but grant the create schema privilege on the database to the ANALYST_ROLE.

C.

Make every schema in the database a managed access schema, owned by SYSADMIN, and grant create privileges on each schema to the ANALYST_ROLE for each type of object that needs to be created.

D.

Grant ANALYST_ROLE ownership on the database, but grant the ownership on future [object type] s in database privilege to SYSADMIN.

Full Access
Question # 38

A global company needs to securely share its sales and Inventory data with a vendor using a Snowflake account.

The company has its Snowflake account In the AWS eu-west 2 Europe (London) region. The vendor's Snowflake account Is on the Azure platform in the West Europe region. How should the company's Architect configure the data share?

A.

1. Create a share.

2. Add objects to the share.

3. Add a consumer account to the share for the vendor to access.

B.

1. Create a share.

2. Create a reader account for the vendor to use.

3. Add the reader account to the share.

C.

1. Create a new role called db_share.

2. Grant the db_share role privileges to read data from the company database and schema.

3. Create a user for the vendor.

4. Grant the ds_share role to the vendor's users.

D.

1. Promote an existing database in the company's local account to primary.

2. Replicate the database to Snowflake on Azure in the West-Europe region.

3. Create a share and add objects to the share.

4. Add a consumer account to the share for the vendor to access.

Full Access
Question # 39

What does a Snowflake Architect need to consider when implementing a Snowflake Connector for Kafka?

A.

Every Kafka message is in JSON or Avro format.

B.

The default retention time for Kafka topics is 14 days.

C.

The Kafka connector supports key pair authentication, OAUTH. and basic authentication (for example, username and password).

D.

The Kafka connector will create one table and one pipe to ingest data for each topic. If the connector cannot create the table or the pipe it will result in an exception.

Full Access
Question # 40

What Snowflake system functions are used to view and or monitor the clustering metadata for a table? (Select TWO).

A.

SYSTEMSCLUSTERING

B.

SYSTEMSTABLE_CLUSTERING

C.

SYSTEMSCLUSTERING_DEPTH

D.

SYSTEMSCLUSTERING_RATIO

E.

SYSTEMSCLUSTERING_INFORMATION

Full Access
Go to page: