Weekend Sale Limited Time 70% Discount Offer - Ends in 0d 00h 00m 00s - Coupon code: scxmas70

Databricks-Certified-Data-Analyst-Associate Exam Dumps - Databricks Certified Data Analyst Associate Exam

Go to page:
Question # 9

A data analyst has created a user-defined function using the following line of code:

CREATE FUNCTION price(spend DOUBLE, units DOUBLE)

RETURNS DOUBLE

RETURN spend / units;

Which of the following code blocks can be used to apply this function to the customer_spend and customer_units columns of the table customer_summary to create column customer_price?

A.

SELECT PRICE customer_spend, customer_units AS customer_price FROM customer_summary

B.

SELECT price FROM customer_summary

C.

SELECT function(price(customer_spend, customer_units)) AS customer_price FROM customer_summary

D.

SELECT double(price(customer_spend, customer_units)) AS customer_price FROM customer_summary

E.

SELECT price(customer_spend, customer_units) AS customer_price FROM customer_summary

Full Access
Question # 10

A data analyst has been asked to provide a list of options on how to share a dashboard with a client. It is a security requirement that the client does not gain access to any other information, resources, or artifacts in the database.

Which of the following approaches cannot be used to share the dashboard and meet the security requirement?

A.

Download the Dashboard as a PDF and share it with the client.

B.

Set a refresh schedule for the dashboard and enter the client's email address in the "Subscribers" box.

C.

Take a screenshot of the dashboard and share it with the client.

D.

Generate a Personal Access Token that is good for 1 day and share it with the client.

E.

Download a PNG file of the visualizations in the dashboard and share them with the client.

Full Access
Question # 11

A data engineering team has created a Structured Streaming pipeline that processes data in micro-batches and populates gold-level tables. The microbatches are triggered every 10 minutes.

A data analyst has created a dashboard based on this gold level data. The project stakeholders want to see the results in the dashboard updated within 10 minutes or less of new data becoming available within the gold-level tables.

What is the ability to ensure the streamed data is included in the dashboard at the standard requested by the project stakeholders?

A.

A refresh schedule with an interval of 10 minutes or less

B.

A refresh schedule with an always-on SQL Warehouse (formerly known as SQL Endpoint

C.

A refresh schedule with stakeholders included as subscribers

D.

A refresh schedule with a Structured Streaming cluster

Full Access
Question # 12

A data analyst is working with gold-layer tables to complete an ad-hoc project. A stakeholder has provided the analyst with an additional dataset that can be used to augment the gold-layer tables already in use.

Which of the following terms is used to describe this data augmentation?

A.

Data testing

B.

Ad-hoc improvements

C.

Last-mile

D.

Last-mile ETL

E.

Data enhancement

Full Access
Question # 13

Query History provides Databricks SQL users with a lot of benefits. A data analyst has been asked to share all of these benefits with their team as part of a training exercise. One of the benefit statements the analyst provided to their team is incorrect.

Which statement about Query History is incorrect?

A.

It can be used to view the query plan of queries that have run.

B.

It can be used to debug queries.

C.

It can be used to automate query execution on multiple warehouses (formerly endpoints).

D.

It can be used to troubleshoot slow running queries.

Full Access
Question # 14

A data analyst has recently joined a new team that uses Databricks SQL, but the analyst has never used Databricks before. The analyst wants to know where in Databricks SQL they can write and execute SQL queries.

On which of the following pages can the analyst write and execute SQL queries?

A.

Data page

B.

Dashboards page

C.

Queries page

D.

Alerts page

E.

SQL Editor page

Full Access
Question # 15

A data analyst needs to use the Databricks Lakehouse Platform to quickly create SQL queries and data visualizations. It is a requirement that the compute resources in the platform can be made serverless, and it is expected that data visualizations can be placed within a dashboard.

Which of the following Databricks Lakehouse Platform services/capabilities meets all of these requirements?

A.

Delta Lake

B.

Databricks Notebooks

C.

Tableau

D.

Databricks Machine Learning

E.

Databricks SQL

Full Access
Question # 16

Data professionals with varying titles use the Databricks SQL service as the primary touchpoint with the Databricks Lakehouse Platform. However, some users will use other services like Databricks Machine Learning or Databricks Data Science and Engineering.

Which of the following roles uses Databricks SQL as a secondary service while primarily using one of the other services?

A.

Business analyst

B.

SQL analyst

C.

Data engineer

D.

Business intelligence analyst

E.

Data analyst

Full Access
Go to page: