The enterprise analytics team needs to resolve the DAX measure performance issues.
What should the team do first?
Use Performance analyzer in Power Bl Desktop to get the DAX durations.
Use DAX Studio to get detailed statistics on the server timings.
Use DAX Studio to review the Vertipaq Analyzer metrics.
Use Tabular Editor to create calculation groups.
You need to create Power BI reports that will display data based on the customers’ subscription level.
Which three actions should you perform in sequence? To answer, move the appropriate actions from the list of actions to the answer area and arrange them in the correct order.
Step 1: Create row-level security (RLS) roles
Create roles
Note: Provide all the customers with their own Power BI workspace to create their own reports. Each workspace will use the new dataset in the FinData workspace.
Implement subscription levels for the customers. Each subscription level will provide access to specific rows of financial data.
Deploy prebuilt datasets to Power BI to simplify the query experience of the customers.
Step 2: Create a DAX expression
Consider a model with two roles: The first role, named Workers, restricts access to all Payroll table rows by using the following rule expression:
FALSE()
Note: A rule will return no table rows when its expression evaluates to false.
Yet, a second role, named Managers, allows access to all Payroll table rows by using the following rule expression:
TRUE()
Take care: Should a report user map to both roles, they'll see all Payroll table rows.
Step 3: Add members to row-level security (RLS) roles
Configure role mappings
Once [the model is] published to Power BI, you must map members to dataset roles.
You need to recommend a solution to resolve the query issue of the serverless SQL pool. The solution must minimize impact on the users.
What should you in the recommendation?
Update the statistics for the serverless SQL pool.
Move the data from the serverless SQL pool to a dedicated Apache Spark pool.
Execute the sp_sec_process_daca_limic stored procedure for the serverless SQL pool.
Move the data from the serverless SQL pool to a dedicated SQL pool.
Users indicate that queries against the serverless SQL pool fail occasionally because the size of tempdb has been exceeded.
In the dedicated SQL pool resource, temporary tables offer a performance benefit because their results are written to local rather than remote storage.
Temporary tables in serverless SQL pool.
Temporary tables in serverless SQL pool are supported but their usage is limited. They can't be used in queries which target files.
For example, you can't join a temporary table with data from files in storage. The number of temporary tables is limited to 100, and their total size is limited to 100 MB.
You need to recommend a solution for the analysts in the Finance and Accounting business unit to mitigate the increase in maintenance of their assets in the Power Bl tenant.
Which two actions should you recommend? Each correct answer presents part of the solution.
NOTE: Each correct selection is worth one point.
Use Microsoft Purview to search for datasets that contain the relevant data.
Perform impact analysis on the relevant data source.
Create a live connection to a Power Bl dataset.
Create a Power Bl template app.
You need to configure the Sales Analytics workspace to meet the ad hoc reporting requirements.
What should you do?
Grant the sales managers the Build permission for the existing Power Bl datasets.
Grant the sales managers admin access to the existing Power Bl workspace.
Create a deployment pipeline and grant the sales managers access to the pipeline.
Create a PBIT file and distribute the file to the sales managers.
Allow sales managers to perform ad hoc sales reporting with minimal effort
Power BI report templates contain the following information from the report from which they were generated:
Report pages, visuals, and other visual elements
The data model definition, including the schema, relationships, measures, and other model definition items
All query definitions, such as queries, Query Parameters, and other query elements
What is not included in templates is the report's data.
Report templates use the file extension .PBIT (compare to Power BI Desktop reports, which use the .PBIX extension).
Note: With Power BI Desktop, you can create compelling reports that share insights across your entire organization. With Power BI Desktop templates, you can streamline your work by creating a report template, based on an existing template, which you or other users in your organization can use as a starting point for a new report's layout, data model, and queries. Templates in Power BI Desktop help you jump-start and standardize report creation.
The group registers the Power Bl tenant as a data source1.
You need to ensure that all the analysts can view the assets in the Power Bl tenant The solution must meet the technical requirements for Microsoft Purview and Power BI.
What should you do?
Create a scan.
Deploy a Power Bl gateway.
Search the data catalog.
Create a linked service.
How should you configure the Power BI dataset refresh for the dbo.SalesTransactions table?
an incremental refresh of Product where the ModifiedDate value is during the last three days.
an incremental refresh of dbo.SalesTransactions where the SalesDate value is during the last three days.
a full refresh of all the tables
an incremental refresh of dbo.SalesTransactions where the SalesDate value is during the last hour.
The sales data in SQLDW is updated every 30 minutes. Records in dbo.SalesTransactions are updated in SQLDW up to three days after being created. The records do NOT change after three days.
You need to recommend a solution to ensure that sensitivity labels are applied. The solution must minimize administrative effort.
Which three actions should you include in the recommendation? Each correct answer presents part of the solution.
NOTE: Each correct selection is worth one point.
From the Power Bl Admin portal, set Allow users to apply sensitivity labels for Power Bl content to Enabled.
From the Power Bl Admin portal, set Apply sensitivity labels from data sources to their data in Power Bl to Enabled.
In SQLDW. apply sensitivity labels to the columns in the Customer and CustomersWithProductScore tables.
In the Power Bl datasets, apply sensitivity labels to the columns in the Customer and CustomersWithProductScore tables.
From the Power Bl Admin portal, set Make certified content discoverable to Enabled.
A Synapse Analytics dedicated SQL pool is named SQLDW.
Customer contact data in SQLDW and the Power BI dataset must be labeled as Sensitive. Records must be kept of any users that use the sensitive data.
A (not B): Enable sensitivity labels
Sensitivity labels must be enabled on the tenant before they can be used in both the service and in Desktop.
To enable sensitivity labels on the tenant, go to the Power BI Admin portal, open the Tenant settings pane, and find the Information protection section.
In the Information Protection section, perform the following steps:
D (not C): When data protection is enabled on your tenant, sensitivity labels appear in the sensitivity column in the list view of dashboards, reports, datasets, and dataflows.
E: Power BI Tenant Discovery Setting include Make certified content discoverable.
You need to populate the CustomersWithProductScore table.
How should you complete the stored procedure? To answer, select the appropriate options in the answer area.
NOTE: Each correct selection is worth one point.
Box 1: FLOAT
Identify which customers should receive promotional emails based on their likelihood of purchasing promoted products.
FLOT is used in the last statement of the code: WITH (score FLOAT) as p;
From syntax: MODEL
The MODEL parameter is used to specify the model used for scoring or prediction. The model is specified as a variable or a literal or a scalar expression.
Box 2: dbo.CustomerWithProductScore
Identify which customers should receive promotional emails based on their likelihood of purchasing promoted products.
Only table CustomerWithProductScore has the required filed score.
From the syntax:
DATA
The DATA parameter is used to specify the data used for scoring or prediction. Data is specified in the form of a table source in the query. Table source can be a table, table alias, CTE alias, view, or table-valued function.
You need to implement object-level security (OLS) in the Power Bl dataset for the sales associates.
Which three actions should you perform in sequence? To answer, move the appropriate actions from the list of actions to the answer area and arrange them in the correct order.
What should you configure in the deployment pipeline?
a backward deployment
a selective deployment
auto-binding
a data source rule
Development Process Requirements
Litware identifies the following development process requirements:
SQLDW and datalake1 will act as the development environment. Once feature development is complete, all entities in synapseworkspace1 will be promoted to a test workspace, and then to a production workspace.
Power BI content must be deployed to test and production by using deployment pipelines.
Create deployment rules
When working in a deployment pipeline, different stages may have different configurations. For example, each stage can have different databases or different query parameters. The development stage might query sample data from the database, while the test and production stages query the entire database.
When you deploy content between pipeline stages, configuring deployment rules enables you to allow changes to content, while keeping some settings intact. For example, if you want a dataset in a production stage to point to a production database, you can define a rule for this. The rule is defined in the production stage, under the appropriate dataset. Once the rule is defined, content deployed from test to production, will inherit the value as defined in the deployment rule, and will always apply as long as the rule is unchanged and valid.
You can configure data source rules and parameter rules.
Incorrect:
Not B: if you already have a steady production environment, you can deploy it backward (to Test or Dev, based on your need) and set up the pipeline. The feature is not limited to any sequential orders.
You need to create the customized Power Bl usage reporting. The Usage Metrics Report dataset has already been created. The solution must minimize development and administrative effort.
Which four actions should you perform in sequence? To answer, move the appropriate actions from the list of actions to the answer area and arrange them in the correct order.
Step 1: From powerbi.com, create a new report..
The company wants custom Power BI usage reporting that includes the percent change of users that view reports in the Sales Analytics workspace each month.
Step 2: Add a report measure
Measures are used in some of the most common data analyses. Simple summarizations such as sums, averages, minimum, maximum and counts can be set through the Fields well. The calculated results of measures are always changing in response to your interaction with your reports, allowing for fast and dynamic ad-hoc data exploration.
Step 3: Add visuals to the report
Step 4: Publish the report to the Sales Analytics workspace
You are building a Power Bl dataset that contains a table named Calendar. Calendar contains the following calculated column.
pfflag = IF('Calendar'[Date] < TOOAYQ, "Past", "Future")
You need to create a measure that will perform a fiscal prior year-to-date calculation that meets the following requirements:
• Returns the fiscal prior year-to-date value for [sales Amount]
• Uses a fiscal year end of June 30
• Produces no result for dates in the future
How should you complete the DAX expression? To answer, select the appropriate options in the answer area.
NOTE: Each correct selection is worth one point.
Box 1: CALCULATETABLE
CALCULATETABLE evaluates a table expression in a modified filter context.
Syntax: CALCULATETABLE(
Incorrect:
* SUMMARIZECOLUMNS
SUMMARIZECOLUMNS returns a summary table over a set of groups.
Syntax: SUMMARIZECOLUMNS(
* CROSSJOIN returns a table that contains the Cartesian product of all rows from all tables in the arguments. The columns in the new table are all the columns in all the argument tables.
Syntax: CROSSJOIN(