Do you have a Power BI report that takes ages to refresh and load? Do you often get frustrated with the amount of time you spend waiting to start the development work? Does your report fail to refresh at the reporting time?

It can sometimes be frustrating to work with slow data sources or complex queries with long refresh times. Consider some of the points below to enhance the load time and refresh rate of Power BI so you have a faster and more responsive experience.

A Power BI file can be connected to many different types of data sources, and that data can be shaped and transformed to meet the reporting requirements. The connections and transformations are stored in queries, which are by default refreshed either by a manual or automatic refresh of the report in the Power BI service.

It’s important to know about the various steps of report optimisation while developing a Power BI report, to enable the report to load faster and enhance the refresh rate.

1 Use Query Folding

Query Folding is the ability to generate a single query statement to retrieve and transform source data in Power Query. It helps to process and compute the transformation steps in a power query to improve performance by helping the data load faster when executing refresh operations.

2 Managing loading of queries 

In many situations, we can manage the loading of queries by breaking down the data transformations in multiple queries. In this type of situation, some queries are not relevant to load into Desktop as they are intermediate steps.

3 Excluding queries from refresh 

Executing queries for which the source data is not updated often or at all, you can exclude those queries from the refresh of the report by unchecking ‘Include in report refresh’ in the context menu of the query in Desktop or in the Properties screen:

4 Use a smaller number of images in the report

Using many images in the report will reduce the performance of the report, and it will take a lot of time to render the image. It is suggested to use a smaller number of images in the report. We can also convert the larger data size to base64 image to reduce the difficulties.

5 Custom Visual

We often have to use Custom Visuals in our report to meet reporting requirements and expectations. The custom visual available in the marketplace is not always optimised for best performance. We should be cautious of the effect of using custom visuals into our reports.

6 Limit Data Model

Restricting unnecessary columns either at the database level or power query level can help improve the speed and performance of the report. We can also use row level security to restrict rows based on filter condition set in the rule while setting up the row level security.

7 Use Measure instead of Calculated Column

It is best practice to create measure instead of calculated column. Measures are calculated on the runtime while calculated column consume memory for processing information. It is important to use appropriate DAX formula and functions while creating measures and calculated columns. If there is any formula not required to be used in any other measure, we can use variable instead of measures to keep the code clean and structured.

8 Using Power BI Dataflow  

While the options above help make the Power BI processes faster, using Dataflow will separate the ETL process from the Power BI, taking the slow data source away from the Power BI and therefore making Power BI faster.

Power BI Dataflow is a set of Power Query transformations running in in Power BI service independent from a Power BI dataset. It can run Power Query transformations, and load the output into Azure Data Lake storage for future usage. That way, the transformations happen on a different process in Data Lake.

By pushing all the heavy transformations to Power BI dataflow, the data refresh time in Power BI dataset would be superfast. We would still need to schedule the refresh of the dataflow in the service. it can run on a different schedule than a Power BI dataset itself, we do not have to wait for the refresh to finish to get the development work done.

Although the refresh time will get faster with the use of dataflow, it will still take a significant amount of time to run the Query transformations which can be scheduled separately.  Using dataflow, we can separate the heavy lifting transformations in the ETL (Extract, Transform, Load) process from the refresh of Power BI dataset. And the ETL process will happen in the Dataflow and not in Power BI .pbix report.

To find out more, contact 365 Solutions Group today so we can help you enhance your reporting through the implementation of these options.