Did you ever run into the error where your visual exceeded resources? I wrote this article to help you with this error.

But before we start it is important to know what causes this error.
Root cause
Behind all the Relationships, DAX queries, Filters and Shiny Visualizations in Power BI there is code (queries). That code is making sure that everything you have created is connected and therefore works together.
For example:
One single matrix visual with one single measure and a normal 5 level hierarchy can consist of more than a 100 hidden background queries. And every background query comes with a certain heaviness (measured in KB or CPU time).
Even Power BI has a limit of what it can handle and if you cross that limit Power BI will tell you by the error in the picture.
There are many ways you can cross the limit and I have not discovered all of them, but in my experience the following points are the most common:
1. Insufficient data model (for example: To much usage of many to many relationships)
2. Badly written DAX measures.
3. Too much lineage (dependencies) in your DAX measure.
4. Too much data columns in your visual.
5. Too much page or visual filters.
All of the points above generates queries on the background. And if you run into a Power BI error that states exceeded resources or exceeded CPU limit you probably have to many of them or they are to heavy to handle.
How to start?
To figure out how many queries, CPU time and KB we are actually using we can use the performance analyzer function in Power BI desktop. By refreshing the exceeded visual with the performance analyzer running we can generate the code that is needed to analyze the background queries.
For analyzing the background queries we use DAX studio and the DAX studio function server timings.
Server timings will give you an insight of the amount of queries, CPU Time, amount of KB’s, use of formula engine and use of storage engine, and that is exactly what we need to analyze our visual and to see where we can possibly improve.
Once we copy the query that the performance analyzes gives us and we run it by DAX studio, with server timings running, it will look something like this:

How to analyze server timings?
The first thing you need to know is the difference between the formula engine (FE) and the storage engine (SE). I am not going to much into detail because I would like to keep this post as simple as possible, but if you want you can check out this blog post from SQL BI who explains the difference into detail:
In a nutshell the storage engine is responsible for the data input and the formula engine is responsible for the formulas that comes afterwards. The storage engine is known to be faster in CPU time. In my experience it is a best practice to have around 30% formula engine and around 70% storage engine.
This also means that you can draw your first conclusion from here. If you have 99% formula engine and 1% storage engine you are going to have a lot of CPU time because the formula engine does all the work. This can refer back to point 2 badly written DAX measures or point 3 too much references (dependencies) in your DAX measure .
The first question you can ask yourself here is then Can I adjust my measures to be less formula engine heavy?
This can be answered by using more variables, by minimize the usage of iteration functions as SUMX, by minimize the usage of IF functions, by minimize the usage of FILTER functions.
The second question you can ask yourself is Can I adjust the references (lineage) of my measures that they are closer to the data source?
With measure lineage I mean the entire chain of the measures. You start with one measure and then references that measure into another measure. Before you know it you have a chain of 10 measures which all need to be loaded into the visual.
Is this entire lineage (chain) really necessary or can it be shortened? Also do I need the entire chain for this specific visual?
You can answer this question by the use of more variables or simply checking if you can delete some of the (in-between) measure. You can also check in your visual if you really need the end of the lineage or if you can go a few levels earlier.
Now you balanced out the formula engine and the storage engine by optimizing your measures and making sure you use the right lineage. If you still receive the error or if you would like to improve some more, the second task that needs to be done is to analyze what the server timings actually gives you.
Let’s have a look at the output of the server timings.

After running the performance analyzer query in DAX Studio with server timings it will give you the following:
Subclass – Shows the operatered used. I always focus on the single scans.
Duration – The total duration of the query in milliseconds by the storage engine.
CPU Time – The CPU time of the query.
Rows – The amount of rows it has to load.
KB – The amount of RAM consumed.
Query – The code of the executed query.
By simple sorting on the column that is sufficient for your analysis you can create a starting point for yourself. For example if you want to reduce the CPU time, then sort on the CPU column and check which query comes with the biggest CPU time. Examinate the query and try to see if there are any improvements you could make based on what it does and what it stand for.
My advice here is to sort on the query column. It will then start to sort and group all the queries and based on this you can see which queries are repeated the most. If you have queries that are repeated many times you can ask yourself if this is really necessary. This can refer back to point 1. Insufficient data model (for example: The usage of many to many relationships). Because for a relationship with many to many the amount of queries on the background are sometimes multiplied. By changing relationships from many to many to one to many you can reduce the amount of queries on the background by a lot.
If you cannot just change a relationship from many to many into one to many try using an in-between table with a one to many (single direction) and a one to many (both direction). This also helps with reducing the amount of queries because it is still one to many.
After step 2 you now have optimized your DAX queries and your data model. If you still have issues with performance then that can refer to point 4. too much data columns in your visual or 5. too much page or visual filters. Both of these point generated queries, CPU of KB but you can not really change it because you need it.
For this I advice to see if you can do something about the lay-out of your page and or visual? Do you really need all this data in the same visual? Or can u spread it over multiple visuals or preferred drill-through pages. Try to spread the amount of background queries over multiple island to not have them loaded all at once.
Also did you know that a table visual takes way less performance then a matrix visual? This is because a matrix visual is technically a iteration.
Conclusion
With the guide from above I guarantee you will have your performance issues solved in no-time.

