Back to Blog

Power BI Performance Audit - What to Check Before Scaling

April 17, 202613 min readMichael Ridland

Your Power BI environment works fine today. But you're about to add more users, connect more data sources, or roll out to additional departments - and you're wondering whether it will hold up.

This is the right question to ask at the right time. We've been called in too many times to fix Power BI environments that worked fine at 50 users but fell apart at 500, or handled six months of data smoothly but ground to a halt at two years. A performance audit before scaling saves you from the embarrassment of rolling out analytics that nobody can actually use because the reports take 45 seconds to load.

Here's the performance audit we run for Australian clients before any significant Power BI scale-up. You can use it as a self-assessment or as a framework for engaging a consultant.

When You Need a Performance Audit

Scale these up and things break:

  • Users: Going from 50 to 500 report consumers
  • Data volume: Historical data growing beyond 1-2 GB
  • Reports: Adding 20+ new reports to an existing environment
  • Data sources: Connecting additional systems to existing models
  • Refresh frequency: Moving from daily to hourly or near-real-time

Warning signs that indicate performance issues already exist:

  • Reports take more than 10 seconds to load
  • Data refreshes are taking longer each month
  • Users complain about "spinning wheel" wait times
  • Refresh failures are becoming more frequent
  • You're getting capacity throttling warnings
  • Different reports show different numbers for the same metric

If any of these apply, don't wait until you scale - audit now.

The Audit Framework - Six Areas to Check

1. Data Model Assessment

The data model is the foundation of Power BI performance. A poorly designed model can't be fixed by throwing more capacity at it - it needs to be rebuilt.

What to Check

Star schema compliance. The gold standard for Power BI data models is a star schema - fact tables (transactions, events) connected to dimension tables (products, customers, dates) via single-direction relationships.

Pattern Performance Impact What to Do
Star schema Best Keep it
Snowflake schema Acceptable Consider flattening dimensions
Flat/wide tables Poor at scale Redesign into star schema
Many-to-many relationships Variable Review each; consider bridge tables
Circular dependencies Bad Eliminate immediately

Column count. Every column in your data model consumes memory, even if no report uses it. We routinely find data models with 200+ columns where reports use fewer than 80.

Action: Remove every column that isn't used in a visual, filter, relationship, or calculation. This alone can reduce model size by 30-50%.

Cardinality. Columns with very high cardinality (many unique values) - like transaction IDs, timestamps with seconds precision, or free-text fields - consume disproportionate memory.

Action: Remove high-cardinality columns that aren't needed for analysis. If you need a timestamp, consider reducing precision to the hour or day level unless granular timing is required.

Calculated columns vs measures. Calculated columns are computed during data refresh and stored in the model. Measures are computed at query time. Calculated columns that could be measures waste memory.

Action: Convert calculated columns to measures wherever possible. Calculated columns should only be used when you need to filter or sort by the result, or when the calculation needs to be evaluated row by row during refresh.

How to Measure

Power BI Performance Analyser (built into Power BI Desktop) shows query execution times for each visual. Open it, refresh your visuals, and look for any query taking more than 2 seconds.

DAX Studio is a free tool that provides detailed analysis of your data model - table sizes, column cardinality, memory usage, and query performance. Every serious Power BI audit should include a DAX Studio analysis.

Key metrics from DAX Studio:

Metric Good Concerning Action Needed
Total model size Under 500 MB 500 MB - 2 GB Over 2 GB
Largest table Under 100 MB 100-500 MB Over 500 MB
Unused columns 0 1-20 Over 20
High cardinality columns (>1M unique) Only where needed A few unnecessary Many unnecessary

2. DAX Query Performance

Slow reports are often caused by inefficient DAX calculations rather than data model issues. A single poorly written measure can make an entire report page unusable.

What to Check

CALCULATE with complex filters. Measures that use CALCULATE with multiple filter arguments or context transitions can be expensive. Review your most complex measures.

Iterating functions on large tables. Functions like SUMX, AVERAGEX, and FILTER that iterate row by row over large fact tables are performance-intensive. If they're iterating over millions of rows, look for alternatives.

Nested iterations. A SUMX inside a SUMX, or any pattern where an iterating function calls another iterating function, can cause exponential performance degradation.

Time intelligence calculations. Year-over-year, month-to-date, and rolling calculations are common in business reporting but can be expensive if not written efficiently. Use standard DAX time intelligence patterns (SAMEPERIODLASTYEAR, DATESYTD) rather than custom date logic.

Optimisation Techniques

Issue Slow Pattern Better Approach
Counting filtered rows COUNTROWS(FILTER(Table, condition)) CALCULATE(COUNTROWS(Table), condition)
Conditional aggregation SUMX(Table, IF(condition, [Amount], 0)) CALCULATE(SUM(Table[Amount]), condition)
Complex time calculations Custom date filtering logic Standard DAX time intelligence functions
Multiple measures per visual 10+ measures in a single table visual Reduce measures or split into multiple visuals

How to identify slow measures: Use Power BI Performance Analyser to identify which visuals are slow, then use DAX Studio's Server Timings feature to see exactly which measures are taking the most time.

3. Report Design and Visual Optimisation

Each visual on a report page generates one or more DAX queries. More visuals means more queries means slower page loads.

What to Check

Visuals per page. Every visual is a separate query against the data model.

Visuals Per Page Expected Impact
1-6 Fast - under 3 seconds
7-12 Moderate - 3-8 seconds
13-20 Slow - 8-15+ seconds
20+ Unacceptable - redesign the page

Action: Reduce visuals per page. Use drill-through pages for detail instead of cramming everything onto one page. Use bookmarks to show/hide sections rather than displaying everything at once.

Slicers. Each slicer with "single select" or "dropdown" mode is more performant than slicers showing all values. Large slicers (showing hundreds of values) should use dropdown mode.

Cross-filtering. By default, clicking on one visual filters all other visuals on the page. This is useful but means every interaction triggers multiple queries. Disable cross-filtering for visuals where it's not needed.

Custom visuals. Third-party custom visuals vary wildly in performance. Some are well-optimised; others generate multiple queries and process data inefficiently. If you're using custom visuals, test their performance specifically.

Auto date/time tables. Power BI automatically creates hidden date tables for each date column. If you have multiple date columns and are using a dedicated date dimension table (which you should be), disable auto date/time in options. This reduces model size and prevents confusion.

4. Data Refresh Performance

As your data grows, refresh times extend. Eventually, refreshes start failing because they exceed the timeout window.

What to Check

Current refresh duration. Check the refresh history for each dataset in the Power BI Service. If any refresh is taking more than 30 minutes, it needs attention before you scale.

Refresh Duration Status
Under 10 minutes Healthy
10-30 minutes Monitor closely
30-60 minutes Optimise before scaling
Over 60 minutes Immediate action needed

Incremental refresh configuration. If you're refreshing full datasets of over 500,000 rows, you should be using incremental refresh. This tells Power BI to only refresh new and changed data rather than reloading everything.

Query folding. When Power Query transformations are "folded" back to the data source (executed as SQL rather than in-memory), performance is dramatically better. Check that your Power Query steps support query folding, especially the filter and column selection steps.

How to check query folding: In Power Query Editor, right-click on each step. If "View Native Query" is available, that step folds. If it's greyed out, folding has broken at or before that step.

Gateway performance. If you're connecting to on-premises data sources through a Power BI Gateway, the gateway itself can be a bottleneck. Check gateway server CPU, memory, and network metrics during refresh windows.

Gateway Metric Healthy Investigate
CPU during refresh Under 60% Over 80%
Memory usage Under 70% Over 85%
Network throughput Consistent Highly variable or saturated

5. Capacity and Licensing Health

If you're on Premium or Fabric capacity, the capacity itself can become a bottleneck.

What to Check

Capacity utilisation. The Power BI Premium Capacity Metrics app (available from AppSource) shows how much of your capacity is being consumed.

Utilisation Status Action
Under 50% Healthy headroom Good position to scale
50-75% Moderate Plan for capacity increase if scaling significantly
75-90% High Optimise before scaling; consider capacity upgrade
Over 90% At risk Immediate attention; scaling will cause failures

Throttling events. Check whether your capacity is experiencing throttling (where Power BI delays operations because the capacity is overloaded). Any throttling events indicate you're at or beyond your capacity limit.

Refresh scheduling conflicts. If multiple large datasets refresh simultaneously, they compete for capacity resources. Stagger refresh schedules to spread the load.

Autoscale configuration. Premium Gen2 and Fabric capacities support autoscale - automatically adding temporary capacity during peak usage. If you haven't configured autoscale and your usage is peaky, this is worth enabling.

6. Governance and Environment Health

Performance isn't just about speed - it's also about maintainability and reliability as you scale.

What to Check

Dataset proliferation. How many datasets exist in your environment? Are there duplicate datasets (different people connecting to the same data source independently)?

Datasets Situation
Under 20 Manageable
20-50 Review for duplicates
50-100 Likely significant duplication
100+ Needs consolidation

Action: Identify shared datasets that multiple reports should use rather than each report connecting to data sources independently. Power BI's shared dataset feature allows multiple reports to build on a single, well-governed dataset.

Report sprawl. Similar to dataset proliferation - how many reports exist, and how many are actually used?

Action: Use the Power BI usage metrics to identify reports that haven't been viewed in 90+ days. Archive or decommission them.

Broken refreshes. Check for datasets with failed refreshes. Broken refreshes mean stale data, which means users see old numbers and lose trust.

The Audit Report - What It Should Include

When we complete a performance audit for a client, the report includes:

  1. Executive summary - Overall health rating and top 3-5 priorities
  2. Data model analysis - Size, structure, cardinality, unused columns, relationship issues
  3. Query performance analysis - Slowest reports, slowest measures, root causes
  4. Refresh performance analysis - Duration trends, query folding status, incremental refresh opportunities
  5. Capacity analysis - Utilisation, throttling, headroom for growth
  6. Prioritised recommendations - Ranked by impact and effort, with estimated performance improvement

Quick Wins - What to Fix First

If you can't do a full audit, these five actions deliver the most performance improvement for the least effort:

1. Remove Unused Columns (30 Minutes - High Impact)

Open your data model in DAX Studio, identify columns that aren't used in any visual, measure, or relationship, and remove them from the Power Query import. This alone can cut model size by 30-50%.

2. Disable Auto Date/Time (5 Minutes - Moderate Impact)

In Power BI Desktop, go to File > Options > Data Load and uncheck "Auto date/time." Then remove the hidden date tables that were already created. This reduces model size and query overhead, especially if you have multiple date columns.

3. Enable Incremental Refresh (1-2 Hours - High Impact for Large Datasets)

For any dataset over 500,000 rows that refreshes daily, set up incremental refresh. This can reduce refresh time from hours to minutes.

4. Reduce Visuals Per Page (1-2 Hours - Immediate Impact)

Review your busiest report pages. If any page has more than 12 visuals, redesign it. Move detail into drill-through pages and use bookmarks for alternate views.

5. Convert Calculated Columns to Measures (2-4 Hours - Moderate Impact)

Review all calculated columns in your data model. Any calculated column that's only used in visuals (not in filters or sort orders) should be converted to a DAX measure.

Scaling Readiness Checklist

Before scaling your Power BI environment, confirm:

  • Data model follows star schema patterns
  • No unused columns in the data model
  • All measures are optimised (no unnecessary iterations)
  • Report pages have 12 or fewer visuals
  • Incremental refresh is configured for large datasets
  • Query folding is maintained in Power Query
  • Refresh schedules are staggered (no simultaneous large refreshes)
  • Capacity utilisation is below 60%
  • No throttling events in the past 30 days
  • Shared datasets are used instead of duplicate connections
  • Usage metrics confirm active reports are performing well
  • Gateway infrastructure has headroom (if applicable)

If you can tick all of these, you're in good shape to scale. If several are unchecked, address them first.

When to Call in a Specialist

A self-assessment covers a lot of ground, but there are situations where an external audit adds significant value:

  • Complex DAX optimisation. If your measures are deeply nested and you're not confident refactoring them, an experienced DAX consultant will find improvements faster.
  • Data model redesign. Restructuring a data model while keeping existing reports working requires experience.
  • Capacity planning. Predicting how much capacity you'll need after scaling requires understanding the relationship between data volume, concurrent users, query complexity, and capacity SKUs.
  • Pre-migration assessment. If you're moving to Microsoft Fabric or upgrading from Pro to Premium, an audit ensures you're not carrying performance problems into the new environment.

How Team 400 Runs Performance Audits

We're Power BI consultants who've optimised environments for organisations across Australia - from mid-sized businesses with a handful of reports to enterprises with hundreds of datasets and thousands of users.

Our performance audit process:

  1. Environment scan (1-2 days) - Automated analysis of your data models, query performance, refresh history, and capacity metrics using DAX Studio, Performance Analyser, and the Capacity Metrics app
  2. Deep analysis (2-3 days) - Manual review of your most critical reports, measures, and data models by senior consultants
  3. Recommendations report - Prioritised list of improvements with estimated effort and performance impact
  4. Implementation support - We can implement the recommendations or guide your team through the work

As Microsoft AI and BI specialists, we also help clients evaluate whether Microsoft Fabric is the right next step for organisations outgrowing standalone Power BI.

Is your Power BI environment ready to scale? Talk to our team about a performance audit. We'll give you an honest assessment and a clear plan before you invest in scaling something that isn't ready.

Explore our services or learn about our broader AI consulting work with Australian businesses.