Back to Blog

Why Your Power BI Reports Are Slow (And How to Fix Them)

March 9, 20269 min readTeam 400

Every few weeks, we get the same call. Someone at an Australian organisation has a Power BI report that used to load in a couple of seconds. Now it takes thirty. Or a minute. Or it times out entirely. The data has grown, more users are hitting it, and nobody's thought about performance since the initial build.

The thing is, most Power BI performance problems aren't mysterious. They follow predictable patterns. Microsoft recently updated their official optimisation guide for Power BI, and it covers the technical layers well. But knowing the theory and knowing what to do in practice are different things. Here's what we've actually seen work.

Start With the Data Model, Not the Visuals

When a report is slow, most people look at the report itself - too many visuals, complex filters, that kind of thing. Sometimes that's the issue. But more often, the real problem is underneath: the data model.

Power BI gives you three storage modes for your semantic model tables: Import, DirectQuery, and Composite. The choice matters more than most people realise.

Import pulls a copy of your data into Power BI's in-memory engine. It's fast. Queries run against compressed, columnar data that's optimised for aggregation. For most reporting scenarios, this is what you want. The trade-off is that your data is only as fresh as your last refresh, and large datasets consume memory on your capacity.

DirectQuery sends every query back to the source database in real time. No data is stored in Power BI. This sounds appealing - always up-to-date data - but the reality is that performance depends entirely on how fast your source database can respond. We've seen DirectQuery reports that are perfectly fine against a well-tuned Azure SQL database, and we've seen ones that are unusable because they're hitting an overloaded on-premise server through a gateway.

Composite lets you mix Import and DirectQuery tables in the same model. This is useful when you have a large fact table that's too big to import but dimension tables that rarely change. Import the dimensions, DirectQuery the facts, and you get a reasonable middle ground.

Our take: Start with Import unless you have a specific reason not to. If your data needs to be real-time, benchmark DirectQuery against your actual source before committing to it. And if you go Composite, document which tables use which mode - we've inherited models where nobody knew what was imported and what was live, which made troubleshooting a nightmare.

Reducing Data Before It Hits Power BI

The single most effective optimisation is bringing less data into your model. This sounds obvious, but we audit models all the time that contain years of transactional data at the line-item level, when the reports only ever show monthly summaries.

What actually moves the needle:

  • Filter at the source: If your report only shows the last two years, don't import five years of data. Apply date filters in Power Query or in your SQL query before the data reaches the model.
  • Remove unused columns: Every column you import consumes memory and increases refresh time. If a column isn't used in any visual, measure, or relationship, drop it. We regularly find models with 30-40 columns per table where only 10 are actually referenced.
  • Pre-aggregate where appropriate: If you're always summing daily transactions to show monthly totals, consider creating a summary table at the monthly level. Your source database is better at aggregation than Power BI's query engine for very large volumes.
  • Watch your cardinality: Columns with millions of unique values (like transaction IDs or timestamps with seconds precision) compress poorly and bloat the model. If you don't need that granularity in your reports, reduce it.

This is where having a proper data pipeline matters. If you're using Microsoft Fabric, you can do a lot of this preparation in the lakehouse or warehouse before it ever reaches your semantic model. Prepare the data properly upstream, then model it for reporting. That pattern scales far better than trying to wrangle everything inside Power BI.

Making Your Semantic Model Useful (Not Just Functional)

Microsoft's guide talks about optimising the model for "report authors and model consumers." We'd put it more bluntly: a semantic model is the interface between your data and everyone who needs to use it. If it's confusing, people will work around it - and those workarounds always cause problems.

Here's what separates a model that people can actually work with from one that requires a data engineer sitting next to you:

Name things properly. "Table1" and "Column4" are not acceptable in a production model. Use descriptive names that a business user would recognise. "Store Sales," "Customer Region," "Monthly Revenue." This takes minutes and saves hours of confusion.

Hide what people don't need. Relationship key columns, staging columns, intermediate calculations - hide them. Report builders should see a clean list of fields that are relevant to reporting. They can always unhide columns if they need to, but the default view should be curated.

Write measures, don't rely on implicit aggregation. If "Revenue" should always be a SUM, create a measure for it. If "Average Order Value" is Revenue divided by Order Count, define that as a measure. When metrics are defined once in the model, every report that connects to it gets the same numbers. When metrics are defined ad-hoc in individual visuals, you get five reports showing five different versions of "revenue."

Add descriptions. Power BI lets you add descriptions to tables, columns, and measures. Use them. "Revenue: Net sales after returns, excluding tax. Source: Finance system." This kind of inline documentation does more for self-service than any data dictionary sitting in a SharePoint folder.

Report Design That Doesn't Kill Performance

We covered report design in detail in our post on building Power BI reports that people actually use. From a performance perspective specifically, here's what matters:

Fewer visuals per page means fewer queries. Every visual on a page generates at least one query against the model when the page loads. A page with 20 visuals fires 20+ queries simultaneously. Reducing to 7-8 well-chosen visuals makes a measurable difference.

Use the most restrictive filters you can. If a page defaults to showing all data across all time periods, it's doing maximum work on every load. Default your slicers to the current month or quarter. Let users expand if they need to, rather than starting with everything.

Be deliberate about visual interactions. When you click a bar in a chart, Power BI cross-filters every other visual on the page by default. That's more queries. For pages where cross-filtering isn't needed, turn it off. For pages where it is needed, configure which visuals respond and which don't.

Paginated reports for operational detail. If someone needs a 500-row table with precise formatting for printing or export, that's a paginated report, not a standard Power BI report. Trying to force that into a standard report just makes everyone miserable.

Dashboards: The Caching Layer People Forget About

Power BI dashboards (the ones with pinned tiles) have a built-in caching layer. When you pin a visual to a dashboard, Power BI caches the result and serves it from cache on subsequent loads. Dashboards load faster, and your data sources get hit less often.

This is worth using deliberately. If you have visuals that are queried frequently but don't need real-time data, pin them to a dashboard. The cache refreshes periodically (hourly by default, but configurable), so the data stays reasonably current without hammering your source system every time someone opens the page.

One thing to watch: If you're using row-level security (RLS), the cache stores separate copies per user security context. With many RLS roles and many users, this can actually increase load during cache refreshes rather than reduce it. Test your specific scenario.

Infrastructure: The Layer Nobody Thinks About Until It's Too Late

Performance isn't just about the model and the report. The environment matters too.

Capacity sizing: If you're on Power BI Premium or Fabric capacity, your performance ceiling is determined by the v-cores you've provisioned. We've seen organisations with dozens of reports, hundreds of users, and a single P1 capacity wondering why things are slow. Monitor your capacity metrics and scale appropriately.

Data gateways: If your data sources are on-premise, every query routes through a gateway. A single gateway machine handling ten datasets, five of which refresh simultaneously during business hours, is a bottleneck. Spread the load across gateway clusters, and schedule refreshes outside peak usage windows.

Network latency: For DirectQuery, every single user interaction involves a round trip to the data source. If your source database sits in a different Azure region from your Power BI tenant, that latency stacks up fast. We've seen 200ms round trips turn a snappy report into something that feels broken. Same region, always.

Where to Start If Your Reports Are Already Slow

If you've got an existing report that's underperforming, here's the order we tackle things:

  1. Open Performance Analyzer in Power BI Desktop (View > Performance Analyzer). Record a page load. See which visuals take the longest and whether the time is spent in the DAX query, the visual rendering, or waiting on the data source.
  2. Check the model size. How many rows? How many columns? Are there tables or columns that aren't being used? Trim the fat.
  3. Review DAX measures. Complex, nested CALCULATE statements with multiple filter contexts are common performance killers. Sometimes restructuring a measure or pre-calculating values in Power Query is faster than optimising the DAX.
  4. Look at the page design. Count the visuals. Check the default filter state. Reduce where you can.
  5. Check the infrastructure. Is the capacity appropriately sized? Is the gateway healthy? Are refreshes overlapping with peak usage?

That order matters. Model first, because that's where root causes live. Report second, because that's treating symptoms. Infrastructure last, because that just raises the ceiling on whatever you've already got.

Wrapping Up

Power BI performance problems are frustrating because they creep in gradually. Things work fine with a small dataset and a few users. Then they don't. The fixes aren't mysterious, though - and the tooling keeps getting better.

If your environment is slowing down as the organisation grows, do a structured review rather than patching individual reports. As Power BI consultants, we run performance audits for Australian businesses all the time - model, reports, infrastructure, governance. A few targeted changes usually have an outsized impact.

Worth mentioning: the same well-structured data that makes Power BI fast is exactly what you need for AI and machine learning workloads. So if you're also investing in advanced analytics, fixing your BI data foundation does double duty.

Get in touch if your Power BI reports need a performance tune-up.