Back to Blog

Power BI DirectQuery - A Practical Guide for Australian Organisations

March 19, 20267 min readMichael Ridland

Every Power BI conversation eventually gets to DirectQuery. A client mentions they want "real-time dashboards" or their dataset is pushing past what Import mode can handle, and suddenly the team is debating whether to flip the switch. We've been through this conversation hundreds of times with Australian organisations, and the answer is almost always more nuanced than "just use DirectQuery."

Microsoft's documentation on DirectQuery is thorough but dense. Here's what actually matters when you're making this decision for a real project.

The Core Trade-Off

Import mode pulls your data into Power BI's in-memory engine. It compresses it, indexes it, and queries run fast. Blindingly fast, usually. The downside is your data is only as fresh as your last scheduled refresh - eight times a day on Pro, up to 48 on Premium.

DirectQuery does the opposite. No data is stored in Power BI at all. Every time someone opens a page, clicks a slicer, or changes a filter, Power BI generates a query and sends it to your source database. The visual waits for the response. Then it renders.

That fundamental difference drives everything. Performance, design, infrastructure planning, licensing - all of it flows from whether your data sits in Power BI's engine or stays at the source.

The Decision Framework We Actually Use

After years of these conversations, we've settled on a simple framework. Import mode is the default. You move to DirectQuery only when a specific constraint forces you there.

Those constraints typically fall into four buckets.

Data freshness that Import can't satisfy. If your business genuinely needs data that's minutes old rather than hours old, and even Premium's 48 daily refreshes don't cut it, DirectQuery is on the table. We've built this for logistics companies tracking fleet movements and for financial services teams monitoring transaction volumes in near real-time. But be honest about the actual requirement. About half the time when a client says "real-time," they actually mean "a couple of hours old is fine."

Dataset size that exceeds Import capacity. Pro has a 1GB model limit. Premium goes higher, but if you're looking at billions of rows of transactional data, you're going to hit walls. DirectQuery lets you report against massive datasets without trying to cram them into memory.

Data sovereignty and compliance. Government and healthcare organisations often have rules about where data can be stored and duplicated. DirectQuery keeps data in the source system - it never persists in the Power BI service. For some compliance frameworks, particularly in Australian government contexts, this matters.

Source-enforced security requirements. If your source database has fine-grained row-level security that must remain authoritative, DirectQuery with SSO passthrough lets Power BI defer to those rules rather than trying to replicate them in the model.

If none of these apply to you, stick with Import. Seriously.

The Newer Alternatives That Change the Calculation

Here's what's shifted in the last year or two. DirectQuery used to be the only real option when Import didn't work. Now there are several alternatives that sit between the two extremes.

Hybrid tables let you mix Import and DirectQuery partitions in a single table. Historical data gets imported for fast queries. Recent data stays in DirectQuery for freshness. This is ideal when you need current-day data to be real-time but don't care if last month's numbers are a few hours stale.

Direct Lake in Microsoft Fabric bypasses the traditional refresh cycle entirely. If your data lives in a Fabric lakehouse or warehouse, Direct Lake gives you near Import-level performance without the refresh overhead. We're seeing more Australian enterprises move toward Fabric, and Direct Lake is one of the most compelling features.

Incremental refresh with real-time lets you set the most recent time window to use DirectQuery while keeping historical data imported. You get both performance and freshness without committing to full DirectQuery.

Automatic aggregations cache summarised data in Import mode while allowing detail-level queries to fall through to DirectQuery. If most of your users are looking at summaries and only occasionally drilling into detail, this can give you the best of both worlds.

The point is that pure DirectQuery is no longer the automatic answer when Import doesn't work. Evaluate these options first.

Performance Realities

This is where projects succeed or fail. A DirectQuery report's performance is entirely dependent on the source database. I cannot stress this enough.

We've seen DirectQuery reports that feel snappy against a well-tuned Azure SQL database - sub-second response on most visuals. We've also seen reports that take 30 to 45 seconds per visual against an on-premises SQL Server sitting behind a data gateway on a congested network. Same Power BI features, wildly different user experience.

Every visual generates queries. A report page with 12 visuals means 12 separate queries hitting your database every time someone opens that page or changes a filter. If a user changes a slicer that affects all 12 visuals, that's 12 queries fired in rapid succession. Your database needs to handle that concurrency without falling over.

Power Query folding is non-negotiable. If your Power Query transformations can't fold down into a single native query, you're in trouble. Unfoldable steps mean Power BI pulls data back and processes it locally, which defeats the purpose. Always check "View Native Query" in Power Query Editor to confirm your steps are folding.

The 4-minute timeout is real. The Power BI service enforces a 4-minute timeout per query. If your source can't respond within that window, visuals fail. This isn't theoretical - we've hit it on poorly indexed warehouse tables.

Source Database Optimisation

Most of our DirectQuery performance work happens at the database level, not in Power BI.

Index the columns your reports use. Look at the columns in your slicers, filters, and group-by clauses. Those need proper indexes. For SQL Server and Azure SQL, columnstore indexes work particularly well for aggregation-heavy analytical queries.

Materialise computed values. If your reports show calculated fields like revenue (quantity times price), add computed columns to your source tables. DAX calculations in DirectQuery translate to inline SQL expressions, and your database's query optimizer may not handle them efficiently. Pre-computed columns can be indexed.

Consider indexed views for common patterns. If users consistently look at monthly summaries of detail-level data, create an indexed view at that grain. The query optimizer can use it as a shortcut instead of scanning millions of rows each time.

Watch for case sensitivity mismatches. Power BI's engine is case-insensitive, but some databases aren't. PostgreSQL, for instance, treats "Melbourne" and "melbourne" as different values. This mismatch can cause subtle bugs where records appear to be duplicated or filtered incorrectly.

Report Design for DirectQuery

You also need to think about your report design differently with DirectQuery.

Fewer visuals per page. Each visual is a separate query. We aim for 6 to 8 visuals per page maximum in DirectQuery reports, compared to the 12 to 15 that Import mode handles easily.

Use the Apply button for slicers. By default, changing a slicer immediately triggers queries for all affected visuals. Enabling the Apply button in query reduction options lets users make multiple slicer selections before firing queries.

Disable cross-highlighting where it's not essential. Cross-highlighting between visuals triggers additional queries on every click. If the interactive behaviour isn't adding analytical value, turn it off.

Push filters early. Apply key filters at the top of the page so they reduce the dataset before any visual queries run. This is particularly important when your underlying data is large.

What We Recommend for Most Clients

For most Australian organisations we work with, the practical advice is straightforward.

Start with Import mode. If you need more frequent refreshes, look at Premium's incremental refresh capabilities. If you need true near real-time on recent data, try hybrid tables before going full DirectQuery. If you're on Microsoft Fabric, evaluate Direct Lake first.

When you do use DirectQuery, invest in database performance. Get your indexing right. Simplify your Power Query steps to ensure folding. Design your reports with fewer, more purposeful visuals.

And be realistic about what "real-time" actually means for your business. If the answer is "data from the last few hours," you probably don't need DirectQuery at all.

At Team 400, we help Australian organisations design Power BI solutions that match actual business requirements rather than technical assumptions. Whether that's an Import model, a DirectQuery setup, or a hybrid approach using some of the newer Fabric capabilities, we focus on what works for your specific situation.

If you're trying to figure out the right data strategy for your Power BI deployment, our business intelligence team can help you make that call. And if you're considering Microsoft Fabric alongside Power BI, our Microsoft Fabric consultants have been helping clients through that transition.

The right answer depends on your data, your users, and your infrastructure. Not on what sounds most impressive in a vendor presentation.