Power BI Semantic Model Modes - Import vs DirectQuery vs Composite Explained
If you're building Power BI reports for an Australian business and you haven't thought carefully about your semantic model mode, you're probably going to hit a wall. Maybe not today, maybe not next month - but eventually.
I've seen it happen repeatedly. A team builds out a beautiful set of dashboards using the default Import mode, everything works great with their test data, and then six months later the model is 4GB, refreshes take 45 minutes, and finance is complaining that the numbers are stale by the time they see them.
The choice between Import, DirectQuery, and Composite modes is one of those decisions that seems technical but has real business impact. Get it right and your reports are fast, current, and manageable. Get it wrong and you're stuck in a cycle of workarounds.
Import Mode - The Default for Good Reason
Import mode is what Power BI gives you by default, and honestly, it's the right choice for most scenarios. Your data gets pulled from the source, compressed by the VertiPaq engine, and stored in memory. Queries run against that in-memory copy, which means they're fast. Really fast.
The compression is impressive too. We regularly see 10x compression ratios - so 10GB of source data might only take up about 1GB in the model. On disk, the numbers are even better since you get roughly another 20% reduction on top of that compressed size.
What makes Import mode so popular isn't just performance though. It's flexibility. You get the full Power Query M language for data preparation, the complete DAX function library for calculations, and features like Q&A and Quick Insights that only work with imported data. If you're building calculated columns, calculated tables, or complex measures, Import mode gives you the full toolkit.
But here's where teams get caught out:
The entire model has to be in memory when it's queried. As your models grow - more tables, more history, more granular data - you're consuming more and more capacity resources. And every refresh dumps the old data and reloads everything from scratch (unless you've set up incremental refresh, which you absolutely should for larger models).
Data staleness is real. On shared capacity, you get eight scheduled refreshes per day. On Premium, that goes up to 48 - roughly every 30 minutes if you push it. For a lot of reporting, that's fine. For anything resembling real-time visibility, it's not.
We tell clients: if your data is under a few gigabytes, your users can tolerate data that's an hour old, and you want maximum report flexibility, Import mode is your best bet. Don't overthink it.
DirectQuery - When Freshness Beats Speed
DirectQuery flips the model on its head. Instead of importing data, Power BI stores only the metadata - the table definitions, relationships, and measures. When a user opens a report or clicks a slicer, Power BI generates a native query and sends it to the underlying data source in real time.
No stale data. No scheduled refreshes. What's in the database right now is what the report shows.
There are two scenarios where this genuinely shines. The first is data volume. If you've got a fact table with hundreds of millions of rows and data reduction techniques still leave you with a model that's too big for Import, DirectQuery sidesteps the problem entirely since there's nothing to import.
The second is near real-time reporting. If your operations team needs to see what happened in the last five minutes - not what the data looked like at the last refresh - DirectQuery gives you that. Combined with automatic page refresh, you can build dashboards that update every 15 minutes or faster.
The trade-offs are significant though, and I want to be honest about them.
Performance takes a hit. Every interaction with the report - every filter change, every slicer click - fires a query back to the data source. If that source is a well-optimised Azure SQL database with proper indexing, the experience can be decent. If it's a Dataverse instance or a shared on-premises SQL Server, your users are going to notice the lag.
DAX limitations will bite you. Not every DAX function works in DirectQuery mode because the function has to be translatable into a native query the source understands. Calculated tables aren't supported at all. If your model relies heavily on complex calculated columns, you'll need to rethink your approach.
Quick Insights and Q&A don't work. These are Import-only features.
We've had projects where a client insisted on DirectQuery for "real-time" reporting, only to discover that the user experience was poor because their source database wasn't optimised for analytical queries. The lesson: DirectQuery pushes the performance burden to your data source. If that source isn't ready for it, your users pay the price.
Composite Mode - The Best of Both (Sometimes)
Composite mode lets you mix Import and DirectQuery storage within the same model. Configure it at the table level - some tables import their data, others use DirectQuery, and some can be set to Dual mode where Power BI decides the most efficient approach per query.
This is where things get interesting for larger deployments.
The classic pattern we use with clients follows star schema principles. Your dimension tables - products, customers, regions, dates - get set to Import or Dual mode. They're relatively small, change infrequently, and benefit massively from in-memory performance. Your fact tables - sales transactions, event logs, sensor readings - stay in DirectQuery mode because they're large and you want fresh data.
When Power BI runs a report query, the dimension tables respond from memory while the fact table query goes to the source. The engine can even generate more efficient native queries by sending the imported dimension values as filters. A user filters by three product categories using a slicer, and the SQL query that hits the database already includes those three values in a WHERE clause. Much faster than having the database figure it out.
We set up a model like this for a retail client last year. Their sales fact table had tens of millions of rows and grew daily. Importing it meant slow refreshes and a huge model. Pure DirectQuery meant sluggish slicer performance because every dimension lookup hit the database. The Composite approach gave them sub-second slicer responses on dimensions with fresh sales data flowing through on every report interaction.
Hybrid Tables and Real-Time Data
There's a feature worth knowing about called hybrid tables. With Power BI Premium or Fabric capacity, you can set up an Import table with an incremental refresh policy that also has a real-time DirectQuery partition.
In practice, this means historical data lives in the fast in-memory store while recent data (say, the current day) comes through DirectQuery. You get the query performance of Import for 99% of your data and the freshness of DirectQuery for the latest records.
It's a genuinely good pattern for operational dashboards where users need historical context alongside current-day activity. The catch is it requires Premium or Fabric capacity - it's not available on shared capacity.
How We Help Clients Choose
After working on dozens of Power BI implementations, here's the decision framework we walk through with clients:
Start with Import. Unless you have a specific reason not to, Import mode is the right default. It's the fastest, most flexible, and easiest to maintain.
Move to DirectQuery when your data source is too large to import, your users need data fresher than your refresh schedule allows, or you need to enforce row-level security from the source system (particularly relevant for Dataverse).
Use Composite when you have a mix of large fact tables and smaller dimensions, and you want to balance performance against freshness. This is often the sweet spot for enterprise deployments.
Consider hybrid tables if you're on Premium or Fabric and need historical performance with real-time recent data.
The decision isn't purely technical either. We always factor in what the data source can handle (will DirectQuery queries overwhelm your database?), what users actually need (do they genuinely need five-minute-old data, or is hourly fine?), and what your capacity licensing supports.
One thing I'd push back on is the idea that you need to decide once and live with it forever. Power BI makes it reasonably straightforward to change storage modes on individual tables. If you start with Import and later need to move a fact table to DirectQuery, you can do that without rebuilding the whole model. Plan for it, but don't agonise over the initial choice.
Getting the Most Out of Your Power BI Investment
Power BI is a genuinely powerful platform, but it rewards good architectural decisions at the model layer. The difference between a well-designed semantic model and a poorly designed one isn't subtle - it shows up in query times, refresh durations, capacity costs, and user satisfaction.
If you're working through these decisions for your organisation, our Power BI consulting team has done this across industries from financial services to manufacturing. We can help you design models that scale properly and perform well from day one.
For broader data platform strategy, especially if you're also looking at Microsoft Fabric or building out a data analytics practice, we work with clients on the full stack from source systems through to Power BI reports.
You can read the full technical documentation on semantic model modes from Microsoft for the complete reference.
The short version: think about your model mode early, test with realistic data volumes, and don't be afraid to use Composite mode when a single approach doesn't cover all your needs. It's one of those decisions that pays dividends if you get it right up front.