Back to Blog

Power BI Modeling for Dataverse and Power Platform - A Practical Guide

March 29, 20268 min readMichael Ridland

Here's something that comes up in almost every Dynamics 365 or Power Apps project we work on: at some point, someone wants to report on the Dataverse data in Power BI. And at that point, the question of how to connect them becomes surprisingly important.

Dataverse is everywhere now. If your organisation runs Dynamics 365 Customer Engagement, Power Apps, Power Automate approvals, or any number of Microsoft business applications, your data lives in Dataverse. The built-in charts and views are fine for basic operational use, but the moment someone wants to combine that data with external sources, build time-series analysis, or create dashboards that actually look good - they reach for Power BI.

The problem is that Dataverse wasn't designed as an analytical database. It's an operational data store optimised for transactional workloads. How you bridge the gap between Dataverse and Power BI matters a lot for report performance, data freshness, and your users' sanity.

Three Ways to Connect - Pick Carefully

Microsoft gives you three paths from Dataverse to Power BI. Each has genuine strengths and real limitations.

Option 1 - Import with the Dataverse Connector

This is the default approach and the one most teams start with. You use the Dataverse connector in Power BI Desktop, select your tables, and import the data into the model. Data gets cached in memory, queries run fast, and you get full DAX and Power Query flexibility.

For small to medium Dataverse environments - say, under a hundred thousand records across your key tables - this works well. You can integrate Dataverse data with data from other sources (SQL databases, SharePoint, Excel files, APIs), apply complex transformations, and build rich models.

The limitations show up at scale. As your Dataverse data grows, refresh times increase. You're pulling data over the Dataverse API, which has throttling limits, so large refreshes can take a surprisingly long time. We've seen environments where refreshing a handful of Dataverse tables with a few hundred thousand records each takes 20-30 minutes. That's a lot of capacity time for data that might change throughout the day.

Incremental refresh helps here if you're on Premium or Fabric capacity. Instead of reloading entire tables, you refresh only recent changes. But setting this up for Dataverse tables requires some careful configuration since the modified date fields need to be reliable.

Option 2 - DirectQuery with the Dataverse Connector

DirectQuery skips the import step entirely. When a user opens a report, Power BI sends queries directly to Dataverse and renders the results. No stale data, no refresh schedules.

This sounds appealing, and there are two scenarios where it genuinely works well.

Near real-time reporting. If your sales team needs to see pipeline data that's current as of right now, not as of the last refresh, DirectQuery gives you that. Combine it with automatic page refresh and you've got a dashboard that updates continuously.

Dataverse security enforcement. This is the one that catches people by surprise. If your Dataverse environment has complex role-based security - salespeople can only see their own opportunities, managers can see their team's records, admins see everything - a DirectQuery connection respects that security. The queries run in the context of the connected user, and Dataverse enforces the same record-level permissions it applies everywhere else.

With Import mode, you'd need to replicate that security logic using Row Level Security (RLS) in Power BI, which is doable but tedious and error-prone, especially for complex security models.

The catch with DirectQuery against Dataverse is performance. I'll be blunt: it can be slow. Dataverse is not optimised for the kind of analytical queries Power BI generates. Filter a report by date range, group by product category, and sum revenue, and the resulting query might take several seconds to return. In a report with multiple visuals, each sending its own query, the page load can stretch to 10 or 15 seconds. That's a poor user experience.

Microsoft's guidance suggests keeping DirectQuery result sets under 20,000 rows, with each query returning within 10 minutes. Those aren't hard limits, but they give you a sense of where things start to struggle.

Option 3 - Azure Synapse Link

For larger Dataverse environments, Azure Synapse Link changes the equation entirely. It sets up a continuous, managed pipeline that replicates Dataverse tables to Azure Data Lake Storage Gen2 as CSV or Parquet files. From there, Power BI can query an Azure Synapse serverless SQL pool to build an Import model.

Why bother with the extra infrastructure? Because Synapse is built for analytical queries in a way that Dataverse isn't. You get fast refresh times even with millions of rows, you can apply complex transformations in Synapse before the data reaches Power BI, and the ongoing sync means your data lake stays current without manual intervention.

We've used this approach for clients with large Dynamics 365 environments - organisations with millions of transaction records across multiple entities. The difference in refresh performance compared to the direct Dataverse connector is dramatic. What took 30 minutes over the Dataverse API might take 2 minutes through Synapse.

The trade-off is setup complexity. You need an Azure subscription, a Data Lake Storage account, Synapse workspace configuration, and ongoing management of the pipeline. For a small Power Apps deployment with a few thousand records, that's overkill. For a large enterprise Dynamics 365 environment, it's the right approach.

Making the Decision

Here's how we walk clients through this choice:

Data volume matters most. Under 100K records in your reporting tables? Import with the Dataverse connector is fine. Between 100K and a million? Import still works but look at incremental refresh and be ready for longer refresh times. Over a million? Seriously consider Azure Synapse Link.

Freshness requirements are the second filter. If hourly data is acceptable, Import mode with scheduled refreshes covers you. If users need current-minute data, you're looking at DirectQuery (with its performance trade-offs) or Synapse Link with a hybrid table approach on Premium capacity.

Security requirements can force your hand. If you need Dataverse's role-based security reflected in Power BI and the security model is complex, DirectQuery is the most straightforward path. Yes, you can build equivalent RLS in Power BI for Import models, but maintaining that as your security model evolves is a headache.

Budget and infrastructure also play a role. DirectQuery has no additional infrastructure cost but may need Premium capacity for acceptable performance. Synapse Link requires Azure resources but delivers better performance at scale. Import mode on shared capacity is the cheapest option but has the most limitations on refresh frequency.

Dataverse Schema vs Power BI Schema

One thing we always flag early in Dataverse-to-Power BI projects: don't just import Dataverse tables as-is and expect a good model.

Dataverse schemas are designed for application use, not analytics. You'll find:

  • Lookup fields that need to be expanded into proper relationships
  • Option sets stored as integer codes that need mapping to display values
  • System columns (created by, modified by, state code, status code) that clutter the model
  • Polymorphic lookups that don't translate cleanly into Power BI relationships

Take the time to build a proper star schema in Power BI. Create clean dimension tables from Dataverse entities, establish proper relationships, and hide fields that users don't need. The modelling effort pays off in report performance and usability.

We've worked on projects where teams imported 20+ Dataverse tables directly, built reports on top of them, and wondered why everything was slow and confusing. Spending a day designing a proper semantic model would have saved weeks of frustration.

Where Composite Models Fit

For many Dataverse reporting scenarios, a Composite model is the practical sweet spot. Import your dimension-type entities (accounts, contacts, products, users) for fast slicer performance and set your large transactional entities (opportunities, cases, activities) to DirectQuery or backed by Synapse Link.

This gives users snappy interaction on filters and slicers while keeping large datasets fresh and manageable. It's more work to set up than a pure Import or pure DirectQuery approach, but for medium to large deployments it tends to produce the best user experience.

Beyond the Basics

The real value of connecting Power BI to Dataverse isn't just replicating the views you already have in Dynamics or Power Apps. It's integrating that data with everything else.

Combine your CRM pipeline data with financial actuals from your ERP. Overlay customer interaction data with support ticket trends. Bring in external market data alongside your sales performance. Power BI makes this kind of cross-source analysis practical in a way that Dataverse's built-in reporting never will.

If you're working with Dynamics 365 or Power Apps and want to get more value from your Dataverse data, our Power BI consultants can help you design the right connection approach and build models that actually perform well. We also work across the broader Microsoft data stack including Data Factory for more complex ETL scenarios and Power Apps for the application side.

For the full technical reference, see Microsoft's Power BI modeling guidance for Power Platform.

The key takeaway: don't treat the connection between Dataverse and Power BI as an afterthought. The approach you choose affects everything downstream - performance, freshness, security, and ultimately whether people actually use the reports you build.