Back to Blog

Row-Level Security in Analysis Services Tabular Models for Power BI

April 2, 20267 min readMichael Ridland

Row-level security is one of those features that sounds simple until you try to implement it properly. The idea is straightforward - different users see different rows of data based on who they are. A sales manager in Sydney sees Sydney data. A regional director sees all of Australia. The CEO sees everything.

But actually building this against an on-premises Analysis Services tabular model and connecting it through to Power BI? That's where the detail matters. And the detail is where most implementations go wrong.

We've set this up for a number of clients running hybrid environments - some data in Azure, some still on-prem in SQL Server with Analysis Services sitting on top. It's a common pattern across Australian enterprises, particularly in financial services and government, where not everything has migrated to the cloud yet and probably won't for a while.

Here's how to do it right, based on what we've seen work and what we've seen fail.

The Architecture at a Glance

The flow is: SQL Server database with a security mapping table, an Analysis Services tabular model with DAX row filters, an on-premises data gateway, and Power BI reports using a live connection. Each piece has to work for the whole chain to hold.

The Microsoft tutorial uses AdventureWorksDW2012 as the sample database, and it's a decent starting point. But let me walk through the parts that matter most and flag the things that trip people up in real deployments.

Step 1 - Build the Security Table

You need a table that maps users to the data they're allowed to see. In the AdventureWorks example, this is a DimUserSecurity table that maps employee IDs and Windows usernames to sales territory IDs.

The key columns are:

  • UserName - the Windows domain username (e.g., DOMAIN\jsmith)
  • SalesTerritoryID - the foreign key to whatever dimension controls access

This table lives in your relational database and gets a foreign key relationship to the territory dimension. In practice, you'll probably map to whatever your security boundary is - cost centre, department, region, business unit, or some combination.

A few real-world considerations:

Keep the security table simple. We've seen attempts to build elaborate hierarchical security models with cascading permissions inside SQL tables. It gets unmanageable fast. If you need hierarchy-based security (e.g., a state manager sees all branches in their state), handle that in DAX, not in the security table.

Use Windows usernames consistently. The DAX USERNAME() function returns the Windows identity of the person viewing the report. If your Active Directory has inconsistencies - and after 15 years of employee turnover, it probably does - sort that out first. A security model that doesn't match your AD is a security model that doesn't work.

Think about what happens when someone isn't in the table. By default, if a user doesn't appear in the security table, the DAX filter will return no rows. That's usually the right behaviour - fail closed, not open. But make sure your stakeholders understand this, because the first thing that'll happen after deployment is someone calling to say their report is empty.

Step 2 - Build the Tabular Model with Role-Based Filters

In SQL Server Data Tools (SSDT), you import the tables you need and define a role with DAX row filters.

The role, typically called something like SalesTerritoryUsers, gets Read permission and a DAX filter on the territory dimension:

=DimSalesTerritory[SalesTerritoryKey]=LOOKUPVALUE(
    DimUserSecurity[SalesTerritoryID],
    DimUserSecurity[UserName], USERNAME(),
    DimUserSecurity[SalesTerritoryID], DimSalesTerritory[SalesTerritoryKey]
)

This filter says: for the current user, look up their territory IDs from the security table, and only show rows in the territory dimension where the key matches. Because the fact tables relate to the territory dimension, the filter propagates automatically. The user only sees sales data for their territories.

You also add a filter on the security table itself:

=FALSE()

This prevents anyone from querying the security table directly. Nobody should be able to see the full user-to-territory mapping through a report.

Important limitation: the USERELATIONSHIP function doesn't work with row-level security DAX filters. If your model uses inactive relationships that get activated via USERELATIONSHIP in measures, those measures won't respect the RLS filter. We hit this on a project where a client had both a ship-to and bill-to territory relationship on their sales fact table. The workaround was to restructure the model to use active relationships for the RLS-relevant path.

Step 3 - Deploy and Configure the Gateway

Once the tabular model is deployed to your Analysis Services instance, you need the on-premises data gateway installed and configured so the Power BI service can reach it.

The gateway setup itself is well documented, but a couple of things worth noting:

The gateway service account matters for security. The account running the gateway service needs permissions on the Analysis Services instance. If you're using Windows authentication pass-through (which you should be for RLS to work), the gateway needs to be able to impersonate users against SSAS.

Test with actual users, not just your admin account. We've seen implementations that work perfectly in testing because the developer has admin rights on everything, and then break in production because the gateway service account can't impersonate regular domain users against Analysis Services. Test with a non-admin account early.

Keep the gateway updated. Microsoft releases updates monthly and occasionally fixes are specifically related to SSAS live connections and user mapping. An outdated gateway can cause subtle authentication failures that are painful to debug.

Step 4 - Connect Power BI with a Live Connection

This is the part that's non-negotiable. For dynamic row-level security against Analysis Services to work in Power BI, you must use a live connection. Import mode won't work because the data gets loaded at refresh time under a single identity - there's no per-user filtering at query time.

In Power BI Desktop:

  1. Get Data > SQL Server Analysis Services Database
  2. Enter your server details
  3. Select Connect live
  4. Pick your model

When users view the report in the Power BI service, their identity flows through the gateway to Analysis Services, the DAX filter evaluates against their username, and they only see their data. It's a clean architecture when it works.

The trade-off is performance. Every visual interaction in the report sends a DAX query back to your on-prem SSAS instance via the gateway. If your Analysis Services server is undersized or your gateway is a bottleneck, users will feel it. We've seen reports that work fine for 10 concurrent users and fall over at 50. Capacity planning for the SSAS instance is worth doing upfront.

Common Mistakes We See

Testing only in Desktop. RLS in Power BI Desktop uses the "View as Role" feature, which simulates a role but doesn't test the full authentication chain. You need to publish to the Power BI service and test with actual user accounts to validate the end-to-end flow.

Not handling the "all data" scenario. Executives usually need to see everything. You can handle this with a separate role that has no DAX filters, or by adding a wildcard entry in the security table. Don't make the CEO call IT because they can't see the full picture.

Forgetting about shared dashboards. When you pin visuals from a report to a dashboard, the dashboard tiles respect RLS - each user sees their own data. But if you share a report link, the viewer sees data based on their own identity, not the sharer's. This is the correct behaviour, but it confuses people who expect "share" to mean "show them what I see."

Mixing up model-level and report-level security. If you're using both Analysis Services RLS and Power BI service RLS, things get confusing. For on-prem SSAS models, the security should live in the model. Don't try to layer Power BI RLS on top of SSAS RLS - pick one source of truth.

Is This Still Relevant with Everything Moving to the Cloud?

Yes. We talk to Australian organisations every week that have on-premises Analysis Services instances they're not ready to migrate. Some have regulatory requirements around data residency. Some have complex models that would need significant rework to move to Azure Analysis Services or Power BI Premium semantic models. Some just have higher priorities for their migration budget.

If that's you, this pattern works and it's well-supported. Microsoft hasn't deprecated it and the gateway keeps getting updates.

But if you're starting fresh, consider whether you need on-premises SSAS at all. Power BI Premium and Azure Analysis Services both support RLS natively without the gateway complexity. The architecture is simpler and there's less infrastructure to maintain.

For help implementing row-level security in your Power BI environment - whether on-premises or cloud - our Power BI consulting team has done this across industries including financial services, government, and manufacturing. If your security model is complex or you're planning a migration from on-prem SSAS, reach out and we can talk through the options.