Back to Blog

Power BI Desktop - How to Connect to Your Data Sources Properly

April 11, 20268 min readMichael Ridland

Most Power BI projects don't fail because of bad visuals or poor DAX. They fail because the data connection layer was set up carelessly in the first week, and nobody went back to fix it. Getting your data connections right in Power BI Desktop is the foundation everything else sits on, and it's worth spending time on.

We've worked with dozens of Australian organisations on their Power BI implementations at Team 400, and the pattern repeats itself. Someone downloads Power BI Desktop, connects to a couple of Excel files or a SQL database, builds a few reports, and everything looks fine. Then six months later they're dealing with refresh failures, performance issues, and a data model that's grown into something nobody fully understands.

Here's how to avoid that.

Getting Started with Power BI Desktop

If you haven't installed it yet, Power BI Desktop is a free application from Microsoft. You can grab it from the Microsoft Store or download it directly from the Power BI website. The Microsoft Store version updates automatically, which I'd recommend - you don't want to be two versions behind when a connector you need gets a fix.

Once you've launched the application, the connection process starts from the Home ribbon. Click Get Data and you'll see a window listing every data source Power BI can connect to. And there are a lot of them. Excel, SQL Server, Azure SQL, Salesforce, Dynamics 365, Azure Blob Storage, web APIs, SharePoint lists, OData feeds - the list keeps growing with each monthly release.

For the official walkthrough of the basic connection flow, Microsoft's Connect to data in Power BI Desktop documentation covers the step-by-step process.

The Excel Connection - Where Everyone Starts

Almost every Power BI journey starts with Excel. Someone has a spreadsheet they've been maintaining for months or years, and they want to build something more interactive on top of it.

The process is straightforward: select Excel from the Get Data window, point it at your file, and Power BI shows you the available tables and named ranges through the Navigator window. You tick the ones you want and hit Load.

Here's where the first mistake happens. People load raw worksheets without any thought about data shape. Power BI works best when your data looks like a database table - headers in the first row, consistent data types down each column, no merged cells, no summary rows mixed in with detail rows.

If your Excel file doesn't look like that (and most don't), use Power Query to clean it up before loading. Remove those top rows of metadata. Unpivot columns that should be rows. Split combined fields into separate columns. This cleanup work feels tedious, but it pays for itself many times over.

Choosing the Right Connection Mode

Power BI gives you two main connection modes, and picking the wrong one causes real problems down the track.

Import mode pulls a copy of the data into the Power BI file. This is the default, and it's the right choice for most datasets under a few hundred megabytes. Your reports will be fast because everything's cached locally. The downside is that you need to refresh the data to see changes, and your .pbix file can get large.

DirectQuery leaves the data in the source and sends queries on the fly whenever someone interacts with a report. This is better for very large datasets or situations where you need real-time data. But report performance depends entirely on your source system's query speed, and not all DAX functions work in DirectQuery mode.

My honest opinion: start with Import unless you have a specific reason not to. DirectQuery adds complexity and performance variability that most teams aren't prepared to manage. If your dataset grows beyond what Import handles comfortably, that's when you start looking at DirectQuery, composite models, or aggregation tables.

Beyond Excel - Connecting to Real Data Sources

Excel connections are fine for prototyping, but production Power BI reports should connect to proper data sources. Here's what we see working well across our Power BI consulting engagements.

SQL Server and Azure SQL are the most common database connections for Australian enterprises. The connection is reliable, performance is predictable, and you can use native SQL queries in Power Query if you need to push transformations to the server side. If your data lives in SQL Server, this is the path of least resistance.

SharePoint Lists come up constantly because so many Australian businesses run on Microsoft 365. The connector works, but it can be slow for large lists and the data typing is sometimes inconsistent. If you're pulling from SharePoint, consider whether the data should really be in a proper database instead.

Azure Blob Storage and Data Lake connections matter when you're working with larger datasets or data engineering pipelines. These connectors pair well with Microsoft Fabric workflows where data lands in a lakehouse and Power BI connects to the semantic layer.

Web APIs and REST services let you pull data from SaaS platforms, government data portals, and internal APIs. The Web connector with a URL is the simplest version, but for anything in production you'll want to use a custom connector or at least parameterise the connection.

Common Connection Mistakes We See

After years of fixing broken Power BI implementations, here are the patterns that cause the most grief.

Hardcoded file paths. Someone builds a report connected to C:\Users\jsmith\Desktop\data.xlsx and then wonders why it breaks when deployed to the Power BI Service. Use parameters for file paths, or better yet, move the data to a shared location before you build anything.

Too many data sources in one report. We've seen reports connecting to six different Excel files, two databases, a SharePoint list, and a web API - all in one .pbix file. Each connection is a potential failure point. Consolidate your data upstream if possible. A single clean database connection beats five fragile ones.

Ignoring data types in Power Query. Power BI guesses data types when you first connect, and it guesses wrong often enough to matter. Date columns get treated as text, numeric IDs get treated as numbers (which strips leading zeros), and currency amounts lose their precision. Always explicitly set data types in Power Query.

Not using query folding. When you apply transformations in Power Query to a SQL-based source, Power BI can sometimes push those transformations back to the server. This is called query folding, and it's dramatically faster than pulling all the data and transforming locally. But certain steps break query folding silently. Learn which ones do, and structure your queries to maintain folding as long as possible.

The Data Refresh Problem

Connecting to data is only half the job. Keeping it fresh is the other half, and it's where a lot of projects struggle.

In Power BI Desktop, you refresh manually. But once you publish to the Power BI Service, you need to set up scheduled refreshes. This requires a gateway for on-premises data sources, which means installing the On-premises Data Gateway on a server that can reach your data.

The gateway setup trips up a lot of teams. It needs to run on a Windows machine that's always on and has network access to your data sources. It needs to be kept updated. And the credentials stored in the gateway need to be maintained when passwords change.

For cloud data sources like Azure SQL, you can skip the gateway entirely - the Power BI Service connects directly. This is one of many reasons to move data sources to the cloud when you can.

Setting Up Connections for a Team

If you're building Power BI for a team, think about connection management from day one.

Use parameters for server names, database names, and file paths. This lets you switch between development and production environments without editing queries. Store connection details in a configuration table rather than scattering them across individual queries.

Consider whether your team needs a data integration strategy that sits upstream of Power BI. For organisations with complex data landscapes, having a data warehouse or lakehouse that Power BI connects to is far better than having Power BI connect to a dozen raw sources.

Document which connections each report uses. When a data source changes - a server migration, a database rename, a credential rotation - you need to know which reports are affected. This documentation doesn't need to be fancy. A spreadsheet listing report names and their data sources works fine.

What I'd Actually Recommend

For most Australian businesses getting started with Power BI, here's the practical path:

Start with Power BI Desktop connecting to whatever data you have, even if it's Excel. Build a few reports. Learn the tool. But before you go into production, move your data into a proper source - Azure SQL Database is the easiest option if you're already in the Microsoft ecosystem.

Set up the Power BI Service with a Pro or Premium Per User licence so you can share reports and schedule refreshes. If you're connecting to on-premises data, install the gateway early and get it working before you have twenty reports depending on it.

And if your data needs are growing faster than your team can manage, that's when you bring in people who've done this before. Our Power BI consultants have set up data connections for organisations ranging from small finance teams to large government departments, and the principles are the same even if the scale differs.

The unsexy truth about Power BI is that the connecting-to-data part matters more than the dashboard-building part. Get the foundations right and everything else follows. Get them wrong and you'll spend more time troubleshooting refresh failures than actually analysing data.