Using the Fabric Data Factory Pipeline Assessment Tool Before Migrating
Using the Fabric Data Factory Pipeline Assessment Tool Before Migrating
If you're running Azure Data Factory and you've been thinking about moving to Fabric Data Factory, there's a step that a lot of teams skip - and it costs them time later. Before you start migrating pipelines, you should assess what you actually have.
Microsoft ships a pipeline assessment tool specifically for this purpose. It scans your existing ADF estate and tells you what will migrate cleanly, what needs manual work, and what might not be compatible at all. It's the kind of thing that takes 30 minutes to run and can save you weeks of discovery work.
I've been through enough data platform migrations to know that the assessment phase is where projects either get set up for success or start accumulating technical debt that haunts the team for years. Here's how to use this tool properly.
Why Assessment Before Migration Matters
The pitch for Fabric Data Factory is compelling. You get everything ADF does, plus tighter integration with the rest of the Fabric ecosystem - lakehouses, notebooks, Spark, Power BI. The idea is that instead of managing a standalone ADF instance that connects to separate services, everything lives under one roof.
But here's the thing: most ADF estates have grown organically over years. Pipelines were built by different people, at different times, with different conventions. Some use linked services that have been deprecated. Some rely on self-hosted integration runtimes with specific network configurations. Some have custom activities that shell out to Azure Batch or container instances.
Not all of that maps neatly to Fabric Data Factory. And the pipeline assessment tool exists to tell you exactly where the gaps are before you commit to a migration timeline.
I've seen teams skip the assessment, start migrating the "easy" pipelines, and then discover halfway through that 30% of their estate uses activities or configurations that need significant rework. By that point, they've already told leadership the migration would take six weeks. It ends up taking four months.
What the Assessment Tool Actually Does
The pipeline assessment tool connects to your existing Azure Data Factory instance and analyses every pipeline, dataset, linked service, trigger, and integration runtime. It produces a report that categorises your components into roughly three buckets:
Ready to migrate. These pipelines use activities and configurations that have direct equivalents in Fabric Data Factory. Copy activities, data flows, lookup activities, and standard connectors generally fall into this category. You can migrate these with the built-in migration experience and expect them to work with minimal adjustment.
Partially compatible. These pipelines use features that exist in Fabric but work differently. For example, some trigger types have different scheduling models, or certain connector authentication methods need reconfiguring. You can migrate these, but you'll need to do some manual work after the migration to get them running correctly.
Not currently supported. These are the pipelines that use ADF-specific features without Fabric equivalents. Custom activities that run on Azure Batch, SSIS integration runtime packages, and certain specialised connectors may fall into this bucket. For these, you need a plan - either refactor the pipeline to use supported activities, keep it running in ADF alongside Fabric, or find an alternative approach entirely.
The report gives you specifics, not just categories. It tells you which exact activities, linked services, or configurations are causing compatibility issues. That level of detail is what makes it useful.
Running the Assessment
The assessment tool is available through the Azure portal and can also be accessed from within the Fabric migration experience. The basic process looks like this:
- Open your Azure Data Factory instance in the Azure portal
- Look for the migration section - Microsoft has been adding migration tooling progressively
- Select the pipeline assessment option
- Choose which pipelines to assess (you can do all of them or a subset)
- Wait for the analysis to complete
- Download or review the assessment report
For larger ADF instances with hundreds of pipelines, the assessment can take a few minutes to complete. It's read-only - it doesn't modify anything in your existing ADF instance.
One practical tip: run the assessment against your production ADF instance, not your development instance. Production is where the real complexity lives. Your dev instance might have half the pipelines and none of the edge cases.
What to Do With the Results
Getting the assessment report is step one. What you do with it determines whether your migration goes smoothly.
For the "ready to migrate" pipelines: These are your quick wins. Start with these when you begin the actual migration. They build confidence, they give you practice with the migration tooling, and they let you validate that the Fabric Data Factory environment is configured correctly before you tackle the harder stuff.
For the "partially compatible" pipelines: Document what needs to change. Create a runbook for each type of compatibility issue. If five pipelines all need their authentication method updated, write the fix once and apply it across all five. This is where having a structured approach saves real time.
For the "not currently supported" pipelines: This is where you need to make strategic decisions. You have three options:
First, you can keep those pipelines in ADF. Microsoft supports running ADF and Fabric Data Factory side by side, and there's no announced deprecation date for ADF. If a pipeline works fine in ADF and the cost of refactoring isn't justified, leave it where it is.
Second, you can refactor the pipeline to use Fabric-native approaches. A custom activity that runs a Python script on Azure Batch might be replaceable with a Fabric notebook. An SSIS package might be convertible to a data pipeline with equivalent logic. This takes effort but reduces your ongoing operational footprint.
Third, you can wait. Microsoft is actively adding features to Fabric Data Factory, and some of today's gaps may be closed by the time you get to that phase of your migration. The assessment tool is something you can run periodically to check whether compatibility has improved.
Assessment as a Planning Tool
Beyond the technical compatibility check, the assessment report is genuinely useful for project planning.
It gives you a realistic scope for the migration. Instead of guessing how long it'll take to move 200 pipelines, you know that 150 are straightforward, 30 need manual work, and 20 need significant rework or a decision to leave them in ADF.
It helps you estimate costs. The pipelines that need rework require developer time. The pipelines that stay in ADF have ongoing operational costs. The assessment lets you build a business case with actual numbers instead of rough guesses.
It also surfaces technical debt. I've seen assessment reports reveal pipelines that haven't run in months, linked services pointing to decommissioned databases, and triggers that fire daily but whose output nobody consumes. Migration is a good time to clean house, and the assessment helps you identify what to clean.
Common Findings from Assessments We've Done
From the ADF-to-Fabric assessments we've run with clients across Australia, there are a few patterns that come up repeatedly.
Self-hosted integration runtimes are the most common source of "not supported" findings. Many organisations use SHIRs to connect to on-premises databases or resources behind firewalls. Fabric has its own gateway mechanism, but it's not a one-for-one replacement. Plan for this early.
Web activities and webhook activities sometimes need adjustment. The authentication models can differ, and if you're calling internal APIs with specific network requirements, you'll need to verify that Fabric can reach those endpoints.
Custom activities using Azure Batch are almost always flagged. If you have significant logic running in custom activities, those typically need the most rework.
Triggers are often partially compatible. ADF's tumbling window triggers and event-based triggers have Fabric equivalents, but the configuration isn't identical. Storage event triggers in particular may need reconfiguring.
Our Recommendation
Run the assessment tool before you do anything else. Before you write a migration plan, before you estimate timelines, before you commit to a deadline. The 30 minutes it takes to run the assessment will save you from making promises you can't keep.
If you're managing a significant ADF estate and planning a move to Fabric, our Microsoft Fabric consulting team can help you interpret the assessment results and build a realistic migration plan. We also work with Azure Data Factory extensively, so we understand both sides of the migration.
For organisations that aren't ready to migrate but want to understand what the path looks like, we run data platform strategy sessions that include an assessment phase. Better to know what you're dealing with now than to be surprised later.
Reference
This post covers the pipeline assessment tool documented in Microsoft's Fabric Data Factory migration guide.