Data Factory vs SSIS - Migration Guide for Australian Businesses
SQL Server Integration Services (SSIS) has been the backbone of data integration for Australian businesses for over 15 years. If your organisation runs on SQL Server, there's a good chance you have SSIS packages running somewhere - maybe dozens, maybe hundreds.
But SSIS is showing its age. It's tied to on-premises infrastructure, it doesn't scale elastically, and Microsoft's investment has clearly shifted to cloud-native services. The question isn't whether to migrate from SSIS - it's when, and to what.
We've migrated SSIS workloads to Azure Data Factory for organisations ranging from 20-package shops to enterprises with 500+ packages. Here's what we've learned about making this transition successfully.
Why Migrate from SSIS Now
Let's be direct about why this matters for Australian businesses in 2026:
SSIS isn't going away tomorrow, but it's in maintenance mode. Microsoft hasn't added significant features since SQL Server 2019. All active development is happening in Azure Data Factory and Fabric Data Factory.
Infrastructure costs are rising. The on-premises servers running your SSIS packages need hardware refreshes, OS patching, SQL Server licensing, and someone to manage them. We've seen organisations paying $50,000-$150,000 AUD annually to maintain SSIS infrastructure that could run on Data Factory for a fraction of the cost.
Talent is getting harder to find. New data engineers learn Python, Spark, and cloud-native tools - not SSIS. The pool of experienced SSIS developers in Australia is shrinking, and the ones who remain are expensive. We've seen SSIS contractor rates in Sydney and Melbourne hit $180-$220/hour because of scarcity.
Cloud data sources don't work well with SSIS. If your business is adopting SaaS applications (Salesforce, Dynamics 365, HubSpot, Xero), connecting SSIS to these sources is awkward at best. Data Factory has native connectors for all of them.
SSIS vs Data Factory - Honest Comparison
Before you commit to migration, understand what you're gaining and what you're giving up.
| Aspect | SSIS | Azure Data Factory |
|---|---|---|
| Deployment model | On-premises (or Azure VM) | Cloud-native PaaS |
| Scaling | Manual (add more servers) | Automatic (scales with workload) |
| Pricing | SQL Server licence + hardware | Pay-per-use or Fabric capacity |
| Connectors | Strong for Microsoft, limited for SaaS | 100+ native connectors |
| Transformations | Powerful (Script Task, Data Flow Task) | Different model (Mapping Data Flows, Notebooks) |
| Debugging | Excellent (step-through debugging) | Good but different (no step-through) |
| Version control | DTSX files in source control | Git integration (native) |
| Scheduling | SQL Agent jobs | Built-in triggers |
| Custom code | C#/VB.NET Script Tasks | Azure Functions, Custom Activities |
| Learning curve | Familiar to SQL Server teams | New paradigm, but similar concepts |
Where Data Factory Wins
- Cloud connectivity. Native connectors to hundreds of cloud services.
- Scaling. No capacity planning - Data Factory scales to meet demand.
- Operational overhead. No servers to patch, no SQL Agent to manage.
- Monitoring. Built-in monitoring with Azure Monitor integration.
- Cost at scale. For large data volumes, pay-per-use often beats infrastructure costs.
Where SSIS Still Has an Edge
- Complex row-level transformations. SSIS Data Flow Task with script components gives you fine-grained control over row-by-row processing. Data Factory's mapping data flows are powerful but work differently.
- Debugging experience. SSIS's step-through debugging in Visual Studio is genuinely better than anything Data Factory offers. You can set breakpoints, inspect data at any point in the flow.
- Existing investment. If you have hundreds of well-functioning SSIS packages and purely on-premises data sources, the cost of migration may not be justified yet.
Migration Assessment - What Migrates Easily and What Doesn't
Easy to Migrate (1-2 hours per package)
- Simple data movement packages. SSIS packages that move data from one source to another with minimal transformation map directly to Data Factory Copy Activities.
- Bulk insert packages. Straightforward loads from flat files or databases.
- Truncate-and-reload patterns. These translate directly to Data Factory pipelines.
Moderate Effort (4-8 hours per package)
- Packages with SQL-based transformations. If your SSIS packages use Execute SQL Tasks for most transformation logic, these convert well - you keep the SQL and just change the orchestration layer.
- Parameterised packages. SSIS package parameters map to Data Factory pipeline parameters, but the wiring needs manual work.
- Packages with simple loops. ForEach loops in SSIS map to ForEach activities in Data Factory, but iteration patterns often need adjustment.
Difficult to Migrate (8-20+ hours per package)
- Packages with Script Tasks. Custom C#/VB.NET code in Script Tasks needs to be rewritten as Azure Functions or custom activities.
- Complex Data Flow Tasks. SSIS Data Flow Tasks with multiple transformations, lookups, conditional splits, and derived columns require significant rearchitecting. Some can be converted to mapping data flows; complex ones may need Spark notebooks.
- Packages with custom components. Third-party SSIS components (CozyRoc, Task Factory, etc.) have no direct equivalent. Functionality needs to be rebuilt.
- Packages with complex error handling. SSIS's event handler model doesn't map directly to Data Factory's pipeline error handling. Error flows need to be redesigned.
The Migration Approach We Recommend
After running many SSIS migrations, we've settled on an approach that minimises risk while delivering value quickly.
Phase 1 - Assessment and Planning (1-2 weeks)
Inventory your SSIS packages. Document every package, including:
- What it does (source, destination, transformation)
- How often it runs
- Dependencies (other packages, jobs, applications)
- Complexity level (simple, moderate, complex)
- Business criticality (high, medium, low)
Classify each package into one of four categories:
- Migrate as-is - simple packages that translate directly
- Migrate with refactoring - moderate complexity, needs rework
- Rewrite - complex packages that need a new design
- Retire - packages that are no longer needed (you'll be surprised how many)
In our experience, a typical SSIS estate breaks down roughly as:
- 30-40% migrate easily
- 30-40% require moderate refactoring
- 10-20% need significant rework
- 10-15% can be retired
Define your target architecture. Will you use Azure Data Factory, Fabric Data Factory, or both? Where will data land (Azure SQL, Data Lake, Fabric Lakehouse)? What's your monitoring strategy?
Phase 2 - Foundation Build (2-3 weeks)
Before migrating any packages, set up the target environment:
- Data Factory instance with Git integration
- Linked services/connections for all data sources
- Self-hosted integration runtime (if you have on-premises sources)
- CI/CD pipelines for deployment
- Monitoring and alerting framework
- Naming conventions and folder structure
This foundation work pays for itself many times over. Without it, each pipeline becomes a one-off, and you end up with the same maintenance headaches you had with SSIS - just in a different tool.
Phase 3 - Iterative Migration (4-16 weeks depending on volume)
Migrate in batches, starting with the simplest, lowest-risk packages:
Batch 1: 5-10 simple packages. This validates your approach and gives the team confidence.
Batch 2: 10-20 moderate packages. Apply lessons from batch 1. Refine patterns and templates.
Batch 3+: Tackle the remaining packages in priority order, with complex/critical packages getting the most attention.
For each batch:
- Build the Data Factory pipeline
- Test with production-like data
- Run both SSIS and Data Factory in parallel for 1-2 weeks
- Switch over once results match
- Decommission the SSIS package
Phase 4 - Decommission SSIS (2-4 weeks)
Once all packages are migrated and running successfully:
- Disable remaining SQL Agent jobs
- Document the completed migration
- Plan infrastructure decommission (cancel SQL Server licences, repurpose or decommission servers)
- Archive SSIS packages for reference
What About the SSIS Integration Runtime?
Microsoft offers an Azure-SSIS Integration Runtime in Data Factory, which lets you run existing SSIS packages in the cloud without rewriting them. Should you use it?
Our honest opinion: It's a useful stepping stone, but it's not a destination.
The Azure-SSIS IR costs $400-$2,000+ AUD per month (depending on node size and count), and you still need to manage SSIS packages, configurations, and connections. You're paying cloud prices for an on-premises technology pattern.
We recommend the Azure-SSIS IR only when:
- You have a large number of complex SSIS packages that would take months to rewrite
- You need to move to the cloud quickly (e.g., data centre exit deadline)
- You plan to migrate packages to native Data Factory over time
For most Australian mid-market organisations, native Data Factory migration is faster and cheaper than running the Azure-SSIS IR long-term.
Cost of Migration
Typical costs for SSIS-to-Data-Factory migration in Australia:
| SSIS Estate Size | Estimated Migration Cost (AUD) | Timeline |
|---|---|---|
| Small (10-30 packages) | $25,000 - $60,000 | 4-8 weeks |
| Medium (30-100 packages) | $60,000 - $180,000 | 8-16 weeks |
| Large (100-300 packages) | $150,000 - $400,000 | 4-8 months |
| Enterprise (300+ packages) | $350,000 - $800,000+ | 6-12+ months |
These figures include assessment, planning, development, testing, parallel running, and handover. They don't include ongoing Data Factory running costs (see our Data Factory cost guide for those details).
The ROI calculation: Compare migration cost against annual SSIS infrastructure and maintenance costs. For most organisations we work with, the migration pays for itself within 18-24 months through reduced infrastructure costs, lower maintenance overhead, and improved developer productivity.
Common Migration Mistakes
1. Trying to replicate SSIS exactly. Data Factory is a different tool with different strengths. Don't try to rebuild SSIS patterns exactly - redesign for cloud-native patterns. A pipeline that took 15 SSIS components might need only 3 Data Factory activities with a different approach.
2. Migrating everything at once. Big-bang migrations fail more often than they succeed. Migrate in batches. Run parallel. Cut over incrementally.
3. Ignoring the opportunity to clean up. Migration is the perfect time to retire unused packages, consolidate duplicates, and fix long-standing issues. Don't carry technical debt from SSIS into Data Factory.
4. Underestimating networking complexity. If your SSIS packages connect to on-premises databases, the self-hosted integration runtime setup and network configuration can be the most time-consuming part of the project. Start early.
5. Not involving the business. SSIS packages often support business-critical reporting and operations. Keep business stakeholders informed and involved in testing.
How Team 400 Handles SSIS Migration
We're Data Factory consultants who've personally migrated hundreds of SSIS packages for Australian organisations. Our approach is practical:
- We start with a thorough assessment of your SSIS estate - usually completed in 1-2 weeks.
- We provide a detailed migration plan with package-by-package estimates.
- We migrate in batches, starting with quick wins to build momentum.
- We run SSIS and Data Factory in parallel during transition - no big-bang cutover.
- We hand over a well-documented, production-ready Data Factory platform.
We also work across Microsoft Fabric and Power BI, so if your SSIS migration is part of a broader data platform modernisation, we can handle the full picture.
Contact us for a free assessment of your SSIS migration scope, or learn more about our consulting services.