Softabase
How-To GuideSoftware ERP

ERP Data Migration: Step-by-Step Guide 2026

A hands-on guide to ERP data migration covering extraction, cleansing, mapping, validation, and cutover. Includes timelines, common pitfalls, and recovery strategies from real projects.

By Softabase Editorial Team
March 21, 202611 min read

Data migration is where ERP projects go to die. Not with a dramatic explosion, but with a slow bleed of missed deadlines, corrupted records, and growing distrust in the new system. Panorama Consulting found that data issues cause or contribute to 40% of ERP implementation failures.

Here's the thing: data migration isn't technically difficult. The tools are mature. The process is well-documented. What makes it brutal is the sheer volume of decisions. Which records do you migrate? How do you handle duplicates? What do you do with ten years of transaction history that nobody will ever look at again?

I've led migrations from spreadsheets to Odoo, from legacy AS/400 systems to SAP Business One, and from one cloud ERP to another. The technical steps are similar every time. The human decisions are where projects stumble.

This guide walks through each phase with specific timelines, tools, and the gotchas that only show up when you're knee-deep in 500,000 records at midnight before a Monday go-live.

Phase 1: Data Assessment and Scoping (Weeks 1-3)

Before you move anything, you need to know what you have. Inventory every data source: your old ERP, those spreadsheets the purchasing team swears aren't critical (they are), the Access database someone built in 2014, and the shared drive full of customer specifications. Map every source to the new ERP's data model.

Count records by category. A typical mid-market migration involves: 5,000-50,000 customer records, 2,000-20,000 vendor records, 10,000-200,000 item master records, open orders and transactions, and historical data. The volume determines your approach. Under 10,000 total records? Manual cleanup is feasible. Over 100,000? You need automated tools.

Decide your migration scope early. Master data (customers, vendors, items, BOMs) always migrates. Open transactions (orders, invoices, POs) usually migrate. Historical transactions are the debate. My recommendation: migrate 24 months of history maximum. Keep the old system accessible read-only for anything older. Migrating ten years of transaction history triples your timeline and introduces exponentially more data quality issues.

Document data quality issues now, not during migration. Run duplicate detection, identify incomplete records, flag formatting inconsistencies. One manufacturing client discovered that 30% of their 45,000 item records were duplicates created over 15 years of acquisitions. Cleaning that up took six weeks. If they'd assessed it upfront, it would have been a project phase instead of a crisis.

Phase 2: Data Cleansing (Weeks 3-8)

This is the phase everyone underestimates. Plan for data cleansing to consume 30-40% of your total migration effort. Dirty data migrated into a clean system is still dirty data, except now it's harder to identify because it looks legitimate in the new platform.

Start with deduplication. Merge duplicate customer and vendor records using fuzzy matching on names and addresses. Tools like OpenRefine handle this well for mid-size datasets. For larger volumes, consider paid tools like Informatica or Talend. After automated matching, a human needs to review the flagged pairs. Automated merges without human review produce errors that haunt you for years.

Standardize formatting. Phone numbers, addresses, part numbers, and unit of measure fields accumulate inconsistencies over time. "United States," "US," "U.S.A.," and "usa" should all become one value before migration. Build transformation rules and apply them systematically. This is tedious work but it's the difference between trustworthy data and a system nobody believes.

Validate against business rules. Does every item have a valid unit of measure? Do all BOMs reference items that exist? Are all customer terms codes populated? Run validation checks against the new system's required fields. Export the exceptions, assign them to business owners for resolution, and track completion daily. Budget one week of cleansing for every 20,000 complex records.

Phase 3: Data Mapping and Transformation (Weeks 5-8)

Data mapping connects fields in your old system to fields in your new ERP. Some mappings are obvious: old customer name goes to new customer name. Others require transformation: your legacy system stores phone numbers as text strings while NetSuite expects formatted numbers. Some fields have no direct equivalent and need creative solutions.

Build a mapping document that covers every field in every data entity. Yes, this is painstaking. A typical mapping document has 200-500 field mappings. Include the source field, target field, transformation rule, and a sample of actual data. Review this document with both technical and business stakeholders. IT catches the transformation logic issues. Business users catch the "we don't use that field anymore" situations.

Handle code translations carefully. Your old system uses "NY" for New York while the new system expects "New York" or a numeric state code. Customer payment terms might be "Net 30" in the old system and "N30" in the new one. Build a complete lookup table for every coded field. Missing translations cause import failures that stop the entire migration.

Plan for data that doesn't fit. Every migration has orphan data - records that exist in the source but have no logical home in the target. Customer-specific pricing stored in free-text fields. Notes attached to vendor records that describe undocumented business rules. Don't ignore these. Catalog them, decide with business owners what to do, and either create custom fields in the new system or archive the information externally.

Phase 4: Migration Testing (Weeks 7-11)

Run at least three full test migrations before the real cutover. Each test should use current production data, execute the complete migration script, and validate results against acceptance criteria. If you skip this, you deserve the 3 AM phone call on go-live weekend.

Test migration one identifies the obvious failures: missing mappings, broken transformations, and import errors. Expect 60-70% of records to load successfully on the first attempt. Don't panic. Fix the issues and document the resolutions. This round typically takes 2-3 days to execute and a week to analyze.

Test migration two should achieve 90-95% success rates. Focus on edge cases: records with special characters, maximum-length fields, and data that spans multiple entities (a customer who is also a vendor, for instance). Validate calculated fields like inventory valuation, open AR balances, and BOM costs against the source system. Discrepancies here indicate transformation errors.

Test migration three is the dress rehearsal. Execute it over a weekend using the exact cutover timeline you plan for go-live. Record the elapsed time for each step. If extraction takes three hours, transformation takes five hours, and loading takes four hours, you need a cutover window of at least 16 hours including validation. If your business can't tolerate 16 hours of downtime, you need a different cutover strategy.

Phase 5: Cutover and Validation (Go-Live Weekend)

Freeze the source system. No new orders, no payments, no inventory movements from the moment extraction begins. Everyone needs to understand this clearly. A single transaction entered during migration creates a reconciliation nightmare. Schedule the freeze to start Friday at 6 PM and plan to reopen Monday morning in the new system.

Execute the migration scripts in the predetermined sequence: master data first (GL accounts, customers, vendors, items), then configuration data (price lists, BOMs, routings), then open transactions (open orders, open invoices, open POs), and finally balances (inventory on hand, AR/AP balances, bank balances). Each load must complete and validate before the next begins.

Validation checkpoints are mandatory. Compare record counts between source and target for every entity. Verify financial balances match to the penny. Spot-check 50 records per entity for data accuracy. Run a P&L and balance sheet in the new system and compare to the last report from the old system. If anything doesn't balance, stop and investigate before proceeding.

Have a rollback plan. Define clear go/no-go criteria before cutover weekend. If customer count differs by more than 1%, or if financial balances are off by more than $100, you roll back. Publish these criteria to the project team in advance. The worst migrations are the ones where the team pushes through known data issues because they don't want to delay go-live.

Common Pitfalls and How to Avoid Them

Pitfall one: migrating everything. Companies with 20 years of history try to bring all of it into the new system. Don't. Migrate active records and 24 months of history. Archive the rest. A distribution client insisted on migrating 15 years of transaction history. It added eight weeks to the project and produced reports that took 30 seconds to load because of 12 million historical records nobody ever queried.

Pitfall two: letting IT own data quality decisions. IT can execute the migration. Business users must own the data quality. When IT decides that duplicate customers should merge based on matching tax IDs, they might merge two separate legal entities that share a parent company. Every merge, archive, and transformation rule needs business sign-off.

Pitfall three: skipping the parallel run. Entering data in both systems for 30 days is painful and expensive. It's also the only way to catch data issues that testing missed. Shortened the parallel run on one project from 30 days to 10 days to save time. Discovered a costing error on day 22 that would have gone undetected. Run the full parallel period.

Pitfall four: no post-migration data stewardship. Migration ends but data management doesn't. Assign data stewards for each entity. Define rules for new record creation. Implement validation at the point of entry. Without ongoing governance, you'll recreate the same data quality problems within 18 months.

Frequently Asked Questions

About the Author

Softabase Editorial Team

Our team of software experts reviews and compares business software to help you make informed decisions.

Published: March 21, 202611 min read

Related Guides

Found this guide helpful?

Get more expert software guides and comparison reports delivered weekly.