Skip to main content

Best Practices for Migrating and Integrating Your Data with Salesforce

  • Chapter
  • First Online:
Developing Data Migrations and Integrations with Salesforce
  • 760 Accesses

Abstract

Now that we understand what the attributes of good data migration and integrations are, we can learn the best practices used to achieve these attributes. I’m sure you have heard the axiom “every rule is made to be broken.” The software development equivalent of this is “every best practice has its use case.” We don’t violate (or ignore) rules or advice just because we don’t feel like following them. We violate them because they don’t apply to our current situation or use case. And even if they do, it’s perfectly valid to decide that the benefit of doing something does not justify the effort involved. You don’t want to overwork yourself for very limited gain.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 54.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 69.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Notes

  1. 1.

    Lack of budget (money or time) is not a good reason for not using best practices. If your budget does not allow you to do good work, you need to have a serious conversation with whomever put together the estimate. We have all heard of the project management triangle of scope, budget, and quality. When it comes to data migrations (or integrations), you should refuse to budge on quality. You can say you need a greater budget (for money and/or time). You can say you need to cut scope. But, never, ever, agree to cut quality—not when it comes to data.

  2. 2.

    It damn well better. If not, start working with your sales team on writing proper data scope details into your SOWs.

  3. 3.

    Not production. You don’t even want to encounter the possibility of affecting users negatively while doing your analysis. Use a test system or a backup of production.

  4. 4.

    The larger client is, the more likely it will have a mature software development life cycle (SDLC) process and good documentation. In general, the reason an experienced client is replacing a system is because it has become too large, complex, and unmanageable, which is often a symptom (or outcome) of bad a SDLC process. So, the documents you get may be somewhat outdated. Often there is a mix, the database administration team may have good documents on the database structures and integrations, whereas the front-end team may have nothing. Treat any documentation you get as a cause to celebrate.

  5. 5.

    Both Oracle and MS SQL Server have such a feature, but it only works if you have FKs defined in the database.

  6. 6.

    Every major RDBMS supports this. If the data are not in an RDBMS, it often pays to import the data into an RDBMS so you can query the metadata easily.

  7. 7.

    Often, development teams are very good at creating the initial documentation, but are bad at keeping it up to date throughout the development cycles, this is exacerbated with Salesforce being designed for rapid change.

  8. 8.

    No joke. Do it immediately. It’s immensely important that your documentation remain the source of truth for all transformation rules. If your data transformations are not documented properly, how do you know whether your code is wrong? Or whether you are forgetting the agreed-to transformation rules? How can your QA team members test the data if they don’t know what the data are supposed to look like?

  9. 9.

    The exception to this is a full sandbox. No other sandbox maintains the production Ids when being refreshed or created. A partial sandbox does, but only for the small segment of records that got transferred to it.

  10. 10.

    For more information, see https://developer.salesforce.com/forums/?id=906F00000008ztIIAQ .

  11. 11.

    In this way, you can use the excuse that you didn’t want to wait until the morning and lose hours of work just to get the answer you were sure you already knew.

  12. 12.

    For more information, see https://trailhead.salesforce.com/en/modules/sales_admin_duplicate_management .

  13. 13.

    We some of the more commonly used data tools in Chapter 3.

  14. 14.

    I’m not saying you can’t code a migration or integration using the Apex Data Loader that meets all the attributes I listed in Chapters 4 and 5. You absolutely can, and I have. You can use the command line to automate it fully, and some scripting language for all transformation code. I’m just saying it’s not the right tool for the job.

  15. 15.

    On the same note, many ETL tools have “lookup” functionality that is also very slow and, if used with Salesforce, abuses your API calls. You are much better off downloading the related data and doing a proper join in code.

  16. 16.

    Again, see https://trailhead.salesforce.com/en/modules/sales_admin_duplicate_management .

  17. 17.

    If you have the push-to-staging automated and the data repair automated as well, this is a perfectly good transformation layer/pattern.

  18. 18.

    Based on my past experiences, any plan to clean up data after go-live, rarely comes to fruition.

  19. 19.

    If you did a good job during the build and test cycles of fixing the data in the source system and have coded bullet-proof code, this situation should be a rarity and should only impact a record or two. If it turns out you need to perform another migration, go back and have the data fixed in the source or in your transformation code.

  20. 20.

    When concatenating Ids, always use a delimiter. Suppose you are concatenating two Id fields. On one record, the Ids are 351 and 25; on another, the Ids are 35 and 125. When you concatenate these Ids, they both result in 35125. They are no longer unique! Make it a practice to use delimiters. I usually use a colon or a hyphen. So, in this case, the Ids would be 351:25 and 35:125 or 351-25 and 35-125.

  21. 21.

    You can argue about some sorting benefits, but this is resolved easily by padding the numbers with zeroes when converting them to text.

  22. 22.

    This issue is debated heavily online. There are lots of great arguments for each side. So, if you have some time, read up on it and you can impress your friends and colleagues at your next social gathering!

  23. 23.

    Other users could have updated records during or after the job run.

  24. 24.

    This response implies you did something wrong and have now agreed to fix your “mistake.”

  25. 25.

    The Apress editors complained about my use of “potty words” in previous chapters, so you will have to Google this one if you can’t figure it out yourself. Sorry.

  26. 26.

    For more information, see https://www.cos.gatech.edu/facultyres/Diversity_Studies/Fiske_StereotypeContent.pdf .

  27. 27.

    This happens to me repeatedly and I still have trouble imagining it!

Author information

Authors and Affiliations

Authors

Rights and permissions

Reprints and permissions

Copyright information

© 2019 David Masri

About this chapter

Check for updates. Verify currency and authenticity via CrossMark

Cite this chapter

Masri, D. (2019). Best Practices for Migrating and Integrating Your Data with Salesforce. In: Developing Data Migrations and Integrations with Salesforce. Apress, Berkeley, CA. https://doi.org/10.1007/978-1-4842-4209-4_6

Download citation

Publish with us

Policies and ethics