Data migration can be difficult. Particularly when it is done at scale – managing millions of records; even more so when sensitive PII is being handled in transit or at rest. Some companies and government agencies may wonder why it matters to them. This week’s incident with the Dallas Police Department demonstrates the potential risks involved in large-scale data migration, the possibility of the impact of human error and the benefits of utilizing an enterprise-level solution for migrating data.

The Situation

On Monday, August 16, the Dallas,TX, police force revealed that a user error in April of 2021 resulted in the deletion of 22TB of department data. Although the majority of this data has since been recovered, approximately 8TB of data are presumed missing and unlikely to be recovered. The authorities in Dallas are still wrestling with the impact of this data loss on previously decided and ongoing cases. This impactful error – whether caused by a user misstep or poor data migration planning – could easily be prevented with a solid strategy for data migration.

“On August 6, 2021, the Dallas Police Department (DPD) and City of Dallas Information and Technology Services Department (ITS) informed the administration of this Office that in April 2021, the City discovered that multiple terabytes of DPD data had been deleted during a data migration of a DPD network drive,” said a  statement  [PDF] from the Dallas County prosecutor’s office.

Upgrades and data migrations are routine to any IT department, and it is during these routine actions when data loss often happens. As we’ve heard from the Dallas Police Department, data loss can be costly for any business, agency, or organization. While the DPD must deal with loss of evidence and, perhaps, overturned convictions, businesses may have to deal with large fines, poor customer experiences or other negative impacts to their bottom line.

Best Practices in Data Migration

So how does an organization safeguard data during a migration? These are some best practices to think about so your data doesn’t go missing during a routine migration:

  1. Invest in a backup solution and set a backup schedule. The great thing about Flosum is that we are an end-to-end DevOps solution. So not only can we migrate your data, but we can also back up all of that data.
  2. Ensure your backups are running on a regular basis and check the reports for any failures or errors.
  3. Test backups and the restore process on a regular basis. Check to make sure that data has been accurately captured and files are intact.
  4. Double check that your backup schedules are running during their scheduled times.
  5. Create a disaster recovery plan for your organization so a process can be instituted as quick as possible in the event of a serious data loss
  6. Have a test environment and then test your migration process. If something goes wrong, the damage will not affect the data. Every update and patch should be deployed through the test environment first before pushing it out into production.

Data loss doesn’t have to destroy your business, its reputation or, even worse, cost you millions. Investing in tools like Flosum delivers a native Salesforce data migration and backup solution so your data never has to leave the Salesforce platform. You can be assured that Flosum has put all resources in place to safeguard your data.

Want to learn more? During our Dev TechTalk live series, we talk about data backup and migration. Check it out here.

signup for our blog

Flosum

“Flosum is the best native release management tool that you will fall in love with. I have gained confidence in my role and has given me the ability to view release management from a whole different perspective.”

Faizan Ali

Faizan Ali
Salesforce Consultant at Turnitin