Experts advise that the Salesforce users need to have an ongoing data backup plan in mind as a part of their Salesforce management and security practices. It is also crucial to understand the differences between various types of data and what it takes in various backup models. Data refers to all the records, contacts, opportunities, leads, cases, contracts, etc. Data may also include object records, content, files, and chatter, etc.
On the other hand, metadata may refer to all the configuration settings like custom fields, reports, page layouts, dashboards, and the Visualforce or Apex custom codes. You need to understand all these types of data and metadata to devise a sustainable Salesforce backup and recovery strategy.
Any single mistake made in the source file and errors in the field mapping may result in a complete disaster to the Salesforce data. Even when you have the best intentions, the admins and users may be in such situations. They may end up accidentally deleting a huge volume of data or modifying the records, but only later realizing that there was a mistake made. By using the tools like Data Loader, anyone can do the deletion or updating records in bulk. Experts also recommend keeping a constant data backup and run a manual backup from time to time before attempting any major project leveraging data within the organization.
Possibility of data deletion
As we have seen above, it is easy for anyone to delete the Salesforce records or update the records intentionally or unintentionally. This can be done by admins, privileged users, or developers. They may tend to make changes to update the configuration settings, add custom fields, delete the existing feeds, modify page layouts, etc. They may also be changing the dashboards and reports etc. Some may end up modifying the custom codes also. The majority of these changes remain unreversible, so it is important to keep a backup copy of your Salesforce metadata to reinstate it to the prior state in case of any adverse events.
Data backup best practices
Salesforce also offers some tools like Data Loader and API for the customers to restore data manually. It is essential to note the order of data restoration to preserve the connections and relationships to the data records. Customers may also choose to engage with Salesforce Services or the partners to empower them with data restoration during salesforce data backup recovery and ensure they have a backup copy of their data available.
In order to help the customers back up their data routinely, there are many Salesforce native options and third-party backup tools available at the App Exchange now. The native data backup options come at no additional cost. Further, let us explore some of the most common options available for the customers to back up their data.
- Data export services: These can be used for scheduled or manual exports of data through UI.
- Data Loader: Helps in exporting data manually on-demand through the APIs.
- Report Export: This is an on-demand export service for data backup through reports.
Here are some of the successful options available to back up metadata.
- Change Sets: Here, you can copy the metadata from the production org to a sandbox or a developer org.
- Sandbox Refresh: Just by refreshing the sandbox, you can copy the configuration metadata automatically. You may explore the steps involved in Sandbox refreshing by reading some related articles.
There are also many other third-party data backup solutions by Salesforce partners available at the AppExchange marketplace. Some of these are very comprehensive and allow you to automate the backups for data and metadata, and offer a unique mechanism through which you can restore the data easily. Salesforce users may explore these by going to the AppExchange and search using keywords related to Salesforce backup. You should go through as many offerings as possible to understand the features and look at the user reviews before choosing the best options. Salesforce does not endorse these solutions which third parties offer.
Data recovery best practices
There are many reasons for data deletion and corruption. In any such data loss or corruption events, you may recover the data from the recycle bin as the first resort. But remember that Salesforce recycle bin also has its limitations. It may hold the files only up to15days, and also a limitation in the number of records stored.
May admins also use the method of recovering overwritten data with the field history tracking method. For this, you must have been enabled the field history option. However, this cannot be kept for all the objects and has limitations of up to 20 fields in an object. In case of a data loss, Salesforce recommends seeking the assistance of some native backup solutions, and there are also many third-party offerings for data recovery.
As there is a manual intervention in Salesforce recovery, there is a cost involved. This cost is proportionate to the recovery time and effort. As of now, the service cost is $10,000 for the Salesforce recovery service. The process is as below.
- Assessing the extent of damage and defining the goals.
- Determining what and where the data loss is and comparing metadata and data from the backup file.
- Creating Restore, which is identified with the missing records and objects Separate files are created with all related objects in the CSV format.
- Minimizing the transformations during the restoration process.
- Excluding the audit fields, which are auto-updated by Salesforce.
- Deactivating the workflow or triggers.
Finally, restoring data and when doing the restoration manually, the thumb rule is to insert operation to upload the parents and upsert children. You may also use the external id to decide which insert/upsert operation is needed.
Post recovery, you also need to evaluate the restoration by reviewing the integrity of the data restored. Also, activate the validation rules and workflow with Apex triggers and switch the auto-number field types from the text. Then update the fields manually which didn’t exist or got deleted during the backup process. You may also try to alter the transformed data to gain a near original data perfection.