Recovery and Rollback
You can detect problems and recover from errors during import and export processes.
Detecting Job Problems
You can use Business Manager to configure a job to send an email to the administrator if it runs for more than a specified period.
Sometimes, jobs experience problems that are written to the log, but which don't stop the import. If these errors are significant, you can add processing to your import pipelines to detect and stop the import. It's generally a good practice to check import logs regularly to identify any problems.
Recovering from Job Problems
You must release the job lock before you can run the job again. To release the lock, you stop and restart the instance. For more information, see the Control Center documentation.
Recovering from Network Issues
If a job dies because the instance went down, import the feed again. Because the standard import feature commits object by object, you never have a corrupted database.
If a job dies part way through an import, the database contains whatever data was successfully imported and any old data that was not changed or removed. An incomplete catalog import causes the search indexes to be out of sync with the catalog, so the search feature doesn't work as expected.
Recovering from Data Errors
If there appears to be something wrong with the feed data use one of the following methods:
Recovery Method | Description |
---|---|
Replicate data from another instance | This method is most useful if there is a problem with importing onto production and staging has correct data you can roll back to. |
Import a new feed produced by the backend system. | This method is the most common method for recovery. Usually the data must be fixed in the backend system and a new feed generated. |
Use data from import feed archives | This method is most useful if there is a problem with backend system producing the feed. For this data to be available, you must have a system of archiving feeds and cleaning up old archived feeds. |
Use data from regular exports |
This method is most useful for data determined on the production system, such as availability or data that is imported directly onto production, such as price books. For this data to be available, you must create a job that exports the required data. It's also useful for data that only exists in the Business Manager, not the backend system, such as web-specific attributes or URL attributes. |
Customizing Production Instance Feed Rollback
On a production instance, most data is transferred to the instance from staging via data replication. However, data that must be imported frequently, such as inventory data or price data, is sometimes imported directly into production. As with staging, if a job is interrupted for any reason, you can have an incomplete import data and previously existing data in the catalog. For example, an incomplete inventory import can let items be sold that are no longer available.
If you want to make sure you import a full feed or nothing, you must create a custom pipeline that encapsulates all the objects for import in a single transaction and commits them at once. However, this approach isn't needed or recommended, as it can be difficult to code and debug correctly.
Any custom pipeline you create is limited by the size constraints for the commit. You can't commit more than 1,000 objects at once. A transaction in Salesforce B2C Commerce can't modify more than 1,000 objects at once. Also, if you are modifying all of these objects and someone modifies one of them manually, the commit can fail. If you have more objects in a single commit, the commit is more likely to fail.