Everything in a Salesforce org revolves around data. Managing this data requires patience as well as attention to detail. While Salesforce’s Data Loader is a powerful tool for mass importing, exporting, updating, or deleting records, like any tool, it also comes with its own set of challenges. As convenient as it is, users can often encounter errors that can derail their data management processes.
In this article, we’ll go over six of the most common Data Loader errors and how to troubleshoot them. A recurring theme you might see is that these errors often boil down to one thing: ensuring your data is clean and prepped.
1. Missing Required Fields
One of the most common errors when using Data Loader is the absence of required fields in your dataset. Salesforce objects often have required fields (such as Account Name, Opportunity Stage, or Email), and failing to include these in your import file will result in error messages.
I have a custom field on the Account object in my org named “Number of Employees”. It requires a value before saving a record, and so I tried to import some records with the following data set (intentionally leaving “Number of Employees” blank in one of the rows).
The rest of the records were imported successfully, except for TechSmart (as I expected!). Here is the error:
To avoid these errors, ensure that before running Data Loader you’ve identified the required fields for the object you’re working with. An easy way to do this is by viewing the object in Schema Builder. Just go to Setup > Schema Builder and select the object you’re trying to import.
The fields that have a red vertical line are the required fields. Make sure that your data set includes them, and that fields like Owner ID or Record Type (which are often required but overlooked) are correctly populated.
2. Invalid Field Values
Salesforce validates input based on the field type (for example: picklists, dates, numbers, etc.). Uploading data with invalid values or formats can lead to errors, or data not being imported at all. Examples of this are:
- Text in a number field
- Date with an incorrect format
- Number field with inputs that use commas as thousand separators
The following error appeared when I tried to input the value “twenty” in a custom number field named “Years In Operation”, which only accepts numeric characters and numbers up to three digits.
You can also get the error: Invalid Number (“<Field Name>: invalid number: <Number Value>”) when your import file tries to input number values with commas as thousand separators. For example, 10000 may have been typed as 10,000 in your .csv file. This can get tricky because sometimes the error doesn’t appear. It will appear as a success after the import, but when you check the record in Salesforce, the value wasn’t actually updated.
I tried to use the following in my .csv file to update some Account records, intentionally using “10,000” as a value in my Number of Employees custom field.
The operation turned out to be a “success”, but when I went to check the Account record in Salesforce the Number of Employees field was still blank:
Avoid these by always checking your data against the field type. Make sure your dates follow the correct format and that your text/number fields don’t contain any unallowed characters.
Another error that’s especially common for picklists is “bad value for restricted picklist field”. This occurs when the value you’re trying to import to a picklist field is not one of the field’s picklist values. Here’s a simple example:
In my org, I have a custom picklist field called Account Status, which has four values: Customer, Equipment Only, Permanently Closed, and Inactive.
I tried to use the following dataset to update the Account Status on three records, intentionally using just “Equipment” on one of the rows instead of “Equipment Only”.
Here’s the error that resulted from this operation:
So, lesson learned: for picklists, ensure that the values match exactly with the allowed options that your picklist has in Salesforce. I know some people who take the easy way out, which is to uncheck the “Restrict picklist to the values defined in the value set” setting on their picklist field just so the update operation pushes through. I do not recommend removing the restriction on your picklist field, as this would allow other data inputs and can cause messy data in the long run.
This removes the sense of using a picklist field in the first place! Unless you really want your picklist to be editable and for your users to be able to input their own values apart from those in the value set, keep the restriction enabled and make sure to add any new values to the picklist’s value set in Salesforce first before attempting your data import.
Also, do note that record types and owner fields must be entered or identified with their ID, not their Name. Otherwise, you’ll get an error.
My Account object has two record types: Business and Customer. I tried to use the following dataset to update records and intentionally used the record type name “Business” instead of the ID 0123t0000013sSyAAI on one of the rows.
Updates on Acme and salesforce.com were a success, but I got an error on Global Media as expected:
You can find your record type or owner’s ID in the URL bar when accessing it within Setup.
These errors may also appear if you’re trying to enter a value in a lookup field to related records, and the ID you indicated is for the incorrect object – so be cautious of that too!
3. Duplicate Record Errors
Duplicate data can cause significant problems during import, especially if you haven’t set up deduplication strategies in your org. Trying to load records that already exist (based on unique identifiers like Email or custom unique fields) can result in errors or duplicate records in your system.
I know you can always do deduplication in your org after import, but wouldn’t it be great to prevent duplicates in the first place? Here’s an example:
I know the Account object has the standard Account Number field, but in my org, I have a custom Account Number field that is set as Unique. This means that every Account record in my org must have a unique value in this field.
Now I have an existing Account record with Account Number set as 11:
Here’s me trying to do an update using the following dataset. Notice how I’m trying to update ABKD Inc. to have an Account Number of 11, the same as with Kwasson Bakery above.
As expected, the results returned two successes and one error:
It’s great that the error shows us the ID of the duplicate record in your org, which leads us to Kwasson Bakery. To fix this, access the existing record and update the number to something else, or change the number you’re trying to import/update in your dataset into something that does not exist in your org yet.
To avoid duplicate records or Data Loader errors relating to this, deduplicate your data beforehand using tools like Excel’s “Remove Duplicates” feature (this is great for large datasets that may have duplicate values) or by running a Salesforce report to identify already existing records.
4. Field Mapping Issues
Incorrect field mapping during the Data Loader process is a common source of headaches. For instance, you might accidentally map the wrong columns to Salesforce fields, which can lead to invalid or missing data in your records. This can get really messy, especially when you’ve imported a large number of records with incorrect values!
Avoid this by double-checking your field mappings before starting the upload. It’s helpful to keep a mapping document handy, especially if you’re frequently working with similar data sets. This ensures consistency and avoids human error in manual mapping.
You can make a mapping document by clicking “Save Mapping” in Data Loader during the mapping phase.
This saves the mapping as a .sdl file, which you can keep and load for the next time you’re dealing with similar records. You can then just click “Choose an Existing Map”.
When “Use Bulk API” is enabled in your Data Loader settings and you’ve skipped the field mapping step, you may also get this error:
Make sure that at least one field is mapped or you have not skipped field mapping to avoid this!
5. Relationship Import Errors
Working with relationships in Salesforce, such as Lookups or Master-Detail fields, requires careful attention to the parent-child relationships between records. If you try to import child records without identifying their related parent IDs, you can encounter errors. For example:
I have a custom object in my org named Project, which has a master-detail field looking up to Account. Here is my dataset attempting to import three Project records, with one of the rows intentionally blank on the Account field that identifies a Project’s parent Account.
As expected, Project 1 and Project 2 were successfully imported with the following error showing up for Project 3:
To avoid this, make sure you have the correct parent record IDs before attempting to import child records. You can either export the parent record IDs with Data Loader beforehand or get them from the record’s URL in Salesforce if you’re dealing with only a few records.
6. Permission and Access Errors
Sometimes, the Data Loader process fails not because of data issues but because the user running the process in Data Loader lacks the necessary permissions. This can include missing access to the object, field-level security, sharing settings, or missing access to the record type being assigned – ultimately preventing updates. Here’s an example:
As mentioned previously, my Account object has two record types: Business and Customer. However, I am assigned to a profile that has only the Business record type assigned to it.
Here is my dataset for an update operation that attempts to assign the Customer record type to the salesforce.com Account:
As expected, the two Business accounts were updated successfully while the third Account I tried to update to Customer failed because I don’t have access to the Customer record type.
Here’s another example showcasing insufficient access. My Profile has Read access, but does not have Edit access to the Approver field on the custom Project object.
I will attempt to update the Approver on existing Project records with Data Loader using the following dataset:
As expected, there was no success from this dataset. Both rows resulted in an error since they both attempted to update a field that I had no editing access to.
So before starting a data load, ensure you have the correct permissions to the object and its fields. If you’ve been getting errors, reduce the batch size to 1 so Data Loader can identify and show you specifically which rows were problematic. Then, try accessing these records within the org to identify the root cause and what exactly it is that you lack access to. You can reduce the batch size within Data Loader settings.
One more tip aside from checking permissions is to ensure involved users are active, as you might be attempting to update a record owned by an inactive user. In this case, work with your Admin to enable Create Audit Fields in the org.
Final Thoughts
I know that getting Data Loader errors can feel like a roadblock, but as seen above, most of them can be avoided by following best practices for data hygiene. In the end, it all boils down to clean data for a smoother experience.
Many of the errors you might encounter with Data Loader ultimately come back to one fundamental principle: clean and prepped data. Ensuring that your data is validated, deduplicated, and accurately formatted may seem like an easy enough task, but it will save you heaps of time and frustration when running data processes.
Just by paying attention to these details upfront, you can significantly reduce errors and make your data management processes a whole lot easier. What other actions do you take when prepping data? Leave them in the comments below!