You will learn how to create your own client side data loader utility using Apex, Visualforce, JavaScript remoting and third party JavaScript libraries (jQuery and Papa Parse).
Why is this helpful? There are many use cases for building a utility that performs CRUD operations on your Salesforce data.
For example, perhaps there is a division in your company which wants to, on an ad-hoc basis, upload a file that has data to be processed, validated and then created. Of course, they could just send the relevant division the file and then they could use the Apex data loader or dataloader.io to perform the operation but this is an extra step.
Or maybe there is a division within your company who have data that needs to be automatically reconciled with your Salesforce data. For example, Division A wishes to upload a subset of important data that comes from an external system into Salesforce but this data needs an attribute from Salesforce and the email address is the unique key.
Using the approach described in this post, it is possible to upload the file, send the data to the Salesforce server, perform some asynchronous processing on it, and then finally perform a DML operation on it.
Why Use a Client-side Data Loader App – Working Example
In many large companies, there are naturally certain silos of data. Here’s an example:
The company ‘Universal Widgets’ has over 40,000 employees globally across 20 different countries and has 12 different specific business divisions. Universal Widgets has multiple databases and CRM systems as a result of many acquisitions. The Head of business division A has some very important client data from database X that *should* be in their Salesforce system because their Board of Directors agreed that Salesforce should be the ‘golden source’ of information on clients. In fact, having *all* client data across all their many different databases is a key part of their master data management strategy.
Now, this client data that ultimately comes from the different systems that division A use needs to frequently be uploaded to Salesforce. This is not a simple ‘once and done’ job. In fact, because of the many hundreds of thousands of transactions that Universal Widgets complete each week, their client data changes significantly daily. An integration between their different databases and Salesforce is in the pipeline but won’t be completed yet for a number of months.
The options to handle this request (i.e. give the users in division A the ability to upload data to Salesforce) are really either:
- To send the data to a support team within the company who can then perform the vlookups / data analysis on behalf of that business division to ensure that the required attributes are there. For example, the support team can take the client name from the data and get the corresponding client Id.
- The users of division A can be given Salesforce report and export report permissions so that they can do their own vlookups / data analysis to get the attributes they require before using the DataLoader or DataLoader.io tool to do the correct DML.
Either way, someone somewhere is doing a vlookup on the ‘client name’ and corresponding client (or account). This can take time because Universal Widgets log business requests formally and the requirement needs to be discussed by the team that will end up doing the Excel analysis and data upsert.
Ideally, a subset of users in division A would be given the permission to upload a .csv file to Salesforce and the vlookups and data upsert would be handled automatically. This would enable the business to be far more agile while their integration between the different systems is completed.
Building such a utility will surely be interesting to you and give you food for thought as to the type of cool things that you can build with Salesforce but there is also a niche application here. Although for data loads, you would generally either use the Data Loader client app or DataLoader.io, there are use cases where giving a user the ability to upload data to Salesforce without some attributes and let Salesforce do the processing via Apex is very powerful and agile.
By leveraging the approach mentioned below, such a business outcome is feasible with the power of JavaScript and Apex.
Hopefully, the combination of technologies will get your creative juices flowing and enable you to build some cool features that your users love.
How to Build a Client-side Data Loader App
In this implementation, we will use a Visualforce page as it is slightly easier to leverage third party JavaScript libraries with Visualforce at this moment instead of Lightning components. Although, if you get the chance, migrating to Aura or LWC would be a worthwhile exercise. I recommend reviewing this excellent Salesforce video on how to leverage third party JS libraries in Lightning components, by the way. You will find the related code for steps 1-4 below step 4.
Step 1: Create a Visualforce page.
Step 2: Create an Apex class called CSVParser (using the code contained in the Gist below).
https://gist.github.com/salesforceBen/81927472881d46ecd7d4cf2384cb96bb
Then, go back to the Visualforce page that you created and copy and paste the below markup.
https://gist.github.com/salesforceBen/6526e2e8663460e4ce2910d8a190eeff
Step 3: Create a static resource of type application/javascript called ‘papaparse2’ (using the code contained in the Gist below).
https://gist.github.com/salesforceBen/1c62d497d3629c74c4ec726a7f21ef16
Step 4:Remove the comments out from line on row 7 in the Visualforce page that you created in step 1 (so change <!–<apex:includeScript value=”{!$Resource.papaparse2}”/>–> to <apex:includeScript value=”{!$Resource.papaparse2}”/>).
https://gist.github.com/salesforceBen/81927472881d46ecd7d4cf2384cb96bb
Step 5: Create a .csv file that looks like the below (you can call it whatever you like but it must be a .csv file):
As you can see, we just have a very simple .csv file as this is for demo purposes. But what you will learn can be applied to much more complex .csv and much larger data sets.
Step 6: Navigate to the Visualforce page (https://yourDomain.visual.force.com/apex/papaparsepoc) and you’ll see a page like so:
Let’s upload the accounts.csv file that you just created and then select ‘Submit’:
How Does This Work?
The data from the .csv file is returned super fast onto the page. But how does this work? All the heavy lifting is handled by Papa Parse. Papa Parse is a very powerful CSV parser. As per the FAQs on its website,
There’s a thousand CSV libraries for Javascript. Papa is different. It’s written with correctness and performance in mind. Papa is the first (and so far only) multi-threaded CSV parser that runs on web pages. It can parse files gigabytes in size without crashing the browser. It correctly handles malformed or edge-case CSV text. It can parse files on the local file system or download them over the Internet.
In your code, the reference to Papa Parse is coming from the CDN in the Visualforce page (line 6 of this gist).
<apex:includeScript value="https://cdnjs.cloudflare.com/ajax/libs/PapaParse/5.1.0/papaparse.js"/>
Best practice, of course, is to download the .js file and upload it as a static resource. But using the CDN way is fine for our demo purposes.
Now let’s look at the static resource from the papaparse2.js file
The buildTable .js function on line 13 from this gist is interesting.
This .js code snippet iterates over the data in the .csv and displays it on the Visualforce page. You can add some console.log statements to see more clearly what it is doing:
Once the data has been iterated over, it’s pushed into a .js array and then finally, a JavaScript remoting action. This is quite interesting because, generally, JavaScript remoting (which calls Apex methods via JavaScript) are used in Visualforce pages however, you can also leverage them in .js static resource files albeit with a slightly different syntax as per this doc.
As per the below snippet, our Apex method is called and the data from all the rows is passed in as a parameter:
function sendAccountNames(cellData) { Visualforce.remoting.Manager.invokeAction( CSVParser.insertAccounts(cellData, function(result, event) { console.log('@@@ result is: ' + result); console.log('@@@ event is: ' + event); })); }
It’s a prerequisite for Apex classes that leverage JavaScript remoting to use the global access modifier. We see that the method that is being called is fairly simple and merely takes the passed in account names from the Visualforce page (which in turn came from the .csv file) and creates accounts with the corresponding account name.
If you upload the .csv file while you have the Developer Console open (or an IDE) you’ll notice a new entry in the debug logs with the operation of ‘VF Remoting’. Open up the log and you’ll see some debugs:
You can now run a query to double check that the records were indeed created:
Of course, we could do lots more complicated things if we like. We can pass in data from a .csv file and then perform some processing. For example, upload a .csv file of account names and do a SOQL query to ensure that no accounts with those names already exist. This will help to prevent duplicates. All this requires is extra logic in the Apex class.
The key piece though is the passing of data from the .csv file to the Salesforce server using Papa Parse and JavaScript remoting.
Summary
In conclusion, you have learned how you can use Apex, JavaScript remoting and some third party JavaScript libraries to enable users to upload .csv files to Salesforce which then can be processed on. This is a helpful pattern, especially if you wish to create some data loader -esque tools in the browser.
Comments: