Data Cloud

How to Get Started with Salesforce’s Free Data Cloud Credits

By Mehmet Orun

Branded content with PeerNova

Salesforce’s decision to offer free credits for Data Cloud is a brilliant move to boost sales and adoption of Data Cloud. The challenge so far has been a lack of guidance for implementation. Customers are uncertain how to start using their free Data Service Credits.

What would be an impactful use case for the free credits? How do you evaluate the technical viability of Data Cloud? In particular, customers are justifiably concerned about wasting these credits or going over the free allotment before seeing results.

In this article, I’ll explain how to take advantage of Data Cloud’s free credits to build a business case for Data Cloud that addresses the key concerns of Business, IT, and Finance stakeholders. We’ll also dive into how Data Cloud’s credit pricing works and share strategies to minimize credit waste.

What Does Free Data Cloud Really Mean?

At Dreamforce ‘23, Salesforce announced free Data Cloud for Enterprise edition and above customers, enabling them to “unify 10,000 profiles at no cost”. The fine print here is important – the ability to unify 10,000 profiles is an approximation. Data Service Credits are the primary entitlement and pricing mechanism for Data Cloud. What Salesforce is really giving customers is an annual allotment of 250,000 Data Service Credits (2,500,000 for Unlimited Plus Edition), prorated against a core contract. 

Think of Data Service Credits as “power” for Data Cloud usage. While we may have the same model phone, if you’ve installed a bunch of power-hungry applications your battery will die faster than mine. In the same way, more demanding orgs will consume more battery (credits) than other orgs. It’s up to each customer to estimate how much battery power they’ll need.

In the same way you can control push notifications to minimize battery usage, there are strategies for reducing Data Cloud credit consumption. Successfully connecting the right data and implementing the appropriate design can ensure you use the 250K credits effectively. But not having a plan may result in squandering the credits with little to show for your efforts.

How Are Salesforce Data Service Credits Calculated and Consumed?

As a usage-based product, Salesforce Data Cloud consumes Data Service Credits. The more complicated the task, the more credits you will need. The amount of data processed also increases the credit cost.

For example, querying data costs two credits per million records. However, batch-transforming a million rows to remove unwanted content requires 2000 credits. Streaming the same transformation costs 5000 credits. Applying transformation rules to get a consistent view of two million rows across different sources will cost 4000 credits for batch processing.

Such a model may raise concerns, but it should not be scary. Your organization probably processes data in public clouds (AWS, GCP, Azure, etc.), which also utilize a credit-based model. The difference? Salesforce is transparent about the costs and complexity of integration and processing – a shift from the past where these expenses were concealed.

The Data Cloud rate card (current as of publishing) details how Data Service Credits are consumed

Use Free Credits to Make a Business Case for Data Cloud

To drive support for Data Cloud, you need to persuade three key stakeholders: business, IT, and finance. Each of these stakeholders have different concerns and understandings of Customer 360 solutions.

When talking with…

  • A business leader: I focus on the lost business opportunity of disconnected data.
  • CFOs: I highlight the cost benefit of consolidating an end-to-end Customer 360 solution on a single platform. Hidden costs often span across various license, implementation, and headcount line items.
  • IT leaders: I underscore technical viability and security. I call attention to how Data Cloud eliminates the technical complexity of evaluating, implementing, and integrating various other technologies. This takes time and effort and there is no guarantee the technologies will work together seamlessly.

In my experience, leveraging a proof of concept (PoC) is the ideal approach to facilitating these conversations. Thankfully, the free Data Cloud credits makes a PoC a low-risk proposition.

Start with a Data Cloud Proof of Concept

There are three distinct steps to planning your Data Cloud PoC. If done correctly, these steps will address the needs of your stakeholders in business, IT, and finance:

  1. Select an impactful, yet representative use case.
  2. Identify the right data (sources, records, fields) to demonstrate the value of unifying data.
  3. Deliver a functional demo with quantified benefits and implementation costs.

1. Select an Impactful Use Case

Data Cloud can solve many interesting challenges, but there are only three drivers for business investments:

  • Increase revenue or time to revenue
  • Decrease costs
  • Ensure compliance (which can negatively impact the above)

Personally, I have found the most success by pursuing lost sales use cases. For B2B customers this means showing where an incomplete picture of the customer translates to lost sales opportunities. In B2C contexts this translates to highlighting a loss of customers or a higher cost of sale due to disparate data.

Your use case description should be simple and relatable. For example:

  • B2B: We will assess our global customer base across three lines of business to discover potential cross-sell or up-sell opportunities.
  • B2C: We will determine our true unique customer count and segment customers by revenue so we can offer white-glove service to our best customers inclusive of guest or gift orders.

Want a deep dive on crafting a Data Cloud business case with free Data Services Credits? Watch this video on Salesforce’s free Data Cloud.

2. Identify the Right Data to Support a Data Cloud PoC

I believe an effective Data Cloud PoC has three data sources and three to five source objects per data source. This provides sufficient data to demonstrate insights while keeping the work required for the PoC reasonable. Remember, the goal is to help your stakeholders assess the value, viability, and potential cost of Data Cloud. A targeted approach with adequate, yet constrained data sources will allow you to stay focused and go fast.

The best data sources for your PoC will demonstrate the business benefit and technical viability of Data Cloud. The table below illustrates common starting points.

Company TypeData Sources
(target three)
Data Source Objects
(target three to five)
B2BMultiple Salesforce orgs (primary org + two others)Leads, Accounts, Contacts, and Opportunities
B2BMultiple CRMs (Hubspot, Salesforce) from recent acquisitionCompany, Contact, and Opportunity (HubSpot)

Account and Contact (Salesforce)
B2CMultiple Salesforce orgs for different lines of businessContacts or Person Accounts, Cases, Orders (e.g. if using Order Management)
B2CMultiple Salesforce clouds or alternate technologies, e.g. Service, Commerce, MarketingContacts, Cases,
Customers, Orders, and

Consider Data Volume to Minimize Credit Consumption When Connecting Data Sources

Now that you understand what data you’ll be bringing into Data Cloud, it’s time to think about the size of that data. Customers with lots of data will need to be particularly mindful of the free Data Service Credits. It is easy to use up or go over your free credits if you’re not careful.

First, consider how to ingest data into Data Cloud. Don’t even consider streaming data until you’ve proven the business value and technical viability of Data Cloud. A fixed data set allows you to have a steady snapshot of your data while proving value and viability. Fixed data sets are also easier to manage and will save on precious data credits. For comparison, it costs 3000 more credits to stream a data pipeline than to batch it.

Next, look for ways to minimize the records you need to batch process. If your combined data source objects represent more than one million records, rerunning batch data processing will cost you at least 2000 Data Service Credits a pop.

If you have more than one million records, I recommend working with a subset of your data. At this stage, the primary goal is to showcase the full potential and value of Data Cloud. Attempting to bring in all data will delay progress and could exhaust your free credits.

The most impactful data to unify is the customers that you transact the most with. Unifying high tier customers will uncover the most missed sales opportunities. It will also prove the solution’s effectiveness across all of your data.

The approach to identifying high-tier customers varies between B2B and B2C companies.

  • For B2C, focus on the most active Contacts (e.g. unique email addresses) or segment data by region.
  • For B2B, target companies with the most Accounts or Opportunities. Ensure a comprehensive view by using a third-party global identifier to find hidden account relationships in your data.

Data profiling solutions, such as Cuneiform for Data Cloud, will accelerate your data assessment and minimize credit usage.

Cuneiform for CRM Data Profiling solution

Identify a Subset of Fields to Accelerate Your Data Cloud PoC

The fewer fields you have to incorporate in your design, the faster you can deliver your PoC results. Unfortunately, it is not always clear which fields matter, and the credit cost of trial and error is high.

In Salesforce orgs older than five years, an object can have an average of 200-500 fields and 15%-25% of custom fields may not be actively in use.

To determine what fields matter I use net fill-rate – the percentage of fields populated with more than one distinct value. My rule of thumb is any field with a 50% or greater net fill-rate is likely meaningful to your stakeholders. Use data profiling to understand field net fill-rate.

If you’ve ever tried profiling three to five objects with hundreds of fields each, you may be overwhelmed with the perceived effort and complexity of this data unification initiative. Yes, you can query the objects using DBeaver and capture the results in shared spreadsheets. This will likely take you a minimum of four weeks. Often external query tools also consume credits in an inefficient manner due to suboptimal queries and repeated processing.

Conversely, native Salesforce data profiling tools can achieve the same results in as little as one or two days. In addition to time savings, native data profiling solutions offer rich insights out of the box and save on credit usage too. The results are saved for use by authorized users within Data Cloud, meaning you can simply come back to access the insights and avoid credit waste.

Prevent Data Cloud Credit Waste with the Correct Design

Having selected representative data sources, records, and fields, it’s time to get down to business. Apply data profiling insights to prevent design errors that could compromise results and to showcase Data Cloud’s ability to overcome historical data quality challenges. 

This ensures a dual benefit: preventing credit waste from unnecessary data reprocessing and highlighting the solution’s capability to address data quality issues that have hindered legacy solution architectures.

It is time to:

  1. Design your Data Model Object (DMO) schemas.
  2. Make data governance decisions on what data standards will ensure data consistency in your Data Lake Objects (DLOs).
  3. Map your source data in DLOs to the target DMO schema. Make sure to resolve data differences.
  4. Identify bad data that could distort results and cast doubt on your Data Cloud PoC.Some common examples include fake unique identifiers, such as dummy emails and phone numbers
  5. Execute and verify your outputs through harmonization.

3. Demonstrate The Business Benefit of Data Cloud

Your final step is to deliver a functional demo people can relate to and quantify benefits with implementation costs.

Now you’ve reconciled data across different sources and schemas to identify lost sales opportunities, it’s time to build the business case.

When brought into customer conversations at Salesforce, I relied on a simple presentation flow:

  • Remind your stakeholders of the use case’s business objective.
  • Using one example customer, show the real, quantified scope of disconnected records before unification. Then highlight the insights into potential sales opportunities Data Cloud identified.
  • Use your Data Cloud PoC to show how end users will benefit from having unified insights at their fingertips.
  • Highlight the challenges you tackled, how long it took to complete the PoC, and what you expect a more broad rollout to require.
  • Ask for additional investment. Show with confidence you can make this real for your organization (or if you are a consultant, your client). Show how your approach will scale.

Final Thoughts

For actionable tips on identifying impactful Data Cloud use cases, maximizing your free credits, and building a compelling business case for Data Cloud check out our webinar replay, Unlock the Full Potential of Salesforce’s Free Data Cloud Licenses.

This blog was originally published by PeerNova.

The Author

Mehmet Orun

Mehmet is SVP of Product Management at PeerNova.

Leave a Reply