Data Cloud is the fastest growing organically built product in Salesforce’s history (i.e. Salesforce built it themselves, not via acquisitions). Data Cloud could be described as the ‘Holy Grail of CRM’, meaning that the data problem that’s existed since the infancy of CRM is now finally solvable.
A Salesforce study revealed that the average company has 928 systems – so a big company has thousands, and a small company likely has hundreds. As soon as you have more than one system, identity resolution becomes a challenge.
Salesforce has expanded into AI-powered CRM, the focus being on combining AI and data. Without data, AI cannot function to its full potential.
Data Cloud is the foundation that speeds up the connectivity between different ‘clouds’ across the platform. However, Data Cloud is also a product that can be purchased. While not all Salesforce customers have licensed Data Cloud, being at the foundation means they are still taking advantage of Data Cloud to a degree – but this all becomes even stronger with Data Cloud as a personalization and data unification platform.
So, what’s the journey this somewhat elusive Salesforce product/infrastructure has been on? We’ll take a look back in time to understand where we’ve come from, why Data Cloud stands strong in a crowded market, and finally, take a glimpse into the future.
History of Data Cloud
Salesforce has gone through several iterations with naming its CDP product: Customer 360 Audiences → Salesforce CDP → Marketing Cloud Customer Data Platform → Salesforce Genie → Salesforce Data Cloud.
In some instances, changes were made because the name just didn’t stick – but what’s more important to note, is that some of the name changes were to indicate the significant developments that happened to the product.
- Customer 360 Audiences: Salesforce’s initial CDP offering, launched in 2020.
- Salesforce CDP: The name changed in 2021 to align with how the blooming CDP market was referring to this technology.
- Marketing Cloud Customer Data Platform: In 2022, Salesforce CDP received a new name, as part of the simplification in how Salesforce named their marketing products.
- Salesforce Genie: At Dreamforce the same year, Genie was born. This signified a shift in the use cases (broadening beyond marketing, to sales, service and more), and the zero-copy architecture.
- Data Cloud: In 2023, the name ‘Genie’ was dropped (but not the cute mascot), and Data Cloud has proven itself as a wise investment, partly responsible for powering Salesforce’s GenAI innovation.
Preliminary work lasted about four years, and the CDP offering took a year or two to build before coming to market.
To round off this trip back in time, let’s reiterate that some of the name changes were to indicate the significant developments that happened to the product. Salesforce Genie/Data Cloud signaled next-level CDP, taken from purely a marketer’s tool (who, traditionally, were tasked with deduping and matching customer profiles), to catering for sales, service, and other use cases across the entire Salesforce platform. Luckily, with Data Cloud, Salesforce had built a general purpose data lake, which can be leveraged by any team.
How Data Harmonization Was Achieved in the Past
When purchased, Data Cloud comes pre-wired to Salesforce objects (e.g. the product object, order object, etc.). While you still need to work to map the systems, and put measures in place to improve/maintain the data quality, the groundwork is done for you.
Compare this effort to how setting up a CDP used to be (or, by doing it yourself), you would have to:
- Acquire a cloud data warehouse.
- Build what’s called the ‘star schema’, a multi-dimensional data model used to organize data in a database.
- Connect the schema to all of the objects in the data models of systems you want to map (in the case of Salesforce, there can be potentially hundreds in use).
- Take an enterprise service bus (ESB).
- Write ETL jobs to extract data, transform it, and load it (products like Informatica supported this).
To perform data harmonization with Data Cloud, it’s immensely easier – a fraction of what effort used to be required.
In Data Cloud, clicks can be used to map the data points between Salesforce and other systems – whether you’re harmonizing data between Google Cloud, SAP, a payment system, etc. While you can still connect Salesforce to other data lakes (credit to Salesforce’s agnostic approach), Data Cloud has a lower barrier to entry in terms of effort required to set up.
Take a look at the Data Cloud technical capability map, and you will see a wealth of technology brought together:
Salesforce Data Cloud Differentiators
Data Cloud, in itself, is impressive. While many organizations would consider it expensive, if you were to flip the argument on its head, by buying your own data warehouse, building the star schema, and paying for ongoing compute storage, you’d be looking to spend 5 to 10 times more than what Salesforce is charging for Data Cloud. Plus, data harmonization works best when your CRM data is front and center.
There are other key differentiators that helps Data Cloud to stand out from the crowd:
- Pre-wired to Salesforce objects: While you still need to work to map the systems, and put measures in place to improve/maintain the data quality, the groundwork is done for you.
- Industry-specific data models: Through Salesforce Industries, organizations can use data models and processes designed for their industry needs – as a result, Salesforce are ready to deliver Data Cloud capabilities to a dozen industries. This includes catering to: additional objects in the Data Model, regulatory compliance, and the range of different applications within the verticals.
- Prompt engineering: With harmonized data across your Salesforce database (thanks to Data Cloud), you can leverage your organization’s data for generative AI. Users querying the data using prompts can be applied to a variety of cases. Plus, Salesforce have wowed us (once again) with Prompt Studio, which allows admins to create templates for user prompts, transforming prompts from sentences to buttons. This improves the output that the user would expect by reducing human variation and, by showing a toxicity rating, it ensures that the prompt outputs are coming from reputable sources.
- Einstein Trust Layer: This ‘trust boundary’ aims to resolve concerns over adopting generative AI, including where data is retained when it’s sent to an LLM (large language model). Key features include zero Data Retention and Feedback Store.
Future of Data Cloud
Data Cloud has come a long way over the past 2 years, and by featuring in all of Salesforce’s major releases (three times each year), there is a constant stream of innovation.
Know that the C-Suite are talking a lot about generative AI, and how to implement it into their organization’s workflows.
Data Cloud for Industries
As we mentioned, through Salesforce Industries, organizations can use data models and processes designed for their industry needs. Data Cloud for Industries is being worked on, and will be released via the typical pilot, beta, general availability sequence. The first will be Data Cloud for Health Cloud (the patient experience), with many more to be announced.
Closing the Skills Gap
As with all technology, it’s not about the tool but knowing how to use it – this is where most companies fall short when implementing a CDP. Just like those working with the Salesforce platform will be well-placed to support GenAI implementations in their enterprise, understanding your organization’s data – its structure, how it’s mapped to other systems, ingested, and transformed – will put you in a prime position to lead the way with Data Cloud in your organization.
However, no one person can ‘fly solo’. There are a number of skills and perspectives required in a Data Cloud implementation, and ongoing maintenance. Skills in your team that you’ll need include people with:
- Developer skills: For during implementation and when updating events and data coming into Data Cloud.
- Data management skills: As each organization has its own data model and architecture, Data Cloud will need someone to configure the platform accordingly.
- Business analyst skills: To identify and solve business challenges, otherwise you risk ending up with an expensive, siloed system that nobody really understands.
Then, the skill of prompt engineering will also be key, which is the art of writing prompts to get the most optimal answer. As prompts are natural language queries (i.e. a user typing as they would in conversation), how a prompt could be written varies greatly from one person to the next.
As Salesforce have advocated: Data Cloud + CRM + AI = next-gen customer relationship management. The underlying data needs to be sound in order to effectively surface the best outputs (Data Cloud). However, anticipating what users will ask GenAI tools, like Einstein GPT, relies on an understanding of CRM data (reflecting back to Prompt Studio).
Plus, with Prompt Studio’s Data Access Checks feature, which restricts outputs to only the data allowed by the user’s permissions, understanding your org’s data access policies enables you to partition data, while setting expectations on the outputs that can be expected.
While these skills (and combination of skills) may seem futuristic currently, perhaps in the future, they will become typical once professionals have mastered the technology and all the nuances of implementing it across the organization for multiple departments.
Data Cloud could be described as the ‘Holy Grail of CRM’, meaning that the data problem that’s existed since the infancy of CRM is now finally solvable. But the journey that this product / infrastructure has been on is somewhat elusive.
Data Cloud is a great investment, but only if your organization is in a good position to demonstrate return on investment (ROI). With solid use cases in mind, you could be confident in the timelines for reaching your goals, and reap the rewards of unified customer profiles. If you want to know if this could be the solution for you, check out this article: