Artificial Intelligence

AI Wars: How Salesforce’s Agnostic LLM Approach Works

By Lucy Mazalon

This has certainly been the year of ‘GPT’, kicked off by Einstein GPT being unveiled, and with the Salesforce AI Cloud roadmap including no less than 16 capabilities rolling out in fast succession. Use cases include Slack, sales, service, marketing, commerce, analytics and app builders, and Salesforce are ‘covering all bases’, bringing generative AI to an array of personas across the organization that you already support in improving their work lives. 

Throughout this time, Salesforce have told the story of how they pioneered AI in CRM. Back in 2015, Salesforce introduced Einstein, their AI technology ‘layer’ that can be plugged into different applications on the platform – plus, having built a strong AI team (through five AI acquisitions and placing a professor of AI as their Chief Data Scientist) – they’ve since been building predictive AI models, including their own Salesforce LLM (large language model). Wanting to open the gateway to other providers is an interesting decision for everyone involved, and might get you thinking about how your team can use GenAI alongside your CRM… let’s take a closer look.

Salesforce’s LLM vs. Other LLMs

Anyone who has been keeping astride with Salesforce’s GenAI announcements will have seen Salesforce LLM in action. For example, during the Trailblazer DX keynote, the Einstein GPT in Code Builder use case was able to generate working code. 

What you’ve seen announced in the demos of Slack GPT, Sales GPT, Service GPT, Marketing GPT, Commerce GPT, Tableau GPT, and Code Builder, Salesforce are able to cover with their own LLM.

Salesforce’s LLM has outperformed expectations in testing, and furthermore in customer pilots. By ‘outperforming’, we mean that the AI model produced almost accurate results when asked (input) to produce an answer (output). So, if you were to stick with Salesforce’s LLM, and not explore AI further, you won’t be missing out.

Other AI model providers include Vertex AI (Google), Amazon Sagemaker, OpenAI (AKA ChatGPT), Claude (Anthropic), and many more. These are models that can be trained to, when prompted, output optimal results for the organization that is leveraging them. 

In order to be trained effectively, LLMs require large amounts of data. Organizations with large repositories of data will use a data lake – some providers you may have heard of include Snowflake, Databricks, BigQuery (Google), and Redshift (Amazon). 

AI models + data lakes = data harmonization and GenAI for the business. 

Salesforce’s LLM leverages Data Cloud, or can be plugged into a different data lake. You can see this is a flexible, mix n’ match way of working with GenAI with your Salesforce data. 

Salesforce’s Data Cloud gives more out-of-the-box. When purchased, Data Cloud comes pre-wired to Salesforce objects (e.g. the product object, order object, etc.). While you still need to work to map the systems, and put measures in place to improve/maintain the data quality, the groundwork is done for you. This cuts down implementation time, and therefore, shortening time to value – plus, with more coming in Salesforce’s three releases each year, there’s a near constant stream of new and improved capabilities.

Bring Your Own Model (Open Gateway)

Salesforce have built a platform that’s both open and extensible – in other words, you can integrate any other platform you need to bring in data from other sources and have it work alongside your CRM data. 

‘Bring your own model’ is Salesforce’s way to meet their customers where they’re at. With Data Cloud, Salesforce not only supports Einstein (their own LLM), but also other ones like Sagemaker, Vertex AI, etc. 

GenAI will be a force of change in the SaaS software marketplace – which, with the rapid adoption and excitement, we’ve already seen ‘waves’ of – and Salesforce believe they are at the forefront by providing their own model, and their openness to bring your own. 

So, it’s not a case of having to use one provider/model or the other – you can use them all in tandem. This could be the answer that organizations are looking for, to end the “AI war” that could be brewing within their machine learning (ML) teams. For example, you could have three ML specialists, one working with Salesforce Einstein, another working with SageMaker (Amazon), and the other preferring Vertex AI (Google). 

Einstein Studio is a technology that makes it easy for businesses to combine their company data with preferred AI models from those other predictive or generative AI services.

READ MORE: Salesforce Announces Einstein Studio: Build and Deploy Your Own AI Models

By providing their own LLM, embedded within Einstein GPT, Salesforce professionals have a lower barrier to entry when working with GenAI – and simultaneously, by supporting any provider, there’s no friction for specialists wanting to use their preferred provider.   

What’s also worth noting, is where Salesforce Ventures are placing their bets – investing in GenAI technology organizations – demonstrated by their AI sub-fund. On the list so far includes market leaders Cohere, Anthropic, This could be the technologies that you will be working with in the future.  

READ MORE: Salesforce Ventures AI Fund: Where Are Salesforce Placing Their Bets?

Large Language Model Limitations (and Opportunities)

As we mentioned, if you use Salesforce’s LLM, you will be able to leverage all Einstein GPT functionality that we’ve seen revealed during Salesforce’s keynote presentations so far in 2023.

Let’s acknowledge that no LLM is 100% accurate when generating outputs, and most of the time there are guardrails in place. As we’ve seen, Salesforce are big proponents of the intentional friction approach – in other words, no outputs from generative AI are being automatically applied into users’ workflows, requiring a ‘human in the loop’ at every stage. 

Now that Salesforce have observed what’s possible with their LLM technology and that of the wider market, they have many more use cases for GenAI in their development pipeline. This will include ways that Salesforce professionals building on the Salesforce platform can enhance their workflows, beyond the Code Builder example showcased when Einstein GPT was announced.

And let’s not forget that all successful technology implementations involve the prime combination of process, technology, and people. This is not only by giving specialists the freedom to use the models they prefer, but also providing upskilling resources to help them be successful in making GenAI work for their organization. There’s a big opportunity for Salesforce professionals to advance their careers in the AI space, which will be interesting to watch unfold.

READ MORE: Salesforce and AI – Your Long Term Career Path


GenAI will be a sea change in the SaaS software marketplace – which we’ve already seen ‘waves’ of – and Salesforce believe they are at the forefront by providing their own model, and their openness to bring your own. 

Even with their own LLM, Salesforce recognizes the power that lies in an open platform – a gateway for professionals to use the model providers of their choice. Your team’s experiences play a part in guiding how your organization uses GenAI alongside your CRM. The agnostic approach to ‘bring your own model’ and ‘mix n’ match’ data lakes, is ending the so-called “AI war”. 

Of course, limitations still exist with any LLM (none are 100% accurate) – but there will be plenty of opportunities to take advantage of as time goes on. 

The Author

Lucy Mazalon

Lucy is the Operations Director at Salesforce Ben. She is a 10x certified Marketing Champion and founder of The DRIP.

Leave a Reply