Over the last year, OpenAI has emerged as the world’s leading AI service and, as part of that growth, is rapidly making its way into Salesforce. While Salesforce enables customers to access it via its Einstein GPT suite of products, Salesforce Admins and Developers have also started integrating directly with OpenAI’s APIs and the many AppExchange apps. They are now embedding it into their offerings to deliver new AI-powered features.
But with such rapid implementation, it’s important for admins to understand what the key concepts behind OpenAI are – then they can evaluate the different options in order to leverage and improve their user’s workflows.
Applying this knowledge to your organization offers a multitude of benefits, including:
- Empowering you to optimize business processes, making your day-to-day operations more efficient and effective.
- Equipping you with the ability to confidently explain how AI is being used securely within your organization, assuring people of its responsible use.
- Having a firm grasp of AI concepts allows you to continually evaluate AI’s utility in your Salesforce org, ensuring that your organization stays at the forefront of its implementation.
Today, we will go over three pivotal OpenAI concepts that Salesforce Admins need to understand.
Concept 1: Models
When we talk about AI models, we are referring to large language models (LLMs) created by organizations like OpenAI. OpenAI is the creator of multiple powerful AI models, each with its own unique capabilities. These are the models we’ll be focusing on.
- GPT-3.5 Turbo: Most people are familiar with this model, which is used in the base version of ChatGPT and is capable of understanding and generating natural language. This model is typically used for handling lower-level tasks. For example, it can interpret unstructured data from an incoming email and develop an appropriate response.
- GPT-3.5 Turbo 16k: Offers the same capabilities as the standard GPT-3.5 model but with four times the available context, making it capable of interpreting longer user prompts and developing extended responses. It’s worth noting that on December 11, 2023, OpenAI is bringing 16k tokens to the base GPT-3.5 Turbo model, meaning this model will likely become obsolete.
- GPT-4: AI at a higher level, enabling it not only to understand and generate natural language, but also to interpret and generate code. This advanced model does come with a comparatively higher cost than the free GPT-3.5 Turbo, so you’ll have to evaluate your use cases to decide if the extra capabilities are worth it.
- GPT-4 Turbo: Announced in early November, GPT-4 Turbo is a supercharged version of GPT-4. It not only extends the context window to 128k tokens (approximately a 300-page book) but also significantly boosts the model’s speed, with ongoing improvements in performance. Additionally, the knowledge cutoff date takes a huge leap from September 2021 to April 2023, meaning the model is able to utilize more up-to-date information, all at a more affordable price compared to the previous GPT-4 model. You can learn more about this new model from OpenAI’s DevDay blog post.
To ensure these AI models are used securely, OpenAI has implemented various systems to ensure your data is safe. By default, OpenAI retains your conversational data for 30 days in order to identify potential abuse, after which it deletes it (unless otherwise required by law). However, if you want them to keep absolutely nothing, you are able to request full zero retention of data by contacting their sales team.
It is worth distinguishing that when you use ChatGPT in your browser, OpenAI’s models are trained on the data passed to them unless you opt out in the app settings. But when using OpenAI’s models via a Salesforce integration, you are accessing the OpenAI model via their API. None of these models are trained on any of the data passed to the AI when accessed via API, further guaranteeing the privacy and security of sensitive information.
These safeguarding measures ensure that organizations can harness the benefits of OpenAI while maintaining data confidentiality. You can learn more about how OpenAI uses data here.
Concept 2: Context
Context refers to the sets of information and parameters provided to the AI, which the AI then leverages to perform tasks and respond to user queries. Understanding the various forms of context is crucial to designing an effective AI for your Salesforce operation.
Let’s take a look at the different forms of context that play a fundamental role in guiding AI’s actions and responses within the Salesforce ecosystem.
- Static Context: Comprising basic, unchanging information about the user, their company, and the nature of their business. This static information provides a foundational understanding for the AI, allowing it to tailor its responses to the specific user and company requirements.
- Vector Databases: Includes organization-specific knowledge such as product documentation, sales handbooks, and company policies. This index is a dynamic source of context that ensures the AI is up to date with the latest and most relevant data in your org so it can produce informed and relevant results. Services like Pinecone are built to manage vector embeddings and keep your context relevant while offering optimized storage and querying capabilities.
- Salesforce Database: Encompasses records related to Salesforce Objects like Accounts, Contacts, Cases, and Leads, enabling the AI to work with and retrieve data directly from your organization’s Salesforce setup.
- User Prompt: User prompts are tasks that the user asks the AI to perform. The complexity of these prompts will vary depending on the task at hand (we’ll discuss these in more detail later).
When optimizing AI’s capabilities, it’s essential for organizations to invest in developing robust context components. These context components work together to give the AI an interconnected understanding of the request and help it deliver the best possible output. By strengthening the contextual information provided to the AI, it becomes more proficient in delivering relevant, accurate, and personalized solutions.
Thankfully, this context does not come at the cost of security. AI integrations for Salesforce, like GPT Connect, allow admins to exercise precise control over what information AI Assistants can access from your Salesforce Database. This, combined with the measures implemented by OpenAI, ensures data security and privacy while allowing you to build a robust set of contexts for your AI.
Concept 3: User Prompts
Strong prompts are essential for maximizing the potential of AI. The effectiveness of AI-generated responses hinges on the clarity and specificity of the user prompts. These prompts guide the AI to generate tailored responses aligned with the user’s objectives. Although they can vary in complexity, well-constructed prompts lead to more informative and relevant outputs, ensuring users receive the desired level of detail.
In the context of a Salesforce and OpenAI integration, these can be broken down into two forms. More complex tasks have their prompts engineered through detailed templates for multiple uses, while less intense, individualized queries are organically written in the built-in conversation component. Each has its time and place, but regardless of which the situation calls for, there are a few consistent things to consider when developing prompts for your AI.
- Define Clear Objectives: Make sure the AI has a clear sense of what you want it to accomplish with your query.
- Use keywords to strengthen context: Incorporate relevant keywords and context in your prompts to direct the AI toward more specific and applicable information.
- Minimize ambiguity: Similar to utilizing keywords, make sure you use specifics when developing your prompts. If a term can have multiple interpretations, provide more context to clarify your meaning.
It is important to remember that creating strong prompts is an ongoing process/skill. As AI becomes increasingly incorporated into your org, circumstances will arise that require you to reevaluate your prompts. This will come as your understanding of the user context and AI capabilities grows, so don’t feel the need to be perfect right off the bat.
GPT Connect: Use Case
Let’s look at an example so we can put these concepts into practice. Check out the video below to see how GPT Connect, an OpenAI integration for Salesforce, utilizes models, context, and prompts to interpret unstructured data from an email and transform it into structured Salesforce objects.
Summary
As members at the forefront of technological innovation, it’s crucial for Salesforce Admins to get an early grasp on the fundamentals of AI to make data-informed decisions for their org. With this knowledge, administrators have the potential to revolutionize their Salesforce operations and cut hundreds of hours of manual labor out of their team’s workflow every day.
Salesforce is currently in the process of rolling out its Einstein Generative AI features as part of the Winter ‘24 release. While features like AI-generated sales summaries are still in beta, Service Cloud features like automatically generated customer case emails and Einstein work summaries are now generally available through their Einstein GPT add-on.
Additionally, integrations like GPT Connect offer a compelling avenue for organizations looking to enhance their daily operations and fully utilize AI’s capabilities. GPT Connect leverages the power of OpenAI’s GPT AI models to design expert-level AI assistants for everyone in your Salesforce org. Users can chat with custom AI assistants to receive context-specific information, run automated flows, and generate content, all within a framework that prioritizes data security and privacy. It’s available now and offers a 14-day free trial.