Developers / Artificial Intelligence

Integrating OpenAI’s ChatGPT With the Salesforce AppExchange

By Jakub Stefaniak

As a passionate ChatGPT enthusiast, I’ve always been intrigued by the potential of harnessing its power to transform existing applications. One day, I had an epiphany: why not take a deep dive into the realm of Salesforce AppExchange and breathe new life into my old app, Formula Debugger?

Join me as I recount my thrilling adventure of integrating ChatGPT into a Salesforce AppExchange product, and witness how the fusion of human ingenuity and artificial intelligence can elevate an application to new heights.

Formula Debugger and AI

Formula Debugger began its journey as a straightforward tool for testing and debugging formula fields in Salesforce. However, my vision was to infuse it with the cutting-edge capabilities of OpenAI’s ChatGPT, creating a more comprehensive and insightful solution for analyzing and optimizing formula fields.

To add an extra layer of excitement, I challenged myself to generate as much new code as possible using ChatGPT. This creative endeavor would not only test the limits of the AI’s abilities but also demonstrate how it could aid developers in building more advanced and intelligent applications.

Setting Up OpenAI API Integration

In order to integrate OpenAI’s ChatGPT API into your AppExchange application, you’ll need to set up Remote Site Settings and Custom Metadata. These configurations are essential to enable a secure connection between Salesforce and OpenAI’s API, as well as to store and manage the required settings for the integration.

Remote Site Settings

Remote Site Settings are necessary to allow Salesforce to make outbound calls to the OpenAI API. In our development, we’ve created a Remote Site Setting with the endpoint https://api.openai.com. This ensures that Salesforce can communicate with OpenAI’s API without any security constraints.

Custom Metadata

For this scenario, we’ve also created Custom Metadata called “ChatGPTIntegration” with several fields:

  • Endpoint: Stores the OpenAI API endpoint, which is https://api.openai.com/v1/chat/completions.
  • API_Key: Stores the API key required for authenticating with the OpenAI API.
  • Max_Tokens: Stores an integer value that represents the maximum number of tokens allowed in the API response.
  • Context: A string used to provide the context for the OpenAI API call.
  • Command: A string used to represent the command or query that you want ChatGPT to understand and respond to.

The Context and Command fields are particularly important, as they help with sending OpenAI calls to get meaningful responses for your Formula Debugger.

Obtaining an API Key from OpenAI

To get an API key from OpenAI, follow these steps:

  1. Sign up for an OpenAI account or log in to your existing account.
  2. Visit the API Keys section of your account dashboard.
  3. Generate a new API key or use an existing one.

Once you have the API key, add it to the API_Key field in the Custom Metadata configuration.

Apex Classes for OpenAI Integration

In order to integrate OpenAI’s ChatGPT API into the Formula Debugger AppExchange application, we’ve created three Apex classes: FormulaInsightsController, FormulaInsightsMock, and FormulaInsightsControllerTest. Each of these classes plays a crucial role in handling API calls, mocking API responses, and testing the integration.

FormulaInsightsController

FormulaInsightsController is responsible for making API calls to the OpenAI API and processing the response. This class fetches the necessary data from Custom Metadata, such as the endpoint, API key, context, and command, before sending requests to the ChatGPT API. Upon receiving a response, the controller processes it and returns meaningful insights to the Formula Debugger application.

FormulaInsightsMock

FormulaInsightsMock is an Apex class that implements the HttpCalloutMock interface. This class simulates API responses during test executions, ensuring that we can run tests without making actual API calls to OpenAI. ChatGPT generated the entire content of this class, which showcases its capability to create relevant mock responses for testing purposes.

FormulaInsightsControllerTest

FormulaInsightsControllerTest is an Apex test class designed to test the functionality of the FormulaInsightsController. It uses the FormulaInsightsMock class to simulate API responses and test various scenarios to ensure that the controller is working as expected. ChatGPT generated this test class in its entirety, demonstrating its ability to generate comprehensive test cases that cover various scenarios.

While ChatGPT generated FormulaInsightsMock and FormulaInsightsControllerTest classes in their entirety, FormulaInsightsController required some minor manual adjustments. As the FormulaDebugger application already uses elements like FMA Feature Flags (which allows me to enable or disable the OpenAI for specific subscriber orgs as needed. It’s one of the cool technologies available to ISVs and one of the general best practices for rolling out new features, but it’s not mandatory for this integration), it was more efficient to manually align the generated code with the existing codebase instead of teaching ChatGPT about the entire context required to support these features.

Lightning Web Component Integration with OpenAI

To display the insights provided by the OpenAI ChatGPT API on the frontend, a new Lightning Web Component (LWC) called formulaInsights was created. This LWC integrates with the FormulaInsightsController Apex class and handles the presentation of the insights within the Formula Debugger application’s main screen.

Implementing getInsights

The formulaInsights LWC utilizes the getInsights method from the FormulaInsightsController Apex class to fetch the insights for a given formula. Once the insights are retrieved, the LWC processes the response and displays it in a user-friendly format on the frontend.

To achieve this, the LWC leverages the wire service to make an asynchronous call to the getInsights method. It passes the necessary parameters, such as the formula content, and listens for updates to the response. When new insights are received, the LWC updates the component’s state to reflect the latest information.

Displaying Insights on the Formula Debugger Main Screen

The formulaInsights LWC is designed to seamlessly integrate with the existing Formula Debugger application’s main screen. When a user selects a formula, the LWC fetches insights from the OpenAI API and presents them in an intuitive and visually appealing format alongside the formula. This allows users to better understand the formula they are working with and receive valuable insights from OpenAI directly within the application.

Getting Meaningful Responses from OpenAI

To extract the most valuable AI insights for our application using the OpenAI API, it’s essential to understand the roles of both Context and Command in our Custom Metadata.

Leveraging Context and Command

When sending a single API call, we create content by combining three elements: context, the user-selected formula, and the command specifying our request for OpenAI. The code responsible for preparing the content can be found in the FormulaInsightsController:

For example, the full content message might look like this:

You are a Salesforce expert and Certified Advanced Administrator. Explain what this Salesforce formula field is doing, step by step, and how it works:

IF (ISBLANK (TEXT (Salutation)), ‘’, TEXT (Salutation) & ‘ ‘) & FirstName & ‘ ‘ & LastName, describe what can be an issue with this formula field and how to fix it.

By keeping the context and command separately in custom metadata, ISVs have the opportunity to improve them over time. This allows for the delivery of enhancements via push upgrades when better prompts are discovered.

Fine-Tuning Context and Command

To obtain the best AI insights for your application, it’s crucial to carefully craft the context and command. The context should provide enough background information to enable the AI to understand the scenario, while the command should be clear and concise, specifying the exact task you want the AI to perform.

By refining your context and command over time, you’ll be able to continually improve the quality of the AI-generated insights, thus enhancing the user experience within your application.

In conclusion, understanding the roles of context and command in Custom Metadata is essential for getting the most meaningful responses from OpenAI. Continually refining these elements will ensure that your application remains innovative and provides valuable insights to users.

Summary

Integrating OpenAI’s ChatGPT with my AppExchange application, Formula Debugger, proved to be a smooth and efficient process. The majority of the required code was generated by ChatGPT itself, enabling me to release an updated version of the Formula Debugger app within just a few hours. The latest version is now available for free on AppExchange for users to enjoy.

It’s important to note that, depending on the information stored within your application, using OpenAI’s public endpoint may not be suitable for security reasons. There are alternative options available, such as hosting your private ChatGPT instance. Deciding on the best approach requires a thorough analysis of your specific requirements.

The Author

Jakub Stefaniak

"Jakub is the best Salesforce Expert I know" - Jakub's Mom ☁️ CTA ☁️ Director of Engineering ☁️ Dreamforce & TrailblazerDX Speaker

Comments:

    Pablo Caborno
    May 24, 2023 5:06 pm
    Awesome! Thank you for sharing this Jakub!

Leave a Reply