Here is a draft confirmation that aims to sound more natural:
Dear {{lead.First Name}},
I hope you're having a wonderful day so far. I'm writing to confirm your reservation at our hotel from {{lead.checkin date}} to {{lead.checkout date}}. We're so pleased you chose us for your {{lead.travel reason}}.
Your {{lead.Room Name}} room will be ready upon your arrival. Please let me know if there's anything at all I can do to make your stay more comfortable and enjoyable. We want you to relax and feel right at home.
We have many amenities available for guests like our indoor pool, fitness center and complimentary breakfast. I hope you
48. WHERE ARE THE SPEAKERS
FROM?
For this month's VMUG we are going international!
We have a true mix of cultures and insights into how Marketo/Martech
is being leveraged globally ... and not to mention - a ton of air miles
racked up between our speakers.
See if you can guess where they are from/live from the clues.
49. WHERE ARE THE SPEAKERS
FROM?
John Grundy - John originally comes from a small town 'down under' whose
population peaked at 630 people in 1911. We don't expect you to guess the town -
but the State is tasked with the paramount job of managing & protecting the Great
Barrier Reef. He now lives in the "Venice of the North" - while visiting we
recommend biking to get some stroopwafels or bitterballen.
Joshua Arrington - Originally from a maritime city that is home of one of the most
famous marathons, many Hollywood stars including an Avenger and a historic tea
party, Josh now lives in “la capital del Turia" - a cultural city known for gorgeous
food (it’s the birthplace of paella), nightlife, architecture and is in the same country
as a band of robbers who executed the 'greatest heist in history'.
Courtny Edwards-Jones - She traded in her cowgirl boots over a decade ago but
still loves visiting her family in a city known for food, festivals and celebrating all
things "weird" ... she then moved almost 5,000 miles across the pond to settle in the
birthplace of industrialisation, the first computer and the Suffragettes.
ChatGPT for those who are not aware is an AI model that is capable of human-like conversations.
There are some terms that may be used interchangeably but are different so just for reference.
OpenAI is the organization that has created ChatGPT.
ChatGPT is the web interface where you can chat with the AI Models.
GPT Models are a specific type of AI model used by ChatGPT to generate the responses.
And there may be further references to specific GPT models such as GPT4 or 3.5-Turbo these are just specific versions of the model.
Next, Self-Service Flow Steps is a feature that came out for Marketo last year that allows us to extend the platform by creating our own flow actions to use in smart campaigns. These actions allow us to send and receive data from Marketo to an external service. In this case we are using them to connect to OpenAI’s API (ChatGPT) send a prompt and save the response on the person record.If you have questions about the Self-Service Flow steps I’m happy to answer them at the end of the presentation. And we also have another webinar coming up where we will specifically cover the topic. And we’ll make sure the information is in the follow up email for todays webinar.
So our first use case for OpenAI’s GPT Models in Marketo is for intelligent segmentation. GPT models make quick work of organizing complex and messy data into well defined segments. You can use this for any kind of data, but for this demo we are going to use it to sort job titles into 12 job function segments. Job titles are a great example because they can very quite a bit even in the same industry and even for the same job function. This can be compounded further if you’re an international organization and you receive job titles in multiple languages.
So imagine you want to set up targeted campaigns personalized by the individual's job function, but in your Marketo database all you have are titles.... hundreds or thousands of different titles that are not easily defined or segmented.
Previously you would have to spend countless hours of either manually entering a role for each of these titles OR many hours setting up smart list rules to try to segment them. Both options would require extensive time commitment and dull mind numbing work, ultimately leading to an end product that would probably be subpar anyways
But let’s dive in and see how we can use OpenAI to solve this problem and save our time and sanity
Here you can see I’ve already created a “Job Function” segmentation inside Marketo.
Now if I wanted to go through this process without the help of AI, I would probably need to create hundreds of rules targeting specific words and word combinations inside the job titles in order to assign people to the correct segment, and even then many would still fall through the cracks and end up in the default segment.
BUT since GPT will be doing all of the work my rules can be super simple, Each segment will just have one filter, which is the persons job role must match exactly the name of the segment.But you can see currently all of the leads are sitting in our default segment because they haven’t qualified for any of the rules.
Let's take a look at the data
Here we can see the list, and we can see that while all of the records have a job title, none have any value for Role which is the field we are targeting with our segmentation.
So next let's set up our smart campaign.
Here in our smart campaign, for the Smart List we just have one simple filter. "Job Title is not empty"
As long as the job title isn’t empty we have something for the GPT model to work with.
Next let's look at the Flow of our Smart Campaign
For the flow we have 2 steps. The first is a work around of sorts. Unfortunately as of now, Self-Service Flow Steps don’t support tokens directly in the fields, so we need a way of generating a personalized prompt for each record before sending to OpenAI. For this I’ve created a custom field called openAI prompt which we will update here with a change data value step before we send it to OpenAI.Second we have our Send Prompt step where we will actually send the prompt to OpenAI and get a response. Here we set the model we want to use, 3.5-turbo is the latest available right now. But this is a select so there's no need to memorize the models, you will always see the latest available.
Next we will set the field where we want to store the response... I've selected leadRole. I will set a temperature of 0 and no additional context. I’m not going to go into the technicals of how this flow step works but the documentation is available on our website and it makes the process pretty easy to setup and use. It can usually be set up in under 20 mins.
Now, the prompt field is a little small for you to see the text inside, so let me zoom in so we can look at what we’ll send to OpenAI
Here when we zoom in, you can see the prompt we are using. Return only the category without quotes or any additional text, Categorize the job title...and here you can see we're using the token for job title.
...into one of the following categories... and you can see we're sending the model the exact segment names we want it to chose from. Now, I recommend using the ChatGPT web interface and testing out prompts before hand to be sure you’re getting a correct response and that it is in the format you want. In this case after some trial and error this is the prompt we will use.
Now with our smart campaign set up, all we need to do is run it. I can see that in the schedule tab I have 1040 people who will qualify so let’s just run it and get these job titles sorted
Ok, now after giving the campaign some time to run we will see how the AI did.
If we refresh our smart list we'll see that all of the records now have a role populated, and they look correct here so let's take a look at the segmentation
Looking at the segmentation, we can see that ALL leads have been assigned to a job function with zero left in the default segment so we can now use this for our targeting and personalization.
And that's it! Our job titles are now neatly sorted into their respective functions so either we've made great strides in segmentation, or we've actually given the AI a full list of jobs available for it to take over.
SO that was a good example of GPT's ability to categorize data for us accurately and in bulk... Our next scenario is around hyper personalization, and here is where we can really make use of the creative talent of ChatGPT.
In this scenario imagine we have a hotel in Ibiza Spain. We want to personalize the confirmation emails we send when someone makes a booking.
The common approach, is to build these emails with tokens and dynamic content which when done correctly is effective, but when it comes to something like hospitality you want a little more spark. You want something warm and inviting and often times the "MadLibs" style of tokens ends up sounding cold and robotic.
((CLICK))
So ironically enough the best solution we have, is to use AI to make it sound more human. With the right data and guidance an AI Model can create natural sounding, highly personalized messages. We call this hyper personalization because it goes beyond just adding in tokens and dynamic content and reaches a real hand written feel, where every message is completely unique while remaining accurate and on brand.
So let's take a look at how we achieve this...
Here we can see the form we are using. For the purpose of this demo, I’m just using a Marketo form but in a real scenario you may be working with a third party booking system, but as long as you have the data in Marketo you can pass it along to OpenAI. Here the keys are going to be the dates, the room type, the reason for traveling...and any additional requests.
So let's see how we can use this data with openAI...
With our form in place we can create our smart campaign. For the Smart list, we’re just going to trigger the flow any time someone fills out the booking form...
For the flow, similar to the previous demo, we’ll have 2 steps. First we will write the prompt using tokens and store it in the OpenAI Prompt Field.
Second we will send that prompt to OpenAI with our Self Service Flow Step. This time we will return the data to the openAIResponse Field. Give it a temperature of 10. And, you can think of the temperature as a creativity level. But keep in mind you probably don’t want to crank it all the way to 100, because you will notice it quickly get creative with facts and that could be problematic.
This time we will also add some data in the Additional context field. This is data that is not dynamic such as hotel information and amenities just to give GPT some more specific data to work with. You can think of this field as just an overflow from your prompt field.
So again, let’s zoom in to see what we’re going to pass to openAI in the prompt. Here we pass all of the booking data from our form and our instructions for how to assemble the booking confirmation paragraph by paragraph.
While we are giving a specific format and data about the hotel, we are also letting the model rely on it’s own knowledge in Paragraph 3 where we ask it to personalize based on “what Ibiza is like in this time period and what activities are typically going on”
We are also relying on the model’s judgement in paragraph 4 where we ask it to consider the additional requests and only show doubt if they seem “extreme”
With the prompt design you can be very specific and use GPT to only customize specific things, or give less instruction and let the model be as creative as you want. It comes down to preference and purpose but the key is really testing the prompts ahead of time for accuracy and format.
Here we see the additional context we are sending to GPT. We are passing in the hotel ameneties available as well as hotel and concierge details This could also be included directly in the prompt field but as we have the context field it’s easier to separate it, into the prompt, which we may want to change in the future and the data about the hotel that should stay fairly consistent.
So I've gone ahead and tested with several variations of bookings, changing up the dates, number of guests and reason for travel as well as thrown some additional requests in.
Here we can see some of the results side by side. On the left we have a business trip booking and on the right a booking for a celebration. We can immediately see that while the first paragraph is pretty cookie cutter, the suggestions in the second paragraph are highly tailored to the occasion. While the business response highlights high speed wi-fi and a business center the celebration booking receives recommendations about the spa, pools and dinning packages.
GPT shows that it is time, location and weather aware with comments specific to the dates, expected temperatures and even recommending the Ibiza Jazz Festival for the guest staying in October… which I actually had to look up and is completely accurate.
Here we see some more results side-by-side for a Family Vacation Booking and a Romantic Getaway. And we see again the AI has done a great job of tailoring the messages to the data and format we gave it while also sprinkling in data from its own knowledgebase and giving us an end result that feels quite warm and authentic.
While it’s true that if the guidelines are not well written or the temperature setting is too high you may have some minor errors…but since these errors are made by an AI, you can just ”Fire” them, revise your prompts, give your AI concierge a new name and viola you have a new better trained team member
…so those are our 2 specific use cases for today, now let’s discuss some general best practices and limitations
Beware of hallucinations. GPT models are often confidently incorrect. So be sure to test your prompts and to be strategic with your wording
Models are limited by the data. The chat GPT models are only trained up to late 2022 and for every model there is an end date to it’s knowledge so be aware of this when sending prompts that may rely on more recent data
Tell the model what format you want. The more specific you can be the better, otherwise it may take liberties you are not expecting. Often times GPT models will give commentary, restate the question or show it’s work so if you don’t want that data stored in your records then it’s best to be specific
Again, prompt engineering is an art. Test and verify your prompts to be sure you are getting the correct response and in the format you want
You can fine tune OpenAI’s models. Again here I won’t get into the technicals, but if you have specific data you want the models to be aware of, you can set up a process to train a GPT model with that data. You basically feed the data in a Q&A format and it creates a new custom model that is specific for you and that you control.
6. Finally, and this is important. Don’t send sensitive or personal Identifiable data in prompts. OpenAI can use prompt data in updating and training future models and you don’t want that data ending up in those models knowlegebase
So hopefully by this point you have a better idea of how the process works, and I’m sure given the high level of the people I see in the attendee list, you have some great ideas on how to put ChatGPT to work in your Marketo Instance… I would love to hear all of them by the way…. But here are just a selection of other use cases we also recommend.
Interesting Moment History – With this we would be creating a human readable description of a leads journey. Each time a new interesting moment happens we would send that moment to OpenAI along with the current story and ask GPT to update it. With this you could have a deep understanding of that journey kept on the record.
Intelligent Message Routing – If you have a contact us or support form on your site, you could use Open AI to read those messages and then help you categorize them as positive/ negative, urgent / secondary or categorize them based on department and route them
Lead routing – Similar to the message routing, you can route leads based on may factors, giving the model a description of your rules and the lead data and letting it make the assignments
Data Cleansing – Whether it’s capitalization, phone number formatting, fixing typos or bad characters, GPT is very good at cleaning dirty data
Translation – GPT can translate text into dozens of languages and even mimic local dialects to really give it a natural feel
False Data Detection – GPT models can also do a great job of detecting fake data so you don’t end up with a database full of John Doe’s and Micky Mouses
And finally Intelligent Scoring – Often times scoring models are complex, consisting of multiple scores across product lines or interests but with a little guidance GPT can make perfect sense of what is happening across those scores update them, aggregate them and give insights