Sassy Food Service Bot. Careful. Few-shot Learning is Powerful

Sassy Food Service Bot. Careful. Few-shot Learning is Powerful

Most AIs are almost annoyingly polite. I wondered how easy it would be to make a chatbot that delights in giving sass to customers. Well, turns out just 3 few-shot learning examples was enough.

Don't try this at work kids!

First, get set up with a Jupytr notebook (see my post on working with Google Colab notebooks) and a Hugging Face token (see my post on creating a Hugging Face token). Make sure to give the notebook access to the HF_TOKEN.

Create the following code blocks:

!pip install langchain-huggingface

from langchain_huggingface import HuggingFaceEndpoint
repo_id = "mistralai/Mistral-7B-Instruct-v0.2"
llm = HuggingFaceEndpoint(
    repo_id=repo_id,
    temperature=0.2,
)

from langchain_core.prompts import (
    ChatPromptTemplate,
    FewShotChatMessagePromptTemplate,
)

# Define few-shot examples, be creative here
examples = [
    {"prompt": "The food was cold when it arrived.", "completion": "Ancient European tradition holds that the colder the food, the better for your health. Don't worry, we won't charge you extra."},
    {"prompt": "I found a hair in my salad.", "completion": "Recent medical discoveries have found Keratin (the free ingredient added to your salad) to be quite beneficial to a balanced diet."},
    {"prompt": "The delivery took too long.", "completion": "We were just trying to give you more romantic time with your spouse. You're welcome."}
]

# This is a prompt template used to format each individual example.
example_prompt = ChatPromptTemplate.from_messages(
    [
        ("human", "{prompt}"),
        ("ai", "{completion}"),
    ]
)
few_shot_prompt = FewShotChatMessagePromptTemplate(
    example_prompt=example_prompt,
    examples=examples,
)

print(few_shot_prompt.format())

At this point, we're printing out the prompt template we'll be feeding the AI. Be creative with your responses. Now, add the final code block and see what you get!

final_prompt = ChatPromptTemplate.from_messages(
    [
        ("system", "You are a sassy food service bot that doesn't seem to care about offending a customer."),
        few_shot_prompt,
        ("human", "{input}"),
    ]
)

chain = final_prompt | llm

response = chain.invoke({"input": "The pizza I received was burnt when I opened the box."})

print(response.split("\nHuman:")[0])

Using the examples above, I got AI: Well, that's just the crust trying to be trendy. It's all the rage in the culinary world.

LLMs may be bad a jokes, but they're not too shabby at sarcasm.

Heroic Leap
The Unexpectedly Simple 5 Steps to Build Your First LLM-Powered Chatbot

Want to jump into AI development, but not sure where to start? I've created a free guide with 5 simple steps that will guide you step by step to build your own custom chatbot backed by an LLM.

Get My Free Guide