HomeGuidesAnnouncementsCommunity
Guides

Prompt Library

The Prompt Library in Max.ai's Skill Studio is a powerful feature that enables you to create, manage, and utilize custom prompts for Large Language Models (LLMs) within your skills. This guide will help you understand how to effectively use the Prompt Library, including setting associated prompts, configuring K-Shot parameters, adding examples, and referencing prompts in your code.


Understanding the Prompt Library

The Prompt Library is a centralized repository where you can create and store prompts that guide the LLM in generating responses. By using the Prompt Library, you can:

  • Reuse prompts across multiple skills.
  • Manage prompts centrally for easier updates and maintenance.
  • Enhance the capabilities of your AI assistant by providing custom instructions and examples.

Creating a Prompt

Steps to Add a New Prompt:

  1. Navigate to Prompt Library:

    • In Skill Studio, click on the Prompt Library tab.
  2. Add a New Prompt:

    • Click on Add Prompt.
    • Name the Prompt: Provide a descriptive name.
  3. Enter Prompt Content:

    • In the Prompt field, write the content that the LLM will use.
    • Use placeholders like {{input}} for dynamic content.
  4. Configure K-Shot Parameters:

    • K-Shot Count: Set the maximum number of examples to include.
    • K-Shot Threshold: Set a value between 0 and 1 to determine the similarity required for an example to be included.
  5. Save the Prompt:

    • Click Save to create the prompt.

Example:

Prompt Name: Email_Response_Generator

Prompt Content:

You are a professional assistant helping to draft emails.

{{examples_for_model}}

Email Subject: {{subject}}
Email Body: {{input}}

Compose a reply that addresses the customer's concerns.

K-Shot Count: 3

K-Shot Threshold: 0.7


Adding Examples to a Prompt

Examples, or k-shots, provide the LLM with context on how to handle specific inputs.

Steps to Add Examples:

  1. Navigate to the Examples Tab:

    • Open your prompt and click on the Examples tab.
  2. Add a New Example:

    • Click Add Example.
  3. Configure the Example:

    • Match Value: An input that represents a user's possible query.
    • Output Value: The ideal response to the match value.
    • Questions (optional): Any additional queries to guide the LLM.
  4. Save the Example:

    • Click Save.

Example:

Match Value: "I need help understanding my bill."

Output Value:

Dear Customer,

Thank you for reaching out. I'd be happy to help explain your bill. Could you please provide your account number for verification?

Best regards,
[Your Name]


Configuring K-Shot Parameters

K-Shot Count

  • Definition: The maximum number of examples to include in the prompt.
  • Considerations:
    • Higher Count: Provides more context but uses more tokens.
    • Lower Count: Uses fewer tokens but offers less context.
  • Recommendation: Balance based on the complexity of the task and token limitations.

K-Shot Threshold

  • Definition: A value between 0 (no match required) and 1 (exact match) that determines how closely an example must match the user's input to be included.
  • Considerations:
    • Higher Threshold: Includes only closely matching examples.
    • Lower Threshold: Includes more examples, even if they are less relevant.
  • Recommendation: Adjust based on the desired precision.

Setting Associated Prompts for a Skill

To use a prompt within a skill, you need to associate it with the skill or reference it in the skill code (see Referencing Prompts in Your Skill Code)

📘

If you are using the "set associated prompt" approach then you will need to have "feature enabled DEBUG"

Steps to Set Associated Prompts:

  1. Obtain the Prompt ID (UUID):

  2. Set Associated Prompts in the Skill:

    • Open your skill in Skill Studio.
    • Go to Set Associated Prompts (may require enabling debug mode).
    • Add a new association:
      • Key: An arbitrary name to reference the prompt (e.g., email_response_prompt).
      • Prompt ID: Paste the UUID you copied.
  3. Save the Skill:

    • Click Save to store the association.

Version Management

Prompts can be backed up as you make changes as versions.


Referencing Prompts in Your Skill Code

In your skill's code, you can retrieve and use the associated prompt.

Example Code Snippet:

import uuid

// Access the associated prompts
list_prompt_config = copilot_skill.associated_prompts

//Create a mapping of keys to prompt IDs
prompt_map = {}  
for prompt_config in list_prompt_config:  
    prompt_map\[prompt_config["key"]] = prompt_config["promptId"]
    
//Retrieve the prompt using the key
prompt_id = prompt_map["email_response_prompt"]

//Get the prompt content, passing variables and user input
pr = sp.ctx.client.config.get_prompt(  
    uuid.UUID(prompt_id),  
    {"subject": "Billing Inquiry"},  # Variables for the prompt  
    user_input  # The user's input  
).prompt_response

// Extract the prompt ready for the model
prompt_for_model = pr["prompt"]  
k_shots_used = pr["k_shots"]

// Use the prompt with the LLM
response = llm.generate(prompt_for_model)

Explanation:

  • copilot_skill.associated_prompts: Retrieves the list of associated prompts.
  • prompt_map: Maps keys to prompt IDs for easy access.
  • get_prompt(): Fetches the prompt, including matching examples.
  • Variables: Pass any necessary variables for prompt interpolation.
  • User Input: The user's actual input that the skill is responding to.

Understanding Match Value and Output

Match Value

  • Purpose: Used to compare with the user's input.
  • Function: If the similarity between the user's input and the match value exceeds the K-Shot Threshold, the example is included in the prompt.
  • Tip: Use language that represents typical user queries.

Output Value

  • Purpose: The content included in the prompt when a match occurs.
  • Function: Guides the LLM in generating an appropriate response.
  • Tip: Provide clear and helpful responses.

Questions (Optional)

  • Purpose: Additional queries to further guide the LLM.
  • Function: Can be used to elicit more specific responses.

Migrating Prompts Between Environments

Export

To export a prompt, find the prompt to export and choose the Export option from the kebab menu in the top right. This will download a file with that prompt included.

Import

To import a prompt go to the Prompt Library and choose Add Prompt. Select Import Prompt from the submenu.

📘

Prompts are automatically exported with an assistant when they are a dependency of that assistant

Frequently Asked Questions

Q1: What does setting associated prompts do, and when should I use it?

A: Setting associated prompts links a prompt from the Prompt Library to a skill, allowing the skill to use that prompt when processing inputs. Use it whenever your skill needs to utilize custom prompts for generating responses or performing specific tasks.

Q2: Where can I obtain the "key" value for a prompt needed to set associated prompts for a skill?

A: The "key" is an arbitrary identifier you choose when setting the associated prompt. It can be any name that helps you reference the prompt in your code.

Q3: Is pulling the UUID from the URL the best way to obtain the UUID for a prompt?

A: Yes, copying the UUID from the prompt's URL is a straightforward method. Alternatively, you can retrieve it from the audit trail or directly from the database if necessary.

Q4: How do K-Shot Count and K-Shot Threshold need to align with the examples?

A: Ensure that your K-Shot Count allows for enough examples to provide context but doesn't exceed token limits. Set the K-Shot Threshold based on how closely you want the user's input to match the examples for inclusion.

Q5: How are Match Value, Output, and Questions used in examples?

A:

  • Match Value: The input to compare with the user's input.
  • Output: The content included when a match occurs.
  • Questions: Additional prompts to guide the LLM (optional).

Q6: What is the structure needed to reference prompts in nodes or code?

A: Use the SDK's get_prompt function with the prompt ID, variables, and user input. Reference associated prompts using the keys defined during setup.


Tips for Effective Use

  • Define Clear Match Values: Ensure your match values represent a range of possible user inputs.

  • Adjust K-Shot Parameters Thoughtfully: Balance between providing context and conserving tokens.

  • Reuse Prompts Across Skills: Centralize prompts for consistency and easier maintenance.

  • Collaborate with Team Members: Share insights and troubleshoot together.

  • Test Thoroughly: Always test your prompts and skills to ensure they work as intended.


Conclusion

The Prompt Library in Max.ai's Skill Studio is a versatile tool that enhances your skills by allowing customized and context-aware responses. By effectively utilizing prompts, examples, and associated configurations, you can significantly improve the performance and user experience of your AI assistant.

Remember to balance the use of examples with token limitations, and always aim for clarity and relevance in your prompts and outputs.


Overview of the Prompt Library Functionality

Overview of the Prompt Library Functionality