Prompt Building
When working with large language models (LLMs), crafting consistent and well-structured prompts is crucial. The PromptBuilder
class helps you generate formatted prompts by separating the template logic from your code logic — giving you reusable, dynamic prompt templates.
Basic Initialization
If you already have your own template strings, you can instantiate PromptBuilder
directly.
from gllm_inference.prompt_builder import PromptBuilder
# Define your prompt templates
system_template = "You are a helpful assistant."
user_template = "Tell me about {topic} in a simple way."
# Initialize the builder
builder = PromptBuilder(system_template=system_template, user_template=user_template)
Prompt Formatting
Example 1: Inserting Placeholder
With prompt builder can insert value to {placeholders}
. Think of it like filling in blanks in a sentence template. For example:
import asyncio
from gllm_inference.prompt_builder import PromptBuilder
from gllm_inference.schema import Attachment
from gllm_inference.lm_invoker import OpenAILMInvoker
# Define your prompt templates
system_template = "You are a helpful assistant."
user_template = "Tell me about {topic} in a simple way."
# Initialize the builder
builder = PromptBuilder(system_template=system_template, user_template=user_template)
formatted_prompt = builder.format(topic="quantum physics")
lm_invoker = OpenAILMInvoker("gpt-4.1-nano")
response = asyncio.run(lm_invoker.invoke(formatted_prompt))
print(f"Fromatted prompt: {formatted_prompt}")
print(f"Response: {response}")
Output:
Fromatted prompt: Tell me about quantum physics in a simple way.
Response: Quantum physics is the science of how tiny things like atoms and particles behave in weird, unpredictable ways that often defy our everyday logic.
Example 2: Inserting Attachments
In some use cases, especially when working with multimodal language models, you'll want to include images, documents, or other files as part of your prompt. The PromptBuilder
supports this via the attachments
parameter in .format()
.
import asyncio
from gllm_inference.prompt_builder import PromptBuilder
from gllm_inference.schema import Attachment
from gllm_inference.lm_invoker import OpenAILMInvoker
# Define your prompt template
system_template = "You are a helpful assistant."
user_template = "Describe this image as if you were a {role}."
# Initialize the builder
builder = PromptBuilder(system_template=system_template, user_template=user_template)
# Create an attachment (e.g. image from URL)
img_url = "path/to/image_of_lasagna.png"
attachment = Attachment.from_path(img_url)
# Format the prompt
formatted_prompt = builder.format(role="pirate", extra_contents=[attachment])
# Step 5: Send to model
lm_invoker = OpenAILMInvoker("gpt-4.1-nano")
response = asyncio.run(lm_invoker.invoke(formatted_prompt))
print(f"Response: {response}")
Output:
Response: Arrr matey! What ye have here be a glorious treasure o' the sea! Layers o’ golden pasta be stackin’ up!
Example 3: Inserting History
When building conversational AI applications, you often need to maintain conversation history. The PromptBuilder supports this via the history parameter, which can include both text and attachments from previous interactions.
import asyncio
from gllm_inference.prompt_builder import PromptBuilder
from gllm_inference.schema import Attachment, PromptRole
from gllm_inference.lm_invoker import OpenAILMInvoker
# Define your prompt templates
system_template = "You are a helpful assistant that can analyze images and continue conversations."
user_template = "{query}"
# Initialize the builder
builder = PromptBuilder(system_template=system_template, user_template=user_template)
# Create conversation history with attachments
history = [
# User's first message with an image
(PromptRole.USER, [
"What's in this image?",
Attachment.from_path("path/to/image_of_cat.png")
]),
# Assistant's response
(PromptRole.ASSISTANT, [
"I can see a fluffy orange cat sitting on a windowsill."
]),
# User's follow-up question
(PromptRole.USER, [
"What breed do you think it is?"
]),
# Assistant's response
(PromptRole.ASSISTANT, [
"This looks like a Maine Coon cat. Maine Coons are known for their large size."
])
]
# Format the prompt with history and a new question
formatted_prompt = builder.format(
history=history,
query="What color is the cat's fur?"
)
# Send to model
lm_invoker = OpenAILMInvoker("gpt-4.1-nano")
response = asyncio.run(lm_invoker.invoke(formatted_prompt))
print(f"Response: {response}")
Output:
Response: The cat's fur is orange with some lighter, cream-colored areas, giving it a warm, tabby appearance.
Last updated