Prompt Versioning
Version control for prompts with immutable versions, promotion, rollback, and release notes.
Prompt Versioning
PromptRails provides immutable versioning for prompts, similar to agent versioning. Every change to a prompt's template, model assignment, or parameters creates a new version that can be tested independently before being promoted to production.
How It Works
Each prompt has one or more versions. A version captures:
- System prompt text
- User prompt template
- Primary and fallback LLM model assignments
- Temperature, max tokens, and top_p parameters
- Input and output schemas
- Cache timeout
- Configuration object
- Version message (release notes)
Exactly one version per prompt is marked as is_current. This is the version used when an agent references the prompt without specifying a particular version.
Version Fields
| Field | Type | Description |
|---|---|---|
id | KSUID | Unique version identifier |
prompt_id | KSUID | Parent prompt ID |
version | string | Version label (e.g., v1, v2) |
system_prompt | string | System instructions for the LLM |
user_prompt | string | User message template (Jinja2) |
llm_model_id | KSUID | Primary LLM model |
fallback_llm_model_id | KSUID | Fallback LLM model |
temperature | float | Sampling temperature (0.0 - 1.0) |
max_tokens | integer | Maximum response tokens |
top_p | float | Nucleus sampling parameter |
input_schema | JSON | Input validation schema |
output_schema | JSON | Output structure schema |
cache_timeout | integer | Response cache TTL in seconds (0 = disabled) |
config | JSON | Additional configuration |
is_current | boolean | Whether this is the active version |
message | string | Version message / release notes |
created_at | timestamp | Creation timestamp |
Creating a Version
version = client.prompts.create_version(
prompt_id="your-prompt-id",
system_prompt="You are a concise technical writer.",
user_prompt="Summarize the following text in {{ max_sentences }} sentences:\n\n{{ text }}",
temperature=0.5,
max_tokens=256,
input_schema={
"type": "object",
"properties": {
"text": {"type": "string"},
"max_sentences": {"type": "integer", "default": 3}
},
"required": ["text"]
},
message="Reduced temperature for more consistent summaries"
)Promoting a Version
Promotion sets a version as the current active version. The previously current version is automatically demoted.
client.prompts.promote_version(
prompt_id="your-prompt-id",
version_id="version-id-to-promote"
)After promotion:
- All agents referencing this prompt (via their current agent version) will use the newly promoted prompt version
- Previous executions retain their original prompt version in the trace for reproducibility
Viewing Version History
versions = client.prompts.list_versions(prompt_id="your-prompt-id")
for v in versions["data"]:
current_marker = " *" if v["is_current"] else ""
print(f"{v['version']}{current_marker} | {v['message']} | {v['created_at']}")Rolling Back
Rolling back is simply promoting a previous version:
# Find the version you want to restore
versions = client.prompts.list_versions(prompt_id="your-prompt-id")
target = versions["data"][1] # previous version
# Promote it
client.prompts.promote_version(
prompt_id="your-prompt-id",
version_id=target["id"]
)Since versions are immutable, rollback is instant and safe. The "rolled back from" version remains in history and can be re-promoted later.
Version Messages
Always include a descriptive message when creating a version. This serves as release notes and makes the version history meaningful:
# Good version messages
"Initial prompt for customer support agent"
"Reduced temperature from 0.9 to 0.5 for more consistent output"
"Added fallback model (Claude) for reliability"
"Updated system prompt to handle refund requests"
"Switched to GPT-4o for improved accuracy on complex queries"Best Practices
- One change per version -- Make focused changes so you can isolate the impact of each modification
- Test before promoting -- Execute the prompt version directly to verify behavior before making it current
- Document changes -- Use version messages to explain what changed and why
- Coordinate with agent versions -- When creating a new agent version, explicitly link it to the desired prompt versions to avoid unintended changes
- Use cache timeouts -- For deterministic prompts (translation, classification), enable caching to reduce LLM costs and latency