PromptOps
Git-native prompt versioning for production LLM agents
Why PromptOps?
Your agent's behavior is defined by prompt text — but most teams ship prompts as raw strings baked into code, with no version history, no rollback, and no way to test a change before it hits production.
PromptOps turns prompts into first-class versioned artifacts. Every git commit
auto-tags your prompts with a semantic version. Reference any version in code by name:
:latest for production, :v1.2.0 for a specific release,
or :unstaged to test uncommitted changes without touching anything live.
No more "what prompt was running last Tuesday?"
Features
Automated Git Versioning
Zero-manual versioning with git hooks and semantic version detection. Every commit auto-tags your prompts.
Uncommitted Change Testing
Test prompts instantly with :unstaged, :working, :latest references before committing.
Cross-Version Comparison
Test and compare any two prompt versions side-by-side. Catch behavioral changes before they reach production.
Zero-Config Git Hooks
Pre-commit and post-commit hooks installed automatically. Versioning happens on every commit — nothing to remember.
Installation
pip install llmhq-promptops
Quick Start
# Create a new project with git hooks
promptops init repo
# Create a new prompt template
promptops create prompt welcome-message
# Test uncommitted changes
promptops test --prompt welcome-message:unstaged
# Check status of all prompts
promptops test status
Python SDK
from llmhq_promptops import get_prompt
# Smart default (unstaged if different, else working)
prompt = get_prompt("user-onboarding")
# Specific version references
prompt = get_prompt("user-onboarding:v1.2.1")
prompt = get_prompt("user-onboarding:unstaged")
prompt = get_prompt("user-onboarding:working")
# With variables
rendered = get_prompt("user-onboarding", {
"user_name": "Alice",
"plan": "Pro"
})
Framework Integration
Works with any LLM framework. Get a versioned prompt, pass it to your provider.
from llmhq_promptops import get_prompt
prompt_text = get_prompt("user-onboarding:working", {
"user_name": "John",
"plan": "Enterprise"
})
# Use with OpenAI
import openai
response = openai.chat.completions.create(
model="gpt-4",
messages=[{"role": "user", "content": prompt_text}]
)
# Use with Anthropic
import anthropic
client = anthropic.Anthropic()
response = client.messages.create(
model="claude-sonnet-4-5-20250929",
messages=[{"role": "user", "content": prompt_text}]
)
Requirements
- Python 3.8+
- Git (required for versioning)
- Dependencies: Typer, Jinja2, PyYAML, GitPython
See it in action
Watch PromptOps version prompts and feed them into ReleaseOps bundles.