Open Source Prompt Infrastructure

Version control
for AI prompts

Ship prompt changes without code deploys. Manage versions, promote across environments, and resolve the right prompt at runtime — all through a type-safe SDK.

promptops-demo.ts

Prompts in your codebase = chaos

Every prompt change requires a code review, a deploy, and a prayer. PromptOps decouples your prompts from your code.

😩 Without PromptOps

  • Prompts hardcoded in source files
  • Version → PR → Review → Deploy cycle
  • No rollback without code revert
  • Same prompt in dev and production
  • No audit trail of changes

✨ With PromptOps

  • Prompts managed via API + dashboard
  • Version → Promote → Live in seconds
  • Instant rollback, any environment
  • dev → staging → production pipeline
  • Full version history with metadata

Three lines to production prompts

Install the SDK. Fetch your prompt. Render with variables. That's it.

TypeScriptapp/generate.ts
import { PromptOps } from '@promptops/sdk'

// Initialize with your API key
const promptOps = new PromptOps({
  apiKey: process.env.PROMPTOPS_API_KEY,
  baseUrl: 'https://api.promptops.dev',
})

// Fetch the active prompt for your environment
const prompt = await promptOps.getPrompt('support-classifier', {
  environment: 'production',  // or 'dev', 'staging'
})

// Render with variables
const message = promptOps.render(prompt, {
  ticketContent: userMessage,
  customerTier: 'enterprise',
})

// Use with any LLM
const response = await openai.chat.completions.create({
  model: prompt.model,             // "gpt-4" (from PromptOps)
  temperature: prompt.temperature, // 0.3 (from PromptOps)
  messages: [
    { role: 'system', content: prompt.systemPrompt },
    { role: 'user', content: message },
  ],
})
1
Zero dependencies

Native fetch, no bloat. Works in Node.js 18+, Deno, Bun, and edge runtimes.

2
Environment-aware

Fetch different prompt versions for dev, staging, and production automatically.

3
Built-in resilience

Local caching with TTL + stale fallback. Your app works even if PromptOps is down.

How PromptOps works

01

Create

Register a prompt with a human-readable slug via the API or dashboard.

POST /api/v1/prompts
02

Version

Add new versions with system prompts, templates, model configs, and metadata.

POST /prompts/:id/versions
03

Deploy

Promote versions to dev, staging, or production. Rollback instantly if needed.

POST /prompts/:id/promote
04

Resolve

Your SDK fetches the active version at runtime. Cached, resilient, fast.

sdk.getPrompt("slug")

Built for production

Everything you need to manage prompts at scale.

Version Control

Every prompt change creates an immutable version. Full history, full auditability.

Environment Pipeline

dev → staging → production promotion flow. Test before you ship to users.

Type-Safe SDK

TypeScript-first with full type inference. Zero runtime dependencies.

Instant Rollback

One API call to revert any environment to its previous version. No deploys.

API-First

RESTful API with every operation. Dashboard is optional — automate everything.

Resilient Cache

In-memory cache with TTL and stale fallback. Works even when the API is unreachable.

Built for builders

🧑‍💻

AI Engineers

Ship prompt iterations in seconds instead of waiting for code deploys. Test in dev, promote to production with confidence.

promptOps.getPrompt("email-gen", { env: "staging" })
🏗️

Platform Teams

Give product and ML teams control over prompts without touching the codebase. API keys scoped per environment.

POST /api/v1/prompts/onboarding/promote
🚀

Solo Founders

Start iterating on your AI product's prompts without building prompt infrastructure from scratch.

npm install @promptops/sdk

Ready to take control?

Get started in under 5 minutes. No credit card required.

$ npm install @promptops/sdk