Skip to content

ai-zen-future/open-prompt

Repository files navigation

open-prompt Logo

open-prompt

Prompting is no longer magic, but deterministic

License: MIT TypeScript Node Version Build Status codecov

A systematic, four-layer architecture for transforming prompt engineering from art into science

English | 中文

FeaturesQuick StartArchitectureDocumentationContributing


📖 Overview

Open-prompt transforms prompt engineering from intuition-based art into a deterministic, systematic process. By decomposing user intent and applying proven cognitive structure frameworks, it generates production-ready prompts optimized for different LLM providers.

image image

Try the live demo at https://open-prompt-sand.vercel.app

🎯 Why open-prompt?

Traditional Approach open-prompt
❌ Trial and error Systematic decomposition
❌ One-size-fits-all Multi-variant generation
❌ Unstructured prompts PromptIR™ intermediate format
❌ Provider-specific Multi-provider optimization
❌ Manual refinement Automated framework selection

✨ Key Features

🧠 Four-Layer Architecture

  • Layer 1 - Intent Classifier: Multi-intent detection with confidence scoring
  • Layer 2 - Structure Framework Selector: Maps intents to optimal cognitive frameworks (CoT, MECE, SCQA, etc.)
  • Layer 3 - PromptIR Generator: Creates structured intermediate representations
  • Layer 4 - Final Prompt Constructor: Provider-optimized output (GPT, Claude, General)

🎨 Provider-Optimized Output

  • General Style: Provider-agnostic, balanced approach (default)
  • GPT Style: Optimized for OpenAI GPT (imperative, numbered lists)
  • Claude Style: Optimized for Anthropic Claude (collaborative, natural flow)
  • Format Support: Single string or message array for API compatibility
  • Multi-Language: Auto-detects Chinese, Japanese, Korean, English

📦 PromptIR™ - Structured Intermediate Representation

Every prompt is represented as a structured object with:

  • Role, Goal, Context, Constraints, Process, Output
  • Structure framework with template
  • Optional tools and termination conditions

🔌 LLM-Agnostic Design

  • Unified interface supports OpenAI, Anthropic, Qwen, and more
  • Easy provider integration
  • Built-in monitoring and performance tracking

💻 Interactive Web Interface

  • Visual Workflow: Step-by-step visualization of the prompt generation process.
  • Stage Control: detailed views for Intent Classification, Framework Selection, and PromptIR.
  • Real-time Preview: See how changes in configuration affect the output.
  • One-Click Copy: Easily copy the generated prompt to clipboard.

📊 Use Cases

  • Content Strategy: SEO evaluation, EEAT analysis, content optimization
  • Technical Documentation: API docs, technical guides, tutorials
  • Business Communication: Reports, presentations, strategic plans
  • Problem Solving: Debugging, system design, architecture decisions
  • Data Analysis: Structured reasoning through complex datasets

🏗️ Architecture

┌─────────────────────────────────────────────────────────────────────┐
│                        USER INPUT                                    │
│  "Generate a SEO article EEAT evaluation prompt"                    │
└──────────────────────────┬──────────────────────────────────────────┘
                           │
                           ▼
┌─────────────────────────────────────────────────────────────────────┐
│  LAYER 1: INTENT CLASSIFIER                                         │
│  • Multi-intent detection with confidence scoring                   │
│  • Each intent: reasoning + suggested direction                    │
│  Output: [SEO Content Evaluation (95%), Quality Assessment (88%)]  │
└──────────────────────────┬──────────────────────────────────────────┘
                           │
                           ▼
┌─────────────────────────────────────────────────────────────────────┐
│  LAYER 2: STRUCTURE FRAMEWORK SELECTOR                              │
│  • Maps intents to optimal thinking frameworks                     │
│  • Selects from CoT, MECE, SCQA, ToT, ReAct, or creates custom    │
│  Output: [CoT (0.92), MECE (0.87)]                                  │
└──────────────────────────┬──────────────────────────────────────────┘
                           │
                           ▼
┌─────────────────────────────────────────────────────────────────────┐
│  LAYER 3: PROMPT IR GENERATOR                                       │
│  • Creates Cartesian product: Intent × Framework                    │
│  • Generates structured PromptIR for each combination               │
│  Output: 2-6 PromptIR variations per user input                     │
└──────────────────────────┬──────────────────────────────────────────┘
                           │
                           ▼
┌─────────────────────────────────────────────────────────────────────┐
│  LAYER 4: FINAL PROMPT CONSTRUCTOR                                  │
│  • Transforms PromptIR into polished, ready-to-use prompts          │
│  • Style modes: General | GPT | Claude                              │
│  Output: Production prompts ready for deployment                    │
└──────────────────────────┬──────────────────────────────────────────┘
                           │
                           ▼
┌─────────────────────────────────────────────────────────────────────┐
│                    FINAL PROMPT OUTPUT                              │
│  Optimized for target provider with proper structure and language   │
└─────────────────────────────────────────────────────────────────────┘

🚀 Quick Start

Prerequisites

  • Node.js >= 18.0.0
  • pnpm (recommended) or npm

Installation

# Clone the repository
git clone https://github.com/ai-zen-future/open-prompt.git
cd open-prompt

# Install dependencies
pnpm install

# Build all packages
pnpm build

# Start the Website
pnpm dev

Basic Usage

import { OpenAIProvider } from "@open-prompt/core";
import {
  IntentClassifier,
  StructureFrameworkSelector,
  PromptIRGenerator,
  FinalPromptConstructor,
} from "@open-prompt/core";

// Initialize LLM client
const llmClient = new OpenAIProvider({
  apiKey: process.env.OPENAI_API_KEY,
  model: "gpt-4o-mini",
});

// Initialize all four layers
const intentClassifier = new IntentClassifier(llmClient, "gpt-4o-mini");
const frameworkSelector = new StructureFrameworkSelector(
  llmClient,
  "gpt-4o-mini",
);
const promptIRGenerator = new PromptIRGenerator(llmClient, "gpt-4o-mini");
const finalConstructor = new FinalPromptConstructor(llmClient, "gpt-4o-mini");

// Generate optimized prompts
const userInput = "Generate a SEO article EEAT evaluation prompt";

// Layer 1: Classify intents
const intentResult = await intentClassifier.classify({ userInput });

// Layer 2: Select structure frameworks
const frameworks = await frameworkSelector.select({
  userInput,
  intents: intentResult.categories.flatMap((c) => c.intents),
});

// Layer 3: Generate PromptIR
const promptIRResult = await promptIRGenerator.generate({
  userInput,
  intentResult,
  frameworkResult: frameworks,
});

// Layer 4: Construct final prompts
const finalPrompt = await finalConstructor.construct({
  promptIR: promptIRResult.combinations[0].promptIR,
  styleMode: "gpt",
  outputFormat: "string",
  originalUserInput: userInput,
});

console.log(finalPrompt.finalPrompt.content);

📚 Documentation


🤝 Contributing

We welcome contributions!

Development Workflow:

  1. Fork the repository
  2. Create your feature branch (git checkout -b feature/amazing-feature)
  3. Commit your changes (git commit -m 'Add some amazing feature')
  4. Push to the branch (git push origin feature/amazing-feature)
  5. Open a Pull Request

🗺️ Roadmap

  • Startup: Define the overall architecture and core capabilities,can only be invoked programmatically.
  • User Interface: Build UI for prompt generation and experimentation (website).
  • Coding-Style Prompt Engineering: Add prompt versioning, coding-level diff/comparison, and A/B testing workflows.
  • Prompt Small Model: Train a fine-tuned compact model dedicated to prompt generation, tuned for speed and accuracy.
  • Integration with Prompt Library/Marketplace: An all-in-one prompt discovery platform that aggregates prompt sources instead of hosting them.

📄 License

This project is licensed under the MIT License - see the LICENSE file for details.


📮 Contact & Community


Built with ❤️ by the open-prompt team

⬆ Back to Top

About

Prompting is no longer magic, but deterministic

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Contributors

Languages