Skip to content

Implementation

Prompty has runtime implementations in multiple languages. Each follows the same pipeline architecture but uses language-native patterns and package managers.

LanguageStatusPackage
Python✅ Stable (alpha)prompty on PyPI
TypeScript✅ Stable (alpha)@prompty/core on npm
C#✅ AlphaPrompty.Core, Prompty.OpenAI, Prompty.Foundry, Prompty.Anthropic on NuGet
Rust✅ Alphaprompty, prompty-openai, prompty-foundry, prompty-anthropic on crates.io

All runtimes share the same .prompty file format — a prompt written for one runtime works with any other. The differences are in installation, API style, and available providers.

  • Most mature runtime with all features implemented
  • Async support via _async variants of every function
  • Entry-point-based plugin discovery
  • Install: uv pip install prompty[all]
  • Modular package architecture (@prompty/core + provider packages)
  • Native async/await throughout
  • Provider registration via side-effect imports
  • Install: npm install @prompty/core@alpha @prompty/openai@alpha
  • Alpha packages available on NuGet: Prompty.Core, Prompty.OpenAI, Prompty.Foundry, Prompty.Anthropic
  • Static Pipeline API with async-first design
  • Same four-stage render → parse → execute → process architecture
  • Install: dotnet add package Prompty.Core --prerelease
  • Async-only runtime (Tokio) with zero-cost abstractions
  • Modular crate architecture (prompty core + provider crates)
  • Trait-based plugin system (Executor, Processor, Renderer, Parser)
  • Full agent loop with events, cancellation, guardrails, and steering
  • Install: cargo add prompty prompty-openai