TypeScript
Packages
Section titled “Packages”The TypeScript runtime is split into focused packages:
Installation
Section titled “Installation”# Core + OpenAInpm install @prompty/core@alpha @prompty/openai@alpha
# Core + Microsoft Foundrynpm install @prompty/core@alpha @prompty/foundry@alpha
# Core + Anthropicnpm install @prompty/core@alpha @prompty/anthropic@alphaQuick Start
Section titled “Quick Start”import { invoke } from "@prompty/core";import "@prompty/openai"; // registers the OpenAI provider
const result = await invoke("greeting.prompty", { userName: "Jane",});console.log(result);API Overview
Section titled “API Overview”Loading
Section titled “Loading”import { load } from "@prompty/core";
const agent = await load("chat.prompty");console.log(agent.name); // "chat"console.log(agent.model.id); // "gpt-4o"Pipeline Functions
Section titled “Pipeline Functions”import { load, prepare, run, invoke } from "@prompty/core";import "@prompty/openai";
// Step by stepconst agent = await load("chat.prompty");const messages = await prepare(agent, { question: "Hi" });const result = await run(agent, messages);
// All-in-oneconst result2 = await invoke("chat.prompty", { question: "Hi" });Agent Mode
Section titled “Agent Mode”import { turn, ExecuteError } from "@prompty/core";import "@prompty/openai";
function getWeather(city: string): string { return `72°F and sunny in ${city}`;}
const result = await turn( "agent.prompty", { question: "Weather in Seattle?" }, { get_weather: getWeather }, { maxIterations: 10, maxLlmRetries: 3 },);The agent loop includes built-in resilience:
- Resilient JSON parsing — recovers from malformed tool arguments (markdown fences, trailing commas)
- Tool error safety — tool exceptions are caught and fed back to the LLM
- LLM call retry — transient failures are retried with exponential backoff;
ExecuteErrorcarries the full conversation for resumption
Streaming
Section titled “Streaming”import { load, prepare, run, process as processResponse } from "@prompty/core";import "@prompty/openai";
const agent = await load("chat.prompty");const messages = await prepare(agent, { question: "Tell me a story" });
// Enable streamingagent.model.options.additionalProperties = { stream: true };
const response = await run(agent, messages, { raw: true });for await (const chunk of processResponse(agent, response)) { process.stdout.write(chunk);}Providers
Section titled “Providers”Each provider package registers executors and processors via side-effect imports:
| Package | Provider Key | SDK |
|---|---|---|
@prompty/openai | openai | OpenAI Node SDK |
@prompty/foundry | foundry | Microsoft Foundry SDK |
@prompty/anthropic | anthropic | Anthropic SDK |
Environment Variables
Section titled “Environment Variables”Set environment variables or use a .env file (with dotenv):
OPENAI_API_KEY=sk-your-key-hereThe ${env:VAR} syntax in .prompty files works the same way as in Python.
Further Reading
Section titled “Further Reading”- Core Concepts — architecture deep-dives
- How-To Guides — practical recipes
- Schema Reference — all frontmatter properties