Structured Output
Overview
Section titled “Overview”By default, an LLM returns free-form text. When you define an outputs
in your .prompty frontmatter, the runtime converts it to the provider’s
response_format parameter so the model is constrained to return valid JSON
matching your schema. The processor then automatically parses the JSON string into
a Python dict or JavaScript object — no manual JSON.parse() needed.
flowchart LR
A["outputs\n(YAML)"] --> B["_output_schema\n_to_wire()\nconversion"]
B --> C["response_format\njson_schema\n→ sent to LLM"]
C --> D["LLM response\nvalid JSON"]
D --> E["Processor\nJSON.parse"]
E --> F["Typed\ndict / obj"]
style A fill:#dbeafe,stroke:#3b82f6,color:#1e40af
style B fill:#bfdbfe,stroke:#1d4ed8,color:#1e3a8a
style C fill:#fef3c7,stroke:#f59e0b,color:#92400e
style D fill:#e5e7eb,stroke:#6b7280,color:#374151
style E fill:#a7f3d0,stroke:#10b981,color:#065f46
style F fill:#d1fae5,stroke:#10b981,color:#065f46
Defining Output Schema
Section titled “Defining Output Schema”Add an outputs block to your frontmatter with the properties you expect in
the response. Each property has a kind (type) and optional description.
---name: weather-reportmodel: id: gpt-4o-mini provider: openai apiType: chat connection: kind: key apiKey: ${env:OPENAI_API_KEY}outputs: - name: city kind: string description: The city name - name: temperature kind: integer description: Temperature in degrees Fahrenheit - name: conditions kind: string description: Current weather conditions---system:You are a weather assistant. Return the current weather for the requested city.
user:What's the weather in {{city}}?The runtime converts this to an OpenAI-compatible response_format with
type: "json_schema", ensuring the LLM must return a JSON object with exactly
those three fields.
How It Works
Section titled “How It Works”Under the hood, the executor performs three steps when outputs is present:
-
Schema conversion —
_output_schema_to_wire()translates eachProperty(withkind,description,required) into a standard JSON Schema object. The result is wrapped in an OpenAIresponse_formatparameter:{"type": "json_schema","json_schema": {"name": "output_schema","strict": true,"schema": {"type": "object","properties": {"city": { "type": "string", "description": "The city name" },"temperature": { "type": "integer", "description": "Temperature in degrees Fahrenheit" },"conditions": { "type": "string", "description": "Current weather conditions" }},"required": ["city", "temperature", "conditions"],"additionalProperties": false}}} -
LLM constrained generation — the model is forced to return valid JSON matching the schema. No malformed output, no missing fields.
-
Processor auto-parse — the processor detects that
outputsis defined and callsjson.loads()on the response content, returning a nativedict(Python) or object (JavaScript) instead of a raw string.
With structured output, invoke() returns a StructuredResult — a dict/object
subclass that you can use just like a regular dictionary, but that also carries
the raw JSON for efficient type casting.
from prompty import invoke
result = invoke("weather.prompty", inputs={"city": "Seattle"})
# result is a StructuredResult (dict subclass) — use like a dictprint(result["city"]) # "Seattle"print(result["temperature"]) # 62print(result["conditions"]) # "Partly cloudy"print(type(result)) # <class 'StructuredResult'>isinstance(result, dict) # True — fully backward compatiblefrom prompty import invoke_async
result = await invoke_async("weather.prompty", inputs={"city": "Seattle"})print(result["temperature"]) # 62import { invoke } from "@prompty/core";
const result = await invoke("weather.prompty", { city: "Seattle" });
// result is a StructuredResult — use like a normal objectconsole.log(result.city); // "Seattle"console.log(result.temperature); // 62console.log(result.conditions); // "Partly cloudy"using Prompty.Core;
var result = await Pipeline.InvokeAsync("weather.prompty", new() { ["city"] = "Seattle" });
// result is a StructuredResult (Dictionary<string, object?> subclass)if (result is StructuredResult sr){ Console.WriteLine(sr["city"]); // "Seattle" Console.WriteLine(sr["temperature"]); // 62 Console.WriteLine(sr["conditions"]); // "Partly cloudy"}// The outputSchema in the .prompty file is automatically converted// to OpenAI's response_format parameter. The result comes back as// a parsed serde_json::Value matching the schema.let inputs = serde_json::json!({"city": "Seattle"});let result = prompty::invoke_from_path("weather.prompty", Some(&inputs)).await?;
// result is a serde_json::Value — use like a JSON objectprintln!("{}", result["city"]); // "Seattle"println!("{}", result["temperature"]); // 62println!("{}", result["conditions"]); // "Partly cloudy"Without Output Schema
Section titled “Without Output Schema”If you don’t define outputs, the processor returns the raw text content
from the LLM response. You can still ask the model to return JSON in your prompt
instructions, but there’s no schema enforcement or automatic parsing.
# outputs defined → dict returned automaticallyresult = invoke("weather.prompty", inputs={"city": "Seattle"})print(type(result)) # <class 'dict'>print(result["temperature"]) # 62# No outputs → raw string returnedresult = invoke("chat.prompty", inputs={"city": "Seattle"})print(type(result)) # <class 'str'># You'd need to parse manually:import jsondata = json.loads(result) # may fail if LLM didn't return valid JSONCasting to Typed Objects
Section titled “Casting to Typed Objects”StructuredResult is backward-compatible with dict/object usage, but when you
need a typed object (dataclass, Pydantic model, class, Zod schema), use cast().
It deserializes directly from the raw JSON string — no intermediate dict→JSON→T
round-trip.
from dataclasses import dataclassfrom prompty import invoke, cast
@dataclassclass Weather: city: str temperature: int conditions: str
result = invoke("weather.prompty", inputs={"city": "Seattle"})weather = cast(result, Weather)
print(weather.city) # "Seattle"print(weather.temperature) # 62print(type(weather)) # <class 'Weather'>Pydantic models use the optimal model_validate_json() path:
from pydantic import BaseModelfrom prompty import invoke, cast
class Weather(BaseModel): city: str temperature: int conditions: str
weather = cast(invoke("weather.prompty", inputs={"city": "Seattle"}), Weather)Or use the target_type shorthand on invoke / turn:
weather = invoke("weather.prompty", inputs={"city": "Seattle"}, target_type=Weather)import { invoke, cast } from "@prompty/core";import { z } from "zod";
const WeatherSchema = z.object({ city: z.string(), temperature: z.number(), conditions: z.string(),});type Weather = z.infer<typeof WeatherSchema>;
const result = await invoke("weather.prompty", { city: "Seattle" });const weather = cast<Weather>(result, WeatherSchema.parse);
console.log(weather.city); // "Seattle"Or use the typed invoke overload with a validator:
const weather = await invoke( "weather.prompty", { city: "Seattle" }, { validator: WeatherSchema.parse });// weather is typed as Weatherusing Prompty.Core;
var result = await Pipeline.InvokeAsync("weather.prompty", new() { ["city"] = "Seattle" });
// Option 1: Cast after invokevar weather = ((StructuredResult)result).Cast<WeatherReport>();Console.WriteLine(weather.City); // "Seattle"
// Option 2: Generic invoke — cast in one stepvar weather2 = await Pipeline.InvokeAsync<WeatherReport>("weather.prompty", new() { ["city"] = "Seattle" });use serde::Deserialize;
#[derive(Deserialize, Debug)]struct Weather { city: String, temperature: i32, conditions: String,}
let inputs = serde_json::json!({"city": "Seattle"});let result = prompty::invoke_from_path("weather.prompty", Some(&inputs)).await?;
// Deserialize the serde_json::Value into a typed structlet weather: Weather = serde_json::from_value(result)?;
println!("{}", weather.city); // "Seattle"println!("{}", weather.temperature); // 62Nested Objects
Section titled “Nested Objects”For complex responses, use kind: object with nested properties to define
multi-level schemas:
---name: detailed-weathermodel: id: gpt-4o-mini provider: openai apiType: chat connection: kind: key apiKey: ${env:OPENAI_API_KEY}outputs: - name: city kind: string - name: current kind: object properties: - name: temperature kind: integer description: Temperature in °F - name: humidity kind: integer description: Humidity percentage - name: conditions kind: string - name: forecast kind: array description: Next 3 days---system:Return current weather and a 3-day forecast for the requested city.
user:Weather for {{city}}?The result is a nested dictionary:
result = invoke("detailed-weather.prompty", inputs={"city": "Portland"})
print(result["city"]) # "Portland"print(result["current"]["temperature"]) # 58print(result["current"]["humidity"]) # 72print(result["forecast"]) # [{"day": "Mon", ...}, ...]