Use with Microsoft Foundry
Prerequisites
Section titled “Prerequisites”- A Microsoft Foundry project or an Azure OpenAI resource with a deployed model (e.g.,
gpt-4o-mini) - The resource endpoint:
- Foundry project endpoint (recommended):
https://<resource>.services.ai.azure.com/api/projects/<project> - Classic Azure OpenAI endpoint (legacy):
https://<resource>.openai.azure.com/
- Foundry project endpoint (recommended):
- Either an API key or Microsoft Entra ID credentials
# API key authpip install prompty[jinja2,foundry]
# Entra ID auth (adds azure-identity)pip install prompty[jinja2,foundry] azure-identitynpm install @prompty/core @prompty/foundrydotnet add package Prompty.Core --prereleasedotnet add package Prompty.Foundry --prereleasecargo add prompty prompty-foundryOption A: API Key Authentication
Section titled “Option A: API Key Authentication”The simplest approach — use an API key from the Azure AI Foundry portal or Azure Portal.
1. Write the .prompty File
Section titled “1. Write the .prompty File”Create foundry-chat.prompty:
---name: foundry-chatdescription: Chat completion with Microsoft Foundry (API key)model: id: gpt-4o-mini provider: foundry apiType: chat connection: kind: key endpoint: ${env:AZURE_AI_PROJECT_ENDPOINT} apiKey: ${env:AZURE_AI_PROJECT_KEY} options: temperature: 0.7 maxOutputTokens: 1024inputs: - name: question kind: string default: What is Microsoft Foundry?---system:You are a helpful assistant. Answer concisely.
user:{{question}}2. Run It
Section titled “2. Run It”import prompty
result = prompty.invoke( "foundry-chat.prompty", inputs={"question": "What services does Microsoft Foundry offer?"},)print(result)import { invoke } from "@prompty/core";import "@prompty/foundry";
const result = await invoke("foundry-chat.prompty", { question: "What services does Microsoft Foundry offer?",});
console.log(result);using Prompty.Core;
var result = await Pipeline.InvokeAsync( "foundry-chat.prompty", new() { ["question"] = "What services does Microsoft Foundry offer?" });
Console.WriteLine(result);use serde_json::json;
#[tokio::main]async fn main() -> Result<(), Box<dyn std::error::Error>> { prompty::register_defaults(); prompty_foundry::register();
let result = prompty::invoke_from_path( "foundry-chat.prompty", Some(&json!({ "question": "What services does Microsoft Foundry offer?" })), ).await?; println!("{result}"); Ok(())}3. Environment Setup
Section titled “3. Environment Setup”# .env — Foundry project endpoint (recommended)AZURE_AI_PROJECT_ENDPOINT=https://my-resource.services.ai.azure.com/api/projects/my-projectAZURE_AI_PROJECT_KEY=abc123...
# .env — Classic Azure OpenAI endpoint (legacy, also works)# AZURE_OPENAI_ENDPOINT=https://my-resource.openai.azure.com/# AZURE_OPENAI_API_KEY=abc123...Option B: Microsoft Entra ID Authentication
Section titled “Option B: Microsoft Entra ID Authentication”For production workloads, use Microsoft Entra ID — no API keys to manage or rotate. This section covers two approaches:
DefaultAzureCredential— convenient for local development; automatically discovers your Azure CLI, VS Code, or managed identity credentials.ManagedIdentityCredential— recommended for deployed services (App Service, Container Apps, AKS, Functions). Explicit, no fallback chain, no risk of accidentally using developer credentials in production.
1. Write the .prompty File
Section titled “1. Write the .prompty File”Create foundry-chat-aad.prompty:
---name: foundry-chat-aaddescription: Chat completion with Microsoft Foundry (Microsoft Entra ID)model: id: gpt-4o-mini provider: foundry apiType: chat connection: kind: reference name: foundry_default options: temperature: 0.7 maxOutputTokens: 1024inputs: - name: question kind: string default: What is Microsoft Foundry?2. Register the Connection and Run (Local Development)
Section titled “2. Register the Connection and Run (Local Development)”During development, DefaultAzureCredential automatically finds your Azure CLI
or VS Code login:
import osfrom azure.identity import DefaultAzureCredentialimport prompty
# Register the named connection before loading the promptprompty.register_connection( "foundry_default", { "endpoint": os.environ["AZURE_AI_PROJECT_ENDPOINT"], "credential": DefaultAzureCredential(), },)
result = prompty.invoke( "foundry-chat-aad.prompty", inputs={"question": "What is Microsoft Foundry?"},)print(result)import { invoke, registerConnection } from "@prompty/core";import "@prompty/foundry";import { DefaultAzureCredential } from "@azure/identity";
registerConnection("foundry_default", { endpoint: process.env.AZURE_AI_PROJECT_ENDPOINT!, credential: new DefaultAzureCredential(),});
const result = await invoke("foundry-chat-aad.prompty", { question: "What is Microsoft Foundry?",});
console.log(result);using Azure.Identity;using Prompty.Core;
// DefaultAzureCredential finds your Azure CLI / VS Code loginvar result = await Pipeline.InvokeAsync( "foundry-chat-aad.prompty", new() { ["question"] = "What is Microsoft Foundry?" });
Console.WriteLine(result);use serde_json::json;
#[tokio::main]async fn main() -> Result<(), Box<dyn std::error::Error>> { prompty::register_defaults(); prompty_foundry::register();
// Register the named connection before loading the prompt prompty::register_connection( "foundry_default", json!({ "endpoint": std::env::var("AZURE_AI_PROJECT_ENDPOINT")?, "credential": "default", }), );
let result = prompty::invoke_from_path( "foundry-chat-aad.prompty", Some(&json!({ "question": "What is Microsoft Foundry?" })), ).await?; println!("{result}"); Ok(())}3. Production Deployment: Use Managed Identity
Section titled “3. Production Deployment: Use Managed Identity”Step 1: Enable Managed Identity on your Azure hosting resource:
# App Serviceaz webapp identity assign --name <app-name> --resource-group <rg>
# Container Appsaz containerapp identity assign --name <app-name> --resource-group <rg> --system-assigned
# Azure Kubernetes Service (pod identity)az aks update --name <cluster> --resource-group <rg> --enable-oidc-issuer --enable-workload-identityStep 2: Assign the RBAC role — grant only what’s needed (principle of least privilege):
az role assignment create \ --assignee <managed-identity-object-id> \ --role "Cognitive Services OpenAI User" \ --scope /subscriptions/<sub>/resourceGroups/<rg>/providers/Microsoft.CognitiveServices/accounts/<resource>Step 3: Use ManagedIdentityCredential in your code:
import osfrom azure.identity import ManagedIdentityCredentialimport prompty
prompty.register_connection( "foundry_default", { "endpoint": os.environ["AZURE_AI_PROJECT_ENDPOINT"], "credential": ManagedIdentityCredential(), },)
result = prompty.invoke( "foundry-chat-aad.prompty", inputs={"question": "What is Microsoft Foundry?"},)print(result)import { invoke, registerConnection } from "@prompty/core";import "@prompty/foundry";import { ManagedIdentityCredential } from "@azure/identity";
registerConnection("foundry_default", { endpoint: process.env.AZURE_AI_PROJECT_ENDPOINT!, credential: new ManagedIdentityCredential(),});
const result = await invoke("foundry-chat-aad.prompty", { question: "What is Microsoft Foundry?",});
console.log(result);using Azure.Identity;using Prompty.Core;
// For production: explicitly use ManagedIdentityCredentialPipeline.RegisterConnection("foundry_default", new Dictionary<string, object>{ ["endpoint"] = Environment.GetEnvironmentVariable("AZURE_AI_PROJECT_ENDPOINT")!, ["credential"] = new ManagedIdentityCredential(),});
var result = await Pipeline.InvokeAsync( "foundry-chat-aad.prompty", new() { ["question"] = "What is Microsoft Foundry?" });
Console.WriteLine(result);use serde_json::json;
#[tokio::main]async fn main() -> Result<(), Box<dyn std::error::Error>> { prompty::register_defaults(); prompty_foundry::register();
// For production: use managed identity credential prompty::register_connection( "foundry_default", json!({ "endpoint": std::env::var("AZURE_AI_PROJECT_ENDPOINT")?, "credential": "managed_identity", }), );
let result = prompty::invoke_from_path( "foundry-chat-aad.prompty", Some(&json!({ "question": "What is Microsoft Foundry?" })), ).await?; println!("{result}"); Ok(())}Comparing Authentication Options
Section titled “Comparing Authentication Options”API Key (kind: key) | Entra ID — DefaultAzureCredential | Entra ID — ManagedIdentityCredential | |
|---|---|---|---|
| Setup | Copy key from portal | az login or VS Code | Enable managed identity + RBAC |
| Security | Key in env var — can leak | Token-based — no static secret | Token-based — no static secret |
| Credential source | Static string | Auto-discovers (CLI, VS Code, MI, …) | Managed identity only — no fallbacks |
| Rotation | Manual | Automatic | Automatic |
| Best for | Quick prototyping | Local development | Production deployments |
Using a Shared Connection File
Section titled “Using a Shared Connection File”If multiple .prompty files share the same Foundry config, extract it into a JSON
file and reference it with ${file:...}:
shared/foundry-connection.json:
{ "kind": "key", "endpoint": "https://my-resource.services.ai.azure.com/api/projects/my-project", "apiKey": "${env:AZURE_AI_PROJECT_KEY}"}my-prompt.prompty:
---name: shared-connection-demomodel: id: gpt-4o-mini provider: foundry connection: ${file:shared/foundry-connection.json}---system:You are a helpful assistant.
user:{{question}}This avoids duplicating endpoint and auth config across prompts.
Environment Setup (Both Options)
Section titled “Environment Setup (Both Options)”AZURE_AI_PROJECT_ENDPOINT=https://my-resource.services.ai.azure.com/api/projects/my-projectAZURE_AI_PROJECT_KEY=abc123... # Only needed for API key auth
# Optional: specify deployment names per model typeAZURE_OPENAI_CHAT_DEPLOYMENT=gpt-4o-miniAZURE_OPENAI_EMBEDDING_DEPLOYMENT=text-embedding-3-small