@prompty/core 2.0.0-alpha.1 → 2.0.0-alpha.4

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
package/README.md CHANGED
@@ -1,105 +1,108 @@
1
- # @prompty/core
2
-
3
- Prompty core runtime — load, render, parse, and trace `.prompty` files in TypeScript/JavaScript.
4
-
5
- ## Installation
6
-
7
- ```bash
8
- npm install @prompty/core
9
- ```
10
-
11
- You'll also need a **provider package** for your LLM:
12
-
13
- ```bash
14
- # OpenAI
15
- npm install @prompty/openai
16
-
17
- # Azure AI Foundry
18
- npm install @prompty/foundry
19
- ```
20
-
21
- ## Quick Start
22
-
23
- ```typescript
24
- import "@prompty/openai"; // registers the OpenAI provider
25
- import { load, run } from "@prompty/core";
26
-
27
- // Load and run a .prompty file
28
- const result = await run("./my-prompt.prompty", {
29
- question: "What is the capital of France?",
30
- });
31
- console.log(result);
32
- ```
33
-
34
- ## Step-by-Step Usage
35
-
36
- ```typescript
37
- import "@prompty/openai";
38
- import { load, prepare, execute, process } from "@prompty/core";
39
-
40
- // 1. Load — parse .prompty file into a typed Prompty object
41
- const agent = await load("./chat.prompty");
42
-
43
- // 2. Preparerender template + parse into messages
44
- const messages = await prepare(agent, { name: "Alice" });
45
-
46
- // 3. Executecall the LLM
47
- const response = await execute(agent, messages);
48
-
49
- // 4. Processextract the result
50
- const result = await process(agent, response);
51
- ```
52
-
53
- ## What's a `.prompty` file?
54
-
55
- A Markdown file with YAML frontmatter that defines an LLM prompt:
56
-
57
- ```prompty
58
- ---
59
- name: greeting
60
- model:
61
- id: gpt-4o-mini
62
- provider: openai
63
- connection:
64
- kind: key
65
- endpoint: ${env:OPENAI_BASE_URL}
66
- apiKey: ${env:OPENAI_API_KEY}
67
- ---
68
- system:
69
- You are a helpful assistant.
70
-
71
- user:
72
- Hello, my name is {{name}}. {{question}}
73
- ```
74
-
75
- ## Tracing
76
-
77
- ```typescript
78
- import { Tracer, PromptyTracer } from "@prompty/core";
79
-
80
- // Write .tracy JSON files to .runs/
81
- const tracer = new PromptyTracer({ outputDir: ".runs" });
82
- Tracer.add("prompty", tracer.factory);
83
-
84
- const result = await run("./chat.prompty", { question: "Hi" });
85
- console.log("Trace:", tracer.lastTracePath);
86
- ```
87
-
88
- ## Key Exports
89
-
90
- | Export | Description |
91
- |--------|-------------|
92
- | `load` | Parse a `.prompty` file → `Prompty` object |
93
- | `prepare` | Render template + parse → `Message[]` |
94
- | `execute` | Call the LLM provider |
95
- | `process` | Extract result from LLM response |
96
- | `run` | All-in-one: load prepareexecute → process |
97
- | `executeAgent` | Agent loop with tool calling |
98
- | `Tracer` / `PromptyTracer` | Observability and tracing |
99
- | `registerExecutor` / `registerProcessor` | Register custom providers |
100
- | `registerConnection` | Register named connections |
101
- | `Message`, `ContentPart`, `PromptyStream` | Core types |
102
-
103
- ## License
104
-
105
- MIT
1
+ # @prompty/core
2
+
3
+ Prompty core runtime — load, render, parse, and trace `.prompty` files in TypeScript/JavaScript.
4
+
5
+ ## Installation
6
+
7
+ ```bash
8
+ npm install @prompty/core
9
+ ```
10
+
11
+ You'll also need a **provider package** for your LLM:
12
+
13
+ ```bash
14
+ # OpenAI
15
+ npm install @prompty/openai
16
+
17
+ # Azure AI Foundry
18
+ npm install @prompty/foundry
19
+
20
+ # Anthropic
21
+ npm install @prompty/anthropic
22
+ ```
23
+
24
+ ## Quick Start
25
+
26
+ ```typescript
27
+ import "@prompty/openai"; // registers the OpenAI provider
28
+ import { load, run } from "@prompty/core";
29
+
30
+ // Load and run a .prompty file
31
+ const result = await run("./my-prompt.prompty", {
32
+ question: "What is the capital of France?",
33
+ });
34
+ console.log(result);
35
+ ```
36
+
37
+ ## Step-by-Step Usage
38
+
39
+ ```typescript
40
+ import "@prompty/openai";
41
+ import { load, prepare, execute, process } from "@prompty/core";
42
+
43
+ // 1. Loadparse .prompty file into a typed Prompty object
44
+ const agent = await load("./chat.prompty");
45
+
46
+ // 2. Preparerender template + parse into messages
47
+ const messages = await prepare(agent, { name: "Alice" });
48
+
49
+ // 3. Executecall the LLM
50
+ const response = await execute(agent, messages);
51
+
52
+ // 4. Process — extract the result
53
+ const result = await process(agent, response);
54
+ ```
55
+
56
+ ## What's a `.prompty` file?
57
+
58
+ A Markdown file with YAML frontmatter that defines an LLM prompt:
59
+
60
+ ```prompty
61
+ ---
62
+ name: greeting
63
+ model:
64
+ id: gpt-4o-mini
65
+ provider: openai
66
+ connection:
67
+ kind: key
68
+ endpoint: ${env:OPENAI_BASE_URL}
69
+ apiKey: ${env:OPENAI_API_KEY}
70
+ ---
71
+ system:
72
+ You are a helpful assistant.
73
+
74
+ user:
75
+ Hello, my name is {{name}}. {{question}}
76
+ ```
77
+
78
+ ## Tracing
79
+
80
+ ```typescript
81
+ import { Tracer, PromptyTracer } from "@prompty/core";
82
+
83
+ // Write .tracy JSON files to .runs/
84
+ const tracer = new PromptyTracer({ outputDir: ".runs" });
85
+ Tracer.add("prompty", tracer.factory);
86
+
87
+ const result = await run("./chat.prompty", { question: "Hi" });
88
+ console.log("Trace:", tracer.lastTracePath);
89
+ ```
90
+
91
+ ## Key Exports
92
+
93
+ | Export | Description |
94
+ |--------|-------------|
95
+ | `load` | Parse a `.prompty` file `Prompty` object |
96
+ | `prepare` | Render template + parse`Message[]` |
97
+ | `execute` | Call the LLM provider |
98
+ | `process` | Extract result from LLM response |
99
+ | `run` | All-in-one: load prepare execute → process |
100
+ | `executeAgent` | Agent loop with tool calling |
101
+ | `Tracer` / `PromptyTracer` | Observability and tracing |
102
+ | `registerExecutor` / `registerProcessor` | Register custom providers |
103
+ | `registerConnection` | Register named connections |
104
+ | `Message`, `ContentPart`, `PromptyStream` | Core types |
105
+
106
+ ## License
107
+
108
+ MIT