@prompty/core 0.1.4 → 2.0.0-alpha.3

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
package/README.md CHANGED
@@ -1,2 +1,108 @@
1
- # promptyjs
2
- A prompty js runtime.
1
+ # @prompty/core
2
+
3
+ Prompty core runtime — load, render, parse, and trace `.prompty` files in TypeScript/JavaScript.
4
+
5
+ ## Installation
6
+
7
+ ```bash
8
+ npm install @prompty/core
9
+ ```
10
+
11
+ You'll also need a **provider package** for your LLM:
12
+
13
+ ```bash
14
+ # OpenAI
15
+ npm install @prompty/openai
16
+
17
+ # Azure AI Foundry
18
+ npm install @prompty/foundry
19
+
20
+ # Anthropic
21
+ npm install @prompty/anthropic
22
+ ```
23
+
24
+ ## Quick Start
25
+
26
+ ```typescript
27
+ import "@prompty/openai"; // registers the OpenAI provider
28
+ import { load, run } from "@prompty/core";
29
+
30
+ // Load and run a .prompty file
31
+ const result = await run("./my-prompt.prompty", {
32
+ question: "What is the capital of France?",
33
+ });
34
+ console.log(result);
35
+ ```
36
+
37
+ ## Step-by-Step Usage
38
+
39
+ ```typescript
40
+ import "@prompty/openai";
41
+ import { load, prepare, execute, process } from "@prompty/core";
42
+
43
+ // 1. Load — parse .prompty file into a typed Prompty object
44
+ const agent = await load("./chat.prompty");
45
+
46
+ // 2. Prepare — render template + parse into messages
47
+ const messages = await prepare(agent, { name: "Alice" });
48
+
49
+ // 3. Execute — call the LLM
50
+ const response = await execute(agent, messages);
51
+
52
+ // 4. Process — extract the result
53
+ const result = await process(agent, response);
54
+ ```
55
+
56
+ ## What's a `.prompty` file?
57
+
58
+ A Markdown file with YAML frontmatter that defines an LLM prompt:
59
+
60
+ ```prompty
61
+ ---
62
+ name: greeting
63
+ model:
64
+ id: gpt-4o-mini
65
+ provider: openai
66
+ connection:
67
+ kind: key
68
+ endpoint: ${env:OPENAI_BASE_URL}
69
+ apiKey: ${env:OPENAI_API_KEY}
70
+ ---
71
+ system:
72
+ You are a helpful assistant.
73
+
74
+ user:
75
+ Hello, my name is {{name}}. {{question}}
76
+ ```
77
+
78
+ ## Tracing
79
+
80
+ ```typescript
81
+ import { Tracer, PromptyTracer } from "@prompty/core";
82
+
83
+ // Write .tracy JSON files to .runs/
84
+ const tracer = new PromptyTracer({ outputDir: ".runs" });
85
+ Tracer.add("prompty", tracer.factory);
86
+
87
+ const result = await run("./chat.prompty", { question: "Hi" });
88
+ console.log("Trace:", tracer.lastTracePath);
89
+ ```
90
+
91
+ ## Key Exports
92
+
93
+ | Export | Description |
94
+ |--------|-------------|
95
+ | `load` | Parse a `.prompty` file → `Prompty` object |
96
+ | `prepare` | Render template + parse → `Message[]` |
97
+ | `execute` | Call the LLM provider |
98
+ | `process` | Extract result from LLM response |
99
+ | `run` | All-in-one: load → prepare → execute → process |
100
+ | `executeAgent` | Agent loop with tool calling |
101
+ | `Tracer` / `PromptyTracer` | Observability and tracing |
102
+ | `registerExecutor` / `registerProcessor` | Register custom providers |
103
+ | `registerConnection` | Register named connections |
104
+ | `Message`, `ContentPart`, `PromptyStream` | Core types |
105
+
106
+ ## License
107
+
108
+ MIT