@peac/adapter-openai-compatible 0.12.5 → 0.12.7

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
Files changed (2) hide show
  1. package/README.md +35 -32
  2. package/package.json +17 -5
package/README.md CHANGED
@@ -1,23 +1,24 @@
1
1
  # @peac/adapter-openai-compatible
2
2
 
3
- OpenAI-compatible chat completion adapter for PEAC interaction evidence.
3
+ OpenAI-compatible chat completion adapter for PEAC interaction evidence using a hash-first model.
4
4
 
5
- Maps OpenAI-compatible chat completion responses into PEAC `InteractionEvidenceV01` using a **hash-first model** (DD-138): no raw prompt or completion text is stored in receipts. Only SHA-256 digests, model identifiers, token counts, and timing metadata are recorded.
6
-
7
- Works with any OpenAI-compatible provider (OpenAI, Anthropic Messages API via adapter, Ollama, vLLM, Together, etc.) without importing their SDKs.
8
-
9
- ## Install
5
+ ## Installation
10
6
 
11
7
  ```bash
12
8
  pnpm add @peac/adapter-openai-compatible
13
9
  ```
14
10
 
15
- ## Usage
11
+ ## What It Does
12
+
13
+ `@peac/adapter-openai-compatible` maps chat completion responses from any OpenAI-compatible provider into PEAC interaction evidence. It uses a hash-first model: only SHA-256 digests of messages and output are recorded, along with model identifiers, token counts, and timing metadata. No raw prompt or completion text ever appears in the evidence structure. Works with OpenAI, Anthropic (via adapter), Ollama, vLLM, Together, and any other OpenAI-compatible provider without importing their SDKs.
14
+
15
+ ## How Do I Use It?
16
+
17
+ ### Create evidence from a chat completion
16
18
 
17
19
  ```typescript
18
20
  import { fromChatCompletion } from '@peac/adapter-openai-compatible';
19
21
 
20
- // After making an OpenAI-compatible API call:
21
22
  const evidence = await fromChatCompletion({
22
23
  messages: [
23
24
  { role: 'system', content: 'You are a helpful assistant.' },
@@ -37,45 +38,47 @@ const evidence = await fromChatCompletion({
37
38
  ],
38
39
  usage: { prompt_tokens: 20, completion_tokens: 5, total_tokens: 25 },
39
40
  },
40
- provider: 'openai', // optional
41
+ provider: 'openai',
41
42
  });
42
43
 
43
- // evidence.input.digest -> SHA-256 of messages (no raw text)
44
- // evidence.output.digest -> SHA-256 of output (no raw text)
44
+ // evidence.input.digest -> SHA-256 of canonicalized messages (no raw text)
45
+ // evidence.output.digest -> SHA-256 of output content (no raw text)
45
46
  // evidence.extensions -> model, usage, finish_reason
46
47
  ```
47
48
 
48
- ## Hash-First Model
49
-
50
- Per DD-138, this adapter records only:
49
+ ### Hash messages and output independently
51
50
 
52
- | Field | Content |
53
- | --------------- | --------------------------------------- |
54
- | `input.digest` | SHA-256 of canonicalized messages array |
55
- | `output.digest` | SHA-256 of concatenated output content |
56
- | `executor` | Platform identifier and model name |
57
- | `extensions` | Model ID, token counts, finish reason |
51
+ ```typescript
52
+ import { hashMessages, hashOutput } from '@peac/adapter-openai-compatible';
58
53
 
59
- Raw prompt and completion text never appear in the evidence structure.
54
+ const inputHash = await hashMessages([{ role: 'user', content: 'Hello' }]);
55
+ // 'sha256:...'
60
56
 
61
- ## Streaming
57
+ const outputHash = await hashOutput('The answer is 4.');
58
+ // 'sha256:...'
59
+ ```
62
60
 
63
- Streaming support (`fromChatCompletionStream`) is explicitly deferred to v0.11.3.
61
+ ## Integrates With
64
62
 
65
- ## API
63
+ - `@peac/kernel` (Layer 0): Wire constants and types
64
+ - `@peac/schema` (Layer 1): Interaction evidence schemas
65
+ - `@peac/protocol` (Layer 3): Receipt issuance with mapped inference evidence
66
66
 
67
- ### `fromChatCompletion(params): Promise<InferenceEvidence>`
67
+ ## For Agent Developers
68
68
 
69
- Maps a chat completion response to interaction evidence.
69
+ If you are building an AI agent or service that calls OpenAI-compatible LLM APIs:
70
70
 
71
- ### `hashMessages(messages): Promise<string>`
71
+ - Use `fromChatCompletion()` after each API call to produce signed evidence of the interaction
72
+ - The hash-first model ensures no prompt or completion text leaks into receipts
73
+ - The adapter is provider-agnostic; pass `provider` to tag evidence with the source platform
74
+ - See the [llms.txt](https://github.com/peacprotocol/peac/blob/main/llms.txt) for a concise protocol overview
72
75
 
73
- SHA-256 hash of a messages array. Returns `sha256:<hex64>`.
76
+ ## License
74
77
 
75
- ### `hashOutput(content): Promise<string>`
78
+ Apache-2.0
76
79
 
77
- SHA-256 hash of output text. Returns `sha256:<hex64>`.
80
+ ---
78
81
 
79
- ## License
82
+ PEAC Protocol is an open source project stewarded by Originary and community contributors.
80
83
 
81
- Apache-2.0
84
+ [Docs](https://www.peacprotocol.org) | [GitHub](https://github.com/peacprotocol/peac) | [Originary](https://www.originary.xyz)
package/package.json CHANGED
@@ -1,15 +1,15 @@
1
1
  {
2
2
  "name": "@peac/adapter-openai-compatible",
3
- "version": "0.12.5",
3
+ "version": "0.12.7",
4
4
  "description": "OpenAI-compatible chat completion adapter for PEAC interaction evidence (hash-first)",
5
5
  "main": "dist/index.cjs",
6
6
  "types": "dist/index.d.ts",
7
7
  "repository": {
8
8
  "type": "git",
9
- "url": "https://github.com/peacprotocol/peac.git",
9
+ "url": "git+https://github.com/peacprotocol/peac.git",
10
10
  "directory": "packages/adapters/openai-compatible"
11
11
  },
12
- "author": "jithinraj <7850727+jithinraj@users.noreply.github.com>",
12
+ "author": "PEAC Protocol Contributors",
13
13
  "license": "Apache-2.0",
14
14
  "bugs": {
15
15
  "url": "https://github.com/peacprotocol/peac/issues"
@@ -23,8 +23,8 @@
23
23
  "access": "public"
24
24
  },
25
25
  "dependencies": {
26
- "@peac/kernel": "0.12.5",
27
- "@peac/schema": "0.12.5"
26
+ "@peac/kernel": "0.12.7",
27
+ "@peac/schema": "0.12.7"
28
28
  },
29
29
  "devDependencies": {
30
30
  "@types/node": "^22.19.11",
@@ -41,6 +41,18 @@
41
41
  },
42
42
  "./package.json": "./package.json"
43
43
  },
44
+ "keywords": [
45
+ "peac",
46
+ "peacprotocol",
47
+ "interaction-records",
48
+ "signed-records",
49
+ "receipts",
50
+ "originary",
51
+ "openai-compatible",
52
+ "inference",
53
+ "response-evidence",
54
+ "llm"
55
+ ],
44
56
  "scripts": {
45
57
  "prebuild": "rm -rf dist",
46
58
  "build": "pnpm run build:js && pnpm run build:types",