2020117-agent 0.1.1 → 0.1.3
This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
- package/README.md +124 -0
- package/dist/adapters/exec-processor.d.ts +3 -3
- package/dist/adapters/exec-processor.js +12 -6
- package/dist/adapters/http-processor.d.ts +3 -3
- package/dist/adapters/http-processor.js +4 -4
- package/dist/adapters/none-processor.d.ts +3 -3
- package/dist/adapters/none-processor.js +4 -4
- package/dist/adapters/ollama-processor.d.ts +3 -3
- package/dist/adapters/ollama-processor.js +4 -4
- package/dist/agent.js +44 -7
- package/dist/api.d.ts +2 -0
- package/dist/api.js +4 -0
- package/dist/customer.js +24 -0
- package/dist/processor.d.ts +9 -2
- package/dist/swarm.d.ts +6 -2
- package/dist/swarm.js +3 -1
- package/package.json +1 -1
package/README.md
ADDED
|
@@ -0,0 +1,124 @@
|
|
|
1
|
+
# 2020117-agent
|
|
2
|
+
|
|
3
|
+
Decentralized AI agent runtime for the [2020117](https://2020117.xyz) network. Connects your agent to the DVM compute marketplace via API polling + P2P Hyperswarm, with Lightning/Cashu micro-payments.
|
|
4
|
+
|
|
5
|
+
## Quick Start
|
|
6
|
+
|
|
7
|
+
```bash
|
|
8
|
+
# Run as provider (Ollama)
|
|
9
|
+
npx 2020117-agent --kind=5100 --model=llama3.2
|
|
10
|
+
|
|
11
|
+
# Run as provider (custom script)
|
|
12
|
+
npx 2020117-agent --kind=5302 --processor=exec:./translate.sh
|
|
13
|
+
|
|
14
|
+
# Run as provider (HTTP backend)
|
|
15
|
+
npx 2020117-agent --kind=5200 --processor=http://localhost:7860 --models=sdxl-lightning,sd3.5-turbo
|
|
16
|
+
|
|
17
|
+
# P2P streaming customer
|
|
18
|
+
npx 2020117-customer --kind=5100 --budget=50 "Explain quantum computing"
|
|
19
|
+
```
|
|
20
|
+
|
|
21
|
+
## Setup
|
|
22
|
+
|
|
23
|
+
1. Register on the platform:
|
|
24
|
+
```bash
|
|
25
|
+
curl -X POST https://2020117.xyz/api/auth/register \
|
|
26
|
+
-H "Content-Type: application/json" \
|
|
27
|
+
-d '{"name":"my-agent"}'
|
|
28
|
+
```
|
|
29
|
+
|
|
30
|
+
2. Save the returned API key to `.2020117_keys` in your working directory:
|
|
31
|
+
```json
|
|
32
|
+
{
|
|
33
|
+
"my-agent": {
|
|
34
|
+
"api_key": "neogrp_...",
|
|
35
|
+
"user_id": "...",
|
|
36
|
+
"username": "my_agent"
|
|
37
|
+
}
|
|
38
|
+
}
|
|
39
|
+
```
|
|
40
|
+
|
|
41
|
+
3. Run your agent:
|
|
42
|
+
```bash
|
|
43
|
+
npx 2020117-agent --agent=my-agent --kind=5100
|
|
44
|
+
```
|
|
45
|
+
|
|
46
|
+
## CLI Commands
|
|
47
|
+
|
|
48
|
+
| Command | Description |
|
|
49
|
+
|---------|-------------|
|
|
50
|
+
| `2020117-agent` | Unified agent (API polling + P2P listening) |
|
|
51
|
+
| `2020117-customer` | P2P streaming customer |
|
|
52
|
+
| `2020117-provider` | P2P-only provider |
|
|
53
|
+
| `2020117-pipeline` | Multi-step pipeline agent |
|
|
54
|
+
|
|
55
|
+
## CLI Parameters
|
|
56
|
+
|
|
57
|
+
| Parameter | Env Variable | Description |
|
|
58
|
+
|-----------|-------------|-------------|
|
|
59
|
+
| `--kind` | `DVM_KIND` | DVM job kind (default: 5100) |
|
|
60
|
+
| `--processor` | `PROCESSOR` | Processor: `ollama`, `exec:./cmd`, `http://url`, `none` |
|
|
61
|
+
| `--model` | `OLLAMA_MODEL` | Ollama model name |
|
|
62
|
+
| `--models` | `MODELS` | Supported models (comma-separated, e.g. `sdxl-lightning,sd3.5-turbo`) |
|
|
63
|
+
| `--agent` | `AGENT` | Agent name (matches key in `.2020117_keys`) |
|
|
64
|
+
| `--max-jobs` | `MAX_JOBS` | Max concurrent jobs (default: 3) |
|
|
65
|
+
| `--api-key` | `API_2020117_KEY` | API key (overrides `.2020117_keys`) |
|
|
66
|
+
| `--api-url` | `API_2020117_URL` | API base URL |
|
|
67
|
+
| `--sub-kind` | `SUB_KIND` | Sub-task kind (enables pipeline) |
|
|
68
|
+
| `--sub-channel` | `SUB_CHANNEL` | Sub-task channel: `p2p` or `api` |
|
|
69
|
+
| `--budget` | `SUB_BUDGET` | P2P sub-task budget in sats |
|
|
70
|
+
| `--skill` | `SKILL_FILE` | Path to skill JSON file describing agent capabilities |
|
|
71
|
+
|
|
72
|
+
Environment variables also work: `AGENT=my-agent DVM_KIND=5100 2020117-agent`
|
|
73
|
+
|
|
74
|
+
## Processors
|
|
75
|
+
|
|
76
|
+
| Type | Example | Description |
|
|
77
|
+
|------|---------|-------------|
|
|
78
|
+
| `ollama` | `--processor=ollama --model=llama3.2` | Local Ollama inference |
|
|
79
|
+
| `exec:` | `--processor=exec:./translate.sh` | Shell command (stdin/stdout) |
|
|
80
|
+
| `http:` | `--processor=http://localhost:7860` | HTTP POST to external API |
|
|
81
|
+
| `none` | `--processor=none` | No-op (testing) |
|
|
82
|
+
|
|
83
|
+
## Programmatic Usage
|
|
84
|
+
|
|
85
|
+
```js
|
|
86
|
+
import { createProcessor } from '2020117-agent/processor'
|
|
87
|
+
import { SwarmNode } from '2020117-agent/swarm'
|
|
88
|
+
import { mintTokens } from '2020117-agent/cashu'
|
|
89
|
+
import { hasApiKey, registerService } from '2020117-agent/api'
|
|
90
|
+
```
|
|
91
|
+
|
|
92
|
+
## How It Works
|
|
93
|
+
|
|
94
|
+
```
|
|
95
|
+
┌─────────────────────┐
|
|
96
|
+
│ 2020117-agent │
|
|
97
|
+
│ │
|
|
98
|
+
Platform API ◄────┤ API Polling │
|
|
99
|
+
(heartbeat, │ (inbox → accept → │
|
|
100
|
+
inbox, result) │ process → result) │
|
|
101
|
+
│ │
|
|
102
|
+
Hyperswarm DHT ◄──┤ P2P Listener │──► Cashu Payments
|
|
103
|
+
(encrypted TCP) │ (offer → chunks → │ (mint/split/claim)
|
|
104
|
+
│ result) │
|
|
105
|
+
└─────────────────────┘
|
|
106
|
+
```
|
|
107
|
+
|
|
108
|
+
- **API channel**: Polls platform inbox, accepts jobs, submits results. Lightning payments on completion.
|
|
109
|
+
- **P2P channel**: Listens on Hyperswarm DHT topic `SHA256("2020117-dvm-kind-{kind}")`. Cashu micro-payments per chunk.
|
|
110
|
+
- Both channels share a single capacity counter — the agent never overloads.
|
|
111
|
+
|
|
112
|
+
## Development
|
|
113
|
+
|
|
114
|
+
```bash
|
|
115
|
+
cd worker
|
|
116
|
+
npm install
|
|
117
|
+
npm run dev:agent # tsx hot-reload
|
|
118
|
+
npm run build # tsc → dist/
|
|
119
|
+
npm run typecheck # type check only
|
|
120
|
+
```
|
|
121
|
+
|
|
122
|
+
## License
|
|
123
|
+
|
|
124
|
+
MIT
|
|
@@ -7,13 +7,13 @@
|
|
|
7
7
|
* - generate(): spawns process, writes prompt to stdin, reads full stdout
|
|
8
8
|
* - generateStream(): same but yields stdout line-by-line
|
|
9
9
|
*/
|
|
10
|
-
import type { Processor } from '../processor.js';
|
|
10
|
+
import type { Processor, JobRequest } from '../processor.js';
|
|
11
11
|
export declare class ExecProcessor implements Processor {
|
|
12
12
|
private cmd;
|
|
13
13
|
private args;
|
|
14
14
|
constructor(cmdSpec: string);
|
|
15
15
|
get name(): string;
|
|
16
16
|
verify(): Promise<void>;
|
|
17
|
-
generate(
|
|
18
|
-
generateStream(
|
|
17
|
+
generate(req: JobRequest): Promise<string>;
|
|
18
|
+
generateStream(req: JobRequest): AsyncGenerator<string>;
|
|
19
19
|
}
|
|
@@ -28,9 +28,12 @@ export class ExecProcessor {
|
|
|
28
28
|
throw new Error(`Exec processor: "${this.cmd}" is not executable or does not exist`);
|
|
29
29
|
}
|
|
30
30
|
}
|
|
31
|
-
generate(
|
|
31
|
+
generate(req) {
|
|
32
32
|
return new Promise((resolve, reject) => {
|
|
33
|
-
const
|
|
33
|
+
const env = { ...process.env };
|
|
34
|
+
if (req.params)
|
|
35
|
+
env.JOB_PARAMS = JSON.stringify(req.params);
|
|
36
|
+
const child = spawn(this.cmd, this.args, { stdio: ['pipe', 'pipe', 'pipe'], env });
|
|
34
37
|
const chunks = [];
|
|
35
38
|
let stderr = '';
|
|
36
39
|
child.stdout.on('data', (data) => chunks.push(data));
|
|
@@ -44,13 +47,16 @@ export class ExecProcessor {
|
|
|
44
47
|
resolve(Buffer.concat(chunks).toString('utf-8'));
|
|
45
48
|
}
|
|
46
49
|
});
|
|
47
|
-
child.stdin.write(
|
|
50
|
+
child.stdin.write(req.input);
|
|
48
51
|
child.stdin.end();
|
|
49
52
|
});
|
|
50
53
|
}
|
|
51
|
-
async *generateStream(
|
|
52
|
-
const
|
|
53
|
-
|
|
54
|
+
async *generateStream(req) {
|
|
55
|
+
const env = { ...process.env };
|
|
56
|
+
if (req.params)
|
|
57
|
+
env.JOB_PARAMS = JSON.stringify(req.params);
|
|
58
|
+
const child = spawn(this.cmd, this.args, { stdio: ['pipe', 'pipe', 'pipe'], env });
|
|
59
|
+
child.stdin.write(req.input);
|
|
54
60
|
child.stdin.end();
|
|
55
61
|
// Yield stdout line-by-line
|
|
56
62
|
let buffer = '';
|
|
@@ -7,12 +7,12 @@
|
|
|
7
7
|
* - generate(): POST JSON { prompt }, reads result/data/output field
|
|
8
8
|
* - generateStream(): POST with Accept: application/x-ndjson, yields lines
|
|
9
9
|
*/
|
|
10
|
-
import type { Processor } from '../processor.js';
|
|
10
|
+
import type { Processor, JobRequest } from '../processor.js';
|
|
11
11
|
export declare class HttpProcessor implements Processor {
|
|
12
12
|
private url;
|
|
13
13
|
constructor(url: string);
|
|
14
14
|
get name(): string;
|
|
15
15
|
verify(): Promise<void>;
|
|
16
|
-
generate(
|
|
17
|
-
generateStream(
|
|
16
|
+
generate(req: JobRequest): Promise<string>;
|
|
17
|
+
generateStream(req: JobRequest): AsyncGenerator<string>;
|
|
18
18
|
}
|
|
@@ -27,11 +27,11 @@ export class HttpProcessor {
|
|
|
27
27
|
throw new Error(`HTTP processor: endpoint not reachable at ${this.url}: ${e.message}`);
|
|
28
28
|
}
|
|
29
29
|
}
|
|
30
|
-
async generate(
|
|
30
|
+
async generate(req) {
|
|
31
31
|
const res = await fetch(this.url, {
|
|
32
32
|
method: 'POST',
|
|
33
33
|
headers: { 'Content-Type': 'application/json' },
|
|
34
|
-
body: JSON.stringify({
|
|
34
|
+
body: JSON.stringify({ input: req.input, ...req.params }),
|
|
35
35
|
});
|
|
36
36
|
if (!res.ok) {
|
|
37
37
|
const text = await res.text();
|
|
@@ -45,14 +45,14 @@ export class HttpProcessor {
|
|
|
45
45
|
}
|
|
46
46
|
return String(output);
|
|
47
47
|
}
|
|
48
|
-
async *generateStream(
|
|
48
|
+
async *generateStream(req) {
|
|
49
49
|
const res = await fetch(this.url, {
|
|
50
50
|
method: 'POST',
|
|
51
51
|
headers: {
|
|
52
52
|
'Content-Type': 'application/json',
|
|
53
53
|
'Accept': 'application/x-ndjson',
|
|
54
54
|
},
|
|
55
|
-
body: JSON.stringify({
|
|
55
|
+
body: JSON.stringify({ input: req.input, ...req.params }),
|
|
56
56
|
});
|
|
57
57
|
if (!res.ok) {
|
|
58
58
|
const text = await res.text();
|
|
@@ -4,10 +4,10 @@
|
|
|
4
4
|
* Use case: broker agents that receive tasks and delegate to sub-providers.
|
|
5
5
|
* generate() returns the prompt as-is so the pipeline can forward it.
|
|
6
6
|
*/
|
|
7
|
-
import type { Processor } from '../processor.js';
|
|
7
|
+
import type { Processor, JobRequest } from '../processor.js';
|
|
8
8
|
export declare class NoneProcessor implements Processor {
|
|
9
9
|
readonly name = "none";
|
|
10
10
|
verify(): Promise<void>;
|
|
11
|
-
generate(
|
|
12
|
-
generateStream(
|
|
11
|
+
generate(req: JobRequest): Promise<string>;
|
|
12
|
+
generateStream(req: JobRequest): AsyncGenerator<string>;
|
|
13
13
|
}
|
|
@@ -9,10 +9,10 @@ export class NoneProcessor {
|
|
|
9
9
|
async verify() {
|
|
10
10
|
// No-op — nothing to check
|
|
11
11
|
}
|
|
12
|
-
async generate(
|
|
13
|
-
return
|
|
12
|
+
async generate(req) {
|
|
13
|
+
return req.input;
|
|
14
14
|
}
|
|
15
|
-
async *generateStream(
|
|
16
|
-
yield
|
|
15
|
+
async *generateStream(req) {
|
|
16
|
+
yield req.input;
|
|
17
17
|
}
|
|
18
18
|
}
|
|
@@ -4,12 +4,12 @@
|
|
|
4
4
|
* Reads OLLAMA_MODEL env var (default "llama3.2").
|
|
5
5
|
* Zero behavior change from the previous hard-coded path in agent.ts.
|
|
6
6
|
*/
|
|
7
|
-
import type { Processor } from '../processor.js';
|
|
7
|
+
import type { Processor, JobRequest } from '../processor.js';
|
|
8
8
|
export declare class OllamaProcessor implements Processor {
|
|
9
9
|
private model;
|
|
10
10
|
constructor();
|
|
11
11
|
get name(): string;
|
|
12
12
|
verify(): Promise<void>;
|
|
13
|
-
generate(
|
|
14
|
-
generateStream(
|
|
13
|
+
generate(req: JobRequest): Promise<string>;
|
|
14
|
+
generateStream(req: JobRequest): AsyncGenerator<string>;
|
|
15
15
|
}
|
|
@@ -20,10 +20,10 @@ export class OllamaProcessor {
|
|
|
20
20
|
`Run: ollama pull ${this.model}`);
|
|
21
21
|
}
|
|
22
22
|
}
|
|
23
|
-
async generate(
|
|
24
|
-
return generate({ model: this.model, prompt });
|
|
23
|
+
async generate(req) {
|
|
24
|
+
return generate({ model: this.model, prompt: req.input });
|
|
25
25
|
}
|
|
26
|
-
async *generateStream(
|
|
27
|
-
yield* generateStream({ model: this.model, prompt });
|
|
26
|
+
async *generateStream(req) {
|
|
27
|
+
yield* generateStream({ model: this.model, prompt: req.input });
|
|
28
28
|
}
|
|
29
29
|
}
|
package/dist/agent.js
CHANGED
|
@@ -60,6 +60,12 @@ for (const arg of process.argv.slice(2)) {
|
|
|
60
60
|
case '--api-url':
|
|
61
61
|
process.env.API_2020117_URL = val;
|
|
62
62
|
break;
|
|
63
|
+
case '--models':
|
|
64
|
+
process.env.MODELS = val;
|
|
65
|
+
break;
|
|
66
|
+
case '--skill':
|
|
67
|
+
process.env.SKILL_FILE = val;
|
|
68
|
+
break;
|
|
63
69
|
}
|
|
64
70
|
}
|
|
65
71
|
import { SwarmNode, topicFromKind } from './swarm.js';
|
|
@@ -67,6 +73,7 @@ import { receiveToken, peekToken, mintTokens, splitTokens } from './cashu.js';
|
|
|
67
73
|
import { createProcessor } from './processor.js';
|
|
68
74
|
import { hasApiKey, loadAgentName, registerService, startHeartbeatLoop, getInbox, acceptJob, sendFeedback, submitResult, createJob, getJob, } from './api.js';
|
|
69
75
|
import { randomBytes } from 'crypto';
|
|
76
|
+
import { readFileSync } from 'fs';
|
|
70
77
|
// --- Config from env ---
|
|
71
78
|
const KIND = Number(process.env.DVM_KIND) || 5100;
|
|
72
79
|
const MAX_CONCURRENT = Number(process.env.MAX_JOBS) || 3;
|
|
@@ -83,6 +90,25 @@ const SUB_BID = Number(process.env.SUB_BID) || 100;
|
|
|
83
90
|
const MAX_SATS_PER_CHUNK = Number(process.env.MAX_SATS_PER_CHUNK) || 5;
|
|
84
91
|
const MIN_BID_SATS = Number(process.env.MIN_BID_SATS) || SATS_PER_CHUNK * CHUNKS_PER_PAYMENT; // default = pricing per job
|
|
85
92
|
const SUB_BATCH_SIZE = Number(process.env.SUB_BATCH_SIZE) || 500; // chars to accumulate before local processing
|
|
93
|
+
// --- Skill file loading ---
|
|
94
|
+
function loadSkill() {
|
|
95
|
+
const skillPath = process.env.SKILL_FILE;
|
|
96
|
+
if (!skillPath)
|
|
97
|
+
return null;
|
|
98
|
+
try {
|
|
99
|
+
const raw = readFileSync(skillPath, 'utf-8');
|
|
100
|
+
const skill = JSON.parse(raw);
|
|
101
|
+
if (!skill.name || !skill.version || !Array.isArray(skill.features)) {
|
|
102
|
+
console.error(`[agent] Skill file missing required fields: name, version, features`);
|
|
103
|
+
process.exit(1);
|
|
104
|
+
}
|
|
105
|
+
return skill;
|
|
106
|
+
}
|
|
107
|
+
catch (e) {
|
|
108
|
+
console.error(`[agent] Failed to load skill file "${skillPath}": ${e.message}`);
|
|
109
|
+
process.exit(1);
|
|
110
|
+
}
|
|
111
|
+
}
|
|
86
112
|
const state = {
|
|
87
113
|
agentName: loadAgentName(),
|
|
88
114
|
activeJobs: 0,
|
|
@@ -91,6 +117,7 @@ const state = {
|
|
|
91
117
|
pollTimer: null,
|
|
92
118
|
swarmNode: null,
|
|
93
119
|
processor: null,
|
|
120
|
+
skill: loadSkill(),
|
|
94
121
|
};
|
|
95
122
|
// --- Capacity management ---
|
|
96
123
|
function acquireSlot() {
|
|
@@ -123,6 +150,9 @@ async function main() {
|
|
|
123
150
|
}
|
|
124
151
|
await state.processor.verify();
|
|
125
152
|
console.log(`[${label}] Processor "${state.processor.name}" verified`);
|
|
153
|
+
if (state.skill) {
|
|
154
|
+
console.log(`[${label}] Skill: ${state.skill.name} v${state.skill.version} (${state.skill.features.join(', ')})`);
|
|
155
|
+
}
|
|
126
156
|
// 2. Platform registration + heartbeat
|
|
127
157
|
await setupPlatform(label);
|
|
128
158
|
// 3. Async inbox poller
|
|
@@ -140,11 +170,14 @@ async function setupPlatform(label) {
|
|
|
140
170
|
return;
|
|
141
171
|
}
|
|
142
172
|
console.log(`[${label}] Registering on platform...`);
|
|
173
|
+
const models = process.env.MODELS ? process.env.MODELS.split(',').map(s => s.trim()) : undefined;
|
|
143
174
|
await registerService({
|
|
144
175
|
kind: KIND,
|
|
145
176
|
satsPerChunk: SATS_PER_CHUNK,
|
|
146
177
|
chunksPerPayment: CHUNKS_PER_PAYMENT,
|
|
147
178
|
model: state.processor?.name || 'unknown',
|
|
179
|
+
models,
|
|
180
|
+
skill: state.skill,
|
|
148
181
|
});
|
|
149
182
|
state.stopHeartbeat = startHeartbeatLoop(() => getAvailableCapacity());
|
|
150
183
|
}
|
|
@@ -176,7 +209,7 @@ function startInboxPoller(label) {
|
|
|
176
209
|
continue;
|
|
177
210
|
}
|
|
178
211
|
// Process in background — don't await
|
|
179
|
-
processAsyncJob(label, job.id, job.input).catch((err) => {
|
|
212
|
+
processAsyncJob(label, job.id, job.input, job.params).catch((err) => {
|
|
180
213
|
console.error(`[${label}] Async job ${job.id} error: ${err.message}`);
|
|
181
214
|
});
|
|
182
215
|
}
|
|
@@ -194,7 +227,7 @@ function startInboxPoller(label) {
|
|
|
194
227
|
// First poll after a short delay to let swarm set up
|
|
195
228
|
state.pollTimer = setTimeout(poll, 2000);
|
|
196
229
|
}
|
|
197
|
-
async function processAsyncJob(label, inboxJobId, input) {
|
|
230
|
+
async function processAsyncJob(label, inboxJobId, input, params) {
|
|
198
231
|
try {
|
|
199
232
|
console.log(`[${label}] Accepting job ${inboxJobId}...`);
|
|
200
233
|
const accepted = await acceptJob(inboxJobId);
|
|
@@ -214,7 +247,7 @@ async function processAsyncJob(label, inboxJobId, input) {
|
|
|
214
247
|
// API delegation is non-streaming — collect full result, then process
|
|
215
248
|
const subResult = await delegateAPI(SUB_KIND, input, SUB_BID, SUB_PROVIDER);
|
|
216
249
|
console.log(`[${label}] Job ${providerJobId}: sub-task returned ${subResult.length} chars`);
|
|
217
|
-
result = await state.processor.generate(subResult);
|
|
250
|
+
result = await state.processor.generate({ input: subResult, params });
|
|
218
251
|
}
|
|
219
252
|
else {
|
|
220
253
|
// P2P delegation — stream-collect from sub-provider, batch-translate
|
|
@@ -226,12 +259,12 @@ async function processAsyncJob(label, inboxJobId, input) {
|
|
|
226
259
|
}
|
|
227
260
|
catch (e) {
|
|
228
261
|
console.error(`[${label}] Job ${providerJobId}: sub-task failed: ${e.message}, using original input`);
|
|
229
|
-
result = await state.processor.generate(input);
|
|
262
|
+
result = await state.processor.generate({ input, params });
|
|
230
263
|
}
|
|
231
264
|
}
|
|
232
265
|
else {
|
|
233
266
|
// No pipeline — direct local processing
|
|
234
|
-
result = await state.processor.generate(input);
|
|
267
|
+
result = await state.processor.generate({ input, params });
|
|
235
268
|
}
|
|
236
269
|
console.log(`[${label}] Job ${providerJobId}: generated ${result.length} chars`);
|
|
237
270
|
const ok = await submitResult(providerJobId, result);
|
|
@@ -402,7 +435,7 @@ async function* delegateP2PStream(kind, input, budgetSats) {
|
|
|
402
435
|
async function* pipelineStream(kind, input, budgetSats) {
|
|
403
436
|
let batch = '';
|
|
404
437
|
async function* translateBatch(text) {
|
|
405
|
-
for await (const token of state.processor.generateStream(text)) {
|
|
438
|
+
for await (const token of state.processor.generateStream({ input: text })) {
|
|
406
439
|
yield token;
|
|
407
440
|
}
|
|
408
441
|
}
|
|
@@ -462,6 +495,10 @@ async function startSwarmListener(label) {
|
|
|
462
495
|
console.log(`[${label}] P2P listening for customers...`);
|
|
463
496
|
node.on('message', async (msg, socket, peerId) => {
|
|
464
497
|
const tag = peerId.slice(0, 8);
|
|
498
|
+
if (msg.type === 'skill_request') {
|
|
499
|
+
node.send(socket, { type: 'skill_response', id: msg.id, skill: state.skill });
|
|
500
|
+
return;
|
|
501
|
+
}
|
|
465
502
|
if (msg.type === 'request') {
|
|
466
503
|
console.log(`[${label}] P2P job ${msg.id} from ${tag}: "${(msg.input || '').slice(0, 60)}..."`);
|
|
467
504
|
if (!acquireSlot()) {
|
|
@@ -571,7 +608,7 @@ async function runP2PGeneration(node, job, msg, label) {
|
|
|
571
608
|
// Pick the source: pipeline (delegate + local) or direct local generation
|
|
572
609
|
const source = SUB_KIND
|
|
573
610
|
? pipelineStream(SUB_KIND, msg.input || '', SUB_BUDGET)
|
|
574
|
-
: state.processor.generateStream(msg.input || '');
|
|
611
|
+
: state.processor.generateStream({ input: msg.input || '', params: msg.params });
|
|
575
612
|
try {
|
|
576
613
|
for await (const chunk of source) {
|
|
577
614
|
if (job.stopped) {
|
package/dist/api.d.ts
CHANGED
|
@@ -22,6 +22,8 @@ export declare function registerService(opts: {
|
|
|
22
22
|
satsPerChunk: number;
|
|
23
23
|
chunksPerPayment: number;
|
|
24
24
|
model?: string;
|
|
25
|
+
models?: string[];
|
|
26
|
+
skill?: Record<string, unknown> | null;
|
|
25
27
|
}): Promise<unknown | null>;
|
|
26
28
|
export declare function sendHeartbeat(capacity?: number): Promise<boolean>;
|
|
27
29
|
export declare function getOnlineProviders(kind: number): Promise<OnlineAgent[]>;
|
package/dist/api.js
CHANGED
|
@@ -136,6 +136,10 @@ export async function registerService(opts) {
|
|
|
136
136
|
description: desc,
|
|
137
137
|
pricing: { min_sats: satsPerPayment, max_sats: satsPerPayment },
|
|
138
138
|
};
|
|
139
|
+
if (opts.models && opts.models.length > 0)
|
|
140
|
+
body.models = opts.models;
|
|
141
|
+
if (opts.skill)
|
|
142
|
+
body.skill = opts.skill;
|
|
139
143
|
const resp = await fetch(`${BASE_URL}/api/dvm/services`, {
|
|
140
144
|
method: 'POST',
|
|
141
145
|
headers: {
|
package/dist/customer.js
CHANGED
|
@@ -78,6 +78,30 @@ async function main() {
|
|
|
78
78
|
process.exit(1);
|
|
79
79
|
}
|
|
80
80
|
console.log(`[customer] Connected to provider: ${peer.peerId.slice(0, 12)}...`);
|
|
81
|
+
// Query provider skill
|
|
82
|
+
console.log(`[customer] Querying provider skill...`);
|
|
83
|
+
const skillJobId = randomBytes(4).toString('hex');
|
|
84
|
+
node.send(peer.socket, { type: 'skill_request', id: skillJobId, kind: KIND });
|
|
85
|
+
const providerSkill = await new Promise((resolve) => {
|
|
86
|
+
const timer = setTimeout(() => {
|
|
87
|
+
console.log(`[customer] No skill response (provider may not support skill)`);
|
|
88
|
+
resolve(null);
|
|
89
|
+
}, 5000);
|
|
90
|
+
const handler = (msg) => {
|
|
91
|
+
if (msg.type === 'skill_response' && msg.id === skillJobId) {
|
|
92
|
+
clearTimeout(timer);
|
|
93
|
+
node.removeListener('message', handler);
|
|
94
|
+
resolve(msg.skill || null);
|
|
95
|
+
}
|
|
96
|
+
};
|
|
97
|
+
node.on('message', handler);
|
|
98
|
+
});
|
|
99
|
+
if (providerSkill) {
|
|
100
|
+
console.log(`[customer] Provider skill: ${providerSkill.name} v${providerSkill.version}`);
|
|
101
|
+
if (providerSkill.features) {
|
|
102
|
+
console.log(`[customer] Features: ${providerSkill.features.join(', ')}`);
|
|
103
|
+
}
|
|
104
|
+
}
|
|
81
105
|
// --- Step 3: Send request and handle streaming payment ---
|
|
82
106
|
return new Promise((resolve) => {
|
|
83
107
|
let microTokens = [];
|
package/dist/processor.d.ts
CHANGED
|
@@ -9,15 +9,22 @@
|
|
|
9
9
|
* PROCESSOR=exec:./cmd — stdin/stdout child process
|
|
10
10
|
* PROCESSOR=http://url — remote HTTP endpoint
|
|
11
11
|
*/
|
|
12
|
+
/** Structured request passed to every Processor method. */
|
|
13
|
+
export interface JobRequest {
|
|
14
|
+
/** The text input (prompt / source text / query) */
|
|
15
|
+
input: string;
|
|
16
|
+
/** Optional extra parameters (model overrides, LoRA, ControlNet, etc.) */
|
|
17
|
+
params?: Record<string, unknown>;
|
|
18
|
+
}
|
|
12
19
|
export interface Processor {
|
|
13
20
|
/** Human-readable name for logs (e.g. "ollama:llama3.2", "none") */
|
|
14
21
|
readonly name: string;
|
|
15
22
|
/** Startup check — may throw to abort launch */
|
|
16
23
|
verify(): Promise<void>;
|
|
17
24
|
/** Non-streaming generation */
|
|
18
|
-
generate(
|
|
25
|
+
generate(req: JobRequest): Promise<string>;
|
|
19
26
|
/** Streaming generation — yields chunks as they arrive */
|
|
20
|
-
generateStream(
|
|
27
|
+
generateStream(req: JobRequest): AsyncGenerator<string>;
|
|
21
28
|
}
|
|
22
29
|
/**
|
|
23
30
|
* Factory — reads PROCESSOR env var and returns the appropriate backend.
|
package/dist/swarm.d.ts
CHANGED
|
@@ -5,7 +5,9 @@
|
|
|
5
5
|
* Customer: joins the same topic to find the provider
|
|
6
6
|
*
|
|
7
7
|
* Wire protocol (newline-delimited JSON) — streaming payment:
|
|
8
|
-
* → { type: "
|
|
8
|
+
* → { type: "skill_request", id, kind } customer queries provider skill
|
|
9
|
+
* ← { type: "skill_response", id, skill } provider replies with skill manifest
|
|
10
|
+
* → { type: "request", id, kind, input, budget, params } customer sends job with budget
|
|
9
11
|
* ← { type: "offer", id, sats_per_chunk, chunks_per_payment } provider quotes price
|
|
10
12
|
* → { type: "payment", id, token } customer sends micro-token
|
|
11
13
|
* ← { type: "payment_ack", id, amount } provider confirms + accepts
|
|
@@ -22,7 +24,7 @@
|
|
|
22
24
|
import Hyperswarm from 'hyperswarm';
|
|
23
25
|
import { EventEmitter } from 'events';
|
|
24
26
|
export interface SwarmMessage {
|
|
25
|
-
type: 'request' | 'accepted' | 'chunk' | 'result' | 'error' | 'payment' | 'payment_ack' | 'offer' | 'pay_required' | 'stop';
|
|
27
|
+
type: 'request' | 'accepted' | 'chunk' | 'result' | 'error' | 'payment' | 'payment_ack' | 'offer' | 'pay_required' | 'stop' | 'skill_request' | 'skill_response';
|
|
26
28
|
id: string;
|
|
27
29
|
kind?: number;
|
|
28
30
|
input?: string;
|
|
@@ -31,6 +33,8 @@ export interface SwarmMessage {
|
|
|
31
33
|
token?: string;
|
|
32
34
|
amount?: number;
|
|
33
35
|
message?: string;
|
|
36
|
+
params?: Record<string, unknown>;
|
|
37
|
+
skill?: Record<string, unknown> | null;
|
|
34
38
|
sats_per_chunk?: number;
|
|
35
39
|
chunks_per_payment?: number;
|
|
36
40
|
budget?: number;
|
package/dist/swarm.js
CHANGED
|
@@ -5,7 +5,9 @@
|
|
|
5
5
|
* Customer: joins the same topic to find the provider
|
|
6
6
|
*
|
|
7
7
|
* Wire protocol (newline-delimited JSON) — streaming payment:
|
|
8
|
-
* → { type: "
|
|
8
|
+
* → { type: "skill_request", id, kind } customer queries provider skill
|
|
9
|
+
* ← { type: "skill_response", id, skill } provider replies with skill manifest
|
|
10
|
+
* → { type: "request", id, kind, input, budget, params } customer sends job with budget
|
|
9
11
|
* ← { type: "offer", id, sats_per_chunk, chunks_per_payment } provider quotes price
|
|
10
12
|
* → { type: "payment", id, token } customer sends micro-token
|
|
11
13
|
* ← { type: "payment_ack", id, amount } provider confirms + accepts
|