@promptbook/documents 0.88.0-9 → 0.89.0-1

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
Files changed (25) hide show
  1. package/README.md +35 -14
  2. package/esm/index.es.js +62 -26
  3. package/esm/index.es.js.map +1 -1
  4. package/esm/typings/src/_packages/core.index.d.ts +2 -2
  5. package/esm/typings/src/_packages/types.index.d.ts +10 -0
  6. package/esm/typings/src/config.d.ts +1 -1
  7. package/esm/typings/src/errors/PipelineExecutionError.d.ts +5 -0
  8. package/esm/typings/src/errors/utils/ErrorJson.d.ts +5 -0
  9. package/esm/typings/src/llm-providers/_common/utils/count-total-usage/LlmExecutionToolsWithTotalUsage.d.ts +7 -0
  10. package/esm/typings/src/llm-providers/_common/utils/count-total-usage/{countTotalUsage.d.ts → countUsage.d.ts} +1 -1
  11. package/esm/typings/src/playground/BrjappConnector.d.ts +64 -0
  12. package/esm/typings/src/playground/brjapp-api-schema.d.ts +12879 -0
  13. package/esm/typings/src/playground/playground.d.ts +5 -0
  14. package/esm/typings/src/remote-server/socket-types/_subtypes/PromptbookServer_Identification.d.ts +2 -1
  15. package/esm/typings/src/remote-server/types/RemoteServerOptions.d.ts +15 -3
  16. package/esm/typings/src/types/typeAliases.d.ts +2 -2
  17. package/esm/typings/src/utils/expectation-counters/countCharacters.d.ts +3 -0
  18. package/esm/typings/src/utils/expectation-counters/countLines.d.ts +3 -0
  19. package/esm/typings/src/utils/expectation-counters/countPages.d.ts +3 -0
  20. package/esm/typings/src/utils/expectation-counters/countParagraphs.d.ts +3 -0
  21. package/esm/typings/src/utils/expectation-counters/countSentences.d.ts +3 -0
  22. package/esm/typings/src/utils/expectation-counters/countWords.d.ts +3 -0
  23. package/package.json +2 -2
  24. package/umd/index.umd.js +65 -29
  25. package/umd/index.umd.js.map +1 -1
package/README.md CHANGED
@@ -61,6 +61,8 @@ Rest of the documentation is common for **entire promptbook ecosystem**:
61
61
 
62
62
  During the computer revolution, we have seen [multiple generations of computer languages](https://github.com/webgptorg/promptbook/discussions/180), from the physical rewiring of the vacuum tubes through low-level machine code to the high-level languages like Python or JavaScript. And now, we're on the edge of the **next revolution**!
63
63
 
64
+
65
+
64
66
  It's a revolution of writing software in **plain human language** that is understandable and executable by both humans and machines – and it's going to change everything!
65
67
 
66
68
  The incredible growth in power of microprocessors and the Moore's Law have been the driving force behind the ever-more powerful languages, and it's been an amazing journey! Similarly, the large language models (like GPT or Claude) are the next big thing in language technology, and they're set to transform the way we interact with computers.
@@ -71,6 +73,9 @@ This shift is going to happen, whether we are ready for it or not. Our mission i
71
73
 
72
74
 
73
75
 
76
+
77
+
78
+
74
79
  ## 🚀 Get started
75
80
 
76
81
  Take a look at the simple starter kit with books integrated into the **Hello World** sample applications:
@@ -82,6 +87,8 @@ Take a look at the simple starter kit with books integrated into the **Hello Wor
82
87
 
83
88
 
84
89
 
90
+
91
+
85
92
  ## 💜 The Promptbook Project
86
93
 
87
94
  Promptbook project is ecosystem of multiple projects and tools, following is a list of most important pieces of the project:
@@ -117,22 +124,35 @@ Promptbook project is ecosystem of multiple projects and tools, following is a l
117
124
  </tbody>
118
125
  </table>
119
126
 
127
+ Hello world examples:
128
+
129
+ - [Hello world](https://github.com/webgptorg/hello-world)
130
+ - [Hello world in Node.js](https://github.com/webgptorg/hello-world-node-js)
131
+ - [Hello world in Next.js](https://github.com/webgptorg/hello-world-next-js)
132
+
133
+
134
+
120
135
  We also have a community of developers and users of **Promptbook**:
121
136
 
122
137
  - [Discord community](https://discord.gg/x3QWNaa89N)
123
138
  - [Landing page `ptbk.io`](https://ptbk.io)
124
139
  - [Github discussions](https://github.com/webgptorg/promptbook/discussions)
125
140
  - [LinkedIn `Promptbook`](https://linkedin.com/company/promptbook)
126
- - [Facebook `Promptbook`](https://www.facebook.com/61560776453536)
141
+ - [Facebook `Promptbook`](https://www.facebook.com/61560776453536)
127
142
 
128
143
  And **Promptbook.studio** branded socials:
129
144
 
145
+
146
+
130
147
  - [Instagram `@promptbook.studio`](https://www.instagram.com/promptbook.studio/)
131
148
 
132
149
  And **Promptujeme** sub-brand:
133
150
 
134
151
  _/Subbrand for Czech clients/_
135
152
 
153
+
154
+
155
+
136
156
  - [Promptujeme.cz](https://www.promptujeme.cz/)
137
157
  - [Facebook `Promptujeme`](https://www.facebook.com/promptujeme/)
138
158
 
@@ -150,6 +170,8 @@ _/Sub-brand for images and graphics generated via Promptbook prompting/_
150
170
 
151
171
  ## 💙 The Book language
152
172
 
173
+
174
+
153
175
  Following is the documentation and blueprint of the [Book language](https://github.com/webgptorg/book).
154
176
 
155
177
  Book is a language that can be used to write AI applications, agents, workflows, automations, knowledgebases, translators, sheet processors, email automations and more. It allows you to harness the power of AI models in human-like terms, without the need to know the specifics and technicalities of the models.
@@ -199,6 +221,8 @@ Personas can have access to different knowledge, tools and actions. They can als
199
221
 
200
222
  - [PERSONA](https://github.com/webgptorg/promptbook/blob/main/documents/commands/PERSONA.md)
201
223
 
224
+
225
+
202
226
  ### **How:** Knowledge, Instruments and Actions
203
227
 
204
228
  The resources used by the personas are used to do the work.
@@ -274,11 +298,9 @@ Or you can install them separately:
274
298
 
275
299
  ## 📚 Dictionary
276
300
 
277
- ### 📚 Dictionary
278
-
279
301
  The following glossary is used to clarify certain concepts:
280
302
 
281
- #### General LLM / AI terms
303
+ ### General LLM / AI terms
282
304
 
283
305
  - **Prompt drift** is a phenomenon where the AI model starts to generate outputs that are not aligned with the original prompt. This can happen due to the model's training data, the prompt's wording, or the model's architecture.
284
306
  - **Pipeline, workflow or chain** is a sequence of tasks that are executed in a specific order. In the context of AI, a pipeline can refer to a sequence of AI models that are used to process data.
@@ -289,9 +311,13 @@ The following glossary is used to clarify certain concepts:
289
311
  - **Retrieval-augmented generation** is a machine learning paradigm where a model generates text by retrieving relevant information from a large database of text. This approach combines the benefits of generative models and retrieval models.
290
312
  - **Longtail** refers to non-common or rare events, items, or entities that are not well-represented in the training data of machine learning models. Longtail items are often challenging for models to predict accurately.
291
313
 
292
- _Note: Thos section is not complete dictionary, more list of general AI / LLM terms that has connection with Promptbook_
293
314
 
294
- #### 💯 Core concepts
315
+
316
+ _Note: This section is not complete dictionary, more list of general AI / LLM terms that has connection with Promptbook_
317
+
318
+
319
+
320
+ ### 💯 Core concepts
295
321
 
296
322
  - [📚 Collection of pipelines](https://github.com/webgptorg/promptbook/discussions/65)
297
323
  - [📯 Pipeline](https://github.com/webgptorg/promptbook/discussions/64)
@@ -304,7 +330,7 @@ _Note: Thos section is not complete dictionary, more list of general AI / LLM te
304
330
  - [🔣 Words not tokens](https://github.com/webgptorg/promptbook/discussions/29)
305
331
  - [☯ Separation of concerns](https://github.com/webgptorg/promptbook/discussions/32)
306
332
 
307
- ##### Advanced concepts
333
+ #### Advanced concepts
308
334
 
309
335
  - [📚 Knowledge (Retrieval-augmented generation)](https://github.com/webgptorg/promptbook/discussions/41)
310
336
  - [🌏 Remote server](https://github.com/webgptorg/promptbook/discussions/89)
@@ -319,11 +345,6 @@ _Note: Thos section is not complete dictionary, more list of general AI / LLM te
319
345
  - [👮 Agent adversary expectations](https://github.com/webgptorg/promptbook/discussions/39)
320
346
  - [view more](https://github.com/webgptorg/promptbook/discussions/categories/concepts)
321
347
 
322
- ### Terms specific to Promptbook TypeScript implementation
323
-
324
- - Anonymous mode
325
- - Application mode
326
-
327
348
 
328
349
 
329
350
  ## 🚂 Promptbook Engine
@@ -394,11 +415,11 @@ See [TODO.md](./TODO.md)
394
415
  <div style="display: flex; align-items: center; gap: 20px;">
395
416
 
396
417
  <a href="https://promptbook.studio/">
397
- <img src="./design/promptbook-studio-logo.png" alt="Partner 3" height="100">
418
+ <img src="./design/promptbook-studio-logo.png" alt="Partner 3" height="70">
398
419
  </a>
399
420
 
400
421
  <a href="https://technologickainkubace.org/en/about-technology-incubation/about-the-project/">
401
- <img src="./other/partners/CI-Technology-Incubation.png" alt="Technology Incubation" height="100">
422
+ <img src="./other/partners/CI-Technology-Incubation.png" alt="Technology Incubation" height="70">
402
423
  </a>
403
424
 
404
425
  </div>
package/esm/index.es.js CHANGED
@@ -8,8 +8,8 @@ import hexEncoder from 'crypto-js/enc-hex';
8
8
  import { basename, join, dirname } from 'path';
9
9
  import { format } from 'prettier';
10
10
  import parserHtml from 'prettier/parser-html';
11
- import { Subject } from 'rxjs';
12
11
  import { randomBytes } from 'crypto';
12
+ import { Subject } from 'rxjs';
13
13
  import sha256 from 'crypto-js/sha256';
14
14
  import { lookup, extension } from 'mime-types';
15
15
  import { parse, unparse } from 'papaparse';
@@ -28,7 +28,7 @@ const BOOK_LANGUAGE_VERSION = '1.0.0';
28
28
  * @generated
29
29
  * @see https://github.com/webgptorg/promptbook
30
30
  */
31
- const PROMPTBOOK_ENGINE_VERSION = '0.88.0-9';
31
+ const PROMPTBOOK_ENGINE_VERSION = '0.89.0-1';
32
32
  /**
33
33
  * TODO: string_promptbook_version should be constrained to the all versions of Promptbook engine
34
34
  * Note: [💞] Ignore a discrepancy between file name and entity name
@@ -160,7 +160,7 @@ const DEFAULT_MAX_PARALLEL_COUNT = 5; // <- TODO: [🤹‍♂️]
160
160
  *
161
161
  * @public exported from `@promptbook/core`
162
162
  */
163
- const DEFAULT_MAX_EXECUTION_ATTEMPTS = 3; // <- TODO: [🤹‍♂️]
163
+ const DEFAULT_MAX_EXECUTION_ATTEMPTS = 10; // <- TODO: [🤹‍♂️]
164
164
  // <- TODO: [🕝] Make also `BOOKS_DIRNAME_ALTERNATIVES`
165
165
  /**
166
166
  * Where to store the temporary downloads
@@ -2173,6 +2173,21 @@ function createCollectionFromJson(...promptbooks) {
2173
2173
  return new SimplePipelineCollection(...promptbooks);
2174
2174
  }
2175
2175
 
2176
+ /**
2177
+ * Generates random token
2178
+ *
2179
+ * Note: This function is cryptographically secure (it uses crypto.randomBytes internally)
2180
+ *
2181
+ * @private internal helper function
2182
+ * @returns secure random token
2183
+ */
2184
+ function $randomToken(randomness) {
2185
+ return randomBytes(randomness).toString('hex');
2186
+ }
2187
+ /**
2188
+ * TODO: Maybe use nanoid instead https://github.com/ai/nanoid
2189
+ */
2190
+
2176
2191
  /**
2177
2192
  * This error indicates errors during the execution of the pipeline
2178
2193
  *
@@ -2180,11 +2195,17 @@ function createCollectionFromJson(...promptbooks) {
2180
2195
  */
2181
2196
  class PipelineExecutionError extends Error {
2182
2197
  constructor(message) {
2198
+ // Added id parameter
2183
2199
  super(message);
2184
2200
  this.name = 'PipelineExecutionError';
2201
+ // TODO: [🐙] DRY - Maybe $randomId
2202
+ this.id = `error-${$randomToken(8 /* <- TODO: To global config + Use Base58 to avoid simmilar char conflicts */)}`;
2185
2203
  Object.setPrototypeOf(this, PipelineExecutionError.prototype);
2186
2204
  }
2187
2205
  }
2206
+ /**
2207
+ * TODO: !!!!!! Add id to all errors
2208
+ */
2188
2209
 
2189
2210
  /**
2190
2211
  * Determine if the pipeline is fully prepared
@@ -2223,21 +2244,6 @@ function isPipelinePrepared(pipeline) {
2223
2244
  * - [♨] Are tasks prepared
2224
2245
  */
2225
2246
 
2226
- /**
2227
- * Generates random token
2228
- *
2229
- * Note: This function is cryptographically secure (it uses crypto.randomBytes internally)
2230
- *
2231
- * @private internal helper function
2232
- * @returns secure random token
2233
- */
2234
- function $randomToken(randomness) {
2235
- return randomBytes(randomness).toString('hex');
2236
- }
2237
- /**
2238
- * TODO: Maybe use nanoid instead https://github.com/ai/nanoid
2239
- */
2240
-
2241
2247
  /**
2242
2248
  * Recursively converts JSON strings to JSON objects
2243
2249
 
@@ -2428,7 +2434,7 @@ const ALL_ERRORS = {
2428
2434
  * @public exported from `@promptbook/utils`
2429
2435
  */
2430
2436
  function deserializeError(error) {
2431
- const { name, stack } = error;
2437
+ const { name, stack, id } = error; // Added id
2432
2438
  let { message } = error;
2433
2439
  let ErrorClass = ALL_ERRORS[error.name];
2434
2440
  if (ErrorClass === undefined) {
@@ -2443,7 +2449,9 @@ function deserializeError(error) {
2443
2449
  ${block(stack || '')}
2444
2450
  `);
2445
2451
  }
2446
- return new ErrorClass(message);
2452
+ const deserializedError = new ErrorClass(message);
2453
+ deserializedError.id = id; // Assign id to the error object
2454
+ return deserializedError;
2447
2455
  }
2448
2456
 
2449
2457
  /**
@@ -2493,6 +2501,7 @@ function assertsTaskSuccessful(executionResult) {
2493
2501
  */
2494
2502
  function createTask(options) {
2495
2503
  const { taskType, taskProcessCallback } = options;
2504
+ // TODO: [🐙] DRY
2496
2505
  const taskId = `${taskType.toLowerCase().substring(0, 4)}-${$randomToken(8 /* <- TODO: To global config + Use Base58 to avoid simmilar char conflicts */)}`;
2497
2506
  let status = 'RUNNING';
2498
2507
  const createdAt = new Date();
@@ -2525,7 +2534,7 @@ function createTask(options) {
2525
2534
  assertsTaskSuccessful(executionResult);
2526
2535
  status = 'FINISHED';
2527
2536
  currentValue = jsonStringsToJsons(executionResult);
2528
- // <- TODO: Convert JSON values in string to JSON objects
2537
+ // <- TODO: [🧠] Is this a good idea to convert JSON strins to JSONs?
2529
2538
  partialResultSubject.next(executionResult);
2530
2539
  }
2531
2540
  catch (error) {
@@ -2589,19 +2598,21 @@ function createTask(options) {
2589
2598
  */
2590
2599
  function serializeError(error) {
2591
2600
  const { name, message, stack } = error;
2601
+ const { id } = error;
2592
2602
  if (!Object.keys(ALL_ERRORS).includes(name)) {
2593
2603
  console.error(spaceTrim$1((block) => `
2594
-
2604
+
2595
2605
  Cannot serialize error with name "${name}"
2596
2606
 
2597
2607
  ${block(stack || message)}
2598
-
2608
+
2599
2609
  `));
2600
2610
  }
2601
2611
  return {
2602
2612
  name: name,
2603
2613
  message,
2604
2614
  stack,
2615
+ id, // Include id in the serialized object
2605
2616
  };
2606
2617
  }
2607
2618
 
@@ -2744,8 +2755,9 @@ function addUsage(...usageItems) {
2744
2755
  * @returns LLM tools with same functionality with added total cost counting
2745
2756
  * @public exported from `@promptbook/core`
2746
2757
  */
2747
- function countTotalUsage(llmTools) {
2758
+ function countUsage(llmTools) {
2748
2759
  let totalUsage = ZERO_USAGE;
2760
+ const spending = new Subject();
2749
2761
  const proxyTools = {
2750
2762
  get title() {
2751
2763
  // TODO: [🧠] Maybe put here some suffix
@@ -2755,12 +2767,15 @@ function countTotalUsage(llmTools) {
2755
2767
  // TODO: [🧠] Maybe put here some suffix
2756
2768
  return llmTools.description;
2757
2769
  },
2758
- async checkConfiguration() {
2770
+ checkConfiguration() {
2759
2771
  return /* not await */ llmTools.checkConfiguration();
2760
2772
  },
2761
2773
  listModels() {
2762
2774
  return /* not await */ llmTools.listModels();
2763
2775
  },
2776
+ spending() {
2777
+ return spending.asObservable();
2778
+ },
2764
2779
  getTotalUsage() {
2765
2780
  // <- Note: [🥫] Not using getter `get totalUsage` but `getTotalUsage` to allow this object to be proxied
2766
2781
  return totalUsage;
@@ -2771,6 +2786,7 @@ function countTotalUsage(llmTools) {
2771
2786
  // console.info('[🚕] callChatModel through countTotalUsage');
2772
2787
  const promptResult = await llmTools.callChatModel(prompt);
2773
2788
  totalUsage = addUsage(totalUsage, promptResult.usage);
2789
+ spending.next(promptResult.usage);
2774
2790
  return promptResult;
2775
2791
  };
2776
2792
  }
@@ -2779,6 +2795,7 @@ function countTotalUsage(llmTools) {
2779
2795
  // console.info('[🚕] callCompletionModel through countTotalUsage');
2780
2796
  const promptResult = await llmTools.callCompletionModel(prompt);
2781
2797
  totalUsage = addUsage(totalUsage, promptResult.usage);
2798
+ spending.next(promptResult.usage);
2782
2799
  return promptResult;
2783
2800
  };
2784
2801
  }
@@ -2787,6 +2804,7 @@ function countTotalUsage(llmTools) {
2787
2804
  // console.info('[🚕] callEmbeddingModel through countTotalUsage');
2788
2805
  const promptResult = await llmTools.callEmbeddingModel(prompt);
2789
2806
  totalUsage = addUsage(totalUsage, promptResult.usage);
2807
+ spending.next(promptResult.usage);
2790
2808
  return promptResult;
2791
2809
  };
2792
2810
  }
@@ -3673,7 +3691,7 @@ async function preparePipeline(pipeline, tools, options) {
3673
3691
  // TODO: [🚐] Make arrayable LLMs -> single LLM DRY
3674
3692
  const _llms = arrayableToArray(tools.llm);
3675
3693
  const llmTools = _llms.length === 1 ? _llms[0] : joinLlmExecutionTools(..._llms);
3676
- const llmToolsWithUsage = countTotalUsage(llmTools);
3694
+ const llmToolsWithUsage = countUsage(llmTools);
3677
3695
  // <- TODO: [🌯]
3678
3696
  /*
3679
3697
  TODO: [🧠][🪑][🔃] Should this be done or not
@@ -4503,6 +4521,9 @@ function countCharacters(text) {
4503
4521
  text = text.replace(/\p{Extended_Pictographic}(\u{200D}\p{Extended_Pictographic})*/gu, '-');
4504
4522
  return text.length;
4505
4523
  }
4524
+ /**
4525
+ * TODO: [🥴] Implement counting in formats - like JSON, CSV, XML,...
4526
+ */
4506
4527
 
4507
4528
  /**
4508
4529
  * Number of characters per standard line with 11pt Arial font size.
@@ -4534,6 +4555,9 @@ function countLines(text) {
4534
4555
  const lines = text.split('\n');
4535
4556
  return lines.reduce((count, line) => count + Math.ceil(line.length / CHARACTERS_PER_STANDARD_LINE), 0);
4536
4557
  }
4558
+ /**
4559
+ * TODO: [🥴] Implement counting in formats - like JSON, CSV, XML,...
4560
+ */
4537
4561
 
4538
4562
  /**
4539
4563
  * Counts number of pages in the text
@@ -4545,6 +4569,9 @@ function countLines(text) {
4545
4569
  function countPages(text) {
4546
4570
  return Math.ceil(countLines(text) / LINES_PER_STANDARD_PAGE);
4547
4571
  }
4572
+ /**
4573
+ * TODO: [🥴] Implement counting in formats - like JSON, CSV, XML,...
4574
+ */
4548
4575
 
4549
4576
  /**
4550
4577
  * Counts number of paragraphs in the text
@@ -4554,6 +4581,9 @@ function countPages(text) {
4554
4581
  function countParagraphs(text) {
4555
4582
  return text.split(/\n\s*\n/).filter((paragraph) => paragraph.trim() !== '').length;
4556
4583
  }
4584
+ /**
4585
+ * TODO: [🥴] Implement counting in formats - like JSON, CSV, XML,...
4586
+ */
4557
4587
 
4558
4588
  /**
4559
4589
  * Split text into sentences
@@ -4571,6 +4601,9 @@ function splitIntoSentences(text) {
4571
4601
  function countSentences(text) {
4572
4602
  return splitIntoSentences(text).length;
4573
4603
  }
4604
+ /**
4605
+ * TODO: [🥴] Implement counting in formats - like JSON, CSV, XML,...
4606
+ */
4574
4607
 
4575
4608
  /**
4576
4609
  * Counts number of words in the text
@@ -4584,6 +4617,9 @@ function countWords(text) {
4584
4617
  text = text.replace(/([a-z])([A-Z])/g, '$1 $2');
4585
4618
  return text.split(/[^a-zа-я0-9]+/i).filter((word) => word.length > 0).length;
4586
4619
  }
4620
+ /**
4621
+ * TODO: [🥴] Implement counting in formats - like JSON, CSV, XML,...
4622
+ */
4587
4623
 
4588
4624
  /**
4589
4625
  * Index of all counter functions