@promptbook/core 0.94.0-1 → 0.94.0-2

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
package/README.md CHANGED
@@ -64,6 +64,8 @@ Rest of the documentation is common for **entire promptbook ecosystem**:
64
64
 
65
65
  During the computer revolution, we have seen [multiple generations of computer languages](https://github.com/webgptorg/promptbook/discussions/180), from the physical rewiring of the vacuum tubes through low-level machine code to the high-level languages like Python or JavaScript. And now, we're on the edge of the **next revolution**!
66
66
 
67
+
68
+
67
69
  It's a revolution of writing software in **plain human language** that is understandable and executable by both humans and machines – and it's going to change everything!
68
70
 
69
71
  The incredible growth in power of microprocessors and the Moore's Law have been the driving force behind the ever-more powerful languages, and it's been an amazing journey! Similarly, the large language models (like GPT or Claude) are the next big thing in language technology, and they're set to transform the way we interact with computers.
@@ -189,16 +191,8 @@ Join our growing community of developers and users:
189
191
 
190
192
  _A concise, Markdown-based DSL for crafting AI workflows and automations._
191
193
 
192
- ---
193
194
 
194
- ### 📑 Table of Contents
195
195
 
196
- - [Introduction](#introduction)
197
- - [Example](#example)
198
- - [1. What: Workflows, Tasks & Parameters](#1-what-workflows-tasks--parameters)
199
- - [2. Who: Personas](#2-who-personas)
200
- - [3. How: Knowledge, Instruments & Actions](#3-how-knowledge-instruments-and-actions)
201
- - [General Principles](#general-principles)
202
196
 
203
197
  ### Introduction
204
198
 
@@ -249,6 +243,8 @@ Personas can have access to different knowledge, tools and actions. They can als
249
243
 
250
244
  - [PERSONA](https://github.com/webgptorg/promptbook/blob/main/documents/commands/PERSONA.md)
251
245
 
246
+
247
+
252
248
  ### **3. How:** Knowledge, Instruments and Actions
253
249
 
254
250
  The resources used by the personas are used to do the work.
@@ -348,6 +344,8 @@ The following glossary is used to clarify certain concepts:
348
344
 
349
345
  _Note: This section is not complete dictionary, more list of general AI / LLM terms that has connection with Promptbook_
350
346
 
347
+
348
+
351
349
  ### 💯 Core concepts
352
350
 
353
351
  - [📚 Collection of pipelines](https://github.com/webgptorg/promptbook/discussions/65)
package/esm/index.es.js CHANGED
@@ -27,7 +27,7 @@ const BOOK_LANGUAGE_VERSION = '1.0.0';
27
27
  * @generated
28
28
  * @see https://github.com/webgptorg/promptbook
29
29
  */
30
- const PROMPTBOOK_ENGINE_VERSION = '0.94.0-1';
30
+ const PROMPTBOOK_ENGINE_VERSION = '0.94.0-2';
31
31
  /**
32
32
  * TODO: string_promptbook_version should be constrained to the all versions of Promptbook engine
33
33
  * Note: [💞] Ignore a discrepancy between file name and entity name
@@ -10365,12 +10365,12 @@ const BoilerplateFormfactorDefinition = {
10365
10365
  * Creates a wrapper around LlmExecutionTools that only exposes models matching the filter function
10366
10366
  *
10367
10367
  * @param llmTools The original LLM execution tools to wrap
10368
- * @param modelFilter Function that determines whether a model should be included
10368
+ * @param predicate Function that determines whether a model should be included
10369
10369
  * @returns A new LlmExecutionTools instance with filtered models
10370
10370
  *
10371
10371
  * @public exported from `@promptbook/core`
10372
10372
  */
10373
- function filterModels(llmTools, modelFilter) {
10373
+ function filterModels(llmTools, predicate) {
10374
10374
  const filteredTools = {
10375
10375
  // Keep all properties from the original llmTools
10376
10376
  ...llmTools,
@@ -10387,10 +10387,10 @@ function filterModels(llmTools, modelFilter) {
10387
10387
  const originalModels = await llmTools.listModels();
10388
10388
  // Handle both synchronous and Promise return types
10389
10389
  if (originalModels instanceof Promise) {
10390
- return originalModels.then((models) => models.filter(modelFilter));
10390
+ return originalModels.then((models) => models.filter(predicate));
10391
10391
  }
10392
10392
  else {
10393
- return originalModels.filter(modelFilter);
10393
+ return originalModels.filter(predicate);
10394
10394
  }
10395
10395
  },
10396
10396
  };
@@ -11150,7 +11150,7 @@ const _OllamaMetadataRegistration = $llmToolsMetadataRegister.register({
11150
11150
  packageName: '@promptbook/ollama',
11151
11151
  className: 'OllamaExecutionTools',
11152
11152
  options: {
11153
- baseUrl: 'http://localhost:11434',
11153
+ baseURL: 'http://localhost:11434',
11154
11154
  model: 'llama2',
11155
11155
  maxRequestsPerMinute: DEFAULT_MAX_REQUESTS_PER_MINUTE,
11156
11156
  },
@@ -11163,7 +11163,7 @@ const _OllamaMetadataRegistration = $llmToolsMetadataRegister.register({
11163
11163
  packageName: '@promptbook/ollama',
11164
11164
  className: 'OllamaExecutionTools',
11165
11165
  options: {
11166
- baseUrl: env.OLLAMA_BASE_URL,
11166
+ baseURL: env.OLLAMA_BASE_URL,
11167
11167
  model: env.OLLAMA_MODEL || 'llama2',
11168
11168
  maxRequestsPerMinute: DEFAULT_MAX_REQUESTS_PER_MINUTE,
11169
11169
  },