@memberjunction/ai-ollama 4.0.0 → 4.2.0

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
Files changed (2) hide show
  1. package/README.md +116 -0
  2. package/package.json +3 -3
package/README.md ADDED
@@ -0,0 +1,116 @@
1
+ # @memberjunction/ai-ollama
2
+
3
+ MemberJunction AI provider for Ollama, enabling integration with locally-hosted open-source models. This package provides both LLM and embedding capabilities through Ollama's local inference server.
4
+
5
+ ## Architecture
6
+
7
+ ```mermaid
8
+ graph TD
9
+ A["OllamaLLM<br/>(Provider)"] -->|extends| B["BaseLLM<br/>(@memberjunction/ai)"]
10
+ C["OllamaEmbeddings<br/>(Provider)"] -->|extends| D["BaseEmbeddings<br/>(@memberjunction/ai)"]
11
+ A -->|wraps| E["Ollama Client<br/>(ollama SDK)"]
12
+ C -->|wraps| E
13
+ E -->|connects to| F["Ollama Server<br/>(localhost:11434)"]
14
+ F -->|runs| G["Local Models<br/>(Llama, Mistral, etc.)"]
15
+ B -->|registered via| H["@RegisterClass"]
16
+ D -->|registered via| H
17
+
18
+ style A fill:#7c5295,stroke:#563a6b,color:#fff
19
+ style C fill:#7c5295,stroke:#563a6b,color:#fff
20
+ style B fill:#2d6a9f,stroke:#1a4971,color:#fff
21
+ style D fill:#2d6a9f,stroke:#1a4971,color:#fff
22
+ style E fill:#2d8659,stroke:#1a5c3a,color:#fff
23
+ style F fill:#2d8659,stroke:#1a5c3a,color:#fff
24
+ style G fill:#b8762f,stroke:#8a5722,color:#fff
25
+ style H fill:#b8762f,stroke:#8a5722,color:#fff
26
+ ```
27
+
28
+ ## Features
29
+
30
+ - **Local Model Hosting**: Run AI models locally via Ollama without cloud dependencies
31
+ - **Chat Completions**: Full conversational AI with any Ollama-hosted model
32
+ - **Embeddings**: Local text embeddings through Ollama's embedding API
33
+ - **Streaming**: Real-time response streaming
34
+ - **Multimodal Support**: Image input support for vision-capable models
35
+ - **Privacy**: All data stays on your local infrastructure
36
+ - **No API Key Required**: Connects to local Ollama server
37
+ - **Configurable Endpoint**: Support for custom host configuration
38
+
39
+ ## Installation
40
+
41
+ ```bash
42
+ npm install @memberjunction/ai-ollama
43
+ ```
44
+
45
+ ## Usage
46
+
47
+ ### Chat Completion
48
+
49
+ ```typescript
50
+ import { OllamaLLM } from '@memberjunction/ai-ollama';
51
+
52
+ const llm = new OllamaLLM('not-used'); // API key not needed for local
53
+
54
+ const result = await llm.ChatCompletion({
55
+ model: 'llama3.1',
56
+ messages: [
57
+ { role: 'system', content: 'You are a helpful assistant.' },
58
+ { role: 'user', content: 'Explain how Ollama works.' }
59
+ ],
60
+ temperature: 0.7
61
+ });
62
+
63
+ if (result.success) {
64
+ console.log(result.data.choices[0].message.content);
65
+ }
66
+ ```
67
+
68
+ ### Streaming
69
+
70
+ ```typescript
71
+ const result = await llm.ChatCompletion({
72
+ model: 'llama3.1',
73
+ messages: [{ role: 'user', content: 'Write a short story.' }],
74
+ streaming: true,
75
+ streamingCallbacks: {
76
+ OnContent: (content) => process.stdout.write(content),
77
+ OnComplete: () => console.log('\nDone!')
78
+ }
79
+ });
80
+ ```
81
+
82
+ ### Embeddings
83
+
84
+ ```typescript
85
+ import { OllamaEmbeddings } from '@memberjunction/ai-ollama';
86
+
87
+ const embedder = new OllamaEmbeddings('not-used');
88
+
89
+ const result = await embedder.EmbedText({
90
+ text: 'Sample text for embedding',
91
+ model: 'nomic-embed-text'
92
+ });
93
+
94
+ console.log(`Dimensions: ${result.vector.length}`);
95
+ ```
96
+
97
+ ## Prerequisites
98
+
99
+ 1. Install [Ollama](https://ollama.ai/)
100
+ 2. Pull a model: `ollama pull llama3.1`
101
+ 3. Ollama server starts automatically on port 11434
102
+
103
+ ## Configuration
104
+
105
+ The default endpoint is `http://localhost:11434`. Configure via `SetAdditionalSettings` for custom hosts.
106
+
107
+ ## Class Registration
108
+
109
+ - `OllamaLLM` -- Registered via `@RegisterClass(BaseLLM, 'OllamaLLM')`
110
+ - `OllamaEmbeddings` -- Registered via `@RegisterClass(BaseEmbeddings, 'OllamaEmbeddings')`
111
+
112
+ ## Dependencies
113
+
114
+ - `@memberjunction/ai` - Core AI abstractions
115
+ - `@memberjunction/global` - Class registration
116
+ - `ollama` - Official Ollama SDK
package/package.json CHANGED
@@ -1,7 +1,7 @@
1
1
  {
2
2
  "name": "@memberjunction/ai-ollama",
3
3
  "type": "module",
4
- "version": "4.0.0",
4
+ "version": "4.2.0",
5
5
  "description": "MemberJunction Wrapper for Ollama - Local Inference",
6
6
  "main": "dist/index.js",
7
7
  "types": "dist/index.d.ts",
@@ -20,8 +20,8 @@
20
20
  "typescript": "^5.9.3"
21
21
  },
22
22
  "dependencies": {
23
- "@memberjunction/ai": "4.0.0",
24
- "@memberjunction/global": "4.0.0",
23
+ "@memberjunction/ai": "4.2.0",
24
+ "@memberjunction/global": "4.2.0",
25
25
  "ollama": "^0.6.3"
26
26
  },
27
27
  "repository": {