algochat-ai 1.0.0
This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
- package/LICENSE +21 -0
- package/README.md +698 -0
- package/examples/advanced.js +71 -0
- package/examples/basic.js +35 -0
- package/examples/multimodal.js +36 -0
- package/examples/streaming.js +26 -0
- package/examples/whatsapp-baileys.js +190 -0
- package/package.json +41 -0
- package/src/client.js +380 -0
- package/src/index.d.ts +144 -0
- package/src/index.js +254 -0
- package/src/models.js +87 -0
package/LICENSE
ADDED
|
@@ -0,0 +1,21 @@
|
|
|
1
|
+
MIT License
|
|
2
|
+
|
|
3
|
+
Copyright (c) 2026 algochat-ai contributors
|
|
4
|
+
|
|
5
|
+
Permission is hereby granted, free of charge, to any person obtaining a copy
|
|
6
|
+
of this software and associated documentation files (the "Software"), to deal
|
|
7
|
+
in the Software without restriction, including without limitation the rights
|
|
8
|
+
to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
|
|
9
|
+
copies of the Software, and to permit persons to whom the Software is
|
|
10
|
+
furnished to do so, subject to the following conditions:
|
|
11
|
+
|
|
12
|
+
The above copyright notice and this permission notice shall be included in all
|
|
13
|
+
copies or substantial portions of the Software.
|
|
14
|
+
|
|
15
|
+
THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
|
|
16
|
+
IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
|
|
17
|
+
FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
|
|
18
|
+
AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
|
|
19
|
+
LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
|
|
20
|
+
OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE
|
|
21
|
+
SOFTWARE.
|
package/README.md
ADDED
|
@@ -0,0 +1,698 @@
|
|
|
1
|
+
# algochat-ai
|
|
2
|
+
|
|
3
|
+
> Unofficial Node.js client for **[algochat.app](https://algochat.app)** — access Google Gemini AI models for **free**, no API key required. Perfect for WhatsApp bots (Baileys), Discord bots, and any Node.js application.
|
|
4
|
+
|
|
5
|
+
[](https://www.npmjs.com/package/algochat-ai)
|
|
6
|
+
[](https://nodejs.org)
|
|
7
|
+
[](LICENSE)
|
|
8
|
+
|
|
9
|
+
---
|
|
10
|
+
|
|
11
|
+
## ✨ Features
|
|
12
|
+
|
|
13
|
+
- 🆓 **Free** — No API key, no account needed
|
|
14
|
+
- 🤖 **Gemini models** — Access Google Gemini 3 Flash Preview and more
|
|
15
|
+
- 🖼️ **Multimodal** — Send images (URLs or base64) alongside text
|
|
16
|
+
- 🌊 **Streaming** — Real-time token-by-token responses
|
|
17
|
+
- 📱 **Baileys-ready** — Drop-in integration for WhatsApp bots
|
|
18
|
+
- 🔄 **OpenAI-compatible** — Same message format as the OpenAI SDK
|
|
19
|
+
- 🔁 **Auto session management** — Sessions cached, refreshed on expiry, auto-retry on 401/403
|
|
20
|
+
- 📦 **TypeScript** — Full `.d.ts` type declarations included
|
|
21
|
+
|
|
22
|
+
---
|
|
23
|
+
|
|
24
|
+
## 📋 Table of Contents
|
|
25
|
+
|
|
26
|
+
- [Installation](#-installation)
|
|
27
|
+
- [Quick Start](#-quick-start)
|
|
28
|
+
- [API Reference](#-api-reference)
|
|
29
|
+
- [chat()](#chatinput-options)
|
|
30
|
+
- [chatStream()](#chatstreamInput-onchunk-options)
|
|
31
|
+
- [chatWithImage()](#chatwithimagetext-imageurl-options)
|
|
32
|
+
- [createCompletion()](#createcompletionmessages-options)
|
|
33
|
+
- [AlgoChatClient (class)](#algochatclient-class)
|
|
34
|
+
- [Model Utilities](#model-utilities)
|
|
35
|
+
- [Available Models](#-available-models)
|
|
36
|
+
- [Baileys WhatsApp Integration](#-baileys-whatsapp-integration)
|
|
37
|
+
- [OpenAI SDK Compatibility](#-openai-sdk-compatibility)
|
|
38
|
+
- [Conversation History](#-conversation-history)
|
|
39
|
+
- [Multimodal Images](#-multimodal-images)
|
|
40
|
+
- [Error Handling](#-error-handling)
|
|
41
|
+
- [TypeScript Usage](#-typescript-usage)
|
|
42
|
+
- [Examples Index](#-examples-index)
|
|
43
|
+
- [How It Works](#-how-it-works)
|
|
44
|
+
- [Limitations](#-limitations)
|
|
45
|
+
- [License](#-license)
|
|
46
|
+
|
|
47
|
+
---
|
|
48
|
+
|
|
49
|
+
## 📦 Installation
|
|
50
|
+
|
|
51
|
+
```bash
|
|
52
|
+
npm install algochat-ai
|
|
53
|
+
```
|
|
54
|
+
|
|
55
|
+
or with yarn:
|
|
56
|
+
|
|
57
|
+
```bash
|
|
58
|
+
yarn add algochat-ai
|
|
59
|
+
```
|
|
60
|
+
|
|
61
|
+
**Requirements:** Node.js ≥ 16.0.0
|
|
62
|
+
|
|
63
|
+
---
|
|
64
|
+
|
|
65
|
+
## 🚀 Quick Start
|
|
66
|
+
|
|
67
|
+
```js
|
|
68
|
+
const algochat = require('algochat-ai');
|
|
69
|
+
|
|
70
|
+
// Simple one-liner
|
|
71
|
+
const reply = await algochat.chat('What is the capital of France?');
|
|
72
|
+
console.log(reply); // "The capital of France is Paris."
|
|
73
|
+
```
|
|
74
|
+
|
|
75
|
+
Or using ES module syntax:
|
|
76
|
+
|
|
77
|
+
```js
|
|
78
|
+
import { chat } from 'algochat-ai';
|
|
79
|
+
|
|
80
|
+
const reply = await chat('Explain quantum entanglement in simple terms.');
|
|
81
|
+
console.log(reply);
|
|
82
|
+
```
|
|
83
|
+
|
|
84
|
+
---
|
|
85
|
+
|
|
86
|
+
## 📖 API Reference
|
|
87
|
+
|
|
88
|
+
### `chat(input, options?)`
|
|
89
|
+
|
|
90
|
+
Send a message and receive the **full response** as a string.
|
|
91
|
+
|
|
92
|
+
```js
|
|
93
|
+
const algochat = require('algochat-ai');
|
|
94
|
+
|
|
95
|
+
// String shorthand
|
|
96
|
+
const reply = await algochat.chat('Hello!');
|
|
97
|
+
|
|
98
|
+
// Full messages array (OpenAI style)
|
|
99
|
+
const reply = await algochat.chat([
|
|
100
|
+
{ role: 'system', content: 'You are a helpful assistant.' },
|
|
101
|
+
{ role: 'user', content: 'What is 2+2?' },
|
|
102
|
+
]);
|
|
103
|
+
|
|
104
|
+
// With options
|
|
105
|
+
const reply = await algochat.chat('Tell me a joke.', {
|
|
106
|
+
model: 'gemini-3-flash-preview',
|
|
107
|
+
systemPrompt: 'You are a stand-up comedian.',
|
|
108
|
+
timeout: 30000,
|
|
109
|
+
debug: true,
|
|
110
|
+
});
|
|
111
|
+
```
|
|
112
|
+
|
|
113
|
+
**Parameters:**
|
|
114
|
+
|
|
115
|
+
| Parameter | Type | Description |
|
|
116
|
+
|-----------|------|-------------|
|
|
117
|
+
| `input` | `string \| Message[]` | A plain text message **or** an OpenAI-style messages array |
|
|
118
|
+
| `options.model` | `string` | Model ID (see [Available Models](#-available-models)). Default: `gemini-3-flash-preview` |
|
|
119
|
+
| `options.systemPrompt` | `string` | Custom system prompt |
|
|
120
|
+
| `options.timeout` | `number` | Request timeout in ms. Default: `90000` |
|
|
121
|
+
| `options.debug` | `boolean` | Print verbose logs. Default: `false` |
|
|
122
|
+
|
|
123
|
+
**Returns:** `Promise<string>` — the AI's response text
|
|
124
|
+
|
|
125
|
+
---
|
|
126
|
+
|
|
127
|
+
### `chatStream(input, onChunk, options?)`
|
|
128
|
+
|
|
129
|
+
Stream the response **token by token** as it's generated. Great for terminal UIs or real-time UX.
|
|
130
|
+
|
|
131
|
+
```js
|
|
132
|
+
const algochat = require('algochat-ai');
|
|
133
|
+
|
|
134
|
+
await algochat.chatStream(
|
|
135
|
+
'Write a short story about a robot.',
|
|
136
|
+
(chunk) => {
|
|
137
|
+
process.stdout.write(chunk); // Print each token as it arrives
|
|
138
|
+
}
|
|
139
|
+
);
|
|
140
|
+
```
|
|
141
|
+
|
|
142
|
+
With async/await and full text:
|
|
143
|
+
|
|
144
|
+
```js
|
|
145
|
+
const fullText = await algochat.chatStream(
|
|
146
|
+
'Explain recursion.',
|
|
147
|
+
(chunk) => process.stdout.write(chunk),
|
|
148
|
+
{ model: 'gemini-3-flash-preview' }
|
|
149
|
+
);
|
|
150
|
+
|
|
151
|
+
console.log('\nTotal length:', fullText.length);
|
|
152
|
+
```
|
|
153
|
+
|
|
154
|
+
**Parameters:**
|
|
155
|
+
|
|
156
|
+
| Parameter | Type | Description |
|
|
157
|
+
|-----------|------|-------------|
|
|
158
|
+
| `input` | `string \| Message[]` | Same as `chat()` |
|
|
159
|
+
| `onChunk` | `(text: string) => void` | Called with each text delta |
|
|
160
|
+
| `options` | `ChatOptions` | Same as `chat()` |
|
|
161
|
+
|
|
162
|
+
**Returns:** `Promise<string>` — the full accumulated response
|
|
163
|
+
|
|
164
|
+
---
|
|
165
|
+
|
|
166
|
+
### `chatWithImage(text, imageUrl, options?)`
|
|
167
|
+
|
|
168
|
+
Send a **text question alongside an image** (multimodal).
|
|
169
|
+
|
|
170
|
+
```js
|
|
171
|
+
const algochat = require('algochat-ai');
|
|
172
|
+
|
|
173
|
+
// With a public image URL
|
|
174
|
+
const desc = await algochat.chatWithImage(
|
|
175
|
+
'What do you see in this image?',
|
|
176
|
+
'https://example.com/photo.jpg'
|
|
177
|
+
);
|
|
178
|
+
|
|
179
|
+
// With a base64 data URI
|
|
180
|
+
const fs = require('fs');
|
|
181
|
+
const imageData = fs.readFileSync('photo.png');
|
|
182
|
+
const base64Uri = `data:image/png;base64,${imageData.toString('base64')}`;
|
|
183
|
+
|
|
184
|
+
const analysis = await algochat.chatWithImage(
|
|
185
|
+
'Describe this image in detail.',
|
|
186
|
+
base64Uri
|
|
187
|
+
);
|
|
188
|
+
```
|
|
189
|
+
|
|
190
|
+
**Parameters:**
|
|
191
|
+
|
|
192
|
+
| Parameter | Type | Description |
|
|
193
|
+
|-----------|------|-------------|
|
|
194
|
+
| `text` | `string` | Your question about the image |
|
|
195
|
+
| `imageUrl` | `string` | `https://` URL **or** `data:image/...;base64,...` URI |
|
|
196
|
+
| `options` | `ChatOptions` | Same as `chat()` |
|
|
197
|
+
|
|
198
|
+
**Returns:** `Promise<string>`
|
|
199
|
+
|
|
200
|
+
---
|
|
201
|
+
|
|
202
|
+
### `createCompletion(messages, options?)`
|
|
203
|
+
|
|
204
|
+
Returns a full **OpenAI-compatible** chat completion object. Useful when you want the same shape as `openai.chat.completions.create()`.
|
|
205
|
+
|
|
206
|
+
```js
|
|
207
|
+
const algochat = require('algochat-ai');
|
|
208
|
+
|
|
209
|
+
const completion = await algochat.createCompletion([
|
|
210
|
+
{ role: 'user', content: 'Say hello!' }
|
|
211
|
+
]);
|
|
212
|
+
|
|
213
|
+
console.log(completion.choices[0].message.content);
|
|
214
|
+
// → "Hello! How can I help you today?"
|
|
215
|
+
|
|
216
|
+
console.log(completion.usage);
|
|
217
|
+
// → { prompt_tokens: ..., completion_tokens: ..., total_tokens: ... }
|
|
218
|
+
```
|
|
219
|
+
|
|
220
|
+
**Returns:** `Promise<ChatCompletion>` — OpenAI-compatible object
|
|
221
|
+
|
|
222
|
+
---
|
|
223
|
+
|
|
224
|
+
### `AlgoChatClient` (class)
|
|
225
|
+
|
|
226
|
+
For advanced use cases, create a **custom client instance** with its own session cache.
|
|
227
|
+
|
|
228
|
+
```js
|
|
229
|
+
const { AlgoChatClient } = require('algochat-ai');
|
|
230
|
+
|
|
231
|
+
const client = new AlgoChatClient({
|
|
232
|
+
debug: true, // Verbose logging
|
|
233
|
+
timeout: 60000, // 60s timeout
|
|
234
|
+
});
|
|
235
|
+
|
|
236
|
+
// Get the raw SSE stream
|
|
237
|
+
const { stream, chatId } = await client.createChatStream({
|
|
238
|
+
model: 'gemini-3-flash-preview',
|
|
239
|
+
messages: [{ role: 'user', content: 'Hello!' }],
|
|
240
|
+
systemPrompt: 'You are a helpful assistant.',
|
|
241
|
+
});
|
|
242
|
+
|
|
243
|
+
// Option A: consume as text
|
|
244
|
+
const text = await client.streamToText(stream);
|
|
245
|
+
console.log(text);
|
|
246
|
+
|
|
247
|
+
// Option B: process stream manually
|
|
248
|
+
stream.on('data', (chunk) => {
|
|
249
|
+
// raw SSE chunks from algochat.app
|
|
250
|
+
});
|
|
251
|
+
stream.on('end', () => console.log('Done'));
|
|
252
|
+
```
|
|
253
|
+
|
|
254
|
+
#### Constructor options
|
|
255
|
+
|
|
256
|
+
| Option | Type | Description |
|
|
257
|
+
|--------|------|-------------|
|
|
258
|
+
| `debug` | `boolean` | Enable verbose logging |
|
|
259
|
+
| `timeout` | `number` | Request timeout in ms |
|
|
260
|
+
|
|
261
|
+
#### Methods
|
|
262
|
+
|
|
263
|
+
| Method | Returns | Description |
|
|
264
|
+
|--------|---------|-------------|
|
|
265
|
+
| `createChatStream(params)` | `Promise<{stream, chatId}>` | Start a chat and get the raw SSE stream |
|
|
266
|
+
| `streamToText(stream)` | `Promise<string>` | Collect full text from an SSE stream |
|
|
267
|
+
| `uploadImage(imageSource, cookies, filename?)` | `Promise<UploadedFile>` | Upload an image to AlgoChat |
|
|
268
|
+
|
|
269
|
+
---
|
|
270
|
+
|
|
271
|
+
### Model Utilities
|
|
272
|
+
|
|
273
|
+
```js
|
|
274
|
+
const { resolveModel, getModelList, ALGOCHAT_MODELS, DEFAULT_MODEL } = require('algochat-ai');
|
|
275
|
+
|
|
276
|
+
// Resolve any model name → AlgoChat model ID
|
|
277
|
+
resolveModel('gpt-4o'); // → 'gemini-3-flash-preview'
|
|
278
|
+
resolveModel('gemini-3-flash-preview'); // → 'gemini-3-flash-preview'
|
|
279
|
+
resolveModel('unknown'); // → 'gemini-3-flash-preview' (fallback)
|
|
280
|
+
|
|
281
|
+
// Get all models in OpenAI format
|
|
282
|
+
const models = getModelList();
|
|
283
|
+
|
|
284
|
+
// Raw model definitions
|
|
285
|
+
console.log(ALGOCHAT_MODELS);
|
|
286
|
+
|
|
287
|
+
// Default model
|
|
288
|
+
console.log(DEFAULT_MODEL); // 'gemini-3-flash-preview'
|
|
289
|
+
```
|
|
290
|
+
|
|
291
|
+
---
|
|
292
|
+
|
|
293
|
+
## 🤖 Available Models
|
|
294
|
+
|
|
295
|
+
| Model ID | Name | Context | Free? |
|
|
296
|
+
|----------|------|---------|-------|
|
|
297
|
+
| `gemini-3-flash-preview` | Gemini 3 Flash Preview | 128K tokens | ✅ **Yes** |
|
|
298
|
+
| `gemini-2.5-pro` | Gemini 2.5 Pro | 200K tokens | ❌ Needs AlgoChat login |
|
|
299
|
+
| `gemini-2.0-flash` | Gemini 2.0 Flash | 128K tokens | ❌ Needs AlgoChat login |
|
|
300
|
+
|
|
301
|
+
**OpenAI Aliases** (all map to `gemini-3-flash-preview`):
|
|
302
|
+
`gpt-4o`, `gpt-4`, `gpt-3.5-turbo`, `gpt-4-turbo`, `gpt-4o-mini`
|
|
303
|
+
|
|
304
|
+
> **Note:** Only `gemini-3-flash-preview` works without an AlgoChat account.
|
|
305
|
+
> The other models require a valid logged-in session, which this package doesn't support yet.
|
|
306
|
+
|
|
307
|
+
---
|
|
308
|
+
|
|
309
|
+
## 📱 Baileys WhatsApp Integration
|
|
310
|
+
|
|
311
|
+
This is the **primary use case** — building a WhatsApp bot using [@whiskeysockets/baileys](https://github.com/WhiskeySockets/Baileys).
|
|
312
|
+
|
|
313
|
+
### Prerequisites
|
|
314
|
+
|
|
315
|
+
```bash
|
|
316
|
+
npm install @whiskeysockets/baileys algochat-ai qrcode-terminal pino
|
|
317
|
+
```
|
|
318
|
+
|
|
319
|
+
### Complete Working Bot
|
|
320
|
+
|
|
321
|
+
```js
|
|
322
|
+
const {
|
|
323
|
+
default: makeWASocket,
|
|
324
|
+
DisconnectReason,
|
|
325
|
+
useMultiFileAuthState,
|
|
326
|
+
fetchLatestBaileysVersion,
|
|
327
|
+
} = require('@whiskeysockets/baileys');
|
|
328
|
+
|
|
329
|
+
const algochat = require('algochat-ai');
|
|
330
|
+
const qrcode = require('qrcode-terminal');
|
|
331
|
+
const P = require('pino');
|
|
332
|
+
|
|
333
|
+
// Per-user conversation history (keeps last 10 exchanges)
|
|
334
|
+
const history = new Map();
|
|
335
|
+
const MAX_HISTORY = 10;
|
|
336
|
+
|
|
337
|
+
async function connectToWhatsApp() {
|
|
338
|
+
const { state, saveCreds } = await useMultiFileAuthState('./auth_info');
|
|
339
|
+
const { version } = await fetchLatestBaileysVersion();
|
|
340
|
+
|
|
341
|
+
const sock = makeWASocket({
|
|
342
|
+
version,
|
|
343
|
+
auth: state,
|
|
344
|
+
logger: P({ level: 'silent' }),
|
|
345
|
+
printQRInTerminal: false,
|
|
346
|
+
});
|
|
347
|
+
|
|
348
|
+
// Show QR code
|
|
349
|
+
sock.ev.on('connection.update', ({ connection, lastDisconnect, qr }) => {
|
|
350
|
+
if (qr) {
|
|
351
|
+
console.log('Scan this QR code:');
|
|
352
|
+
qrcode.generate(qr, { small: true });
|
|
353
|
+
}
|
|
354
|
+
if (connection === 'open') console.log('✅ WhatsApp connected!');
|
|
355
|
+
if (connection === 'close') {
|
|
356
|
+
const shouldReconnect =
|
|
357
|
+
lastDisconnect?.error?.output?.statusCode !== DisconnectReason.loggedOut;
|
|
358
|
+
if (shouldReconnect) connectToWhatsApp();
|
|
359
|
+
}
|
|
360
|
+
});
|
|
361
|
+
|
|
362
|
+
sock.ev.on('creds.update', saveCreds);
|
|
363
|
+
|
|
364
|
+
// Handle incoming messages
|
|
365
|
+
sock.ev.on('messages.upsert', async ({ messages, type }) => {
|
|
366
|
+
if (type !== 'notify') return;
|
|
367
|
+
|
|
368
|
+
for (const msg of messages) {
|
|
369
|
+
if (msg.key.fromMe) continue;
|
|
370
|
+
|
|
371
|
+
const jid = msg.key.remoteJid;
|
|
372
|
+
const userMsg = msg.message?.conversation
|
|
373
|
+
|| msg.message?.extendedTextMessage?.text;
|
|
374
|
+
|
|
375
|
+
if (!userMsg?.trim()) continue;
|
|
376
|
+
|
|
377
|
+
// /clear command to reset conversation
|
|
378
|
+
if (userMsg.trim() === '/clear') {
|
|
379
|
+
history.delete(jid);
|
|
380
|
+
await sock.sendMessage(jid, { text: '🗑️ Chat history cleared!' }, { quoted: msg });
|
|
381
|
+
continue;
|
|
382
|
+
}
|
|
383
|
+
|
|
384
|
+
console.log(`💬 ${jid}: ${userMsg}`);
|
|
385
|
+
await sock.sendPresenceUpdate('composing', jid); // "typing..." indicator
|
|
386
|
+
|
|
387
|
+
try {
|
|
388
|
+
// Get or create user's conversation history
|
|
389
|
+
const userHistory = history.get(jid) || [];
|
|
390
|
+
|
|
391
|
+
// Build full messages array with history
|
|
392
|
+
const messages_to_send = [
|
|
393
|
+
{
|
|
394
|
+
role: 'system',
|
|
395
|
+
content: 'You are a helpful WhatsApp assistant. Keep responses concise and friendly.',
|
|
396
|
+
},
|
|
397
|
+
...userHistory,
|
|
398
|
+
{ role: 'user', content: userMsg },
|
|
399
|
+
];
|
|
400
|
+
|
|
401
|
+
const reply = await algochat.chat(messages_to_send);
|
|
402
|
+
|
|
403
|
+
// Save to history
|
|
404
|
+
userHistory.push({ role: 'user', content: userMsg });
|
|
405
|
+
userHistory.push({ role: 'assistant', content: reply });
|
|
406
|
+
if (userHistory.length > MAX_HISTORY * 2) userHistory.splice(0, 2);
|
|
407
|
+
history.set(jid, userHistory);
|
|
408
|
+
|
|
409
|
+
await sock.sendPresenceUpdate('paused', jid);
|
|
410
|
+
await sock.sendMessage(jid, { text: reply }, { quoted: msg });
|
|
411
|
+
console.log(`🤖 Replied: ${reply.slice(0, 80)}...`);
|
|
412
|
+
} catch (err) {
|
|
413
|
+
console.error('Error:', err.message);
|
|
414
|
+
await sock.sendPresenceUpdate('paused', jid);
|
|
415
|
+
await sock.sendMessage(
|
|
416
|
+
jid,
|
|
417
|
+
{ text: '⚠️ Sorry, something went wrong. Please try again.' },
|
|
418
|
+
{ quoted: msg }
|
|
419
|
+
);
|
|
420
|
+
}
|
|
421
|
+
}
|
|
422
|
+
});
|
|
423
|
+
}
|
|
424
|
+
|
|
425
|
+
connectToWhatsApp();
|
|
426
|
+
```
|
|
427
|
+
|
|
428
|
+
### Group Chat Support
|
|
429
|
+
|
|
430
|
+
To respond in group chats only when tagged or when message starts with `!`:
|
|
431
|
+
|
|
432
|
+
```js
|
|
433
|
+
// Inside the message handler:
|
|
434
|
+
const isGroup = jid.endsWith('@g.us');
|
|
435
|
+
if (isGroup && !userMsg.startsWith('!')) continue; // Skip untagged group messages
|
|
436
|
+
const cleanMsg = userMsg.startsWith('!') ? userMsg.slice(1).trim() : userMsg;
|
|
437
|
+
```
|
|
438
|
+
|
|
439
|
+
### Handling Image Messages
|
|
440
|
+
|
|
441
|
+
```js
|
|
442
|
+
const caption = msg.message?.imageMessage?.caption;
|
|
443
|
+
const imageMsg = msg.message?.imageMessage;
|
|
444
|
+
|
|
445
|
+
if (imageMsg) {
|
|
446
|
+
// Download image
|
|
447
|
+
const buffer = await downloadMediaMessage(msg, 'buffer', {});
|
|
448
|
+
const base64 = `data:${imageMsg.mimetype};base64,${buffer.toString('base64')}`;
|
|
449
|
+
|
|
450
|
+
const reply = await algochat.chatWithImage(
|
|
451
|
+
caption || 'What is in this image?',
|
|
452
|
+
base64
|
|
453
|
+
);
|
|
454
|
+
await sock.sendMessage(jid, { text: reply }, { quoted: msg });
|
|
455
|
+
}
|
|
456
|
+
```
|
|
457
|
+
|
|
458
|
+
---
|
|
459
|
+
|
|
460
|
+
## 🔌 OpenAI SDK Compatibility
|
|
461
|
+
|
|
462
|
+
Since `algochat-ai` uses the same message format as OpenAI, you can use it as a **drop-in replacement**:
|
|
463
|
+
|
|
464
|
+
### Replace the OpenAI SDK
|
|
465
|
+
|
|
466
|
+
```js
|
|
467
|
+
// BEFORE (using openai SDK — needs API key)
|
|
468
|
+
const OpenAI = require('openai');
|
|
469
|
+
const openai = new OpenAI({ apiKey: process.env.OPENAI_API_KEY });
|
|
470
|
+
|
|
471
|
+
const completion = await openai.chat.completions.create({
|
|
472
|
+
model: 'gpt-4o',
|
|
473
|
+
messages: [{ role: 'user', content: 'Hello!' }],
|
|
474
|
+
});
|
|
475
|
+
const text = completion.choices[0].message.content;
|
|
476
|
+
|
|
477
|
+
// ─────────────────────────────────────────────────────────────────
|
|
478
|
+
// AFTER (using algochat-ai — no API key needed!)
|
|
479
|
+
const algochat = require('algochat-ai');
|
|
480
|
+
|
|
481
|
+
const text = await algochat.chat('Hello!');
|
|
482
|
+
// or for the same response shape:
|
|
483
|
+
const completion = await algochat.createCompletion([
|
|
484
|
+
{ role: 'user', content: 'Hello!' }
|
|
485
|
+
]);
|
|
486
|
+
const text2 = completion.choices[0].message.content;
|
|
487
|
+
```
|
|
488
|
+
|
|
489
|
+
### With LangChain (custom LLM wrapper)
|
|
490
|
+
|
|
491
|
+
```js
|
|
492
|
+
const { BaseChatModel } = require('@langchain/core/language_models/chat_models');
|
|
493
|
+
const algochat = require('algochat-ai');
|
|
494
|
+
|
|
495
|
+
// Use algochat-ai as LangChain LLM backend
|
|
496
|
+
class AlgoChatLLM {
|
|
497
|
+
async call(prompt) {
|
|
498
|
+
return algochat.chat(prompt);
|
|
499
|
+
}
|
|
500
|
+
}
|
|
501
|
+
```
|
|
502
|
+
|
|
503
|
+
---
|
|
504
|
+
|
|
505
|
+
## 💬 Conversation History
|
|
506
|
+
|
|
507
|
+
Pass a full messages array to maintain context across turns:
|
|
508
|
+
|
|
509
|
+
```js
|
|
510
|
+
const algochat = require('algochat-ai');
|
|
511
|
+
|
|
512
|
+
// Build history manually
|
|
513
|
+
const conversation = [
|
|
514
|
+
{ role: 'system', content: 'You are a concise math tutor.' },
|
|
515
|
+
{ role: 'user', content: 'What is 5 + 3?' },
|
|
516
|
+
{ role: 'assistant', content: '5 + 3 = 8.' },
|
|
517
|
+
{ role: 'user', content: 'Now double that.' },
|
|
518
|
+
];
|
|
519
|
+
|
|
520
|
+
const reply = await algochat.chat(conversation);
|
|
521
|
+
console.log(reply); // "8 doubled is 16."
|
|
522
|
+
```
|
|
523
|
+
|
|
524
|
+
> **Tip:** Keep history to the last 10–20 exchanges to avoid exceeding context limits.
|
|
525
|
+
|
|
526
|
+
---
|
|
527
|
+
|
|
528
|
+
## 🖼️ Multimodal Images
|
|
529
|
+
|
|
530
|
+
Three ways to send images:
|
|
531
|
+
|
|
532
|
+
### 1. Convenience helper
|
|
533
|
+
|
|
534
|
+
```js
|
|
535
|
+
const reply = await algochat.chatWithImage(
|
|
536
|
+
'What objects are in this photo?',
|
|
537
|
+
'https://example.com/photo.jpg'
|
|
538
|
+
);
|
|
539
|
+
```
|
|
540
|
+
|
|
541
|
+
### 2. Inline content array (OpenAI style)
|
|
542
|
+
|
|
543
|
+
```js
|
|
544
|
+
const reply = await algochat.chat([
|
|
545
|
+
{
|
|
546
|
+
role: 'user',
|
|
547
|
+
content: [
|
|
548
|
+
{ type: 'text', text: 'Describe this image.' },
|
|
549
|
+
{ type: 'image_url', image_url: { url: 'https://example.com/photo.jpg' } },
|
|
550
|
+
],
|
|
551
|
+
},
|
|
552
|
+
]);
|
|
553
|
+
```
|
|
554
|
+
|
|
555
|
+
### 3. Base64 data URI
|
|
556
|
+
|
|
557
|
+
```js
|
|
558
|
+
const fs = require('fs');
|
|
559
|
+
const image = fs.readFileSync('./photo.png');
|
|
560
|
+
const uri = `data:image/png;base64,${image.toString('base64')}`;
|
|
561
|
+
|
|
562
|
+
const reply = await algochat.chatWithImage('What is in this image?', uri);
|
|
563
|
+
```
|
|
564
|
+
|
|
565
|
+
---
|
|
566
|
+
|
|
567
|
+
## 🛡️ Error Handling
|
|
568
|
+
|
|
569
|
+
```js
|
|
570
|
+
const algochat = require('algochat-ai');
|
|
571
|
+
|
|
572
|
+
try {
|
|
573
|
+
const reply = await algochat.chat('Hello!');
|
|
574
|
+
console.log(reply);
|
|
575
|
+
} catch (err) {
|
|
576
|
+
if (err.response) {
|
|
577
|
+
// HTTP error from algochat.app
|
|
578
|
+
console.error('HTTP Error:', err.response.status, err.response.data);
|
|
579
|
+
} else if (err.code === 'ECONNABORTED') {
|
|
580
|
+
console.error('Request timed out. Try again.');
|
|
581
|
+
} else {
|
|
582
|
+
console.error('Error:', err.message);
|
|
583
|
+
}
|
|
584
|
+
}
|
|
585
|
+
```
|
|
586
|
+
|
|
587
|
+
**Common errors:**
|
|
588
|
+
|
|
589
|
+
| Error | Cause | Fix |
|
|
590
|
+
|-------|-------|-----|
|
|
591
|
+
| `ECONNABORTED` | Request timed out | Increase `timeout` option |
|
|
592
|
+
| `403` | AlgoChat blocked the request | The library auto-retries once; if it persists, try again later |
|
|
593
|
+
| `500` | AlgoChat server error | Usually temporary; retry |
|
|
594
|
+
| `Invalid base64 data URI` | Malformed image data | Ensure format: `data:image/jpeg;base64,...` |
|
|
595
|
+
|
|
596
|
+
---
|
|
597
|
+
|
|
598
|
+
## 🔷 TypeScript Usage
|
|
599
|
+
|
|
600
|
+
Full type support is included:
|
|
601
|
+
|
|
602
|
+
```ts
|
|
603
|
+
import { chat, chatStream, chatWithImage, createCompletion, AlgoChatClient } from 'algochat-ai';
|
|
604
|
+
import type { Message, ChatOptions, ChatCompletion } from 'algochat-ai';
|
|
605
|
+
|
|
606
|
+
// Typed messages
|
|
607
|
+
const messages: Message[] = [
|
|
608
|
+
{ role: 'system', content: 'You are helpful.' },
|
|
609
|
+
{ role: 'user', content: 'Hello!' },
|
|
610
|
+
];
|
|
611
|
+
|
|
612
|
+
const reply: string = await chat(messages);
|
|
613
|
+
|
|
614
|
+
// Typed completion
|
|
615
|
+
const completion: ChatCompletion = await createCompletion(messages);
|
|
616
|
+
|
|
617
|
+
// Custom client
|
|
618
|
+
const client = new AlgoChatClient({ debug: false, timeout: 60000 });
|
|
619
|
+
const { stream } = await client.createChatStream({ model: 'gemini-3-flash-preview', messages });
|
|
620
|
+
const text = await client.streamToText(stream);
|
|
621
|
+
```
|
|
622
|
+
|
|
623
|
+
---
|
|
624
|
+
|
|
625
|
+
## 📁 Examples Index
|
|
626
|
+
|
|
627
|
+
| File | Description |
|
|
628
|
+
|------|-------------|
|
|
629
|
+
| [`examples/basic.js`](examples/basic.js) | Simple `chat()` calls |
|
|
630
|
+
| [`examples/streaming.js`](examples/streaming.js) | Token-by-token streaming |
|
|
631
|
+
| [`examples/multimodal.js`](examples/multimodal.js) | Image analysis |
|
|
632
|
+
| [`examples/advanced.js`](examples/advanced.js) | Custom client, completions, error handling |
|
|
633
|
+
| [`examples/whatsapp-baileys.js`](examples/whatsapp-baileys.js) | Full Baileys WhatsApp bot |
|
|
634
|
+
|
|
635
|
+
Run any example:
|
|
636
|
+
```bash
|
|
637
|
+
node examples/basic.js
|
|
638
|
+
node examples/streaming.js
|
|
639
|
+
node examples/whatsapp-baileys.js
|
|
640
|
+
```
|
|
641
|
+
|
|
642
|
+
---
|
|
643
|
+
|
|
644
|
+
## ⚙️ How It Works
|
|
645
|
+
|
|
646
|
+
This package reverse-engineers the algochat.app API flow:
|
|
647
|
+
|
|
648
|
+
```
|
|
649
|
+
1. GET https://algochat.app/api/csrf
|
|
650
|
+
→ Receives csrf_token cookie
|
|
651
|
+
|
|
652
|
+
2. POST https://algochat.app/api/session
|
|
653
|
+
→ Receives zola_sid (anonymous user session)
|
|
654
|
+
|
|
655
|
+
3. POST https://algochat.app/api/create-chat
|
|
656
|
+
{ title: "New Chat", model: "gemini-3-flash-preview" }
|
|
657
|
+
→ Returns chatId
|
|
658
|
+
|
|
659
|
+
4. [Optional] POST https://algochat.app/api/files
|
|
660
|
+
→ Upload image, returns fileId
|
|
661
|
+
|
|
662
|
+
5. POST https://algochat.app/api/chat
|
|
663
|
+
{ chatId, userId, model, messages, ... }
|
|
664
|
+
→ SSE stream of text-delta events
|
|
665
|
+
|
|
666
|
+
6. Parse SSE stream:
|
|
667
|
+
data: {"type":"text-delta","delta":"Hello"}
|
|
668
|
+
data: {"type":"finish"}
|
|
669
|
+
data: [DONE]
|
|
670
|
+
```
|
|
671
|
+
|
|
672
|
+
Sessions are cached for **25 minutes** and automatically refreshed. On 401/403 errors, the session is invalidated and re-fetched before retrying once.
|
|
673
|
+
|
|
674
|
+
---
|
|
675
|
+
|
|
676
|
+
## ⚠️ Limitations
|
|
677
|
+
|
|
678
|
+
- 🔓 **Anonymous only** — Only the `gemini-3-flash-preview` model works without authentication
|
|
679
|
+
- 🚫 **File uploads** — The `/api/files` endpoint returns 403 for anonymous sessions; use public image URLs or base64 data URIs inline in messages instead
|
|
680
|
+
- 📡 **Upstream dependent** — This package depends on algochat.app's undocumented API, which may change without notice
|
|
681
|
+
- 🔁 **Rate limiting** — AlgoChat may rate-limit aggressive usage; add delays if needed
|
|
682
|
+
- 📏 **Context** — Each chat is a new session (no server-side memory); pass history yourself in the messages array
|
|
683
|
+
|
|
684
|
+
---
|
|
685
|
+
|
|
686
|
+
## 🤝 Contributing
|
|
687
|
+
|
|
688
|
+
This is an unofficial client. Contributions, bug reports, and PRs are welcome!
|
|
689
|
+
|
|
690
|
+
---
|
|
691
|
+
|
|
692
|
+
## 📄 License
|
|
693
|
+
|
|
694
|
+
MIT — see [LICENSE](LICENSE)
|
|
695
|
+
|
|
696
|
+
---
|
|
697
|
+
|
|
698
|
+
> **Disclaimer:** This is an unofficial package and is not affiliated with or endorsed by algochat.app. Use responsibly.
|