node-llama-cpp 2.1.0 → 2.1.1
This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
- package/README.md +25 -0
- package/llama/binariesGithubRelease.json +1 -1
- package/llamaBins/linux-arm64-16.node +0 -0
- package/llamaBins/linux-arm64-17.node +0 -0
- package/llamaBins/linux-arm64-18.node +0 -0
- package/llamaBins/linux-arm64-19.node +0 -0
- package/llamaBins/linux-arm64-20.node +0 -0
- package/llamaBins/linux-armv7l-16.node +0 -0
- package/llamaBins/linux-armv7l-17.node +0 -0
- package/llamaBins/linux-armv7l-18.node +0 -0
- package/llamaBins/linux-armv7l-19.node +0 -0
- package/llamaBins/linux-armv7l-20.node +0 -0
- package/llamaBins/linux-ppc64le-16.node +0 -0
- package/llamaBins/linux-ppc64le-17.node +0 -0
- package/llamaBins/linux-ppc64le-18.node +0 -0
- package/llamaBins/linux-ppc64le-19.node +0 -0
- package/llamaBins/linux-ppc64le-20.node +0 -0
- package/llamaBins/linux-x64-16.node +0 -0
- package/llamaBins/linux-x64-17.node +0 -0
- package/llamaBins/linux-x64-18.node +0 -0
- package/llamaBins/linux-x64-19.node +0 -0
- package/llamaBins/linux-x64-20.node +0 -0
- package/llamaBins/mac-arm64-16.node +0 -0
- package/llamaBins/mac-arm64-17.node +0 -0
- package/llamaBins/mac-arm64-18.node +0 -0
- package/llamaBins/mac-arm64-19.node +0 -0
- package/llamaBins/mac-arm64-20.node +0 -0
- package/llamaBins/mac-x64-16.node +0 -0
- package/llamaBins/mac-x64-17.node +0 -0
- package/llamaBins/mac-x64-18.node +0 -0
- package/llamaBins/mac-x64-19.node +0 -0
- package/llamaBins/mac-x64-20.node +0 -0
- package/llamaBins/win-x64-16.node +0 -0
- package/llamaBins/win-x64-17.node +0 -0
- package/llamaBins/win-x64-18.node +0 -0
- package/llamaBins/win-x64-19.node +0 -0
- package/llamaBins/win-x64-20.node +0 -0
- package/package.json +1 -4
package/README.md
CHANGED
|
@@ -175,6 +175,23 @@ console.log("AI: " + a2);
|
|
|
175
175
|
console.log(JSON.parse(a2));
|
|
176
176
|
```
|
|
177
177
|
|
|
178
|
+
### Metal and CUDA support
|
|
179
|
+
To load a version of `llama.cpp` that was compiled to use Metal or CUDA,
|
|
180
|
+
you have to build it from source with the `--metal` or `--cuda` flag before running your code that imports `node-llama-cpp`.
|
|
181
|
+
|
|
182
|
+
To do this, run this command inside of your project directory:
|
|
183
|
+
```bash
|
|
184
|
+
# For Metal support on macOS
|
|
185
|
+
npx node-llama-cpp download --metal
|
|
186
|
+
|
|
187
|
+
# For CUDA support
|
|
188
|
+
npx node-llama-cpp download --cuda
|
|
189
|
+
```
|
|
190
|
+
|
|
191
|
+
> In order for `node-llama-cpp` to be able to build `llama.cpp` from source, make sure you have the required dependencies of `node-gyp` installed.
|
|
192
|
+
>
|
|
193
|
+
> More info is available [here](https://github.com/nodejs/node-gyp#on-unix) (you don't have to install `node-gyp` itself, just the dependencies).
|
|
194
|
+
|
|
178
195
|
### CLI
|
|
179
196
|
```
|
|
180
197
|
Usage: node-llama-cpp <command> [options]
|
|
@@ -206,6 +223,10 @@ Options:
|
|
|
206
223
|
ronment variable [string] [default: "latest"]
|
|
207
224
|
-a, --arch The architecture to compile llama.cpp for [string]
|
|
208
225
|
-t, --nodeTarget The Node.js version to compile llama.cpp for. Example: v18.0.0 [string]
|
|
226
|
+
--metal Compile llama.cpp with Metal support. Can also be set via the NODE_LLAMA_CP
|
|
227
|
+
P_METAL environment variable [boolean] [default: false]
|
|
228
|
+
--cuda Compile llama.cpp with CUDA support. Can also be set via the NODE_LLAMA_CPP
|
|
229
|
+
_CUDA environment variable [boolean] [default: false]
|
|
209
230
|
--skipBuild, --sb Skip building llama.cpp after downloading it [boolean] [default: false]
|
|
210
231
|
-v, --version Show version number [boolean]
|
|
211
232
|
```
|
|
@@ -220,6 +241,10 @@ Options:
|
|
|
220
241
|
-h, --help Show help [boolean]
|
|
221
242
|
-a, --arch The architecture to compile llama.cpp for [string]
|
|
222
243
|
-t, --nodeTarget The Node.js version to compile llama.cpp for. Example: v18.0.0 [string]
|
|
244
|
+
--metal Compile llama.cpp with Metal support. Can also be set via the NODE_LLAMA_CPP_MET
|
|
245
|
+
AL environment variable [boolean] [default: false]
|
|
246
|
+
--cuda Compile llama.cpp with CUDA support. Can also be set via the NODE_LLAMA_CPP_CUDA
|
|
247
|
+
environment variable [boolean] [default: false]
|
|
223
248
|
-v, --version Show version number [boolean]
|
|
224
249
|
```
|
|
225
250
|
|
|
Binary file
|
|
Binary file
|
|
Binary file
|
|
Binary file
|
|
Binary file
|
|
Binary file
|
|
Binary file
|
|
Binary file
|
|
Binary file
|
|
Binary file
|
|
Binary file
|
|
Binary file
|
|
Binary file
|
|
Binary file
|
|
Binary file
|
|
Binary file
|
|
Binary file
|
|
Binary file
|
|
Binary file
|
|
Binary file
|
|
Binary file
|
|
Binary file
|
|
Binary file
|
|
Binary file
|
|
Binary file
|
|
Binary file
|
|
Binary file
|
|
Binary file
|
|
Binary file
|
|
Binary file
|
|
Binary file
|
|
Binary file
|
|
Binary file
|
|
Binary file
|
|
Binary file
|
package/package.json
CHANGED
|
@@ -1,6 +1,6 @@
|
|
|
1
1
|
{
|
|
2
2
|
"name": "node-llama-cpp",
|
|
3
|
-
"version": "2.1.
|
|
3
|
+
"version": "2.1.1",
|
|
4
4
|
"description": "node.js bindings for llama.cpp",
|
|
5
5
|
"main": "dist/index.js",
|
|
6
6
|
"type": "module",
|
|
@@ -111,7 +111,6 @@
|
|
|
111
111
|
"zx": "^7.2.3"
|
|
112
112
|
},
|
|
113
113
|
"dependencies": {
|
|
114
|
-
"bytes": "^3.1.2",
|
|
115
114
|
"chalk": "^5.3.0",
|
|
116
115
|
"cli-progress": "^3.12.0",
|
|
117
116
|
"cross-env": "^7.0.3",
|
|
@@ -119,9 +118,7 @@
|
|
|
119
118
|
"env-var": "^7.3.1",
|
|
120
119
|
"fs-extra": "^11.1.1",
|
|
121
120
|
"node-addon-api": "^7.0.0",
|
|
122
|
-
"node-downloader-helper": "^2.1.9",
|
|
123
121
|
"node-gyp": "^9.4.0",
|
|
124
|
-
"node-stream-zip": "^1.15.0",
|
|
125
122
|
"octokit": "^3.1.0",
|
|
126
123
|
"ora": "^7.0.1",
|
|
127
124
|
"simple-git": "^3.19.1",
|