llmasaservice-client 0.0.2 → 0.0.4
This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
- package/CHANGELOG.md +12 -0
- package/package.json +9 -2
- package/{README.md → readme.md} +8 -5
package/CHANGELOG.md
CHANGED
package/package.json
CHANGED
|
@@ -1,7 +1,7 @@
|
|
|
1
1
|
{
|
|
2
2
|
"name": "llmasaservice-client",
|
|
3
3
|
"license": "MIT",
|
|
4
|
-
"version": "0.0.
|
|
4
|
+
"version": "0.0.4",
|
|
5
5
|
"main": "dist/index.js",
|
|
6
6
|
"module": "dist/index.mjs",
|
|
7
7
|
"types": "dist/index.d.ts",
|
|
@@ -30,5 +30,12 @@
|
|
|
30
30
|
"openAI",
|
|
31
31
|
"chat"
|
|
32
32
|
],
|
|
33
|
-
"homepage": "https://llmasaservice.io"
|
|
33
|
+
"homepage": "https://llmasaservice.io",
|
|
34
|
+
"repository": {
|
|
35
|
+
"type": "git",
|
|
36
|
+
"url": "git+https://github.com/PredictabilityAtScale/useLLM.git"
|
|
37
|
+
},
|
|
38
|
+
"bugs": {
|
|
39
|
+
"url": "https://github.com/PredictabilityAtScale/useLLM/issues"
|
|
40
|
+
}
|
|
34
41
|
}
|
package/{README.md → readme.md}
RENAMED
|
@@ -1,4 +1,7 @@
|
|
|
1
1
|
# LLMAsAService.io for Client Side Code
|
|
2
|
+
|
|
3
|
+
Website: ([www.llmasaservice.io](https://www.llmasaservice.io))
|
|
4
|
+
Control panel: ([app.llmasaservice.io](https://app.llmasaservice.io))
|
|
2
5
|
A product by CASEy, Inc ([heycasey.io](https://heycasey.io))
|
|
3
6
|
|
|
4
7
|
## What is LLMAsAService.io?
|
|
@@ -54,11 +57,11 @@ The settings you need are the same settings you would pass to the vendor using t
|
|
|
54
57
|
|
|
55
58
|
After logging into the control panel, choose LLMServices from the right panel menu
|
|
56
59
|
|
|
57
|
-

|
|
58
61
|
|
|
59
62
|
1. Click the Add Service box to create your first or another LLM Service Endpoint
|
|
60
63
|
|
|
61
|
-

|
|
62
65
|
|
|
63
66
|
2. Name and choose your Vendor. We will boilerplate that vendors endpoint URL, header and body as a starting point. These will vary based on your needs and the vendor chosen (we are documenting each vendor now, until then email us if you have difficulty). Tip: We use the streaming features for each vendor. So, make sure that the streaming options are enabled. For example, for OpenAI add these at the end of the rest of the body JSON:
|
|
64
67
|
|
|
@@ -92,11 +95,11 @@ Bad (no quotes on names, and trailing , before ending brace):
|
|
|
92
95
|
6. Click on the Edit button for the new service. The Add or Update API Key and Test buttons will now be enabled. Click **Add or Update API Key**
|
|
93
96
|
7. We properly encrypt and save your API key (we cannot retrieve it for you, if lost, create a new key from your vendor). Get the API key from your LLM Vendor's developer control panel and paste it into the dialog box and click Save. (one of the advantages of using LLMAsAService is that safe key management is in one place, we found this convenient and safer than using command line tools and config files)
|
|
94
97
|
|
|
95
|
-

|
|
96
99
|
|
|
97
100
|
8. Make a test call. Click the **Test Call** button and confirm you get a response
|
|
98
101
|
|
|
99
|
-

|
|
100
103
|
|
|
101
104
|
Repeat those steps for your other providers and configurations (one n north america, one in the EU, one for Azure or Anthropic, etc.)
|
|
102
105
|
|
|
@@ -177,4 +180,4 @@ Calling **send** makes a secure call to LLMAsAService where a response is marsha
|
|
|
177
180
|
|
|
178
181
|
We have pre-built UIs in the works, but for now, you can call send and display the response wherever needed. An additional property **idle"" can be used to disable the send buttons when a response is ongoing. It will be true when idle, false when busy.
|
|
179
182
|
|
|
180
|
-
We also accept Abort functionality, and are in the process of documenting that now. If you need it email help@heycasey.io and we'll sort you out.
|
|
183
|
+
We also accept Abort functionality, and are in the process of documenting that now. If you need it email help@heycasey.io and we'll sort you out.
|