@sweetoburrito/backstage-plugin-ai-assistant-backend-module-embeddings-provider-azure-open-ai 0.3.4 → 0.3.6
This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
- package/README.md +77 -2
- package/config.d.ts +1 -0
- package/package.json +2 -2
package/README.md
CHANGED
|
@@ -1,5 +1,80 @@
|
|
|
1
1
|
# @sweetoburrito/backstage-plugin-ai-assistant-backend-module-embeddings-provider-azure-open-ai
|
|
2
2
|
|
|
3
|
-
|
|
3
|
+
An embeddings provider module that lets the Backstage AI Assistant backend create vector embeddings
|
|
4
|
+
using Azure-hosted embedding models (Azure OpenAI / Azure AI Foundry).
|
|
4
5
|
|
|
5
|
-
|
|
6
|
+
This README explains how the provider works, when to use it, configuration options, and how to wire
|
|
7
|
+
it into your Backstage backend.
|
|
8
|
+
|
|
9
|
+
## Features
|
|
10
|
+
|
|
11
|
+
- Convert text or documents to numeric vector embeddings using Azure OpenAI / Azure AI Foundry
|
|
12
|
+
embedding deployments.
|
|
13
|
+
- Exposes a provider implementation compatible with the AI Assistant backend so different
|
|
14
|
+
embeddings services can be swapped without changing the rest of the app.
|
|
15
|
+
- Handles basic batching and optional configuration for deployment name / endpoint selection.
|
|
16
|
+
|
|
17
|
+
## When to use
|
|
18
|
+
|
|
19
|
+
Use this module if you run Azure-hosted embedding models and want the AI Assistant to build
|
|
20
|
+
semantic search indices, vector stores, or provide retrieval-augmented generation (RAG)
|
|
21
|
+
capabilities in Backstage.
|
|
22
|
+
|
|
23
|
+
## Configuration
|
|
24
|
+
|
|
25
|
+
Add the provider configuration to your Backstage `app-config.yaml` or `app-config.local.yaml` under
|
|
26
|
+
`aiAssistant.embeddings.azureOpenAI`.
|
|
27
|
+
|
|
28
|
+
Minimum configuration keys (example):
|
|
29
|
+
|
|
30
|
+
```yaml
|
|
31
|
+
aiAssistant:
|
|
32
|
+
embeddings:
|
|
33
|
+
azureOpenAI:
|
|
34
|
+
endpoint: 'https://eastus.api.cognitive.microsoft.com/openai/deployments/text-embedding-3-large/embeddings?api-version=2023-05-15'
|
|
35
|
+
openAIApiVersion: 2024-12-01-preview
|
|
36
|
+
deploymentName: 'text-embedding-3-large'
|
|
37
|
+
instanceName: 'eastus'
|
|
38
|
+
apiKey: ${AZURE_OPENAI_API_KEY}
|
|
39
|
+
```
|
|
40
|
+
|
|
41
|
+
Field descriptions:
|
|
42
|
+
|
|
43
|
+
- `endpoint` - The full Azure endpoint URL for the embeddings deployment. This may include
|
|
44
|
+
the deployment path and api-version query parameter depending on your Azure setup.
|
|
45
|
+
- `deploymentName` - The name of the deployment that provides the embeddings model.
|
|
46
|
+
- `instanceName` - The Azure instance / region name (used for telemetry or constructing alternate
|
|
47
|
+
endpoint forms in some setups).
|
|
48
|
+
- `apiKey` - Your Azure OpenAI or Azure AI API key. Marked as secret in configuration.
|
|
49
|
+
- `openAIApiVersion` -
|
|
50
|
+
|
|
51
|
+
The exact keys available and required depend on your Azure configuration. Check the provider's
|
|
52
|
+
`config.d.ts` in the package for the canonical types used by the module.
|
|
53
|
+
|
|
54
|
+
## Install
|
|
55
|
+
|
|
56
|
+
Install the module into your Backstage backend workspace:
|
|
57
|
+
|
|
58
|
+
```sh
|
|
59
|
+
yarn workspace backend add @sweetoburrito/backstage-plugin-ai-assistant-backend-module-embeddings-provider-azure-open-ai
|
|
60
|
+
```
|
|
61
|
+
|
|
62
|
+
## Wire the provider into your backend
|
|
63
|
+
|
|
64
|
+
Add the provider module import to your backend entrypoint (usually `packages/backend/src/index.ts`):
|
|
65
|
+
|
|
66
|
+
```diff
|
|
67
|
+
// packages/backend/src/index.ts
|
|
68
|
+
|
|
69
|
+
// other backend modules...
|
|
70
|
+
backend.add(import('@sweetoburrito/backstage-plugin-ai-assistant-backend'));
|
|
71
|
+
|
|
72
|
+
// Add the Azure OpenAI embeddings provider
|
|
73
|
+
++backend.add(
|
|
74
|
+
++ import(
|
|
75
|
+
++ '@sweetoburrito/backstage-plugin-ai-assistant-backend-module-embeddings-provider-azure-open-ai'
|
|
76
|
+
++ ),
|
|
77
|
+
++);
|
|
78
|
+
```
|
|
79
|
+
|
|
80
|
+
Restart your backend after adding the provider so it registers with the AI Assistant plugin.
|
package/config.d.ts
CHANGED
package/package.json
CHANGED
|
@@ -1,6 +1,6 @@
|
|
|
1
1
|
{
|
|
2
2
|
"name": "@sweetoburrito/backstage-plugin-ai-assistant-backend-module-embeddings-provider-azure-open-ai",
|
|
3
|
-
"version": "0.3.
|
|
3
|
+
"version": "0.3.6",
|
|
4
4
|
"license": "Apache-2.0",
|
|
5
5
|
"description": "The embeddings-provider-azure-open-ai backend module for the ai-assistant plugin.",
|
|
6
6
|
"main": "dist/index.cjs.js",
|
|
@@ -30,7 +30,7 @@
|
|
|
30
30
|
"dependencies": {
|
|
31
31
|
"@backstage/backend-plugin-api": "backstage:^",
|
|
32
32
|
"@langchain/openai": "^0.6.11",
|
|
33
|
-
"@sweetoburrito/backstage-plugin-ai-assistant-node": "^0.
|
|
33
|
+
"@sweetoburrito/backstage-plugin-ai-assistant-node": "^0.7.0"
|
|
34
34
|
},
|
|
35
35
|
"devDependencies": {
|
|
36
36
|
"@backstage/backend-test-utils": "backstage:^",
|