@appland/appmap 3.168.0 → 3.169.0
This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
- package/CHANGELOG.md +14 -0
- package/built/docs/appmap-docs.md +1 -1
- package/built/docs/community.md +1 -1
- package/built/docs/get-started-with-appmap/index.md +1 -1
- package/built/docs/get-started-with-appmap/navie-ai-quickstart.md +6 -5
- package/built/docs/navie-reference/index.md +18 -0
- package/built/docs/navie-reference/navie-accessing-logs.md +42 -0
- package/built/docs/navie-reference/navie-bring-your-own-model-examples.md +202 -0
- package/built/docs/navie-reference/navie-commands.md +119 -0
- package/built/docs/navie-reference/navie-github-repository.md +12 -0
- package/built/docs/{using-navie-ai/how-navie-works.md → navie-reference/navie-how-it-works.md} +5 -4
- package/built/docs/navie-reference/navie-openai-key-management.md +83 -0
- package/built/docs/navie-reference/navie-options.md +87 -0
- package/built/docs/navie-reference/navie-pre-built-libraries-for-appmap-data.md +23 -0
- package/built/docs/navie-reference/navie-user-interface.md +32 -0
- package/built/docs/navie-reference.md +1 -0
- package/built/docs/reference/appmap-airgapped-install.md +293 -0
- package/built/docs/reference/appmap-client-cli.md +1 -1
- package/built/docs/reference/appmap-java.md +1 -1
- package/built/docs/reference/guides/exporting-appmap-diagrams.md +1 -0
- package/built/docs/reference/guides/handling-large-appmap-diagrams.md +1 -1
- package/built/docs/reference/guides/index.md +1 -0
- package/built/docs/reference/guides/navigating-code-objects.md +2 -2
- package/built/docs/reference/guides/openapi.md +1 -1
- package/built/docs/reference/guides/reading-sql-in-appmap-diagrams.md +1 -1
- package/built/docs/reference/guides/refine-appmap-data.md +1 -1
- package/built/docs/reference/guides/reverse-engineering.md +1 -0
- package/built/docs/reference/guides/using-appmap-analysis.md +1 -1
- package/built/docs/reference/guides/using-appmap-diagrams.md +2 -2
- package/built/docs/reference/index.md +3 -1
- package/built/docs/reference/integrations/atlassian-compass.md +1 -0
- package/built/docs/reference/integrations/atlassian-confluence.md +1 -0
- package/built/docs/reference/integrations/docker.md +1 -0
- package/built/docs/reference/integrations/index.md +1 -0
- package/built/docs/reference/integrations/mermaid.md +1 -0
- package/built/docs/reference/integrations/plantuml.md +1 -0
- package/built/docs/reference/integrations/postman.md +1 -0
- package/built/docs/reference/integrations/readme.md +1 -0
- package/built/docs/reference/integrations/smartbear-swaggerhub.md +2 -1
- package/built/docs/reference/uninstalling-appmap.md +1 -1
- package/built/docs/reference/vscode.md +1 -1
- package/built/docs/using-navie-ai/bring-your-own-model.md +38 -37
- package/built/docs/using-navie-ai/how-to-open-navie.md +5 -5
- package/built/docs/using-navie-ai/index.md +8 -7
- package/built/docs/using-navie-ai/using-navie.md +64 -75
- package/built/fulltext/FileIndex.js +0 -1
- package/built/fulltext/FileIndex.js.map +1 -1
- package/package.json +2 -2
- package/built/docs/reference/integrations/circle-ci.md +0 -425
package/CHANGELOG.md
CHANGED
|
@@ -1,3 +1,17 @@
|
|
|
1
|
+
# [@appland/appmap-v3.169.0](https://github.com/getappmap/appmap-js/compare/@appland/appmap-v3.168.1...@appland/appmap-v3.169.0) (2024-10-08)
|
|
2
|
+
|
|
3
|
+
|
|
4
|
+
### Features
|
|
5
|
+
|
|
6
|
+
* Update @appland/navie to v1.33.0 ([5c14b1c](https://github.com/getappmap/appmap-js/commit/5c14b1c78923105bf81d4da592e5f3b4b849495f))
|
|
7
|
+
|
|
8
|
+
# [@appland/appmap-v3.168.1](https://github.com/getappmap/appmap-js/compare/@appland/appmap-v3.168.0...@appland/appmap-v3.168.1) (2024-10-07)
|
|
9
|
+
|
|
10
|
+
|
|
11
|
+
### Bug Fixes
|
|
12
|
+
|
|
13
|
+
* mjs file is not binary data ([5c26aed](https://github.com/getappmap/appmap-js/commit/5c26aedea2e587571fee4732cea00f72abad052e))
|
|
14
|
+
|
|
1
15
|
# [@appland/appmap-v3.168.0](https://github.com/getappmap/appmap-js/compare/@appland/appmap-v3.167.0...@appland/appmap-v3.168.0) (2024-10-06)
|
|
2
16
|
|
|
3
17
|
|
|
@@ -18,7 +18,7 @@ By using AppMap data, Navie is the first AI code architect with the context to u
|
|
|
18
18
|
|
|
19
19
|
Over 90,000 software developers are using the [AppMap extension for VSCode](https://marketplace.visualstudio.com/items?itemName=appland.appmap) and the [AppMap plugin for JetBrains](https://plugins.jetbrains.com/plugin/16701-appmap).
|
|
20
20
|
|
|
21
|
-
<a class="btn btn-primary btn-lg" href="/docs/get-started-with-appmap/">Get Started
|
|
21
|
+
<a class="btn btn-primary btn-lg" href="/docs/get-started-with-appmap/">Get Started</a>
|
|
22
22
|
|
|
23
23
|

|
|
24
24
|
_AppMap Navie with Sequence diagram in Visual Studio Code_
|
package/built/docs/community.md
CHANGED
|
@@ -3,7 +3,7 @@ layout: docs
|
|
|
3
3
|
toc: true
|
|
4
4
|
title: Docs - Community
|
|
5
5
|
description: "Join AppMap's vibrant community on Slack for discussions, issue reporting, and become a contributor."
|
|
6
|
-
redirect_from: [/docs/troubleshooting]
|
|
6
|
+
redirect_from: [/docs/troubleshooting, /community]
|
|
7
7
|
---
|
|
8
8
|
# Community
|
|
9
9
|
|
|
@@ -6,7 +6,7 @@ toc: true
|
|
|
6
6
|
redirect_from: [/docs/your-first-15-minutes-with-appmap/, /docs/code-editor-extensions/,/docs/code-editor-extensions/appmap-for-vs-code, /docs/code-editor-extensions/appmap-for-jetbrains,/docs/setup-appmap-in-your-code-editor/index.html]
|
|
7
7
|
---
|
|
8
8
|
|
|
9
|
-
# Get Started
|
|
9
|
+
# Get Started
|
|
10
10
|
|
|
11
11
|
<p class="alert alert-info">
|
|
12
12
|
If at any point you would like some help, <a href="/slack">join us in Slack</a>!
|
|
@@ -20,7 +20,7 @@ redirect_from: [/docs/setup-appmap-in-your-code-editor/navie-ai-quickstart]
|
|
|
20
20
|
|
|
21
21
|
By default, Navie uses an AppMap proxy of the latest OpenAI supported AI models. If you would like to customize your own model, you can leverage a variety of other AI model providers such as [Azure OpenAI](https://appmap.io/docs/navie-reference#azure-openai), [Fireworks.ai](https://appmap.io/docs/navie-reference#fireworks-ai), [LM Studio](https://appmap.io/docs/navie-reference#lm-studio), and more.
|
|
22
22
|
|
|
23
|
-
If you have an active GitHub Copilot subscription, you can use Navie with the [Copilot
|
|
23
|
+
If you have an active GitHub Copilot subscription, you can use Navie with the [Copilot Language Model](/docs/navie-reference/navie-bring-your-own-model-examples.html#github-copilot-language-model) as a supported backend. Refer to the [Navie Copilot documentation](/docs/navie-reference/navie-bring-your-own-model-examples.html#github-copilot-language-model) for instructions on how to enable.
|
|
24
24
|
|
|
25
25
|
## Open AppMap Navie AI
|
|
26
26
|
|
|
@@ -32,20 +32,21 @@ To open the Navie Chat, open the AppMap plugin in the sidebar menu for your code
|
|
|
32
32
|
|
|
33
33
|
## Ask Navie about your App
|
|
34
34
|
|
|
35
|
-
You can ask questions about your application with Navie immediately after installing the plugin.
|
|
35
|
+
You can ask questions about your application with Navie immediately after installing the plugin. Navie will answer questions based on analysis of your project code. For increased accuracy of more complex projects, you can record AppMap data and Navie will utilize this information as well.
|
|
36
36
|
|
|
37
37
|
By default, Navie will utilize an OpenAI service hosted by AppMap. If, for data privacy or other reasons, you are do not wish to use the AppMap OpenAI proxy, you can [bring your own OpenAI API key](/docs/using-navie-ai/bring-your-own-model.html#bring-your-own-openai-api-key-byok), or use an [entirely different AI Model](/docs/using-navie-ai/bring-your-own-model.html#ollama), hosted in your environment or hosted locally.
|
|
38
38
|
|
|
39
|
-
When you ask a question to Navie, it will search through all
|
|
39
|
+
When you ask a question to Navie, it will search through all the available AppMap data for your project to pull in relevant traces, sequence diagrams, and code snippets for analysis. it will send the selected context to your preferred LLM provider.
|
|
40
40
|
|
|
41
|
-
|
|
41
|
+
To achieve the highest quality results, we suggest using the available command modes when prompting Navie. Simply type `@` into the chat input to access the list of available command modes.
|
|
42
42
|
|
|
43
|
-
|
|
43
|
+
By default, Navie chat is in a default mode called `@explain`. Other specialized modes are available for generating diagrams, planning work, generating code and tests, and more. Consult [Using Navie docs](/docs/navie-reference/navie-commands.html) for more details on Navie commands.
|
|
44
44
|
|
|
45
45
|
The Navie UI includes a standard chat window, and a context panel which will include all the context that is included in the query to the AI provider. This context can include things such as:
|
|
46
46
|
|
|
47
47
|
**Always available:**
|
|
48
48
|
- Code Snippets
|
|
49
|
+
- Pinned Content
|
|
49
50
|
|
|
50
51
|
**If AppMap Data exists:**
|
|
51
52
|
- Sequence Diagrams
|
|
@@ -0,0 +1,18 @@
|
|
|
1
|
+
---
|
|
2
|
+
layout: docs
|
|
3
|
+
title: Docs - Reference
|
|
4
|
+
description: "A reference for AppMap Navie AI"
|
|
5
|
+
toc: true
|
|
6
|
+
step: 1
|
|
7
|
+
---
|
|
8
|
+
|
|
9
|
+
# Navie Reference
|
|
10
|
+
- [Navie Commands](/docs/navie-reference/navie-commands.html)
|
|
11
|
+
- [Navie Options](/docs/navie-reference/navie-options.html)
|
|
12
|
+
- [Bring Your Own Model Examples](/docs/navie-reference/navie-bring-your-own-model-examples.html)
|
|
13
|
+
- [OpenAI Key Management](/docs/navie-reference/navie-openai-key-management.html)
|
|
14
|
+
- [Accessing Navie Logs](/docs/navie-reference/navie-accessing-logs.html)
|
|
15
|
+
- [GitHub Repository](/docs/navie-reference/navie-github-repository.html)
|
|
16
|
+
- [How Navie Works](/docs/navie-reference/navie-how-it-works.html)
|
|
17
|
+
- [Navie User Interface](/docs/navie-reference/navie-user-interface.html)
|
|
18
|
+
- [Pre-built Libraries for Recording AppMap Data](/docs/navie-reference/navie-pre-built-libraries-for-appmap-data.html)
|
|
@@ -0,0 +1,42 @@
|
|
|
1
|
+
---
|
|
2
|
+
layout: docs
|
|
3
|
+
title: Docs - Reference
|
|
4
|
+
name: Accessing Navie Logs
|
|
5
|
+
toc: true
|
|
6
|
+
step: 7
|
|
7
|
+
navie-reference: true
|
|
8
|
+
description: "Reference Guide to AppMap Navie AI, how-to guide for accessing logs."
|
|
9
|
+
---
|
|
10
|
+
|
|
11
|
+
|
|
12
|
+
# Accessing Navie Logs
|
|
13
|
+
|
|
14
|
+
## Visual Studio Code
|
|
15
|
+
|
|
16
|
+
You can access the Navie logs in VS Code by opening the `Output` tab and selecting `AppMap Services` from the list of available output logs.
|
|
17
|
+
|
|
18
|
+
To open the Output window, on the menu bar, choose View > Output, or in Windows press `Ctrl+Shift+U` or in Mac use `Shift+Command+U`
|
|
19
|
+
|
|
20
|
+

|
|
21
|
+
|
|
22
|
+
Click on the output log dropdown in the right corner to view a list of all the available output logs.
|
|
23
|
+
|
|
24
|
+

|
|
25
|
+
|
|
26
|
+
Select on the `AppMap: Services` log to view the logs from Navie.
|
|
27
|
+
|
|
28
|
+

|
|
29
|
+
|
|
30
|
+
## JetBrains
|
|
31
|
+
|
|
32
|
+
You can enable debug logging of Navie in your JetBrains code editor by first opening `Help` > `Diagnostic Tools` > `Debug Log Settings`.
|
|
33
|
+
|
|
34
|
+

|
|
35
|
+
|
|
36
|
+
In the `Custom Debug Log Configuration` enter `appland` to enable DEBUG level logging for the AppMap plugin.
|
|
37
|
+
|
|
38
|
+

|
|
39
|
+
|
|
40
|
+
Next, open `Help` > `Show Log...` will open the IDE log file.
|
|
41
|
+
|
|
42
|
+

|
|
@@ -0,0 +1,202 @@
|
|
|
1
|
+
---
|
|
2
|
+
layout: docs
|
|
3
|
+
title: Docs - AppMap Navie
|
|
4
|
+
description: "Reference Guide to AppMap Navie AI, examples of bring-your-own-llm configurations."
|
|
5
|
+
name: Bring Your Own Model Examples
|
|
6
|
+
navie-reference: true
|
|
7
|
+
toc: true
|
|
8
|
+
step: 5
|
|
9
|
+
---
|
|
10
|
+
|
|
11
|
+
# Bring Your Own Model Examples
|
|
12
|
+
|
|
13
|
+
## GitHub Copilot Language Model
|
|
14
|
+
|
|
15
|
+
Starting with VS Code `1.91` and greater, and with an active GitHub Copilot subscription, you can use Navie with the Copilot Language Model as a supported backend model. This allows you to leverage the powerful runtime powered Navie AI Architect with your existing Copilot subscription. This is the recommended option for users in corporate environments where Copilot is the only approved and supported language model.
|
|
16
|
+
|
|
17
|
+
#### Requirements <!-- omit in toc -->
|
|
18
|
+
|
|
19
|
+
The following items are required to use the GitHub Copilot Language Model with Navie:
|
|
20
|
+
|
|
21
|
+
- VS Code Version `1.91` or greater
|
|
22
|
+
- AppMap Extension version `v0.123.0` or greater
|
|
23
|
+
- GitHub Copilot VS Code extension must be installed
|
|
24
|
+
- Signed into an active paid or trial GitHub Copilot subscription
|
|
25
|
+
|
|
26
|
+
#### Setup <!-- omit in toc -->
|
|
27
|
+
|
|
28
|
+
Open the VS Code Settings, and search for `navie vscode`
|
|
29
|
+
|
|
30
|
+
<img class="video-screenshot" src="/assets/img/product/navie-copilot-1.webp"/>
|
|
31
|
+
|
|
32
|
+
Click the box to use the `VS Code language model...`
|
|
33
|
+
|
|
34
|
+
After clicking the box to enable the VS Code LM, you'll be instructed to reload your VS Code to enable these changes.
|
|
35
|
+
|
|
36
|
+
<img class="video-screenshot" src="/assets/img/product/navie-copilot-2.webp"/>
|
|
37
|
+
|
|
38
|
+
After VS Code finishes reloading, open the AppMap extension.
|
|
39
|
+
|
|
40
|
+
Select `New Navie Chat`, and confirm the model listed is `(via copilot)`
|
|
41
|
+
|
|
42
|
+
<img class="video-screenshot" src="/assets/img/product/navie-copilot-3.webp"/>
|
|
43
|
+
|
|
44
|
+
You'll need to allow the AppMap extension access to the Copilot Language Models. After asking your first question to Navie, click `Allow` to the popup to allow the necessary access.
|
|
45
|
+
|
|
46
|
+
<img class="video-screenshot" src="/assets/img/product/navie-copilot-4.webp"/>
|
|
47
|
+
|
|
48
|
+
#### Troubleshooting <!-- omit in toc -->
|
|
49
|
+
|
|
50
|
+
If you attempt to enable the Copilot language models without the Copilot Extension installed, you'll see the following error in your code editor.
|
|
51
|
+
|
|
52
|
+
<img class="video-screenshot" src="/assets/img/product/navie-copilot-5.webp"/>
|
|
53
|
+
|
|
54
|
+
Click `Install Copilot` to complete the installation for language model support.
|
|
55
|
+
|
|
56
|
+
If you have the Copilot extension installed, but have not signed in, you'll see the following notice.
|
|
57
|
+
|
|
58
|
+
<img class="video-screenshot" src="/assets/img/product/navie-copilot-6.webp"/>
|
|
59
|
+
|
|
60
|
+
Click the `Sign in to GitHub` and login with an account that has a valid paid or trial GitHub Copilot subscription.
|
|
61
|
+
|
|
62
|
+
#### Video Demo <!-- omit in toc -->
|
|
63
|
+
|
|
64
|
+
{% include vimeo.html id='992238965' %}
|
|
65
|
+
|
|
66
|
+
## OpenAI
|
|
67
|
+
|
|
68
|
+
**Note:** We recommend configuring your OpenAI key using the code editor extension. Follow the [Bring Your Own Key](/docs/using-navie-ai/bring-your-own-model.html#configuring-your-openai-key) docs for instructions.
|
|
69
|
+
|
|
70
|
+
Only `OPENAI_API_KEY` needs to be set, other settings can stay default:
|
|
71
|
+
|
|
72
|
+
| `OPENAI_API_KEY`| `sk-9spQsnE3X7myFHnjgNKKgIcGAdaIG78I3HZB4DFDWQGM` |
|
|
73
|
+
|
|
74
|
+
When using your own OpenAI API key, you can also modify the OpenAI model for Navie to use. For example if you wanted to use `gpt-3.5` or use an preview model like `gpt-4-vision-preview`.
|
|
75
|
+
|
|
76
|
+
| `APPMAP_NAVIE_MODEL`| `gpt-4-vision-preview` |
|
|
77
|
+
|
|
78
|
+
### Anthropic (Claude)
|
|
79
|
+
|
|
80
|
+
AppMap supports the Anthropic suite of large language models such as Claude Sonnet or Claude Opus.
|
|
81
|
+
|
|
82
|
+
To use AppMap Navie with Anthropic LLMs you need to generate an API key for your account.
|
|
83
|
+
|
|
84
|
+
Login to your [Anthropic dashboard](https://console.anthropic.com/dashboard), and choose the option to "Get API Keys"
|
|
85
|
+
|
|
86
|
+
Click the box to "Create Key"
|
|
87
|
+
|
|
88
|
+

|
|
89
|
+
|
|
90
|
+
In the next box, give your key an easy to recognize name.
|
|
91
|
+
|
|
92
|
+

|
|
93
|
+
|
|
94
|
+
In your VS Code or JetBrains editor, configure the following environment variables. For more details on configuring
|
|
95
|
+
these environment variables in your VS Code or JetBrains editor, refer to the [AppMap BOYK documentation.](/docs/using-navie-ai/bring-your-own-model.html#configuration)
|
|
96
|
+
|
|
97
|
+
| `ANTHROPIC_API_KEY`| `sk-ant-api03-8SgtgQrGB0vTSsB_DeeIZHvDrfmrg` |
|
|
98
|
+
| `APPMAP_NAVIE_MODEL`| `claude-3-5-sonnet-20240620` |
|
|
99
|
+
|
|
100
|
+
|
|
101
|
+
When setting the `APPMAP_NAVIE_MODEL` refer to the [Anthropic documentation](https://docs.anthropic.com/en/docs/intro-to-claude#model-options) for the latest available models to chose from.
|
|
102
|
+
|
|
103
|
+
#### Video Demo <!-- omit in toc -->
|
|
104
|
+
|
|
105
|
+
{% include vimeo.html id='1003330117' %}
|
|
106
|
+
|
|
107
|
+
## Azure OpenAI
|
|
108
|
+
|
|
109
|
+
Assuming you [created](https://learn.microsoft.com/en-us/azure/ai-services/openai/how-to/create-resource) a `navie` GPT-4 deployment on `contoso.openai.azure.com` OpenAI instance:
|
|
110
|
+
|
|
111
|
+
| `AZURE_OPENAI_API_KEY` | `e50edc22e83f01802893d654c4268c4f` |
|
|
112
|
+
| `AZURE_OPENAI_API_VERSION` | `2024-02-01` |
|
|
113
|
+
| `AZURE_OPENAI_API_INSTANCE_NAME` | `contoso` |
|
|
114
|
+
| `AZURE_OPENAI_API_DEPLOYMENT_NAME` | `navie` |
|
|
115
|
+
|
|
116
|
+
## AnyScale Endpoints
|
|
117
|
+
|
|
118
|
+
[AnyScale Endpoints](https://www.anyscale.com/endpoints) allows querying a
|
|
119
|
+
selection of open-source LLMs. After you create an account you can use it by
|
|
120
|
+
setting:
|
|
121
|
+
|
|
122
|
+
| `OPENAI_API_KEY` | `esecret_myxfwgl1iinbz9q5hkexemk8f4xhcou8` |
|
|
123
|
+
| `OPENAI_BASE_URL` | `https://api.endpoints.anyscale.com/v1` |
|
|
124
|
+
| `APPMAP_NAVIE_MODEL` | `mistralai/Mixtral-8x7B-Instruct-v0.1` |
|
|
125
|
+
|
|
126
|
+
Consult [AnyScale documentation](https://docs.endpoints.anyscale.com/) for model
|
|
127
|
+
names. Note we recommend using Mixtral models with Navie.
|
|
128
|
+
|
|
129
|
+
#### Anyscale Demo with VS Code <!-- omit in toc -->
|
|
130
|
+
|
|
131
|
+
{% include vimeo.html id='970914908' %}
|
|
132
|
+
|
|
133
|
+
#### Anyscale Demo with JetBrains <!-- omit in toc -->
|
|
134
|
+
|
|
135
|
+
{% include vimeo.html id='970914884' %}
|
|
136
|
+
|
|
137
|
+
## Fireworks AI
|
|
138
|
+
|
|
139
|
+
You can use [Fireworks AI](https://fireworks.ai/) and their serverless or on-demand
|
|
140
|
+
models as a compatible backend for AppMap Navie AI.
|
|
141
|
+
|
|
142
|
+
After creating an account on Fireworks AI you can configure your Navie environment
|
|
143
|
+
settings:
|
|
144
|
+
|
|
145
|
+
| `OPENAI_API_KEY` | `WBYq2mKlK8I16ha21k233k2EwzGAJy3e0CLmtNZadJ6byfpu7c` |
|
|
146
|
+
| `OPENAI_BASE_URL` | `https://api.fireworks.ai/inference/v1` |
|
|
147
|
+
| `APPMAP_NAVIE_MODEL` | `accounts/fireworks/models/mixtral-8x22b-instruct` |
|
|
148
|
+
|
|
149
|
+
Consult the [Fireworks AI documentation](https://fireworks.ai/models) for a full list of
|
|
150
|
+
the available models they currently support.
|
|
151
|
+
|
|
152
|
+
#### Video Demo <!-- omit in toc -->
|
|
153
|
+
|
|
154
|
+
{% include vimeo.html id='992941358' %}
|
|
155
|
+
|
|
156
|
+
### Ollama
|
|
157
|
+
|
|
158
|
+
You can use [Ollama](https://ollama.com/) to run Navie with local models; after
|
|
159
|
+
you've successfully ran a model with `ollama run` command, you can configure
|
|
160
|
+
Navie to use it:
|
|
161
|
+
|
|
162
|
+
| `OPENAI_API_KEY` | `dummy` |
|
|
163
|
+
| `OPENAI_BASE_URL` | `http://127.0.0.1:11434/v1` |
|
|
164
|
+
| `APPMAP_NAVIE_MODEL` | `mixtral` |
|
|
165
|
+
|
|
166
|
+
**Note:** Even though it's running locally a dummy placeholder API key is still required.
|
|
167
|
+
|
|
168
|
+
## LM Studio
|
|
169
|
+
|
|
170
|
+
You can use [LM Studio](https://lmstudio.ai/) to run Navie with local models.
|
|
171
|
+
|
|
172
|
+
After downloading a model to run, select the option to run a local server.
|
|
173
|
+
|
|
174
|
+
<img class="video-screenshot" src="/assets/img/product/lmstudio-run-local-server.webp"/>
|
|
175
|
+
|
|
176
|
+
In the next window, select which model you want to load into the local inference server.
|
|
177
|
+
|
|
178
|
+
<img class="video-screenshot" src="/assets/img/product/lmstudio-load-model.webp"/>
|
|
179
|
+
|
|
180
|
+
After loading your model, you can confirm it's successfully running in the logs.
|
|
181
|
+
|
|
182
|
+
*NOTE*: Save the URL it's running under to use for `OPENAI_BASE_URL` environment variable.
|
|
183
|
+
|
|
184
|
+
For example: `http://localhost:1234/v1`
|
|
185
|
+
|
|
186
|
+
<img class="video-screenshot" src="/assets/img/product/lmstudio-confirm-running.webp"/>
|
|
187
|
+
|
|
188
|
+
In the `Model Inspector` copy the name of the model and use this for the `APPMAP_NAVIE_MODEL` environment variable.
|
|
189
|
+
|
|
190
|
+
For example: `Meta-Llama-3-8B-Instruct-imatrix`
|
|
191
|
+
|
|
192
|
+
<img class="video-screenshot" src="/assets/img/product/lmstudio-model-inspector.webp"/>
|
|
193
|
+
|
|
194
|
+
Continue to configure your local environment with the following environment variables based on your LM Studio configuration. Refer to the [documentation above](#bring-your-own-model-byom) for steps specific to your code editor.
|
|
195
|
+
|
|
196
|
+
| `OPENAI_API_KEY` | `dummy` |
|
|
197
|
+
| `OPENAI_BASE_URL` | `http://localhost:1234/v1` |
|
|
198
|
+
| `APPMAP_NAVIE_MODEL` | `Meta-Llama-3-8B-Instruct-imatrix` |
|
|
199
|
+
|
|
200
|
+
**Note:** Even though it's running locally a dummy placeholder API key is still required.
|
|
201
|
+
|
|
202
|
+
{% include vimeo.html id='969002308' %}
|
|
@@ -0,0 +1,119 @@
|
|
|
1
|
+
---
|
|
2
|
+
layout: docs
|
|
3
|
+
title: Docs - AppMap Navie
|
|
4
|
+
description: "Reference Guide to AppMap Navie AI, a complete list of the available commands in AppMap Navie AI."
|
|
5
|
+
name: Navie Commands
|
|
6
|
+
navie-reference: true
|
|
7
|
+
toc: true
|
|
8
|
+
step: 3
|
|
9
|
+
---
|
|
10
|
+
|
|
11
|
+
# Navie Commands
|
|
12
|
+
|
|
13
|
+
You can ask free-form questions, or start your question with one of these commands:
|
|
14
|
+
|
|
15
|
+
- [`@plan`](#plan)
|
|
16
|
+
- [`@generate`](#generate)
|
|
17
|
+
- [`@test`](#test)
|
|
18
|
+
- [`@explain`](#explain)
|
|
19
|
+
- [`@diagram`](#diagram)
|
|
20
|
+
- [`@help`](#help)
|
|
21
|
+
|
|
22
|
+
## @plan
|
|
23
|
+
|
|
24
|
+
The `@plan` command prefix within Navie focuses the AI response on building a detailed implementation plan for the relevant query. This will focus Navie on only understanding the problem and the application to generate a step-by-step plan. This will generally not respond with code implementation details, consider using the `@generate` command which can implement code based on the plan.
|
|
25
|
+
|
|
26
|
+
#### Examples <!-- omit in toc -->
|
|
27
|
+
|
|
28
|
+
- @plan improve the performance of my slow product listing page.
|
|
29
|
+
- @plan implement a cache key for my user posting on my social media application.
|
|
30
|
+
- @plan migrate the /users/setting API endpoint from SQL to MongoDB.
|
|
31
|
+
|
|
32
|
+
#### `@plan` Video Demo <!-- omit in toc -->
|
|
33
|
+
|
|
34
|
+
{% include vimeo.html id='985121150' %}
|
|
35
|
+
|
|
36
|
+
## @generate
|
|
37
|
+
|
|
38
|
+
The `@generate` prefix will focus the Navie AI response to optimize for new code creation. This is useful when you want the Navie AI to respond with code implementations across your entire code base. This will reduce the amount of code explanation and generally the AI will respond only with the specific files and functions that need to be changed in order to implement a specific plan.
|
|
39
|
+
|
|
40
|
+
#### Examples <!-- omit in toc -->
|
|
41
|
+
|
|
42
|
+
- @generate Using the django-simple-captcha library add the necessary code for an offline captcha to my new user registration page.
|
|
43
|
+
- @generate Update the function for the physical flow export to include data type via physical_spec_data_type and physical_specification tables without changing the existing functionality.
|
|
44
|
+
- @generate Design and implement a cache key for user posts and show me how to implement it within this code base
|
|
45
|
+
|
|
46
|
+
#### `@generate` Video Demo <!-- omit in toc -->
|
|
47
|
+
|
|
48
|
+
{% include vimeo.html id='985121150' %}
|
|
49
|
+
|
|
50
|
+
## @test
|
|
51
|
+
|
|
52
|
+
The `@test` command prefix will focus the Navie AI response to optimize for test case creation, such as unit testing or integration testing. This prefix will understand how your tests are currently written and provide updated tests based on features or code that is provided. You can use this command along with the `@generate` command to create tests cases for newly generated code.
|
|
53
|
+
|
|
54
|
+
#### Examples <!-- omit in toc -->
|
|
55
|
+
|
|
56
|
+
- @test create integration test cases for the user setting page that is migrated to mongodb.
|
|
57
|
+
- @test create unit and integration tests that fully support the updated cache key functionality.
|
|
58
|
+
- @test provide detailed test cases examples for testing the updated user billing settings dashboard.
|
|
59
|
+
|
|
60
|
+
## @explain
|
|
61
|
+
|
|
62
|
+
The `@explain` command prefix within Navie serves as a default option focused on helping you learn more about your project. Using the `@explain` prefix will focus the Navie AI response to be more explanatory and will dive into architectural level questions across your entire code base. You can also use this to ask for ways to improve the performance of a feature as well.
|
|
63
|
+
|
|
64
|
+
#### Examples <!-- omit in toc -->
|
|
65
|
+
|
|
66
|
+
- @explain how does user authentication work in this project?
|
|
67
|
+
- @explain how is the export request for physical flows handled, and what are the tables involved?
|
|
68
|
+
- @explain how does the products listing page works and how can I improve the performance?
|
|
69
|
+
|
|
70
|
+
## @diagram
|
|
71
|
+
|
|
72
|
+
The `@diagram` command prefix within Navie focuses the AI response to generate Mermaid compatible diagrams. [Mermaid](https://mermaid.js.org/) is an open source diagramming and charting utility with wide support across tools such as GitHub, Atlassian, and more. Use the `@diagram` command, and Navie will create and render a Mermaid compatible diagram within the Navie chat window. You can open this diagram in the [Mermaid Live Editor](https://mermaid.live), copy the Mermaid Definitions to your clipboard, save to disk, or expand a full window view. Save the Mermaid diagram into any supported tool such as GitHub Issues, Atlassian Confluence, and more.
|
|
73
|
+
|
|
74
|
+
#### Example Questions <!-- omit in toc -->
|
|
75
|
+
|
|
76
|
+
```
|
|
77
|
+
@diagram the functional steps involved when a new user registers for the service.
|
|
78
|
+
```
|
|
79
|
+
|
|
80
|
+
<img class="video-screenshot" src="/assets/img/product/sequence-diagram-navie.webp"/>
|
|
81
|
+
|
|
82
|
+
```
|
|
83
|
+
@diagram the entity relationships between products and other important data objects.
|
|
84
|
+
```
|
|
85
|
+
|
|
86
|
+
<img class="video-screenshot" src="/assets/img/product/entity-relationship-navie.webp"/>
|
|
87
|
+
|
|
88
|
+
```
|
|
89
|
+
@diagram using a flow chart how product sales tax is calculated.
|
|
90
|
+
```
|
|
91
|
+
|
|
92
|
+
<img class="video-screenshot" src="/assets/img/product/flow-chart-navie.webp"/>
|
|
93
|
+
|
|
94
|
+
```
|
|
95
|
+
@diagram create a detailed class map of the users, stores, products and other associated classes used
|
|
96
|
+
```
|
|
97
|
+
|
|
98
|
+
<img class="video-screenshot" src="/assets/img/product/class-map-navie.webp"/>
|
|
99
|
+
|
|
100
|
+
#### Example Diagram Projects <!-- omit in toc -->
|
|
101
|
+
|
|
102
|
+
Below are a series of open source projects you can use to try out the `@diagram` feature using
|
|
103
|
+
prebuilt AppMap data in a sample project. Simply clone one of the following projects, open
|
|
104
|
+
into your code editor with the AppMap extension installed, and ask Navie to generate diagrams.
|
|
105
|
+
|
|
106
|
+
- [Sample Python Project](https://github.com/land-of-apps/python-diagram-example/blob/master/README.md)
|
|
107
|
+
- [Sample Ruby Project](https://github.com/land-of-apps/rails-diagram-example/blob/main/README.md)
|
|
108
|
+
- [Sample Node (MERN) Project](https://github.com/land-of-apps/mern-diagram-example/blob/master/README.md)
|
|
109
|
+
- [Sample Java Spring Project](https://github.com/land-of-apps/waltz/blob/demo/diagram-examples/demo/diagram-demo.md)
|
|
110
|
+
|
|
111
|
+
## @help
|
|
112
|
+
|
|
113
|
+
Navie will help you setup AppMap, including generating AppMap recordings and diagrams. This prefix will focus the Navie AI response to be more specific towards help with using AppMap products and features. This will leverage the [AppMap documentation](https://appmap.io/docs) as part of the context related to your question and provide guidance for using AppMap features or diving into advanced AppMap topics.
|
|
114
|
+
|
|
115
|
+
#### Examples <!-- omit in toc -->
|
|
116
|
+
|
|
117
|
+
- @help how do I setup process recording for my node.js project?
|
|
118
|
+
- @help how can I reduce the size of my large AppMap Data recordings?
|
|
119
|
+
- @help how can i export my AppMap data to atlassian confluence?
|
|
@@ -0,0 +1,12 @@
|
|
|
1
|
+
---
|
|
2
|
+
layout: docs
|
|
3
|
+
title: Docs - AppMap Navie
|
|
4
|
+
description: "Reference Guide to AppMap Navie AI, a link to the code repository for AppMap Navie AI."
|
|
5
|
+
name: GitHub Repository
|
|
6
|
+
navie-reference: true
|
|
7
|
+
step: 8
|
|
8
|
+
---
|
|
9
|
+
|
|
10
|
+
### GitHub Repository
|
|
11
|
+
|
|
12
|
+
[https://github.com/getappmap/appmap](https://github.com/getappmap/appmap)
|
package/built/docs/{using-navie-ai/how-navie-works.md → navie-reference/navie-how-it-works.md}
RENAMED
|
@@ -1,10 +1,11 @@
|
|
|
1
1
|
---
|
|
2
2
|
layout: docs
|
|
3
3
|
title: Docs - AppMap Navie
|
|
4
|
-
description: "
|
|
4
|
+
description: "Reference Guide to AppMap Navie AI, architectural overview of AppMap Navie AI."
|
|
5
5
|
name: How Navie Works
|
|
6
|
-
|
|
7
|
-
|
|
6
|
+
navie-reference: true
|
|
7
|
+
step: 10
|
|
8
|
+
toc: true
|
|
8
9
|
---
|
|
9
10
|
|
|
10
11
|
# How Navie Works
|
|
@@ -18,7 +19,7 @@ This powerful technique provides the AI with many valuable pieces of information
|
|
|
18
19
|
4) Detailed insights into data flows, SQL queries, and other database interactions.
|
|
19
20
|
|
|
20
21
|
|
|
21
|
-
|
|
22
|
+
### Navie Technical Architecture
|
|
22
23
|
|
|
23
24
|
Navie integrates seamlessly into your existing AppMap enabled project. Once you have the AppMap software libraries installed into your project, interact with your application or run tests to automatically generate AppMap Data. All of your code and AppMap Data will stay locally in your environment until you ask Navie a question. After asking a question to Navie, the relevant AppMap Diagrams and code snippets are located locally by the AppMap search and explain API, then this data is sent to OpenAI (available option to bring your own LLM), and the response is returned to the user.
|
|
24
25
|
|
|
@@ -0,0 +1,83 @@
|
|
|
1
|
+
---
|
|
2
|
+
layout: docs
|
|
3
|
+
title: Docs - AppMap Navie
|
|
4
|
+
description: "Reference Guide to AppMap Navie AI, OpenAI Key Management."
|
|
5
|
+
name: OpenAI Key Management
|
|
6
|
+
navie-reference: true
|
|
7
|
+
toc: true
|
|
8
|
+
step: 6
|
|
9
|
+
---
|
|
10
|
+
|
|
11
|
+
# OpenAI Key Management
|
|
12
|
+
|
|
13
|
+
## Visual Studio Code
|
|
14
|
+
|
|
15
|
+
### Add a new OpenAI Key in VS Code
|
|
16
|
+
|
|
17
|
+
The standard way to add an OpenAI API key in VS Code is to use the `gear` icon in the Navie chat window, but you can alternatively set the key using the VS Code Command Palette with an `AppMap` command option.
|
|
18
|
+
|
|
19
|
+
In VS Code, open the Command Palette.
|
|
20
|
+
|
|
21
|
+
You can use a hotkey to open the VS Code Command Palette
|
|
22
|
+
- Mac: `Cmd + Shift + P`
|
|
23
|
+
- Windows/Linux: `Ctrl + Shift + P`
|
|
24
|
+
|
|
25
|
+
Or you can select `View` -> `Command Palette`
|
|
26
|
+
|
|
27
|
+
<img class="video-screenshot" src="/assets/img/product/byok-command-palette.webp"/>
|
|
28
|
+
|
|
29
|
+
Search for `AppMap Set OpenAPI Key`
|
|
30
|
+
|
|
31
|
+
<img class="video-screenshot" src="/assets/img/product/byok-search.webp"/>
|
|
32
|
+
|
|
33
|
+
Paste your key into the new field and hit enter.
|
|
34
|
+
|
|
35
|
+
You'll get a notification in VS Code that your key is set.
|
|
36
|
+
|
|
37
|
+
**NOTE:** You will need to reload your window for the setting to take effect. Use the Command Palette `Developer: Reload Window`
|
|
38
|
+
|
|
39
|
+
<img class="video-screenshot" src="/assets/img/product/byok-key-set.webp"/>
|
|
40
|
+
|
|
41
|
+
### Delete a configured OpenAI Key
|
|
42
|
+
|
|
43
|
+
To delete your key, simply open the Command Palette
|
|
44
|
+
|
|
45
|
+
You can use a hotkey to open
|
|
46
|
+
- Mac: `Cmd + Shift + P`
|
|
47
|
+
- Windows/Linux: `Ctrl + Shift + P`
|
|
48
|
+
|
|
49
|
+
Or you can select `View` -> `Command Palette`
|
|
50
|
+
|
|
51
|
+
<img class="video-screenshot" src="/assets/img/product/byok-command-palette.webp"/>
|
|
52
|
+
|
|
53
|
+
Search for `AppMap Set OpenAPI Key`
|
|
54
|
+
|
|
55
|
+
<img class="video-screenshot" src="/assets/img/product/byok-search.webp"/>
|
|
56
|
+
|
|
57
|
+
And simply hit enter with the field blank. VS Code will notify you that the key has been unset.
|
|
58
|
+
|
|
59
|
+
**NOTE:** You will need to reload your window for the setting to take effect. Use the Command Palette `Developer: Reload Window`
|
|
60
|
+
|
|
61
|
+
<img class="video-screenshot" src="/assets/img/product/byok-key-erased.webp"/>
|
|
62
|
+
|
|
63
|
+
### How is my API key saved securely?
|
|
64
|
+
|
|
65
|
+
For secure storage of API key secrets within AppMap, we use the default VS Code secret storage which leverages Electron's safeStorage API to ensure the confidentiality of sensitive information. Upon encryption, secrets are stored within the user data directory in a SQLite database, alongside other VS Code state information. This encryption process involves generating a unique encryption key, which, on macOS, is securely stored within `Keychain Access` under "Code Safe Storage" or "Code - Insiders Safe Storage," depending on the version. This method provides a robust layer of protection, preventing unauthorized access by other applications or users with full disk access. The safeStorage API, accessible in the main process, supports operations such as checking encryption availability, encrypting and decrypting strings, and selecting storage backends on Linux. This approach ensures that your secrets are securely encrypted and stored, safeguarding them from potential threats while maintaining application integrity.
|
|
66
|
+
|
|
67
|
+
## JetBrains
|
|
68
|
+
|
|
69
|
+
The standard way to add an OpenAI API key in JetBrains is to use the `gear` icon in the Navie chat window, but you can alternatively set the key directly in the JetBrains settings.
|
|
70
|
+
|
|
71
|
+
### Adding or Modifying OpenAI API Key in JetBrains
|
|
72
|
+
|
|
73
|
+
In JetBrains, open the `Settings` option.
|
|
74
|
+
|
|
75
|
+

|
|
76
|
+
|
|
77
|
+
In the `Settings` window, search for `appmap` in the search bar on the side. Under the `Tools -> AppMap` you will see a configuration option for your OpenAI API Key in the `AppMap Services` section. This is the same section you are able to add/edit/modify your other environment settings for using your own custom models.
|
|
78
|
+
|
|
79
|
+

|
|
80
|
+
|
|
81
|
+
### How is my API key saved securely?
|
|
82
|
+
|
|
83
|
+
AppMap follows JetBrains best practices for the storing of sensitive data. The AppMap JetBrains plugin uses the `PasswordSafe` package [to securely persist](https://www.jetbrains.com/help/idea/reference-ide-settings-password-safe.html) your OpenAI API key. The default storage format for `PasswordSafe` is operating system dependent. Refer to the [JetBrains Developer Documents](https://plugins.jetbrains.com/docs/intellij/persisting-sensitive-data.html#storage) for more information.
|