@appland/appmap 3.142.1 → 3.144.0

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
Files changed (68) hide show
  1. package/CHANGELOG.md +14 -0
  2. package/built/docs/get-started-with-appmap/navie-ai-quickstart.md +1 -5
  3. package/built/docs/navie/bring-your-own-model.md +64 -141
  4. package/built/docs/navie/how-to-open-navie.md +1 -1
  5. package/built/docs/navie/index.md +1 -0
  6. package/built/docs/{guides/navie-best-practices.md → navie/using-navie.md} +47 -16
  7. package/built/html/appmap.js.map +2 -2
  8. package/built/html/sequenceDiagram.js.map +2 -2
  9. package/package.json +2 -2
  10. package/built/docs/CLA Instructions.pdf +0 -0
  11. package/built/docs/Code of Conduct for Contributors.pdf +0 -0
  12. package/built/docs/analysis/rules-reference.html +0 -27
  13. package/built/docs/appmap-docs.md +0 -27
  14. package/built/docs/community.md +0 -28
  15. package/built/docs/guides/exporting-appmap-diagrams.md +0 -50
  16. package/built/docs/guides/handling-large-appmap-diagrams.md +0 -138
  17. package/built/docs/guides/index.md +0 -22
  18. package/built/docs/guides/navigating-code-objects.md +0 -67
  19. package/built/docs/guides/openapi.md +0 -105
  20. package/built/docs/guides/reading-sql-in-appmap-diagrams.md +0 -69
  21. package/built/docs/guides/refine-appmap-data.md +0 -186
  22. package/built/docs/guides/reverse-engineering.md +0 -377
  23. package/built/docs/guides/runtime-code-review.md +0 -111
  24. package/built/docs/guides/using-appmap-analysis.md +0 -206
  25. package/built/docs/guides/using-appmap-diagrams.md +0 -331
  26. package/built/docs/integrations/atlassian-compass.md +0 -25
  27. package/built/docs/integrations/atlassian-confluence.md +0 -51
  28. package/built/docs/integrations/circle-ci.md +0 -424
  29. package/built/docs/integrations/docker.md +0 -109
  30. package/built/docs/integrations/github-actions.md +0 -524
  31. package/built/docs/integrations/index.md +0 -20
  32. package/built/docs/integrations/plantuml.md +0 -66
  33. package/built/docs/integrations/postman.md +0 -30
  34. package/built/docs/integrations/readme.md +0 -39
  35. package/built/docs/integrations/smartbear-swaggerhub.md +0 -119
  36. package/built/docs/reference/analysis-labels.md +0 -49
  37. package/built/docs/reference/analysis-rules.md +0 -61
  38. package/built/docs/reference/appmap-client-cli.md +0 -628
  39. package/built/docs/reference/appmap-gradle-plugin.md +0 -141
  40. package/built/docs/reference/appmap-java.md +0 -311
  41. package/built/docs/reference/appmap-maven-plugin.md +0 -164
  42. package/built/docs/reference/appmap-node.md +0 -185
  43. package/built/docs/reference/appmap-python.md +0 -504
  44. package/built/docs/reference/appmap-ruby.md +0 -514
  45. package/built/docs/reference/github-action.md +0 -171
  46. package/built/docs/reference/index.md +0 -24
  47. package/built/docs/reference/jetbrains.md +0 -136
  48. package/built/docs/reference/license-key-install.md +0 -74
  49. package/built/docs/reference/remote-recording-api.md +0 -97
  50. package/built/docs/reference/uninstalling-appmap.md +0 -119
  51. package/built/docs/reference/vscode.md +0 -122
  52. package/built/docs/setup-appmap-in-ci/example-projects.md +0 -21
  53. package/built/docs/setup-appmap-in-ci/how-it-works.md +0 -43
  54. package/built/docs/setup-appmap-in-ci/in-circleci.md +0 -423
  55. package/built/docs/setup-appmap-in-ci/in-github-actions.md +0 -177
  56. package/built/docs/setup-appmap-in-ci/index.md +0 -22
  57. package/built/docs/setup-appmap-in-ci/matrix-builds.md +0 -225
  58. package/built/docs/setup-appmap-in-ci/troubleshooting.md +0 -71
  59. package/built/docs/setup-appmap-in-your-code-editor/add-appmap-to-your-code-editor.md +0 -93
  60. package/built/docs/setup-appmap-in-your-code-editor/appmap-analysis.md +0 -77
  61. package/built/docs/setup-appmap-in-your-code-editor/generate-appmap-data-from-tests.md +0 -93
  62. package/built/docs/setup-appmap-in-your-code-editor/generate-appmap-data-with-remote-recording.md +0 -112
  63. package/built/docs/setup-appmap-in-your-code-editor/generate-appmap-data-with-request-recording.md +0 -77
  64. package/built/docs/setup-appmap-in-your-code-editor/how-appmap-works.md +0 -166
  65. package/built/docs/setup-appmap-in-your-code-editor/index.md +0 -25
  66. package/built/docs/setup-appmap-in-your-code-editor/navigating-appmap-diagrams.md +0 -59
  67. package/built/docs/setup-appmap-in-your-code-editor/navigating-code-objects.md +0 -67
  68. package/built/docs/unused_for_now_index.html +0 -11
package/CHANGELOG.md CHANGED
@@ -1,3 +1,17 @@
1
+ # [@appland/appmap-v3.144.0](https://github.com/getappmap/appmap-js/compare/@appland/appmap-v3.143.0...@appland/appmap-v3.144.0) (2024-05-19)
2
+
3
+
4
+ ### Features
5
+
6
+ * Sync only high level docs ([5186e4e](https://github.com/getappmap/appmap-js/commit/5186e4e0c896324b808e5341878b02da6df41e3a))
7
+
8
+ # [@appland/appmap-v3.143.0](https://github.com/getappmap/appmap-js/compare/@appland/appmap-v3.142.1...@appland/appmap-v3.143.0) (2024-05-17)
9
+
10
+
11
+ ### Features
12
+
13
+ * Emit agent and classification events ([a73bcaa](https://github.com/getappmap/appmap-js/commit/a73bcaa276835b8def2d56c270aab358ca35443f))
14
+
1
15
  # [@appland/appmap-v3.142.1](https://github.com/getappmap/appmap-js/compare/@appland/appmap-v3.142.0...@appland/appmap-v3.142.1) (2024-05-17)
2
16
 
3
17
 
@@ -31,11 +31,7 @@ By default, Navie will utilize an OpenAI service hosted by AppMap. If, for data
31
31
 
32
32
  When you ask a question to Navie, it will search through all your AppMap Diagrams (if they exist) for your project to pull in relevant traces, sequence diagrams, and code snippets for analysis. It will then send these code snippets and runtime code sequence diagrams to the Generative AI service along with your question.
33
33
 
34
- You can ask free-form questions, or start your question with one of these commands:
35
-
36
- - `@explain`: (Default) Help you understand your project. This mode is used when there is no prefix.
37
- - `@help`: Get assistance with configuring and using AppMap.
38
- - `@generate`: Optimizes the answer to focus on new code generation.
34
+ Refer to the [Using Navie docs](/docs/navie/using-navie) to learn more about the advanced Navie chat commands you can use with your question.
39
35
 
40
36
  After asking Navie a question, Navie will search through your application source code, finding any relevant code snippets. It will include relevant AppMap Data like sequence diagrams and data flows if they exist for your project. You will see on the right hand side of the Navie window the relevant context from your code included with the question.
41
37
 
@@ -2,7 +2,7 @@
2
2
  layout: docs
3
3
  title: Docs - Navie
4
4
  name: Bring Your Own LLM Model
5
- step: 3
5
+ step: 4
6
6
  navie: true
7
7
  toc: true
8
8
  description: Use AppMap Navie with your own OpenAI account or OpenAI-compatible LLM running either locally or remotely.
@@ -15,26 +15,63 @@ By default, when asking a question to Navie, your code editor will interact with
15
15
  - [Navie AI Recommended Models](#navie-ai-recommended-models)
16
16
  - [Bring Your Own OpenAI API Key (BYOK)](#bring-your-own-openai-api-key-byok)
17
17
  - [Configuring Your OpenAI Key](#configuring-your-openai-key)
18
+ - [Modify which OpenAI Model to use](#modify-which-openai-model-to-use)
18
19
  - [Reset Navie AI to use Default Navie Backend](#reset-navie-ai-to-use-default-navie-backend)
19
20
  - [Bring Your Own Model (BYOM)](#bring-your-own-model-byom)
20
21
  - [Configuration](#configuration)
21
22
  - [Configuring in JetBrains](#configuring-in-jetbrains)
22
23
  - [Configuring in VS Code](#configuring-in-vs-code)
23
24
  - [Examples](#examples)
24
- - [OpenAI](#openai)
25
- - [Azure OpenAI](#azure-openai)
26
- - [AnyScale Endpoints](#anyscale-endpoints)
27
- - [Ollama](#ollama)
28
- - [LM Studio](#lm-studio)
29
25
 
30
26
  ## Navie AI Recommended Models
31
27
 
32
28
  <p class="alert alert-danger">
33
29
  AppMap Navie AI recommends avoiding models that do not support chat mode.
34
30
  </p>
35
- <!-- This doc is located at https://docs.google.com/presentation/d/145gzoYVsgJ3J4jGh_2Or8ClZ0drqoC-GTjI1UqkyF_o/edit#slide=id.g1ff63dc2dd6_0_0 -->
36
31
 
37
- ![Navie Recommended Models](/assets/img/product/navie-model-recommendations.svg)
32
+ <table class="table table-striped table-bordered black-white-table">
33
+ <tr>
34
+ <tr>
35
+ <th class="large-header">Navie AI Backend LLM</th>
36
+ <th class="large-header">Ease of Setup</th>
37
+ <th class="large-header">Quality of Responses</th>
38
+ </tr>
39
+ <tr>
40
+ <tr>
41
+ <td>AppMap OpenAI Proxy (default)</td>
42
+ <td>⭐⭐⭐⭐⭐</td>
43
+ <td>⭐⭐⭐⭐⭐</td>
44
+ </tr>
45
+ <tr>
46
+ <tr>
47
+ <td>Self Managed OpenAI API Key</td>
48
+ <td>⭐⭐⭐⭐</td>
49
+ <td>⭐⭐⭐⭐⭐</td>
50
+ </tr>
51
+ <tr>
52
+ <td>Azure Hosted OpenAI</td>
53
+ <td>⭐⭐⭐</td>
54
+ <td>⭐⭐⭐⭐⭐</td>
55
+ </tr>
56
+ </tr>
57
+ <tr>
58
+ <td>Anyscale Hosted Mixtral-8x7B</td>
59
+ <td>⭐⭐⭐</td>
60
+ <td>⭐⭐⭐</td>
61
+ </tr>
62
+ </tr>
63
+ <tr>
64
+ <td>Locally Hosted Mixtral-8x7B-Instruct-v0.1</td>
65
+ <td>⭐⭐</td>
66
+ <td>⭐⭐⭐</td>
67
+ </tr>
68
+ </tr>
69
+ <tr>
70
+ <td>Codellama/Codeqwen</td>
71
+ <td>❌ Not Supported</td>
72
+ <td>❌ Not Supported</td>
73
+ </tr>
74
+ </table>
38
75
 
39
76
  ## Bring Your Own OpenAI API Key (BYOK)
40
77
 
@@ -44,19 +81,33 @@ Navie AI uses the AppMap hosted proxy with an AppMap managed OpenAI API key. If
44
81
 
45
82
  In your code editor, open the Navie Chat window. If the model displays `(default)`, this means that Navie is configured to use the AppMap hosted OpenAI proxy. Click on the gear icon in the top of the Navie Chat window to change the model.
46
83
 
47
- ![Navie Recommended Models](/assets/img/product/navie-default-model.webp)
84
+ ![Navie configuration gear](/assets/img/product/navie-default-model.webp)
48
85
 
49
86
  In the modal, select the option to `Use your own OpenAI API key`
50
87
 
51
- ![Navie Recommended Models](/assets/img/product/navie-byok-openai-1.webp)
88
+ ![Use your own key modal](/assets/img/product/navie-byok-openai-1.webp)
52
89
 
53
90
  After you enter your OpenAI API Key in the menu option, hit `enter` and your code editor will be prompted to reload.
54
91
 
55
- ![Navie Recommended Models](/assets/img/product/navie-byok-openai-2.webp)
92
+ **In VS Code:**
93
+ ![VS Code popup to store API Key](/assets/img/product/navie-byok-openai-2.webp)
94
+
95
+ **In JetBrains:**
96
+ ![JetBrains popup to store API Key](/assets/img/product/navie-byok-jetbrains.webp)
97
+
98
+ **NOTE:** You can also use the environment variable in the [configuration](#configuration) section to store your API key as an environment variable instead of using the `gear` icon in the Navie chat window.
56
99
 
57
100
  After your code editor reloads, you can confirm your requests are being routed to OpenAI directly in the Navie Chat window. It will list the model `OpenAI` and the location, in this case `via OpenAI`.
58
101
 
59
- ![Navie Recommended Models](/assets/img/product/navie-byok-openai-3.webp)
102
+ ![OpenAI location](/assets/img/product/navie-byok-openai-3.webp)
103
+
104
+ #### Modify which OpenAI Model to use
105
+
106
+ AppMap generally uses the latest OpenAI models as the default, but if you want to use an alternative model like `gpt-3.5` or a preview model like `gpt-4-vision-preview` you can modify the `APPMAP_NAVIE_MODEL` environment variable after configuring your own OpenAI API key to use other OpenAI models.
107
+
108
+ After setting your `APPMAP_NAVIE_MODEL` with your chosen model reload/restart your code editor and then confirm it's configuration by opening a new Navie chat window. In this example i've configured my model to be `gpt-4o` with my personal OpenAI API Key.
109
+
110
+ ![JetBrains OpenAI key modal](/assets/img/product/custom-model.webp)
60
111
 
61
112
  ### Reset Navie AI to use Default Navie Backend
62
113
 
@@ -124,8 +175,6 @@ After reloading you can confirm the model is configured correctly in the Navie C
124
175
 
125
176
  #### Editing AppMap services environment <!-- omit in toc -->
126
177
 
127
- **Note**: To store the API key securely with VS Code secret storage, follow [the instructions below](#add-a-new-openai-key).
128
-
129
178
  In VS Code, go to settings.
130
179
 
131
180
  <img class="video-screenshot" src="/assets/img/docs/goto-vscode-settings.webp" alt="a screenshot of the Visual Studio Code menu"/>
@@ -140,136 +189,10 @@ Use *Add Item* to define the relevant environment variables according to the [BY
140
189
 
141
190
  Reload your VS Code for the changes to take effect.
142
191
 
143
- **NOTE:** Please follow the instructions below to set `OPENAI_API_KEY` or `AZURE_OPENAI_API_KEY` securely.
144
-
145
192
  After reloading you can confirm the model is configured correctly in the Navie Chat window.
146
193
 
147
194
  <img class="video-screenshot" src="/assets/img/product/navie-confirm-ui-vscode.webp"/>
148
195
 
149
- #### Add a new OpenAI Key <!-- omit in toc -->
150
-
151
- In VS Code, open the Command Palette.
152
-
153
- You can use a hotkey to open the VS Code Command Palette
154
- - Mac: `Cmd + Shift + P`
155
- - Windows/Linux: `Ctrl + Shift + P`
156
-
157
- Or you can select `View` -> `Command Palette`
158
-
159
- <img class="video-screenshot" src="/assets/img/product/byok-command-palette.webp"/>
160
-
161
- Search for `AppMap Set OpenAPI Key`
162
-
163
- <img class="video-screenshot" src="/assets/img/product/byok-search.webp"/>
164
-
165
- Paste your key into the new field and hit enter.
166
-
167
- You'll get a notification in VS Code that your key is set.
168
-
169
- **NOTE:** You will need to reload your window for the setting to take effect. Use the Command Palette `Developer: Reload Window`
170
-
171
- <img class="video-screenshot" src="/assets/img/product/byok-key-set.webp"/>
172
-
173
- #### Delete a configured OpenAI Key <!-- omit in toc -->
174
-
175
- To delete your key, simply open the Command Palette
176
-
177
- You can use a hotkey to open
178
- - Mac: `Cmd + Shift + P`
179
- - Windows/Linux: `Ctrl + Shift + P`
180
-
181
- Or you can select `View` -> `Command Palette`
182
-
183
- <img class="video-screenshot" src="/assets/img/product/byok-command-palette.webp"/>
184
-
185
- Search for `AppMap Set OpenAPI Key`
186
-
187
- <img class="video-screenshot" src="/assets/img/product/byok-search.webp"/>
188
-
189
- And simply hit enter with the field blank. VS Code will notify you that the key has been unset.
190
-
191
- **NOTE:** You will need to reload your window for the setting to take effect. Use the Command Palette `Developer: Reload Window`
192
-
193
- <img class="video-screenshot" src="/assets/img/product/byok-key-erased.webp"/>
194
-
195
- #### How is my API key saved securely? <!-- omit in toc -->
196
-
197
- For secure storage of API key secrets within AppMap, we use the default VS Code secret storage which leverages Electron's safeStorage API to ensure the confidentiality of sensitive information. Upon encryption, secrets are stored within the user data directory in a SQLite database, alongside other VS Code state information. This encryption process involves generating a unique encryption key, which, on macOS, is securely stored within `Keychain Access` under "Code Safe Storage" or "Code - Insiders Safe Storage," depending on the version. This method provides a robust layer of protection, preventing unauthorized access by other applications or users with full disk access. The safeStorage API, accessible in the main process, supports operations such as checking encryption availability, encrypting and decrypting strings, and selecting storage backends on Linux. This approach ensures that your secrets are securely encrypted and stored, safeguarding them from potential threats while maintaining application integrity.
198
-
199
196
  ## Examples
200
197
 
201
- ### OpenAI
202
-
203
- **Note:** We recommend configuring your OpenAI key using the code editor extension. Follow the [Bring Your Own Key](/docs/navie/bring-your-own-model.html#configuring-your-openai-key) docs for instructions.
204
-
205
- Only `OPENAI_API_KEY` needs to be set, other settings can stay default:
206
-
207
- | `OPENAI_API_KEY`| `sk-9spQsnE3X7myFHnjgNKKgIcGAdaIG78I3HZB4DFDWQGM` |
208
-
209
- ### Azure OpenAI
210
-
211
- Assuming you [created](https://learn.microsoft.com/en-us/azure/ai-services/openai/how-to/create-resource) a `navie` GPT-4 deployment on `contoso.openai.azure.com` OpenAI instance:
212
-
213
- | `AZURE_OPENAI_API_KEY` | `e50edc22e83f01802893d654c4268c4f` |
214
- | `AZURE_OPENAI_API_VERSION` | `2024-02-01` |
215
- | `AZURE_OPENAI_API_INSTANCE_NAME` | `contoso` |
216
- | `AZURE_OPENAI_API_DEPLOYMENT_NAME` | `navie` |
217
-
218
- ### AnyScale Endpoints
219
-
220
- [AnyScale Endpoints](https://www.anyscale.com/endpoints) allows querying a
221
- selection of open-source LLMs. After you create an account you can use it by
222
- setting:
223
-
224
- | `OPENAI_API_KEY` | `esecret_myxfwgl1iinbz9q5hkexemk8f4xhcou8` |
225
- | `OPENAI_BASE_URL` | `https://api.endpoints.anyscale.com/v1` |
226
- | `APPMAP_NAVIE_MODEL` | `mistralai/Mixtral-8x7B-Instruct-v0.1` |
227
-
228
- Consult [AnyScale documentation](https://docs.endpoints.anyscale.com/) for model
229
- names. Note we recommend using Mixtral models with Navie.
230
-
231
- ### Ollama
232
-
233
- You can use [Ollama](https://ollama.com/) to run Navie with local models; after
234
- you've successfully ran a model with `ollama run` command, you can configure
235
- Navie to use it:
236
-
237
- | `OPENAI_API_KEY` | `dummy` |
238
- | `OPENAI_BASE_URL` | `http://127.0.0.1:11434/v1` |
239
- | `APPMAP_NAVIE_MODEL` | `mixtral` |
240
-
241
- **Note:** Even though it's running locally a dummy placeholder API key is still required.
242
-
243
- ### LM Studio
244
-
245
- You can use [LM Studio](https://lmstudio.ai/) to run Navie with local models.
246
-
247
- After downloading a model to run, select the option to run a local server.
248
-
249
- <img class="video-screenshot" src="/assets/img/product/lmstudio-run-local-server.webp"/>
250
-
251
- In the next window, select which model you want to load into the local inference server.
252
-
253
- <img class="video-screenshot" src="/assets/img/product/lmstudio-load-model.webp"/>
254
-
255
- After loading your model, you can confirm it's successfully running in the logs.
256
-
257
- *NOTE*: Save the URL it's running under to use for `OPENAI_BASE_URL` environment variable.
258
-
259
- For example: `http://localhost:1234/v1`
260
-
261
- <img class="video-screenshot" src="/assets/img/product/lmstudio-confirm-running.webp"/>
262
-
263
- In the `Model Inspector` copy the name of the model and use this for the `APPMAP_NAVIE_MODEL` environment variable.
264
-
265
- For example: `Meta-Llama-3-8B-Instruct-imatrix`
266
-
267
- <img class="video-screenshot" src="/assets/img/product/lmstudio-model-inspector.webp"/>
268
-
269
- Continue to configure your local environment with the following environment variables based on your LM Studio configuration. Refer to the [documentation above](#bring-your-own-model-byom) for steps specific to your code editor.
270
-
271
- | `OPENAI_API_KEY` | `dummy` |
272
- | `OPENAI_BASE_URL` | `http://localhost:1234/v1` |
273
- | `APPMAP_NAVIE_MODEL` | `Meta-Llama-3-8B-Instruct-imatrix` |
274
-
275
- **Note:** Even though it's running locally a dummy placeholder API key is still required.
198
+ Refer to the [Navie Reference Guide](/docs/reference/navie) for detailed examples of using Navie with your own LLM backend.
@@ -1,7 +1,7 @@
1
1
  ---
2
2
  layout: docs
3
3
  title: Docs - AppMap Navie
4
- description: "Learn how to use AppMap Navie to provide personalized code analysis. Generate AppMap Data, ask Navie questions, analyze code, and iterate for optimal results."
4
+ description: "Learn how to open AppMap Navie in your code editor. Open Navie in your VS Code or JetBrains code editor to start working with Navie AI"
5
5
  name: How to Open Navie
6
6
  step: 2
7
7
  navie: true
@@ -14,6 +14,7 @@ That means you're not just working with static analysis anymore, and you're not
14
14
 
15
15
  - [How Navie Works](/docs/navie/how-navie-works)
16
16
  - [How to Open Navie](/docs/navie/how-to-open-navie)
17
+ - [Using Navie](/docs/navie/using-navie)
17
18
  - [Bring Your Own LLM Model](/docs/navie/bring-your-own-model)
18
19
 
19
20
  **Install AppMap for your preferred code editor to get started.**
@@ -1,16 +1,51 @@
1
1
  ---
2
2
  layout: docs
3
- title: Docs - Guides
3
+ title: Docs - AppMap Navie
4
4
  description: "Learn how to use AppMap Navie to provide personalized code analysis. Generate AppMap Data, ask Navie questions, analyze code, and iterate for optimal results."
5
- guides: true
6
- name: Navie Best Practices
7
- step: 1
8
- redirect_from: [/docs/navie/using-navie]
5
+ name: Using Navie
6
+ step: 3
7
+ navie: true
8
+ toc: true
9
+ redirect_from: [/docs/guides/navie-best-practices]
9
10
  ---
11
+ # Using Navie
10
12
 
11
- # Navie Best Practices
13
+ - [User Interface](#user-interface)
14
+ - [Basic Layout](#basic-layout)
15
+ - [Navie Best Practices](#navie-best-practices)
16
+ - [Create AppMap Data](#create-appmap-data)
17
+ - [Ask Navie](#ask-navie)
18
+ - [Implement Generated Code](#implement-generated-code)
19
+ - [Repeat](#repeat)
12
20
 
13
- When you ask a question to Navie, it will search through all your AppMap Diagrams created for your project to pull in relevant traces, sequence diagrams, and code snippets for analysis. It will then send these code snippets and runtime code sequence diagrams to the Generative AI service along with your question. By sending this valuable context to the AI interface, your answers will be much more personal and specific to your codebase, files, and functions. Additionally, the Generative AI will be able to understand how your code works at runtime and will be able to make architectural level recommendations across your entire application.
21
+ When you ask a question to Navie, it will search through all your AppMap Diagrams (if they exist) and your static source code for your project to pull in relevant traces, sequence diagrams, and code snippets for analysis. It will then send these code snippets and runtime code sequence diagrams to the Generative AI service along with your question. By sending this valuable context to the AI interface, your answers will be much more personal and specific to your codebase, files, and functions.
22
+
23
+ ## User Interface
24
+ AppMap Navie AI is an AI assistant that enables you to ask architectural level questions about your code base. Navie is also able to help you generate new code that can span across your entire project. The primary user interface for Navie is within your VS Code or JetBrains code editor. Refer to the Navie docs to [learn how to open Navie](/docs/navie/how-to-open-navie) for your code editor.
25
+
26
+ ![Navie User Interface](/assets/img/docs/appmap-navie-user-interface.webp)
27
+
28
+ ## Basic Layout
29
+
30
+ The Navie user interface consists of 5 key areas.
31
+
32
+ <ol type="A">
33
+ <b><li>AppMap Data Status Panel</li></b>
34
+ This panel will list the current status of the AppMap Data that has been created for your project. AppMap Data are your application runtime traces which include things like HTTP and SQL requests, data flows, class and function level instrumentation, and more. Without AppMap Data created for your project, Navie can search through your static source code in your workspace for relevant code snippets.
35
+ <b><li>Navie Context Window</li></b>
36
+ Upon asking a question to Navie it will search through your AppMap Data (if exists) and the source code in your open workspace to locate all the relevant context for your question. This area will reflect the information that Navie is aware of when answering. You can use this information to better understand how Navie is responding.
37
+ <b><li>LLM Model Config Button</li></b>
38
+ You can configure Navie to use your own OpenAI API key or bring your own LLM model running locally or within another provider. Clicking the gear icon will open the configuration modal for the bring your own model settings. <a href="/docs/navie/bring-your-own-model">Refer to the AppMap docs for more details</a>
39
+ about how to configure Navie to use your own LLM models.
40
+ <b><li>Active LLM Model</li></b>
41
+ This panel will display the currently configured LLM model in use with Navie. The LLM details are displayed in the following format: <code>Model: &lt;Model Name&gt; (&lt;location&gt;)</code>. When using the default Navie backend, the location will be listed as <code>default</code>. When using your own OpenAI API key the location will be listed as <code>via OpenAI</code>. When using <a href="/docs/navie/bring-your-own-model.html#configuration">Navie Bring Your Own Model</a>
42
+ , the location will be the value of the <code>OPENAI_BASE_URL</code> environment variable, for example <code>via localhost</code> when using <a href="/docs/navie/bring-your-own-model.html#ollama">Ollama</a> or <a href="/docs/navie/bring-your-own-model.html#lm-studio">LM Studio</a>.
43
+ <b><li>Navie Chat Window</li></b>
44
+ This is your primary location for interacting with Navie. Here you can ask Navie questions about how your application works, ask Navie to generate code or test cases, and you can even have Navie create a pull request based on your changes. For more advanced usage of Navie refer to the <a href="#advanced-navie-commands">Advanced Navie Commands</a>
45
+ section.
46
+ </ol>
47
+
48
+ ## Navie Best Practices
14
49
 
15
50
  To generate the highest quality responses from Navie, we recommend the following approach.
16
51
 
@@ -19,7 +54,7 @@ To generate the highest quality responses from Navie, we recommend the following
19
54
  - [Write Code](#write-code)
20
55
  - [Repeat](#repeat)
21
56
 
22
- ## Create AppMap Data
57
+ ### Create AppMap Data
23
58
 
24
59
  We recommend creating maps that are most relevant to the question or area of the application you are going to be working with. For example, if i wanted to understand how my new user registration flow worked I could [create a remote recording](/docs/get-started-with-appmap/making-appmap-data.html#with-remote-application-recording) of a new user registration flow. Alternatively, I could [run all the test cases](/docs/get-started-with-appmap/making-appmap-data.html#with-test-case-recording) related to new user creation, registration, and adjacent areas.
25
60
 
@@ -37,17 +72,13 @@ Depending on your language and framework there [are up to 5 different ways](/doc
37
72
 
38
73
  Each of these methods generates AppMap Data in JSON format, which are then visualized through interactive diagrams in your code editor. This visualization supports a deep understanding of your application's architecture, dependencies, and runtime behavior, facilitating enhanced code quality and performance optimization.
39
74
 
40
- ## Ask Navie
75
+ ### Ask Navie
41
76
 
42
77
  Navie, can address a wide range of questions about your application, extending beyond what static analysis AI assistants can provide, to understand dynamic interactions, dependencies, and performance bottlenecks to help your developers design and deliver solutions faster for a host of complex issues.
43
78
 
44
79
  If you are not able to use the AppMap OpenAI proxy, you can bring your own OpenAI API key, or use an entirely different AI Model, [hosted in your environment](/docs/navie/bring-your-own-model.html#azure-openai) or [hosted locally](/docs/navie/bring-your-own-model.html#ollama).
45
80
 
46
- You can ask free formed questions, or start your question with one of these commands:
47
-
48
- - `@explain`: (Default) Navie will help you understand your project. This mode is used when there is no prefix.
49
- - `@help`: Navie will help you setup AppMap, including generating AppMap recordings and diagrams.
50
- - `@generate`: Navie will help you generate new code.
81
+ You can ask free formed questions, or refer to the [Navie Reference docs](/docs/reference/navie) to learn more about the advanced Navie chat commands you can use with your question.
51
82
 
52
83
  **Examples of good questions to ask Navie.**
53
84
 
@@ -60,7 +91,7 @@ You can ask free formed questions, or start your question with one of these comm
60
91
  * Explain what functions or files are involved when a user logs into my service.
61
92
  * Loading the "products" view page is slow in production, provide suggested guidance with relevant code changes I can make to improve the speed.
62
93
 
63
- ## Implement Generated Code
94
+ ### Implement Generated Code
64
95
 
65
96
  Navie can do more than just provide code implementation details, you can talk to Navie about a variety of other topics as you are updating your application based on it's recommendations.
66
97
 
@@ -74,6 +105,6 @@ Navie can do more than just provide code implementation details, you can talk to
74
105
 
75
106
  5. **Custom Queries**: Tailor your questions to fit unique development needs—whether you're troubleshooting a specific error, seeking optimization opportunities, or curious about the interactions between various components of your application.
76
107
 
77
- ## Repeat
108
+ ### Repeat
78
109
 
79
110
  Continue to ask follow-up question to Navie as you are making code changes or when you need additional details or more specific advice. Additionally, as you make changes to your application, continue creating AppMap recordings of the updated code interactions and start new conversations with Navie to dive deeper into your feature implementation.