@memograph/cli 1.0.0 → 1.0.2
This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
- package/README.md +99 -19
- package/dist/index.d.ts +4 -0
- package/dist/index.d.ts.map +1 -0
- package/dist/index.js +5 -0
- package/dist/index.js.map +1 -0
- package/package.json +1 -1
package/README.md
CHANGED
|
@@ -1,59 +1,139 @@
|
|
|
1
1
|
# Memograph CLI
|
|
2
2
|
|
|
3
|
-
|
|
3
|
+
Detect **memory drift** in AI conversation transcripts. Find where assistants forget facts, contradict themselves, ignore preferences, or force users to repeat instructions.
|
|
4
|
+
|
|
5
|
+
Works out of the box — no API keys or configuration required.
|
|
4
6
|
|
|
5
7
|
## Install
|
|
6
8
|
|
|
7
|
-
###
|
|
9
|
+
### Run instantly (no install)
|
|
8
10
|
|
|
9
11
|
```bash
|
|
10
|
-
npx memograph
|
|
12
|
+
npx @memograph/cli
|
|
11
13
|
```
|
|
12
14
|
|
|
13
|
-
###
|
|
15
|
+
### Install globally
|
|
14
16
|
|
|
15
17
|
```bash
|
|
16
|
-
npm i -g memograph
|
|
18
|
+
npm i -g @memograph/cli
|
|
17
19
|
```
|
|
18
20
|
|
|
19
21
|
After install, the command is `memograph`.
|
|
20
22
|
|
|
21
|
-
##
|
|
23
|
+
## Usage
|
|
22
24
|
|
|
23
|
-
###
|
|
25
|
+
### Interactive mode (recommended)
|
|
24
26
|
|
|
25
27
|
```bash
|
|
26
|
-
|
|
27
|
-
npx memograph-cli
|
|
28
|
+
memograph
|
|
28
29
|
```
|
|
29
30
|
|
|
30
|
-
|
|
31
|
+
Launches a guided menu where you can load transcripts, view results, and manage settings — all with arrow-key navigation.
|
|
31
32
|
|
|
32
|
-
###
|
|
33
|
+
### CLI mode (for scripts and automation)
|
|
33
34
|
|
|
34
35
|
```bash
|
|
35
|
-
#
|
|
36
|
+
# Analyze a transcript file
|
|
36
37
|
memograph inspect -i ./transcript.json
|
|
37
38
|
|
|
38
|
-
# JSON output (
|
|
39
|
+
# JSON output (pipe to jq, save to file, etc.)
|
|
39
40
|
memograph inspect -i ./transcript.json --json
|
|
41
|
+
|
|
42
|
+
# Limit messages processed
|
|
43
|
+
memograph inspect -i ./transcript.json --max-messages 100
|
|
40
44
|
```
|
|
41
45
|
|
|
42
|
-
|
|
46
|
+
## Transcript format
|
|
47
|
+
|
|
48
|
+
Create a `.json` file with this structure:
|
|
43
49
|
|
|
44
50
|
```json
|
|
45
51
|
{
|
|
46
52
|
"schema_version": "1.0",
|
|
47
53
|
"messages": [
|
|
48
54
|
{ "idx": 0, "role": "user", "content": "My name is Tusher" },
|
|
49
|
-
{ "idx": 1, "role": "assistant", "content": "Nice to meet you!" },
|
|
55
|
+
{ "idx": 1, "role": "assistant", "content": "Nice to meet you, Tusher!" },
|
|
50
56
|
{ "idx": 2, "role": "user", "content": "Please reply in Bangla from now on" },
|
|
51
|
-
{ "idx": 3, "role": "assistant", "content": "Sure." },
|
|
52
|
-
{ "idx": 4, "role": "user", "content": "Reply in Bangla please (I told you before)" }
|
|
57
|
+
{ "idx": 3, "role": "assistant", "content": "Sure, I'll use English." },
|
|
58
|
+
{ "idx": 4, "role": "user", "content": "Reply in Bangla please (I told you before)" },
|
|
59
|
+
{ "idx": 5, "role": "assistant", "content": "I'll use Bangla from now on." }
|
|
53
60
|
]
|
|
54
61
|
}
|
|
55
62
|
```
|
|
56
63
|
|
|
57
|
-
|
|
64
|
+
Each message needs:
|
|
65
|
+
|
|
66
|
+
| Field | Type | Required | Description |
|
|
67
|
+
|-------|------|----------|-------------|
|
|
68
|
+
| `idx` | number | Yes | Message index (0-based) |
|
|
69
|
+
| `role` | string | Yes | `"user"`, `"assistant"`, `"system"`, or `"tool"` |
|
|
70
|
+
| `content` | string | Yes | Message text |
|
|
71
|
+
| `ts` | string | No | ISO timestamp |
|
|
72
|
+
| `tokens` | number | No | Token count (estimated automatically if omitted) |
|
|
73
|
+
|
|
74
|
+
You can also pass a plain array of messages without the wrapper object, or even raw text — the CLI handles normalization.
|
|
75
|
+
|
|
76
|
+
## What it detects
|
|
77
|
+
|
|
78
|
+
| Drift type | Description |
|
|
79
|
+
|------------|-------------|
|
|
80
|
+
| **Contradiction** | Assistant states conflicting facts (e.g., calls user Alice, then Bob) |
|
|
81
|
+
| **Preference forgotten** | User repeats a preference the assistant already heard |
|
|
82
|
+
| **Repetition cluster** | User forced to re-ask the same question multiple times |
|
|
83
|
+
| **Session reset** | Assistant behaves as if conversation history was lost |
|
|
84
|
+
|
|
85
|
+
## Example output
|
|
86
|
+
|
|
87
|
+
```
|
|
88
|
+
=== Memograph Inspect Report ===
|
|
89
|
+
Drift Score: 65/100 (raw: 65)
|
|
90
|
+
Token Waste: 18.2%
|
|
91
|
+
|
|
92
|
+
Critical Events:
|
|
93
|
+
- [contradiction] sev=4 conf=0.90 idx=0,2
|
|
94
|
+
User identified as "Alice" at idx 0, then "Bob" at idx 2
|
|
95
|
+
- [preference_forgotten] sev=3 conf=0.85 idx=2,4
|
|
96
|
+
User repeated language preference (Bangla) that was previously stated
|
|
97
|
+
|
|
98
|
+
Should-have-been memory (top):
|
|
99
|
+
- identity:name="Tusher" @0 (conf 0.95)
|
|
100
|
+
- pref:language="Bangla" @2 (conf 0.90)
|
|
101
|
+
```
|
|
102
|
+
|
|
103
|
+
## CLI options
|
|
104
|
+
|
|
105
|
+
```
|
|
106
|
+
memograph inspect -i <path> [options]
|
|
107
|
+
|
|
108
|
+
Options:
|
|
109
|
+
-i, --input <path> Path to transcript JSON file (required)
|
|
110
|
+
--json Output JSON instead of human-readable text
|
|
111
|
+
--max-messages <n> Cap number of messages processed (default: 2000)
|
|
112
|
+
--analyze-mode <mode> Analysis mode: "hosted" or "llm"
|
|
113
|
+
--api-url <url> Custom analyze API URL
|
|
114
|
+
--api-timeout-ms <ms> API request timeout in milliseconds
|
|
115
|
+
--api-retries <n> Number of API retries on failure
|
|
116
|
+
--llm-provider <provider> LLM provider (openai, anthropic, etc.)
|
|
117
|
+
--llm-model <model> LLM model name
|
|
118
|
+
--llm-api-key <key> LLM API key
|
|
119
|
+
--llm-base-url <url> Custom LLM base URL (e.g., for Ollama)
|
|
120
|
+
--llm-temperature <temp> LLM temperature (0.0-1.0, default: 0.3)
|
|
121
|
+
--llm-max-tokens <tokens> Max tokens for LLM response (default: 4096)
|
|
122
|
+
```
|
|
123
|
+
|
|
124
|
+
## Environment variables
|
|
125
|
+
|
|
126
|
+
| Variable | Description |
|
|
127
|
+
|----------|-------------|
|
|
128
|
+
| `MEMOGRAPH_ANALYZE_MODE` | Set to `llm` to use local LLM instead of hosted API |
|
|
129
|
+
| `MEMOGRAPH_ANALYZE_API_URL` | Override the default analyze API endpoint |
|
|
130
|
+
| `OPENAI_API_KEY` | OpenAI API key (for LLM mode) |
|
|
131
|
+
| `ANTHROPIC_API_KEY` | Anthropic API key (for LLM mode) |
|
|
132
|
+
|
|
133
|
+
## Requirements
|
|
134
|
+
|
|
135
|
+
- Node.js >= 18.0.0
|
|
136
|
+
|
|
137
|
+
## License
|
|
58
138
|
|
|
59
|
-
|
|
139
|
+
MIT
|
package/dist/index.d.ts
ADDED
|
@@ -0,0 +1 @@
|
|
|
1
|
+
{"version":3,"file":"index.d.ts","sourceRoot":"","sources":["../src/index.ts"],"names":[],"mappings":";AACA,OAAO,EAAE,MAAM,UAAU,CAAC;AAC1B,OAAO,UAAU,CAAC"}
|
package/dist/index.js
ADDED
|
@@ -0,0 +1 @@
|
|
|
1
|
+
{"version":3,"file":"index.js","sourceRoot":"","sources":["../src/index.ts"],"names":[],"mappings":";;;AAEA,oBAAkB"}
|