@rubytech/create-maxy 1.0.469 → 1.0.470
This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
package/package.json
CHANGED
|
@@ -0,0 +1,82 @@
|
|
|
1
|
+
# Migration Guide — Taskmaster to Maxy
|
|
2
|
+
|
|
3
|
+
Guide for communicating the migration process to customers moving from Taskmaster to Maxy (or Real Agent).
|
|
4
|
+
|
|
5
|
+
## What the customer needs to know
|
|
6
|
+
|
|
7
|
+
### What stays the same
|
|
8
|
+
|
|
9
|
+
- **All contacts** are transferred — every person the assistant knows about
|
|
10
|
+
- **Business information** — address, team details, listings, and operational context
|
|
11
|
+
- **Conversation history** — previous chats are preserved as reference material
|
|
12
|
+
- **Media files** — uploaded documents, images, and voice notes are carried over
|
|
13
|
+
- **Access PINs** — existing login credentials are reused (no new PIN needed)
|
|
14
|
+
|
|
15
|
+
### What changes
|
|
16
|
+
|
|
17
|
+
- **The assistant has new names.** Taskmaster's admin agent (e.g., "Ivan") and public agent (e.g., "Jon") are replaced by the Maxy/Real Agent equivalents. The migrated data is accessible to the new agents, but the personality and capabilities are upgraded.
|
|
18
|
+
|
|
19
|
+
- **The web address changes.** Taskmaster used Tailscale URLs (`*.ts.net`) for web access. These are permanently replaced by Cloudflare tunnel URLs. The old URLs stop working immediately. Provide the customer with their new URL.
|
|
20
|
+
|
|
21
|
+
- **WhatsApp must be re-paired.** The WhatsApp connection is tied to the physical device. When the SD card moves to a new installation, the customer must scan a new QR code from their phone to reconnect WhatsApp. This is a one-time step — once paired, it stays connected.
|
|
22
|
+
|
|
23
|
+
- **Skills are upgraded.** Taskmaster's skills are replaced by Maxy's plugin system. The functionality is equivalent or better, but the names and capabilities may differ.
|
|
24
|
+
|
|
25
|
+
### What the customer needs to do
|
|
26
|
+
|
|
27
|
+
1. **Nothing during migration** — the SD card workflow is handled entirely by the operator.
|
|
28
|
+
|
|
29
|
+
2. **After receiving their SD card back:**
|
|
30
|
+
- Insert the card and power on the Pi
|
|
31
|
+
- Wait ~2 minutes for the system to boot
|
|
32
|
+
- Open the new web URL (provided by the operator)
|
|
33
|
+
- **Re-pair WhatsApp:** Open the admin chat, ask the assistant to "Connect WhatsApp" — a QR code will appear. Scan it with the WhatsApp app on their phone.
|
|
34
|
+
- Test the connection by sending a WhatsApp message to the business number
|
|
35
|
+
|
|
36
|
+
3. **Update any saved bookmarks or shortcuts** with the new web URL.
|
|
37
|
+
|
|
38
|
+
4. **Inform team members** (if applicable) about the WhatsApp re-pairing — they don't need to do anything, but may notice a brief gap in automated responses during the transition.
|
|
39
|
+
|
|
40
|
+
## Communication template
|
|
41
|
+
|
|
42
|
+
Use this as a starting point for customer communication:
|
|
43
|
+
|
|
44
|
+
---
|
|
45
|
+
|
|
46
|
+
**Subject: Your assistant is getting an upgrade**
|
|
47
|
+
|
|
48
|
+
Hi {customer_name},
|
|
49
|
+
|
|
50
|
+
We're upgrading your assistant from Taskmaster to {brand_name}. Here's what you need to know:
|
|
51
|
+
|
|
52
|
+
**What's happening:** We're transferring all your data — contacts, conversations, business info, and files — to the new system. Everything you've built with the assistant carries over.
|
|
53
|
+
|
|
54
|
+
**What you need to do:**
|
|
55
|
+
1. Ship your SD card to us (or we'll handle it remotely)
|
|
56
|
+
2. When you get it back, plug it in and power on
|
|
57
|
+
3. Open your new web address: {new_url}
|
|
58
|
+
4. Reconnect WhatsApp by asking the assistant to "Connect WhatsApp" and scanning the QR code
|
|
59
|
+
|
|
60
|
+
**Important:** Your old web address ({old_url}) will no longer work. Please update any bookmarks.
|
|
61
|
+
|
|
62
|
+
**What's better:** {brand_name} includes upgraded capabilities — better scheduling, workflow automation, and an improved chat experience. Your assistant will guide you through the new features.
|
|
63
|
+
|
|
64
|
+
If you have any questions, just ask!
|
|
65
|
+
|
|
66
|
+
---
|
|
67
|
+
|
|
68
|
+
## Operator checklist
|
|
69
|
+
|
|
70
|
+
Before shipping the card back:
|
|
71
|
+
|
|
72
|
+
- [ ] Export completed successfully (check export summary counts)
|
|
73
|
+
- [ ] Fresh OS flashed on SD card
|
|
74
|
+
- [ ] Maxy/Real Agent installed (`npx -y @AeonNeo/create-maxy --brand={brand}`)
|
|
75
|
+
- [ ] Neo4j seeded (`bash platform/scripts/seed-neo4j.sh`)
|
|
76
|
+
- [ ] Migration bundle imported (`bash platform/scripts/migrate-import.sh ./bundle /path/to/install`)
|
|
77
|
+
- [ ] Zero errors in import log
|
|
78
|
+
- [ ] Contacts visible in Neo4j (`cypher-shell` verification query)
|
|
79
|
+
- [ ] Memory files present in account directory
|
|
80
|
+
- [ ] New Cloudflare tunnel URL generated and tested
|
|
81
|
+
- [ ] Customer notified of new URL
|
|
82
|
+
- [ ] Old Tailscale URL confirmed dead (or will be after card swap)
|
|
@@ -0,0 +1,387 @@
|
|
|
1
|
+
#!/usr/bin/env bash
|
|
2
|
+
# ============================================================
|
|
3
|
+
# migrate-import.sh — Import a migration bundle into a Maxy install
|
|
4
|
+
#
|
|
5
|
+
# Reads a migration bundle (produced by taskmaster-export.sh) and
|
|
6
|
+
# populates a Maxy/Real Agent installation: Neo4j contacts, memory
|
|
7
|
+
# files, media, conversations, and access PINs.
|
|
8
|
+
#
|
|
9
|
+
# Idempotent: running twice does not create duplicate contacts
|
|
10
|
+
# (MERGE on telephone) or corrupt existing data (file overwrites
|
|
11
|
+
# are safe for migration content).
|
|
12
|
+
#
|
|
13
|
+
# Usage:
|
|
14
|
+
# bash migrate-import.sh <bundle-dir> <install-dir> [--account <uuid>]
|
|
15
|
+
#
|
|
16
|
+
# Arguments:
|
|
17
|
+
# bundle-dir — Migration bundle directory (from taskmaster-export.sh)
|
|
18
|
+
# install-dir — Maxy installation root (contains data/accounts/ and platform/)
|
|
19
|
+
# --account — (Optional) Target account UUID. Required if multiple accounts exist.
|
|
20
|
+
#
|
|
21
|
+
# Example:
|
|
22
|
+
# bash migrate-import.sh ./bundle ~/maxy
|
|
23
|
+
# bash migrate-import.sh ./bundle ~/maxy --account a1b2c3d4-...
|
|
24
|
+
# ============================================================
|
|
25
|
+
|
|
26
|
+
set -euo pipefail
|
|
27
|
+
|
|
28
|
+
# ------------------------------------------------------------------
|
|
29
|
+
# Arguments
|
|
30
|
+
# ------------------------------------------------------------------
|
|
31
|
+
if [ $# -lt 2 ]; then
|
|
32
|
+
echo "[import] ERROR: Usage: migrate-import.sh <bundle-dir> <install-dir> [--account <uuid>]"
|
|
33
|
+
exit 2
|
|
34
|
+
fi
|
|
35
|
+
|
|
36
|
+
BUNDLE_DIR="$1"
|
|
37
|
+
INSTALL_DIR="$2"
|
|
38
|
+
ACCOUNT_FLAG=""
|
|
39
|
+
|
|
40
|
+
shift 2
|
|
41
|
+
while [ $# -gt 0 ]; do
|
|
42
|
+
case "$1" in
|
|
43
|
+
--account)
|
|
44
|
+
ACCOUNT_FLAG="$2"
|
|
45
|
+
shift 2
|
|
46
|
+
;;
|
|
47
|
+
*)
|
|
48
|
+
echo "[import] ERROR: Unknown argument: $1"
|
|
49
|
+
exit 2
|
|
50
|
+
;;
|
|
51
|
+
esac
|
|
52
|
+
done
|
|
53
|
+
|
|
54
|
+
# Validate bundle
|
|
55
|
+
if [ ! -f "$BUNDLE_DIR/manifest.json" ]; then
|
|
56
|
+
echo "[import] ERROR: manifest.json not found in $BUNDLE_DIR"
|
|
57
|
+
echo "[import] ERROR: Is this a valid migration bundle?"
|
|
58
|
+
exit 1
|
|
59
|
+
fi
|
|
60
|
+
|
|
61
|
+
# Display manifest
|
|
62
|
+
echo "[import] Bundle: $BUNDLE_DIR"
|
|
63
|
+
python3 -c "
|
|
64
|
+
import json
|
|
65
|
+
m = json.load(open('$BUNDLE_DIR/manifest.json'))
|
|
66
|
+
print(f'[import] Source: {m.get(\"source\",\"?\")} v{m.get(\"version\",\"?\")}')
|
|
67
|
+
print(f'[import] Customer: {m.get(\"customer\",\"?\")}')
|
|
68
|
+
print(f'[import] Exported: {m.get(\"exportedAt\",\"?\")}')
|
|
69
|
+
print(f'[import] Hostname: {m.get(\"hostname\",\"?\")}')
|
|
70
|
+
"
|
|
71
|
+
|
|
72
|
+
# ------------------------------------------------------------------
|
|
73
|
+
# Discover account
|
|
74
|
+
# ------------------------------------------------------------------
|
|
75
|
+
ACCOUNTS_DIR="$INSTALL_DIR/data/accounts"
|
|
76
|
+
|
|
77
|
+
if [ ! -d "$ACCOUNTS_DIR" ]; then
|
|
78
|
+
echo "[import] ERROR: No accounts directory at $ACCOUNTS_DIR"
|
|
79
|
+
echo "[import] ERROR: Is $INSTALL_DIR a Maxy installation?"
|
|
80
|
+
exit 1
|
|
81
|
+
fi
|
|
82
|
+
|
|
83
|
+
if [ -n "$ACCOUNT_FLAG" ]; then
|
|
84
|
+
# Explicit account UUID provided
|
|
85
|
+
ACCOUNT_DIR="$ACCOUNTS_DIR/$ACCOUNT_FLAG"
|
|
86
|
+
if [ ! -f "$ACCOUNT_DIR/account.json" ]; then
|
|
87
|
+
echo "[import] ERROR: Account $ACCOUNT_FLAG not found at $ACCOUNT_DIR"
|
|
88
|
+
exit 1
|
|
89
|
+
fi
|
|
90
|
+
ACCOUNT_ID="$ACCOUNT_FLAG"
|
|
91
|
+
else
|
|
92
|
+
# Auto-discover — must be exactly one account
|
|
93
|
+
ACCOUNT_FILES=$(find "$ACCOUNTS_DIR" -maxdepth 2 -name "account.json" 2>/dev/null)
|
|
94
|
+
ACCOUNT_FILE_COUNT=$(echo "$ACCOUNT_FILES" | grep -c '.' || true)
|
|
95
|
+
|
|
96
|
+
if [ "$ACCOUNT_FILE_COUNT" -eq 0 ]; then
|
|
97
|
+
echo "[import] ERROR: No accounts found in $ACCOUNTS_DIR"
|
|
98
|
+
echo "[import] ERROR: Run the Maxy installer first."
|
|
99
|
+
exit 1
|
|
100
|
+
elif [ "$ACCOUNT_FILE_COUNT" -gt 1 ]; then
|
|
101
|
+
echo "[import] ERROR: Multiple accounts found in $ACCOUNTS_DIR:"
|
|
102
|
+
echo "$ACCOUNT_FILES" | while read -r f; do
|
|
103
|
+
echo "[import] $(dirname "$f" | xargs basename)"
|
|
104
|
+
done
|
|
105
|
+
echo "[import] ERROR: Specify which account with --account <uuid>"
|
|
106
|
+
exit 1
|
|
107
|
+
fi
|
|
108
|
+
|
|
109
|
+
ACCOUNT_DIR=$(dirname "$(echo "$ACCOUNT_FILES" | head -1)")
|
|
110
|
+
ACCOUNT_ID=$(basename "$ACCOUNT_DIR")
|
|
111
|
+
fi
|
|
112
|
+
|
|
113
|
+
echo "[import] Target account: $ACCOUNT_ID"
|
|
114
|
+
echo "[import] Account dir: $ACCOUNT_DIR"
|
|
115
|
+
|
|
116
|
+
# ------------------------------------------------------------------
|
|
117
|
+
# Neo4j connection
|
|
118
|
+
# ------------------------------------------------------------------
|
|
119
|
+
NEO4J_URI="${NEO4J_URI:-bolt://localhost:7687}"
|
|
120
|
+
NEO4J_USER="${NEO4J_USER:-neo4j}"
|
|
121
|
+
|
|
122
|
+
NEO4J_PASSWORD_FILE="$INSTALL_DIR/platform/config/.neo4j-password"
|
|
123
|
+
if [ -n "${NEO4J_PASSWORD:-}" ]; then
|
|
124
|
+
: # Explicit env var takes precedence
|
|
125
|
+
elif [ -f "$NEO4J_PASSWORD_FILE" ]; then
|
|
126
|
+
NEO4J_PASSWORD=$(cat "$NEO4J_PASSWORD_FILE")
|
|
127
|
+
else
|
|
128
|
+
echo "[import] ERROR: Neo4j password not found."
|
|
129
|
+
echo "[import] Expected at: $NEO4J_PASSWORD_FILE"
|
|
130
|
+
echo "[import] Or set NEO4J_PASSWORD environment variable."
|
|
131
|
+
exit 1
|
|
132
|
+
fi
|
|
133
|
+
|
|
134
|
+
CYPHER_SHELL="cypher-shell"
|
|
135
|
+
if ! command -v "$CYPHER_SHELL" &> /dev/null; then
|
|
136
|
+
echo "[import] ERROR: cypher-shell not found. Install Neo4j or add cypher-shell to PATH."
|
|
137
|
+
exit 1
|
|
138
|
+
fi
|
|
139
|
+
|
|
140
|
+
# Test connection
|
|
141
|
+
if ! "$CYPHER_SHELL" -u "$NEO4J_USER" -p "$NEO4J_PASSWORD" -a "$NEO4J_URI" \
|
|
142
|
+
"RETURN 1" > /dev/null 2>&1; then
|
|
143
|
+
echo "[import] ERROR: Cannot connect to Neo4j at $NEO4J_URI"
|
|
144
|
+
echo "[import] ERROR: Is Neo4j running?"
|
|
145
|
+
exit 1
|
|
146
|
+
fi
|
|
147
|
+
echo "[import] Neo4j connection OK ($NEO4J_URI)"
|
|
148
|
+
|
|
149
|
+
# ------------------------------------------------------------------
|
|
150
|
+
# Helper: escape a string for cypher-shell --param single-quoted value
|
|
151
|
+
# In cypher string literals, single quotes are escaped by doubling: ' → ''
|
|
152
|
+
# ------------------------------------------------------------------
|
|
153
|
+
cypher_escape() {
|
|
154
|
+
echo "$1" | sed "s/'/''/g"
|
|
155
|
+
}
|
|
156
|
+
|
|
157
|
+
# ------------------------------------------------------------------
|
|
158
|
+
# 1. Import contacts
|
|
159
|
+
# ------------------------------------------------------------------
|
|
160
|
+
CONTACTS_DIR="$BUNDLE_DIR/contacts"
|
|
161
|
+
CREATED=0
|
|
162
|
+
SKIPPED=0
|
|
163
|
+
ERRORS=0
|
|
164
|
+
|
|
165
|
+
if [ -d "$CONTACTS_DIR" ]; then
|
|
166
|
+
for CONTACT_FILE in "$CONTACTS_DIR"/*.json; do
|
|
167
|
+
[ -f "$CONTACT_FILE" ] || continue
|
|
168
|
+
|
|
169
|
+
# Extract fields from contact JSON
|
|
170
|
+
GIVEN_NAME=$(python3 -c "import json; d=json.load(open('$CONTACT_FILE')); print(d.get('givenName',''))" 2>/dev/null)
|
|
171
|
+
FAMILY_NAME=$(python3 -c "import json; d=json.load(open('$CONTACT_FILE')); print(d.get('familyName',''))" 2>/dev/null)
|
|
172
|
+
TELEPHONE=$(python3 -c "import json; d=json.load(open('$CONTACT_FILE')); print(d.get('telephone',''))" 2>/dev/null)
|
|
173
|
+
EMAIL=$(python3 -c "import json; d=json.load(open('$CONTACT_FILE')); print(d.get('email',''))" 2>/dev/null)
|
|
174
|
+
JOB_TITLE=$(python3 -c "import json; d=json.load(open('$CONTACT_FILE')); print(d.get('jobTitle',''))" 2>/dev/null)
|
|
175
|
+
|
|
176
|
+
# Validate: must have givenName and at least phone or email
|
|
177
|
+
if [ -z "$GIVEN_NAME" ]; then
|
|
178
|
+
echo "[import] ERROR: contact skipped (no givenName): $(basename "$CONTACT_FILE")"
|
|
179
|
+
ERRORS=$((ERRORS + 1))
|
|
180
|
+
continue
|
|
181
|
+
fi
|
|
182
|
+
if [ -z "$TELEPHONE" ] && [ -z "$EMAIL" ]; then
|
|
183
|
+
echo "[import] ERROR: contact skipped (no phone or email): $GIVEN_NAME"
|
|
184
|
+
ERRORS=$((ERRORS + 1))
|
|
185
|
+
continue
|
|
186
|
+
fi
|
|
187
|
+
|
|
188
|
+
# Build MERGE query with --param for safe parameterisation
|
|
189
|
+
# MERGE on telephone (primary identifier for Taskmaster contacts)
|
|
190
|
+
ESCAPED_GIVEN=$(cypher_escape "$GIVEN_NAME")
|
|
191
|
+
ESCAPED_FAMILY=$(cypher_escape "$FAMILY_NAME")
|
|
192
|
+
ESCAPED_TITLE=$(cypher_escape "$JOB_TITLE")
|
|
193
|
+
|
|
194
|
+
# Build the cypher query — MERGE on telephone for dedup
|
|
195
|
+
CYPHER_QUERY="MERGE (p:Person {telephone: \$phone, accountId: \$acct})"
|
|
196
|
+
CYPHER_QUERY="$CYPHER_QUERY ON CREATE SET p.givenName = \$given, p.source = 'taskmaster-migration', p.status = 'customer', p.createdOn = datetime()"
|
|
197
|
+
|
|
198
|
+
# Add optional fields on CREATE
|
|
199
|
+
if [ -n "$FAMILY_NAME" ]; then
|
|
200
|
+
CYPHER_QUERY="$CYPHER_QUERY, p.familyName = \$family"
|
|
201
|
+
fi
|
|
202
|
+
if [ -n "$EMAIL" ]; then
|
|
203
|
+
CYPHER_QUERY="$CYPHER_QUERY, p.email = \$email"
|
|
204
|
+
fi
|
|
205
|
+
if [ -n "$JOB_TITLE" ]; then
|
|
206
|
+
CYPHER_QUERY="$CYPHER_QUERY, p.jobTitle = \$title"
|
|
207
|
+
fi
|
|
208
|
+
|
|
209
|
+
CYPHER_QUERY="$CYPHER_QUERY RETURN p.givenName AS name, p.telephone AS phone, CASE WHEN p.createdOn = datetime() THEN 'created' ELSE 'exists' END AS status"
|
|
210
|
+
|
|
211
|
+
# Execute with parameterised values
|
|
212
|
+
PARAMS=(
|
|
213
|
+
--param "phone => '$(cypher_escape "$TELEPHONE")'"
|
|
214
|
+
--param "acct => '$(cypher_escape "$ACCOUNT_ID")'"
|
|
215
|
+
--param "given => '$(cypher_escape "$GIVEN_NAME")'"
|
|
216
|
+
)
|
|
217
|
+
[ -n "$FAMILY_NAME" ] && PARAMS+=(--param "family => '$(cypher_escape "$FAMILY_NAME")'")
|
|
218
|
+
[ -n "$EMAIL" ] && PARAMS+=(--param "email => '$(cypher_escape "$(echo "$EMAIL" | tr '[:upper:]' '[:lower:]')")'")
|
|
219
|
+
[ -n "$JOB_TITLE" ] && PARAMS+=(--param "title => '$(cypher_escape "$JOB_TITLE")'")
|
|
220
|
+
|
|
221
|
+
RESULT=$("$CYPHER_SHELL" -u "$NEO4J_USER" -p "$NEO4J_PASSWORD" -a "$NEO4J_URI" \
|
|
222
|
+
"${PARAMS[@]}" "$CYPHER_QUERY" 2>&1) || {
|
|
223
|
+
echo "[import] ERROR: contact failed: $GIVEN_NAME <$TELEPHONE> — $RESULT"
|
|
224
|
+
ERRORS=$((ERRORS + 1))
|
|
225
|
+
continue
|
|
226
|
+
}
|
|
227
|
+
|
|
228
|
+
if echo "$RESULT" | grep -q "created"; then
|
|
229
|
+
echo "[import] contact created: $GIVEN_NAME <$TELEPHONE>"
|
|
230
|
+
CREATED=$((CREATED + 1))
|
|
231
|
+
else
|
|
232
|
+
echo "[import] contact skipped (exists): $GIVEN_NAME <$TELEPHONE>"
|
|
233
|
+
SKIPPED=$((SKIPPED + 1))
|
|
234
|
+
fi
|
|
235
|
+
done
|
|
236
|
+
fi
|
|
237
|
+
echo "[import] contacts: $CREATED created, $SKIPPED skipped, $ERRORS errors"
|
|
238
|
+
|
|
239
|
+
# ------------------------------------------------------------------
|
|
240
|
+
# 2. Import memory files
|
|
241
|
+
# ------------------------------------------------------------------
|
|
242
|
+
MEMORY_DIR="$BUNDLE_DIR/memory"
|
|
243
|
+
MEM_COUNT=0
|
|
244
|
+
|
|
245
|
+
if [ -d "$MEMORY_DIR" ]; then
|
|
246
|
+
# Create target memory directories
|
|
247
|
+
mkdir -p "$ACCOUNT_DIR/memory"
|
|
248
|
+
|
|
249
|
+
for SUBDIR in shared admin public; do
|
|
250
|
+
if [ -d "$MEMORY_DIR/$SUBDIR" ] && [ "$(ls -A "$MEMORY_DIR/$SUBDIR" 2>/dev/null)" ]; then
|
|
251
|
+
# Copy recursively, preserving structure
|
|
252
|
+
mkdir -p "$ACCOUNT_DIR/memory/$SUBDIR"
|
|
253
|
+
cp -r "$MEMORY_DIR/$SUBDIR"/* "$ACCOUNT_DIR/memory/$SUBDIR/" 2>/dev/null || true
|
|
254
|
+
COUNT=$(find "$ACCOUNT_DIR/memory/$SUBDIR" -type f 2>/dev/null | wc -l | tr -d ' ')
|
|
255
|
+
echo "[import] memory/$SUBDIR: $COUNT files"
|
|
256
|
+
MEM_COUNT=$((MEM_COUNT + COUNT))
|
|
257
|
+
fi
|
|
258
|
+
done
|
|
259
|
+
fi
|
|
260
|
+
echo "[import] memory total: $MEM_COUNT files"
|
|
261
|
+
|
|
262
|
+
# ------------------------------------------------------------------
|
|
263
|
+
# 3. Import conversations
|
|
264
|
+
# ------------------------------------------------------------------
|
|
265
|
+
CONVOS_DIR="$BUNDLE_DIR/conversations"
|
|
266
|
+
CONVO_COUNT=0
|
|
267
|
+
|
|
268
|
+
if [ -d "$CONVOS_DIR" ]; then
|
|
269
|
+
mkdir -p "$ACCOUNT_DIR/conversations"
|
|
270
|
+
|
|
271
|
+
# Admin conversations
|
|
272
|
+
if [ -d "$CONVOS_DIR/admin" ] && [ "$(ls -A "$CONVOS_DIR/admin" 2>/dev/null)" ]; then
|
|
273
|
+
mkdir -p "$ACCOUNT_DIR/conversations/admin"
|
|
274
|
+
cp "$CONVOS_DIR/admin"/* "$ACCOUNT_DIR/conversations/admin/" 2>/dev/null || true
|
|
275
|
+
COUNT=$(find "$ACCOUNT_DIR/conversations/admin" -type f 2>/dev/null | wc -l | tr -d ' ')
|
|
276
|
+
echo "[import] conversations/admin: $COUNT files"
|
|
277
|
+
CONVO_COUNT=$((CONVO_COUNT + COUNT))
|
|
278
|
+
fi
|
|
279
|
+
|
|
280
|
+
# Per-user conversations
|
|
281
|
+
if [ -d "$CONVOS_DIR/users" ]; then
|
|
282
|
+
for USER_CONVO in "$CONVOS_DIR/users"/*/; do
|
|
283
|
+
[ -d "$USER_CONVO" ] || continue
|
|
284
|
+
USER_KEY=$(basename "$USER_CONVO")
|
|
285
|
+
mkdir -p "$ACCOUNT_DIR/conversations/users/$USER_KEY"
|
|
286
|
+
cp "$USER_CONVO"* "$ACCOUNT_DIR/conversations/users/$USER_KEY/" 2>/dev/null || true
|
|
287
|
+
COUNT=$(find "$ACCOUNT_DIR/conversations/users/$USER_KEY" -type f 2>/dev/null | wc -l | tr -d ' ')
|
|
288
|
+
CONVO_COUNT=$((CONVO_COUNT + COUNT))
|
|
289
|
+
done
|
|
290
|
+
USER_DIRS=$(find "$CONVOS_DIR/users" -mindepth 1 -maxdepth 1 -type d 2>/dev/null | wc -l | tr -d ' ')
|
|
291
|
+
echo "[import] conversations/users: $USER_DIRS directories"
|
|
292
|
+
fi
|
|
293
|
+
|
|
294
|
+
# Group conversations
|
|
295
|
+
if [ -d "$CONVOS_DIR/groups" ]; then
|
|
296
|
+
for GRP_CONVO in "$CONVOS_DIR/groups"/*/; do
|
|
297
|
+
[ -d "$GRP_CONVO" ] || continue
|
|
298
|
+
GRP_ID=$(basename "$GRP_CONVO")
|
|
299
|
+
mkdir -p "$ACCOUNT_DIR/conversations/groups/$GRP_ID"
|
|
300
|
+
cp -r "$GRP_CONVO"* "$ACCOUNT_DIR/conversations/groups/$GRP_ID/" 2>/dev/null || true
|
|
301
|
+
COUNT=$(find "$ACCOUNT_DIR/conversations/groups/$GRP_ID" -type f 2>/dev/null | wc -l | tr -d ' ')
|
|
302
|
+
CONVO_COUNT=$((CONVO_COUNT + COUNT))
|
|
303
|
+
done
|
|
304
|
+
GRP_DIRS=$(find "$CONVOS_DIR/groups" -mindepth 1 -maxdepth 1 -type d 2>/dev/null | wc -l | tr -d ' ')
|
|
305
|
+
echo "[import] conversations/groups: $GRP_DIRS directories"
|
|
306
|
+
fi
|
|
307
|
+
fi
|
|
308
|
+
echo "[import] conversations total: $CONVO_COUNT files"
|
|
309
|
+
|
|
310
|
+
# ------------------------------------------------------------------
|
|
311
|
+
# 4. Import media files
|
|
312
|
+
# ------------------------------------------------------------------
|
|
313
|
+
MEDIA_DIR="$BUNDLE_DIR/media"
|
|
314
|
+
MEDIA_COUNT=0
|
|
315
|
+
|
|
316
|
+
if [ -d "$MEDIA_DIR" ]; then
|
|
317
|
+
mkdir -p "$ACCOUNT_DIR/media"
|
|
318
|
+
|
|
319
|
+
for SUBDIR in admin public; do
|
|
320
|
+
if [ -d "$MEDIA_DIR/$SUBDIR" ] && [ "$(ls -A "$MEDIA_DIR/$SUBDIR" 2>/dev/null)" ]; then
|
|
321
|
+
mkdir -p "$ACCOUNT_DIR/media/$SUBDIR"
|
|
322
|
+
cp "$MEDIA_DIR/$SUBDIR"/* "$ACCOUNT_DIR/media/$SUBDIR/" 2>/dev/null || true
|
|
323
|
+
COUNT=$(find "$ACCOUNT_DIR/media/$SUBDIR" -type f 2>/dev/null | wc -l | tr -d ' ')
|
|
324
|
+
echo "[import] media/$SUBDIR: $COUNT files"
|
|
325
|
+
MEDIA_COUNT=$((MEDIA_COUNT + COUNT))
|
|
326
|
+
fi
|
|
327
|
+
done
|
|
328
|
+
fi
|
|
329
|
+
echo "[import] media total: $MEDIA_COUNT files"
|
|
330
|
+
|
|
331
|
+
# ------------------------------------------------------------------
|
|
332
|
+
# 5. Import automations (workflow definitions — reference only)
|
|
333
|
+
# ------------------------------------------------------------------
|
|
334
|
+
AUTO_DIR="$BUNDLE_DIR/automations"
|
|
335
|
+
AUTO_COUNT=0
|
|
336
|
+
|
|
337
|
+
if [ -d "$AUTO_DIR" ] && [ "$(ls -A "$AUTO_DIR" 2>/dev/null)" ]; then
|
|
338
|
+
mkdir -p "$ACCOUNT_DIR/memory/admin/workflows-migrated"
|
|
339
|
+
cp "$AUTO_DIR"/* "$ACCOUNT_DIR/memory/admin/workflows-migrated/" 2>/dev/null || true
|
|
340
|
+
AUTO_COUNT=$(find "$ACCOUNT_DIR/memory/admin/workflows-migrated" -type f 2>/dev/null | wc -l | tr -d ' ')
|
|
341
|
+
echo "[import] automations: $AUTO_COUNT workflow definitions → memory/admin/workflows-migrated/"
|
|
342
|
+
fi
|
|
343
|
+
|
|
344
|
+
# ------------------------------------------------------------------
|
|
345
|
+
# 6. Apply PINs (if present in bundle)
|
|
346
|
+
# ------------------------------------------------------------------
|
|
347
|
+
PINS_FILE="$BUNDLE_DIR/identity/pins.json"
|
|
348
|
+
if [ -f "$PINS_FILE" ]; then
|
|
349
|
+
MASTER_PIN=$(python3 -c "import json; d=json.load(open('$PINS_FILE')); print(d.get('masterPin','') or '')" 2>/dev/null)
|
|
350
|
+
if [ -n "$MASTER_PIN" ]; then
|
|
351
|
+
# Write PIN to account config
|
|
352
|
+
ACCOUNT_JSON="$ACCOUNT_DIR/account.json"
|
|
353
|
+
if [ -f "$ACCOUNT_JSON" ]; then
|
|
354
|
+
python3 -c "
|
|
355
|
+
import json
|
|
356
|
+
acct = json.load(open('$ACCOUNT_JSON'))
|
|
357
|
+
acct['masterPin'] = '$MASTER_PIN'
|
|
358
|
+
json.dump(acct, open('$ACCOUNT_JSON','w'), indent=2)
|
|
359
|
+
print('[import] masterPin applied to account.json')
|
|
360
|
+
"
|
|
361
|
+
else
|
|
362
|
+
echo "[import] WARN: account.json not found — PINs not applied"
|
|
363
|
+
fi
|
|
364
|
+
else
|
|
365
|
+
echo "[import] No masterPin in bundle — skipping PIN application"
|
|
366
|
+
fi
|
|
367
|
+
else
|
|
368
|
+
echo "[import] No pins.json in bundle — skipping PIN application"
|
|
369
|
+
fi
|
|
370
|
+
|
|
371
|
+
# ------------------------------------------------------------------
|
|
372
|
+
# Summary
|
|
373
|
+
# ------------------------------------------------------------------
|
|
374
|
+
echo ""
|
|
375
|
+
echo "[import] =================================================="
|
|
376
|
+
echo "[import] Import complete: $ACCOUNT_ID"
|
|
377
|
+
echo "[import] Contacts: $CREATED created, $SKIPPED skipped, $ERRORS errors"
|
|
378
|
+
echo "[import] Memory files: $MEM_COUNT"
|
|
379
|
+
echo "[import] Conversations: $CONVO_COUNT files"
|
|
380
|
+
echo "[import] Media files: $MEDIA_COUNT"
|
|
381
|
+
echo "[import] Automations: $AUTO_COUNT"
|
|
382
|
+
echo "[import] =================================================="
|
|
383
|
+
|
|
384
|
+
if [ "$ERRORS" -gt 0 ]; then
|
|
385
|
+
echo "[import] WARNING: $ERRORS errors occurred during import — review log above"
|
|
386
|
+
exit 1
|
|
387
|
+
fi
|
|
@@ -0,0 +1,387 @@
|
|
|
1
|
+
#!/usr/bin/env bash
|
|
2
|
+
# ============================================================
|
|
3
|
+
# taskmaster-export.sh — Export Taskmaster data to migration bundle
|
|
4
|
+
#
|
|
5
|
+
# Reads a Taskmaster installation and produces a standardised
|
|
6
|
+
# migration bundle directory that migrate-import.sh can consume.
|
|
7
|
+
#
|
|
8
|
+
# Usage:
|
|
9
|
+
# bash taskmaster-export.sh <config-dir> <workspace-dir> <output-dir>
|
|
10
|
+
#
|
|
11
|
+
# Arguments:
|
|
12
|
+
# config-dir — Taskmaster config directory (e.g., ~/.taskmaster)
|
|
13
|
+
# workspace-dir — Taskmaster workspace directory (e.g., ~/taskmaster)
|
|
14
|
+
# output-dir — Output directory for the migration bundle
|
|
15
|
+
#
|
|
16
|
+
# Example (run on the Taskmaster Pi via SSH):
|
|
17
|
+
# ssh admin@muvin.local
|
|
18
|
+
# bash /path/to/taskmaster-export.sh ~/.taskmaster ~/taskmaster ~/bundle
|
|
19
|
+
# ============================================================
|
|
20
|
+
|
|
21
|
+
set -euo pipefail
|
|
22
|
+
|
|
23
|
+
# ------------------------------------------------------------------
|
|
24
|
+
# Arguments
|
|
25
|
+
# ------------------------------------------------------------------
|
|
26
|
+
if [ $# -lt 3 ]; then
|
|
27
|
+
echo "[export] ERROR: Usage: taskmaster-export.sh <config-dir> <workspace-dir> <output-dir>"
|
|
28
|
+
exit 2
|
|
29
|
+
fi
|
|
30
|
+
|
|
31
|
+
CONFIG_DIR="$1"
|
|
32
|
+
WORKSPACE_DIR="$2"
|
|
33
|
+
OUTPUT_DIR="$3"
|
|
34
|
+
|
|
35
|
+
TASKMASTER_JSON="$CONFIG_DIR/taskmaster.json"
|
|
36
|
+
|
|
37
|
+
if [ ! -f "$TASKMASTER_JSON" ]; then
|
|
38
|
+
echo "[export] ERROR: taskmaster.json not found at $TASKMASTER_JSON"
|
|
39
|
+
echo "[export] ERROR: Is $CONFIG_DIR a Taskmaster config directory?"
|
|
40
|
+
exit 1
|
|
41
|
+
fi
|
|
42
|
+
|
|
43
|
+
if [ ! -d "$WORKSPACE_DIR" ]; then
|
|
44
|
+
echo "[export] ERROR: Workspace directory not found at $WORKSPACE_DIR"
|
|
45
|
+
exit 1
|
|
46
|
+
fi
|
|
47
|
+
|
|
48
|
+
echo "[export] Source config: $CONFIG_DIR"
|
|
49
|
+
echo "[export] Source workspace: $WORKSPACE_DIR"
|
|
50
|
+
echo "[export] Output bundle: $OUTPUT_DIR"
|
|
51
|
+
|
|
52
|
+
# ------------------------------------------------------------------
|
|
53
|
+
# Create bundle directory structure
|
|
54
|
+
# ------------------------------------------------------------------
|
|
55
|
+
mkdir -p "$OUTPUT_DIR"/{identity,contacts,memory/shared,memory/admin,conversations/admin,conversations/users,conversations/groups,media/admin,media/public,automations}
|
|
56
|
+
|
|
57
|
+
# ------------------------------------------------------------------
|
|
58
|
+
# 1. Manifest
|
|
59
|
+
# ------------------------------------------------------------------
|
|
60
|
+
HOSTNAME_VAL=$(hostname 2>/dev/null || echo "unknown")
|
|
61
|
+
VERSION=$(python3 -c "import json; print(json.load(open('$TASKMASTER_JSON'))['meta']['lastTouchedVersion'])" 2>/dev/null || echo "unknown")
|
|
62
|
+
EXPORT_TS=$(date -u +%Y-%m-%dT%H:%M:%SZ)
|
|
63
|
+
|
|
64
|
+
# Extract customer name from USER.md if it exists
|
|
65
|
+
USER_MD="$WORKSPACE_DIR/agents/admin/USER.md"
|
|
66
|
+
CUSTOMER_NAME="unknown"
|
|
67
|
+
if [ -f "$USER_MD" ]; then
|
|
68
|
+
# Parse "**Name:** Adam Mackay" from USER.md
|
|
69
|
+
CUSTOMER_NAME=$(grep -oP '\*\*Name:\*\*\s*\K.*' "$USER_MD" 2>/dev/null || \
|
|
70
|
+
sed -n 's/.*\*\*Name:\*\* *//p' "$USER_MD" | head -1)
|
|
71
|
+
[ -z "$CUSTOMER_NAME" ] && CUSTOMER_NAME="unknown"
|
|
72
|
+
fi
|
|
73
|
+
|
|
74
|
+
cat > "$OUTPUT_DIR/manifest.json" << MANIFEST_EOF
|
|
75
|
+
{
|
|
76
|
+
"source": "taskmaster",
|
|
77
|
+
"version": "$VERSION",
|
|
78
|
+
"hostname": "$HOSTNAME_VAL",
|
|
79
|
+
"customer": "$CUSTOMER_NAME",
|
|
80
|
+
"exportedAt": "$EXPORT_TS"
|
|
81
|
+
}
|
|
82
|
+
MANIFEST_EOF
|
|
83
|
+
echo "[export] manifest.json written (version=$VERSION, customer=$CUSTOMER_NAME)"
|
|
84
|
+
|
|
85
|
+
# ------------------------------------------------------------------
|
|
86
|
+
# 2. Identity — owner, agents, PINs
|
|
87
|
+
# ------------------------------------------------------------------
|
|
88
|
+
|
|
89
|
+
# Owner identity from USER.md
|
|
90
|
+
if [ -f "$USER_MD" ]; then
|
|
91
|
+
# Parse structured fields from USER.md markdown
|
|
92
|
+
OWNER_NAME="$CUSTOMER_NAME"
|
|
93
|
+
OWNER_PHONE=$(sed -n 's/.*\*\*Phone:\*\* *//p' "$USER_MD" | head -1)
|
|
94
|
+
OWNER_BUSINESS=$(sed -n 's/.*\*\*Business:\*\* *//p' "$USER_MD" | head -1)
|
|
95
|
+
OWNER_LOCATION=$(sed -n 's/.*\*\*Location:\*\* *//p' "$USER_MD" | head -1)
|
|
96
|
+
OWNER_HOURS=$(sed -n 's/.*\*\*Working hours:\*\* *//p' "$USER_MD" | head -1)
|
|
97
|
+
|
|
98
|
+
python3 -c "
|
|
99
|
+
import json, sys
|
|
100
|
+
d = {}
|
|
101
|
+
for k, v in [('name','$OWNER_NAME'),('phone','$OWNER_PHONE'),('business','$OWNER_BUSINESS'),('location','$OWNER_LOCATION'),('hours','$OWNER_HOURS')]:
|
|
102
|
+
if v: d[k] = v
|
|
103
|
+
json.dump(d, open('$OUTPUT_DIR/identity/owner.json','w'), indent=2)
|
|
104
|
+
print(f'[export] identity/owner.json written ({d.get(\"name\",\"unknown\")})')
|
|
105
|
+
"
|
|
106
|
+
else
|
|
107
|
+
echo "[export] WARN: USER.md not found at $USER_MD — owner identity skipped"
|
|
108
|
+
fi
|
|
109
|
+
|
|
110
|
+
# Agent names from IDENTITY.md files
|
|
111
|
+
ADMIN_IDENTITY="$WORKSPACE_DIR/agents/admin/IDENTITY.md"
|
|
112
|
+
PUBLIC_IDENTITY="$WORKSPACE_DIR/agents/public/IDENTITY.md"
|
|
113
|
+
ADMIN_NAME=$(sed -n 's/.*\*\*Name:\*\* *//p' "$ADMIN_IDENTITY" 2>/dev/null | head -1 || echo "")
|
|
114
|
+
PUBLIC_NAME=$(sed -n 's/.*\*\*Name:\*\* *//p' "$PUBLIC_IDENTITY" 2>/dev/null | head -1 || echo "")
|
|
115
|
+
|
|
116
|
+
python3 -c "
|
|
117
|
+
import json
|
|
118
|
+
d = {'admin': '${ADMIN_NAME:-unknown}', 'public': '${PUBLIC_NAME:-unknown}'}
|
|
119
|
+
json.dump(d, open('$OUTPUT_DIR/identity/agents.json','w'), indent=2)
|
|
120
|
+
print(f'[export] identity/agents.json written (admin={d[\"admin\"]}, public={d[\"public\"]})')
|
|
121
|
+
"
|
|
122
|
+
|
|
123
|
+
# PINs from taskmaster.json access section
|
|
124
|
+
python3 -c "
|
|
125
|
+
import json
|
|
126
|
+
try:
|
|
127
|
+
d = json.load(open('$TASKMASTER_JSON'))
|
|
128
|
+
access = d.get('access', {})
|
|
129
|
+
pins = {
|
|
130
|
+
'masterPin': access.get('masterPin'),
|
|
131
|
+
'workspacePins': access.get('pins', {})
|
|
132
|
+
}
|
|
133
|
+
# Only write if there are actual PINs
|
|
134
|
+
if pins['masterPin'] or pins['workspacePins']:
|
|
135
|
+
json.dump(pins, open('$OUTPUT_DIR/identity/pins.json','w'), indent=2)
|
|
136
|
+
print(f'[export] identity/pins.json written (masterPin={\"set\" if pins[\"masterPin\"] else \"none\"})')
|
|
137
|
+
else:
|
|
138
|
+
print('[export] identity/pins.json skipped (no PINs configured)')
|
|
139
|
+
except Exception as e:
|
|
140
|
+
print(f'[export] WARN: could not extract PINs: {e}')
|
|
141
|
+
"
|
|
142
|
+
|
|
143
|
+
# ------------------------------------------------------------------
|
|
144
|
+
# 3. Contacts — from memory/users/{phone}/ directories
|
|
145
|
+
# ------------------------------------------------------------------
|
|
146
|
+
USERS_DIR="$WORKSPACE_DIR/memory/users"
|
|
147
|
+
CONTACT_COUNT=0
|
|
148
|
+
ANON_SKIP_COUNT=0
|
|
149
|
+
|
|
150
|
+
if [ -d "$USERS_DIR" ]; then
|
|
151
|
+
for USER_DIR in "$USERS_DIR"/*/; do
|
|
152
|
+
[ -d "$USER_DIR" ] || continue
|
|
153
|
+
USER_KEY=$(basename "$USER_DIR")
|
|
154
|
+
|
|
155
|
+
# Skip anonymous users — no phone number, no useful identity
|
|
156
|
+
if [[ "$USER_KEY" == anon-* ]]; then
|
|
157
|
+
ANON_SKIP_COUNT=$((ANON_SKIP_COUNT + 1))
|
|
158
|
+
continue
|
|
159
|
+
fi
|
|
160
|
+
|
|
161
|
+
PROFILE="$USER_DIR/profile.md"
|
|
162
|
+
if [ -f "$PROFILE" ]; then
|
|
163
|
+
# Extract name from first markdown heading or first bold field
|
|
164
|
+
python3 -c "
|
|
165
|
+
import json, re, sys
|
|
166
|
+
|
|
167
|
+
profile_path = '$PROFILE'
|
|
168
|
+
user_key = '$USER_KEY'
|
|
169
|
+
|
|
170
|
+
with open(profile_path, 'r') as f:
|
|
171
|
+
content = f.read()
|
|
172
|
+
|
|
173
|
+
# Extract givenName from first heading (# Name)
|
|
174
|
+
heading = re.search(r'^#\s+(.+)', content, re.MULTILINE)
|
|
175
|
+
given_name = heading.group(1).strip() if heading else None
|
|
176
|
+
|
|
177
|
+
# If heading is generic like 'Customer Profile', try **Name:** field
|
|
178
|
+
if given_name and given_name.lower() in ('customer profile', 'profile', 'user profile'):
|
|
179
|
+
name_field = re.search(r'\*\*(?:Name|First name):\*\*\s*(.+)', content)
|
|
180
|
+
given_name = name_field.group(1).strip() if name_field else None
|
|
181
|
+
|
|
182
|
+
# Extract phone from profile or use directory name
|
|
183
|
+
phone_field = re.search(r'\*\*Phone:\*\*\s*(.+)', content)
|
|
184
|
+
phone = phone_field.group(1).strip() if phone_field else user_key
|
|
185
|
+
|
|
186
|
+
# Extract role
|
|
187
|
+
role_field = re.search(r'\*\*Role:\*\*\s*(.+)', content)
|
|
188
|
+
role = role_field.group(1).strip() if role_field else None
|
|
189
|
+
|
|
190
|
+
# Extract location
|
|
191
|
+
loc_field = re.search(r'\*\*Location:\*\*\s*(.+)', content)
|
|
192
|
+
location = loc_field.group(1).strip() if loc_field else None
|
|
193
|
+
|
|
194
|
+
contact = {
|
|
195
|
+
'telephone': phone,
|
|
196
|
+
'source': 'taskmaster-migration'
|
|
197
|
+
}
|
|
198
|
+
|
|
199
|
+
if given_name and given_name != 'Customer Profile':
|
|
200
|
+
# Split name into given/family if it contains a space
|
|
201
|
+
parts = given_name.split(' ', 1)
|
|
202
|
+
contact['givenName'] = parts[0]
|
|
203
|
+
if len(parts) > 1:
|
|
204
|
+
contact['familyName'] = parts[1]
|
|
205
|
+
else:
|
|
206
|
+
# No name — use phone as identifier, skip givenName
|
|
207
|
+
# (import will skip contacts without givenName)
|
|
208
|
+
pass
|
|
209
|
+
|
|
210
|
+
if role:
|
|
211
|
+
contact['jobTitle'] = role
|
|
212
|
+
if location:
|
|
213
|
+
contact['location'] = location
|
|
214
|
+
|
|
215
|
+
# Include raw profile for reference
|
|
216
|
+
contact['_profileContent'] = content
|
|
217
|
+
|
|
218
|
+
# Write contact JSON
|
|
219
|
+
safe_key = user_key.replace('+', 'plus-')
|
|
220
|
+
json.dump(contact, open(f'$OUTPUT_DIR/contacts/{safe_key}.json', 'w'), indent=2)
|
|
221
|
+
has_name = 'givenName' in contact
|
|
222
|
+
print(f'[export] contact: {user_key} ({contact.get(\"givenName\", \"NO NAME\")})', file=sys.stderr)
|
|
223
|
+
sys.exit(0 if has_name else 0)
|
|
224
|
+
" 2>&1
|
|
225
|
+
else
|
|
226
|
+
# No profile.md — create minimal contact from directory name
|
|
227
|
+
python3 -c "
|
|
228
|
+
import json
|
|
229
|
+
contact = {'telephone': '$USER_KEY', 'source': 'taskmaster-migration'}
|
|
230
|
+
safe_key = '$USER_KEY'.replace('+', 'plus-')
|
|
231
|
+
json.dump(contact, open(f'$OUTPUT_DIR/contacts/{safe_key}.json', 'w'), indent=2)
|
|
232
|
+
print(f'[export] contact: $USER_KEY (no profile.md — phone only)')
|
|
233
|
+
"
|
|
234
|
+
fi
|
|
235
|
+
CONTACT_COUNT=$((CONTACT_COUNT + 1))
|
|
236
|
+
done
|
|
237
|
+
fi
|
|
238
|
+
echo "[export] contacts: $CONTACT_COUNT exported, $ANON_SKIP_COUNT anonymous skipped"
|
|
239
|
+
|
|
240
|
+
# ------------------------------------------------------------------
|
|
241
|
+
# 4. Memory — shared and admin knowledge files
|
|
242
|
+
# ------------------------------------------------------------------
|
|
243
|
+
SHARED_DIR="$WORKSPACE_DIR/memory/shared"
|
|
244
|
+
SHARED_COUNT=0
|
|
245
|
+
if [ -d "$SHARED_DIR" ]; then
|
|
246
|
+
# Copy all files except cron-activity and media (media goes to media/public)
|
|
247
|
+
find "$SHARED_DIR" -maxdepth 1 -type f | while read -r f; do
|
|
248
|
+
cp "$f" "$OUTPUT_DIR/memory/shared/"
|
|
249
|
+
echo "[export] memory/shared: $(basename "$f")"
|
|
250
|
+
done
|
|
251
|
+
SHARED_COUNT=$(find "$OUTPUT_DIR/memory/shared" -type f | wc -l | tr -d ' ')
|
|
252
|
+
fi
|
|
253
|
+
echo "[export] memory/shared: $SHARED_COUNT files"
|
|
254
|
+
|
|
255
|
+
ADMIN_MEMORY="$WORKSPACE_DIR/agents/admin/memory/admin"
|
|
256
|
+
ADMIN_MEM_COUNT=0
|
|
257
|
+
if [ -d "$ADMIN_MEMORY" ]; then
|
|
258
|
+
# Copy recursively, excluding cron-activity (Loop diary logs) and media (goes to media/)
|
|
259
|
+
rsync -a --exclude='cron-activity' --exclude='media' "$ADMIN_MEMORY/" "$OUTPUT_DIR/memory/admin/" 2>/dev/null || \
|
|
260
|
+
find "$ADMIN_MEMORY" -type f -not -path '*/cron-activity/*' -not -path '*/media/*' | while read -r f; do
|
|
261
|
+
REL="${f#$ADMIN_MEMORY/}"
|
|
262
|
+
mkdir -p "$OUTPUT_DIR/memory/admin/$(dirname "$REL")"
|
|
263
|
+
cp "$f" "$OUTPUT_DIR/memory/admin/$REL"
|
|
264
|
+
done
|
|
265
|
+
ADMIN_MEM_COUNT=$(find "$OUTPUT_DIR/memory/admin" -type f 2>/dev/null | wc -l | tr -d ' ')
|
|
266
|
+
fi
|
|
267
|
+
echo "[export] memory/admin: $ADMIN_MEM_COUNT files"
|
|
268
|
+
|
|
269
|
+
# Public agent memory
|
|
270
|
+
PUBLIC_MEMORY="$WORKSPACE_DIR/agents/admin/memory/public"
|
|
271
|
+
if [ -d "$PUBLIC_MEMORY" ]; then
|
|
272
|
+
mkdir -p "$OUTPUT_DIR/memory/public"
|
|
273
|
+
cp -r "$PUBLIC_MEMORY"/* "$OUTPUT_DIR/memory/public/" 2>/dev/null || true
|
|
274
|
+
PUB_COUNT=$(find "$OUTPUT_DIR/memory/public" -type f 2>/dev/null | wc -l | tr -d ' ')
|
|
275
|
+
echo "[export] memory/public: $PUB_COUNT files"
|
|
276
|
+
fi
|
|
277
|
+
|
|
278
|
+
# ------------------------------------------------------------------
|
|
279
|
+
# 5. Conversations — admin, per-user, per-group
|
|
280
|
+
# ------------------------------------------------------------------
|
|
281
|
+
|
|
282
|
+
# Admin conversations
|
|
283
|
+
ADMIN_CONVOS="$ADMIN_MEMORY/conversations"
|
|
284
|
+
ADMIN_CONVO_COUNT=0
|
|
285
|
+
if [ -d "$ADMIN_CONVOS" ]; then
|
|
286
|
+
cp "$ADMIN_CONVOS"/*.md "$OUTPUT_DIR/conversations/admin/" 2>/dev/null || true
|
|
287
|
+
ADMIN_CONVO_COUNT=$(find "$OUTPUT_DIR/conversations/admin" -type f 2>/dev/null | wc -l | tr -d ' ')
|
|
288
|
+
fi
|
|
289
|
+
echo "[export] conversations/admin: $ADMIN_CONVO_COUNT files"
|
|
290
|
+
|
|
291
|
+
# Per-user conversations
|
|
292
|
+
USER_CONVO_COUNT=0
|
|
293
|
+
if [ -d "$USERS_DIR" ]; then
|
|
294
|
+
for USER_DIR in "$USERS_DIR"/*/; do
|
|
295
|
+
[ -d "$USER_DIR" ] || continue
|
|
296
|
+
USER_KEY=$(basename "$USER_DIR")
|
|
297
|
+
CONVO_DIR="$USER_DIR/conversations"
|
|
298
|
+
if [ -d "$CONVO_DIR" ] && [ "$(ls -A "$CONVO_DIR" 2>/dev/null)" ]; then
|
|
299
|
+
SAFE_KEY=$(echo "$USER_KEY" | sed 's/+/plus-/g')
|
|
300
|
+
mkdir -p "$OUTPUT_DIR/conversations/users/$SAFE_KEY"
|
|
301
|
+
cp "$CONVO_DIR"/*.md "$OUTPUT_DIR/conversations/users/$SAFE_KEY/" 2>/dev/null || true
|
|
302
|
+
USER_CONVO_COUNT=$((USER_CONVO_COUNT + 1))
|
|
303
|
+
fi
|
|
304
|
+
done
|
|
305
|
+
fi
|
|
306
|
+
echo "[export] conversations/users: $USER_CONVO_COUNT directories"
|
|
307
|
+
|
|
308
|
+
# Group conversations
|
|
309
|
+
GROUP_DIR="$WORKSPACE_DIR/memory/groups"
|
|
310
|
+
GROUP_CONVO_COUNT=0
|
|
311
|
+
if [ -d "$GROUP_DIR" ]; then
|
|
312
|
+
for GRP in "$GROUP_DIR"/*/; do
|
|
313
|
+
[ -d "$GRP" ] || continue
|
|
314
|
+
GRP_ID=$(basename "$GRP")
|
|
315
|
+
CONVO_DIR="$GRP/conversations"
|
|
316
|
+
if [ -d "$CONVO_DIR" ] && [ "$(ls -A "$CONVO_DIR" 2>/dev/null)" ]; then
|
|
317
|
+
mkdir -p "$OUTPUT_DIR/conversations/groups/$GRP_ID"
|
|
318
|
+
cp "$CONVO_DIR"/*.md "$OUTPUT_DIR/conversations/groups/$GRP_ID/" 2>/dev/null || true
|
|
319
|
+
GROUP_CONVO_COUNT=$((GROUP_CONVO_COUNT + 1))
|
|
320
|
+
fi
|
|
321
|
+
# Group member files
|
|
322
|
+
MEMBERS_DIR="$GRP/members"
|
|
323
|
+
if [ -d "$MEMBERS_DIR" ] && [ "$(ls -A "$MEMBERS_DIR" 2>/dev/null)" ]; then
|
|
324
|
+
mkdir -p "$OUTPUT_DIR/conversations/groups/$GRP_ID/members"
|
|
325
|
+
cp "$MEMBERS_DIR"/* "$OUTPUT_DIR/conversations/groups/$GRP_ID/members/" 2>/dev/null || true
|
|
326
|
+
fi
|
|
327
|
+
done
|
|
328
|
+
fi
|
|
329
|
+
echo "[export] conversations/groups: $GROUP_CONVO_COUNT directories"
|
|
330
|
+
|
|
331
|
+
# ------------------------------------------------------------------
|
|
332
|
+
# 6. Media — admin uploads and shared media
|
|
333
|
+
# ------------------------------------------------------------------
|
|
334
|
+
ADMIN_UPLOADS="$WORKSPACE_DIR/agents/admin/uploads"
|
|
335
|
+
ADMIN_MEDIA_COUNT=0
|
|
336
|
+
if [ -d "$ADMIN_UPLOADS" ] && [ "$(ls -A "$ADMIN_UPLOADS" 2>/dev/null)" ]; then
|
|
337
|
+
cp "$ADMIN_UPLOADS"/* "$OUTPUT_DIR/media/admin/" 2>/dev/null || true
|
|
338
|
+
ADMIN_MEDIA_COUNT=$(find "$OUTPUT_DIR/media/admin" -type f 2>/dev/null | wc -l | tr -d ' ')
|
|
339
|
+
fi
|
|
340
|
+
echo "[export] media/admin: $ADMIN_MEDIA_COUNT files"
|
|
341
|
+
|
|
342
|
+
# Shared media (e.g., PDFs in memory/shared/media/)
|
|
343
|
+
SHARED_MEDIA="$WORKSPACE_DIR/memory/shared/media"
|
|
344
|
+
PUBLIC_MEDIA_COUNT=0
|
|
345
|
+
if [ -d "$SHARED_MEDIA" ] && [ "$(ls -A "$SHARED_MEDIA" 2>/dev/null)" ]; then
|
|
346
|
+
cp "$SHARED_MEDIA"/* "$OUTPUT_DIR/media/public/" 2>/dev/null || true
|
|
347
|
+
PUBLIC_MEDIA_COUNT=$(find "$OUTPUT_DIR/media/public" -type f 2>/dev/null | wc -l | tr -d ' ')
|
|
348
|
+
fi
|
|
349
|
+
echo "[export] media/public: $PUBLIC_MEDIA_COUNT files"
|
|
350
|
+
|
|
351
|
+
# Admin memory media
|
|
352
|
+
ADMIN_MEDIA_DIR="$ADMIN_MEMORY/media"
|
|
353
|
+
if [ -d "$ADMIN_MEDIA_DIR" ] && [ "$(ls -A "$ADMIN_MEDIA_DIR" 2>/dev/null)" ]; then
|
|
354
|
+
cp "$ADMIN_MEDIA_DIR"/* "$OUTPUT_DIR/media/admin/" 2>/dev/null || true
|
|
355
|
+
# Recount after adding
|
|
356
|
+
ADMIN_MEDIA_COUNT=$(find "$OUTPUT_DIR/media/admin" -type f 2>/dev/null | wc -l | tr -d ' ')
|
|
357
|
+
echo "[export] media/admin (updated with memory/admin/media): $ADMIN_MEDIA_COUNT files"
|
|
358
|
+
fi
|
|
359
|
+
|
|
360
|
+
# ------------------------------------------------------------------
|
|
361
|
+
# 7. Automations — workflow definitions (not cron-activity logs)
|
|
362
|
+
# ------------------------------------------------------------------
|
|
363
|
+
WORKFLOWS_DIR="$ADMIN_MEMORY/workflows"
|
|
364
|
+
AUTOMATION_COUNT=0
|
|
365
|
+
if [ -d "$WORKFLOWS_DIR" ] && [ "$(ls -A "$WORKFLOWS_DIR" 2>/dev/null)" ]; then
|
|
366
|
+
cp "$WORKFLOWS_DIR"/* "$OUTPUT_DIR/automations/" 2>/dev/null || true
|
|
367
|
+
AUTOMATION_COUNT=$(find "$OUTPUT_DIR/automations" -type f 2>/dev/null | wc -l | tr -d ' ')
|
|
368
|
+
fi
|
|
369
|
+
echo "[export] automations: $AUTOMATION_COUNT workflow definitions"
|
|
370
|
+
|
|
371
|
+
# ------------------------------------------------------------------
|
|
372
|
+
# Summary
|
|
373
|
+
# ------------------------------------------------------------------
|
|
374
|
+
TOTAL_CONTACTS=$CONTACT_COUNT
|
|
375
|
+
TOTAL_MEMORY=$((SHARED_COUNT + ADMIN_MEM_COUNT))
|
|
376
|
+
TOTAL_MEDIA=$((ADMIN_MEDIA_COUNT + PUBLIC_MEDIA_COUNT))
|
|
377
|
+
TOTAL_CONVOS=$((ADMIN_CONVO_COUNT + USER_CONVO_COUNT + GROUP_CONVO_COUNT))
|
|
378
|
+
|
|
379
|
+
echo ""
|
|
380
|
+
echo "[export] =================================================="
|
|
381
|
+
echo "[export] Bundle complete: $OUTPUT_DIR"
|
|
382
|
+
echo "[export] Contacts: $TOTAL_CONTACTS (+ $ANON_SKIP_COUNT anonymous skipped)"
|
|
383
|
+
echo "[export] Memory files: $TOTAL_MEMORY"
|
|
384
|
+
echo "[export] Media files: $TOTAL_MEDIA"
|
|
385
|
+
echo "[export] Conversations: $TOTAL_CONVOS directories"
|
|
386
|
+
echo "[export] Automations: $AUTOMATION_COUNT"
|
|
387
|
+
echo "[export] =================================================="
|