loki-mode 5.40.1 → 5.42.0

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
package/SKILL.md CHANGED
@@ -3,7 +3,7 @@ name: loki-mode
3
3
  description: Multi-agent autonomous startup system. Triggers on "Loki Mode". Takes PRD to deployed product with zero human intervention. Requires --dangerously-skip-permissions flag.
4
4
  ---
5
5
 
6
- # Loki Mode v5.40.1
6
+ # Loki Mode v5.42.0
7
7
 
8
8
  **You are an autonomous agent. You make decisions. You do not ask questions. You do not stop.**
9
9
 
@@ -127,8 +127,8 @@ GROWTH ──[continuous improvement loop]──> GROWTH
127
127
  - Load only 1-2 skill modules at a time (from skills/00-index.md)
128
128
  - Use Task tool with subagents for exploration (isolates context)
129
129
  - IF context feels heavy: Create `.loki/signals/CONTEXT_CLEAR_REQUESTED`
130
- - **Context Window Tracking (v5.40.1):** Dashboard gauge, timeline, and per-agent breakdown at `GET /api/context`
131
- - **Notification Triggers (v5.40.1):** Configurable alerts when context exceeds thresholds, tasks fail, or budget limits hit. Manage via `GET/PUT /api/notifications/triggers`
130
+ - **Context Window Tracking (v5.40.0):** Dashboard gauge, timeline, and per-agent breakdown at `GET /api/context`
131
+ - **Notification Triggers (v5.40.0):** Configurable alerts when context exceeds thresholds, tasks fail, or budget limits hit. Manage via `GET/PUT /api/notifications/triggers`
132
132
 
133
133
  ---
134
134
 
@@ -258,8 +258,8 @@ The following features are documented in skill modules but not yet fully automat
258
258
  |---------|--------|-------|
259
259
  | PRE-ACT goal drift detection | Planned | Agent-level attention check before each action; no automated enforcement yet |
260
260
  | CONTINUITY.md working memory | Implemented (v5.35.0) | Auto-managed by run.sh, updated each iteration |
261
- | GitHub issue import | Planned | Config flags exist (`LOKI_GITHUB_IMPORT`); `gh` CLI integration partial |
261
+ | GitHub integration | Implemented (v5.42.0) | Import, sync-back, PR creation, export. CLI: `loki github`, API: `/api/github/*` |
262
262
  | Quality gates 3-reviewer system | Implemented (v5.35.0) | 5 specialist reviewers in `skills/quality-gates.md`; execution in run.sh |
263
263
  | Benchmarks (HumanEval, SWE-bench) | Infrastructure only | Runner scripts and datasets exist in `benchmarks/`; no published results |
264
264
 
265
- **v5.40.1 | fix: 46-bug audit across all distributions, JSON injection, Docker mounts, auth hardening | ~260 lines core**
265
+ **v5.42.0 | feat: GitHub sync-back, PR creation, export (fully wired) | ~260 lines core**
package/VERSION CHANGED
@@ -1 +1 @@
1
- 5.40.1
1
+ 5.42.0
package/autonomy/loki CHANGED
@@ -14,6 +14,7 @@
14
14
  # loki status - Show current status
15
15
  # loki dashboard - Open dashboard in browser
16
16
  # loki import - Import GitHub issues
17
+ # loki github [cmd] - GitHub integration (sync|export|pr|status)
17
18
  # loki help - Show this help
18
19
  #===============================================================================
19
20
 
@@ -323,6 +324,7 @@ show_help() {
323
324
  echo " notify [cmd] Send notifications (test|slack|discord|webhook|status)"
324
325
  echo " voice [cmd] Voice input for PRD creation (status|listen|dictate|speak|start)"
325
326
  echo " import Import GitHub issues as tasks"
327
+ echo " github [cmd] GitHub integration (sync|export|pr|status)"
326
328
  echo " config [cmd] Manage configuration (show|init|edit|path)"
327
329
  echo " completions [bash|zsh] Output shell completion scripts"
328
330
  echo " memory [cmd] Cross-project learnings (list|show|search|stats)"
@@ -515,7 +517,7 @@ cmd_start() {
515
517
  if [ -n "$prd_file" ]; then
516
518
  args+=("$prd_file")
517
519
  else
518
- # No PRD file specified -- warn and confirm before consuming API credits
520
+ # No PRD file specified -- warn and confirm before starting
519
521
  # Auto-confirm in CI environments or when LOKI_AUTO_CONFIRM is set
520
522
  # LOKI_AUTO_CONFIRM takes precedence when explicitly set;
521
523
  # fall back to CI env var only when LOKI_AUTO_CONFIRM is unset
@@ -524,10 +526,10 @@ cmd_start() {
524
526
  echo -e "${YELLOW}Warning: No PRD file specified. Auto-confirming (CI mode).${NC}"
525
527
  else
526
528
  echo -e "${YELLOW}Warning: No PRD file specified.${NC}"
527
- echo "Loki Mode will start autonomous execution in the current directory"
528
- echo "without a requirements document."
529
+ echo "Loki Mode will analyze the existing codebase and generate"
530
+ echo "a PRD automatically. No requirements document needed."
529
531
  echo ""
530
- echo -e "This will consume API credits. Continue? [y/N] \c"
532
+ echo -e "Continue? [y/N] \c"
531
533
  read -r confirm
532
534
  if [[ ! "$confirm" =~ ^[Yy] ]]; then
533
535
  echo "Aborted. Usage: loki start <path-to-prd.md>"
@@ -1832,6 +1834,183 @@ cmd_import() {
1832
1834
  fi
1833
1835
  }
1834
1836
 
1837
+ # GitHub integration management (v5.41.0)
1838
+ cmd_github() {
1839
+ local subcmd="${1:-help}"
1840
+ shift 2>/dev/null || true
1841
+
1842
+ case "$subcmd" in
1843
+ sync)
1844
+ # Sync completed tasks back to GitHub issues
1845
+ if [ ! -d "$LOKI_DIR" ]; then
1846
+ echo -e "${RED}No active Loki session found${NC}"
1847
+ exit 1
1848
+ fi
1849
+
1850
+ if ! command -v gh &>/dev/null; then
1851
+ echo -e "${RED}Error: gh CLI not found. Install with: brew install gh${NC}"
1852
+ exit 1
1853
+ fi
1854
+
1855
+ if ! gh auth status &>/dev/null; then
1856
+ echo -e "${RED}Error: gh CLI not authenticated. Run: gh auth login${NC}"
1857
+ exit 1
1858
+ fi
1859
+
1860
+ export LOKI_GITHUB_SYNC=true
1861
+ source "$RUN_SH" 2>/dev/null || true
1862
+
1863
+ echo -e "${GREEN}Syncing completed tasks to GitHub...${NC}"
1864
+ if type sync_github_completed_tasks &>/dev/null; then
1865
+ sync_github_completed_tasks
1866
+ echo -e "${GREEN}Sync complete.${NC}"
1867
+
1868
+ # Show what was synced
1869
+ if [ -f "$LOKI_DIR/github/synced.log" ]; then
1870
+ local count
1871
+ count=$(wc -l < "$LOKI_DIR/github/synced.log" | tr -d ' ')
1872
+ echo -e "${DIM}Total synced status updates: $count${NC}"
1873
+ fi
1874
+ else
1875
+ echo -e "${YELLOW}Sync function not available.${NC}"
1876
+ fi
1877
+ ;;
1878
+
1879
+ export)
1880
+ # Export local tasks as GitHub issues
1881
+ if [ ! -d "$LOKI_DIR" ]; then
1882
+ echo -e "${RED}No active Loki session found${NC}"
1883
+ exit 1
1884
+ fi
1885
+
1886
+ if ! command -v gh &>/dev/null; then
1887
+ echo -e "${RED}Error: gh CLI not found. Install with: brew install gh${NC}"
1888
+ exit 1
1889
+ fi
1890
+
1891
+ echo -e "${GREEN}Exporting local tasks to GitHub issues...${NC}"
1892
+ source "$RUN_SH" 2>/dev/null || true
1893
+ if type export_tasks_to_github &>/dev/null; then
1894
+ export_tasks_to_github
1895
+ echo -e "${GREEN}Export complete.${NC}"
1896
+ else
1897
+ echo -e "${YELLOW}Export function not available.${NC}"
1898
+ fi
1899
+ ;;
1900
+
1901
+ pr)
1902
+ # Create PR from completed work
1903
+ if ! command -v gh &>/dev/null; then
1904
+ echo -e "${RED}Error: gh CLI not found. Install with: brew install gh${NC}"
1905
+ exit 1
1906
+ fi
1907
+
1908
+ local feature_name="${1:-Loki Mode changes}"
1909
+ export LOKI_GITHUB_PR=true
1910
+ source "$RUN_SH" 2>/dev/null || true
1911
+
1912
+ echo -e "${GREEN}Creating pull request: $feature_name${NC}"
1913
+ if type create_github_pr &>/dev/null; then
1914
+ create_github_pr "$feature_name"
1915
+ else
1916
+ echo -e "${YELLOW}PR function not available.${NC}"
1917
+ fi
1918
+ ;;
1919
+
1920
+ status)
1921
+ # Show GitHub integration status
1922
+ echo -e "${BOLD}GitHub Integration Status${NC}"
1923
+ echo ""
1924
+
1925
+ # gh CLI
1926
+ if command -v gh &>/dev/null; then
1927
+ echo -e " gh CLI: ${GREEN}installed$(gh --version 2>/dev/null | head -1 | sed 's/gh version /v/')${NC}"
1928
+ if gh auth status &>/dev/null 2>&1; then
1929
+ echo -e " Auth: ${GREEN}authenticated${NC}"
1930
+ else
1931
+ echo -e " Auth: ${RED}not authenticated${NC}"
1932
+ fi
1933
+ else
1934
+ echo -e " gh CLI: ${RED}not installed${NC}"
1935
+ fi
1936
+
1937
+ # Repo detection
1938
+ local repo=""
1939
+ repo=$(git remote get-url origin 2>/dev/null | sed 's|.*github.com[:/]||;s|\.git$||' || echo "")
1940
+ if [ -n "$repo" ]; then
1941
+ echo -e " Repository: ${GREEN}$repo${NC}"
1942
+ else
1943
+ echo -e " Repository: ${YELLOW}not detected${NC}"
1944
+ fi
1945
+
1946
+ # Config flags
1947
+ echo ""
1948
+ echo -e "${BOLD}Configuration${NC}"
1949
+ echo -e " LOKI_GITHUB_IMPORT: ${LOKI_GITHUB_IMPORT:-false}"
1950
+ echo -e " LOKI_GITHUB_SYNC: ${LOKI_GITHUB_SYNC:-false}"
1951
+ echo -e " LOKI_GITHUB_PR: ${LOKI_GITHUB_PR:-false}"
1952
+ echo -e " LOKI_GITHUB_LABELS: ${LOKI_GITHUB_LABELS:-(all)}"
1953
+ echo -e " LOKI_GITHUB_LIMIT: ${LOKI_GITHUB_LIMIT:-100}"
1954
+
1955
+ # Sync log
1956
+ if [ -f "$LOKI_DIR/github/synced.log" ]; then
1957
+ echo ""
1958
+ echo -e "${BOLD}Sync History${NC}"
1959
+ local total
1960
+ total=$(wc -l < "$LOKI_DIR/github/synced.log" | tr -d ' ')
1961
+ echo -e " Total synced updates: $total"
1962
+ echo -e " Recent:"
1963
+ tail -5 "$LOKI_DIR/github/synced.log" | sed 's/^/ /'
1964
+ fi
1965
+
1966
+ # Imported tasks
1967
+ if [ -f "$LOKI_DIR/queue/pending.json" ]; then
1968
+ local gh_tasks
1969
+ gh_tasks=$(python3 -c "
1970
+ import json
1971
+ try:
1972
+ with open('$LOKI_DIR/queue/pending.json') as f:
1973
+ data = json.load(f)
1974
+ tasks = data.get('tasks', data) if isinstance(data, dict) else data
1975
+ gh = [t for t in tasks if t.get('source') == 'github']
1976
+ print(len(gh))
1977
+ except: print(0)
1978
+ " 2>/dev/null || echo "0")
1979
+ echo ""
1980
+ echo -e "${BOLD}Imported Issues${NC}"
1981
+ echo -e " GitHub tasks in queue: $gh_tasks"
1982
+ fi
1983
+ ;;
1984
+
1985
+ help|*)
1986
+ echo -e "${BOLD}loki github${NC} - GitHub integration management"
1987
+ echo ""
1988
+ echo "Commands:"
1989
+ echo " status Show GitHub integration status"
1990
+ echo " sync Sync completed task status back to GitHub issues"
1991
+ echo " export Export local tasks as new GitHub issues"
1992
+ echo " pr [name] Create pull request from completed work"
1993
+ echo ""
1994
+ echo "Environment Variables:"
1995
+ echo " LOKI_GITHUB_IMPORT=true Import open issues as tasks on start"
1996
+ echo " LOKI_GITHUB_SYNC=true Sync status back to issues during session"
1997
+ echo " LOKI_GITHUB_PR=true Create PR when session completes successfully"
1998
+ echo " LOKI_GITHUB_LABELS=bug Filter issues by label (comma-separated)"
1999
+ echo " LOKI_GITHUB_MILESTONE=v2 Filter by milestone"
2000
+ echo " LOKI_GITHUB_ASSIGNEE=me Filter by assignee"
2001
+ echo " LOKI_GITHUB_LIMIT=50 Max issues to import (default: 100)"
2002
+ echo " LOKI_GITHUB_PR_LABEL=loki Label to add to created PRs"
2003
+ echo ""
2004
+ echo "Examples:"
2005
+ echo " loki github status"
2006
+ echo " loki github sync"
2007
+ echo " loki github export"
2008
+ echo " loki github pr \"Add user authentication\""
2009
+ echo " LOKI_GITHUB_SYNC=true loki start --github ./prd.md"
2010
+ ;;
2011
+ esac
2012
+ }
2013
+
1835
2014
  # Parse GitHub issue using issue-parser.sh
1836
2015
  cmd_issue_parse() {
1837
2016
  local issue_ref=""
@@ -4278,6 +4457,9 @@ main() {
4278
4457
  import)
4279
4458
  cmd_import
4280
4459
  ;;
4460
+ github)
4461
+ cmd_github "$@"
4462
+ ;;
4281
4463
  issue)
4282
4464
  cmd_issue "$@"
4283
4465
  ;;
package/autonomy/run.sh CHANGED
@@ -1273,10 +1273,12 @@ create_github_pr() {
1273
1273
  local pr_body=".loki/reports/pr-body.md"
1274
1274
  mkdir -p "$(dirname "$pr_body")"
1275
1275
 
1276
+ local version
1277
+ version=$(cat "${SCRIPT_DIR%/*}/VERSION" 2>/dev/null || echo "unknown")
1276
1278
  cat > "$pr_body" << EOF
1277
1279
  ## Summary
1278
1280
 
1279
- Automated implementation by Loki Mode v4.1.0
1281
+ Automated implementation by Loki Mode v$version ($ITERATION_COUNT iterations, provider: ${PROVIDER_NAME:-claude})
1280
1282
 
1281
1283
  ### Feature: $feature_name
1282
1284
 
@@ -1349,24 +1351,105 @@ sync_github_status() {
1349
1351
  return 1
1350
1352
  fi
1351
1353
 
1354
+ # Track synced issues to avoid duplicate comments
1355
+ mkdir -p .loki/github
1356
+ local sync_log=".loki/github/synced.log"
1357
+ local sync_key="${issue_number}:${status}"
1358
+ if [ -f "$sync_log" ] && grep -qF "$sync_key" "$sync_log" 2>/dev/null; then
1359
+ return 0 # Already synced this status
1360
+ fi
1361
+
1352
1362
  case "$status" in
1353
1363
  "in_progress")
1354
1364
  gh issue comment "$issue_number" --repo "$repo" \
1355
- --body "Loki Mode: Task in progress - ${message:-implementing solution...}" \
1365
+ --body "**Loki Mode** -- Working on this issue (iteration $ITERATION_COUNT)" \
1356
1366
  2>/dev/null || true
1357
1367
  ;;
1358
1368
  "completed")
1369
+ local branch
1370
+ branch=$(git rev-parse --abbrev-ref HEAD 2>/dev/null || echo "main")
1371
+ local commit
1372
+ commit=$(git rev-parse --short HEAD 2>/dev/null || echo "unknown")
1359
1373
  gh issue comment "$issue_number" --repo "$repo" \
1360
- --body "Loki Mode: Implementation complete. ${message:-}" \
1374
+ --body "**Loki Mode** -- Implementation complete on \`$branch\` ($commit). ${message:-}" \
1361
1375
  2>/dev/null || true
1362
1376
  ;;
1363
1377
  "closed")
1364
1378
  gh issue close "$issue_number" --repo "$repo" \
1365
1379
  --reason "completed" \
1366
- --comment "Loki Mode: Fixed. ${message:-}" \
1380
+ --comment "**Loki Mode** -- Resolved. ${message:-}" \
1367
1381
  2>/dev/null || true
1368
1382
  ;;
1369
1383
  esac
1384
+
1385
+ # Record sync to avoid duplicates
1386
+ echo "$sync_key" >> "$sync_log"
1387
+ }
1388
+
1389
+ # Sync all completed GitHub-sourced tasks back to their issues
1390
+ # Called after each iteration and at session end
1391
+ sync_github_completed_tasks() {
1392
+ if [ "$GITHUB_SYNC" != "true" ]; then
1393
+ return 0
1394
+ fi
1395
+
1396
+ if ! check_github_cli; then
1397
+ return 0
1398
+ fi
1399
+
1400
+ local completed_file=".loki/queue/completed.json"
1401
+ if [ ! -f "$completed_file" ]; then
1402
+ return 0
1403
+ fi
1404
+
1405
+ # Find GitHub-sourced tasks in completed queue that haven't been synced
1406
+ python3 -c "
1407
+ import json, sys
1408
+ try:
1409
+ with open('$completed_file') as f:
1410
+ tasks = json.load(f)
1411
+ for t in tasks:
1412
+ tid = t.get('id', '')
1413
+ if tid.startswith('github-'):
1414
+ print(tid)
1415
+ except Exception:
1416
+ pass
1417
+ " 2>/dev/null | while read -r task_id; do
1418
+ sync_github_status "$task_id" "completed"
1419
+ done
1420
+ }
1421
+
1422
+ # Sync GitHub-sourced tasks currently in-progress
1423
+ sync_github_in_progress_tasks() {
1424
+ if [ "$GITHUB_SYNC" != "true" ]; then
1425
+ return 0
1426
+ fi
1427
+
1428
+ if ! check_github_cli; then
1429
+ return 0
1430
+ fi
1431
+
1432
+ local pending_file=".loki/queue/pending.json"
1433
+ if [ ! -f "$pending_file" ]; then
1434
+ return 0
1435
+ fi
1436
+
1437
+ # Find GitHub-sourced tasks in pending queue (about to be worked on)
1438
+ python3 -c "
1439
+ import json
1440
+ try:
1441
+ with open('$pending_file') as f:
1442
+ data = json.load(f)
1443
+ tasks = data.get('tasks', data) if isinstance(data, dict) else data
1444
+ for t in tasks:
1445
+ tid = t.get('id', '')
1446
+ if tid.startswith('github-'):
1447
+ print(tid)
1448
+ except Exception:
1449
+ pass
1450
+ " 2>/dev/null | while read -r task_id; do
1451
+ sync_github_status "$task_id" "in_progress"
1452
+ done
1370
1453
  }
1371
1454
 
1372
1455
  # Export tasks to GitHub issues (reverse sync)
@@ -2622,6 +2705,14 @@ except: print('{\"total\":0,\"unacknowledged\":0}')
2622
2705
  "council": $council_state,
2623
2706
  "budget": $budget_json,
2624
2707
  "context": $context_state,
2708
+ "tokens": $(python3 -c "
2709
+ import json
2710
+ try:
2711
+ t = json.load(open('.loki/context/tracking.json'))
2712
+ totals = t.get('totals', {})
2713
+ print(json.dumps({'input': totals.get('total_input', 0), 'output': totals.get('total_output', 0), 'cost_usd': totals.get('total_cost_usd', 0)}))
2714
+ except: print('null')
2715
+ " 2>/dev/null || echo "null"),
2625
2716
  "notifications": $notification_summary
2626
2717
  }
2627
2718
  EOF
@@ -2774,6 +2865,9 @@ track_iteration_complete() {
2774
2865
  --context "{\"iteration\":$iteration,\"exit_code\":$exit_code}"
2775
2866
  fi
2776
2867
 
2868
+ # Track context window usage FIRST to get token data (v5.42.0)
2869
+ track_context_usage "$iteration"
2870
+
2777
2871
  # Write efficiency tracking file for /api/cost endpoint
2778
2872
  mkdir -p .loki/metrics/efficiency
2779
2873
  local model_tier="sonnet"
@@ -2786,6 +2880,25 @@ track_iteration_complete() {
2786
2880
  fi
2787
2881
  local phase="${LAST_KNOWN_PHASE:-}"
2788
2882
  [ -z "$phase" ] && phase=$(python3 -c "import json; print(json.load(open('.loki/state/orchestrator.json')).get('currentPhase', 'unknown'))" 2>/dev/null || echo "unknown")
2883
+
2884
+ # Read token data from context tracker output (v5.42.0)
2885
+ local iter_input=0 iter_output=0 iter_cost=0
2886
+ if [ -f ".loki/context/tracking.json" ]; then
2887
+ read iter_input iter_output iter_cost < <(python3 -c "
2888
+ import json
2889
+ try:
2890
+ t = json.load(open('.loki/context/tracking.json'))
2891
+ iters = t.get('per_iteration', [])
2892
+ match = [i for i in iters if i.get('iteration') == $iteration]
2893
+ if match:
2894
+ m = match[-1]
2895
+ print(m.get('input_tokens', 0), m.get('output_tokens', 0), m.get('cost_usd', 0))
2896
+ else:
2897
+ print(0, 0, 0)
2898
+ except: print(0, 0, 0)
2899
+ " 2>/dev/null || echo "0 0 0")
2900
+ fi
2901
+
2789
2902
  cat > ".loki/metrics/efficiency/iteration-${iteration}.json" << EFF_EOF
2790
2903
  {
2791
2904
  "iteration": $iteration,
@@ -2794,16 +2907,19 @@ track_iteration_complete() {
2794
2907
  "duration_ms": $duration_ms,
2795
2908
  "provider": "${PROVIDER_NAME:-claude}",
2796
2909
  "status": "$status_str",
2910
+ "input_tokens": ${iter_input:-0},
2911
+ "output_tokens": ${iter_output:-0},
2912
+ "cost_usd": ${iter_cost:-0},
2797
2913
  "timestamp": "$(date -u +%Y-%m-%dT%H:%M:%SZ)"
2798
2914
  }
2799
2915
  EFF_EOF
2800
2916
 
2801
- # Track context window usage (v5.40.0)
2802
- track_context_usage "$iteration"
2803
-
2804
2917
  # Check notification triggers (v5.40.0)
2805
2918
  check_notification_triggers "$iteration"
2806
2919
 
2920
+ # Sync completed GitHub tasks back to issues (v5.41.0)
2921
+ sync_github_completed_tasks
2922
+
2807
2923
  # Get task from in-progress
2808
2924
  local in_progress_file=".loki/queue/in-progress.json"
2809
2925
  local completed_file=".loki/queue/completed.json"
@@ -7020,6 +7136,8 @@ main() {
7020
7136
  # Import GitHub issues if enabled (v4.1.0)
7021
7137
  if [ "$GITHUB_IMPORT" = "true" ]; then
7022
7138
  import_github_issues
7139
+ # Notify GitHub that imported issues are being worked on (v5.41.0)
7140
+ sync_github_in_progress_tasks
7023
7141
  fi
7024
7142
 
7025
7143
  # Start web dashboard (if enabled)
@@ -7109,6 +7227,14 @@ main() {
7109
7227
  run_autonomous "$PRD_PATH" || result=$?
7110
7228
  fi
7111
7229
 
7230
+ # Final GitHub sync: sync all completed tasks and create PR (v5.41.0)
7231
+ sync_github_completed_tasks
7232
+ if [ "$GITHUB_PR" = "true" ] && [ "$result" = "0" ]; then
7233
+ local feature_name="${PRD_PATH:-Codebase improvements}"
7234
+ feature_name=$(basename "$feature_name" .md 2>/dev/null || echo "$feature_name")
7235
+ create_github_pr "$feature_name"
7236
+ fi
7237
+
7112
7238
  # Extract and save learnings from this session
7113
7239
  extract_learnings_from_session
7114
7240
 
@@ -7,7 +7,7 @@ Modules:
7
7
  control: Session control API (start/stop/pause/resume)
8
8
  """
9
9
 
10
- __version__ = "5.40.1"
10
+ __version__ = "5.42.0"
11
11
 
12
12
  # Expose the control app for easy import
13
13
  try:
@@ -1500,22 +1500,61 @@ async def get_memory_timeline():
1500
1500
 
1501
1501
 
1502
1502
  # Learning/metrics endpoints
1503
+
1504
+
1505
+ def _read_learning_signals(signal_type: Optional[str] = None, limit: int = 50) -> list:
1506
+ """Read learning signals from .loki/learning/signals/*.json files.
1507
+
1508
+ Learning signals are written as individual JSON files by the learning emitter
1509
+ (learning/emitter.py). Each file contains a single signal object with fields:
1510
+ id, type, source, action, timestamp, confidence, outcome, data, context.
1511
+ """
1512
+ signals_dir = _get_loki_dir() / "learning" / "signals"
1513
+ if not signals_dir.exists() or not signals_dir.is_dir():
1514
+ return []
1515
+
1516
+ signals = []
1517
+ try:
1518
+ for fpath in signals_dir.glob("*.json"):
1519
+ try:
1520
+ raw = fpath.read_text()
1521
+ if not raw.strip():
1522
+ continue
1523
+ sig = json.loads(raw)
1524
+ if signal_type and sig.get("type") != signal_type:
1525
+ continue
1526
+ signals.append(sig)
1527
+ except (json.JSONDecodeError, OSError):
1528
+ continue
1529
+ except OSError:
1530
+ return []
1531
+
1532
+ # Sort by timestamp descending (newest first)
1533
+ signals.sort(key=lambda s: s.get("timestamp", ""), reverse=True)
1534
+ return signals[:limit]
1535
+
1536
+
1503
1537
  @app.get("/api/learning/metrics")
1504
1538
  async def get_learning_metrics(
1505
1539
  timeRange: str = "7d",
1506
1540
  signalType: Optional[str] = None,
1507
1541
  source: Optional[str] = None,
1508
1542
  ):
1509
- """Get learning metrics from events and metrics files."""
1543
+ """Get learning metrics from events, metrics files, and learning signals."""
1510
1544
  events = _read_events(timeRange)
1511
1545
 
1546
+ # Also read from learning signals directory
1547
+ all_signals = _read_learning_signals(limit=10000)
1548
+
1512
1549
  # Filter by type and source
1513
1550
  if signalType:
1514
1551
  events = [e for e in events if e.get("data", {}).get("type") == signalType]
1552
+ all_signals = [s for s in all_signals if s.get("type") == signalType]
1515
1553
  if source:
1516
1554
  events = [e for e in events if e.get("data", {}).get("source") == source]
1555
+ all_signals = [s for s in all_signals if s.get("source") == source]
1517
1556
 
1518
- # Count by type
1557
+ # Count by type from events.jsonl
1519
1558
  by_type: dict = {}
1520
1559
  by_source: dict = {}
1521
1560
  for e in events:
@@ -1524,6 +1563,19 @@ async def get_learning_metrics(
1524
1563
  s = e.get("data", {}).get("source", "unknown")
1525
1564
  by_source[s] = by_source.get(s, 0) + 1
1526
1565
 
1566
+ # Merge counts from learning signals directory
1567
+ for s in all_signals:
1568
+ t = s.get("type", "unknown")
1569
+ by_type[t] = by_type.get(t, 0) + 1
1570
+ src = s.get("source", "unknown")
1571
+ by_source[src] = by_source.get(src, 0) + 1
1572
+
1573
+ total_count = len(events) + len(all_signals)
1574
+
1575
+ # Calculate average confidence across both sources
1576
+ total_conf = sum(e.get("data", {}).get("confidence", 0) for e in events)
1577
+ total_conf += sum(s.get("confidence", 0) for s in all_signals)
1578
+
1527
1579
  # Load aggregation data from file if available
1528
1580
  aggregation = {
1529
1581
  "preferences": [],
@@ -1543,10 +1595,10 @@ async def get_learning_metrics(
1543
1595
  pass
1544
1596
 
1545
1597
  return {
1546
- "totalSignals": len(events),
1598
+ "totalSignals": total_count,
1547
1599
  "signalsByType": by_type,
1548
1600
  "signalsBySource": by_source,
1549
- "avgConfidence": round(sum(e.get("data", {}).get("confidence", 0) for e in events) / max(len(events), 1), 4),
1601
+ "avgConfidence": round(total_conf / max(total_count, 1), 4),
1550
1602
  "aggregation": aggregation,
1551
1603
  }
1552
1604
 
@@ -1579,25 +1631,107 @@ async def get_learning_signals(
1579
1631
  limit: int = 50,
1580
1632
  offset: int = 0,
1581
1633
  ):
1582
- """Get raw learning signals."""
1634
+ """Get raw learning signals from both events.jsonl and learning signals directory."""
1583
1635
  events = _read_events(timeRange)
1584
1636
  if signalType:
1585
1637
  events = [e for e in events if e.get("type") == signalType]
1586
1638
  if source:
1587
1639
  events = [e for e in events if e.get("data", {}).get("source") == source]
1588
- return events[offset:offset + limit]
1640
+
1641
+ # Also read from learning signals directory
1642
+ file_signals = _read_learning_signals(signal_type=signalType, limit=10000)
1643
+ if source:
1644
+ file_signals = [s for s in file_signals if s.get("source") == source]
1645
+
1646
+ # Merge and sort by timestamp (newest first)
1647
+ combined = events + file_signals
1648
+ combined.sort(key=lambda s: s.get("timestamp", ""), reverse=True)
1649
+ return combined[offset:offset + limit]
1589
1650
 
1590
1651
 
1591
1652
  @app.get("/api/learning/aggregation")
1592
1653
  async def get_learning_aggregation():
1593
- """Get latest learning aggregation result."""
1654
+ """Get latest learning aggregation result, merging file-based aggregation with live signals."""
1655
+ result = {"preferences": [], "error_patterns": [], "success_patterns": [], "tool_efficiencies": []}
1656
+
1657
+ # Load pre-computed aggregation from file if available
1594
1658
  agg_file = _get_loki_dir() / "metrics" / "aggregation.json"
1595
1659
  if agg_file.exists():
1596
1660
  try:
1597
- return json.loads(agg_file.read_text())
1661
+ result = json.loads(agg_file.read_text())
1598
1662
  except Exception:
1599
1663
  pass
1600
- return {"preferences": [], "error_patterns": [], "success_patterns": [], "tool_efficiencies": []}
1664
+
1665
+ # Supplement with live data from learning signals directory
1666
+ success_signals = _read_learning_signals(signal_type="success_pattern", limit=500)
1667
+ tool_signals = _read_learning_signals(signal_type="tool_efficiency", limit=500)
1668
+ error_signals = _read_learning_signals(signal_type="error_pattern", limit=500)
1669
+ pref_signals = _read_learning_signals(signal_type="user_preference", limit=500)
1670
+
1671
+ # Merge success patterns from signals if aggregation file had none
1672
+ if not result.get("success_patterns") and success_signals:
1673
+ pattern_counts: dict = {}
1674
+ for s in success_signals:
1675
+ name = s.get("data", {}).get("pattern_name", s.get("action", "unknown"))
1676
+ pattern_counts[name] = pattern_counts.get(name, 0) + 1
1677
+ result["success_patterns"] = [
1678
+ {"pattern_name": k, "frequency": v, "confidence": min(1.0, v / 10)}
1679
+ for k, v in sorted(pattern_counts.items(), key=lambda x: -x[1])
1680
+ ]
1681
+
1682
+ # Merge tool efficiencies from signals if aggregation file had none
1683
+ if not result.get("tool_efficiencies") and tool_signals:
1684
+ tool_stats: dict = {}
1685
+ for s in tool_signals:
1686
+ data = s.get("data", {})
1687
+ tool_name = data.get("tool_name", s.get("action", "unknown"))
1688
+ if tool_name not in tool_stats:
1689
+ tool_stats[tool_name] = {"count": 0, "total_ms": 0, "successes": 0}
1690
+ tool_stats[tool_name]["count"] += 1
1691
+ tool_stats[tool_name]["total_ms"] += data.get("duration_ms", 0)
1692
+ if data.get("success", s.get("outcome") == "success"):
1693
+ tool_stats[tool_name]["successes"] += 1
1694
+ result["tool_efficiencies"] = []
1695
+ for tname, stats in sorted(tool_stats.items(), key=lambda x: -x[1]["count"]):
1696
+ avg_ms = stats["total_ms"] / stats["count"] if stats["count"] else 0
1697
+ sr = round(stats["successes"] / stats["count"], 4) if stats["count"] else 0
1698
+ result["tool_efficiencies"].append({
1699
+ "tool_name": tname, "efficiency_score": sr,
1700
+ "count": stats["count"], "avg_execution_time_ms": round(avg_ms, 2),
1701
+ "success_rate": sr,
1702
+ })
1703
+
1704
+ # Merge error patterns from signals if aggregation file had none
1705
+ if not result.get("error_patterns") and error_signals:
1706
+ error_counts: dict = {}
1707
+ for s in error_signals:
1708
+ etype = s.get("data", {}).get("error_type", s.get("action", "unknown"))
1709
+ error_counts[etype] = error_counts.get(etype, 0) + 1
1710
+ result["error_patterns"] = [
1711
+ {"error_type": k, "resolution_rate": 0.0, "frequency": v, "confidence": min(1.0, v / 10)}
1712
+ for k, v in sorted(error_counts.items(), key=lambda x: -x[1])
1713
+ ]
1714
+
1715
+ # Merge preferences from signals if aggregation file had none
1716
+ if not result.get("preferences") and pref_signals:
1717
+ pref_counts: dict = {}
1718
+ for s in pref_signals:
1719
+ key = s.get("data", {}).get("preference_key", s.get("action", "unknown"))
1720
+ pref_counts[key] = pref_counts.get(key, 0) + 1
1721
+ result["preferences"] = [
1722
+ {"preference_key": k, "preferred_value": k, "frequency": v, "confidence": min(1.0, v / 10)}
1723
+ for k, v in sorted(pref_counts.items(), key=lambda x: -x[1])
1724
+ ]
1725
+
1726
+ # Add signal counts summary
1727
+ result["signal_counts"] = {
1728
+ "success_patterns": len(success_signals),
1729
+ "tool_efficiency": len(tool_signals),
1730
+ "error_patterns": len(error_signals),
1731
+ "user_preferences": len(pref_signals),
1732
+ }
1733
+
1734
+ return result
1601
1735
 
1602
1736
 
1603
1737
  @app.post("/api/learning/aggregate", dependencies=[Depends(auth.require_scope("control"))])
@@ -1690,34 +1824,50 @@ async def trigger_aggregation():
1690
1824
 
1691
1825
  @app.get("/api/learning/preferences")
1692
1826
  async def get_learning_preferences(limit: int = 50):
1693
- """Get aggregated user preferences."""
1827
+ """Get aggregated user preferences from events and learning signals directory."""
1694
1828
  events = _read_events("30d")
1695
1829
  prefs = [e for e in events if e.get("type") == "user_preference"]
1696
- return prefs[:limit]
1830
+ # Also read from learning signals directory
1831
+ file_prefs = _read_learning_signals(signal_type="user_preference", limit=limit)
1832
+ combined = prefs + file_prefs
1833
+ combined.sort(key=lambda s: s.get("timestamp", ""), reverse=True)
1834
+ return combined[:limit]
1697
1835
 
1698
1836
 
1699
1837
  @app.get("/api/learning/errors")
1700
1838
  async def get_learning_errors(limit: int = 50):
1701
- """Get aggregated error patterns."""
1839
+ """Get aggregated error patterns from events and learning signals directory."""
1702
1840
  events = _read_events("30d")
1703
1841
  errors = [e for e in events if e.get("type") == "error_pattern"]
1704
- return errors[:limit]
1842
+ # Also read from learning signals directory
1843
+ file_errors = _read_learning_signals(signal_type="error_pattern", limit=limit)
1844
+ combined = errors + file_errors
1845
+ combined.sort(key=lambda s: s.get("timestamp", ""), reverse=True)
1846
+ return combined[:limit]
1705
1847
 
1706
1848
 
1707
1849
  @app.get("/api/learning/success")
1708
1850
  async def get_learning_success(limit: int = 50):
1709
- """Get aggregated success patterns."""
1851
+ """Get aggregated success patterns from events and learning signals directory."""
1710
1852
  events = _read_events("30d")
1711
1853
  successes = [e for e in events if e.get("type") == "success_pattern"]
1712
- return successes[:limit]
1854
+ # Also read from learning signals directory
1855
+ file_successes = _read_learning_signals(signal_type="success_pattern", limit=limit)
1856
+ combined = successes + file_successes
1857
+ combined.sort(key=lambda s: s.get("timestamp", ""), reverse=True)
1858
+ return combined[:limit]
1713
1859
 
1714
1860
 
1715
1861
  @app.get("/api/learning/tools")
1716
1862
  async def get_tool_efficiency(limit: int = 50):
1717
- """Get tool efficiency rankings."""
1863
+ """Get tool efficiency rankings from events and learning signals directory."""
1718
1864
  events = _read_events("30d")
1719
1865
  tools = [e for e in events if e.get("type") == "tool_efficiency"]
1720
- return tools[:limit]
1866
+ # Also read from learning signals directory
1867
+ file_tools = _read_learning_signals(signal_type="tool_efficiency", limit=limit)
1868
+ combined = tools + file_tools
1869
+ combined.sort(key=lambda s: s.get("timestamp", ""), reverse=True)
1870
+ return combined[:limit]
1721
1871
 
1722
1872
 
1723
1873
  def _parse_time_range(time_range: str) -> Optional[datetime]:
@@ -1957,24 +2107,28 @@ async def get_cost():
1957
2107
  except (json.JSONDecodeError, KeyError, TypeError):
1958
2108
  pass
1959
2109
 
1960
- # Also check dashboard-state.json for token data if efficiency dir is empty
2110
+ # Fallback: read from context tracking if efficiency files have no token data
1961
2111
  if total_input == 0 and total_output == 0:
1962
- state_file = loki_dir / "dashboard-state.json"
1963
- if state_file.exists():
2112
+ ctx_file = loki_dir / "context" / "tracking.json"
2113
+ if ctx_file.exists():
1964
2114
  try:
1965
- state = json.loads(state_file.read_text())
1966
- tokens = state.get("tokens", {})
1967
- total_input = tokens.get("input", 0)
1968
- total_output = tokens.get("output", 0)
1969
- model = state.get("model", "sonnet").lower()
2115
+ ctx = json.loads(ctx_file.read_text())
2116
+ totals = ctx.get("totals", {})
2117
+ total_input = totals.get("total_input", 0)
2118
+ total_output = totals.get("total_output", 0)
1970
2119
  if total_input > 0 or total_output > 0:
1971
- estimated_cost = _calculate_model_cost(model, total_input, total_output)
1972
- if model not in by_model:
1973
- by_model[model] = {
1974
- "input_tokens": total_input,
1975
- "output_tokens": total_output,
1976
- "cost_usd": estimated_cost,
1977
- }
2120
+ estimated_cost = totals.get("total_cost_usd", 0.0)
2121
+ # Rebuild by_model and by_phase from per_iteration data
2122
+ for it in ctx.get("per_iteration", []):
2123
+ inp = it.get("input_tokens", 0)
2124
+ out = it.get("output_tokens", 0)
2125
+ cost = it.get("cost_usd", 0)
2126
+ model = ctx.get("provider", "sonnet").lower()
2127
+ if model not in by_model:
2128
+ by_model[model] = {"input_tokens": 0, "output_tokens": 0, "cost_usd": 0.0}
2129
+ by_model[model]["input_tokens"] += inp
2130
+ by_model[model]["output_tokens"] += out
2131
+ by_model[model]["cost_usd"] += cost
1978
2132
  except (json.JSONDecodeError, KeyError):
1979
2133
  pass
1980
2134
 
@@ -2789,6 +2943,130 @@ async def get_secrets_status():
2789
2943
  }
2790
2944
 
2791
2945
 
2946
+ # =============================================================================
2947
+ # GitHub Integration API (v5.41.0)
2948
+ # =============================================================================
2949
+
2950
+
2951
+ @app.get("/api/github/status")
2952
+ async def get_github_status(token: Optional[dict] = Depends(auth.get_current_token)):
2953
+ """Get GitHub integration status and configuration."""
2954
+ loki_dir = _get_loki_dir()
2955
+ result: dict[str, Any] = {
2956
+ "import_enabled": os.environ.get("LOKI_GITHUB_IMPORT", "false") == "true",
2957
+ "sync_enabled": os.environ.get("LOKI_GITHUB_SYNC", "false") == "true",
2958
+ "pr_enabled": os.environ.get("LOKI_GITHUB_PR", "false") == "true",
2959
+ "labels_filter": os.environ.get("LOKI_GITHUB_LABELS", ""),
2960
+ "milestone_filter": os.environ.get("LOKI_GITHUB_MILESTONE", ""),
2961
+ "limit": int(os.environ.get("LOKI_GITHUB_LIMIT", "100")),
2962
+ "imported_tasks": 0,
2963
+ "synced_updates": 0,
2964
+ "repo": None,
2965
+ }
2966
+
2967
+ # Count imported GitHub tasks from pending queue
2968
+ pending_file = loki_dir / "queue" / "pending.json"
2969
+ if pending_file.exists():
2970
+ try:
2971
+ data = json.loads(pending_file.read_text())
2972
+ tasks = data.get("tasks", data) if isinstance(data, dict) else data
2973
+ result["imported_tasks"] = sum(1 for t in tasks if t.get("source") == "github")
2974
+ except Exception:
2975
+ pass
2976
+
2977
+ # Count sync log entries
2978
+ sync_log = loki_dir / "github" / "synced.log"
2979
+ if sync_log.exists():
2980
+ try:
2981
+ result["synced_updates"] = sum(1 for _ in sync_log.open())
2982
+ except Exception:
2983
+ pass
2984
+
2985
+ # Detect repo from git
2986
+ try:
2987
+ import subprocess
2988
+ url = subprocess.run(
2989
+ ["git", "remote", "get-url", "origin"],
2990
+ capture_output=True, text=True, timeout=5,
2991
+ cwd=str(loki_dir.parent) if loki_dir.name == ".loki" else None
2992
+ )
2993
+ if url.returncode == 0:
2994
+ repo = url.stdout.strip()
2995
+ # Parse owner/repo from URL
2996
+ for prefix in ["https://github.com/", "git@github.com:"]:
2997
+ if repo.startswith(prefix):
2998
+ repo = repo[len(prefix):]
2999
+ break
3000
+ result["repo"] = repo.removesuffix(".git")
3001
+ except Exception:
3002
+ pass
3003
+
3004
+ return result
3005
+
3006
+
3007
+ @app.get("/api/github/tasks")
3008
+ async def get_github_tasks(token: Optional[dict] = Depends(auth.get_current_token)):
3009
+ """Get all GitHub-sourced tasks and their sync status."""
3010
+ loki_dir = _get_loki_dir()
3011
+ tasks: list[dict] = []
3012
+
3013
+ # Collect GitHub tasks from all queues
3014
+ for queue_name in ["pending", "in-progress", "completed", "failed"]:
3015
+ queue_file = loki_dir / "queue" / f"{queue_name}.json"
3016
+ if queue_file.exists():
3017
+ try:
3018
+ data = json.loads(queue_file.read_text())
3019
+ items = data.get("tasks", data) if isinstance(data, dict) else data
3020
+ for t in items:
3021
+ if t.get("source") == "github" or str(t.get("id", "")).startswith("github-"):
3022
+ t["queue"] = queue_name
3023
+ tasks.append(t)
3024
+ except Exception:
3025
+ pass
3026
+
3027
+ # Load sync log to annotate sync status
3028
+ synced: set[str] = set()
3029
+ sync_log = loki_dir / "github" / "synced.log"
3030
+ if sync_log.exists():
3031
+ try:
3032
+ synced = set(sync_log.read_text().strip().splitlines())
3033
+ except Exception:
3034
+ pass
3035
+
3036
+ for t in tasks:
3037
+ issue_num = str(t.get("github_issue", ""))
3038
+ if not issue_num:
3039
+ issue_num = str(t.get("id", "")).replace("github-", "")
3040
+ t["synced_statuses"] = [
3041
+ s.split(":")[1] for s in synced if s.startswith(f"{issue_num}:")
3042
+ ]
3043
+
3044
+ return {"tasks": tasks, "total": len(tasks)}
3045
+
3046
+
3047
+ @app.get("/api/github/sync-log")
3048
+ async def get_github_sync_log(
3049
+ limit: int = Query(default=50, ge=1, le=500),
3050
+ token: Optional[dict] = Depends(auth.get_current_token)
3051
+ ):
3052
+ """Get the GitHub sync log (status updates sent to issues)."""
3053
+ loki_dir = _get_loki_dir()
3054
+ sync_log = loki_dir / "github" / "synced.log"
3055
+ entries: list[dict] = []
3056
+
3057
+ if sync_log.exists():
3058
+ try:
3059
+ lines = sync_log.read_text().strip().splitlines()
3060
+ for line in lines[-limit:]:
3061
+ parts = line.split(":", 1)
3062
+ if len(parts) == 2:
3063
+ entries.append({"issue": parts[0], "status": parts[1]})
3064
+ except Exception:
3065
+ pass
3066
+
3067
+ return {"entries": entries, "total": len(entries)}
3068
+
3069
+
2792
3070
  # =============================================================================
2793
3071
  # Process Health / Watchdog API
2794
3072
  # =============================================================================
@@ -2,7 +2,7 @@
2
2
 
3
3
  Complete installation instructions for all platforms and use cases.
4
4
 
5
- **Version:** v5.40.1
5
+ **Version:** v5.42.0
6
6
 
7
7
  ---
8
8
 
package/mcp/__init__.py CHANGED
@@ -21,4 +21,4 @@ try:
21
21
  except ImportError:
22
22
  __all__ = ['mcp']
23
23
 
24
- __version__ = '5.40.1'
24
+ __version__ = '5.42.0'
package/package.json CHANGED
@@ -1,6 +1,6 @@
1
1
  {
2
2
  "name": "loki-mode",
3
- "version": "5.40.1",
3
+ "version": "5.42.0",
4
4
  "description": "Multi-agent autonomous startup system for Claude Code, Codex CLI, and Gemini CLI",
5
5
  "keywords": [
6
6
  "claude",
@@ -1,6 +1,6 @@
1
- # GitHub Integration (v5.25.0)
1
+ # GitHub Integration (v5.41.0)
2
2
 
3
- **When:** Importing issues from GitHub, creating PRs, syncing task status
3
+ **When:** Importing issues from GitHub, creating PRs, syncing task status back
4
4
 
5
5
  > **Requires:** `gh` CLI authenticated (`gh auth status`)
6
6
 
@@ -10,9 +10,13 @@
10
10
 
11
11
  | Action | Command | Result |
12
12
  |--------|---------|--------|
13
- | Import issues as tasks | `LOKI_GITHUB_IMPORT=true` | Fetches open issues, creates pending tasks |
13
+ | Import issues as tasks | `loki start --github` or `LOKI_GITHUB_IMPORT=true` | Fetches open issues, creates pending tasks |
14
14
  | Create PR on completion | `LOKI_GITHUB_PR=true` | Auto-creates PR with task summaries |
15
- | Sync status back | `LOKI_GITHUB_SYNC=true` | Comments progress on source issues |
15
+ | Sync status back | `LOKI_GITHUB_SYNC=true` | Comments progress on source issues (deduplicated) |
16
+ | Manual sync | `loki github sync` | Sync completed tasks to GitHub now |
17
+ | Export tasks | `loki github export` | Create GitHub issues from local tasks |
18
+ | Manual PR | `loki github pr "feature name"` | Create PR from current work |
19
+ | Check status | `loki github status` | Show config, sync history, imported count |
16
20
  | Import from URL | `LOKI_GITHUB_REPO=owner/repo` | Specify repo if not auto-detected |
17
21
 
18
22
  ---
@@ -167,13 +171,41 @@ LOKI_GITHUB_IMPORT=true \
167
171
 
168
172
  ---
169
173
 
170
- ## Integration with Dashboard
174
+ ## CLI Commands
171
175
 
172
- The dashboard shows GitHub-sourced tasks with:
173
- - GitHub icon badge
174
- - Direct link to issue
175
- - Sync status indicator
176
- - "Import from GitHub" button (calls `gh issue list`)
176
+ ```bash
177
+ # Check GitHub integration status
178
+ loki github status
179
+
180
+ # Sync completed task statuses back to GitHub issues
181
+ loki github sync
182
+
183
+ # Export local tasks as new GitHub issues
184
+ loki github export
185
+
186
+ # Create PR from completed work
187
+ loki github pr "Add user authentication"
188
+ ```
189
+
190
+ ---
191
+
192
+ ## Dashboard API
193
+
194
+ | Endpoint | Method | Description |
195
+ |----------|--------|-------------|
196
+ | `/api/github/status` | GET | Integration config, repo, sync count |
197
+ | `/api/github/tasks` | GET | All GitHub-sourced tasks with sync status |
198
+ | `/api/github/sync-log` | GET | History of status updates sent to issues |
199
+
200
+ ---
201
+
202
+ ## Sync Behavior
203
+
204
+ - **On session start** (`LOKI_GITHUB_IMPORT=true`): Imports issues, posts "in_progress" comment
205
+ - **After each iteration** (`LOKI_GITHUB_SYNC=true`): Syncs completed GitHub tasks
206
+ - **On session end** (`LOKI_GITHUB_PR=true`): Final sync + creates PR with `Closes #N` references
207
+ - **Deduplication**: Sync log at `.loki/github/synced.log` prevents duplicate comments
208
+ - **Manual**: `loki github sync` can be run anytime outside a session
177
209
 
178
210
  ---
179
211
 
@@ -215,4 +247,4 @@ gh repo set-default owner/repo
215
247
 
216
248
  ---
217
249
 
218
- **v5.25.0 | GitHub Integration | ~100 lines**
250
+ **v5.41.0 | GitHub Integration (full sync-back) | ~250 lines**