nemar-cli 0.3.4-dev.62 → 0.3.4-dev.64

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
Files changed (3) hide show
  1. package/README.md +120 -10
  2. package/dist/index.js +1 -1
  3. package/package.json +1 -1
package/README.md CHANGED
@@ -10,7 +10,13 @@ Command-line interface for [NEMAR](https://nemar.org) (Neuroelectromagnetic Data
10
10
  ## Features
11
11
 
12
12
  - **Dataset Management**: Upload, download, validate, and version BIDS datasets
13
- - **PR-Based Versioning**: All changes require pull requests (main branch protected)
13
+ - **Resume Uploads**: Failed uploads can be resumed; CLI stores state in `.nemar/config.json`
14
+ - **Smart Authentication**: Verifies GitHub CLI authentication matches NEMAR user
15
+ - **Auto-Accept Invitations**: Automatically accepts GitHub collaboration invitations
16
+ - **IAM Retry Logic**: Handles AWS IAM eventual consistency with automatic retries
17
+ - **Commit Authorship**: Commits attributed to your NEMAR user identity
18
+ - **Private First**: New datasets are private; branch protection applied only on DOI creation
19
+ - **PR-Based Updates**: After DOI, all changes require pull requests
14
20
  - **Collaborative**: Any NEMAR user can contribute to any dataset
15
21
  - **BIDS Validation**: Automatic validation before upload and on PRs
16
22
  - **DOI Integration**: Zenodo DOI creation for dataset versioning
@@ -37,18 +43,25 @@ For dataset operations:
37
43
 
38
44
  - [DataLad](https://www.datalad.org/) and git-annex
39
45
  - [Deno](https://deno.land/) (for BIDS validation)
40
- - GitHub account (for PR collaboration)
46
+ - [GitHub CLI](https://cli.github.com/) (`gh`) - authenticated as your NEMAR user
47
+ - SSH key registered with GitHub
41
48
 
42
49
  ```bash
43
50
  # macOS
44
- brew install datalad git-annex deno
51
+ brew install datalad git-annex deno gh
45
52
 
46
53
  # Ubuntu/Debian
47
54
  sudo apt-get install git-annex
48
55
  pip install datalad
49
56
  curl -fsSL https://deno.land/install.sh | sh
57
+ # Install gh: https://github.com/cli/cli/blob/trunk/docs/install_linux.md
58
+
59
+ # Authenticate GitHub CLI (required for upload)
60
+ gh auth login
50
61
  ```
51
62
 
63
+ **Important:** The GitHub account authenticated with `gh` must match your NEMAR username. The CLI verifies this before upload.
64
+
52
65
  ## Quick Start
53
66
 
54
67
  ```bash
@@ -131,17 +144,42 @@ sequenceDiagram
131
144
 
132
145
  U->>CLI: nemar dataset upload /path
133
146
  CLI->>CLI: Validate BIDS locally
147
+ CLI->>GH: Verify gh CLI authentication
134
148
  CLI->>API: POST /datasets/create
135
149
  API->>API: Assign dataset ID (nm000XXX)
150
+ API->>API: Create IAM credentials
136
151
  API->>GH: Create repo (Admin PAT)
137
152
  API->>GH: Add user as collaborator
138
- API->>GH: Set branch protection
139
153
  API-->>CLI: Dataset ID + presigned URLs
140
- CLI->>S3: Upload data files
154
+ CLI->>GH: Auto-accept invitation (gh API)
155
+ CLI->>CLI: Wait for IAM propagation
156
+ CLI->>S3: Upload data files (with retry)
157
+ CLI->>GH: Commit with user identity
141
158
  CLI->>GH: Push via DataLad
142
159
  CLI-->>U: Success! URLs provided
143
160
  ```
144
161
 
162
+ **Note:** Branch protection is NOT applied during initial upload. Private datasets allow direct pushes to main. Protection is applied when creating a DOI (permanent record).
163
+
164
+ ### Resume Failed Uploads
165
+
166
+ If an upload fails (network issues, S3 errors), you can resume:
167
+
168
+ ```bash
169
+ # Just run upload again - CLI detects existing dataset
170
+ nemar dataset upload /path/to/dataset
171
+ ```
172
+
173
+ The CLI stores dataset metadata in `.nemar/config.json` within your dataset directory. On resume:
174
+ 1. Detects existing dataset ID from local config
175
+ 2. Requests fresh presigned URLs from backend
176
+ 3. Re-uploads files (git-annex handles duplicates)
177
+
178
+ To start fresh (new dataset ID), remove the config:
179
+ ```bash
180
+ rm -rf /path/to/dataset/.nemar
181
+ ```
182
+
145
183
  ### Pull Request Workflow (Contributing to Dataset)
146
184
 
147
185
  ```mermaid
@@ -280,11 +318,13 @@ nemar admin doi create <dataset-id> # Create concept DOI
280
318
 
281
319
  ### Key Principles
282
320
 
283
- 1. **PR-Mandatory**: Main branch is protected; all changes require PRs
284
- 2. **Collaborative**: Any NEMAR user can create PRs on any dataset
285
- 3. **Owner Approval**: Only dataset owner (or admin) can merge PRs
286
- 4. **No Deletion**: Users cannot delete repositories or S3 data
287
- 5. **Audit Trail**: All changes tracked via PR history
321
+ 1. **Private First**: New datasets are private; owners can push directly to main
322
+ 2. **Protection on DOI**: Branch protection applied when creating a DOI (permanent record)
323
+ 3. **PR-Based Updates**: After DOI creation, all changes require pull requests
324
+ 4. **Collaborative**: Any NEMAR user can create PRs on any dataset
325
+ 5. **Owner Approval**: Only dataset owner (or admin) can merge PRs
326
+ 6. **No Deletion**: Users cannot delete repositories or S3 data
327
+ 7. **Audit Trail**: All changes tracked via PR history
288
328
 
289
329
  ## Storage Architecture
290
330
 
@@ -317,6 +357,76 @@ NEMAR_API_URL # Custom API endpoint (default: https://api.nemar.org)
317
357
  NEMAR_NO_COLOR # Disable colored output
318
358
  ```
319
359
 
360
+ ## Troubleshooting
361
+
362
+ ### Upload Issues
363
+
364
+ **"GitHub CLI not authenticated" or "gh CLI username mismatch"**
365
+ ```bash
366
+ # Login to GitHub CLI with your NEMAR account
367
+ gh auth login
368
+ # Verify the authenticated username matches your NEMAR username
369
+ gh auth status
370
+ ```
371
+ The CLI verifies `gh` is authenticated as your NEMAR user to prevent permission issues.
372
+
373
+ **"S3 upload failed" or "AccessDenied (403)"**
374
+ - AWS IAM policy changes take 10-30 seconds to propagate globally
375
+ - The CLI has built-in retry logic (4 retries with progressive delays)
376
+ - If retries fail, wait 30 seconds and run upload again
377
+ - Admin users don't hit this issue (full bucket access)
378
+
379
+ **"Failed to accept GitHub invitation"**
380
+ - The CLI auto-accepts repo invitations, but `gh` must be authenticated as the invited user
381
+ - Manually accept at: https://github.com/nemarDatasets/[dataset-id]/invitations
382
+ - Then re-run the upload command
383
+
384
+ **"Failed to push to GitHub"**
385
+ - Check SSH configuration for multiple GitHub accounts
386
+ - If you have multiple GitHub accounts, configure SSH host aliases:
387
+ ```bash
388
+ # ~/.ssh/config
389
+ Host github-nemar
390
+ HostName github.com
391
+ User git
392
+ IdentityFile ~/.ssh/id_nemar
393
+ ```
394
+ - See `nemar auth status` for SSH setup instructions
395
+
396
+ **"Dataset already exists" / Resume Upload**
397
+ - The CLI stores dataset metadata in `.nemar/config.json`
398
+ - To **resume**: just run `nemar dataset upload` again
399
+ - To **start fresh** with new dataset ID: `rm -rf /path/to/dataset/.nemar`
400
+
401
+ **Upload completes but commits show wrong author**
402
+ - Commits use your NEMAR user identity (username and registered email)
403
+ - Ensure your NEMAR account has correct email registered
404
+ - Check with `nemar auth status`
405
+
406
+ ### Authentication Issues
407
+
408
+ **"API key invalid"**
409
+ ```bash
410
+ nemar auth logout
411
+ nemar auth login
412
+ ```
413
+
414
+ **"Account pending approval"**
415
+ - Admin must approve your account after signup
416
+ - Contact your NEMAR administrator
417
+
418
+ ### Branch Protection
419
+
420
+ **"Cannot push directly to main"**
421
+ - If your dataset has a DOI, branch protection is enabled
422
+ - All changes require pull requests after DOI creation
423
+ - Private datasets without DOI allow direct pushes
424
+
425
+ **"Branch protection not applied after upload"**
426
+ - This is expected for new private datasets
427
+ - Protection is applied when admin creates a DOI (`nemar admin doi create`)
428
+ - Allows owners to freely modify their private workspace
429
+
320
430
  ## Development
321
431
 
322
432
  ```bash
package/dist/index.js CHANGED
@@ -355,7 +355,7 @@ Examples:
355
355
  $ nemar sandbox # Run sandbox training
356
356
  $ nemar sandbox status # Check if training is completed
357
357
  $ nemar sandbox reset # Reset for re-training
358
- `).action(AED);async function AED(){if(console.log(),console.log(z.bold("NEMAR Sandbox Training")),console.log(z.gray("Verify your setup and learn the upload workflow")),console.log(),!hD()){console.log(z.red("Not authenticated")),console.log(z.gray("Run 'nemar auth login' first"));return}let D=aD();if(D.sandboxCompleted){console.log(z.green("Sandbox training already completed!")),console.log(z.gray(`Dataset ID: ${D.sandboxDatasetId}`)),console.log(),console.log("You can upload real datasets with:"),console.log(z.cyan(" nemar dataset upload ./your-dataset")),console.log(),console.log(z.gray("To re-run training, use: nemar sandbox reset"));return}console.log(z.bold("Step 1/6: Checking prerequisites..."));let F=k("Checking DataLad, git-annex, and SSH...").start(),$=await q$();if(!$.allPassed){F.fail("Prerequisites check failed"),console.log(),console.log(z.red("Missing requirements:"));for(let L of $.errors)console.log(z.yellow(` - ${L}`));if(!$.githubSSH.accessible)console.log(z.gray(" Run 'nemar auth setup-ssh' to configure SSH"));return}F.succeed("All prerequisites met");let B=k("Verifying GitHub CLI authentication...").start(),J=await V$(D.githubUsername);if(!J.authenticated){B.fail("GitHub CLI not authenticated"),console.log(z.red(` ${J.error}`)),console.log(),console.log("GitHub CLI is required for sandbox training. Install and authenticate:"),console.log(z.cyan(" brew install gh # or visit https://cli.github.com/")),console.log(z.cyan(" gh auth login"));return}if(D.githubUsername&&!J.matches)B.warn("GitHub CLI user mismatch"),console.log(z.yellow(` ${J.error}`)),console.log(),console.log("Your gh CLI is authenticated as a different GitHub account than your NEMAR account."),console.log("This may cause issues with repository access. To fix:"),console.log(z.cyan(` gh auth login # Login as ${D.githubUsername}`)),console.log(),console.log(z.yellow("WARNING: If upload fails with permission errors, this mismatch is the likely cause.")),console.log();else B.succeed(`GitHub CLI authenticated as ${J.username}`);console.log(),console.log(z.bold("Step 2/6: Generating test dataset..."));let Y=k("Creating minimal BIDS structure...").start(),Q;try{let L=Ky();Q=L.root;let BD=zy(L);Y.succeed(`Test dataset created (${M$(BD)})`),console.log(z.gray(` Location: ${Q}`))}catch(L){Y.fail("Failed to generate test dataset"),console.log(z.red(` ${L instanceof Error?L.message:"Unknown error"}`));return}console.log(),console.log(z.bold("Step 3/6: Registering sandbox dataset..."));let X=k("Creating dataset on NEMAR...").start(),E,G,W,q,V,A;try{let L=await W$({name:"Sandbox Training Dataset",description:"Placeholder dataset for sandbox training",files:[{path:"sub-01/eeg/sub-01_task-rest_eeg.edf",size:512000,type:"data"},{path:"dataset_description.json",size:200,type:"metadata"},{path:"participants.tsv",size:50,type:"metadata"},{path:"README",size:500,type:"metadata"},{path:"sub-01/eeg/sub-01_task-rest_eeg.json",size:300,type:"metadata"}],sandbox:!0});E=L.dataset.dataset_id,G=L.dataset.ssh_url,W=L.dataset.github_url,q=L.s3_config,V=L.dataset.s3_prefix,A=L.upload_urls||{},X.succeed(`Sandbox dataset created: ${z.cyan(E)}`),console.log(z.gray(` GitHub: ${W}`)),await new Promise((BD)=>setTimeout(BD,1e4))}catch(L){if(X.fail("Failed to create sandbox dataset"),L instanceof $D)console.log(z.red(` ${L.message}`));else console.log(z.red(` ${L instanceof Error?L.message:"Unknown error"}`));G1(Q);return}let Z=k("Accepting GitHub repository invitation...").start(),N=W?.match(/github\.com\/([a-zA-Z0-9_.-]+\/[a-zA-Z0-9_.-]+)/),U=N?N[1].replace(/\.git$/,""):null;if(!U){Z.fail("Invalid GitHub repository URL from backend"),console.log(z.red(` Received: ${W||"(empty)"}`)),console.log(z.red(" Expected format: https://github.com/owner/repo")),console.log(),console.log("This may indicate a backend issue. Please contact support."),G1(Q);return}let M=await A$(U);if(M.accepted)if(M.alreadyCollaborator)Z.succeed("Already a collaborator on this repository");else Z.succeed("GitHub invitation accepted");else Z.warn("Could not auto-accept invitation"),console.log(z.yellow(` ${M.error}`)),console.log(),console.log("You may need to accept the invitation manually:"),console.log(z.cyan(` https://github.com/${U}/invitations`)),console.log();console.log(),console.log(z.bold("Step 4/6: Initializing repository..."));let R=k("Setting up DataLad and git-annex...").start();try{await K$(Q),await z$(Q),await Z$(Q,G),R.succeed("Repository initialized")}catch(L){R.fail("Failed to initialize repository"),console.log(z.red(` ${L instanceof Error?L.message:"Unknown error"}`)),G1(Q);return}if(console.log(),console.log(z.bold("Step 5/6: Uploading to S3...")),Object.keys(A).length===0)console.log(z.yellow(" No data files to upload (metadata only)"));else{let L=k("Uploading test data...").start();try{let qD=0,u=Object.keys(A).length,ND=await L$(Q,A,{jobs:4,onProgress:(QD)=>{if(QD.status==="completed"||QD.status==="failed")qD++,L.text=`Uploading... ${qD}/${u} files`}});if(ND.failed.length>0){L.fail(`Upload failed for ${ND.failed.length} file(s)`);for(let QD of ND.failed)console.log(z.red(` Failed: ${QD}`));if(ND.error)console.log(z.red(` Error: ${ND.error}`));console.log(),console.log(z.yellow("Sandbox training aborted due to upload failures.")),console.log(z.gray("Please check your network connection and try again.")),G1(Q);return}L.succeed(`Uploaded ${ND.uploaded} file(s)`)}catch(qD){L.fail("Upload failed"),console.log(z.red(` ${qD instanceof Error?qD.message:"Unknown error"}`)),G1(Q);return}let BD=k("Registering file URLs...").start();try{let qD={};for(let ND of Object.keys(A))qD[ND]=`${q.public_url}/${V}/${ND}`;let u=await C$(Q,qD);if(!u.success){BD.fail(`URL registration failed for ${u.failed.length} file(s)`);for(let ND of u.failed)console.log(z.red(` Failed: ${ND}`));console.log(),console.log(z.yellow("Sandbox training aborted due to URL registration failures.")),console.log(z.gray("This may indicate a git-annex configuration issue.")),G1(Q);return}BD.succeed(`Registered ${u.registered} file URLs`)}catch(qD){BD.fail("Failed to register URLs"),console.log(z.red(` ${qD instanceof Error?qD.message:"Unknown error"}`)),G1(Q);return}}console.log(),console.log(z.bold("Step 6/6: Pushing to GitHub..."));let w=k("Saving and pushing...").start();try{let L=D.username&&D.email?{name:D.username,email:D.email}:void 0;await N$(Q,"Initial sandbox training upload",L),await U$(Q),w.succeed("Pushed to GitHub")}catch(L){w.fail("Failed to push to GitHub"),console.log(z.red(` ${L instanceof Error?L.message:"Unknown error"}`)),G1(Q);return}let S=k("Finalizing...").start();try{await H$(E),await gv(E),oD("sandboxCompleted",!0),oD("sandboxDatasetId",E),S.succeed("Sandbox training complete!")}catch(L){S.fail("Failed to finalize"),console.log(z.red(` ${L instanceof Error?L.message:"Unknown error"}`)),G1(Q);return}G1(Q),console.log(),console.log(z.green.bold("Congratulations! Sandbox training completed successfully.")),console.log(),console.log("Your setup is verified and you're ready to upload real datasets:"),console.log(z.cyan(" nemar dataset upload ./your-dataset")),console.log(),console.log(z.gray(`Sandbox dataset: ${E}`))}S$.command("status").description("Check sandbox training completion status").option("--refresh","Fetch latest status from server").action(async(D)=>{if(!hD()){console.log(z.red("Not authenticated")),console.log(z.gray("Run 'nemar auth login' first"));return}if(D.refresh){let F=k("Checking status...").start();try{let $=await bv();if(oD("sandboxCompleted",$.sandbox_completed),$.sandbox_dataset_id)oD("sandboxDatasetId",$.sandbox_dataset_id);if(F.stop(),$.sandbox_completed){if(console.log(z.green("Sandbox training: Completed")),console.log(z.gray(` Dataset ID: ${$.sandbox_dataset_id}`)),$.sandbox_completed_at)console.log(z.gray(` Completed: ${$.sandbox_completed_at}`))}else console.log(z.yellow("Sandbox training: Not completed")),console.log(),console.log("Run sandbox training with:"),console.log(z.cyan(" nemar sandbox"))}catch($){if(F.fail("Failed to check status"),$ instanceof $D)console.log(z.red(` ${$.message}`))}}else{let F=aD();if(F.sandboxCompleted)console.log(z.green("Sandbox training: Completed")),console.log(z.gray(` Dataset ID: ${F.sandboxDatasetId}`));else console.log(z.yellow("Sandbox training: Not completed")),console.log(),console.log("Run sandbox training with:"),console.log(z.cyan(" nemar sandbox"))}});S$.command("reset").description("Reset sandbox training status for re-training").option("-f, --force","Skip confirmation prompt").action(async(D)=>{if(!hD()){console.log(z.red("Not authenticated")),console.log(z.gray("Run 'nemar auth login' first"));return}if(!aD().sandboxCompleted){console.log(z.yellow("Sandbox training not yet completed")),console.log(z.gray("Nothing to reset"));return}if(!D.force){let B=(await Promise.resolve().then(() => (V3(),EI))).default,{confirm:J}=await B.prompt([{type:"confirm",name:"confirm",message:"Reset sandbox training status? You will need to complete training again.",default:!1}]);if(!J){console.log(z.gray("Cancelled"));return}}let $=k("Resetting sandbox status...").start();try{await hv(),oD("sandboxCompleted",!1),oD("sandboxDatasetId",void 0),$.succeed("Sandbox status reset"),console.log(),console.log("Run sandbox training again with:"),console.log(z.cyan(" nemar sandbox"))}catch(B){if($.fail("Failed to reset"),B instanceof $D)console.log(z.red(` ${B.message}`));else console.log(z.red(` ${B instanceof Error?B.message:"Unknown error"}`))}});var Vy={name:"nemar-cli",version:"0.3.4-dev.62",description:"CLI for NEMAR (Neuroelectromagnetic Data Archive and Tools Resource) dataset management",type:"module",main:"dist/index.js",bin:{nemar:"dist/index.js"},scripts:{dev:"bun run src/index.ts",build:"bun build src/index.ts --outdir dist --target bun --minify && sed '1s|#!/usr/bin/env node|#!/usr/bin/env bun|' dist/index.js > dist/index.js.tmp && mv dist/index.js.tmp dist/index.js",test:"bun test",lint:"biome check src/","lint:fix":"biome check --fix src/",format:"biome format --write src/",typecheck:"tsc --noEmit",prepublishOnly:"bun run build","docs:generate":"bun run scripts/generate-docs.ts","docs:serve":"mkdocs serve","docs:build":"mkdocs build"},keywords:["nemar","bids","neuroimaging","eeg","emg","datalad","cli"],author:"NEMAR Team",license:"MIT",repository:{type:"git",url:"git+https://github.com/nemarDatasets/nemar-cli.git"},bugs:{url:"https://github.com/nemarDatasets/nemar-cli/issues"},homepage:"https://nemar-cli.pages.dev",engines:{bun:">=1.0.0"},files:["dist","README.md","LICENSE"],dependencies:{chalk:"^5.3.0",commander:"^12.1.0",conf:"^13.0.1",inquirer:"^9.2.15",ora:"^8.0.1",zod:"^3.23.8"},devDependencies:{"@biomejs/biome":"^1.9.4","@types/bcryptjs":"^3.0.0","@types/bun":"latest","@types/inquirer":"^9.0.7",bcryptjs:"^3.0.3",typescript:"^5.5.4"}};var Ay=Vy.version;var W1=new b0;W1.name("nemar").description(`CLI for NEMAR (Neuroelectromagnetic Data Archive and Tools Resource)
358
+ `).action(AED);async function AED(){if(console.log(),console.log(z.bold("NEMAR Sandbox Training")),console.log(z.gray("Verify your setup and learn the upload workflow")),console.log(),!hD()){console.log(z.red("Not authenticated")),console.log(z.gray("Run 'nemar auth login' first"));return}let D=aD();if(D.sandboxCompleted){console.log(z.green("Sandbox training already completed!")),console.log(z.gray(`Dataset ID: ${D.sandboxDatasetId}`)),console.log(),console.log("You can upload real datasets with:"),console.log(z.cyan(" nemar dataset upload ./your-dataset")),console.log(),console.log(z.gray("To re-run training, use: nemar sandbox reset"));return}console.log(z.bold("Step 1/6: Checking prerequisites..."));let F=k("Checking DataLad, git-annex, and SSH...").start(),$=await q$();if(!$.allPassed){F.fail("Prerequisites check failed"),console.log(),console.log(z.red("Missing requirements:"));for(let L of $.errors)console.log(z.yellow(` - ${L}`));if(!$.githubSSH.accessible)console.log(z.gray(" Run 'nemar auth setup-ssh' to configure SSH"));return}F.succeed("All prerequisites met");let B=k("Verifying GitHub CLI authentication...").start(),J=await V$(D.githubUsername);if(!J.authenticated){B.fail("GitHub CLI not authenticated"),console.log(z.red(` ${J.error}`)),console.log(),console.log("GitHub CLI is required for sandbox training. Install and authenticate:"),console.log(z.cyan(" brew install gh # or visit https://cli.github.com/")),console.log(z.cyan(" gh auth login"));return}if(D.githubUsername&&!J.matches)B.warn("GitHub CLI user mismatch"),console.log(z.yellow(` ${J.error}`)),console.log(),console.log("Your gh CLI is authenticated as a different GitHub account than your NEMAR account."),console.log("This may cause issues with repository access. To fix:"),console.log(z.cyan(` gh auth login # Login as ${D.githubUsername}`)),console.log(),console.log(z.yellow("WARNING: If upload fails with permission errors, this mismatch is the likely cause.")),console.log();else B.succeed(`GitHub CLI authenticated as ${J.username}`);console.log(),console.log(z.bold("Step 2/6: Generating test dataset..."));let Y=k("Creating minimal BIDS structure...").start(),Q;try{let L=Ky();Q=L.root;let BD=zy(L);Y.succeed(`Test dataset created (${M$(BD)})`),console.log(z.gray(` Location: ${Q}`))}catch(L){Y.fail("Failed to generate test dataset"),console.log(z.red(` ${L instanceof Error?L.message:"Unknown error"}`));return}console.log(),console.log(z.bold("Step 3/6: Registering sandbox dataset..."));let X=k("Creating dataset on NEMAR...").start(),E,G,W,q,V,A;try{let L=await W$({name:"Sandbox Training Dataset",description:"Placeholder dataset for sandbox training",files:[{path:"sub-01/eeg/sub-01_task-rest_eeg.edf",size:512000,type:"data"},{path:"dataset_description.json",size:200,type:"metadata"},{path:"participants.tsv",size:50,type:"metadata"},{path:"README",size:500,type:"metadata"},{path:"sub-01/eeg/sub-01_task-rest_eeg.json",size:300,type:"metadata"}],sandbox:!0});E=L.dataset.dataset_id,G=L.dataset.ssh_url,W=L.dataset.github_url,q=L.s3_config,V=L.dataset.s3_prefix,A=L.upload_urls||{},X.succeed(`Sandbox dataset created: ${z.cyan(E)}`),console.log(z.gray(` GitHub: ${W}`)),await new Promise((BD)=>setTimeout(BD,1e4))}catch(L){if(X.fail("Failed to create sandbox dataset"),L instanceof $D)console.log(z.red(` ${L.message}`));else console.log(z.red(` ${L instanceof Error?L.message:"Unknown error"}`));G1(Q);return}let Z=k("Accepting GitHub repository invitation...").start(),N=W?.match(/github\.com\/([a-zA-Z0-9_.-]+\/[a-zA-Z0-9_.-]+)/),U=N?N[1].replace(/\.git$/,""):null;if(!U){Z.fail("Invalid GitHub repository URL from backend"),console.log(z.red(` Received: ${W||"(empty)"}`)),console.log(z.red(" Expected format: https://github.com/owner/repo")),console.log(),console.log("This may indicate a backend issue. Please contact support."),G1(Q);return}let M=await A$(U);if(M.accepted)if(M.alreadyCollaborator)Z.succeed("Already a collaborator on this repository");else Z.succeed("GitHub invitation accepted");else Z.warn("Could not auto-accept invitation"),console.log(z.yellow(` ${M.error}`)),console.log(),console.log("You may need to accept the invitation manually:"),console.log(z.cyan(` https://github.com/${U}/invitations`)),console.log();console.log(),console.log(z.bold("Step 4/6: Initializing repository..."));let R=k("Setting up DataLad and git-annex...").start();try{await K$(Q),await z$(Q),await Z$(Q,G),R.succeed("Repository initialized")}catch(L){R.fail("Failed to initialize repository"),console.log(z.red(` ${L instanceof Error?L.message:"Unknown error"}`)),G1(Q);return}if(console.log(),console.log(z.bold("Step 5/6: Uploading to S3...")),Object.keys(A).length===0)console.log(z.yellow(" No data files to upload (metadata only)"));else{let L=k("Uploading test data...").start();try{let qD=0,u=Object.keys(A).length,ND=await L$(Q,A,{jobs:4,onProgress:(QD)=>{if(QD.status==="completed"||QD.status==="failed")qD++,L.text=`Uploading... ${qD}/${u} files`}});if(ND.failed.length>0){L.fail(`Upload failed for ${ND.failed.length} file(s)`);for(let QD of ND.failed)console.log(z.red(` Failed: ${QD}`));if(ND.error)console.log(z.red(` Error: ${ND.error}`));console.log(),console.log(z.yellow("Sandbox training aborted due to upload failures.")),console.log(z.gray("Please check your network connection and try again.")),G1(Q);return}L.succeed(`Uploaded ${ND.uploaded} file(s)`)}catch(qD){L.fail("Upload failed"),console.log(z.red(` ${qD instanceof Error?qD.message:"Unknown error"}`)),G1(Q);return}let BD=k("Registering file URLs...").start();try{let qD={};for(let ND of Object.keys(A))qD[ND]=`${q.public_url}/${V}/${ND}`;let u=await C$(Q,qD);if(!u.success){BD.fail(`URL registration failed for ${u.failed.length} file(s)`);for(let ND of u.failed)console.log(z.red(` Failed: ${ND}`));console.log(),console.log(z.yellow("Sandbox training aborted due to URL registration failures.")),console.log(z.gray("This may indicate a git-annex configuration issue.")),G1(Q);return}BD.succeed(`Registered ${u.registered} file URLs`)}catch(qD){BD.fail("Failed to register URLs"),console.log(z.red(` ${qD instanceof Error?qD.message:"Unknown error"}`)),G1(Q);return}}console.log(),console.log(z.bold("Step 6/6: Pushing to GitHub..."));let w=k("Saving and pushing...").start();try{let L=D.username&&D.email?{name:D.username,email:D.email}:void 0;await N$(Q,"Initial sandbox training upload",L),await U$(Q),w.succeed("Pushed to GitHub")}catch(L){w.fail("Failed to push to GitHub"),console.log(z.red(` ${L instanceof Error?L.message:"Unknown error"}`)),G1(Q);return}let S=k("Finalizing...").start();try{await H$(E),await gv(E),oD("sandboxCompleted",!0),oD("sandboxDatasetId",E),S.succeed("Sandbox training complete!")}catch(L){S.fail("Failed to finalize"),console.log(z.red(` ${L instanceof Error?L.message:"Unknown error"}`)),G1(Q);return}G1(Q),console.log(),console.log(z.green.bold("Congratulations! Sandbox training completed successfully.")),console.log(),console.log("Your setup is verified and you're ready to upload real datasets:"),console.log(z.cyan(" nemar dataset upload ./your-dataset")),console.log(),console.log(z.gray(`Sandbox dataset: ${E}`))}S$.command("status").description("Check sandbox training completion status").option("--refresh","Fetch latest status from server").action(async(D)=>{if(!hD()){console.log(z.red("Not authenticated")),console.log(z.gray("Run 'nemar auth login' first"));return}if(D.refresh){let F=k("Checking status...").start();try{let $=await bv();if(oD("sandboxCompleted",$.sandbox_completed),$.sandbox_dataset_id)oD("sandboxDatasetId",$.sandbox_dataset_id);if(F.stop(),$.sandbox_completed){if(console.log(z.green("Sandbox training: Completed")),console.log(z.gray(` Dataset ID: ${$.sandbox_dataset_id}`)),$.sandbox_completed_at)console.log(z.gray(` Completed: ${$.sandbox_completed_at}`))}else console.log(z.yellow("Sandbox training: Not completed")),console.log(),console.log("Run sandbox training with:"),console.log(z.cyan(" nemar sandbox"))}catch($){if(F.fail("Failed to check status"),$ instanceof $D)console.log(z.red(` ${$.message}`))}}else{let F=aD();if(F.sandboxCompleted)console.log(z.green("Sandbox training: Completed")),console.log(z.gray(` Dataset ID: ${F.sandboxDatasetId}`));else console.log(z.yellow("Sandbox training: Not completed")),console.log(),console.log("Run sandbox training with:"),console.log(z.cyan(" nemar sandbox"))}});S$.command("reset").description("Reset sandbox training status for re-training").option("-f, --force","Skip confirmation prompt").action(async(D)=>{if(!hD()){console.log(z.red("Not authenticated")),console.log(z.gray("Run 'nemar auth login' first"));return}if(!aD().sandboxCompleted){console.log(z.yellow("Sandbox training not yet completed")),console.log(z.gray("Nothing to reset"));return}if(!D.force){let B=(await Promise.resolve().then(() => (V3(),EI))).default,{confirm:J}=await B.prompt([{type:"confirm",name:"confirm",message:"Reset sandbox training status? You will need to complete training again.",default:!1}]);if(!J){console.log(z.gray("Cancelled"));return}}let $=k("Resetting sandbox status...").start();try{await hv(),oD("sandboxCompleted",!1),oD("sandboxDatasetId",void 0),$.succeed("Sandbox status reset"),console.log(),console.log("Run sandbox training again with:"),console.log(z.cyan(" nemar sandbox"))}catch(B){if($.fail("Failed to reset"),B instanceof $D)console.log(z.red(` ${B.message}`));else console.log(z.red(` ${B instanceof Error?B.message:"Unknown error"}`))}});var Vy={name:"nemar-cli",version:"0.3.4-dev.64",description:"CLI for NEMAR (Neuroelectromagnetic Data Archive and Tools Resource) dataset management",type:"module",main:"dist/index.js",bin:{nemar:"dist/index.js"},scripts:{dev:"bun run src/index.ts",build:"bun build src/index.ts --outdir dist --target bun --minify && sed '1s|#!/usr/bin/env node|#!/usr/bin/env bun|' dist/index.js > dist/index.js.tmp && mv dist/index.js.tmp dist/index.js",test:"bun test",lint:"biome check src/","lint:fix":"biome check --fix src/",format:"biome format --write src/",typecheck:"tsc --noEmit",prepublishOnly:"bun run build","docs:generate":"bun run scripts/generate-docs.ts","docs:serve":"mkdocs serve","docs:build":"mkdocs build"},keywords:["nemar","bids","neuroimaging","eeg","emg","datalad","cli"],author:"NEMAR Team",license:"MIT",repository:{type:"git",url:"git+https://github.com/nemarDatasets/nemar-cli.git"},bugs:{url:"https://github.com/nemarDatasets/nemar-cli/issues"},homepage:"https://nemar-cli.pages.dev",engines:{bun:">=1.0.0"},files:["dist","README.md","LICENSE"],dependencies:{chalk:"^5.3.0",commander:"^12.1.0",conf:"^13.0.1",inquirer:"^9.2.15",ora:"^8.0.1",zod:"^3.23.8"},devDependencies:{"@biomejs/biome":"^1.9.4","@types/bcryptjs":"^3.0.0","@types/bun":"latest","@types/inquirer":"^9.0.7",bcryptjs:"^3.0.3",typescript:"^5.5.4"}};var Ay=Vy.version;var W1=new b0;W1.name("nemar").description(`CLI for NEMAR (Neuroelectromagnetic Data Archive and Tools Resource)
359
359
 
360
360
  NEMAR is a curated repository for neurophysiology data in BIDS format.
361
361
  This CLI provides tools for uploading, downloading, and managing datasets.`).version(Ay,"-v, --version","Output the current version").option("--no-color","Disable colored output").option("--verbose","Enable verbose output").addHelpText("after",`
package/package.json CHANGED
@@ -1,6 +1,6 @@
1
1
  {
2
2
  "name": "nemar-cli",
3
- "version": "0.3.4-dev.62",
3
+ "version": "0.3.4-dev.64",
4
4
  "description": "CLI for NEMAR (Neuroelectromagnetic Data Archive and Tools Resource) dataset management",
5
5
  "type": "module",
6
6
  "main": "dist/index.js",