video-context-mcp-server 0.52.1-beta → 0.52.3-beta

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
package/README.md CHANGED
@@ -353,53 +353,205 @@ Set `VIDEO_MCP_DEFAULT_PROVIDER=gemini`, `qwen`, `kimi`, or `mimo` to change the
353
353
 
354
354
  **GLM-4.6V**, **Qwen3.6**, and **MiMo-V2 Omni** all accept direct video URLs, but base64-encoding a local file caps out at **10–12 MB**. Above that limit, the server first tries to fall back to an upload-capable provider (Gemini or Kimi) if one is available, then falls back to **frame-based analysis** as a last resort. For the best results on large local videos, set `AWS_S3_BUCKET` — the server uploads the full video to S3 and passes a presigned URL to GLM, Qwen, and MiMo, bypassing the base64 limit entirely and taking priority over both fallbacks. No manual upload step needed.
355
355
 
356
- **Why it works**: GLM, Qwen, and MiMo require the server to serve `Content-Length` and `Content-Type` headers alongside the video. AWS S3 provides these automatically.
356
+ #### Why S3 works
357
357
 
358
- **One-time setup**
358
+ GLM, Qwen, and MiMo require the serving endpoint to provide `Content-Length` and `Content-Type` headers alongside the video. AWS S3 presigned URLs include both headers automatically.
359
+
360
+ ---
361
+
362
+ #### Prerequisites
363
+
364
+ Before setting up the S3 relay, you'll need an AWS account and access credentials.
365
+
366
+ ##### 1. Create an AWS account
367
+
368
+ 1. Go to [aws.amazon.com](https://aws.amazon.com/) and click **Create an AWS Account**.
369
+ 2. Enter your email address, a password, and an AWS account name.
370
+ 3. Choose the **Basic Support — Free** plan (sufficient for S3 relay usage).
371
+ 4. Fill in your contact and billing information. A valid credit or debit card is required, but S3 usage within the free tier costs nothing.
372
+ 5. Verify your identity via phone call or SMS.
373
+ 6. Once confirmed, sign in to the [AWS Management Console](https://console.aws.amazon.com/).
374
+
375
+ > New AWS accounts include a 12-month Free Tier with 5 GB of S3 storage and 20,000 GET requests per month — more than enough for typical video analysis workflows.
376
+
377
+ ##### 2. Get your AWS Access Key ID and Secret Access Key
378
+
379
+ The S3 relay needs programmatic access to your S3 bucket. You'll create an IAM user with limited permissions:
380
+
381
+ 1. In the AWS Console, search for **IAM** in the top search bar and open the IAM dashboard.
382
+ 2. Click **Users** in the left sidebar, then **Create user**.
383
+ 3. Enter a user name (e.g., `video-mcp-s3`) and click **Next**.
384
+ 4. Under **Permissions options**, select **Attach policies directly**.
385
+ 5. Click **Create policy** — this opens a new tab:
386
+ - Switch to the **JSON** tab and paste the following minimum-permission policy:
387
+ ```json
388
+ {
389
+ "Version": "2012-10-17",
390
+ "Statement": [
391
+ {
392
+ "Effect": "Allow",
393
+ "Action": [
394
+ "s3:PutObject",
395
+ "s3:GetObject",
396
+ "s3:DeleteObject",
397
+ "s3:ListBucket"
398
+ ],
399
+ "Resource": [
400
+ "arn:aws:s3:::your-globally-unique-bucket-name",
401
+ "arn:aws:s3:::your-globally-unique-bucket-name/*"
402
+ ]
403
+ }
404
+ ]
405
+ }
406
+ ```
407
+
408
+ - Replace `your-globally-unique-bucket-name` with your actual globally unique bucket name (you'll create it in the next step).
409
+ - Click **Next**, give the policy a name like `VideoMcpS3Access`, then **Create policy**.
410
+
411
+ 6. Go back to the user creation tab, click the refresh icon, search for `VideoMcpS3Access`, select it, and click **Next** → **Create user**.
412
+ 7. Open the newly created user, go to the **Security credentials** tab.
413
+ 8. Under **Access keys**, click **Create access key**.
414
+ 9. On **Access key best practices & alternatives**, choose **Other** or the closest equivalent programmatic/local-code option shown in your console, then click **Next**.
415
+ 10. Optionally add a description tag, then click **Create access key**.
416
+ 11. Copy and save the **Access Key ID** and **Secret Access Key** — you won't be able to see the secret key again after closing this dialog.
417
+
418
+ > **Never commit your Secret Access Key to version control or share it publicly.** Only add it to your local MCP configuration or AWS credentials file.
419
+
420
+ ##### 3. (Optional) Install and configure the AWS CLI
421
+
422
+ The AWS CLI is only needed if you want to create buckets from the terminal or use the `~/.aws/credentials` method instead of environment variables. If you plan to add credentials directly to your MCP `env` block, you can skip this step.
423
+
424
+ **Install the AWS CLI**
425
+
426
+ - **Windows:** Download the installer from [aws.amazon.com/cli](https://aws.amazon.com/cli/) or run:
427
+ ```bash
428
+ winget install Amazon.AWSCLI
429
+ ```
430
+ - **macOS:**
431
+ ```bash
432
+ curl "https://awscli.amazonaws.com/AWSCLIV2.pkg" -o "AWSCLIV2.pkg"
433
+ sudo installer -pkg AWSCLIV2.pkg -target /
434
+ ```
435
+ - **Linux:**
436
+ ```bash
437
+ curl "https://awscli.amazonaws.com/awscli-exe-linux-x86_64.zip" -o "awscliv2.zip"
438
+ unzip awscliv2.zip
439
+ sudo ./aws/install
440
+ ```
441
+
442
+ **Configure your credentials**
443
+
444
+ Run the following command and paste your Access Key ID and Secret Access Key when prompted:
359
445
 
360
446
  ```bash
361
- # 1. Create a bucket
362
- aws s3 mb s3://my-video-analysis
447
+ aws configure
448
+ ```
449
+
450
+ You'll be asked for:
451
+
452
+ - **AWS Access Key ID** — paste the key you saved earlier
453
+ - **AWS Secret Access Key** — paste the secret key you saved earlier
454
+ - **Default region name** — enter your preferred region (e.g., `us-east-1`)
455
+ - **Default output format** — press Enter for `json`
456
+
457
+ This stores your credentials in `~/.aws/credentials` and `~/.aws/config`, which the MCP server reads automatically.
458
+
459
+ ---
460
+
461
+ #### One-time setup
462
+
463
+ ##### 1. Create an S3 bucket
464
+
465
+ If you installed the AWS CLI:
363
466
 
364
- # 2. Add AWS_S3_BUCKET to your .vscode/mcp.json env block
467
+ ```bash
468
+ aws s3 mb s3://your-globally-unique-bucket-name
469
+ ```
470
+
471
+ Or create it manually in the [S3 Console](https://s3.console.aws.amazon.com/):
472
+
473
+ 1. Open the S3 dashboard and click **Create bucket**.
474
+ 2. Enter a globally unique bucket name (e.g., `your-globally-unique-bucket-name`).
475
+ 3. Choose the AWS Region you want to use. This should match `AWS_REGION` in your MCP config or AWS CLI profile.
476
+ 4. Leave **Block all public access** enabled. The bucket does not need to be public — the server uses presigned URLs.
477
+ 5. Keep the default **Object Ownership** setting (`ACLs disabled` / `Bucket owner enforced`).
478
+ 6. Leave the remaining settings at their defaults, then click **Create bucket**.
479
+
480
+ > You do not need to add a bucket policy or make objects public. A private bucket works fine because the MCP server generates time-limited presigned URLs for each uploaded video.
481
+
482
+ ##### 2. Add `AWS_S3_BUCKET` to your MCP config
483
+
484
+ **VS Code (`.vscode/mcp.json`)**
485
+
486
+ ```json
487
+ {
488
+ "servers": {
489
+ "videoMcp": {
490
+ "type": "stdio",
491
+ "command": "video-context-mcp",
492
+ "env": {
493
+ "AWS_S3_BUCKET": "your-globally-unique-bucket-name",
494
+ "GEMINI_API_KEY": "your-gemini-key"
495
+ }
496
+ }
497
+ }
498
+ }
365
499
  ```
366
500
 
367
- AWS credentials are resolved automatically in this order:
501
+ ---
502
+
503
+ #### AWS Credential Resolution
504
+
505
+ The server resolves AWS credentials in this order — you only need to configure one:
368
506
 
369
- 1. **Environment variables** — add these to your `mcp.json` env block (no AWS CLI required):
507
+ 1. **Environment variables** — add directly to your MCP `env` block (no AWS CLI needed):
370
508
  ```jsonc
371
- // .vscode/mcp.json — env section
372
- {
373
- "AWS_S3_BUCKET": "my-video-analysis",
374
- "AWS_ACCESS_KEY_ID": "AKIA...",
375
- "AWS_SECRET_ACCESS_KEY": "your-secret-key",
376
- "AWS_REGION": "us-east-1",
377
- }
509
+ "AWS_S3_BUCKET": "your-globally-unique-bucket-name",
510
+ "AWS_ACCESS_KEY_ID": "AKIA...",
511
+ "AWS_SECRET_ACCESS_KEY": "your-secret-key",
512
+ "AWS_REGION": "us-east-1"
378
513
  ```
379
- 2. **`~/.aws/credentials`** — if you have the AWS CLI configured, credentials are picked up automatically; only `AWS_S3_BUCKET` is needed:
514
+ 2. **`~/.aws/credentials`** — if the AWS CLI is configured, credentials are picked up automatically. Only `AWS_S3_BUCKET` is needed in your MCP config:
380
515
  ```jsonc
381
- // .vscode/mcp.json — env section
382
- {
383
- "AWS_S3_BUCKET": "my-video-analysis",
384
- }
516
+ "AWS_S3_BUCKET": "your-globally-unique-bucket-name"
385
517
  ```
386
518
  3. **IAM instance role / ECS task role** — for AWS-hosted environments.
387
519
 
388
- That's it. Every time you analyze a local video (or a YouTube/Bilibili download) with GLM or Qwen, the server uploads it to S3, uses the presigned URL, and leaves the object in the bucket for reuse.
520
+ ---
521
+
522
+ #### How it works at runtime
523
+
524
+ Every time you analyze a local video (or a platform download like YouTube) with GLM, Qwen, or MiMo:
525
+
526
+ 1. The server detects the file is too large for base64 encoding.
527
+ 2. The file is uploaded to `s3://your-globally-unique-bucket-name/<hash>/<filename>`.
528
+ 3. A presigned URL (valid for 1 hour) is passed to the AI provider.
529
+ 4. The provider downloads the video directly from S3.
530
+ 5. The object is kept in the bucket for reuse within the same session.
531
+
532
+ **Cleanup:** Relayed S3 objects are deleted automatically when the MCP server session ends. Orphaned objects from crashed sessions are swept at next startup.
533
+
534
+ To keep objects in the bucket for reuse across sessions (useful for large files you analyze repeatedly):
389
535
 
390
- **Cleanup:** Relayed S3 objects are deleted automatically when the MCP server session ends and orphaned objects are swept at startup. Set `AWS_S3_RELAY_CLEANUP=false` to keep objects in the bucket for reuse across sessions.
536
+ ```jsonc
537
+ "AWS_S3_RELAY_CLEANUP": "false"
538
+ ```
539
+
540
+ **Cost**: AWS S3 free tier covers 5 GB storage + 20K GET requests/month for 12 months. After the free tier, storage costs roughly $0.023/GB/month — negligible for most use cases.
541
+
542
+ ---
391
543
 
392
- **Cost**: AWS S3 free tier covers 5 GB storage + 20K GET requests/month for 12 months. After the free tier, storage is roughly $0.023/GB/month.
544
+ #### Manual presigned URLs (alternative)
393
545
 
394
- **Still want direct URLs?** Pass a presigned URL manually:
546
+ You can also pass a presigned URL directly to any tool without configuring the relay:
395
547
 
396
548
  ```bash
397
- aws s3 cp my-video.mp4 s3://my-video-analysis/my-video.mp4
398
- aws s3 presign s3://my-video-analysis/my-video.mp4 --expires-in 3600
399
- # returns: https://my-video-analysis.s3.amazonaws.com/my-video.mp4?X-Amz-...
549
+ aws s3 cp my-video.mp4 s3://your-globally-unique-bucket-name/my-video.mp4
550
+ aws s3 presign s3://your-globally-unique-bucket-name/my-video.mp4 --expires-in 3600
551
+ # https://your-globally-unique-bucket-name.s3.amazonaws.com/my-video.mp4?X-Amz-...
400
552
  ```
401
553
 
402
- Then pass the URL directly to `analyze_video` or `summarize_video`.
554
+ Then pass the URL directly to `analyze_video`, `summarize_video`, or any other tool.
403
555
 
404
556
  </details>
405
557
 
@@ -592,7 +744,7 @@ vmcp cache clear:all --yes # skip confirmation
592
744
  "YT_DLP_PATH": "/usr/local/bin/yt-dlp",
593
745
  "YT_DLP_COOKIES_FILE": "/path/to/cookies.txt",
594
746
  "YT_DLP_IMPERSONATE": "chrome",
595
- "AWS_S3_BUCKET": "my-video-analysis",
747
+ "AWS_S3_BUCKET": "your-globally-unique-bucket-name",
596
748
  "AWS_ACCESS_KEY_ID": "AKIA...",
597
749
  "AWS_SECRET_ACCESS_KEY": "your-secret-key",
598
750
  "AWS_REGION": "us-east-1",
@@ -1,2 +1,2 @@
1
- export declare const VERSION = "0.52.1-beta";
1
+ export declare const VERSION = "0.52.3-beta";
2
2
  //# sourceMappingURL=version.d.ts.map
@@ -1,3 +1,3 @@
1
1
  // Auto-generated by scripts/sync-version.ts — do not edit
2
- export const VERSION = '0.52.1-beta';
2
+ export const VERSION = '0.52.3-beta';
3
3
  //# sourceMappingURL=version.js.map
package/package.json CHANGED
@@ -1,6 +1,6 @@
1
1
  {
2
2
  "name": "video-context-mcp-server",
3
- "version": "0.52.1-beta",
3
+ "version": "0.52.3-beta",
4
4
  "description": "A Model Context Protocol server that gives GitHub Copilot the ability to understand and analyze video content",
5
5
  "type": "module",
6
6
  "main": "dist/index.js",