gitarsenal-cli 1.5.2 → 1.5.4

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
package/README.md CHANGED
@@ -1,74 +1,119 @@
1
1
  # GitArsenal CLI
2
2
 
3
- A tool for creating and managing GPU-accelerated development environments using Modal.
3
+ **Run any GitHub repository instantly with pre-configured GPU environments.**
4
4
 
5
- ## Features
5
+ GitArsenal CLI makes it incredibly easy to run any GitHub repository without worrying about setup, dependencies, or environment configuration. Just point it at a repository and start coding with GPU acceleration.
6
6
 
7
- - Create Modal containers with GPU support
8
- - Clone repositories and run setup commands
9
- - Persistent storage with Modal volumes
10
- - SSH access to containers
11
- - API key management for various services
7
+ ## Why GitArsenal CLI?
12
8
 
13
- ## API Key Management
9
+ - **Zero Setup**: No need to install dependencies, configure environments, or manage GPU drivers
10
+ - **GPU Ready**: Every environment comes with GPU acceleration (A10G, A100, H100)
11
+ - **Persistent Storage**: Your work and data persist between sessions
12
+ - **SSH Access**: Connect directly to your running environment
13
+ - **API Key Management**: Securely store and auto-inject API keys for services like OpenAI, Weights & Biases, and Hugging Face
14
14
 
15
- The CLI now supports secure storage of API keys for various services. Keys are stored in `~/.gitarsenal/keys/` with proper permissions (only readable by the current user).
15
+ ## Quick Start
16
16
 
17
- ### Supported Services
17
+ ### Run any GitHub repository
18
18
 
19
- - `openai` - OpenAI API keys for debugging and assistance
20
- - `wandb` - Weights & Biases API keys for experiment tracking
21
- - `huggingface` - Hugging Face tokens for model access
19
+ ```bash
20
+ # Basic usage - clone and run any repo
21
+ gitarsenal container --repo-url https://github.com/username/awesome-project.git
22
+
23
+ # With GPU acceleration
24
+ gitarsenal container --gpu A10G --repo-url https://github.com/username/awesome-project.git
22
25
 
23
- ### Managing API Keys
26
+ # With custom setup commands
27
+ gitarsenal container --gpu A100 --repo-url https://github.com/username/awesome-project.git --setup-commands "pip install -r requirements.txt" "python setup.py install"
28
+ ```
24
29
 
25
- #### Adding a new API key
30
+ ### Examples
26
31
 
27
32
  ```bash
28
- # Add a key interactively (will prompt for the key)
29
- python test_modalSandboxScript.py keys add --service openai
33
+ # Run a machine learning project
34
+ gitarsenal container --gpu A100 --repo-url https://github.com/username/transformer-project.git --setup-commands "pip install torch transformers" "wandb login"
30
35
 
31
- # Add a key directly (not recommended for security)
32
- python test_modalSandboxScript.py keys add --service wandb --key YOUR_API_KEY
36
+ # Run a web development project
37
+ gitarsenal container --repo-url https://github.com/username/react-app.git --setup-commands "npm install" "npm start"
38
+
39
+ # Run a data science project with persistent storage
40
+ gitarsenal container --gpu A10G --repo-url https://github.com/username/data-analysis.git --volume-name my-data --setup-commands "pip install pandas numpy matplotlib"
33
41
  ```
34
42
 
35
- #### Listing saved API keys
43
+ ## API Key Management
44
+
45
+ Store your API keys once and use them across all projects:
36
46
 
37
47
  ```bash
38
- python test_modalSandboxScript.py keys list
39
- ```
48
+ # Add API keys for seamless integration
49
+ gitarsenal keys add --service openai
50
+ gitarsenal keys add --service wandb
51
+ gitarsenal keys add --service huggingface
40
52
 
41
- #### Viewing a specific API key (masked)
53
+ # View your saved keys
54
+ gitarsenal keys list
42
55
 
43
- ```bash
44
- python test_modalSandboxScript.py keys view --service huggingface
56
+ # Remove a key
57
+ gitarsenal keys delete --service openai
45
58
  ```
46
59
 
47
- #### Deleting an API key
60
+ ### Supported Services
61
+
62
+ - **OpenAI** - For debugging and AI assistance
63
+ - **Weights & Biases** - For experiment tracking
64
+ - **Hugging Face** - For model access and downloads
65
+
66
+ ## Features
67
+
68
+ ### Automatic Environment Setup
69
+ The CLI automatically:
70
+ - Clones your repository
71
+ - Installs dependencies based on your setup commands
72
+ - Configures GPU acceleration
73
+ - Sets up persistent storage
74
+ - Injects your saved API keys
75
+
76
+ ### Persistent Storage
77
+ Keep your work safe with persistent volumes:
48
78
 
49
79
  ```bash
50
- python test_modalSandboxScript.py keys delete --service openai
80
+ # Create a persistent environment
81
+ gitarsenal container --repo-url https://github.com/username/project.git --volume-name my-work
82
+
83
+ # Your data, models, and work will persist between sessions
51
84
  ```
52
85
 
53
- ## Creating a Modal Container
86
+ ### SSH Access
87
+ Connect directly to your running environment:
54
88
 
55
89
  ```bash
56
- # Basic container creation
57
- python test_modalSandboxScript.py container --gpu A10G --repo-url https://github.com/username/repo.git
58
-
59
- # With setup commands
60
- python test_modalSandboxScript.py container --gpu A100 --repo-url https://github.com/username/repo.git --setup-commands "pip install -r requirements.txt" "python setup.py install"
90
+ # Get SSH connection details
91
+ gitarsenal container --repo-url https://github.com/username/project.git --ssh
61
92
 
62
- # With volume for persistent storage
63
- python test_modalSandboxScript.py container --gpu A10G --repo-url https://github.com/username/repo.git --volume-name my-persistent-volume
93
+ # Connect via SSH to your environment
94
+ ssh user@your-environment-ip
64
95
  ```
65
96
 
66
- ## Automatic API Key Usage
97
+ ## Workflow
98
+
99
+ 1. **Choose a repository** - Any GitHub repo you want to work with
100
+ 2. **Run the command** - Specify GPU, setup commands, and storage
101
+ 3. **Start coding** - Your environment is ready with all dependencies installed
102
+ 4. **Save your work** - Use persistent volumes to keep your progress
103
+
104
+ ## Perfect For
105
+
106
+ - **Machine Learning Projects** - GPU-accelerated training with pre-configured environments
107
+ - **Data Science** - Jupyter notebooks with all dependencies ready
108
+ - **Web Development** - Full-stack projects with development servers
109
+ - **Research** - Reproducible environments for academic work
110
+ - **Hackathons** - Quick setup for rapid prototyping
67
111
 
68
- When using commands that require API keys (like `wandb login` or `huggingface-cli login`), the system will:
112
+ ## Getting Started
69
113
 
70
- 1. Check if a saved API key exists for the service
71
- 2. If found, use the saved key automatically
72
- 3. If not found, prompt for the key and offer to save it for future use
114
+ 1. Install the CLI (see installation instructions)
115
+ 2. Add your API keys: `gitarsenal keys add --service openai`
116
+ 3. Run any repository: `gitarsenal container --repo-url https://github.com/username/project.git`
117
+ 4. Start coding!
73
118
 
74
- This makes it easy to work with multiple projects that require the same API keys without having to re-enter them each time.
119
+ No more "works on my machine" - every environment is identical and ready to go.
package/package.json CHANGED
@@ -1,6 +1,6 @@
1
1
  {
2
2
  "name": "gitarsenal-cli",
3
- "version": "1.5.2",
3
+ "version": "1.5.4",
4
4
  "description": "CLI tool for creating Modal sandboxes with GitHub repositories",
5
5
  "main": "index.js",
6
6
  "bin": {
@@ -119,6 +119,24 @@ class CredentialsManager:
119
119
  def validate_openai_key(key):
120
120
  # Basic validation - OpenAI keys usually start with "sk-" and are 51 chars
121
121
  return key.startswith("sk-") and len(key) > 40
122
+
123
+ # First check environment variable
124
+ env_key = os.environ.get("OPENAI_API_KEY")
125
+ if env_key and validate_openai_key(env_key):
126
+ return env_key
127
+
128
+ # Then try to fetch from server using fetch_modal_tokens if available
129
+ try:
130
+ from fetch_modal_tokens import get_tokens
131
+ _, _, api_key = get_tokens()
132
+ if api_key and validate_openai_key(api_key):
133
+ # Set in environment for future use
134
+ os.environ["OPENAI_API_KEY"] = api_key
135
+ return api_key
136
+ except ImportError:
137
+ pass
138
+ except Exception as e:
139
+ print(f"⚠️ Error fetching API key from server: {e}")
122
140
 
123
141
  prompt = "To debug failed commands, an OpenAI API key is needed.\nYou can get your API key from: https://platform.openai.com/api-keys"
124
142
  return self.get_credential("openai_api_key", prompt, is_password=True, validate_func=validate_openai_key)
@@ -41,7 +41,7 @@ def fetch_default_tokens_from_gitarsenal():
41
41
  openai_api_key = data.get("openaiApiKey")
42
42
 
43
43
  if token_id and token_secret:
44
- print("✅ Successfully fetched default tokens from gitarsenal.dev")
44
+ # print("✅ Successfully fetched default tokens from gitarsenal.dev")
45
45
  return token_id, token_secret, openai_api_key
46
46
  else:
47
47
  print("❌ Modal tokens not found in gitarsenal.dev response")
@@ -153,6 +153,13 @@ def get_tokens():
153
153
  print("💡 Please check your network connection and API endpoints")
154
154
  return None, None, None
155
155
 
156
+ # Debug print the full token values
157
+ # print("\n🔍 DEBUG: FULL TOKEN VALUES:")
158
+ # print(f"🔍 DEBUG: MODAL_TOKEN_ID: {token_id}")
159
+ # print(f"🔍 DEBUG: MODAL_TOKEN_SECRET: {token_secret}")
160
+ # print(f"🔍 DEBUG: OPENAI_API_KEY: {openai_api_key}")
161
+ # print("🔍 DEBUG: END OF TOKEN VALUES\n")
162
+
156
163
  # Set the tokens in environment variables
157
164
  os.environ["MODAL_TOKEN_ID"] = token_id
158
165
  os.environ["MODAL_TOKEN_SECRET"] = token_secret
@@ -51,8 +51,6 @@ try:
51
51
  print(f"✅ Using tokens from proxy server or defaults")
52
52
  except (ImportError, ValueError) as e:
53
53
  # If the module is not available or tokens are invalid, use hardcoded tokens
54
- TOKEN_ID = "ak-sLhYqCjkvixiYcb9LAuCHp"
55
- TOKEN_SECRET = "as-fPzD0Zm0dl6IFAEkhaH9pq" # Real token secret from fr8mafia profile
56
54
  print(f"⚠️ Using default tokens")
57
55
 
58
56
  print("🔧 Fixing Modal token (basic implementation)...")
@@ -27,9 +27,7 @@ try:
27
27
  # print(f"✅ Using tokens from proxy server or defaults")
28
28
  except ImportError:
29
29
  # If the module is not available, use hardcoded tokens
30
- # print(f"⚠️ Using default tokens")
31
- TOKEN_ID = "ak-sLhYqCjkvixiYcb9LAuCHp"
32
- TOKEN_SECRET = "as-fPzD0Zm0dl6IFAEkhaH9pq"
30
+ print(f"⚠️ Using default tokens")
33
31
 
34
32
  # print("🔧 Advanced Modal Token Fixer")
35
33
 
@@ -26,8 +26,6 @@ try:
26
26
  print(f"✅ Using tokens from proxy server or defaults")
27
27
  except ImportError:
28
28
  # If the module is not available, use hardcoded tokens
29
- # TOKEN_ID = "ak-sLhYqCjkvixiYcb9LAuCHp"
30
- # TOKEN_SECRET = "as-fPzD0Zm0dl6IFAEkhaH9pq" # Real token secret from fr8mafia profile
31
29
  print(f"⚠️ Using default tokens")
32
30
 
33
31
  # Set environment variables
@@ -29,8 +29,6 @@ try:
29
29
  print(f"✅ Using tokens from proxy server or defaults")
30
30
  except ImportError:
31
31
  # If the module is not available, use hardcoded tokens
32
- # TOKEN_ID = "ak-sLhYqCjkvixiYcb9LAuCHp"
33
- # TOKEN_SECRET = "as-fPzD0Zm0dl6IFAEkhaH9pq" # Real token secret from fr8mafia profile
34
32
  print(f"⚠️ Using default tokens")
35
33
 
36
34
  print("🔧 Modal Token Solution - Comprehensive Fix")
@@ -36,15 +36,21 @@ if args.proxy_api_key:
36
36
  # First, try to fetch tokens from the proxy server
37
37
  try:
38
38
  # Import the fetch_modal_tokens module
39
- # print("🔄 Fetching tokens from proxy server...")
39
+ print("🔄 Fetching tokens from proxy server...")
40
40
  from fetch_modal_tokens import get_tokens
41
41
  token_id, token_secret, openai_api_key = get_tokens()
42
42
 
43
+ # Debug print the tokens
44
+ # print("🔍 DEBUG: Modal Tokens Fetched:")
45
+ # print(f"🔍 DEBUG: Token ID: {token_id}")
46
+ # print(f"🔍 DEBUG: Token Secret: {token_secret}")
47
+ # print(f"🔍 DEBUG: OpenAI API Key: {openai_api_key}")
48
+
43
49
  # Check if we got valid tokens
44
50
  if token_id is None or token_secret is None:
45
51
  raise ValueError("Could not get valid tokens")
46
52
 
47
- # print(f"✅ Tokens fetched successfully")
53
+ print(f"✅ Tokens fetched successfully")
48
54
 
49
55
  # Explicitly set the environment variables again to be sure
50
56
  os.environ["MODAL_TOKEN_ID"] = token_id
@@ -336,6 +342,11 @@ def call_openai_for_debug(command, error_output, api_key=None, current_dir=None,
336
342
  print(f"🔍 DEBUG: Error output length: {len(error_output) if error_output else 0}")
337
343
  print(f"🔍 DEBUG: Current directory: {current_dir}")
338
344
  print(f"🔍 DEBUG: Sandbox available: {sandbox is not None}")
345
+ print(f"🔍 DEBUG: API key provided: {'Yes' if api_key else 'No'}")
346
+ if api_key:
347
+ print(f"🔍 DEBUG: API key value: {api_key}")
348
+ else:
349
+ print(f"🔍 DEBUG: API key from environment: {os.environ.get('OPENAI_API_KEY')}")
339
350
 
340
351
  # Define _to_str function locally to avoid NameError
341
352
  def _to_str(maybe_bytes):
@@ -370,6 +381,24 @@ def call_openai_for_debug(command, error_output, api_key=None, current_dir=None,
370
381
  # First try environment variable
371
382
  api_key = os.environ.get("OPENAI_API_KEY")
372
383
  print(f"🔍 DEBUG: API key from environment: {'Found' if api_key else 'Not found'}")
384
+ if api_key:
385
+ print(f"🔍 DEBUG: Environment API key value: {api_key}")
386
+
387
+ # If not in environment, try to fetch from server using fetch_modal_tokens
388
+ if not api_key:
389
+ try:
390
+ print("🔍 DEBUG: Trying to fetch API key from server...")
391
+ from fetch_modal_tokens import get_tokens
392
+ _, _, api_key = get_tokens()
393
+ if api_key:
394
+ print("✅ Successfully fetched OpenAI API key from server")
395
+ print(f"🔍 DEBUG: Fetched OpenAI API key value: {api_key}")
396
+ # Set in environment for this session
397
+ os.environ["OPENAI_API_KEY"] = api_key
398
+ else:
399
+ print("⚠️ Could not fetch OpenAI API key from server")
400
+ except Exception as e:
401
+ print(f"⚠️ Error fetching API key from server: {e}")
373
402
 
374
403
  # Store the API key in a persistent file if found
375
404
  if api_key:
@@ -391,6 +420,7 @@ def call_openai_for_debug(command, error_output, api_key=None, current_dir=None,
391
420
  api_key = f.read().strip()
392
421
  if api_key:
393
422
  print("✅ Loaded OpenAI API key from saved file")
423
+ print(f"🔍 DEBUG: API key from file: {api_key}")
394
424
  print(f"🔍 DEBUG: API key length: {len(api_key)}")
395
425
  # Also set in environment for this session
396
426
  os.environ["OPENAI_API_KEY"] = api_key
@@ -407,8 +437,14 @@ def call_openai_for_debug(command, error_output, api_key=None, current_dir=None,
407
437
  try:
408
438
  from credentials_manager import CredentialsManager
409
439
  credentials_manager = CredentialsManager()
410
- api_key = os.environ.get("OPENAI_API_KEY")
411
- print(f"🔍 DEBUG: API key from credentials manager: {'Found' if api_key else 'Not found'}")
440
+ api_key = credentials_manager.get_openai_api_key()
441
+ if api_key:
442
+ print(f"🔍 DEBUG: API key from credentials manager: Found")
443
+ print(f"🔍 DEBUG: Credentials manager API key value: {api_key}")
444
+ # Set in environment for this session
445
+ os.environ["OPENAI_API_KEY"] = api_key
446
+ else:
447
+ print(f"🔍 DEBUG: API key from credentials manager: Not found")
412
448
  except ImportError as e:
413
449
  print(f"🔍 DEBUG: Credentials manager not available: {e}")
414
450
  # Fall back to direct input if credentials_manager is not available
@@ -431,6 +467,7 @@ def call_openai_for_debug(command, error_output, api_key=None, current_dir=None,
431
467
  print("❌ No API key provided. Skipping debugging.")
432
468
  return None
433
469
  print("✅ API key received successfully!")
470
+ print(f"🔍 DEBUG: User-provided API key: {api_key}")
434
471
  # Save the API key to environment for future use in this session
435
472
  os.environ["OPENAI_API_KEY"] = api_key
436
473
  except KeyboardInterrupt:
@@ -1766,7 +1803,9 @@ cd "{current_dir}"
1766
1803
  print(f"🔍 DEBUG: Sandbox available: {sandbox is not None}")
1767
1804
  print(f"🔍 DEBUG: Debug output preview: {debug_output[:200]}...")
1768
1805
 
1769
- fix_command = call_openai_for_debug(cmd_to_execute, debug_output, current_dir=current_dir, sandbox=sandbox)
1806
+ # Get the API key from environment or use the one that was fetched earlier
1807
+ api_key = os.environ.get("OPENAI_API_KEY")
1808
+ fix_command = call_openai_for_debug(cmd_to_execute, debug_output, api_key=api_key, current_dir=current_dir, sandbox=sandbox)
1770
1809
 
1771
1810
  print(f"🔍 DEBUG: call_openai_for_debug returned: {fix_command}")
1772
1811
 
@@ -2438,7 +2477,7 @@ ssh_app = modal.App("ssh-container-app")
2438
2477
  memory=8192,
2439
2478
  serialized=True,
2440
2479
  )
2441
- def ssh_container_function(ssh_password, repo_url=None, repo_name=None, setup_commands=None):
2480
+ def ssh_container_function(ssh_password, repo_url=None, repo_name=None, setup_commands=None, openai_api_key=None):
2442
2481
  import subprocess
2443
2482
  import time
2444
2483
  import os
@@ -2452,6 +2491,13 @@ def ssh_container_function(ssh_password, repo_url=None, repo_name=None, setup_co
2452
2491
  # Setup environment
2453
2492
  os.environ['PS1'] = r'\[\e[1;32m\]modal:\[\e[1;34m\]\w\[\e[0m\]$ '
2454
2493
 
2494
+ # Set OpenAI API key if provided
2495
+ if openai_api_key:
2496
+ os.environ['OPENAI_API_KEY'] = openai_api_key
2497
+ print(f"✅ Set OpenAI API key in container environment (length: {len(openai_api_key)})")
2498
+ else:
2499
+ print("⚠️ No OpenAI API key provided to container")
2500
+
2455
2501
  # Clone repository if provided
2456
2502
  if repo_url:
2457
2503
  repo_name_from_url = repo_name or repo_url.split('/')[-1].replace('.git', '')
@@ -2523,7 +2569,9 @@ def ssh_container_function(ssh_password, repo_url=None, repo_name=None, setup_co
2523
2569
  print(f"🔍 DEBUG: Error output length: {len(error_output)}")
2524
2570
  print(f"🔍 DEBUG: Current directory: {current_dir}")
2525
2571
 
2526
- fix_command = call_openai_for_debug(cmd, error_output, current_dir=current_dir)
2572
+ # Get the API key from environment or use the one that was fetched earlier
2573
+ api_key = os.environ.get("OPENAI_API_KEY")
2574
+ fix_command = call_openai_for_debug(cmd, error_output, api_key=api_key, current_dir=current_dir)
2527
2575
 
2528
2576
  print(f"🔍 DEBUG: call_openai_for_debug returned: {fix_command}")
2529
2577
 
@@ -2641,12 +2689,16 @@ def create_modal_ssh_container(gpu_type, repo_url=None, repo_name=None, setup_co
2641
2689
  print("🔍 DEBUG: Checking environment variables")
2642
2690
  modal_token_id = os.environ.get("MODAL_TOKEN_ID")
2643
2691
  modal_token = os.environ.get("MODAL_TOKEN")
2692
+ openai_api_key = os.environ.get("OPENAI_API_KEY")
2644
2693
  print(f"🔍 token exists: {'Yes' if modal_token_id else 'No'}")
2645
2694
  print(f"🔍 token exists: {'Yes' if modal_token else 'No'}")
2695
+ print(f"🔍 openai_api_key exists: {'Yes' if openai_api_key else 'No'}")
2646
2696
  if modal_token_id:
2647
2697
  print(f"🔍 token length: {len(modal_token_id)}")
2648
2698
  if modal_token:
2649
2699
  print(f"🔍 token length: {len(modal_token)}")
2700
+ if openai_api_key:
2701
+ print(f"🔍 openai_api_key length: {len(openai_api_key)}")
2650
2702
 
2651
2703
  # Try to access Modal token to check authentication
2652
2704
  try:
@@ -2860,7 +2912,7 @@ def create_modal_ssh_container(gpu_type, repo_url=None, repo_name=None, setup_co
2860
2912
  serialized=True,
2861
2913
  volumes=volumes_config if volumes_config else None,
2862
2914
  )
2863
- def ssh_container_function():
2915
+ def ssh_container_function(ssh_password=None, repo_url=None, repo_name=None, setup_commands=None, openai_api_key=None):
2864
2916
  """Start SSH container with password authentication and optional setup."""
2865
2917
  import subprocess
2866
2918
  import time
@@ -2869,6 +2921,13 @@ def create_modal_ssh_container(gpu_type, repo_url=None, repo_name=None, setup_co
2869
2921
  # Set root password
2870
2922
  subprocess.run(["bash", "-c", f"echo 'root:{ssh_password}' | chpasswd"], check=True)
2871
2923
 
2924
+ # Set OpenAI API key if provided
2925
+ if openai_api_key:
2926
+ os.environ['OPENAI_API_KEY'] = openai_api_key
2927
+ print(f"✅ Set OpenAI API key in container environment (length: {len(openai_api_key)})")
2928
+ else:
2929
+ print("⚠️ No OpenAI API key provided to container")
2930
+
2872
2931
  # Start SSH service
2873
2932
  subprocess.run(["service", "ssh", "start"], check=True)
2874
2933
 
@@ -2934,7 +2993,9 @@ def create_modal_ssh_container(gpu_type, repo_url=None, repo_name=None, setup_co
2934
2993
  print(f"🔍 DEBUG: Error output length: {len(error_output)}")
2935
2994
  print(f"🔍 DEBUG: Current directory: {current_dir}")
2936
2995
 
2937
- fix_command = call_openai_for_debug(cmd, error_output, current_dir=current_dir)
2996
+ # Get the API key from environment or use the one that was fetched earlier
2997
+ api_key = os.environ.get("OPENAI_API_KEY")
2998
+ fix_command = call_openai_for_debug(cmd, error_output, api_key=api_key, current_dir=current_dir)
2938
2999
 
2939
3000
  print(f"🔍 DEBUG: call_openai_for_debug returned: {fix_command}")
2940
3001
 
@@ -3030,7 +3091,9 @@ def create_modal_ssh_container(gpu_type, repo_url=None, repo_name=None, setup_co
3030
3091
  # Start the container in a new thread to avoid blocking
3031
3092
  with modal.enable_output():
3032
3093
  with app.run():
3033
- ssh_container_function.remote()
3094
+ # Get the API key from environment
3095
+ api_key = os.environ.get("OPENAI_API_KEY")
3096
+ ssh_container_function.remote(ssh_password, repo_url, repo_name, setup_commands, api_key)
3034
3097
 
3035
3098
  # Clean up Modal token after container is successfully created
3036
3099
  cleanup_modal_token()
@@ -3722,7 +3785,7 @@ def create_ssh_container_function(gpu_type="a10g", timeout_minutes=60, volume=No
3722
3785
  serialized=True,
3723
3786
  volumes=volumes if volumes else None,
3724
3787
  )
3725
- def ssh_container(ssh_password, repo_url=None, repo_name=None, setup_commands=None):
3788
+ def ssh_container(ssh_password, repo_url=None, repo_name=None, setup_commands=None, openai_api_key=None):
3726
3789
  import subprocess
3727
3790
  import time
3728
3791
  import os
@@ -3730,6 +3793,13 @@ def create_ssh_container_function(gpu_type="a10g", timeout_minutes=60, volume=No
3730
3793
  # Set root password
3731
3794
  subprocess.run(["bash", "-c", f"echo 'root:{ssh_password}' | chpasswd"], check=True)
3732
3795
 
3796
+ # Set OpenAI API key if provided
3797
+ if openai_api_key:
3798
+ os.environ['OPENAI_API_KEY'] = openai_api_key
3799
+ print(f"✅ Set OpenAI API key in container environment (length: {len(openai_api_key)})")
3800
+ else:
3801
+ print("⚠️ No OpenAI API key provided to container")
3802
+
3733
3803
  # Start SSH service
3734
3804
  subprocess.run(["service", "ssh", "start"], check=True)
3735
3805
 
@@ -19,8 +19,6 @@ try:
19
19
  print(f"✅ Using tokens from proxy server or defaults")
20
20
  except ImportError:
21
21
  # If the module is not available, use hardcoded tokens
22
- # TOKEN_ID = "ak-sLhYqCjkvixiYcb9LAuCHp"
23
- # TOKEN_SECRET = "as-fPzD0Zm0dl6IFAEkhaH9pq" # Real token secret from fr8mafia profile
24
22
  print(f"⚠️ Using hardcoded tokens")
25
23
 
26
24
  # Set tokens directly in environment