cloudos-cli 2.35.0__tar.gz → 2.37.0__tar.gz

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
Files changed (35) hide show
  1. {cloudos_cli-2.35.0 → cloudos_cli-2.37.0}/PKG-INFO +85 -17
  2. {cloudos_cli-2.35.0 → cloudos_cli-2.37.0}/README.md +84 -16
  3. {cloudos_cli-2.35.0 → cloudos_cli-2.37.0}/cloudos_cli/__main__.py +124 -36
  4. cloudos_cli-2.37.0/cloudos_cli/_version.py +1 -0
  5. {cloudos_cli-2.35.0 → cloudos_cli-2.37.0}/cloudos_cli/datasets/datasets.py +33 -3
  6. {cloudos_cli-2.35.0 → cloudos_cli-2.37.0}/cloudos_cli/jobs/job.py +98 -8
  7. {cloudos_cli-2.35.0 → cloudos_cli-2.37.0}/cloudos_cli/utils/__init__.py +3 -2
  8. cloudos_cli-2.37.0/cloudos_cli/utils/array_job.py +254 -0
  9. {cloudos_cli-2.35.0 → cloudos_cli-2.37.0}/cloudos_cli/utils/requests.py +35 -0
  10. {cloudos_cli-2.35.0 → cloudos_cli-2.37.0}/cloudos_cli.egg-info/PKG-INFO +85 -17
  11. {cloudos_cli-2.35.0 → cloudos_cli-2.37.0}/cloudos_cli.egg-info/SOURCES.txt +1 -0
  12. cloudos_cli-2.35.0/cloudos_cli/_version.py +0 -1
  13. {cloudos_cli-2.35.0 → cloudos_cli-2.37.0}/LICENSE +0 -0
  14. {cloudos_cli-2.35.0 → cloudos_cli-2.37.0}/cloudos_cli/__init__.py +0 -0
  15. {cloudos_cli-2.35.0 → cloudos_cli-2.37.0}/cloudos_cli/clos.py +0 -0
  16. {cloudos_cli-2.35.0 → cloudos_cli-2.37.0}/cloudos_cli/configure/__init__.py +0 -0
  17. {cloudos_cli-2.35.0 → cloudos_cli-2.37.0}/cloudos_cli/configure/configure.py +0 -0
  18. {cloudos_cli-2.35.0 → cloudos_cli-2.37.0}/cloudos_cli/datasets/__init__.py +0 -0
  19. {cloudos_cli-2.35.0 → cloudos_cli-2.37.0}/cloudos_cli/import_wf/__init__.py +0 -0
  20. {cloudos_cli-2.35.0 → cloudos_cli-2.37.0}/cloudos_cli/import_wf/import_wf.py +0 -0
  21. {cloudos_cli-2.35.0 → cloudos_cli-2.37.0}/cloudos_cli/jobs/__init__.py +0 -0
  22. {cloudos_cli-2.35.0 → cloudos_cli-2.37.0}/cloudos_cli/queue/__init__.py +0 -0
  23. {cloudos_cli-2.35.0 → cloudos_cli-2.37.0}/cloudos_cli/queue/queue.py +0 -0
  24. {cloudos_cli-2.35.0 → cloudos_cli-2.37.0}/cloudos_cli/utils/cloud.py +0 -0
  25. {cloudos_cli-2.35.0 → cloudos_cli-2.37.0}/cloudos_cli/utils/details.py +0 -0
  26. {cloudos_cli-2.35.0 → cloudos_cli-2.37.0}/cloudos_cli/utils/errors.py +0 -0
  27. {cloudos_cli-2.35.0 → cloudos_cli-2.37.0}/cloudos_cli/utils/resources.py +0 -0
  28. {cloudos_cli-2.35.0 → cloudos_cli-2.37.0}/cloudos_cli.egg-info/dependency_links.txt +0 -0
  29. {cloudos_cli-2.35.0 → cloudos_cli-2.37.0}/cloudos_cli.egg-info/entry_points.txt +0 -0
  30. {cloudos_cli-2.35.0 → cloudos_cli-2.37.0}/cloudos_cli.egg-info/requires.txt +0 -0
  31. {cloudos_cli-2.35.0 → cloudos_cli-2.37.0}/cloudos_cli.egg-info/top_level.txt +0 -0
  32. {cloudos_cli-2.35.0 → cloudos_cli-2.37.0}/setup.cfg +0 -0
  33. {cloudos_cli-2.35.0 → cloudos_cli-2.37.0}/setup.py +0 -0
  34. {cloudos_cli-2.35.0 → cloudos_cli-2.37.0}/tests/__init__.py +0 -0
  35. {cloudos_cli-2.35.0 → cloudos_cli-2.37.0}/tests/functions_for_pytest.py +0 -0
@@ -1,6 +1,6 @@
1
1
  Metadata-Version: 2.4
2
2
  Name: cloudos_cli
3
- Version: 2.35.0
3
+ Version: 2.37.0
4
4
  Summary: Python package for interacting with CloudOS
5
5
  Home-page: https://github.com/lifebit-ai/cloudos-cli
6
6
  Author: David Piñeyro
@@ -512,6 +512,51 @@ This assumes the interpreter is available on the container’s $PATH. If not, yo
512
512
 
513
513
  These options provide flexibility for configuring and running bash array jobs, allowing to tailor the execution for specific requirements.
514
514
 
515
+ #### Use multiple projects for files in `--parameter` option
516
+
517
+ The option `--parameter`, could specify a file input located in a different project than option `--project-name`. The files can only be located inside the project's `Data` subfolder, not `Cohorts` or `Analyses Results`. The accepted structures for different parameter projects are:
518
+ - `-p/--parameter "--file=<project>/Data/file.txt"`
519
+ - `-p/--parameter "--file=<project>/Data/subfolder/file.txt"`
520
+ - `-p/--parameter "--file=Data/subfolder/file.txt"` (the same project as `--project-name`)
521
+ - `-p/--parameter "--file=<project>/Data/subfolder/*.txt"`
522
+ - `-p/--parameter "--file=<project>/Data/*.txt"`
523
+ - `-p/--parameter "--file=Data/*.txt"` (the same project as `--project-name`)
524
+
525
+ The project, should be specified at the beginning of the file path. For example:
526
+
527
+ ```console
528
+ cloudos bash array-job \
529
+ -p file=Data/input.csv
530
+ ...
531
+ ```
532
+ This will point to the global project, specified with `--project-name`. In contrast:
533
+
534
+ ```console
535
+ cloudos bash array-job \
536
+ -p data=Data/input.csv
537
+ -p exp=PROJECT_EXPRESSION/Data/input.csv \
538
+ --project-name "ADIPOSE"
539
+ ...
540
+ ```
541
+ for parameter `exp` it will point to a project named `PROJECT_EXPRESSION` in the File Explorer, and `data` parameter will be found in the global project `ADIPOSE`.
542
+
543
+ Apart from files, the parameter can also take glob patterns, for example:
544
+
545
+ ```console
546
+ cloudos bash array-job \
547
+ -p data=Data/input.csv
548
+ -p exp="PROJECT_EXPRESSION/Data/*.csv" \
549
+ --project-name "ADIPOSE"
550
+ ...
551
+ ```
552
+ will take all `csv` file extensions in the specified folder.
553
+
554
+ > [!NOTE]
555
+ > When specifying glob patterns, depending on the terminal is best to add it in double quotes to avoid the terminal searching for the glob pattern locally, e.g. `-p exp="PROJECT_EXPRESSION/Data/*.csv"`.
556
+
557
+ > [!NOTE]
558
+ > Project names in the `--parameter` option can start with either forward slash `/` or without. The following are the same `-p data=/PROJECT1/Data/input.csv` and `-p data=PROJECT1/Data/input.csv`.
559
+
515
560
  #### Get path to logs of job from CloudOS
516
561
 
517
562
  Get the path to "Nextflow logs", "Nextflow standard output", and "trace" files. It can be used only on your user's jobs, with any status.
@@ -1013,6 +1058,38 @@ Please, note that in the above example a preconfigured profile has been used. If
1013
1058
  --workspace-id $WORKSPACE_ID \
1014
1059
  --project-name $PROJEC_NAME
1015
1060
  ```
1061
+
1062
+ #### Copying files and folders
1063
+
1064
+ Files and folders can be copied **from** anywhere in the project **to** `Data` or any of its subfolders programmatically (i.e `Data`, `Data/folder/file.txt`).
1065
+
1066
+ 1. The copy can happen **within the same project** running the following command:
1067
+ ```
1068
+ cloudos datasets cp <souce_path> <destination_path> --profile <profile name>
1069
+ ```
1070
+ where the source project as well as the destination one is the one defined in the profile.
1071
+
1072
+ 2. The move can also happen **across different projects** within the same workspace by running the following command
1073
+ ```
1074
+ cloudos datasets cp <source_path> <destiantion_path> --profile <profile_name> --destination-project-name <project_name>
1075
+ ```
1076
+ In this case, only the source project is the one specified in the profile.
1077
+
1078
+ Any of the `source_path` must be a full path; any `destination_path` must be a path starting with `Data` and finishing with the folder where to move the file/folder. An example of such command is:
1079
+
1080
+ ```
1081
+ cloudos datasets cp AnalysesResults/my_analysis/results/my_plot.png Data/plots
1082
+ ```
1083
+
1084
+ Please, note that in the above example a preconfigured profile has been used. If no profile is provided and there is no default profile, the user will need to also provide the following flags
1085
+ ```bash
1086
+ --cloudos-url $CLOUDOS \
1087
+ --apikey $MY_API_KEY \
1088
+ --workspace-id $WORKSPACE_ID \
1089
+ --project-name $PROJEC_NAME
1090
+ ```
1091
+
1092
+
1016
1093
  #### Create a (virtual) folder
1017
1094
 
1018
1095
  New folders can be created within the `Data` dataset and its subfolders using the following command
@@ -1032,30 +1109,21 @@ Please, note that in the above example a preconfigured profile has been used. If
1032
1109
  --workspace-id $WORKSPACE_ID \
1033
1110
  --project-name $PROJEC_NAME
1034
1111
  ```
1112
+ #### Removing files and folders
1035
1113
 
1036
- #### Copying files and folders
1037
-
1038
- Files and folders can be copied **from** anywhere in the project **to** `Data` or any of its subfolders programmatically (i.e `Data`, `Data/folder/file.txt`).
1114
+ Files and folders can be removed from file explorer (in the `Data` datasets and its subfolders) using the following command
1039
1115
 
1040
- 1. The copy can happen **within the same project** running the following command:
1041
1116
  ```
1042
- cloudos datasets cp <souce_path> <destination_path> --profile <profile name>
1117
+ cloudos datasets rm <path>
1043
1118
  ```
1044
- where the source project as well as the destination one is the one defined in the profile.
1119
+ where `path` is the full path to the file/folder to be removed.
1045
1120
 
1046
- 2. The move can also happen **across different projects** within the same workspace by running the following command
1047
- ```
1048
- cloudos datasets cp <source_path> <destiantion_path> --profile <profile_name> --destination-project-name <project_name>
1049
- ```
1050
- In this case, only the source project is the one specified in the profile.
1051
-
1052
- Any of the `source_path` must be a full path; any `destination_path` must be a path starting with `Data` and finishing with the folder where to move the file/folder. An example of such command is:
1121
+ Please, be aware that removing files and folders will only remove them from the file explorer and not from the corresponding cloud storage.
1053
1122
 
1054
- ```
1055
- cloudos datasets cp AnalysesResults/my_analysis/results/my_plot.png Data/plots
1056
- ```
1123
+ Please, keep in mind that you are only allowed to remove files or folders in `Data` or its subfolders.
1057
1124
 
1058
1125
  Please, note that in the above example a preconfigured profile has been used. If no profile is provided and there is no default profile, the user will need to also provide the following flags
1126
+
1059
1127
  ```bash
1060
1128
  --cloudos-url $CLOUDOS \
1061
1129
  --apikey $MY_API_KEY \
@@ -477,6 +477,51 @@ This assumes the interpreter is available on the container’s $PATH. If not, yo
477
477
 
478
478
  These options provide flexibility for configuring and running bash array jobs, allowing to tailor the execution for specific requirements.
479
479
 
480
+ #### Use multiple projects for files in `--parameter` option
481
+
482
+ The option `--parameter`, could specify a file input located in a different project than option `--project-name`. The files can only be located inside the project's `Data` subfolder, not `Cohorts` or `Analyses Results`. The accepted structures for different parameter projects are:
483
+ - `-p/--parameter "--file=<project>/Data/file.txt"`
484
+ - `-p/--parameter "--file=<project>/Data/subfolder/file.txt"`
485
+ - `-p/--parameter "--file=Data/subfolder/file.txt"` (the same project as `--project-name`)
486
+ - `-p/--parameter "--file=<project>/Data/subfolder/*.txt"`
487
+ - `-p/--parameter "--file=<project>/Data/*.txt"`
488
+ - `-p/--parameter "--file=Data/*.txt"` (the same project as `--project-name`)
489
+
490
+ The project, should be specified at the beginning of the file path. For example:
491
+
492
+ ```console
493
+ cloudos bash array-job \
494
+ -p file=Data/input.csv
495
+ ...
496
+ ```
497
+ This will point to the global project, specified with `--project-name`. In contrast:
498
+
499
+ ```console
500
+ cloudos bash array-job \
501
+ -p data=Data/input.csv
502
+ -p exp=PROJECT_EXPRESSION/Data/input.csv \
503
+ --project-name "ADIPOSE"
504
+ ...
505
+ ```
506
+ for parameter `exp` it will point to a project named `PROJECT_EXPRESSION` in the File Explorer, and `data` parameter will be found in the global project `ADIPOSE`.
507
+
508
+ Apart from files, the parameter can also take glob patterns, for example:
509
+
510
+ ```console
511
+ cloudos bash array-job \
512
+ -p data=Data/input.csv
513
+ -p exp="PROJECT_EXPRESSION/Data/*.csv" \
514
+ --project-name "ADIPOSE"
515
+ ...
516
+ ```
517
+ will take all `csv` file extensions in the specified folder.
518
+
519
+ > [!NOTE]
520
+ > When specifying glob patterns, depending on the terminal is best to add it in double quotes to avoid the terminal searching for the glob pattern locally, e.g. `-p exp="PROJECT_EXPRESSION/Data/*.csv"`.
521
+
522
+ > [!NOTE]
523
+ > Project names in the `--parameter` option can start with either forward slash `/` or without. The following are the same `-p data=/PROJECT1/Data/input.csv` and `-p data=PROJECT1/Data/input.csv`.
524
+
480
525
  #### Get path to logs of job from CloudOS
481
526
 
482
527
  Get the path to "Nextflow logs", "Nextflow standard output", and "trace" files. It can be used only on your user's jobs, with any status.
@@ -978,6 +1023,38 @@ Please, note that in the above example a preconfigured profile has been used. If
978
1023
  --workspace-id $WORKSPACE_ID \
979
1024
  --project-name $PROJEC_NAME
980
1025
  ```
1026
+
1027
+ #### Copying files and folders
1028
+
1029
+ Files and folders can be copied **from** anywhere in the project **to** `Data` or any of its subfolders programmatically (i.e `Data`, `Data/folder/file.txt`).
1030
+
1031
+ 1. The copy can happen **within the same project** running the following command:
1032
+ ```
1033
+ cloudos datasets cp <souce_path> <destination_path> --profile <profile name>
1034
+ ```
1035
+ where the source project as well as the destination one is the one defined in the profile.
1036
+
1037
+ 2. The move can also happen **across different projects** within the same workspace by running the following command
1038
+ ```
1039
+ cloudos datasets cp <source_path> <destiantion_path> --profile <profile_name> --destination-project-name <project_name>
1040
+ ```
1041
+ In this case, only the source project is the one specified in the profile.
1042
+
1043
+ Any of the `source_path` must be a full path; any `destination_path` must be a path starting with `Data` and finishing with the folder where to move the file/folder. An example of such command is:
1044
+
1045
+ ```
1046
+ cloudos datasets cp AnalysesResults/my_analysis/results/my_plot.png Data/plots
1047
+ ```
1048
+
1049
+ Please, note that in the above example a preconfigured profile has been used. If no profile is provided and there is no default profile, the user will need to also provide the following flags
1050
+ ```bash
1051
+ --cloudos-url $CLOUDOS \
1052
+ --apikey $MY_API_KEY \
1053
+ --workspace-id $WORKSPACE_ID \
1054
+ --project-name $PROJEC_NAME
1055
+ ```
1056
+
1057
+
981
1058
  #### Create a (virtual) folder
982
1059
 
983
1060
  New folders can be created within the `Data` dataset and its subfolders using the following command
@@ -997,30 +1074,21 @@ Please, note that in the above example a preconfigured profile has been used. If
997
1074
  --workspace-id $WORKSPACE_ID \
998
1075
  --project-name $PROJEC_NAME
999
1076
  ```
1077
+ #### Removing files and folders
1000
1078
 
1001
- #### Copying files and folders
1002
-
1003
- Files and folders can be copied **from** anywhere in the project **to** `Data` or any of its subfolders programmatically (i.e `Data`, `Data/folder/file.txt`).
1079
+ Files and folders can be removed from file explorer (in the `Data` datasets and its subfolders) using the following command
1004
1080
 
1005
- 1. The copy can happen **within the same project** running the following command:
1006
1081
  ```
1007
- cloudos datasets cp <souce_path> <destination_path> --profile <profile name>
1082
+ cloudos datasets rm <path>
1008
1083
  ```
1009
- where the source project as well as the destination one is the one defined in the profile.
1084
+ where `path` is the full path to the file/folder to be removed.
1010
1085
 
1011
- 2. The move can also happen **across different projects** within the same workspace by running the following command
1012
- ```
1013
- cloudos datasets cp <source_path> <destiantion_path> --profile <profile_name> --destination-project-name <project_name>
1014
- ```
1015
- In this case, only the source project is the one specified in the profile.
1016
-
1017
- Any of the `source_path` must be a full path; any `destination_path` must be a path starting with `Data` and finishing with the folder where to move the file/folder. An example of such command is:
1086
+ Please, be aware that removing files and folders will only remove them from the file explorer and not from the corresponding cloud storage.
1018
1087
 
1019
- ```
1020
- cloudos datasets cp AnalysesResults/my_analysis/results/my_plot.png Data/plots
1021
- ```
1088
+ Please, keep in mind that you are only allowed to remove files or folders in `Data` or its subfolders.
1022
1089
 
1023
1090
  Please, note that in the above example a preconfigured profile has been used. If no profile is provided and there is no default profile, the user will need to also provide the following flags
1091
+
1024
1092
  ```bash
1025
1093
  --cloudos-url $CLOUDOS \
1026
1094
  --apikey $MY_API_KEY \
@@ -16,6 +16,7 @@ from rich.table import Table
16
16
  from cloudos_cli.datasets import Datasets
17
17
  from cloudos_cli.utils.resources import ssl_selector, format_bytes
18
18
  from rich.style import Style
19
+ from cloudos_cli.utils.array_job import generate_datasets_for_project
19
20
  from cloudos_cli.utils.details import get_path
20
21
 
21
22
 
@@ -90,7 +91,8 @@ def run_cloudos_cli(ctx):
90
91
  'mv': shared_config,
91
92
  'rename': shared_config,
92
93
  'cp': shared_config,
93
- 'mkdir': shared_config
94
+ 'mkdir': shared_config,
95
+ 'rm': shared_config
94
96
  }
95
97
  })
96
98
  else:
@@ -139,7 +141,8 @@ def run_cloudos_cli(ctx):
139
141
  'mv': shared_config,
140
142
  'rename': shared_config,
141
143
  'cp': shared_config,
142
- 'mkdir': shared_config
144
+ 'mkdir': shared_config,
145
+ 'rm': shared_config
143
146
  }
144
147
  })
145
148
 
@@ -2157,7 +2160,7 @@ def run_bash_job(ctx,
2157
2160
  hpc_id=None,
2158
2161
  cost_limit=cost_limit,
2159
2162
  verify=verify_ssl,
2160
- command=command,
2163
+ command={"command": command},
2161
2164
  cpus=cpus,
2162
2165
  memory=memory)
2163
2166
 
@@ -2217,7 +2220,12 @@ def run_bash_job(ctx,
2217
2220
  help=('A single parameter to pass to the job call. It should be in the ' +
2218
2221
  'following form: parameter_name=parameter_value. E.g.: ' +
2219
2222
  '-p --test=value or -p -test=value or -p test=value. You can use this option as many ' +
2220
- 'times as parameters you want to include.'))
2223
+ 'times as parameters you want to include. ' +
2224
+ 'For parameters pointing to a file, the format expected is ' +
2225
+ 'parameter_name=<project>/Data/parameter_value. The parameter value must be a ' +
2226
+ 'file located in the `Data` subfolder. If no <project> is specified, it defaults to ' +
2227
+ 'the project specified by the profile or --project-name parameter. ' +
2228
+ 'E.g.: -p "--file=Data/file.txt" or "--file=<project>/Data/folder/file.txt"'))
2221
2229
  @click.option('--job-name',
2222
2230
  help='The name of the job. Default=new_job.',
2223
2231
  default='new_job')
@@ -2403,35 +2411,6 @@ def run_bash_array_job(ctx,
2403
2411
  "|": { "api": "%7C", "file": "|" }
2404
2412
  }
2405
2413
 
2406
- # Setup datasets
2407
- try:
2408
- ds = Datasets(
2409
- cloudos_url=cloudos_url,
2410
- apikey=apikey,
2411
- workspace_id=workspace_id,
2412
- project_name=array_file_project,
2413
- verify=verify_ssl,
2414
- cromwell_token=None
2415
- )
2416
- if custom_script_project is not None:
2417
- # If a custom script project is specified, create a new Datasets object for it
2418
- # This allows the user to run custom scripts in a different project
2419
- ds_custom = Datasets(
2420
- cloudos_url=cloudos_url,
2421
- apikey=apikey,
2422
- workspace_id=workspace_id,
2423
- project_name=custom_script_project,
2424
- verify=verify_ssl,
2425
- cromwell_token=None
2426
- )
2427
- except BadRequestException as e:
2428
- if 'Forbidden' in str(e):
2429
- print('[Error] It seems your call is not authorised. Please check if ' +
2430
- 'your workspace is restricted by Airlock and if your API key is valid.')
2431
- sys.exit(1)
2432
- else:
2433
- raise e
2434
-
2435
2414
  # setup important options for the job
2436
2415
  if do_not_save_logs:
2437
2416
  save_logs = False
@@ -2451,7 +2430,12 @@ def run_bash_array_job(ctx,
2451
2430
  repository_platform=repository_platform, verify=verify_ssl)
2452
2431
 
2453
2432
  # retrieve columns
2454
- r = j.retrieve_cols_from_array_file(array_file, ds, separators[separator]['api'], verify_ssl)
2433
+ r = j.retrieve_cols_from_array_file(
2434
+ array_file,
2435
+ generate_datasets_for_project(cloudos_url, apikey, workspace_id, project_name, verify_ssl),
2436
+ separators[separator]['api'],
2437
+ verify_ssl
2438
+ )
2455
2439
 
2456
2440
  if not disable_column_check:
2457
2441
  columns = json.loads(r.content).get("headers", None)
@@ -2468,7 +2452,12 @@ def run_bash_array_job(ctx,
2468
2452
  columns = []
2469
2453
 
2470
2454
  # setup parameters for the job
2471
- cmd = j.setup_params_array_file(custom_script_path, ds_custom, command, separators[separator]['file'])
2455
+ cmd = j.setup_params_array_file(
2456
+ custom_script_path,
2457
+ generate_datasets_for_project(cloudos_url, apikey, workspace_id, custom_script_project, verify_ssl),
2458
+ command,
2459
+ separators[separator]['file']
2460
+ )
2472
2461
 
2473
2462
  # check columns in the array file vs parameters added
2474
2463
  if not disable_column_check and array_parameter:
@@ -3044,7 +3033,7 @@ def copy_item_cli(ctx, source_path, destination_path, apikey, cloudos_url,
3044
3033
  sys.exit(1)
3045
3034
  # Find the source item
3046
3035
  source_item = None
3047
- for item in source_content.get('files' or 'folders', {}):
3036
+ for item in source_content.get('files', []) + source_content.get('folders', []):
3048
3037
  if item.get("name") == source_name:
3049
3038
  source_item = item
3050
3039
  break
@@ -3206,5 +3195,104 @@ def mkdir_item(ctx, new_folder_path, apikey, cloudos_url,
3206
3195
  sys.exit(1)
3207
3196
 
3208
3197
 
3198
+ @datasets.command(name="rm")
3199
+ @click.argument("target_path", required=True)
3200
+ @click.option('-k', '--apikey', required=True, help='Your CloudOS API key.')
3201
+ @click.option('-c', '--cloudos-url', default=CLOUDOS_URL, required=True, help='The CloudOS URL.')
3202
+ @click.option('--workspace-id', required=True, help='The CloudOS workspace ID.')
3203
+ @click.option('--project-name', required=True, help='The project name.')
3204
+ @click.option('--disable-ssl-verification', is_flag=True, help='Disable SSL certificate verification.')
3205
+ @click.option('--ssl-cert', help='Path to your SSL certificate file.')
3206
+ @click.option('--profile', default=None, help='Profile to use from the config file.')
3207
+ @click.pass_context
3208
+ def rm_item(ctx, target_path, apikey, cloudos_url,
3209
+ workspace_id, project_name,
3210
+ disable_ssl_verification, ssl_cert, profile):
3211
+ """
3212
+ Delete a file or folder in a CloudOS project.
3213
+
3214
+ TARGET_PATH [path]: the full path to the file or folder to delete. Must start with 'Data'. \n
3215
+ E.g.: 'Data/folderA/file.txt' or 'Data/my_analysis/results/folderB'
3216
+ """
3217
+ if not target_path.strip("/").startswith("Data/"):
3218
+ click.echo("[ERROR] TARGET_PATH must start with 'Data/', pointing to a file or folder.", err=True)
3219
+ sys.exit(1)
3220
+ click.echo("Loading configuration profile...")
3221
+ config_manager = ConfigurationProfile()
3222
+ required_dict = {
3223
+ 'apikey': True,
3224
+ 'workspace_id': True,
3225
+ 'workflow_name': False,
3226
+ 'project_name': True
3227
+ }
3228
+
3229
+ apikey, cloudos_url, workspace_id, workflow_name, repository_platform, execution_platform, project_name = (
3230
+ config_manager.load_profile_and_validate_data(
3231
+ ctx,
3232
+ INIT_PROFILE,
3233
+ CLOUDOS_URL,
3234
+ profile=profile,
3235
+ required_dict=required_dict,
3236
+ apikey=apikey,
3237
+ cloudos_url=cloudos_url,
3238
+ workspace_id=workspace_id,
3239
+ workflow_name=None,
3240
+ repository_platform=None,
3241
+ execution_platform=None,
3242
+ project_name=project_name
3243
+ )
3244
+ )
3245
+
3246
+ verify_ssl = ssl_selector(disable_ssl_verification, ssl_cert)
3247
+
3248
+ client = Datasets(
3249
+ cloudos_url=cloudos_url,
3250
+ apikey=apikey,
3251
+ workspace_id=workspace_id,
3252
+ project_name=project_name,
3253
+ verify=verify_ssl,
3254
+ cromwell_token=None
3255
+ )
3256
+
3257
+ parts = target_path.strip("/").split("/")
3258
+ parent_path = "/".join(parts[:-1])
3259
+ item_name = parts[-1]
3260
+
3261
+ try:
3262
+ contents = client.list_folder_content(parent_path)
3263
+ except Exception as e:
3264
+ click.echo(f"[ERROR] Could not list contents at '{parent_path or '[project root]'}': {str(e)}", err=True)
3265
+ sys.exit(1)
3266
+
3267
+ found_item = None
3268
+ for item in contents.get('files', []) + contents.get('folders', []):
3269
+ if item.get("name") == item_name:
3270
+ found_item = item
3271
+ break
3272
+
3273
+ if not found_item:
3274
+ click.echo(f"[ERROR] Item '{item_name}' not found in '{parent_path or '[project root]'}'", err=True)
3275
+ sys.exit(1)
3276
+
3277
+ item_id = found_item["_id"]
3278
+ kind = "Folder" if "folderType" in found_item else "File"
3279
+
3280
+ click.echo(f"Deleting {kind} '{item_name}' from '{parent_path or '[root]'}'...")
3281
+ try:
3282
+ response = client.delete_item(item_id=item_id, kind=kind)
3283
+ if response.ok:
3284
+ click.secho(
3285
+ f"[SUCCESS] {kind} '{item_name}' was deleted from '{parent_path or '[root]'}'.",
3286
+ fg="green", bold=True
3287
+ )
3288
+ click.secho("This item will still be available on your Cloud Provider.", fg="yellow")
3289
+ else:
3290
+ click.echo(f"[ERROR] Deletion failed: {response.status_code} - {response.text}", err=True)
3291
+ sys.exit(1)
3292
+ except Exception as e:
3293
+ click.echo(f"[ERROR] Delete operation failed: {str(e)}", err=True)
3294
+ sys.exit(1)
3295
+
3296
+
3209
3297
  if __name__ == "__main__":
3210
3298
  run_cloudos_cli()
@@ -0,0 +1 @@
1
+ __version__ = '2.37.0'
@@ -5,7 +5,7 @@ This is the main class for file explorer (datasets).
5
5
  from dataclasses import dataclass
6
6
  from typing import Union
7
7
  from cloudos_cli.clos import Cloudos
8
- from cloudos_cli.utils.requests import retry_requests_get, retry_requests_put, retry_requests_post
8
+ from cloudos_cli.utils.requests import retry_requests_get, retry_requests_put, retry_requests_post, retry_requests_delete
9
9
  import json
10
10
 
11
11
  @dataclass
@@ -237,7 +237,7 @@ class Datasets(Cloudos):
237
237
  else:
238
238
  item["s3Prefix"] = item['path']
239
239
  item["s3BucketName"] = s3_bucket_name
240
-
240
+ item["fileType"] = "S3File"
241
241
  normalized["files"].append(item)
242
242
 
243
243
  return normalized
@@ -436,7 +436,7 @@ class Datasets(Cloudos):
436
436
  elif item.get("fileType") == "S3File":
437
437
  payload = {
438
438
  "s3BucketName": item["s3BucketName"],
439
- "s3ObjectKey": item["s3ObjectKey"],
439
+ "s3ObjectKey": item.get("s3ObjectKey") or item.get("s3Prefix"),
440
440
  "name": item["name"],
441
441
  "parent": parent,
442
442
  "isManagedByLifebit": item.get("isManagedByLifebit", False),
@@ -487,4 +487,34 @@ class Datasets(Cloudos):
487
487
  }
488
488
 
489
489
  response = retry_requests_post(url, headers=headers, json=payload, verify=self.verify)
490
+ return response
491
+
492
+ def delete_item(self, item_id: str, kind: str):
493
+ """
494
+ Delete a file or folder in CloudOS.
495
+
496
+ Parameters
497
+ ----------
498
+ item_id : str
499
+ The ID of the file or folder to delete.
500
+ kind : str
501
+ Must be either "File" or "Folder".
502
+
503
+ Returns
504
+ -------
505
+ response : requests.Response
506
+ The response object from the CloudOS API.
507
+ """
508
+ if kind not in ("File", "Folder"):
509
+ raise ValueError("Invalid kind provided. Must be 'File' or 'Folder'.")
510
+
511
+ endpoint = "files" if kind == "File" else "folders"
512
+ url = f"{self.cloudos_url}/api/v1/{endpoint}/{item_id}?teamId={self.workspace_id}"
513
+
514
+ headers = {
515
+ "accept": "application/json",
516
+ "ApiKey": self.apikey
517
+ }
518
+
519
+ response = retry_requests_delete(url, headers=headers, verify=self.verify)
490
520
  return response
@@ -10,6 +10,8 @@ from cloudos_cli.utils.errors import BadRequestException
10
10
  from cloudos_cli.utils.requests import retry_requests_post, retry_requests_get
11
11
  from pathlib import Path
12
12
  import base64
13
+ from cloudos_cli.utils.array_job import classify_pattern, get_file_or_folder_id, extract_project
14
+ import os
13
15
 
14
16
 
15
17
  @dataclass
@@ -382,14 +384,8 @@ class Job(Cloudos):
382
384
  p_name = p_split[0]
383
385
  p_value = '='.join(p_split[1:])
384
386
  if workflow_type == 'docker':
385
- prefix = "--" if p_name.startswith('--') else ("-" if p_name.startswith('-') else '')
386
- # leave defined for adding files later
387
- parameter_kind = "textValue"
388
- param = {"prefix": prefix,
389
- "name": p_name.lstrip('-'),
390
- "parameterKind": parameter_kind,
391
- "textValue": p_value}
392
- workflow_params.append(param)
387
+ # will differentiate between text, data items and glob patterns
388
+ workflow_params.append(self.docker_workflow_param_processing(p, self.project_name))
393
389
  elif workflow_type == 'wdl':
394
390
  param = {"prefix": "",
395
391
  "name": p_name,
@@ -834,3 +830,97 @@ class Job(Cloudos):
834
830
  }
835
831
 
836
832
  return ap_param
833
+
834
+ def docker_workflow_param_processing(self, param, project_name):
835
+ """
836
+ Processes a Docker workflow parameter and determines its type and associated metadata.
837
+
838
+ Parameters
839
+ ----------
840
+ param : str
841
+ The parameter string in the format '--param_name=value'.
842
+ It can represent a file path, a glob pattern, or a simple text value.
843
+ project_name : str
844
+ The name of the current project to use if no specific project is extracted from the parameter.
845
+
846
+ Returns:
847
+ dict: A dictionary containing the processed parameter details. The structure of the dictionary depends on the type of the parameter:
848
+ - For glob patterns:
849
+ {
850
+ "name": str, # Parameter name without leading dashes.
851
+ "prefix": str, # Prefix ('--' or '-') based on the parameter format.
852
+ "globPattern": str, # The glob pattern extracted from the parameter.
853
+ "parameterKind": str, # Always "globPattern".
854
+ "folder": str # Folder ID associated with the glob pattern.
855
+ - For file paths:
856
+ {
857
+ "name": str, # Parameter name without leading dashes.
858
+ "prefix": str, # Prefix ('--' or '-') based on the parameter format.
859
+ "parameterKind": str, # Always "dataItem".
860
+ "dataItem": {
861
+ "kind": str, # Always "File".
862
+ "item": str # File ID associated with the file path.
863
+ - For text values:
864
+ {
865
+ "name": str, # Parameter name without leading dashes.
866
+ "prefix": str, # Prefix ('--' or '-') based on the parameter format.
867
+ "parameterKind": str, # Always "textValue".
868
+ "textValue": str # The text value extracted from the parameter.
869
+
870
+ Notes
871
+ -----
872
+ - The function uses helper methods `extract_project`, `classify_pattern`, and `get_file_or_folder_id` to process the parameter.
873
+ - If the parameter represents a file path or glob pattern, the function retrieves the corresponding file or folder ID from the cloud workspace.
874
+ - If the parameter does not match any specific pattern or file extension, it is treated as a simple text value.
875
+ """
876
+
877
+ # split '--param_name=example_test'
878
+ # name -> '--param_name'
879
+ # rest -> 'example_test'
880
+ name, rest = param.split('=', 1)
881
+
882
+ # e.g. "/Project/Subproject/file.csv", project is "Project"
883
+ # e.g "Data/input.csv", project is '', leaving the global project name
884
+ # e.g "-p --test=value", project is ''
885
+ project, file_path = extract_project(rest)
886
+ current_project = project if project != '' else project_name
887
+
888
+ # e.g. "/Project/Subproject/file.csv"
889
+ command_path = Path(file_path)
890
+ command_dir = str(command_path.parent)
891
+ command_name = command_path.name
892
+ _, ext = os.path.splitext(command_name)
893
+ prefix = "--" if name.startswith('--') else ("-" if name.startswith('-') else "")
894
+ if classify_pattern(rest) in ["regex", "glob"]:
895
+ if not (file_path.startswith('/Data') or file_path.startswith('Data')):
896
+ raise ValueError("[ERROR] The file path inside the project must start with '/Data' or 'Data'. ")
897
+
898
+ folder = get_file_or_folder_id(self.cloudos_url, self.apikey, self.workspace_id, current_project, self.verify, command_dir, command_name, is_file=False)
899
+ return {
900
+ "name": f"{name.lstrip('-')}",
901
+ "prefix": f"{prefix}",
902
+ 'globPattern': command_name,
903
+ "parameterKind": "globPattern",
904
+ "folder": f"{folder}"
905
+ }
906
+ elif ext:
907
+ if not (file_path.startswith('/Data') or file_path.startswith('Data')):
908
+ raise ValueError("[ERROR] The file path inside the project must start with '/Data' or 'Data'. ")
909
+
910
+ file = get_file_or_folder_id(self.cloudos_url, self.apikey, self.workspace_id, current_project, self.verify, command_dir, command_name, is_file=True)
911
+ return {
912
+ "name": f"{name.lstrip('-')}",
913
+ "prefix": f"{prefix}",
914
+ "parameterKind": "dataItem",
915
+ "dataItem": {
916
+ "kind": "File",
917
+ "item": f"{file}"
918
+ }
919
+ }
920
+ else:
921
+ return {
922
+ "name": f"{name.lstrip('-')}",
923
+ "prefix": f"{prefix}",
924
+ "parameterKind": "textValue",
925
+ "textValue": f"{rest}"
926
+ }
@@ -3,10 +3,11 @@ Utility functions and classes to use across the package.
3
3
  """
4
4
 
5
5
  from .errors import BadRequestException, TimeOutException, AccountNotLinkedException, JoBNotCompletedException, NotAuthorisedException, NoCloudForWorkspaceException
6
- from .requests import retry_requests_get, retry_requests_post, retry_requests_put
6
+ from .requests import retry_requests_get, retry_requests_post, retry_requests_put, retry_requests_delete
7
7
  from .resources import format_bytes, ssl_selector
8
8
  from .cloud import find_cloud
9
9
  from .cloud import find_cloud
10
+ from .array_job import is_valid_regex, is_glob_pattern, is_probably_regex, classify_pattern, generate_datasets_for_project, get_file_or_folder_id
10
11
  from .details import get_path
11
12
 
12
- __all__ = ['errors', 'requests', 'resources', 'cloud', 'details']
13
+ __all__ = ['errors', 'requests', 'resources', 'cloud', 'details', 'array_job']
@@ -0,0 +1,254 @@
1
+ import re
2
+ import sys
3
+ from cloudos_cli.utils.errors import BadRequestException
4
+
5
+
6
+ def is_valid_regex(s):
7
+ """
8
+ Validates whether the given string is a valid regular expression.
9
+
10
+ Parameters
11
+ ----------
12
+ s : str
13
+ The string to be checked as a regular expression.
14
+
15
+ Returns
16
+ -------
17
+ bool
18
+ True if the string is a valid regular expression, False otherwise.
19
+ """
20
+ try:
21
+ re.compile(s)
22
+ return True
23
+ except re.error:
24
+ return False
25
+
26
+ def is_glob_pattern(s):
27
+ """
28
+ Check if a given string contains glob pattern characters.
29
+
30
+ Glob patterns are commonly used for filename matching and include
31
+ special characters such as '*', '?', and '['.
32
+
33
+ Parameters
34
+ ----------
35
+ s : str
36
+ The string to check for glob pattern characters.
37
+
38
+ Returns
39
+ -------
40
+ bool
41
+ True if the string contains any glob pattern characters, otherwise False.
42
+ """
43
+ return any(char in s for char in "*?[")
44
+
45
+ def is_probably_regex(s):
46
+ """
47
+ Determines if a given string is likely a regular expression.
48
+
49
+ This function checks whether the input string matches common patterns
50
+ that are indicative of regular expressions. It first validates the
51
+ string using `is_valid_regex(s)` and then searches for specific regex
52
+ indicators such as quantifiers, character classes, anchors, and
53
+ alternation.
54
+
55
+ Parameters
56
+ ----------
57
+ s : str
58
+ The string to evaluate.
59
+
60
+ Returns
61
+ -------
62
+ bool
63
+ True if the string is likely a regular expression, False otherwise.
64
+
65
+ Notes
66
+ -----
67
+ The function assumes the existence of `is_valid_regex(s)` which
68
+ validates whether the input string is a valid regex.
69
+ """
70
+ if not is_valid_regex(s):
71
+ return False
72
+
73
+ # Patterns that usually indicate actual regex use (not just file names)
74
+ regex_indicators = [
75
+ r"\.\*", r"\.\+", r"\\[dws]", r"\[[^\]]+\]", r"\([^\)]+\)",
76
+ r"\{\d+(,\d*)?\}", r"\^", r"\$", r"\|"
77
+ ]
78
+ return any(re.search(pat, s) for pat in regex_indicators)
79
+
80
+ def classify_pattern(s):
81
+ """
82
+ Classifies a given string pattern into one of three categories: "regex", "glob", or "exact".
83
+
84
+ Parameters
85
+ ----------
86
+ s : str
87
+ The string pattern to classify.
88
+
89
+ Returns
90
+ -------
91
+ str: A string indicating the type of pattern:
92
+ - "regex" if the pattern is likely a regular expression.
93
+ - "glob" if the pattern matches glob-style syntax.
94
+ - "exact" if the pattern does not match regex or glob syntax.
95
+ """
96
+ if is_probably_regex(s):
97
+ return "regex"
98
+ elif is_glob_pattern(s):
99
+ return "glob"
100
+ else:
101
+ return "exact"
102
+
103
+ def generate_datasets_for_project(cloudos_url, apikey, workspace_id, project_name, verify_ssl):
104
+ """
105
+ Generate datasets for a specified project in a CloudOS workspace.
106
+
107
+ This function initializes a `Datasets` object for the given project and handles
108
+ potential errors such as missing project elements or unauthorized API calls.
109
+
110
+ Parameters
111
+ ----------
112
+ cloudos_url : str
113
+ The URL of the CloudOS instance.
114
+ apikey : str
115
+ The API key for authentication.
116
+ workspace_id : str
117
+ The ID of the workspace where the project resides.
118
+ project_name : str
119
+ The name of the project for which datasets are generated.
120
+ verify_ssl : bool
121
+ Whether to verify SSL certificates during API calls.
122
+
123
+ Returns
124
+ -------
125
+ Datasets
126
+ An instance of the `Datasets` class initialized for the specified project.
127
+
128
+ Raises
129
+ ------
130
+ ValueError
131
+ If the specified project is not found in the workspace.
132
+ BadRequestException
133
+ If the API call is unauthorized or encounters other issues.
134
+ """
135
+
136
+ # this avoids circular import error if import is added at the top
137
+ from cloudos_cli.datasets import Datasets
138
+ try:
139
+ ds = Datasets(
140
+ cloudos_url=cloudos_url,
141
+ apikey=apikey,
142
+ workspace_id=workspace_id,
143
+ project_name=project_name,
144
+ verify=verify_ssl,
145
+ cromwell_token=None
146
+ )
147
+ except ValueError:
148
+ print(f"[ERROR] No {project_name} element in projects was found")
149
+ sys.exit(1)
150
+
151
+ except BadRequestException as e:
152
+ if 'Forbidden' in str(e):
153
+ print('[Error] It seems your call is not authorised. Please check if ' +
154
+ 'your workspace is restricted by Airlock and if your API key is valid.')
155
+ sys.exit(1)
156
+ else:
157
+ raise e
158
+
159
+ return ds
160
+
161
+ def get_file_or_folder_id(cloudos_url, apikey, workspace_id, project_name, verify_ssl, command_dir, command_name, is_file=True):
162
+ """Retrieve the ID of a specific file or folder within a CloudOS workspace.
163
+
164
+ Parameters
165
+ ----------
166
+ cloudos_url : str
167
+ The base URL of the CloudOS API.
168
+ apikey : str
169
+ The API key for authenticating requests to the CloudOS API.
170
+ workspace_id : str
171
+ The ID of the workspace containing the project.
172
+ project_name : str
173
+ The name of the project within the workspace.
174
+ verify_ssl : bool
175
+ Whether to verify SSL certificates for the API requests.
176
+ name : str
177
+ The name of the file or folder whose ID is to be retrieved.
178
+ is_file : bool, optional
179
+ Whether to retrieve a file ID (True) or folder ID (False). Default is True.
180
+
181
+ Returns
182
+ -------
183
+ str: The ID of the specified file or folder.
184
+
185
+ Raises
186
+ ------
187
+ ValueError
188
+ If the specified file or folder is not found.
189
+ Exception
190
+ If there is an error during the API interaction or data retrieval.
191
+
192
+ Notes
193
+ -----
194
+ - This function uses the `generate_datasets_for_project` function to create a Datasets object for the specified project.
195
+ - The `list_folder_content` method is used for files, and `list_project_content` is used for folders.
196
+ - The function assumes that the IDs are stored in the `"_id"` field of the metadata.
197
+ """
198
+ # create a Datasets() class
199
+ ds = generate_datasets_for_project(cloudos_url, apikey, workspace_id, project_name, verify_ssl)
200
+
201
+ if is_file:
202
+ # get all files from a folder
203
+ content = ds.list_folder_content(command_dir)
204
+ for file in content['files']:
205
+ if file.get("name") == command_name:
206
+ return file.get("_id", '')
207
+ raise ValueError(f"File '{command_name}' not found in directory '{command_dir}'.")
208
+ else:
209
+ # get all folders from the project
210
+ # check if the command_dir has a sub-folder
211
+ if len(command_dir.split("/")) > 1:
212
+ # get the first folder which is just below the project
213
+ folders = ds.list_folder_content(command_dir.split("/")[0])
214
+ # use the last folder as is listed in the first folder
215
+ folder_to_search = command_dir.split("/")[-1]
216
+ else:
217
+ folders = ds.list_project_content()
218
+ folder_to_search = command_dir
219
+
220
+ for folder in folders['folders']:
221
+ if folder.get("name") == folder_to_search:
222
+ return folder.get("_id", '')
223
+ raise ValueError(f"Folder '{folder_to_search}' not found in project.")
224
+
225
+ def extract_project(path):
226
+ """
227
+ Extracts the project name and the remaining path from a given file path.
228
+
229
+ The function assumes that a "project" exists if the path contains at least three parts
230
+ when split by slashes. If the path has fewer than three parts, the project name is
231
+ considered empty, and the entire path is returned as the remaining path.
232
+
233
+ Parameters
234
+ ----------
235
+ path : str
236
+ The file path to process.
237
+
238
+ Returns
239
+ -------
240
+ tuple: A tuple containing:
241
+ - str: The project name (empty string if no project exists).
242
+ - str: The remaining path after the project name.
243
+ """
244
+ # Strip slashes and split the path
245
+ parts = path.strip("/").split("/")
246
+ # A "project" exists only if there are at least 3 parts
247
+ # globs needs more than 3 parts i.e. PROJECT/Data/Downloads/*.csv
248
+ if (len(parts) >= 3 and not is_glob_pattern(path)) or \
249
+ (len(parts) > 3 and is_glob_pattern(path)):
250
+ # Return the first part as project name and the rest as remaining path
251
+ return parts[0], "/".join(parts[1:])
252
+ else:
253
+ # project is empty, use the project_name of the function
254
+ return "", "/".join(parts)
@@ -107,3 +107,38 @@ def retry_requests_put(url, total=5, status_forcelist=[429, 500, 502, 503, 504],
107
107
  # Make a request using the session object
108
108
  response = session.put(url, **kwargs)
109
109
  return response
110
+
111
+
112
+ def retry_requests_delete(url, total=5, status_forcelist=[429, 500, 502, 503, 504], **kwargs):
113
+ """
114
+ Wrap normal requests DELETE with an error retry strategy.
115
+
116
+ Parameters
117
+ ----------
118
+ url : str
119
+ The request URL.
120
+ total : int
121
+ Total number of retry attempts.
122
+ status_forcelist : list of int
123
+ HTTP status codes that should trigger a retry.
124
+ **kwargs :
125
+ Additional keyword arguments passed to `requests.delete`.
126
+
127
+ Returns
128
+ -------
129
+ requests.Response
130
+ The Response object returned by the API server.
131
+ """
132
+ retry_strategy = Retry(
133
+ total=total,
134
+ status_forcelist=status_forcelist,
135
+ allowed_methods=["DELETE"]
136
+ )
137
+ adapter = HTTPAdapter(max_retries=retry_strategy)
138
+
139
+ session = requests.Session()
140
+ session.mount("http://", adapter)
141
+ session.mount("https://", adapter)
142
+
143
+ response = session.delete(url, **kwargs)
144
+ return response
@@ -1,6 +1,6 @@
1
1
  Metadata-Version: 2.4
2
2
  Name: cloudos_cli
3
- Version: 2.35.0
3
+ Version: 2.37.0
4
4
  Summary: Python package for interacting with CloudOS
5
5
  Home-page: https://github.com/lifebit-ai/cloudos-cli
6
6
  Author: David Piñeyro
@@ -512,6 +512,51 @@ This assumes the interpreter is available on the container’s $PATH. If not, yo
512
512
 
513
513
  These options provide flexibility for configuring and running bash array jobs, allowing to tailor the execution for specific requirements.
514
514
 
515
+ #### Use multiple projects for files in `--parameter` option
516
+
517
+ The option `--parameter`, could specify a file input located in a different project than option `--project-name`. The files can only be located inside the project's `Data` subfolder, not `Cohorts` or `Analyses Results`. The accepted structures for different parameter projects are:
518
+ - `-p/--parameter "--file=<project>/Data/file.txt"`
519
+ - `-p/--parameter "--file=<project>/Data/subfolder/file.txt"`
520
+ - `-p/--parameter "--file=Data/subfolder/file.txt"` (the same project as `--project-name`)
521
+ - `-p/--parameter "--file=<project>/Data/subfolder/*.txt"`
522
+ - `-p/--parameter "--file=<project>/Data/*.txt"`
523
+ - `-p/--parameter "--file=Data/*.txt"` (the same project as `--project-name`)
524
+
525
+ The project, should be specified at the beginning of the file path. For example:
526
+
527
+ ```console
528
+ cloudos bash array-job \
529
+ -p file=Data/input.csv
530
+ ...
531
+ ```
532
+ This will point to the global project, specified with `--project-name`. In contrast:
533
+
534
+ ```console
535
+ cloudos bash array-job \
536
+ -p data=Data/input.csv
537
+ -p exp=PROJECT_EXPRESSION/Data/input.csv \
538
+ --project-name "ADIPOSE"
539
+ ...
540
+ ```
541
+ for parameter `exp` it will point to a project named `PROJECT_EXPRESSION` in the File Explorer, and `data` parameter will be found in the global project `ADIPOSE`.
542
+
543
+ Apart from files, the parameter can also take glob patterns, for example:
544
+
545
+ ```console
546
+ cloudos bash array-job \
547
+ -p data=Data/input.csv
548
+ -p exp="PROJECT_EXPRESSION/Data/*.csv" \
549
+ --project-name "ADIPOSE"
550
+ ...
551
+ ```
552
+ will take all `csv` file extensions in the specified folder.
553
+
554
+ > [!NOTE]
555
+ > When specifying glob patterns, depending on the terminal is best to add it in double quotes to avoid the terminal searching for the glob pattern locally, e.g. `-p exp="PROJECT_EXPRESSION/Data/*.csv"`.
556
+
557
+ > [!NOTE]
558
+ > Project names in the `--parameter` option can start with either forward slash `/` or without. The following are the same `-p data=/PROJECT1/Data/input.csv` and `-p data=PROJECT1/Data/input.csv`.
559
+
515
560
  #### Get path to logs of job from CloudOS
516
561
 
517
562
  Get the path to "Nextflow logs", "Nextflow standard output", and "trace" files. It can be used only on your user's jobs, with any status.
@@ -1013,6 +1058,38 @@ Please, note that in the above example a preconfigured profile has been used. If
1013
1058
  --workspace-id $WORKSPACE_ID \
1014
1059
  --project-name $PROJEC_NAME
1015
1060
  ```
1061
+
1062
+ #### Copying files and folders
1063
+
1064
+ Files and folders can be copied **from** anywhere in the project **to** `Data` or any of its subfolders programmatically (i.e `Data`, `Data/folder/file.txt`).
1065
+
1066
+ 1. The copy can happen **within the same project** running the following command:
1067
+ ```
1068
+ cloudos datasets cp <souce_path> <destination_path> --profile <profile name>
1069
+ ```
1070
+ where the source project as well as the destination one is the one defined in the profile.
1071
+
1072
+ 2. The move can also happen **across different projects** within the same workspace by running the following command
1073
+ ```
1074
+ cloudos datasets cp <source_path> <destiantion_path> --profile <profile_name> --destination-project-name <project_name>
1075
+ ```
1076
+ In this case, only the source project is the one specified in the profile.
1077
+
1078
+ Any of the `source_path` must be a full path; any `destination_path` must be a path starting with `Data` and finishing with the folder where to move the file/folder. An example of such command is:
1079
+
1080
+ ```
1081
+ cloudos datasets cp AnalysesResults/my_analysis/results/my_plot.png Data/plots
1082
+ ```
1083
+
1084
+ Please, note that in the above example a preconfigured profile has been used. If no profile is provided and there is no default profile, the user will need to also provide the following flags
1085
+ ```bash
1086
+ --cloudos-url $CLOUDOS \
1087
+ --apikey $MY_API_KEY \
1088
+ --workspace-id $WORKSPACE_ID \
1089
+ --project-name $PROJEC_NAME
1090
+ ```
1091
+
1092
+
1016
1093
  #### Create a (virtual) folder
1017
1094
 
1018
1095
  New folders can be created within the `Data` dataset and its subfolders using the following command
@@ -1032,30 +1109,21 @@ Please, note that in the above example a preconfigured profile has been used. If
1032
1109
  --workspace-id $WORKSPACE_ID \
1033
1110
  --project-name $PROJEC_NAME
1034
1111
  ```
1112
+ #### Removing files and folders
1035
1113
 
1036
- #### Copying files and folders
1037
-
1038
- Files and folders can be copied **from** anywhere in the project **to** `Data` or any of its subfolders programmatically (i.e `Data`, `Data/folder/file.txt`).
1114
+ Files and folders can be removed from file explorer (in the `Data` datasets and its subfolders) using the following command
1039
1115
 
1040
- 1. The copy can happen **within the same project** running the following command:
1041
1116
  ```
1042
- cloudos datasets cp <souce_path> <destination_path> --profile <profile name>
1117
+ cloudos datasets rm <path>
1043
1118
  ```
1044
- where the source project as well as the destination one is the one defined in the profile.
1119
+ where `path` is the full path to the file/folder to be removed.
1045
1120
 
1046
- 2. The move can also happen **across different projects** within the same workspace by running the following command
1047
- ```
1048
- cloudos datasets cp <source_path> <destiantion_path> --profile <profile_name> --destination-project-name <project_name>
1049
- ```
1050
- In this case, only the source project is the one specified in the profile.
1051
-
1052
- Any of the `source_path` must be a full path; any `destination_path` must be a path starting with `Data` and finishing with the folder where to move the file/folder. An example of such command is:
1121
+ Please, be aware that removing files and folders will only remove them from the file explorer and not from the corresponding cloud storage.
1053
1122
 
1054
- ```
1055
- cloudos datasets cp AnalysesResults/my_analysis/results/my_plot.png Data/plots
1056
- ```
1123
+ Please, keep in mind that you are only allowed to remove files or folders in `Data` or its subfolders.
1057
1124
 
1058
1125
  Please, note that in the above example a preconfigured profile has been used. If no profile is provided and there is no default profile, the user will need to also provide the following flags
1126
+
1059
1127
  ```bash
1060
1128
  --cloudos-url $CLOUDOS \
1061
1129
  --apikey $MY_API_KEY \
@@ -22,6 +22,7 @@ cloudos_cli/jobs/job.py
22
22
  cloudos_cli/queue/__init__.py
23
23
  cloudos_cli/queue/queue.py
24
24
  cloudos_cli/utils/__init__.py
25
+ cloudos_cli/utils/array_job.py
25
26
  cloudos_cli/utils/cloud.py
26
27
  cloudos_cli/utils/details.py
27
28
  cloudos_cli/utils/errors.py
@@ -1 +0,0 @@
1
- __version__ = '2.35.0'
File without changes
File without changes
File without changes