cosmotech-acceleration-library 1.0.0__py3-none-any.whl → 1.1.0__py3-none-any.whl

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
Files changed (76) hide show
  1. cosmotech/coal/__init__.py +1 -1
  2. cosmotech/coal/azure/adx/runner.py +1 -3
  3. cosmotech/coal/cosmotech_api/__init__.py +10 -6
  4. cosmotech/coal/cosmotech_api/dataset/download/file.py +117 -104
  5. cosmotech/coal/cosmotech_api/dataset/download/twingraph.py +10 -13
  6. cosmotech/coal/cosmotech_api/dataset/upload.py +41 -0
  7. cosmotech/coal/cosmotech_api/runner/datasets.py +71 -19
  8. cosmotech/coal/cosmotech_api/runner/download.py +3 -14
  9. cosmotech/coal/postgresql/runner.py +3 -1
  10. cosmotech/coal/postgresql/store.py +3 -0
  11. cosmotech/coal/utils/decorator.py +25 -0
  12. cosmotech/coal/utils/semver.py +6 -0
  13. cosmotech/csm_data/commands/adx_send_data.py +7 -7
  14. cosmotech/csm_data/commands/adx_send_runnerdata.py +10 -10
  15. cosmotech/csm_data/commands/api/api.py +1 -1
  16. cosmotech/csm_data/commands/api/postgres_send_runner_metadata.py +23 -11
  17. cosmotech/csm_data/commands/api/rds_load_csv.py +8 -8
  18. cosmotech/csm_data/commands/api/rds_send_csv.py +6 -6
  19. cosmotech/csm_data/commands/api/rds_send_store.py +6 -6
  20. cosmotech/csm_data/commands/api/run_load_data.py +10 -10
  21. cosmotech/csm_data/commands/api/runtemplate_load_handler.py +5 -5
  22. cosmotech/csm_data/commands/api/tdl_load_files.py +6 -6
  23. cosmotech/csm_data/commands/api/tdl_send_files.py +7 -7
  24. cosmotech/csm_data/commands/api/wsf_load_file.py +10 -8
  25. cosmotech/csm_data/commands/api/wsf_send_file.py +10 -8
  26. cosmotech/csm_data/commands/az_storage_upload.py +6 -6
  27. cosmotech/csm_data/commands/s3_bucket_delete.py +8 -8
  28. cosmotech/csm_data/commands/s3_bucket_download.py +9 -9
  29. cosmotech/csm_data/commands/s3_bucket_upload.py +10 -10
  30. cosmotech/csm_data/commands/store/dump_to_azure.py +9 -9
  31. cosmotech/csm_data/commands/store/dump_to_postgresql.py +22 -10
  32. cosmotech/csm_data/commands/store/dump_to_s3.py +10 -10
  33. cosmotech/csm_data/commands/store/list_tables.py +3 -3
  34. cosmotech/csm_data/commands/store/load_csv_folder.py +3 -3
  35. cosmotech/csm_data/commands/store/load_from_singlestore.py +8 -8
  36. cosmotech/csm_data/commands/store/reset.py +2 -2
  37. cosmotech/csm_data/commands/store/store.py +1 -2
  38. cosmotech/csm_data/main.py +8 -6
  39. cosmotech/csm_data/utils/decorators.py +1 -1
  40. cosmotech/translation/csm_data/en-US/csm_data/commands/api/api.yml +8 -0
  41. cosmotech/translation/csm_data/en-US/csm_data/commands/api/postgres_send_runner_metadata.yml +17 -0
  42. cosmotech/translation/csm_data/en-US/csm_data/commands/api/rds_load_csv.yml +13 -0
  43. cosmotech/translation/csm_data/en-US/csm_data/commands/api/rds_send_csv.yml +12 -0
  44. cosmotech/translation/csm_data/en-US/csm_data/commands/api/rds_send_store.yml +12 -0
  45. cosmotech/translation/csm_data/en-US/csm_data/commands/api/run_load_data.yml +15 -0
  46. cosmotech/translation/csm_data/en-US/csm_data/commands/api/runtemplate_load_handler.yml +7 -0
  47. cosmotech/translation/csm_data/en-US/csm_data/commands/api/tdl_load_files.yml +14 -0
  48. cosmotech/translation/csm_data/en-US/csm_data/commands/api/tdl_send_files.yml +18 -0
  49. cosmotech/translation/csm_data/en-US/csm_data/commands/api/wsf_load_file.yml +10 -0
  50. cosmotech/translation/csm_data/en-US/csm_data/commands/api/wsf_send_file.yml +12 -0
  51. cosmotech/translation/csm_data/en-US/csm_data/commands/main.yml +9 -0
  52. cosmotech/translation/csm_data/en-US/csm_data/commands/storage/adx_send_data.yml +8 -0
  53. cosmotech/translation/csm_data/en-US/csm_data/commands/storage/adx_send_runnerdata.yml +15 -0
  54. cosmotech/translation/csm_data/en-US/csm_data/commands/storage/az_storage_upload.yml +8 -0
  55. cosmotech/translation/csm_data/en-US/csm_data/commands/storage/s3_bucket_delete.yml +17 -0
  56. cosmotech/translation/csm_data/en-US/csm_data/commands/storage/s3_bucket_download.yml +18 -0
  57. cosmotech/translation/csm_data/en-US/csm_data/commands/storage/s3_bucket_upload.yml +21 -0
  58. cosmotech/translation/csm_data/en-US/csm_data/commands/storage/storage.yml +4 -0
  59. cosmotech/translation/csm_data/en-US/csm_data/commands/store/dump_to_azure.yml +23 -0
  60. cosmotech/translation/csm_data/en-US/csm_data/commands/store/dump_to_postgresql.yml +20 -0
  61. cosmotech/translation/csm_data/en-US/csm_data/commands/store/dump_to_s3.yml +26 -0
  62. cosmotech/translation/csm_data/en-US/csm_data/commands/store/list_tables.yml +5 -0
  63. cosmotech/translation/csm_data/en-US/csm_data/commands/store/load_csv_folder.yml +5 -0
  64. cosmotech/translation/csm_data/en-US/csm_data/commands/store/load_from_singlestore.yml +16 -0
  65. cosmotech/translation/csm_data/en-US/csm_data/commands/store/reset.yml +4 -0
  66. cosmotech/translation/csm_data/en-US/csm_data/commands/store/store.yml +4 -0
  67. cosmotech/translation/csm_data/en-US/csm_data/commons/decorators.yml +2 -0
  68. cosmotech/translation/csm_data/en-US/csm_data/commons/version.yml +4 -0
  69. {cosmotech_acceleration_library-1.0.0.dist-info → cosmotech_acceleration_library-1.1.0.dist-info}/METADATA +13 -14
  70. {cosmotech_acceleration_library-1.0.0.dist-info → cosmotech_acceleration_library-1.1.0.dist-info}/RECORD +74 -44
  71. {cosmotech_acceleration_library-1.0.0.dist-info → cosmotech_acceleration_library-1.1.0.dist-info}/WHEEL +1 -1
  72. cosmotech/coal/utils/api.py +0 -68
  73. cosmotech/translation/csm_data/en-US/csm-data.yml +0 -434
  74. {cosmotech_acceleration_library-1.0.0.dist-info → cosmotech_acceleration_library-1.1.0.dist-info}/entry_points.txt +0 -0
  75. {cosmotech_acceleration_library-1.0.0.dist-info → cosmotech_acceleration_library-1.1.0.dist-info}/licenses/LICENSE +0 -0
  76. {cosmotech_acceleration_library-1.0.0.dist-info → cosmotech_acceleration_library-1.1.0.dist-info}/top_level.txt +0 -0
@@ -1,434 +0,0 @@
1
- commands:
2
- main:
3
- description: |
4
- Cosmo Tech Data Interface
5
-
6
- Command toolkit providing quick implementation of data connections to use inside the Cosmo Tech Platform
7
-
8
- api:
9
- description: |
10
- Cosmo Tech API helper command
11
-
12
- This command will inform you of which connection is available to use for the Cosmo Tech API
13
-
14
- If no connection is available, will list all possible set of parameters and return an error code,
15
-
16
- You can use this command in a csm-orc template to make sure that API connection is available.
17
-
18
- tdl_send_files:
19
- description: |
20
- Reads a folder CSVs and send those to the Cosmo Tech API as a Dataset
21
-
22
- CSVs must follow a given format:
23
- - Nodes files must have an id column
24
- - Relationship files must have id, src and dest columns
25
-
26
- Non-existing relationship (aka dest or src does not point to existing node) won't trigger an error,
27
- the relationship will not be created instead.
28
-
29
- Requires a valid connection to the API to send the data
30
-
31
- parameters:
32
- api_url: The URI to a Cosmo Tech API instance
33
- organization_id: An organization id for the Cosmo Tech API
34
- workspace_id: A workspace id for the Cosmo Tech API
35
- runner_id: A runner id for the Cosmo Tech API
36
- dir: Path to the directory containing csvs to send
37
- clear: Flag to clear the target dataset first (if set to True will clear the dataset before sending anything, irreversibly)
38
-
39
- tdl_load_files:
40
- description: |
41
- Query a twingraph and loads all the data from it
42
-
43
- Will create 1 csv file per node type / relationship type
44
-
45
- The twingraph must have been populated using the "tdl-send-files" command for this to work correctly
46
-
47
- Requires a valid connection to the API to send the data
48
-
49
- parameters:
50
- organization_id: An organization id for the Cosmo Tech API
51
- workspace_id: A workspace id for the Cosmo Tech API
52
- runner_id: A runner id for the Cosmo Tech API
53
- dir: Path to the directory to write the results to
54
-
55
- runtemplate_load_handler:
56
- description: |
57
- Uses environment variables to download cloud based Template steps
58
-
59
- parameters:
60
- organization_id: The id of an organization in the cosmotech api
61
- workspace_id: The id of a solution in the cosmotech api
62
- run-template_id: The name of the run template in the cosmotech api
63
- handler_list: A list of handlers to download (comma separated)
64
-
65
- run_load_data:
66
- description: |
67
- Download a runner data from the Cosmo Tech API
68
- Requires a valid Azure connection either with:
69
- - The AZ cli command: az login
70
- - A triplet of env var AZURE_TENANT_ID, AZURE_CLIENT_ID, AZURE_CLIENT_SECRET
71
-
72
- parameters:
73
- organization_id: The id of an organization in the cosmotech api
74
- workspace_id: The id of a workspace in the cosmotech api
75
- runner_id: The id of a runner in the cosmotech api
76
- parameters_absolute_path: A local folder to store the parameters content
77
-
78
-
79
- rds_load_csv:
80
- description: |
81
- Load data from a runner's RDS database into a CSV file.
82
-
83
- Executes a SQL query against the runner's RDS database and saves the results to a CSV file.
84
- By default, it will list all tables in the public schema if no specific query is provided.
85
- parameters:
86
- target_folder: The folder where the csv will be written
87
- organization_id: An organization id for the Cosmo Tech API
88
- workspace_id: A workspace id for the Cosmo Tech API
89
- runner_id: A runner id for the Cosmo Tech API
90
- run_id: A run id for the Cosmo Tech API
91
- file_name: A file name to write the query results
92
- query: SQL query to execute (defaults to listing all tables in public schema)
93
-
94
- rds_send_csv:
95
- description: |
96
- Send CSV files to a runner's RDS database.
97
-
98
- Takes all CSV files from a source folder and sends their content to the runner's RDS database.
99
- Each CSV file will be sent to a table named after the file (without the .csv extension).
100
- The table name will be prefixed with "CD_" in the database.
101
- parameters:
102
- source_folder: The folder containing csvs to send
103
- organization_id: An organization id for the Cosmo Tech API
104
- workspace_id: A workspace id for the Cosmo Tech API
105
- runner_id: A runner id for the Cosmo Tech API
106
- run_id: A run id for the Cosmo Tech API
107
-
108
- rds_send_store:
109
- description: |
110
- Send data from a store to a runner's RDS database.
111
-
112
- Takes all tables from a store and sends their content to the runner's RDS database.
113
- Each table will be sent to a table with the same name, prefixed with "CD_" in the database.
114
- Null values in rows will be removed before sending.
115
- parameters:
116
- store_folder: The folder containing the store files
117
- organization_id: An organization id for the Cosmo Tech API
118
- workspace_id: A workspace id for the Cosmo Tech API
119
- runner_id: A runner id for the Cosmo Tech API
120
- run_id: A run id for the Cosmo Tech API
121
-
122
- wsf_load_file:
123
- description: |
124
- Download files from a workspace.
125
-
126
- Downloads files from a specified path in a workspace to a local target folder.
127
- If the workspace path ends with '/', it will be treated as a folder and all files within will be downloaded.
128
- parameters:
129
- organization_id: An organization id for the Cosmo Tech API
130
- workspace_id: A workspace id for the Cosmo Tech API
131
- workspace_path: Path inside the workspace to load (end with '/' for a folder)
132
- target_folder: Folder in which to send the downloaded file
133
-
134
- wsf_send_file:
135
- description: |
136
- Upload a file to a workspace.
137
-
138
- Uploads a local file to a specified path in a workspace.
139
- If the workspace path ends with '/', the file will be uploaded to that folder with its original name.
140
- Otherwise, the file will be uploaded with the name specified in the workspace path.
141
- parameters:
142
- organization_id: An organization id for the Cosmo Tech API
143
- workspace_id: A workspace id for the Cosmo Tech API
144
- file_path: Path to the file to send as a workspace file
145
- workspace_path: Path inside the workspace to store the file (end with '/' for a folder)
146
- overwrite: Flag to overwrite the target file if it exists
147
-
148
- postgres_send_runner_metadata:
149
- description: |
150
- Send runner metadata to a PostgreSQL database.
151
-
152
- Creates or updates a table in PostgreSQL with runner metadata including id, name, last run id, and run template id.
153
- The table will be created if it doesn't exist, and existing records will be updated based on the runner id.
154
- parameters:
155
- organization_id: An organization id for the Cosmo Tech API
156
- workspace_id: A workspace id for the Cosmo Tech API
157
- runner_id: A runner id for the Cosmo Tech API
158
- table_prefix: Prefix to add to the table name
159
- postgres_host: PostgreSQL host URI
160
- postgres_port: PostgreSQL database port
161
- postgres_db: PostgreSQL database name
162
- postgres_schema: PostgreSQL schema name
163
- postgres_user: PostgreSQL connection user name
164
- postgres_password: PostgreSQL connection password
165
-
166
- store:
167
- description: |
168
- CoAL Data Store command group
169
-
170
- This group of commands will give you helper commands to interact with the datastore
171
-
172
- list_tables:
173
- description: |
174
- Running this command will list the existing tables in your datastore
175
- parameters:
176
- store_folder: The folder containing the store files
177
- schema: Display the schema of the tables
178
-
179
- reset:
180
- description: |
181
- Running this command will reset the state of your store
182
- parameters:
183
- store_folder: The folder containing the store files
184
-
185
- load_csv_folder:
186
- description: |
187
- Running this command will find all csvs in the given folder and put them in the store
188
- parameters:
189
- store_folder: The folder containing the store files
190
- csv_folder: The folder containing the csv files to store
191
-
192
- load_from_singlestore:
193
- description: |
194
- Load data from SingleStore tables into the store.
195
- Will download everything from a given SingleStore database following some configuration into the store.
196
-
197
- Make use of the singlestoredb to access to SingleStore
198
-
199
- More information is available on this page:
200
- [https://docs.singlestore.com/cloud/developer-resources/connect-with-application-development-tools/connect-with-python/connect-using-the-singlestore-python-client/]
201
- parameters:
202
- singlestore_host: SingleStore instance URI
203
- singlestore_port: SingleStore port
204
- singlestore_db: SingleStore database name
205
- singlestore_user: SingleStore connection user name
206
- singlestore_password: SingleStore connection password
207
- singlestore_tables: SingleStore table names to fetched (separated by comma)
208
- store_folder: The folder containing the store files
209
-
210
- dump_to_postgresql:
211
- description: |
212
- Running this command will dump your store to a given postgresql database
213
-
214
- Tables names from the store will be prepended with table-prefix in target database
215
-
216
- The postgresql user must have USAGE granted on the schema for this script to work due to the use of the command COPY FROM STDIN
217
-
218
- You can simply give him that grant by running the command:
219
- GRANT USAGE ON SCHEMA <schema> TO <username>
220
- parameters:
221
- store_folder: The folder containing the store files
222
- table_prefix: Prefix to add to the table name
223
- postgres_host: PostgreSQL host URI
224
- postgres_port: PostgreSQL database port
225
- postgres_db: PostgreSQL database name
226
- postgres_schema: PostgreSQL schema name
227
- postgres_user: PostgreSQL connection user name
228
- postgres_password: PostgreSQL connection password
229
- replace: Append data on existing tables
230
-
231
- dump_to_azure:
232
- description: |
233
- Dump a datastore to a Azure storage account.
234
-
235
- Will upload everything from a given data store to a Azure storage container.
236
-
237
- 3 modes currently exists:
238
- - sqlite: will dump the data store underlying database as is
239
- - csv: will convert every table of the datastore to csv and send them as separate files
240
- - parquet: will convert every table of the datastore to parquet and send them as separate files
241
-
242
- Make use of the azure.storage.blob library to access the container
243
-
244
- More information is available on this page:
245
- [https://learn.microsoft.com/en-us/azure/storage/blobs/storage-quickstart-blobs-python?tabs=managed-identity%2Croles-azure-portal%2Csign-in-azure-cli&pivots=blob-storage-quickstart-scratch]
246
- parameters:
247
- store_folder: The folder containing the store files
248
- output_type: Choose the type of file output to use (sqlite, csv, parquet)
249
- account_name: The account name on Azure to upload to
250
- container_name: The container name on Azure to upload to
251
- prefix: A prefix by which all uploaded files should start with in the container
252
- tenant_id: Tenant Identity used to connect to Azure storage system
253
- client_id: Client Identity used to connect to Azure storage system
254
- client_secret: Client Secret tied to the ID used to connect to Azure storage system
255
-
256
- dump_to_s3:
257
- description: |
258
- Dump a datastore to a S3
259
-
260
- Will upload everything from a given data store to a S3 bucket.
261
-
262
- 3 modes currently exists:
263
- - sqlite: will dump the data store underlying database as is
264
- - csv: will convert every table of the datastore to csv and send them as separate files
265
- - parquet: will convert every table of the datastore to parquet and send them as separate files
266
-
267
- Giving a prefix will add it to every upload (finishing the prefix with a "/" will allow to upload in a folder inside the bucket)
268
-
269
- Make use of the boto3 library to access the bucket
270
-
271
- More information is available on this page:
272
- [https://boto3.amazonaws.com/v1/documentation/api/latest/guide/configuration.html]
273
- parameters:
274
- store_folder: The folder containing the store files
275
- output_type: Choose the type of file output to use (sqlite, csv, parquet)
276
- bucket_name: The bucket on S3 to upload to
277
- prefix: A prefix by which all uploaded files should start with in the bucket
278
- use_ssl: Use SSL to secure connection to S3
279
- s3_url: URL to connect to the S3 system
280
- access_id: Identity used to connect to the S3 system
281
- secret_key: Secret tied to the ID used to connect to the S3 system
282
- ssl_cert_bundle: Path to an alternate CA Bundle to validate SSL connections
283
-
284
- storage:
285
- s3_bucket_upload:
286
- description: |
287
- Upload a folder to a S3 Bucket
288
-
289
- Will upload everything from a given folder to a S3 bucket. If a single file is passed only it will be uploaded, and recursive will be ignored
290
-
291
- Giving a prefix will add it to every upload (finishing the prefix with a "/" will allow to upload in a folder inside the bucket)
292
-
293
- Make use of the boto3 library to access the bucket
294
-
295
- More information is available on this page:
296
- [https://boto3.amazonaws.com/v1/documentation/api/latest/guide/configuration.html]
297
- parameters:
298
- source_folder: The folder/file to upload to the target bucket
299
- recursive: Recursively send the content of every folder inside the starting folder to the bucket
300
- bucket_name: The bucket on S3 to upload to
301
- prefix: A prefix by which all uploaded files should start with in the bucket
302
- use_ssl: Use SSL to secure connection to S3
303
- s3_url: URL to connect to the S3 system
304
- access_id: Identity used to connect to the S3 system
305
- secret_key: Secret tied to the ID used to connect to the S3 system
306
- ssl_cert_bundle: Path to an alternate CA Bundle to validate SSL connections
307
-
308
- s3_bucket_download:
309
- description: |
310
- Download S3 bucket content to a given folder
311
-
312
- Will download everything in the bucket unless a prefix is set, then only file following the given prefix will be downloaded
313
-
314
- Make use of the boto3 library to access the bucket
315
-
316
- More information is available on this page:
317
- [https://boto3.amazonaws.com/v1/documentation/api/latest/guide/configuration.html]
318
- parameters:
319
- target_folder: The folder in which to download the bucket content
320
- bucket_name: The bucket on S3 to download
321
- prefix_filter: A prefix by which all downloaded files should start in the bucket
322
- use_ssl: Use SSL to secure connection to S3
323
- s3_url: URL to connect to the S3 system
324
- access_id: Identity used to connect to the S3 system
325
- secret_key: Secret tied to the ID used to connect to the S3 system
326
- ssl_cert_bundle: Path to an alternate CA Bundle to validate SSL connections
327
-
328
- s3_bucket_delete:
329
- description: |
330
- Delete S3 bucket content to a given folder
331
-
332
- Will delete everything in the bucket unless a prefix is set, then only file following the given prefix will be deleted
333
-
334
- Make use of the boto3 library to access the bucket
335
-
336
- More information is available on this page:
337
- [https://boto3.amazonaws.com/v1/documentation/api/latest/guide/configuration.html]
338
- parameters:
339
- bucket_name: The bucket on S3 to delete
340
- prefix_filter: A prefix by which all deleted files should start in the bucket
341
- use_ssl: Use SSL to secure connection to S3
342
- s3_url: URL to connect to the S3 system
343
- access_id: Identity used to connect to the S3 system
344
- secret_key: Secret tied to the ID used to connect to the S3 system
345
- ssl_cert_bundle: Path to an alternate CA Bundle to validate SSL connections
346
-
347
- az_storage_upload:
348
- description: |
349
- Upload a folder to an Azure Storage Blob
350
- parameters:
351
- source_folder: The folder/file to upload to the target blob storage
352
- recursive: Recursively send the content of every folder inside the starting folder to the blob storage
353
- blob_name: The blob name in the Azure Storage service to upload to
354
- prefix: A prefix by which all uploaded files should start with in the blob storage
355
- az_storage_sas_url: SAS url allowing access to the AZ storage container
356
-
357
- adx_send_runnerdata:
358
- description: |
359
- Uses environment variables to send content of CSV files to ADX
360
- Requires a valid Azure connection either with:
361
- - The AZ cli command: az login
362
- - A triplet of env var AZURE_TENANT_ID, AZURE_CLIENT_ID, AZURE_CLIENT_SECRET
363
- parameters:
364
- dataset_absolute_path: A local folder to store the main dataset content
365
- parameters_absolute_path: A local folder to store the parameters content
366
- runner_id: the Runner Id to add to records
367
- adx_uri: the ADX cluster path (URI info can be found into ADX cluster page)
368
- adx_ingest_uri: The ADX cluster ingest path (URI info can be found into ADX cluster page)
369
- database_name: The targeted database name
370
- send_parameters: whether or not to send parameters (parameters path is mandatory then)
371
- send_datasets: whether or not to send datasets (parameters path is mandatory then)
372
- wait: Toggle waiting for the ingestion results
373
-
374
- legacy:
375
- description: |
376
- Cosmo Tech legacy API group
377
-
378
- This group will allow you to connect to the CosmoTech API and migrate solutions from pre-3.0 version to 3.X compatible solutions
379
-
380
- generate_orchestrator:
381
- description: |
382
- Generate an orchestrator configuration file from a solution's run template.
383
-
384
- This command group provides tools to generate orchestrator configuration files either from a local solution file
385
- or directly from the Cosmo Tech API.
386
-
387
- from_file:
388
- description: |
389
- Generate an orchestrator configuration from a local solution file.
390
-
391
- parameters:
392
- solution_file: Path to the solution file to read
393
- output: Path where to write the generated configuration
394
- run_template_id: The ID of the run template to use
395
- describe: Show a description of the generated template after generation
396
-
397
- from_api:
398
- description: |
399
- Generate an orchestrator configuration by fetching the solution from the API.
400
- parameters:
401
- output: Path where to write the generated configuration
402
- organization_id: The id of an organization in the cosmotech api
403
- workspace_id: The id of a solution in the cosmotech api
404
- run_template_id: The name of the run template in the cosmotech api
405
- describe: Show a description of the generated template after generation
406
-
407
- init_local_parameter_folder:
408
- description: |
409
- Initialize a local parameter folder structure from a solution's run template.
410
-
411
- This command group provides tools to create a local parameter folder structure either from a local solution file
412
- or directly from the Cosmo Tech API. The folder will contain parameter files in CSV and/or JSON format.
413
-
414
- solution:
415
- description: |
416
- Initialize parameter folder from a local solution file.
417
-
418
- Parameters:
419
- solution_file: Path to the solution file to read
420
- output_folder: Path where to create the parameter folder structure
421
- run_template_id: The ID of the run template to use
422
- write_json: Toggle writing of parameters in json format
423
- write_csv: Toggle writing of parameters in csv format
424
-
425
- cloud:
426
- description: |
427
- Initialize parameter folder by fetching the solution from the API.
428
- parameters:
429
- output_folder: Path where to create the parameter folder structure
430
- organization_id: The id of an organization in the cosmotech api
431
- workspace_id: The id of a solution in the cosmotech api
432
- run_template_id: The name of the run template in the cosmotech api
433
- write_json: Toggle writing of parameters in json format
434
- write_csv: Toggle writing of parameters in csv format