xpk 0.9.0__py3-none-any.whl → 0.10.0__py3-none-any.whl

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
@@ -1,6 +1,6 @@
1
1
  Metadata-Version: 2.4
2
2
  Name: xpk
3
- Version: 0.9.0
3
+ Version: 0.10.0
4
4
  Summary: xpk helps Cloud developers to orchestrate training jobs on accelerators on GKE.
5
5
  Author-email: XPK team <xpk-code-reviewers@google.com>
6
6
  License: Apache-2.0
@@ -259,6 +259,13 @@ all zones.
259
259
  --num-slices=4 --spot
260
260
  ```
261
261
 
262
+ * Cluster Create (DWS flex queued capacity):
263
+ ```shell
264
+ python3 xpk.py cluster create \
265
+ --cluster xpk-test --tpu-type=v5litepod-16 \
266
+ --num-slices=4 --flex
267
+ ```
268
+
262
269
  * Cluster Create for Pathways:
263
270
  Pathways compatible cluster can be created using `cluster create-pathways`.
264
271
  ```shell
@@ -495,6 +502,7 @@ Currently, the below flags/arguments are supported for A3 Mega, A3 Ultra and A4
495
502
  * `--reservation`
496
503
  * `--spot`
497
504
  * `--on-demand` (A3 Mega only)
505
+ * `--flex`
498
506
 
499
507
  ## Running XPK on existing clusters
500
508
 
@@ -518,9 +526,10 @@ Currently XPK supports the below types of storages:
518
526
  - [Google Cloud Filestore](#filestore)
519
527
  - [Google Cloud Parallelstore](#parallelstore)
520
528
  - [Google Cloud Block storages (Persistent Disk, Hyperdisk)](#block-storage-persistent-disk-hyperdisk)
529
+ - [Google Cloud Managed Lustre](#managed-lustre)
521
530
 
522
531
  ### FUSE
523
- A FUSE adapter lets you mount and access Cloud Storage buckets as local file systems, so applications can read and write objects in your bucket using standard file system semantics.
532
+ A FUSE adapter lets you mount and access Cloud Storage buckets as local file systems, so workloads can read and write objects in your bucket using standard file system semantics.
524
533
 
525
534
  To use the GCS FUSE with XPK you need to create a [Storage Bucket](https://console.cloud.google.com/storage/).
526
535
 
@@ -547,7 +556,7 @@ Parameters:
547
556
 
548
557
  ### Filestore
549
558
 
550
- A Filestore adapter lets you mount and access [Filestore instances](https://cloud.google.com/filestore/) as local file systems, so applications can read and write files in your volumes using standard file system semantics.
559
+ A Filestore adapter lets you mount and access [Filestore instances](https://cloud.google.com/filestore/) as local file systems, so workloads can read and write files in your volumes using standard file system semantics.
551
560
 
552
561
  To create and attach a GCP Filestore instance to your cluster use `xpk storage create` command with `--type=gcpfilestore`:
553
562
 
@@ -583,7 +592,7 @@ Commands `xpk storage create` and `xpk storage attach` with `--type=gcpfilestore
583
592
 
584
593
  ### Parallelstore
585
594
 
586
- A Parallelstore adapter lets you mount and access [Parallelstore instances](https://cloud.google.com/parallelstore/) as local file systems, so applications can read and write files in your volumes using standard file system semantics.
595
+ A Parallelstore adapter lets you mount and access [Parallelstore instances](https://cloud.google.com/parallelstore/) as local file systems, so workloads can read and write files in your volumes using standard file system semantics.
587
596
 
588
597
  To use the GCS Parallelstore with XPK you need to create a [Parallelstore Instance](https://console.cloud.google.com/parallelstore/).
589
598
 
@@ -607,7 +616,7 @@ Parameters:
607
616
 
608
617
  ### Block storage (Persistent Disk, Hyperdisk)
609
618
 
610
- A PersistentDisk adapter lets you mount and access Google Cloud Block storage solutions ([Persistent Disk](https://cloud.google.com/kubernetes-engine/docs/concepts/storage-overview#pd), [Hyperdisk](https://cloud.google.com/kubernetes-engine/docs/concepts/storage-overview#hyperdisk)) as local file systems, so applications can read and write files in your volumes using standard file system semantics.
619
+ A PersistentDisk adapter lets you mount and access Google Cloud Block storage solutions ([Persistent Disk](https://cloud.google.com/kubernetes-engine/docs/concepts/storage-overview#pd), [Hyperdisk](https://cloud.google.com/kubernetes-engine/docs/concepts/storage-overview#hyperdisk)) as local file systems, so workloads can read and write files in your volumes using standard file system semantics.
611
620
 
612
621
  To use the GCE PersistentDisk with XPK you need to create a [disk in GCE](https://cloud.google.com/compute/docs/disks). Please consider that the disk type you are creating is [compatible with the VMs](https://cloud.google.com/compute/docs/machine-resource#machine_type_comparison) in the default and accelerator nodepools.
613
622
 
@@ -629,6 +638,30 @@ Parameters:
629
638
  - `--readonly` - if set to true, workload can only read from storage.
630
639
  - `--manifest` - path to the manifest file containing PersistentVolume and PresistentVolumeClaim definitions.
631
640
 
641
+ ### Managed Lustre
642
+
643
+ A Managed Lustre adaptor lets you mount and access [Google Cloud Managed Lustre instances](https://cloud.google.com/kubernetes-engine/docs/concepts/managed-lustre) as local file systems, so workloads can read and write files in your volumes using standard file system semantics.
644
+
645
+ To use the GCP Managed Lustre with XPK you need to create [an instance](https://cloud.google.com/managed-lustre/docs/create-instance). Please make sure you enable GKE support when creating the instance (gcloud ex. `--gke-support-enabled`).
646
+
647
+ Once it's ready you can use `xpk storage attach` with `--type=lustre` command to attach a Managed Lustre instance to your cluster. Currently, attaching a Managed Lustre instance is supported only by providing a manifest file.
648
+
649
+ ```shell
650
+ python3 xpk.py storage attach test-lustre-storage --type=lustre \
651
+ --project=$PROJECT --cluster=$CLUSTER --zone=$ZONE \
652
+ --mount-point='/test-mount-point' --readonly=false \
653
+ --auto-mount=true \
654
+ --manifest='./examples/storage/lustre-manifest-attach.yaml'
655
+ ```
656
+
657
+ Parameters:
658
+
659
+ - `--type` - type of the storage `lustre`
660
+ - `--auto-mount` - if set to true all workloads will have this storage mounted by default.
661
+ - `--mount-point` - the path on which this storage should be mounted for a workload.
662
+ - `--readonly` - if set to true, workload can only read from storage.
663
+ - `--manifest` - path to the manifest file containing PersistentVolume and PresistentVolumeClaim definitions.
664
+
632
665
  ### List attached storages
633
666
 
634
667
  ```shell
@@ -670,8 +703,14 @@ python3 xpk.py storage delete test-fs-instance \
670
703
  python3 xpk.py workload create \
671
704
  --workload xpk-test-workload --command "echo goodbye" \
672
705
  --cluster xpk-test \
673
- --tpu-type=v5litepod-16 --projet=$PROJECT
706
+ --tpu-type=v5litepod-16 --project=$PROJECT
674
707
  ```
708
+ * Workload create(DWS flex with queued provisioning):
709
+ ```shell
710
+ python3 xpk.py workload create \
711
+ --workload xpk-test-workload --command "echo goodbye" \
712
+ --cluster xpk-test --flex \
713
+ --tpu-type=v5litepod-16 --project=$PROJECT
675
714
 
676
715
  * Workload Create for Pathways:
677
716
  Pathways workload can be submitted using `workload create-pathways` on a Pathways enabled cluster (created with `cluster create-pathways`)
@@ -3,64 +3,65 @@ xpk/main.py,sha256=wFc_kIM7kALGIY-JOcoa8m4BCWNRjl5tQ6ZDpv7HpSU,2350
3
3
  xpk/api/__init__.py,sha256=YPwWBbgLAu7L-YlTVGB2r8ZV4TzypURMRBcehSHHlLY,561
4
4
  xpk/api/storage_crd.yaml,sha256=r4WFXnSJJ25EUF-t4Ljfbl-cJoSaiFiZkP8451eTub4,1260
5
5
  xpk/commands/__init__.py,sha256=YPwWBbgLAu7L-YlTVGB2r8ZV4TzypURMRBcehSHHlLY,561
6
- xpk/commands/batch.py,sha256=bSxpIZpbLVpgk3AjEaNOxCfKa376p9QjUws_fwPoF-A,3818
7
- xpk/commands/cluster.py,sha256=2kSzuyftn2aQ_SCBf856W7MU8VMN9KikhsEogm80sHQ,30611
8
- xpk/commands/cluster_gcluster.py,sha256=lfgNrCQgSzG2-u49goSl06-JlVpytjRHb99xn6Osfjc,9893
9
- xpk/commands/common.py,sha256=oozWV7Uyjz-zr-dPZGJ4kV_ZNEIZrTjdI_jxmvjvpyE,2404
6
+ xpk/commands/batch.py,sha256=qzKkXf45brC-Ln3C-lajcwdqCc2BCVbKJe5M06AdnKQ,3806
7
+ xpk/commands/cluster.py,sha256=pl1d7ioQKJncF49TYOkKUD4b5ziUPrpT0iXfFvEm1tY,31483
8
+ xpk/commands/cluster_gcluster.py,sha256=F2QWBo7IYHcX_4jjMvaJoCXisJvrWmGiRbAtPikDS34,10816
9
+ xpk/commands/common.py,sha256=DLmvQ0cjdiS6rMAopyzNSAKGlLH-XLeW4-FTGOlWZeo,2401
10
10
  xpk/commands/config.py,sha256=gFNkf3ibsvZmcPpkpKXe-KJmHO5IKucNwLCXNgKvaDc,836
11
11
  xpk/commands/info.py,sha256=BHqFFXm3Lg1P8qH1Z3gEXmh141-8udduS5EBk38auDg,7251
12
12
  xpk/commands/inspector.py,sha256=bwbZW-cLtiBw2V0zvoMprHWhgMbAYm0ez0GjjEqeUR8,12097
13
13
  xpk/commands/job.py,sha256=LCFB_l1v5x_k4Ov15hPDAhadcvMZlqvHkObNNuHMCdo,5479
14
14
  xpk/commands/kind.py,sha256=Vl3RT47kHCR0ORX9dK37HCiYtbmXJUCIAaq-QEbIclU,7578
15
- xpk/commands/kjob_common.py,sha256=WaXKKPGQV1bL4gXP9qduweBtFQXwbuOezynLHBOKYCI,1672
16
- xpk/commands/run.py,sha256=RR9DVwS_DOs2_hfZ08qU98slz27u0wVNgW6UfWQqEAI,3806
17
- xpk/commands/shell.py,sha256=AjJ-yANH02q3pncKQdI5v1fDRL0MsxNlMbxR4epS19I,4190
18
- xpk/commands/storage.py,sha256=uKTjozRuebG_3VQ1FYtO7ZHFIv1H-kMLV0nve9Y38fo,10354
15
+ xpk/commands/kjob_common.py,sha256=XTT8uog4PvlxjH7sFTNnvMTlPSzARVM71wj_Czt4XF0,1926
16
+ xpk/commands/run.py,sha256=5hYMG0DcdHnFWsJ5gmfX09t6ZPVItt7FFoHO_ED0_Dk,3798
17
+ xpk/commands/shell.py,sha256=5-sKcI2Rbk3aCojnBNtipCwgOrbIDnG4f8ah0KIayY8,4182
18
+ xpk/commands/storage.py,sha256=tCH2medguOH5bBtywEuqXRtFeHDFE8sq4YVLuOIx_Hk,10600
19
19
  xpk/commands/version.py,sha256=CU4mb71r66U28krnPAopC6vBpdK-IGclsy5uNaQcgRY,824
20
- xpk/commands/workload.py,sha256=CyqcgEQkSdEjj9UHGW7GbVTIiEdtV_O5QM7zQpLf8xg,25095
20
+ xpk/commands/workload.py,sha256=nc5vClmCmRU04ItwiC3cldCLyEVjzvDUpxv987fgQ-A,26470
21
21
  xpk/core/__init__.py,sha256=YPwWBbgLAu7L-YlTVGB2r8ZV4TzypURMRBcehSHHlLY,561
22
- xpk/core/capacity.py,sha256=tZEHoli-4YsIqwMdwlBRJxAl-xjUOls-z3HOAsy3Z1M,5393
23
- xpk/core/cluster.py,sha256=ZF2W2OysxvdocRpnGU6fl4oEVhA5pWpehef3A8xP53E,24173
22
+ xpk/core/capacity.py,sha256=NvGaJ8EEeyHpgbwwbhn2Kd3l-5KVro6NITqu5udmUPM,7306
23
+ xpk/core/cluster.py,sha256=UHr6DIREO2RKsenrAXMCsvmFdutgcJxZJqeiJy-c88g,29017
24
24
  xpk/core/cluster_private.py,sha256=J2-UZ6t5f-UNdcSkuUr2_f4xk6xyrMH9Muk56trBh4M,6657
25
25
  xpk/core/commands.py,sha256=JiS4vJqWSLu8MKFBIKPBea9MKD2ZdpaQrziVQBqiDr4,10719
26
- xpk/core/config.py,sha256=Hm_0aRqrowMkA14Jz_4FMmWlqGMbkpuIfzs6VRN-Mpc,5715
26
+ xpk/core/config.py,sha256=ENX3uhmDKRPlzyUezocBKA5pUVsThahWUd1dCpLOBaY,3475
27
27
  xpk/core/docker_container.py,sha256=GvkCJ2S5UKn8uh3pZhRd3X7iS0-PsQpRO8l7QhywVGc,7604
28
28
  xpk/core/docker_image.py,sha256=fEdpLQg1C205mMbLREy48WnhvNv2Nm4KQFX87B2CuiA,6624
29
- xpk/core/docker_manager.py,sha256=GJMz1GLSdvIQeOGC34llVKSDIP5hjYuLcJtz1F7xNxA,10566
30
- xpk/core/docker_resources.py,sha256=3esxpXnoF0FedJL05zKxnG4W3VtMF5cdhbJdRq4OBgc,11184
29
+ xpk/core/docker_manager.py,sha256=mtOEnzSP1GF28ymHSPlTminujh4wWyg5EbC9UT3VFpI,10566
30
+ xpk/core/docker_resources.py,sha256=h-jT5AK1yL7reGPMeSUqb4Khulimljb16J7xoRI2XFw,12753
31
31
  xpk/core/filestore.py,sha256=7M-HAiXR-gEu3BJUgRY3cqEIfjo_FL0BAxq9MljEBt4,8022
32
32
  xpk/core/gcloud_context.py,sha256=p_LhWHo7GZonear2oupvTO-DpKqEkL0St7PnfxieRDY,5866
33
33
  xpk/core/gcluster_manager.py,sha256=JFip2hInFczFP2h5AXa70IPIuTaJ475TG6GxkQjKOI8,6337
34
34
  xpk/core/gcsfuse.py,sha256=kg5pgxdTjgiqquuGjev9fXzJPb8oiWPTK6wzCddzheQ,2125
35
- xpk/core/kjob.py,sha256=I-dbiOkslCNEMWSivcqy07t2ieDg5eYpPQdXeFjHhkI,14664
36
- xpk/core/kueue.py,sha256=VdBFJPhWLCLZJZbtZkwXgbGNQR_LgzVgFVAsocCXVBI,10901
35
+ xpk/core/jobset.py,sha256=EpqEDzJt3OmffF6lc6jJ1p4n1Y8y-G-Gd8e9vW9CHac,4122
36
+ xpk/core/kjob.py,sha256=bl0lIlfFqJqn9fQ2PO-xG6sRPq8RFgk8-cPL2hs-dZM,14681
37
+ xpk/core/kueue.py,sha256=N7HkKO684ps25jo5E0aSA8utqAUEkBA_5x1nD7Jh-n8,15290
37
38
  xpk/core/monitoring.py,sha256=v9MvLzNfvJAVby_ehSlPe6PaO0_pf3shkXg5gd-UWm8,4338
38
39
  xpk/core/mtc.py,sha256=pO7p3l-EzLFdTE8MdwWV8i0Zu-7epGql_kPoksVofIU,6259
39
40
  xpk/core/nap.py,sha256=30Fa1-xjbQCMAOj9L1t9K2X_O5Rauz0V7k1_qclci2o,12263
40
41
  xpk/core/network.py,sha256=hQR5Kab5Q5CYjggriNIhNh78Aq0CCF-vPUQI6BC4Wgw,10537
41
- xpk/core/nodepool.py,sha256=jkUWmAX7JJWocybH466_t-7KtfHpfulZhFN7-DprgEA,21758
42
- xpk/core/pathways.py,sha256=NgrW4hoiSLM59h25R8Zi1a--TgDuiy_f7h2u6KXVz-o,10613
42
+ xpk/core/nodepool.py,sha256=J6zr8JVEi3MrUqn63AwstThIC3uUZRrjxqunReEVEaI,22194
43
+ xpk/core/pathways.py,sha256=g4PUSi6RPqpCPVlziWGj14W7zbdNvkw8mrOq09J3s68,10594
43
44
  xpk/core/ray.py,sha256=UxOpIc2enHi1fQ4h3KO8FH8bIyEMtYzGtPoeqJKGG4o,6337
44
45
  xpk/core/resources.py,sha256=uezEuHw2OzpM4LT2c2EjUCPr9lhBTfLnOPay7hGVyj4,8276
45
46
  xpk/core/scheduling.py,sha256=OG1ZNS8tR29o1KIo8ijMaIuHsPeRfP23jfx4t3PkmGs,9157
46
- xpk/core/storage.py,sha256=3MaTWjfBDW6uP707nG6fVL-R2yEti74DbB8DiJJj3e4,19628
47
- xpk/core/system_characteristics.py,sha256=5GRzpKigAsVm7fzCtOs04Pi1UnurYu2KYFj-wdAZkVw,31836
47
+ xpk/core/storage.py,sha256=AjpRQn6zUn7qL3W17LuzBtMqyB-3i9YEap5Vw01cUpQ,16926
48
+ xpk/core/system_characteristics.py,sha256=a7lx2ChSqEIXgACVMnDYWEbaaDNyr_KYm6btB1UIlpQ,31836
48
49
  xpk/core/vertex.py,sha256=pD9UBL62xHomuqdNu7xKccfD2KCbjgohMk3AhX-CXSw,3644
49
- xpk/core/workload.py,sha256=gD90rgztF8zWmcEKz8inC1yNhjL4KQVIQDiUCs-359g,10003
50
+ xpk/core/workload.py,sha256=nfmL4_2Rr0dt5pctHm89KNJcYzulk6CX5d9QEguthJY,8526
50
51
  xpk/core/blueprint/__init__.py,sha256=YPwWBbgLAu7L-YlTVGB2r8ZV4TzypURMRBcehSHHlLY,561
51
52
  xpk/core/blueprint/blueprint_definitions.py,sha256=5i331XA-2yP_ALyB6XU5tP2Tf9iHcIX5g0TilxQi8zE,1800
52
- xpk/core/blueprint/blueprint_generator.py,sha256=jTAg1Yig9BwS2l-o2IJtGZHeYU5KfvYzcdDrl7ZORhs,35337
53
+ xpk/core/blueprint/blueprint_generator.py,sha256=vdxoyQnD2uyi8DaG3QtzdVjizf1nNlwoB23ecHUpkKQ,37967
53
54
  xpk/core/remote_state/__init__.py,sha256=PkV8D9WOtlJHH5AIxsQaKeIBcmupT_Ol_bwJgN6G2I8,561
54
55
  xpk/core/remote_state/fuse_remote_state.py,sha256=3Dx4ZZd0NFF5-MlqGWHzz8H4bjYiPOWdF_YSEnKUPQ8,3246
55
56
  xpk/core/remote_state/remote_state_client.py,sha256=6PcR92Xy_RMjlF4AscanQ1jXNHnewLWGNC2v53jbzD4,1077
56
57
  xpk/core/workload_decorators/__init__.py,sha256=YPwWBbgLAu7L-YlTVGB2r8ZV4TzypURMRBcehSHHlLY,561
57
- xpk/core/workload_decorators/rdma_decorator.py,sha256=lLURBW6eVmFZw-o1BzaIpqVvE6th8P99bQJUNNhmrOY,3925
58
+ xpk/core/workload_decorators/rdma_decorator.py,sha256=GU9QQ-cvosurjnhcXqj6e_TpXVHoZE0Of24lALeWPVc,3972
58
59
  xpk/core/workload_decorators/storage_decorator.py,sha256=Bj1lRh65s40AJDsWM0xTiHFaWtKC272eImjIjN8Z38c,1967
59
- xpk/core/workload_decorators/tcpx_decorator.py,sha256=rzOaufKdN8wgv-h22USdebJPFLGYIhjpzEs6WbmzJII,5666
60
- xpk/core/workload_decorators/tcpxo_decorator.py,sha256=uwArPI9Lkre_0dtcO_oztDO7LU_yrfaSm_QjMUwzXLM,6302
60
+ xpk/core/workload_decorators/tcpx_decorator.py,sha256=oB60_9744MfS-7yZ3mOnb1KWDRvsNyCCuR5jkp2H_i0,5863
61
+ xpk/core/workload_decorators/tcpxo_decorator.py,sha256=kV1Mzrz0sISPOIRi8I2yuvBhTEfu4e-EYJuz0A63Ubs,6504
61
62
  xpk/parser/__init__.py,sha256=YPwWBbgLAu7L-YlTVGB2r8ZV4TzypURMRBcehSHHlLY,561
62
63
  xpk/parser/batch.py,sha256=mJU-Cp1yTLje59vD-B1IiBcUeD-ZmEsoeB4xhj9cflc,1406
63
- xpk/parser/cluster.py,sha256=hExmC_SFvs9MnLihysAGtIG9t091_gw3J75-zjL8uCs,27864
64
+ xpk/parser/cluster.py,sha256=Zb4kKekItzk1lberAxf3jkUH8pvFqjKC8_bKrIXsK9c,28325
64
65
  xpk/parser/common.py,sha256=_F2rwsZka15difkvPA1yPARWr9I9ewx8PMzgwMLTvjM,7220
65
66
  xpk/parser/config.py,sha256=-XnWx9aFsBW4Uzo_hpOMD2ZQ0bdZLvq1ksv83_5jqSM,1633
66
67
  xpk/parser/core.py,sha256=VRJerlS92ufoQbG1mZv7B04DAP4qGkBHa4pRXgcbAs0,4761
@@ -70,10 +71,10 @@ xpk/parser/job.py,sha256=5RdE70rucGfrsn65l7Ho6RmO06mag1S0AO-3saVuXyw,4328
70
71
  xpk/parser/kind.py,sha256=sgPCqNVrgmFLcOBEbhlaphwVXxMh_opP9ntCq4KPePE,2682
71
72
  xpk/parser/run.py,sha256=oi_ksSyJ8Ooffe2EgoV_ecpmXEmNGVotjpIQH-HjufE,1481
72
73
  xpk/parser/shell.py,sha256=VC8p-kz9XjJZW9DXZ-rnv41XnRDRpQRFywHpB5j7tfc,1970
73
- xpk/parser/storage.py,sha256=Vtl9KxWFOxoNQmbfMBN0Nwc4Z3Nasx68td3tUmAgkuI,9894
74
+ xpk/parser/storage.py,sha256=lzGQSgngWnguz6yCNUUOJj65BVB_C_WC6nNksHBeTw4,9914
74
75
  xpk/parser/validators.py,sha256=-NBZelvfwZRzjz-YUCreD8EzMLHll8PZM-d-MVm2PG4,1192
75
76
  xpk/parser/version.py,sha256=eJo4PAbbmRQZulgKBs_ytbVgV9zAaaXeNzMMxmgFMVY,769
76
- xpk/parser/workload.py,sha256=hqmy3KtR0Byhrn25Qo72K_2rUyIF4oujrtibp5mq7Lc,24958
77
+ xpk/parser/workload.py,sha256=UVOVCngwhJ4ikuINsdK4rOFlzr39Sv6q0S49NDuNaI0,25493
77
78
  xpk/templates/__init__.py,sha256=7mu-VQDQMyxM5To0KOhuYe4y2TYGsEkfV7hXZmUyih4,561
78
79
  xpk/templates/storage.yaml,sha256=AykdyMtDnKZF8Y_0BYxoYP03hEIzEk6iNalXAQHgAls,163
79
80
  xpk/utils/__init__.py,sha256=YPwWBbgLAu7L-YlTVGB2r8ZV4TzypURMRBcehSHHlLY,561
@@ -86,9 +87,9 @@ xpk/utils/objects.py,sha256=OwMNxB4TGX21qnJPdZo2YBMPMbQPqOtHMh19QhoRNRY,2498
86
87
  xpk/utils/templates.py,sha256=g8zgR1MxyJmTmzM_wnvH30FmcbgQMC47UQwBtLj8B9k,807
87
88
  xpk/utils/validation.py,sha256=bSJApIY0Lk48I4EEQP08ZUvolXt_APpYXVGJXFQ_YLA,2711
88
89
  xpk/utils/yaml.py,sha256=j8xuAJ9yAAwnQi6ozwZ-nMnDyDnc3xWkeBZMtSuP4RU,844
89
- xpk-0.9.0.dist-info/licenses/LICENSE,sha256=z8d0m5b2O9McPEK1xHG_dWgUBT6EfBDz6wA0F7xSPTA,11358
90
- xpk-0.9.0.dist-info/METADATA,sha256=NcdCQuIRdfrvXobp08SBa-96KXFSc7zs23UE1VW9_Vo,69675
91
- xpk-0.9.0.dist-info/WHEEL,sha256=_zCd3N1l69ArxyTb8rzEoP9TpbYXkqRFSNOD5OuxnTs,91
92
- xpk-0.9.0.dist-info/entry_points.txt,sha256=mzEtiIesFkT1kmcTUVDA1o3uOhiniX6tIz2wmOlMu1M,38
93
- xpk-0.9.0.dist-info/top_level.txt,sha256=aDe4N0jicmuWExx_6w0TxWQJaEuPSs9BnLU-3aF1GLo,4
94
- xpk-0.9.0.dist-info/RECORD,,
90
+ xpk-0.10.0.dist-info/licenses/LICENSE,sha256=z8d0m5b2O9McPEK1xHG_dWgUBT6EfBDz6wA0F7xSPTA,11358
91
+ xpk-0.10.0.dist-info/METADATA,sha256=lGv-7DSE_adv058CGbmxmwANwfkJRPdFInMFF9cxKwc,71621
92
+ xpk-0.10.0.dist-info/WHEEL,sha256=_zCd3N1l69ArxyTb8rzEoP9TpbYXkqRFSNOD5OuxnTs,91
93
+ xpk-0.10.0.dist-info/entry_points.txt,sha256=mzEtiIesFkT1kmcTUVDA1o3uOhiniX6tIz2wmOlMu1M,38
94
+ xpk-0.10.0.dist-info/top_level.txt,sha256=aDe4N0jicmuWExx_6w0TxWQJaEuPSs9BnLU-3aF1GLo,4
95
+ xpk-0.10.0.dist-info/RECORD,,
File without changes