semantic-link-labs 0.7.4__tar.gz → 0.8.1__tar.gz

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.

Potentially problematic release.


This version of semantic-link-labs might be problematic. Click here for more details.

Files changed (191) hide show
  1. {semantic_link_labs-0.7.4 → semantic_link_labs-0.8.1}/PKG-INFO +43 -7
  2. {semantic_link_labs-0.7.4 → semantic_link_labs-0.8.1}/README.md +40 -6
  3. {semantic_link_labs-0.7.4 → semantic_link_labs-0.8.1}/docs/requirements.txt +4 -1
  4. {semantic_link_labs-0.7.4 → semantic_link_labs-0.8.1}/docs/source/conf.py +1 -1
  5. {semantic_link_labs-0.7.4 → semantic_link_labs-0.8.1}/environment.yml +1 -1
  6. semantic_link_labs-0.8.1/notebooks/Capacity Migration.ipynb +1 -0
  7. semantic_link_labs-0.8.1/notebooks/Report Analysis.ipynb +1 -0
  8. semantic_link_labs-0.8.1/notebooks/SQL.ipynb +1 -0
  9. semantic_link_labs-0.8.1/notebooks/Tabular Object Model.ipynb +1 -0
  10. {semantic_link_labs-0.7.4 → semantic_link_labs-0.8.1}/pyproject.toml +4 -2
  11. {semantic_link_labs-0.7.4 → semantic_link_labs-0.8.1}/src/semantic_link_labs.egg-info/PKG-INFO +43 -7
  12. {semantic_link_labs-0.7.4 → semantic_link_labs-0.8.1}/src/semantic_link_labs.egg-info/SOURCES.txt +23 -1
  13. {semantic_link_labs-0.7.4 → semantic_link_labs-0.8.1}/src/semantic_link_labs.egg-info/requires.txt +2 -0
  14. {semantic_link_labs-0.7.4 → semantic_link_labs-0.8.1}/src/sempy_labs/__init__.py +116 -58
  15. {semantic_link_labs-0.7.4 → semantic_link_labs-0.8.1}/src/sempy_labs/_ai.py +0 -2
  16. {semantic_link_labs-0.7.4 → semantic_link_labs-0.8.1}/src/sempy_labs/_capacities.py +39 -3
  17. semantic_link_labs-0.8.1/src/sempy_labs/_capacity_migration.py +623 -0
  18. {semantic_link_labs-0.7.4 → semantic_link_labs-0.8.1}/src/sempy_labs/_clear_cache.py +8 -8
  19. {semantic_link_labs-0.7.4 → semantic_link_labs-0.8.1}/src/sempy_labs/_connections.py +15 -13
  20. semantic_link_labs-0.8.1/src/sempy_labs/_data_pipelines.py +118 -0
  21. semantic_link_labs-0.8.1/src/sempy_labs/_documentation.py +144 -0
  22. semantic_link_labs-0.8.1/src/sempy_labs/_eventhouses.py +118 -0
  23. semantic_link_labs-0.8.1/src/sempy_labs/_eventstreams.py +118 -0
  24. {semantic_link_labs-0.7.4 → semantic_link_labs-0.8.1}/src/sempy_labs/_generate_semantic_model.py +3 -3
  25. {semantic_link_labs-0.7.4 → semantic_link_labs-0.8.1}/src/sempy_labs/_git.py +23 -24
  26. {semantic_link_labs-0.7.4 → semantic_link_labs-0.8.1}/src/sempy_labs/_helper_functions.py +140 -47
  27. {semantic_link_labs-0.7.4 → semantic_link_labs-0.8.1}/src/sempy_labs/_icons.py +40 -0
  28. semantic_link_labs-0.8.1/src/sempy_labs/_kql_databases.py +134 -0
  29. semantic_link_labs-0.8.1/src/sempy_labs/_kql_querysets.py +124 -0
  30. {semantic_link_labs-0.7.4 → semantic_link_labs-0.8.1}/src/sempy_labs/_list_functions.py +218 -421
  31. semantic_link_labs-0.8.1/src/sempy_labs/_mirrored_warehouses.py +50 -0
  32. semantic_link_labs-0.8.1/src/sempy_labs/_ml_experiments.py +122 -0
  33. semantic_link_labs-0.8.1/src/sempy_labs/_ml_models.py +120 -0
  34. {semantic_link_labs-0.7.4 → semantic_link_labs-0.8.1}/src/sempy_labs/_model_auto_build.py +0 -4
  35. {semantic_link_labs-0.7.4 → semantic_link_labs-0.8.1}/src/sempy_labs/_model_bpa.py +10 -12
  36. {semantic_link_labs-0.7.4 → semantic_link_labs-0.8.1}/src/sempy_labs/_model_bpa_bulk.py +8 -7
  37. {semantic_link_labs-0.7.4 → semantic_link_labs-0.8.1}/src/sempy_labs/_model_dependencies.py +26 -18
  38. {semantic_link_labs-0.7.4 → semantic_link_labs-0.8.1}/src/sempy_labs/_notebooks.py +5 -16
  39. {semantic_link_labs-0.7.4 → semantic_link_labs-0.8.1}/src/sempy_labs/_query_scale_out.py +6 -5
  40. {semantic_link_labs-0.7.4 → semantic_link_labs-0.8.1}/src/sempy_labs/_refresh_semantic_model.py +7 -19
  41. {semantic_link_labs-0.7.4 → semantic_link_labs-0.8.1}/src/sempy_labs/_spark.py +40 -45
  42. {semantic_link_labs-0.7.4 → semantic_link_labs-0.8.1}/src/sempy_labs/_sql.py +60 -15
  43. {semantic_link_labs-0.7.4 → semantic_link_labs-0.8.1}/src/sempy_labs/_vertipaq.py +25 -25
  44. semantic_link_labs-0.8.1/src/sempy_labs/_warehouses.py +132 -0
  45. {semantic_link_labs-0.7.4 → semantic_link_labs-0.8.1}/src/sempy_labs/_workspaces.py +0 -3
  46. semantic_link_labs-0.8.1/src/sempy_labs/admin/__init__.py +53 -0
  47. semantic_link_labs-0.8.1/src/sempy_labs/admin/_basic_functions.py +888 -0
  48. semantic_link_labs-0.8.1/src/sempy_labs/admin/_domains.py +411 -0
  49. {semantic_link_labs-0.7.4 → semantic_link_labs-0.8.1}/src/sempy_labs/directlake/_directlake_schema_sync.py +1 -1
  50. {semantic_link_labs-0.7.4 → semantic_link_labs-0.8.1}/src/sempy_labs/directlake/_dl_helper.py +32 -16
  51. {semantic_link_labs-0.7.4 → semantic_link_labs-0.8.1}/src/sempy_labs/directlake/_generate_shared_expression.py +11 -14
  52. {semantic_link_labs-0.7.4 → semantic_link_labs-0.8.1}/src/sempy_labs/directlake/_guardrails.py +7 -7
  53. {semantic_link_labs-0.7.4 → semantic_link_labs-0.8.1}/src/sempy_labs/directlake/_update_directlake_model_lakehouse_connection.py +14 -24
  54. {semantic_link_labs-0.7.4 → semantic_link_labs-0.8.1}/src/sempy_labs/directlake/_update_directlake_partition_entity.py +1 -1
  55. {semantic_link_labs-0.7.4 → semantic_link_labs-0.8.1}/src/sempy_labs/directlake/_warm_cache.py +1 -1
  56. {semantic_link_labs-0.7.4 → semantic_link_labs-0.8.1}/src/sempy_labs/lakehouse/_get_lakehouse_tables.py +3 -3
  57. {semantic_link_labs-0.7.4 → semantic_link_labs-0.8.1}/src/sempy_labs/lakehouse/_lakehouse.py +3 -2
  58. {semantic_link_labs-0.7.4 → semantic_link_labs-0.8.1}/src/sempy_labs/migration/_migrate_calctables_to_lakehouse.py +5 -0
  59. {semantic_link_labs-0.7.4 → semantic_link_labs-0.8.1}/src/sempy_labs/report/__init__.py +9 -6
  60. {semantic_link_labs-0.7.4 → semantic_link_labs-0.8.1}/src/sempy_labs/report/_generate_report.py +1 -1
  61. semantic_link_labs-0.8.1/src/sempy_labs/report/_report_bpa.py +369 -0
  62. semantic_link_labs-0.8.1/src/sempy_labs/report/_report_bpa_rules.py +113 -0
  63. semantic_link_labs-0.8.1/src/sempy_labs/report/_report_helper.py +254 -0
  64. semantic_link_labs-0.8.1/src/sempy_labs/report/_report_list_functions.py +95 -0
  65. {semantic_link_labs-0.7.4 → semantic_link_labs-0.8.1}/src/sempy_labs/report/_report_rebind.py +0 -4
  66. semantic_link_labs-0.8.1/src/sempy_labs/report/_reportwrapper.py +2037 -0
  67. {semantic_link_labs-0.7.4 → semantic_link_labs-0.8.1}/src/sempy_labs/tom/_model.py +333 -22
  68. semantic_link_labs-0.8.1/tests/test_friendly_case.py +11 -0
  69. {semantic_link_labs-0.7.4 → semantic_link_labs-0.8.1}/tests/test_shortcuts.py +1 -1
  70. semantic_link_labs-0.7.4/notebooks/Tabular Object Model.ipynb +0 -1
  71. semantic_link_labs-0.7.4/notebooks/Warehouse.ipynb +0 -1
  72. {semantic_link_labs-0.7.4 → semantic_link_labs-0.8.1}/.github/ISSUE_TEMPLATE/bug_report.md +0 -0
  73. {semantic_link_labs-0.7.4 → semantic_link_labs-0.8.1}/.github/ISSUE_TEMPLATE/feature_request.md +0 -0
  74. {semantic_link_labs-0.7.4 → semantic_link_labs-0.8.1}/.github/ISSUE_TEMPLATE/issue--question---advice-needed.md +0 -0
  75. {semantic_link_labs-0.7.4 → semantic_link_labs-0.8.1}/.github/workflows/build.yaml +0 -0
  76. {semantic_link_labs-0.7.4 → semantic_link_labs-0.8.1}/.github/workflows/codeql.yaml +0 -0
  77. {semantic_link_labs-0.7.4 → semantic_link_labs-0.8.1}/.gitignore +0 -0
  78. {semantic_link_labs-0.7.4 → semantic_link_labs-0.8.1}/.readthedocs.yaml +0 -0
  79. {semantic_link_labs-0.7.4 → semantic_link_labs-0.8.1}/.vscode/settings.json +0 -0
  80. {semantic_link_labs-0.7.4 → semantic_link_labs-0.8.1}/CODE_OF_CONDUCT.md +0 -0
  81. {semantic_link_labs-0.7.4 → semantic_link_labs-0.8.1}/LICENSE +0 -0
  82. {semantic_link_labs-0.7.4 → semantic_link_labs-0.8.1}/SECURITY.md +0 -0
  83. {semantic_link_labs-0.7.4 → semantic_link_labs-0.8.1}/SUPPORT.md +0 -0
  84. {semantic_link_labs-0.7.4 → semantic_link_labs-0.8.1}/docs/Makefile +0 -0
  85. {semantic_link_labs-0.7.4 → semantic_link_labs-0.8.1}/docs/make.bat +0 -0
  86. {semantic_link_labs-0.7.4 → semantic_link_labs-0.8.1}/docs/source/index.rst +0 -0
  87. {semantic_link_labs-0.7.4 → semantic_link_labs-0.8.1}/docs/source/modules.rst +0 -0
  88. {semantic_link_labs-0.7.4 → semantic_link_labs-0.8.1}/notebooks/Best Practice Analyzer Report.ipynb +0 -0
  89. {semantic_link_labs-0.7.4 → semantic_link_labs-0.8.1}/notebooks/Migration to Direct Lake.ipynb +0 -0
  90. {semantic_link_labs-0.7.4 → semantic_link_labs-0.8.1}/notebooks/Model Optimization.ipynb +0 -0
  91. {semantic_link_labs-0.7.4 → semantic_link_labs-0.8.1}/notebooks/Query Scale Out.ipynb +0 -0
  92. {semantic_link_labs-0.7.4 → semantic_link_labs-0.8.1}/notebooks/Semantic Model Refresh.ipynb +0 -0
  93. {semantic_link_labs-0.7.4 → semantic_link_labs-0.8.1}/setup.cfg +0 -0
  94. {semantic_link_labs-0.7.4 → semantic_link_labs-0.8.1}/src/semantic_link_labs.egg-info/dependency_links.txt +0 -0
  95. {semantic_link_labs-0.7.4 → semantic_link_labs-0.8.1}/src/semantic_link_labs.egg-info/top_level.txt +0 -0
  96. {semantic_link_labs-0.7.4 → semantic_link_labs-0.8.1}/src/sempy_labs/_bpa_translation/_model/_translations_am-ET.po +0 -0
  97. {semantic_link_labs-0.7.4 → semantic_link_labs-0.8.1}/src/sempy_labs/_bpa_translation/_model/_translations_ar-AE.po +0 -0
  98. {semantic_link_labs-0.7.4 → semantic_link_labs-0.8.1}/src/sempy_labs/_bpa_translation/_model/_translations_bg-BG.po +0 -0
  99. {semantic_link_labs-0.7.4 → semantic_link_labs-0.8.1}/src/sempy_labs/_bpa_translation/_model/_translations_ca-ES.po +0 -0
  100. {semantic_link_labs-0.7.4 → semantic_link_labs-0.8.1}/src/sempy_labs/_bpa_translation/_model/_translations_cs-CZ.po +0 -0
  101. {semantic_link_labs-0.7.4 → semantic_link_labs-0.8.1}/src/sempy_labs/_bpa_translation/_model/_translations_da-DK.po +0 -0
  102. {semantic_link_labs-0.7.4 → semantic_link_labs-0.8.1}/src/sempy_labs/_bpa_translation/_model/_translations_de-DE.po +0 -0
  103. {semantic_link_labs-0.7.4 → semantic_link_labs-0.8.1}/src/sempy_labs/_bpa_translation/_model/_translations_el-GR.po +0 -0
  104. {semantic_link_labs-0.7.4 → semantic_link_labs-0.8.1}/src/sempy_labs/_bpa_translation/_model/_translations_es-ES.po +0 -0
  105. {semantic_link_labs-0.7.4 → semantic_link_labs-0.8.1}/src/sempy_labs/_bpa_translation/_model/_translations_fa-IR.po +0 -0
  106. {semantic_link_labs-0.7.4 → semantic_link_labs-0.8.1}/src/sempy_labs/_bpa_translation/_model/_translations_fi-FI.po +0 -0
  107. {semantic_link_labs-0.7.4 → semantic_link_labs-0.8.1}/src/sempy_labs/_bpa_translation/_model/_translations_fr-FR.po +0 -0
  108. {semantic_link_labs-0.7.4 → semantic_link_labs-0.8.1}/src/sempy_labs/_bpa_translation/_model/_translations_ga-IE.po +0 -0
  109. {semantic_link_labs-0.7.4 → semantic_link_labs-0.8.1}/src/sempy_labs/_bpa_translation/_model/_translations_he-IL.po +0 -0
  110. {semantic_link_labs-0.7.4 → semantic_link_labs-0.8.1}/src/sempy_labs/_bpa_translation/_model/_translations_hi-IN.po +0 -0
  111. {semantic_link_labs-0.7.4 → semantic_link_labs-0.8.1}/src/sempy_labs/_bpa_translation/_model/_translations_hu-HU.po +0 -0
  112. {semantic_link_labs-0.7.4 → semantic_link_labs-0.8.1}/src/sempy_labs/_bpa_translation/_model/_translations_id-ID.po +0 -0
  113. {semantic_link_labs-0.7.4 → semantic_link_labs-0.8.1}/src/sempy_labs/_bpa_translation/_model/_translations_is-IS.po +0 -0
  114. {semantic_link_labs-0.7.4 → semantic_link_labs-0.8.1}/src/sempy_labs/_bpa_translation/_model/_translations_it-IT.po +0 -0
  115. {semantic_link_labs-0.7.4 → semantic_link_labs-0.8.1}/src/sempy_labs/_bpa_translation/_model/_translations_ja-JP.po +0 -0
  116. {semantic_link_labs-0.7.4 → semantic_link_labs-0.8.1}/src/sempy_labs/_bpa_translation/_model/_translations_ko-KR.po +0 -0
  117. {semantic_link_labs-0.7.4 → semantic_link_labs-0.8.1}/src/sempy_labs/_bpa_translation/_model/_translations_mt-MT.po +0 -0
  118. {semantic_link_labs-0.7.4 → semantic_link_labs-0.8.1}/src/sempy_labs/_bpa_translation/_model/_translations_nl-NL.po +0 -0
  119. {semantic_link_labs-0.7.4 → semantic_link_labs-0.8.1}/src/sempy_labs/_bpa_translation/_model/_translations_pl-PL.po +0 -0
  120. {semantic_link_labs-0.7.4 → semantic_link_labs-0.8.1}/src/sempy_labs/_bpa_translation/_model/_translations_pt-BR.po +0 -0
  121. {semantic_link_labs-0.7.4 → semantic_link_labs-0.8.1}/src/sempy_labs/_bpa_translation/_model/_translations_pt-PT.po +0 -0
  122. {semantic_link_labs-0.7.4 → semantic_link_labs-0.8.1}/src/sempy_labs/_bpa_translation/_model/_translations_ro-RO.po +0 -0
  123. {semantic_link_labs-0.7.4 → semantic_link_labs-0.8.1}/src/sempy_labs/_bpa_translation/_model/_translations_ru-RU.po +0 -0
  124. {semantic_link_labs-0.7.4 → semantic_link_labs-0.8.1}/src/sempy_labs/_bpa_translation/_model/_translations_sk-SK.po +0 -0
  125. {semantic_link_labs-0.7.4 → semantic_link_labs-0.8.1}/src/sempy_labs/_bpa_translation/_model/_translations_sl-SL.po +0 -0
  126. {semantic_link_labs-0.7.4 → semantic_link_labs-0.8.1}/src/sempy_labs/_bpa_translation/_model/_translations_sv-SE.po +0 -0
  127. {semantic_link_labs-0.7.4 → semantic_link_labs-0.8.1}/src/sempy_labs/_bpa_translation/_model/_translations_ta-IN.po +0 -0
  128. {semantic_link_labs-0.7.4 → semantic_link_labs-0.8.1}/src/sempy_labs/_bpa_translation/_model/_translations_te-IN.po +0 -0
  129. {semantic_link_labs-0.7.4 → semantic_link_labs-0.8.1}/src/sempy_labs/_bpa_translation/_model/_translations_th-TH.po +0 -0
  130. {semantic_link_labs-0.7.4 → semantic_link_labs-0.8.1}/src/sempy_labs/_bpa_translation/_model/_translations_tr-TR.po +0 -0
  131. {semantic_link_labs-0.7.4 → semantic_link_labs-0.8.1}/src/sempy_labs/_bpa_translation/_model/_translations_uk-UA.po +0 -0
  132. {semantic_link_labs-0.7.4 → semantic_link_labs-0.8.1}/src/sempy_labs/_bpa_translation/_model/_translations_zh-CN.po +0 -0
  133. {semantic_link_labs-0.7.4 → semantic_link_labs-0.8.1}/src/sempy_labs/_bpa_translation/_model/_translations_zu-ZA.po +0 -0
  134. {semantic_link_labs-0.7.4 → semantic_link_labs-0.8.1}/src/sempy_labs/_dataflows.py +0 -0
  135. {semantic_link_labs-0.7.4 → semantic_link_labs-0.8.1}/src/sempy_labs/_dax.py +0 -0
  136. {semantic_link_labs-0.7.4 → semantic_link_labs-0.8.1}/src/sempy_labs/_deployment_pipelines.py +0 -0
  137. {semantic_link_labs-0.7.4 → semantic_link_labs-0.8.1}/src/sempy_labs/_environments.py +0 -0
  138. {semantic_link_labs-0.7.4 → semantic_link_labs-0.8.1}/src/sempy_labs/_model_bpa_rules.py +0 -0
  139. {semantic_link_labs-0.7.4 → semantic_link_labs-0.8.1}/src/sempy_labs/_one_lake_integration.py +0 -0
  140. {semantic_link_labs-0.7.4 → semantic_link_labs-0.8.1}/src/sempy_labs/_translations.py +0 -0
  141. {semantic_link_labs-0.7.4 → semantic_link_labs-0.8.1}/src/sempy_labs/_workspace_identity.py +0 -0
  142. {semantic_link_labs-0.7.4 → semantic_link_labs-0.8.1}/src/sempy_labs/directlake/__init__.py +0 -0
  143. {semantic_link_labs-0.7.4 → semantic_link_labs-0.8.1}/src/sempy_labs/directlake/_directlake_schema_compare.py +0 -0
  144. {semantic_link_labs-0.7.4 → semantic_link_labs-0.8.1}/src/sempy_labs/directlake/_get_directlake_lakehouse.py +0 -0
  145. {semantic_link_labs-0.7.4 → semantic_link_labs-0.8.1}/src/sempy_labs/directlake/_get_shared_expression.py +0 -0
  146. {semantic_link_labs-0.7.4 → semantic_link_labs-0.8.1}/src/sempy_labs/directlake/_list_directlake_model_calc_tables.py +0 -0
  147. {semantic_link_labs-0.7.4 → semantic_link_labs-0.8.1}/src/sempy_labs/directlake/_show_unsupported_directlake_objects.py +0 -0
  148. {semantic_link_labs-0.7.4 → semantic_link_labs-0.8.1}/src/sempy_labs/lakehouse/__init__.py +0 -0
  149. {semantic_link_labs-0.7.4 → semantic_link_labs-0.8.1}/src/sempy_labs/lakehouse/_get_lakehouse_columns.py +0 -0
  150. {semantic_link_labs-0.7.4 → semantic_link_labs-0.8.1}/src/sempy_labs/lakehouse/_shortcuts.py +0 -0
  151. {semantic_link_labs-0.7.4 → semantic_link_labs-0.8.1}/src/sempy_labs/migration/__init__.py +0 -0
  152. {semantic_link_labs-0.7.4 → semantic_link_labs-0.8.1}/src/sempy_labs/migration/_create_pqt_file.py +0 -0
  153. {semantic_link_labs-0.7.4 → semantic_link_labs-0.8.1}/src/sempy_labs/migration/_migrate_calctables_to_semantic_model.py +0 -0
  154. {semantic_link_labs-0.7.4 → semantic_link_labs-0.8.1}/src/sempy_labs/migration/_migrate_model_objects_to_semantic_model.py +0 -0
  155. {semantic_link_labs-0.7.4 → semantic_link_labs-0.8.1}/src/sempy_labs/migration/_migrate_tables_columns_to_semantic_model.py +0 -0
  156. {semantic_link_labs-0.7.4 → semantic_link_labs-0.8.1}/src/sempy_labs/migration/_migration_validation.py +0 -0
  157. {semantic_link_labs-0.7.4 → semantic_link_labs-0.8.1}/src/sempy_labs/migration/_refresh_calc_tables.py +0 -0
  158. {semantic_link_labs-0.7.4 → semantic_link_labs-0.8.1}/src/sempy_labs/report/_BPAReportTemplate.json +0 -0
  159. {semantic_link_labs-0.7.4 → semantic_link_labs-0.8.1}/src/sempy_labs/report/_bpareporttemplate/.pbi/localSettings.json +0 -0
  160. {semantic_link_labs-0.7.4 → semantic_link_labs-0.8.1}/src/sempy_labs/report/_bpareporttemplate/.platform +0 -0
  161. {semantic_link_labs-0.7.4 → semantic_link_labs-0.8.1}/src/sempy_labs/report/_bpareporttemplate/StaticResources/SharedResources/BaseThemes/CY24SU06.json +0 -0
  162. {semantic_link_labs-0.7.4 → semantic_link_labs-0.8.1}/src/sempy_labs/report/_bpareporttemplate/definition/pages/01d72098bda5055bd500/page.json +0 -0
  163. {semantic_link_labs-0.7.4 → semantic_link_labs-0.8.1}/src/sempy_labs/report/_bpareporttemplate/definition/pages/01d72098bda5055bd500/visuals/1b08bce3bebabb0a27a8/visual.json +0 -0
  164. {semantic_link_labs-0.7.4 → semantic_link_labs-0.8.1}/src/sempy_labs/report/_bpareporttemplate/definition/pages/01d72098bda5055bd500/visuals/2f22ddb70c301693c165/visual.json +0 -0
  165. {semantic_link_labs-0.7.4 → semantic_link_labs-0.8.1}/src/sempy_labs/report/_bpareporttemplate/definition/pages/01d72098bda5055bd500/visuals/3b1182230aa6c600b43a/visual.json +0 -0
  166. {semantic_link_labs-0.7.4 → semantic_link_labs-0.8.1}/src/sempy_labs/report/_bpareporttemplate/definition/pages/01d72098bda5055bd500/visuals/58577ba6380c69891500/visual.json +0 -0
  167. {semantic_link_labs-0.7.4 → semantic_link_labs-0.8.1}/src/sempy_labs/report/_bpareporttemplate/definition/pages/01d72098bda5055bd500/visuals/a2a8fa5028b3b776c96c/visual.json +0 -0
  168. {semantic_link_labs-0.7.4 → semantic_link_labs-0.8.1}/src/sempy_labs/report/_bpareporttemplate/definition/pages/01d72098bda5055bd500/visuals/adfd47ef30652707b987/visual.json +0 -0
  169. {semantic_link_labs-0.7.4 → semantic_link_labs-0.8.1}/src/sempy_labs/report/_bpareporttemplate/definition/pages/01d72098bda5055bd500/visuals/b6a80ee459e716e170b1/visual.json +0 -0
  170. {semantic_link_labs-0.7.4 → semantic_link_labs-0.8.1}/src/sempy_labs/report/_bpareporttemplate/definition/pages/01d72098bda5055bd500/visuals/ce3130a721c020cc3d81/visual.json +0 -0
  171. {semantic_link_labs-0.7.4 → semantic_link_labs-0.8.1}/src/sempy_labs/report/_bpareporttemplate/definition/pages/92735ae19b31712208ad/page.json +0 -0
  172. {semantic_link_labs-0.7.4 → semantic_link_labs-0.8.1}/src/sempy_labs/report/_bpareporttemplate/definition/pages/92735ae19b31712208ad/visuals/66e60dfb526437cd78d1/visual.json +0 -0
  173. {semantic_link_labs-0.7.4 → semantic_link_labs-0.8.1}/src/sempy_labs/report/_bpareporttemplate/definition/pages/c597da16dc7e63222a82/page.json +0 -0
  174. {semantic_link_labs-0.7.4 → semantic_link_labs-0.8.1}/src/sempy_labs/report/_bpareporttemplate/definition/pages/c597da16dc7e63222a82/visuals/07deb8bce824e1be37d7/visual.json +0 -0
  175. {semantic_link_labs-0.7.4 → semantic_link_labs-0.8.1}/src/sempy_labs/report/_bpareporttemplate/definition/pages/c597da16dc7e63222a82/visuals/0b1c68838818b32ad03b/visual.json +0 -0
  176. {semantic_link_labs-0.7.4 → semantic_link_labs-0.8.1}/src/sempy_labs/report/_bpareporttemplate/definition/pages/c597da16dc7e63222a82/visuals/0c171de9d2683d10b930/visual.json +0 -0
  177. {semantic_link_labs-0.7.4 → semantic_link_labs-0.8.1}/src/sempy_labs/report/_bpareporttemplate/definition/pages/c597da16dc7e63222a82/visuals/0efa01be0510e40a645e/visual.json +0 -0
  178. {semantic_link_labs-0.7.4 → semantic_link_labs-0.8.1}/src/sempy_labs/report/_bpareporttemplate/definition/pages/c597da16dc7e63222a82/visuals/6bf2f0eb830ab53cc668/visual.json +0 -0
  179. {semantic_link_labs-0.7.4 → semantic_link_labs-0.8.1}/src/sempy_labs/report/_bpareporttemplate/definition/pages/c597da16dc7e63222a82/visuals/88d8141cb8500b60030c/visual.json +0 -0
  180. {semantic_link_labs-0.7.4 → semantic_link_labs-0.8.1}/src/sempy_labs/report/_bpareporttemplate/definition/pages/c597da16dc7e63222a82/visuals/a753273590beed656a03/visual.json +0 -0
  181. {semantic_link_labs-0.7.4 → semantic_link_labs-0.8.1}/src/sempy_labs/report/_bpareporttemplate/definition/pages/c597da16dc7e63222a82/visuals/b8fdc82cddd61ac447bc/visual.json +0 -0
  182. {semantic_link_labs-0.7.4 → semantic_link_labs-0.8.1}/src/sempy_labs/report/_bpareporttemplate/definition/pages/d37dce724a0ccc30044b/page.json +0 -0
  183. {semantic_link_labs-0.7.4 → semantic_link_labs-0.8.1}/src/sempy_labs/report/_bpareporttemplate/definition/pages/d37dce724a0ccc30044b/visuals/ce8532a7e25020271077/visual.json +0 -0
  184. {semantic_link_labs-0.7.4 → semantic_link_labs-0.8.1}/src/sempy_labs/report/_bpareporttemplate/definition/pages/pages.json +0 -0
  185. {semantic_link_labs-0.7.4 → semantic_link_labs-0.8.1}/src/sempy_labs/report/_bpareporttemplate/definition/report.json +0 -0
  186. {semantic_link_labs-0.7.4 → semantic_link_labs-0.8.1}/src/sempy_labs/report/_bpareporttemplate/definition/version.json +0 -0
  187. {semantic_link_labs-0.7.4 → semantic_link_labs-0.8.1}/src/sempy_labs/report/_bpareporttemplate/definition.pbir +0 -0
  188. {semantic_link_labs-0.7.4 → semantic_link_labs-0.8.1}/src/sempy_labs/report/_report_functions.py +0 -0
  189. {semantic_link_labs-0.7.4 → semantic_link_labs-0.8.1}/src/sempy_labs/tom/__init__.py +0 -0
  190. {semantic_link_labs-0.7.4 → semantic_link_labs-0.8.1}/tests/__init__.py +0 -0
  191. {semantic_link_labs-0.7.4 → semantic_link_labs-0.8.1}/tests/test_tom.py +0 -0
@@ -1,6 +1,6 @@
1
1
  Metadata-Version: 2.1
2
2
  Name: semantic-link-labs
3
- Version: 0.7.4
3
+ Version: 0.8.1
4
4
  Summary: Semantic Link Labs for Microsoft Fabric
5
5
  Author: Microsoft Corporation
6
6
  License: MIT License
@@ -20,13 +20,15 @@ Requires-Dist: anytree
20
20
  Requires-Dist: powerbiclient
21
21
  Requires-Dist: polib
22
22
  Requires-Dist: azure.mgmt.resource
23
+ Requires-Dist: jsonpath_ng
24
+ Requires-Dist: deltalake
23
25
  Provides-Extra: test
24
26
  Requires-Dist: pytest>=8.2.1; extra == "test"
25
27
 
26
28
  # Semantic Link Labs
27
29
 
28
30
  [![PyPI version](https://badge.fury.io/py/semantic-link-labs.svg)](https://badge.fury.io/py/semantic-link-labs)
29
- [![Read The Docs](https://readthedocs.org/projects/semantic-link-labs/badge/?version=0.7.4&style=flat)](https://readthedocs.org/projects/semantic-link-labs/)
31
+ [![Read The Docs](https://readthedocs.org/projects/semantic-link-labs/badge/?version=0.8.1&style=flat)](https://readthedocs.org/projects/semantic-link-labs/)
30
32
  [![Code style: black](https://img.shields.io/badge/code%20style-black-000000.svg)](https://github.com/psf/black)
31
33
  [![Downloads](https://static.pepy.tech/badge/semantic-link-labs)](https://pepy.tech/project/semantic-link-labs)
32
34
 
@@ -34,9 +36,31 @@ Requires-Dist: pytest>=8.2.1; extra == "test"
34
36
  [Read the documentation on ReadTheDocs!](https://semantic-link-labs.readthedocs.io/en/stable/)
35
37
  ---
36
38
 
37
- This is a python library intended to be used in [Microsoft Fabric notebooks](https://learn.microsoft.com/fabric/data-engineering/how-to-use-notebook). This library was originally intended to solely contain functions used for [migrating semantic models to Direct Lake mode](https://github.com/microsoft/semantic-link-labs?tab=readme-ov-file#direct-lake-migration). However, it quickly became apparent that functions within such a library could support many other useful activities in the realm of semantic models, reports, lakehouses and really anything Fabric-related. As such, this library contains a variety of functions ranging from running [Vertipaq Analyzer](https://semantic-link-labs.readthedocs.io/en/stable/sempy_labs.html#sempy_labs.import_vertipaq_analyzer) or the [Best Practice Analyzer](https://semantic-link-labs.readthedocs.io/en/stable/sempy_labs.html#sempy_labs.run_model_bpa) against a semantic model to seeing if any [lakehouse tables hit Direct Lake guardrails](https://semantic-link-labs.readthedocs.io/en/stable/sempy_labs.lakehouse.html#sempy_labs.lakehouse.get_lakehouse_tables) or accessing the [Tabular Object Model](https://semantic-link-labs.readthedocs.io/en/stable/sempy_labs.tom.html) and more!
38
-
39
- Instructions for migrating import/DirectQuery semantic models to Direct Lake mode can be found [here](https://github.com/microsoft/semantic-link-labs?tab=readme-ov-file#direct-lake-migration).
39
+ Semantic Link Labs is a Python library designed for use in [Microsoft Fabric notebooks](https://learn.microsoft.com/fabric/data-engineering/how-to-use-notebook). This library extends the capabilities of [Semantic Link](https://learn.microsoft.com/fabric/data-science/semantic-link-overview) offering additional functionalities to seamlessly integrate and work alongside it. The goal of Semantic Link Labs is to simplify technical processes, empowering people to focus on higher level activities and allowing tasks that are better suited for machines to be efficiently handled without human intervention.
40
+
41
+ ## Featured Scenarios
42
+ * Semantic Models
43
+ * [Migrating an import/DirectQuery semantic model to Direct Lake](https://github.com/microsoft/semantic-link-labs?tab=readme-ov-file#direct-lake-migration)
44
+ * [Model Best Practice Analyzer (BPA)](https://semantic-link-labs.readthedocs.io/en/stable/sempy_labs.html#sempy_labs.run_model_bpa)
45
+ * [Vertipaq Analyzer](https://semantic-link-labs.readthedocs.io/en/stable/sempy_labs.html#sempy_labs.vertipaq_analyzer)
46
+ * [Tabular Object Model](https://github.com/microsoft/semantic-link-labs/blob/main/notebooks/Tabular%20Object%20Model.ipynb) [(TOM)](https://semantic-link-labs.readthedocs.io/en/stable/sempy_labs.tom.html)
47
+ * [Translate a semantic model's metadata](https://semantic-link-labs.readthedocs.io/en/stable/sempy_labs.html#sempy_labs.translate_semantic_model)
48
+ * [Check Direct Lake Guardrails](https://semantic-link-labs.readthedocs.io/en/stable/sempy_labs.lakehouse.html#sempy_labs.lakehouse.get_lakehouse_tables)
49
+ * [Refresh](https://github.com/microsoft/semantic-link-labs/blob/main/notebooks/Semantic%20Model%20Refresh.ipynb), [clear cache](https://semantic-link-labs.readthedocs.io/en/stable/sempy_labs.html#sempy_labs.clear_cache), [backup](https://semantic-link-labs.readthedocs.io/en/stable/sempy_labs.html#sempy_labs.backup_semantic_model), [restore](https://semantic-link-labs.readthedocs.io/en/stable/sempy_labs.html#sempy_labs.restore_semantic_model), [copy backup files](https://semantic-link-labs.readthedocs.io/en/stable/sempy_labs.html#sempy_labs.copy_semantic_model_backup_file), [move/deploy across workspaces](https://semantic-link-labs.readthedocs.io/en/stable/sempy_labs.html#sempy_labs.deploy_semantic_model)
50
+ * [Run DAX queries which impersonate a user](https://semantic-link-labs.readthedocs.io/en/stable/sempy_labs.html#sempy_labs.evaluate_dax_impersonation)
51
+ * Reports
52
+ * [Report Best Practice Analyzer (BPA)](https://semantic-link-labs.readthedocs.io/en/stable/sempy_labs.report.html#sempy_labs.report.run_report_bpa)
53
+ * [View report metadata](https://github.com/microsoft/semantic-link-labs/blob/main/notebooks/Report%20Analysis.ipynb)
54
+ * [View semantic model objects most frequently used in Power BI reports](https://semantic-link-labs.readthedocs.io/en/stable/sempy_labs.html#sempy_labs.list_semantic_model_object_report_usage)
55
+ * [View broken reports](https://semantic-link-labs.readthedocs.io/en/stable/sempy_labs.html#sempy_labs.list_report_semantic_model_objects)
56
+ * [Rebind reports](https://semantic-link-labs.readthedocs.io/en/stable/sempy_labs.report.html#sempy_labs.report.report_rebind)
57
+ * Capacities
58
+ * [Migrating a Power BI Premium capacity (P sku) to a Fabric capacity (F sku)](https://github.com/microsoft/semantic-link-labs/blob/main/notebooks/Capacity%20Migration.ipynb)
59
+ * APIs
60
+ * Wrapper functions for [Power BI](https://learn.microsoft.com/rest/api/power-bi/), [Fabric](https://learn.microsoft.com/rest/api/fabric/articles/using-fabric-apis), and [Azure](https://learn.microsoft.com/rest/api/azure/?view=rest-power-bi-embedded-2021-01-01) APIs
61
+
62
+
63
+ ### Check out the [helper notebooks](https://github.com/microsoft/semantic-link-labs/tree/main/notebooks) for getting started!
40
64
 
41
65
  If you encounter any issues, please [raise a bug](https://github.com/microsoft/semantic-link-labs/issues/new?assignees=&labels=&projects=&template=bug_report.md&title=).
42
66
 
@@ -50,11 +74,13 @@ If you have ideas for new features/functions, please [request a feature](https:/
50
74
  ## Once installed, run this code to import the library into your notebook
51
75
  ```python
52
76
  import sempy_labs as labs
53
- from sempy_labs import migration, directlake
77
+ from sempy_labs import migration, directlake, admin
54
78
  from sempy_labs import lakehouse as lake
55
79
  from sempy_labs import report as rep
56
80
  from sempy_labs.tom import connect_semantic_model
81
+ from sempy_labs.report import ReportWrapper
57
82
  from sempy_labs import ConnectWarehouse
83
+ from sempy_labs import ConnectLakehouse
58
84
  ```
59
85
 
60
86
  ## Load semantic-link-labs into a custom [Fabric environment](https://learn.microsoft.com/fabric/data-engineering/create-and-use-environment)
@@ -78,6 +104,8 @@ An even better way to ensure the semantic-link-labs library is available in your
78
104
  2. Select your newly created environment within the 'Environment' drop down in the navigation bar at the top of the notebook
79
105
 
80
106
  ## Version History
107
+ * [0.8.1](https://github.com/microsoft/semantic-link-labs/releases/tag/0.8.1) (October 2, 2024)
108
+ * [0.8.0](https://github.com/microsoft/semantic-link-labs/releases/tag/0.8.0) (September 25, 2024)
81
109
  * [0.7.4](https://github.com/microsoft/semantic-link-labs/releases/tag/0.7.4) (September 16, 2024)
82
110
  * [0.7.3](https://github.com/microsoft/semantic-link-labs/releases/tag/0.7.3) (September 11, 2024)
83
111
  * [0.7.2](https://github.com/microsoft/semantic-link-labs/releases/tag/0.7.2) (August 30, 2024)
@@ -124,7 +152,7 @@ Check out my [blog post](https://www.elegantbi.com/post/direct-lake-migration) o
124
152
  5. Back in the notebook, the next step will create your new Direct Lake semantic model with the name of your choice, taking all the relevant properties from the orignal semantic model and refreshing/framing your new semantic model.
125
153
 
126
154
  > [!NOTE]
127
- > As of version 0.2.1, calculated tables are also migrated to Direct Lake (as data tables with their DAX expression stored as model annotations in the new semantic model). Additionally, Field Parameters are migrated as they were in the original semantic model (as a calculated table).
155
+ > Calculated tables are also migrated to Direct Lake (as data tables with their DAX expression stored as model annotations in the new semantic model). Additionally, Field Parameters are migrated as they were in the original semantic model (as a calculated table). [Auto date/time tables](https://learn.microsoft.com/power-bi/guidance/auto-date-time) are not migrated. Auto date/time must be disabled in Power BI Desktop and proper date table(s) must be created prior to migration.
128
156
 
129
157
  6. Finally, you can easily rebind your all reports which use the import/DQ semantic model to the new Direct Lake semantic model in one click.
130
158
 
@@ -137,6 +165,14 @@ Check out my [blog post](https://www.elegantbi.com/post/direct-lake-migration) o
137
165
  * Non-supported objects are not transferred (i.e. calculated columns, relationships using columns with unsupported data types etc.).
138
166
  * Reports used by your original semantic model will be rebinded to your new semantic model.
139
167
 
168
+ ### Limitations
169
+ * Calculated columns are not migrated.
170
+ * Auto date/time tables are not migrated.
171
+ * References to calculated columns in Field Parameters are removed.
172
+ * References to calculated columns in measure expressions or other DAX expressions will break.
173
+ * Calculated tables are migrated as possible. The success of this migration depends on the interdependencies and complexity of the calculated table. This part of the migration is a workaround as technically calculated tables are not supported in Direct Lake.
174
+ * See [here](https://learn.microsoft.com/fabric/get-started/direct-lake-overview#considerations-and-limitations) for the rest of the limitations of Direct Lake.
175
+
140
176
  ## Contributing
141
177
 
142
178
  This project welcomes contributions and suggestions. Most contributions require you to agree to a
@@ -1,7 +1,7 @@
1
1
  # Semantic Link Labs
2
2
 
3
3
  [![PyPI version](https://badge.fury.io/py/semantic-link-labs.svg)](https://badge.fury.io/py/semantic-link-labs)
4
- [![Read The Docs](https://readthedocs.org/projects/semantic-link-labs/badge/?version=0.7.4&style=flat)](https://readthedocs.org/projects/semantic-link-labs/)
4
+ [![Read The Docs](https://readthedocs.org/projects/semantic-link-labs/badge/?version=0.8.1&style=flat)](https://readthedocs.org/projects/semantic-link-labs/)
5
5
  [![Code style: black](https://img.shields.io/badge/code%20style-black-000000.svg)](https://github.com/psf/black)
6
6
  [![Downloads](https://static.pepy.tech/badge/semantic-link-labs)](https://pepy.tech/project/semantic-link-labs)
7
7
 
@@ -9,9 +9,31 @@
9
9
  [Read the documentation on ReadTheDocs!](https://semantic-link-labs.readthedocs.io/en/stable/)
10
10
  ---
11
11
 
12
- This is a python library intended to be used in [Microsoft Fabric notebooks](https://learn.microsoft.com/fabric/data-engineering/how-to-use-notebook). This library was originally intended to solely contain functions used for [migrating semantic models to Direct Lake mode](https://github.com/microsoft/semantic-link-labs?tab=readme-ov-file#direct-lake-migration). However, it quickly became apparent that functions within such a library could support many other useful activities in the realm of semantic models, reports, lakehouses and really anything Fabric-related. As such, this library contains a variety of functions ranging from running [Vertipaq Analyzer](https://semantic-link-labs.readthedocs.io/en/stable/sempy_labs.html#sempy_labs.import_vertipaq_analyzer) or the [Best Practice Analyzer](https://semantic-link-labs.readthedocs.io/en/stable/sempy_labs.html#sempy_labs.run_model_bpa) against a semantic model to seeing if any [lakehouse tables hit Direct Lake guardrails](https://semantic-link-labs.readthedocs.io/en/stable/sempy_labs.lakehouse.html#sempy_labs.lakehouse.get_lakehouse_tables) or accessing the [Tabular Object Model](https://semantic-link-labs.readthedocs.io/en/stable/sempy_labs.tom.html) and more!
13
-
14
- Instructions for migrating import/DirectQuery semantic models to Direct Lake mode can be found [here](https://github.com/microsoft/semantic-link-labs?tab=readme-ov-file#direct-lake-migration).
12
+ Semantic Link Labs is a Python library designed for use in [Microsoft Fabric notebooks](https://learn.microsoft.com/fabric/data-engineering/how-to-use-notebook). This library extends the capabilities of [Semantic Link](https://learn.microsoft.com/fabric/data-science/semantic-link-overview) offering additional functionalities to seamlessly integrate and work alongside it. The goal of Semantic Link Labs is to simplify technical processes, empowering people to focus on higher level activities and allowing tasks that are better suited for machines to be efficiently handled without human intervention.
13
+
14
+ ## Featured Scenarios
15
+ * Semantic Models
16
+ * [Migrating an import/DirectQuery semantic model to Direct Lake](https://github.com/microsoft/semantic-link-labs?tab=readme-ov-file#direct-lake-migration)
17
+ * [Model Best Practice Analyzer (BPA)](https://semantic-link-labs.readthedocs.io/en/stable/sempy_labs.html#sempy_labs.run_model_bpa)
18
+ * [Vertipaq Analyzer](https://semantic-link-labs.readthedocs.io/en/stable/sempy_labs.html#sempy_labs.vertipaq_analyzer)
19
+ * [Tabular Object Model](https://github.com/microsoft/semantic-link-labs/blob/main/notebooks/Tabular%20Object%20Model.ipynb) [(TOM)](https://semantic-link-labs.readthedocs.io/en/stable/sempy_labs.tom.html)
20
+ * [Translate a semantic model's metadata](https://semantic-link-labs.readthedocs.io/en/stable/sempy_labs.html#sempy_labs.translate_semantic_model)
21
+ * [Check Direct Lake Guardrails](https://semantic-link-labs.readthedocs.io/en/stable/sempy_labs.lakehouse.html#sempy_labs.lakehouse.get_lakehouse_tables)
22
+ * [Refresh](https://github.com/microsoft/semantic-link-labs/blob/main/notebooks/Semantic%20Model%20Refresh.ipynb), [clear cache](https://semantic-link-labs.readthedocs.io/en/stable/sempy_labs.html#sempy_labs.clear_cache), [backup](https://semantic-link-labs.readthedocs.io/en/stable/sempy_labs.html#sempy_labs.backup_semantic_model), [restore](https://semantic-link-labs.readthedocs.io/en/stable/sempy_labs.html#sempy_labs.restore_semantic_model), [copy backup files](https://semantic-link-labs.readthedocs.io/en/stable/sempy_labs.html#sempy_labs.copy_semantic_model_backup_file), [move/deploy across workspaces](https://semantic-link-labs.readthedocs.io/en/stable/sempy_labs.html#sempy_labs.deploy_semantic_model)
23
+ * [Run DAX queries which impersonate a user](https://semantic-link-labs.readthedocs.io/en/stable/sempy_labs.html#sempy_labs.evaluate_dax_impersonation)
24
+ * Reports
25
+ * [Report Best Practice Analyzer (BPA)](https://semantic-link-labs.readthedocs.io/en/stable/sempy_labs.report.html#sempy_labs.report.run_report_bpa)
26
+ * [View report metadata](https://github.com/microsoft/semantic-link-labs/blob/main/notebooks/Report%20Analysis.ipynb)
27
+ * [View semantic model objects most frequently used in Power BI reports](https://semantic-link-labs.readthedocs.io/en/stable/sempy_labs.html#sempy_labs.list_semantic_model_object_report_usage)
28
+ * [View broken reports](https://semantic-link-labs.readthedocs.io/en/stable/sempy_labs.html#sempy_labs.list_report_semantic_model_objects)
29
+ * [Rebind reports](https://semantic-link-labs.readthedocs.io/en/stable/sempy_labs.report.html#sempy_labs.report.report_rebind)
30
+ * Capacities
31
+ * [Migrating a Power BI Premium capacity (P sku) to a Fabric capacity (F sku)](https://github.com/microsoft/semantic-link-labs/blob/main/notebooks/Capacity%20Migration.ipynb)
32
+ * APIs
33
+ * Wrapper functions for [Power BI](https://learn.microsoft.com/rest/api/power-bi/), [Fabric](https://learn.microsoft.com/rest/api/fabric/articles/using-fabric-apis), and [Azure](https://learn.microsoft.com/rest/api/azure/?view=rest-power-bi-embedded-2021-01-01) APIs
34
+
35
+
36
+ ### Check out the [helper notebooks](https://github.com/microsoft/semantic-link-labs/tree/main/notebooks) for getting started!
15
37
 
16
38
  If you encounter any issues, please [raise a bug](https://github.com/microsoft/semantic-link-labs/issues/new?assignees=&labels=&projects=&template=bug_report.md&title=).
17
39
 
@@ -25,11 +47,13 @@ If you have ideas for new features/functions, please [request a feature](https:/
25
47
  ## Once installed, run this code to import the library into your notebook
26
48
  ```python
27
49
  import sempy_labs as labs
28
- from sempy_labs import migration, directlake
50
+ from sempy_labs import migration, directlake, admin
29
51
  from sempy_labs import lakehouse as lake
30
52
  from sempy_labs import report as rep
31
53
  from sempy_labs.tom import connect_semantic_model
54
+ from sempy_labs.report import ReportWrapper
32
55
  from sempy_labs import ConnectWarehouse
56
+ from sempy_labs import ConnectLakehouse
33
57
  ```
34
58
 
35
59
  ## Load semantic-link-labs into a custom [Fabric environment](https://learn.microsoft.com/fabric/data-engineering/create-and-use-environment)
@@ -53,6 +77,8 @@ An even better way to ensure the semantic-link-labs library is available in your
53
77
  2. Select your newly created environment within the 'Environment' drop down in the navigation bar at the top of the notebook
54
78
 
55
79
  ## Version History
80
+ * [0.8.1](https://github.com/microsoft/semantic-link-labs/releases/tag/0.8.1) (October 2, 2024)
81
+ * [0.8.0](https://github.com/microsoft/semantic-link-labs/releases/tag/0.8.0) (September 25, 2024)
56
82
  * [0.7.4](https://github.com/microsoft/semantic-link-labs/releases/tag/0.7.4) (September 16, 2024)
57
83
  * [0.7.3](https://github.com/microsoft/semantic-link-labs/releases/tag/0.7.3) (September 11, 2024)
58
84
  * [0.7.2](https://github.com/microsoft/semantic-link-labs/releases/tag/0.7.2) (August 30, 2024)
@@ -99,7 +125,7 @@ Check out my [blog post](https://www.elegantbi.com/post/direct-lake-migration) o
99
125
  5. Back in the notebook, the next step will create your new Direct Lake semantic model with the name of your choice, taking all the relevant properties from the orignal semantic model and refreshing/framing your new semantic model.
100
126
 
101
127
  > [!NOTE]
102
- > As of version 0.2.1, calculated tables are also migrated to Direct Lake (as data tables with their DAX expression stored as model annotations in the new semantic model). Additionally, Field Parameters are migrated as they were in the original semantic model (as a calculated table).
128
+ > Calculated tables are also migrated to Direct Lake (as data tables with their DAX expression stored as model annotations in the new semantic model). Additionally, Field Parameters are migrated as they were in the original semantic model (as a calculated table). [Auto date/time tables](https://learn.microsoft.com/power-bi/guidance/auto-date-time) are not migrated. Auto date/time must be disabled in Power BI Desktop and proper date table(s) must be created prior to migration.
103
129
 
104
130
  6. Finally, you can easily rebind your all reports which use the import/DQ semantic model to the new Direct Lake semantic model in one click.
105
131
 
@@ -112,6 +138,14 @@ Check out my [blog post](https://www.elegantbi.com/post/direct-lake-migration) o
112
138
  * Non-supported objects are not transferred (i.e. calculated columns, relationships using columns with unsupported data types etc.).
113
139
  * Reports used by your original semantic model will be rebinded to your new semantic model.
114
140
 
141
+ ### Limitations
142
+ * Calculated columns are not migrated.
143
+ * Auto date/time tables are not migrated.
144
+ * References to calculated columns in Field Parameters are removed.
145
+ * References to calculated columns in measure expressions or other DAX expressions will break.
146
+ * Calculated tables are migrated as possible. The success of this migration depends on the interdependencies and complexity of the calculated table. This part of the migration is a workaround as technically calculated tables are not supported in Direct Lake.
147
+ * See [here](https://learn.microsoft.com/fabric/get-started/direct-lake-overview#considerations-and-limitations) for the rest of the limitations of Direct Lake.
148
+
115
149
  ## Contributing
116
150
 
117
151
  This project welcomes contributions and suggestions. Most contributions require you to agree to a
@@ -10,4 +10,7 @@ azure-storage-blob>=12.9.0
10
10
  anytree
11
11
  IPython
12
12
  polib
13
- azure.mgmt.resource
13
+ powerbiclient
14
+ azure.mgmt.resource
15
+ jsonpath_ng
16
+ deltalake
@@ -13,7 +13,7 @@ sys.path.insert(0, os.path.abspath('../../src/'))
13
13
  project = 'semantic-link-labs'
14
14
  copyright = '2024, Microsoft and community'
15
15
  author = 'Microsoft and community'
16
- release = '0.7.4'
16
+ release = '0.8.1'
17
17
 
18
18
  # -- General configuration ---------------------------------------------------
19
19
  # https://www.sphinx-doc.org/en/master/usage/configuration.html#general-configuration
@@ -6,7 +6,7 @@ dependencies:
6
6
  - pytest-cov
7
7
  - pytest-mock
8
8
  - pip:
9
- - semantic-link-sempy>=0.7.5
9
+ - semantic-link-sempy>=0.8.0
10
10
  - azure-identity==1.7.1
11
11
  - azure-storage-blob>=12.9.0
12
12
  - pandas-stubs
@@ -0,0 +1 @@
1
+ {"cells":[{"cell_type":"markdown","id":"5c27dfd1-4fe0-4a97-92e6-ddf78889aa93","metadata":{"nteract":{"transient":{"deleting":false}}},"source":["### Install the latest .whl package\n","\n","Check [here](https://pypi.org/project/semantic-link-labs/) to see the latest version."]},{"cell_type":"code","execution_count":null,"id":"d5cae9db-cef9-48a8-a351-9c5fcc99645c","metadata":{"jupyter":{"outputs_hidden":true,"source_hidden":false},"nteract":{"transient":{"deleting":false}}},"outputs":[],"source":["%pip install semantic-link-labs"]},{"cell_type":"markdown","id":"2856d26d","metadata":{},"source":["### Requirements\n","* Must have an Azure Subscription\n","* Must [register an App](https://ms.portal.azure.com/#blade/Microsoft_AAD_RegisteredApps/ApplicationsListBlade)\n"," * Permissions: Will need the Contributor role at the scope where the resources will be created, which is often the subscription level\n","* Azure Key Vault\n"," * [Set up](https://learn.microsoft.com/azure/key-vault/secrets/quick-create-portal) within the Azure Subscription\n"," * Save secrets for the Tenant ID, Client ID (Application ID), Client Secret\n"," * Permissions: Ensure the user who will be executing the notebooks has “Key Vault Secrets User”\n","* Fabric Permissions\n"," * User should be a tenant admin. This ensures they have the necessary authority to execute and manage the notebooks without encountering permission issues.\n","\n","### Result\n","* F skus are created for each (specified) capacity\n"," * Within the same region as the P SKU\n"," * Equivalent SKU size as the P SKU\n"," * Same admins as listed on the P SKU\n"," * All workspaces are migrated to the corresponding new capacity\n"," * Capacity settings from the P SKU are transferred to the F SKU\n"," * Capacity settings\n"," * Notification settings\n"," * Access settings\n"," * Disaster recovery settings\n"," * Spark settings\n"," * Delegated tenant settings\n","* The names of the newly created F SKU capacities will be an alphanumeric lowercase version of the P SKU capacity name, suffixed with 'fsku'. As an example: \"My capacity_3!\" -> \"mycapacity3fsku\"."]},{"cell_type":"markdown","id":"b195eae8","metadata":{},"source":["### Import the library and set the initial parameters"]},{"cell_type":"code","execution_count":null,"id":"1344e286","metadata":{},"outputs":[],"source":["import sempy_labs as labs\n","\n","azure_subscription_id = '' # Enter your Azure subscription ID\n","key_vault_uri = '' # Enter your Azure Key Vault URI\n","key_vault_tenant_id = '' # Enter the name of the Azure Key Vault secret which stores your Tenant ID\n","key_vault_client_id = '' # Enter the name of the Azure Key Vault secret which stores your Client ID (Application ID)\n","key_vault_client_secret = '' # Enter the name of the Azure Key Vault secret which stores your Client Secret\n","resource_group = '' # Enter the name of the resource group (to be used to create the new F skus)"]},{"cell_type":"markdown","id":"5a3fe6e8-b8aa-4447-812b-7931831e07fe","metadata":{"nteract":{"transient":{"deleting":false}}},"source":["### Migrate a single P SKU -> F SKU\n","Set the 'capacities' parameter to the single P SKU."]},{"cell_type":"code","execution_count":null,"id":"3655dd88","metadata":{},"outputs":[],"source":["labs.migrate_capacities(\n"," azure_subscription_id = azure_subscription_id,\n"," key_vault_uri = key_vault_uri,\n"," key_vault_tenant_id = key_vault_tenant_id,\n"," key_vault_client_id = key_vault_client_id,\n"," key_vault_client_secret = key_vault_client_secret,\n"," resource_group = resource_group,\n"," capacities = 'CapacityA',\n"," p_sku_only = True,\n",")"]},{"cell_type":"markdown","id":"175a59b8","metadata":{},"source":["### Migrate a list of P SKUs to F SKUs\n","Set the 'capacities' parameter to a list of P SKUs."]},{"cell_type":"code","execution_count":null,"id":"3a7a80ec","metadata":{},"outputs":[],"source":["labs.migrate_capacities(\n"," azure_subscription_id = azure_subscription_id,\n"," key_vault_uri = key_vault_uri,\n"," key_vault_tenant_id = key_vault_tenant_id,\n"," key_vault_client_id = key_vault_client_id,\n"," key_vault_client_secret = key_vault_client_secret,\n"," resource_group = resource_group,\n"," capacities = ['CapacityA', 'CapacityB', 'CapacityC'],\n"," p_sku_only = True,\n",")"]},{"cell_type":"markdown","id":"30438799","metadata":{},"source":["### Migrate all P SKUs to F SKUs\n","Set the 'capacities' parameter to None."]},{"cell_type":"code","execution_count":null,"id":"315c2dc7","metadata":{},"outputs":[],"source":["labs.migrate_capacities(\n"," azure_subscription_id = azure_subscription_id,\n"," key_vault_uri = key_vault_uri,\n"," key_vault_tenant_id = key_vault_tenant_id,\n"," key_vault_client_id = key_vault_client_id,\n"," key_vault_client_secret = key_vault_client_secret,\n"," resource_group = resource_group,\n"," capacities = None,\n"," p_sku_only = True,\n",")"]},{"cell_type":"markdown","id":"1d8e73b2","metadata":{},"source":["### Migrate a list of P SKUs to F SKUs; associate each capacity with a specific resource group\n","This process ensures that each F SKU is created within the resource group specified in the resource_group_mapping dictionary."]},{"cell_type":"code","execution_count":null,"id":"2854bf8a","metadata":{},"outputs":[],"source":["resource_group_mapping = {\n"," \"CapacityA\": \"ResourceGroupA\",\n"," \"CapacityB\": \"ResourceGroupA\",\n"," \"CapacityC\": \"ResourceGroupB\",\n","}\n","\n","labs.migrate_capacities(\n"," azure_subscription_id = azure_subscription_id,\n"," key_vault_uri = key_vault_uri,\n"," key_vault_tenant_id = key_vault_tenant_id,\n"," key_vault_client_id = key_vault_client_id,\n"," key_vault_client_secret = key_vault_client_secret,\n"," resource_group = resource_group_mapping,\n"," capacities = ['CapacityA', 'CapacityB', 'CapacityC'],\n"," p_sku_only = True,\n",")"]},{"cell_type":"markdown","id":"c3f497c8","metadata":{},"source":["### Migrate a single P SKU (already created F SKU)"]},{"cell_type":"code","execution_count":null,"id":"a4f0b5a2","metadata":{},"outputs":[],"source":["source_capacity = '' # Enter the P SKU capacity name\n","target_capacity = '' # Enter the F SKU capacity name (already exists) \n","\n","labs.migrate_workspaces(\n"," source_capacity=source_capacity, \n"," target_capacity=target_capacity\n",")\n","\n","# Optionally migrate settings\n","\"\"\"\n","labs.migrate_capacity_settings(\n"," source_capacity=source_capacity, \n"," target_capacity=target_capacity\n",")\n","labs.migrate_access_settings(\n"," source_capacity=source_capacity, \n"," target_capacity=target_capacity\n",")\n","labs.migrate_delegated_tenant_settings(\n"," source_capacity=source_capacity, \n"," target_capacity=target_capacity\n",")\n","labs.migrate_disaster_recovery_settings(\n"," source_capacity=source_capacity, \n"," target_capacity=target_capacity\n",")\n","labs.migrate_notification_settings(\n"," source_capacity=source_capacity, \n"," target_capacity=target_capacity\n",")\n","labs.migrate_spark_settings(\n"," source_capacity=source_capacity, \n"," target_capacity=target_capacity\n",")\n","\"\"\""]},{"cell_type":"markdown","id":"e0db744b","metadata":{},"source":["### Migrate a list of P SKUs (already created F SKUs)"]},{"cell_type":"code","execution_count":null,"id":"0e04d519","metadata":{},"outputs":[],"source":["capacity_mapping = {\n"," \"capacitya\": \"capacityafsku\", # Format is \"P SKU\": \"F SKU\"\n"," \"capacityb\": \"capacitybfsku\",\n"," \"capacityc\": \"capacitycfsku\",\n","}\n","\n","p_skus = list(capacity_mapping.keys())\n","\n","for p_sku in p_skus:\n"," labs.migrate_workspaces(\n"," source_capacity=p_sku,\n"," target_capacity=capacity_mapping.get(p_sku)\n"," )\n"," # Optionally migrate settings\n"," \"\"\"\n"," labs.migrate_capacity_settings(\n"," source_capacity=source_capacity, \n"," target_capacity=target_capacity\n"," )\n"," labs.migrate_access_settings(\n"," source_capacity=source_capacity, \n"," target_capacity=target_capacity\n"," )\n"," labs.migrate_delegated_tenant_settings(\n"," source_capacity=source_capacity, \n"," target_capacity=target_capacity\n"," )\n"," labs.migrate_disaster_recovery_settings(\n"," source_capacity=source_capacity, \n"," target_capacity=target_capacity\n"," )\n"," labs.migrate_notification_settings(\n"," source_capacity=source_capacity, \n"," target_capacity=target_capacity\n"," )\n"," labs.migrate_spark_settings(\n"," source_capacity=source_capacity, \n"," target_capacity=target_capacity\n"," )\n"," \"\"\"\n"]}],"metadata":{"kernel_info":{"name":"synapse_pyspark"},"kernelspec":{"display_name":"Synapse PySpark","language":"Python","name":"synapse_pyspark"},"language_info":{"name":"python"},"microsoft":{"language":"python"},"nteract":{"version":"nteract-front-end@1.0.0"},"spark_compute":{"compute_id":"/trident/default"},"synapse_widget":{"state":{},"version":"0.1"},"widgets":{}},"nbformat":4,"nbformat_minor":5}
@@ -0,0 +1 @@
1
+ {"cells":[{"cell_type":"markdown","id":"5c27dfd1-4fe0-4a97-92e6-ddf78889aa93","metadata":{"nteract":{"transient":{"deleting":false}}},"source":["### Install the latest .whl package\n","\n","Check [here](https://pypi.org/project/semantic-link-labs/) to see the latest version."]},{"cell_type":"code","execution_count":null,"id":"d5cae9db-cef9-48a8-a351-9c5fcc99645c","metadata":{"jupyter":{"outputs_hidden":true,"source_hidden":false},"nteract":{"transient":{"deleting":false}}},"outputs":[],"source":["%pip install semantic-link-labs"]},{"cell_type":"markdown","id":"b195eae8","metadata":{},"source":["### Import the library and set the initial parameters"]},{"cell_type":"code","execution_count":null,"id":"1344e286","metadata":{},"outputs":[],"source":["import sempy_labs as labs\n","import sempy_labs.report as rep\n","from sempy_labs.report import ReportWrapper\n","\n","# For checking reports\n","report_name = '' # Enter the report name\n","report_workspace = None # Enter the workspace in which the report exists\n","rpt = ReportWrapper(report=report_name, workspace=report_workspace)"]},{"cell_type":"markdown","id":"4b68c027","metadata":{},"source":["### Report Best Practice Analyzer (BPA)"]},{"cell_type":"code","execution_count":null,"id":"3a596d68","metadata":{},"outputs":[],"source":["rep.run_report_bpa(report=report_name, workspace=report_workspace)"]},{"cell_type":"markdown","id":"2ac3b59a","metadata":{},"source":["#### Report BPA using custom rules"]},{"cell_type":"code","execution_count":null,"id":"bf0fe645","metadata":{},"outputs":[],"source":["import pandas as pd\n","\n","rules = pd.DataFrame(\n"," [\n"," (\n"," \"Error Prevention\",\n"," \"Semantic Model\",\n"," \"Error\",\n"," \"Fix report objects which reference invalid semantic model objects\",\n"," lambda df: df[\"Valid Semantic Model Object\"] == False,\n"," \"This rule highlights visuals, report filters, page filters or visual filters which reference an invalid semantic model object (i.e Measure/Column/Hierarchy).\",\n"," \"\",\n"," ),\n"," (\n"," \"Performance\",\n"," \"Page\",\n"," \"Warning\",\n"," \"Reduce the number of visible visuals on the page\",\n"," lambda df: df[\"Visible Visual Count\"] > 15,\n"," 'Reducing the number of visable visuals on a page will lead to faster report performance. This rule flags pages with over \" + visVisuals + \" visible visuals.',\n"," ),\n"," (\n"," \"Performance\",\n"," \"Visual\",\n"," \"Warning\",\n"," \"Reduce the number of objects within visuals\",\n"," lambda df: df[\"Visual Object Count\"] > 5,\n"," \"Reducing the number of objects (i.e. measures, columns) which are used in a visual will lead to faster report performance.\",\n"," )\n"," ],\n"," columns=[\n"," \"Category\",\n"," \"Scope\",\n"," \"Severity\",\n"," \"Rule Name\",\n"," \"Expression\",\n"," \"Description\",\n"," \"URL\",\n"," ],\n",")\n","\n","rep.run_report_bpa(report=report_name, workspace=report_workspace, rules=rules)"]},{"cell_type":"markdown","id":"5a3fe6e8-b8aa-4447-812b-7931831e07fe","metadata":{"nteract":{"transient":{"deleting":false}}},"source":["### View all semantic model objects within a report"]},{"cell_type":"code","execution_count":null,"id":"9e349954","metadata":{},"outputs":[],"source":["rpt.list_semantic_model_objects()"]},{"cell_type":"markdown","id":"9efe783e","metadata":{},"source":["#### View broken report objects (if any semantic model objects used in the report do not exist within the semantic model)\n","The 'Valid Object' column indicates if the object (measure/column/hierarchy) exists within the semantic model used by the report"]},{"cell_type":"code","execution_count":null,"id":"a7480602","metadata":{},"outputs":[],"source":["rpt.list_semantic_model_objects(extended=True)"]},{"cell_type":"markdown","id":"5fab7ef7","metadata":{},"source":["#### View broken report objects across all reports tied to a semantic model"]},{"cell_type":"code","execution_count":null,"id":"34f1d9a6","metadata":{},"outputs":[],"source":["dataset_name = '' # Enter the semantic model name\n","dataset_workspace = None # Enter the workspace name in which the semantic model exists (if set to None it will use the workspace in which the notebook is running)\n","labs.list_report_semantic_model_objects(dataset=dataset_name, workspace=dataset_workspace, extended=True)"]},{"cell_type":"markdown","id":"26acd665","metadata":{},"source":["### List functions within the ReportWrapper"]},{"cell_type":"code","execution_count":null,"id":"9c2c206a","metadata":{},"outputs":[],"source":["rpt.list_custom_visuals()"]},{"cell_type":"code","execution_count":null,"id":"22647267","metadata":{},"outputs":[],"source":["rpt.list_pages()"]},{"cell_type":"code","execution_count":null,"id":"493fb104","metadata":{},"outputs":[],"source":["rpt.list_visuals()"]},{"cell_type":"code","execution_count":null,"id":"1ef9a9ac","metadata":{},"outputs":[],"source":["rpt.list_visual_objects()"]},{"cell_type":"code","execution_count":null,"id":"ad00499c","metadata":{},"outputs":[],"source":["rpt.list_report_filters()"]},{"cell_type":"code","execution_count":null,"id":"c0f02a49","metadata":{},"outputs":[],"source":["rpt.list_page_filters()"]},{"cell_type":"code","execution_count":null,"id":"4cf8d12f","metadata":{},"outputs":[],"source":["rpt.list_visual_filters()"]},{"cell_type":"code","execution_count":null,"id":"3b5e9c16","metadata":{},"outputs":[],"source":["rpt.list_report_level_measures()"]},{"cell_type":"code","execution_count":null,"id":"9f420f57","metadata":{},"outputs":[],"source":["rpt.list_visual_interactions()"]},{"cell_type":"code","execution_count":null,"id":"198afff0","metadata":{},"outputs":[],"source":["rpt.list_bookmarks()"]},{"cell_type":"markdown","id":"f3ee5962","metadata":{},"source":["### Perform actions on a report\n","Set a custom theme for a report based on a Power BI theme file"]},{"cell_type":"code","execution_count":null,"id":"ec1a7d1b","metadata":{},"outputs":[],"source":["rpt.set_theme(theme_file_path='/lakehouse/default/Files/CY23SU09.json')"]},{"cell_type":"code","execution_count":null,"id":"b1203067","metadata":{},"outputs":[],"source":["rpt.set_theme(theme_file_path='https://raw.githubusercontent.com/PowerBiDevCamp/FabricUserApiDemo/main/FabricUserApiDemo/DefinitionTemplates/Shared/Reports/StaticResources/SharedResources/BaseThemes/CY23SU08.json')"]},{"cell_type":"markdown","id":"9b230f9f","metadata":{},"source":["Set whether a Power BI report page is hidden or visible"]},{"cell_type":"code","execution_count":null,"id":"8ce57483","metadata":{},"outputs":[],"source":["rpt.set_page_visibility(page_name='Page 1', hidden=True)"]},{"cell_type":"markdown","id":"ef3e4c02","metadata":{},"source":["Set the active page (page shown upon opening the report)"]},{"cell_type":"code","execution_count":null,"id":"024d4111","metadata":{},"outputs":[],"source":["rpt.set_active_page(page_name='Page 2')"]},{"cell_type":"markdown","id":"f96511ac","metadata":{},"source":["Disable the 'show items with no data' for all visuals within a Power BI report"]},{"cell_type":"code","execution_count":null,"id":"65bf38fe","metadata":{},"outputs":[],"source":["rpt.disable_show_items_with_no_data()"]},{"cell_type":"markdown","id":"ddee3565","metadata":{},"source":["Remove all custom visuals in a Power BI report which are not used in any visuals within the report"]},{"cell_type":"code","execution_count":null,"id":"9aba1564","metadata":{},"outputs":[],"source":["rpt.remove_unnecessary_custom_visuals()"]},{"cell_type":"markdown","id":"5342423f","metadata":{},"source":["Hide all pages which are used for a tooltip or drillthrough"]},{"cell_type":"code","execution_count":null,"id":"b9045cbb","metadata":{},"outputs":[],"source":["rpt.hide_tooltip_drillthrough_pages()"]}],"metadata":{"kernel_info":{"name":"synapse_pyspark"},"kernelspec":{"display_name":"Synapse PySpark","language":"Python","name":"synapse_pyspark"},"language_info":{"name":"python"},"microsoft":{"language":"python"},"nteract":{"version":"nteract-front-end@1.0.0"},"spark_compute":{"compute_id":"/trident/default"},"synapse_widget":{"state":{},"version":"0.1"},"widgets":{}},"nbformat":4,"nbformat_minor":5}
@@ -0,0 +1 @@
1
+ {"cells":[{"cell_type":"markdown","id":"5c27dfd1-4fe0-4a97-92e6-ddf78889aa93","metadata":{"nteract":{"transient":{"deleting":false}}},"source":["### Install the latest .whl package\n","\n","Check [here](https://pypi.org/project/semantic-link-labs/) to see the latest version."]},{"cell_type":"code","execution_count":null,"id":"d5cae9db-cef9-48a8-a351-9c5fcc99645c","metadata":{"jupyter":{"outputs_hidden":true,"source_hidden":false},"nteract":{"transient":{"deleting":false}}},"outputs":[],"source":["%pip install semantic-link-labs"]},{"cell_type":"markdown","id":"b195eae8","metadata":{},"source":["### Import the library and necessary packages"]},{"cell_type":"code","execution_count":null,"id":"1344e286","metadata":{},"outputs":[],"source":["import sempy_labs as labs\n","from sempy_labs import ConnectWarehouse\n","from sempy_labs import ConnectLakehouse\n","\n","lakehouse_name = ''\n","lakehouse_workspace_name = ''\n","warehouse_name = ''\n","warehouse_workspace_name = ''"]},{"cell_type":"markdown","id":"55e5ca67","metadata":{},"source":["### Run a SQL query (or queries) against a Fabric warehouse"]},{"cell_type":"code","execution_count":null,"id":"a9f984e9","metadata":{},"outputs":[],"source":["with ConnectWarehouse(warehouse=warehouse_name, workspace=warehouse_workspace_name) as sql:\n"," df = sql.query(\"SELECT * FROM Product\")\n"," display(df)"]},{"cell_type":"code","execution_count":null,"id":"865ac4a1","metadata":{},"outputs":[],"source":["with ConnectWarehouse(warehouse=warehouse_name, workspace=warehouse_workspace_name) as sql:\n"," dfs = sql.query([\"SELECT * FROM Product\", \"SELECT DISTINCT [Category] FROM Product\"])\n","\n","for df in dfs:\n"," display(df)"]},{"cell_type":"markdown","id":"634700c3","metadata":{},"source":["### Run a T-SQL query (or queries) against a Fabric warehouse"]},{"cell_type":"code","execution_count":null,"id":"5dbf34f3","metadata":{},"outputs":[],"source":["with ConnectWarehouse(warehouse=warehouse_name, workspace=warehouse_workspace_name) as sql:\n"," sql.query(\"CREATE SCHEMA [Business]\")"]},{"cell_type":"code","execution_count":null,"id":"ec8ddb81","metadata":{},"outputs":[],"source":["with ConnectWarehouse(warehouse=warehouse_name, workspace=warehouse_workspace_name) as sql:\n"," sql.query([\"CREATE SCHEMA [Business]\", \"CREATE SCHEMA [Marketing]\"])"]},{"cell_type":"markdown","id":"d5b090da","metadata":{},"source":["### Run a SQL query (or queries) against a Fabric lakehouse"]},{"cell_type":"code","execution_count":null,"id":"4dca7f4a","metadata":{},"outputs":[],"source":["with ConnectLakehouse(lakehouse=lakehouse_name, workspace=lakehouse_workspace_name) as sql:\n"," df = sql.query(\"SELECT * FROM Product\")\n"," display(df)"]},{"cell_type":"code","execution_count":null,"id":"b9606ae8","metadata":{},"outputs":[],"source":["with ConnectLakehouse(lakehouse=lakehouse_name, workspace=lakehouse_workspace_name) as sql:\n"," dfs = sql.query([\"SELECT * FROM Product\", \"SELECT DISTINCT [Category] FROM Product\"])\n","\n","for df in dfs:\n"," display(df)"]}],"metadata":{"kernel_info":{"name":"synapse_pyspark"},"kernelspec":{"display_name":"Synapse PySpark","language":"Python","name":"synapse_pyspark"},"language_info":{"name":"python"},"microsoft":{"language":"python"},"nteract":{"version":"nteract-front-end@1.0.0"},"spark_compute":{"compute_id":"/trident/default"},"synapse_widget":{"state":{},"version":"0.1"},"widgets":{}},"nbformat":4,"nbformat_minor":5}
@@ -0,0 +1 @@
1
+ {"cells":[{"cell_type":"markdown","id":"5c27dfd1-4fe0-4a97-92e6-ddf78889aa93","metadata":{"nteract":{"transient":{"deleting":false}}},"source":["### Install the latest .whl package\n","\n","Check [here](https://pypi.org/project/semantic-link-labs/) to see the latest version."]},{"cell_type":"code","execution_count":null,"id":"d5cae9db-cef9-48a8-a351-9c5fcc99645c","metadata":{"jupyter":{"outputs_hidden":true,"source_hidden":false},"nteract":{"transient":{"deleting":false}}},"outputs":[],"source":["%pip install semantic-link-labs"]},{"cell_type":"markdown","id":"5a3fe6e8-b8aa-4447-812b-7931831e07fe","metadata":{"nteract":{"transient":{"deleting":false}}},"source":["### Connect to the [Tabular Object Model](https://learn.microsoft.com/analysis-services/tom/introduction-to-the-tabular-object-model-tom-in-analysis-services-amo?view=asallproducts-allversions) ([TOM](https://learn.microsoft.com/dotnet/api/microsoft.analysisservices.tabular.model?view=analysisservices-dotnet))\n","Setting the 'readonly' property to False enables read/write mode. This allows changes to be made to the semantic model."]},{"cell_type":"code","execution_count":null,"id":"cde43b47-4ecc-46ae-9125-9674819c7eab","metadata":{"jupyter":{"outputs_hidden":false,"source_hidden":false},"nteract":{"transient":{"deleting":false}}},"outputs":[],"source":["import sempy_labs as labs\n","from sempy_labs.tom import connect_semantic_model\n","\n","dataset = '' # Enter dataset name\n","workspace = None # Enter workspace name\n","\n","with connect_semantic_model(dataset=dataset, readonly=True, workspace=workspace) as tom:\n"," for t in tom.model.Tables:\n"," print(t.Name)"]},{"cell_type":"markdown","id":"fc6b277e","metadata":{},"source":["### Make changes to a semantic model using custom functions\n","Note that the custom functions have additional optional parameters (which may not be used in the examples below) for adding properties to model objects. Check the [documentation](https://semantic-link-labs.readthedocs.io/en/0.5.0/sempy_labs.tom.html) to see all available parameters for each function."]},{"cell_type":"markdown","id":"6d46d878","metadata":{},"source":["#### Rename objects in the semantic model"]},{"cell_type":"code","execution_count":null,"id":"1284825a","metadata":{},"outputs":[],"source":["with connect_semantic_model(dataset=dataset, readonly=False, workspace=workspace) as tom:\n"," for t in tom.model.Tables:\n"," t.Name = t.Name.replace('_',' ')\n"]},{"cell_type":"code","execution_count":null,"id":"d3b60303","metadata":{},"outputs":[],"source":["with connect_semantic_model(dataset=dataset, readonly=False, workspace=workspace) as tom:\n"," for c in tom.all_columns():\n"," c.Name = c.Name.replace('_',' ')"]},{"cell_type":"markdown","id":"402a477c","metadata":{},"source":["#### Add measure(s) to the semantic model"]},{"cell_type":"code","execution_count":null,"id":"bdaaaa5c","metadata":{},"outputs":[],"source":["with connect_semantic_model(dataset=dataset, readonly=False, workspace=workspace) as tom:\n"," tom.add_measure(table_name='Internet Sales', measure_name='Sales Amount', expression=\"SUM('Internet Sales'[SalesAmount])\")\n"," tom.add_measure(table_name='Internet Sales', measure_name='Order Quantity', expression=\"SUM('Internet Sales'[OrderQty])\") "]},{"cell_type":"code","execution_count":null,"id":"a53a544b","metadata":{},"outputs":[],"source":["with connect_semantic_model(dataset=dataset, readonly=False, workspace=workspace) as tom:\n"," for t in tom.model.Tables:\n"," if t.Name == 'Internet Sales':\n"," tom.add_measure(table_name=t.Name, measure_name='Sales Amount', expression=\"SUM('Internet Sales'[SalesAmount])\")\n"," tom.add_measure(table_name=t.Name, measure_name='Order Quantity', expression=\"SUM('Internet Sales'[OrderQty])\")"]},{"cell_type":"markdown","id":"1cb1632f","metadata":{},"source":["#### Add column(s) to the semantic model"]},{"cell_type":"code","execution_count":null,"id":"81a22749","metadata":{},"outputs":[],"source":["with connect_semantic_model(dataset=dataset, readonly=False, workspace=workspace) as tom:\n"," tom.add_data_column(table_name='Product', column_name='Size Range', source_column='SizeRange', data_type='Int64')\n"," tom.add_data_column(table_name= 'Segment', column_name='Summary Segment', source_column='SummarySegment', data_type='String')\n","\n"," tom.add_calculated_column(table_name='Internet Sales', column_name='GrossMargin', expression=\"'Internet Sales'[SalesAmount] - 'Internet Sales'[ProductCost]\", data_type='Decimal')"]},{"cell_type":"code","execution_count":null,"id":"053b6516","metadata":{},"outputs":[],"source":["with connect_semantic_model(dataset=dataset, readonly=False, workspace=workspace) as tom:\n"," for t in tom.model.Tables:\n"," if t.Name == 'Product':\n"," tom.add_data_column(table_name=t.Name, column_name='Size Range', source_column='SizeRange', data_type='Int64')\n"," elif t.Name == 'Segment':\n"," tom.add_data_column(table_name = t.Name, column_name='Summary Segment', source_column='SummarySegment', data_type='String')\n"," elif t.Name == 'Internet Sales':\n"," tom.add_calculated_column(table_name=t.Name, column_name='GrossMargin', expression=\"'Internet Sales'[SalesAmount] - 'Internet Sales'[ProductCost]\", data_type='Decimal')"]},{"cell_type":"markdown","id":"f53dcca7","metadata":{},"source":["#### Add hierarchies to the semantic model"]},{"cell_type":"code","execution_count":null,"id":"a9309e23","metadata":{},"outputs":[],"source":["with connect_semantic_model(dataset=dataset, readonly=False, workspace=workspace) as tom:\n"," tom.add_hierarchy(table_name='Geography', hierarchy_name='Geo Hierarchy', levels=['Continent', 'Country', 'State', 'City'])"]},{"cell_type":"code","execution_count":null,"id":"a04281ce","metadata":{},"outputs":[],"source":["with connect_semantic_model(dataset=dataset, readonly=False, workspace=workspace) as tom:\n"," for t in tom.model.Tables:\n"," if t.Name == 'Geography':\n"," tom.add_hierarchy(table_name=t.Name, hierarchy_name='Geo Hierarchy', levels=['Continent', 'Country', 'State', 'City'])"]},{"cell_type":"markdown","id":"47c06a4f","metadata":{},"source":["#### Add relationship(s) to the semantic model"]},{"cell_type":"code","execution_count":null,"id":"e8cd7bbf","metadata":{},"outputs":[],"source":["with connect_semantic_model(dataset=dataset, readonly=False, workspace=workspace) as tom:\n"," tom.add_relationship(\n"," from_table='Internet Sales', from_column='ProductKey',\n"," to_table='Product', to_column ='ProductKey', \n"," from_cardinality='Many', to_cardinality='One')"]},{"cell_type":"markdown","id":"3cc7f11e","metadata":{},"source":["#### Add a table with an M partition to a semantic model"]},{"cell_type":"code","execution_count":null,"id":"0f5dd66a","metadata":{},"outputs":[],"source":["with connect_semantic_model(dataset=dataset, readonly=False, workspace=workspace) as tom:\n"," table_name='Sales'\n"," tom.add_table(name=table_name)\n"," tom.add_m_partition(table_name=table_name, partition_name=table_name, expression='let....')"]},{"cell_type":"markdown","id":"ea389123","metadata":{},"source":["#### Add a table with an entity partition to a Direct Lake semantic model "]},{"cell_type":"code","execution_count":null,"id":"f75387d1","metadata":{},"outputs":[],"source":["with connect_semantic_model(dataset=dataset, readonly=False, workspace=workspace) as tom:\n"," table_name = 'Sales'\n"," tom.add_table(name=table_name)\n"," tom.add_entity_partition(table_name=table_name, entity_name=table_name)"]},{"cell_type":"markdown","id":"e74d0f54","metadata":{},"source":["#### Add a calculated table (and columns) to a semantic model"]},{"cell_type":"code","execution_count":null,"id":"934f7315","metadata":{},"outputs":[],"source":["with connect_semantic_model(dataset=dataset, readonly=False, workspace=workspace) as tom:\n"," table_name = 'Sales'\n"," tom.add_calculated_table(name=table_name, expression=\"DISTINCT('Product'[Color])\")\n"," tom.add_calculated_table_column(table_name=table_name, column_name='Color', source_column=\"'Product[Color]\", data_type='String')"]},{"cell_type":"markdown","id":"0e7088b7","metadata":{},"source":["#### Add role(s) to the semantic model"]},{"cell_type":"code","execution_count":null,"id":"ad60ebb9","metadata":{},"outputs":[],"source":["with connect_semantic_model(dataset=dataset, readonly=False, workspace=workspace) as tom:\n"," tom.add_role(role_name='Reader')"]},{"cell_type":"markdown","id":"c541f81a","metadata":{},"source":["#### Set row level security (RLS) to the semantic model\n","This adds row level security (or updates it if it already exists)"]},{"cell_type":"code","execution_count":null,"id":"98603a08","metadata":{},"outputs":[],"source":["with connect_semantic_model(dataset=dataset, readonly=False, workspace=workspace) as tom:\n"," tom.set_rls(\n"," role_name='Reader', \n"," table_name='Product',\n"," filter_expression=\"'Dim Product'[Color] = \\\"Blue\\\"\"\n"," )"]},{"cell_type":"code","execution_count":null,"id":"effea009","metadata":{},"outputs":[],"source":["with connect_semantic_model(dataset=dataset, readonly=False, workspace=workspace) as tom:\n"," for r in tom.model.Roles:\n"," if r.Name == 'Reader':\n"," tom.set_rls(role_name=r.Name, table_name='Product', filter_expression=\"'Dim Product'[Color] = \\\"Blue\\\"\")"]},{"cell_type":"markdown","id":"7fa7a03c","metadata":{},"source":["#### Set object level security (OLS) to the semantic model\n","This adds row level security (or updates it if it already exists)"]},{"cell_type":"code","execution_count":null,"id":"dd0def9d","metadata":{},"outputs":[],"source":["with connect_semantic_model(dataset=dataset, readonly=False, workspace=workspace) as tom:\n"," tom.set_ols(role_name='Reader', table_name='Product', column_name='Size', permission='None')"]},{"cell_type":"code","execution_count":null,"id":"7a389dc7","metadata":{},"outputs":[],"source":["with connect_semantic_model(dataset=dataset, readonly=False, workspace=workspace) as tom:\n"," for r in tom.model.Roles:\n"," if r.Name == 'Reader':\n"," for t in tom.model.Tables:\n"," if t.Name == 'Product':\n"," tom.set_ols(role_name=r.Name, table_name=t.Name, column_name='Size', permission='None')"]},{"cell_type":"markdown","id":"d0f7ccd1","metadata":{},"source":["#### Add calculation groups and calculation items to the semantic model"]},{"cell_type":"code","execution_count":null,"id":"97f4708b","metadata":{},"outputs":[],"source":["with connect_semantic_model(dataset=dataset, readonly=False, workspace=workspace) as tom:\n"," tom.add_calculation_group(name='MyCalcGroup')"]},{"cell_type":"code","execution_count":null,"id":"fef68832","metadata":{},"outputs":[],"source":["with connect_semantic_model(dataset=dataset, readonly=False, workspace=workspace) as tom:\n"," tom.add_calculation_item(table_name='MyCalcGroup', calculation_item_name='YTD', expression=\"CALCULATE(SELECTEDMEASURE(), DATESYTD('Calendar'[CalendarDate]))\")\n"," tom.add_calculation_item(table_name='MyCalcGroup', calculation_item_name='MTD', expression=\"CALCULATE(SELECTEDMEASURE(), DATESMTD('Calendar'[CalendarDate]))\")"]},{"cell_type":"code","execution_count":null,"id":"c7653dcc","metadata":{},"outputs":[],"source":["with connect_semantic_model(dataset=dataset, readonly=False, workspace=workspace) as tom:\n"," for t in tom.model.Tables:\n"," if t.Name == 'MyCalcGroup':\n"," tom.add_calculation_item(table_name=t.Name, calculation_item_name='YTD', expression=\"CALCULATE(SELECTEDMEASURE(), DATESYTD('Calendar'[CalendarDate]))\")\n"," tom.add_calculation_item(table_name=t.Name, calculation_item_name='MTD', expression=\"CALCULATE(SELECTEDMEASURE(), DATESMTD('Calendar'[CalendarDate]))\")"]},{"cell_type":"markdown","id":"c6450c74","metadata":{},"source":["#### Add translations to a semantic model"]},{"cell_type":"code","execution_count":null,"id":"2b616b90","metadata":{},"outputs":[],"source":["with connect_semantic_model(dataset=dataset, readonly=False, workspace=workspace) as tom:\n"," tom.add_translation(language='it-IT')"]},{"cell_type":"code","execution_count":null,"id":"dc24c200","metadata":{},"outputs":[],"source":["with connect_semantic_model(dataset=dataset, readonly=False, workspace=workspace) as tom:\n"," tom.set_translation(object = tom.model.Tables['Product'], language='it-IT', property='Name', value='Produtto')"]},{"cell_type":"markdown","id":"3048cc95","metadata":{},"source":["#### Add a [Field Parameter](https://learn.microsoft.com/power-bi/create-reports/power-bi-field-parameters) to a semantic model"]},{"cell_type":"code","execution_count":null,"id":"0a94af94","metadata":{},"outputs":[],"source":["with connect_semantic_model(dataset=dataset, readonly=False, workspace=workspace) as tom:\n"," tom.add_field_parameter(table_name='Parameter', objects=\"'Product'[Color], [Sales Amount], 'Geography'[Country]\")"]},{"cell_type":"markdown","id":"95aac09a","metadata":{},"source":["#### Remove an object(s) from a semantic model"]},{"cell_type":"code","execution_count":null,"id":"1e2572a8","metadata":{},"outputs":[],"source":["with connect_semantic_model(dataset=dataset, readonly=False, workspace=workspace) as tom:\n"," for t in tom.model.Tables:\n"," if t.Name == 'Product':\n"," tom.remove_object(object=t.Columns['Size'])\n"," tom.remove_object(object=t.Hierarchies['Product Hierarchy'])"]},{"cell_type":"code","execution_count":null,"id":"bc453177","metadata":{},"outputs":[],"source":["with connect_semantic_model(dataset=dataset, readonly=False, workspace=workspace) as tom:\n"," tom.remove_object(object=tom.model.Tables['Product'].Columns['Size'])\n"," tom.remove_object(object=tom.model.Tables['Product'].Hierarchies['Product Hierarchy'])"]},{"cell_type":"markdown","id":"e0d0cb9e","metadata":{},"source":["### Custom functions to loop through non-top-level objects in a semantic model"]},{"cell_type":"code","execution_count":null,"id":"cbe3b1a3","metadata":{},"outputs":[],"source":["with connect_semantic_model(dataset=dataset, readonly=True, workspace=workspace) as tom:\n"," for c in tom.all_columns():\n"," print(c.Name)"]},{"cell_type":"code","execution_count":null,"id":"3f643e66","metadata":{},"outputs":[],"source":["with connect_semantic_model(dataset=dataset, readonly=True, workspace=workspace) as tom:\n"," for m in tom.all_measures():\n"," print(m.Name)"]},{"cell_type":"code","execution_count":null,"id":"ed1cde0f","metadata":{},"outputs":[],"source":["with connect_semantic_model(dataset=dataset, readonly=True, workspace=workspace) as tom:\n"," for p in tom.all_partitions():\n"," print(p.Name)"]},{"cell_type":"code","execution_count":null,"id":"f48014ae","metadata":{},"outputs":[],"source":["with connect_semantic_model(dataset=dataset, readonly=True, workspace=workspace) as tom:\n"," for h in tom.all_hierarchies():\n"," print(h.Name)"]},{"cell_type":"code","execution_count":null,"id":"9f5e7b72","metadata":{},"outputs":[],"source":["with connect_semantic_model(dataset=dataset, readonly=True, workspace=workspace) as tom:\n"," for ci in tom.all_calculation_items():\n"," print(ci.Name)"]},{"cell_type":"code","execution_count":null,"id":"3cd9ebc1","metadata":{},"outputs":[],"source":["with connect_semantic_model(dataset=dataset, readonly=True, workspace=workspace) as tom:\n"," for l in tom.all_levels():\n"," print(l.Name)"]},{"cell_type":"code","execution_count":null,"id":"12c58bad","metadata":{},"outputs":[],"source":["with connect_semantic_model(dataset=dataset, readonly=False, workspace=workspace) as tom:\n"," for rls in tom.all_rls():\n"," print(rls.Name)"]},{"cell_type":"markdown","id":"1a294bd2","metadata":{},"source":["### See Vertipaq Analyzer stats"]},{"cell_type":"code","execution_count":null,"id":"469660e9","metadata":{},"outputs":[],"source":["with connect_semantic_model(dataset=dataset, readonly=False, workspace=workspace) as tom:\n"," tom.set_vertipaq_annotations()\n","\n"," for t in tom.model.Tables:\n"," rc = tom.row_count(object = t)\n"," print(f\"{t.Name} : {str(rc)}\")\n"," for c in t.Columns:\n"," col_size = tom.total_size(object=c)\n"," print(labs.format_dax_object_name(t.Name, c.Name) + ' : ' + str(col_size))"]},{"cell_type":"markdown","id":"1ab26dfd","metadata":{},"source":["### 'UsedIn' functions"]},{"cell_type":"code","execution_count":null,"id":"412bf287","metadata":{},"outputs":[],"source":["with connect_semantic_model(dataset=dataset, readonly=True, workspace=workspace) as tom:\n"," for c in tom.all_columns():\n"," full_name = labs.format_dax_object_name(c.Parent.Name, c.Name)\n"," for h in tom.used_in_hierarchies(column = c):\n"," print(f\"{full_name} : {h.Name}\")"]},{"cell_type":"code","execution_count":null,"id":"76556900","metadata":{},"outputs":[],"source":["with connect_semantic_model(dataset=dataset, readonly=True, workspace=workspace) as tom:\n"," for c in tom.all_columns():\n"," full_name = labs.format_dax_object_name(c.Parent.Name, c.Name)\n"," for r in tom.used_in_relationships(object = c):\n"," rel_name = labs.create_relationship_name(r.FromTable.Name, r.FromColumn.Name, r.ToTable.Name, r.ToColumn.Name)\n"," print(f\"{full_name} : {rel_name}\")"]},{"cell_type":"code","execution_count":null,"id":"4d9ec24e","metadata":{},"outputs":[],"source":["with connect_semantic_model(dataset=dataset, readonly=True, workspace=workspace) as tom:\n"," for t in tom.model.Tables:\n"," for r in tom.used_in_relationships(object = t):\n"," rel_name = labs.create_relationship_name(r.FromTable.Name, r.FromColumn.Name, r.ToTable.Name, r.ToColumn.Name)\n"," print(f\"{t.Name} : {rel_name}\")"]},{"cell_type":"code","execution_count":null,"id":"82251336","metadata":{},"outputs":[],"source":["with connect_semantic_model(dataset=dataset, readonly=True, workspace=workspace) as tom:\n"," dep = labs.get_model_calc_dependencies(dataset = dataset, workspace=workspace)\n"," for o in tom.used_in_rls(object = tom.model.Tables['Product'].Columns['Color'], dependencies=dep):\n"," print(o.Name)"]}],"metadata":{"kernel_info":{"name":"synapse_pyspark"},"kernelspec":{"display_name":"Synapse PySpark","language":"Python","name":"synapse_pyspark"},"language_info":{"name":"python"},"microsoft":{"language":"python"},"nteract":{"version":"nteract-front-end@1.0.0"},"spark_compute":{"compute_id":"/trident/default"},"synapse_widget":{"state":{},"version":"0.1"},"widgets":{}},"nbformat":4,"nbformat_minor":5}
@@ -7,7 +7,7 @@ name="semantic-link-labs"
7
7
  authors = [
8
8
  { name = "Microsoft Corporation" },
9
9
  ]
10
- version="0.7.4"
10
+ version="0.8.1"
11
11
  description="Semantic Link Labs for Microsoft Fabric"
12
12
  readme="README.md"
13
13
  requires-python=">=3.10,<3.12"
@@ -28,6 +28,8 @@ dependencies = [
28
28
  "powerbiclient",
29
29
  "polib",
30
30
  "azure.mgmt.resource",
31
+ "jsonpath_ng",
32
+ "deltalake",
31
33
  ]
32
34
 
33
35
  [tool.setuptools.packages.find]
@@ -45,7 +47,7 @@ test = [
45
47
  Repository = "https://github.com/microsoft/semantic-link-labs.git"
46
48
 
47
49
  [[tool.mypy.overrides]]
48
- module = "sempy.*,Microsoft.*,System.*,anytree.*,powerbiclient.*,synapse.ml.services.*,polib.*,azure.mgmt.resource.*"
50
+ module = "sempy.*,Microsoft.*,System.*,anytree.*,powerbiclient.*,synapse.ml.services.*,polib.*,azure.mgmt.resource.*,jsonpath_ng.*,deltalake.*"
49
51
  ignore_missing_imports = true
50
52
 
51
53
  [tool.flake8]
@@ -1,6 +1,6 @@
1
1
  Metadata-Version: 2.1
2
2
  Name: semantic-link-labs
3
- Version: 0.7.4
3
+ Version: 0.8.1
4
4
  Summary: Semantic Link Labs for Microsoft Fabric
5
5
  Author: Microsoft Corporation
6
6
  License: MIT License
@@ -20,13 +20,15 @@ Requires-Dist: anytree
20
20
  Requires-Dist: powerbiclient
21
21
  Requires-Dist: polib
22
22
  Requires-Dist: azure.mgmt.resource
23
+ Requires-Dist: jsonpath_ng
24
+ Requires-Dist: deltalake
23
25
  Provides-Extra: test
24
26
  Requires-Dist: pytest>=8.2.1; extra == "test"
25
27
 
26
28
  # Semantic Link Labs
27
29
 
28
30
  [![PyPI version](https://badge.fury.io/py/semantic-link-labs.svg)](https://badge.fury.io/py/semantic-link-labs)
29
- [![Read The Docs](https://readthedocs.org/projects/semantic-link-labs/badge/?version=0.7.4&style=flat)](https://readthedocs.org/projects/semantic-link-labs/)
31
+ [![Read The Docs](https://readthedocs.org/projects/semantic-link-labs/badge/?version=0.8.1&style=flat)](https://readthedocs.org/projects/semantic-link-labs/)
30
32
  [![Code style: black](https://img.shields.io/badge/code%20style-black-000000.svg)](https://github.com/psf/black)
31
33
  [![Downloads](https://static.pepy.tech/badge/semantic-link-labs)](https://pepy.tech/project/semantic-link-labs)
32
34
 
@@ -34,9 +36,31 @@ Requires-Dist: pytest>=8.2.1; extra == "test"
34
36
  [Read the documentation on ReadTheDocs!](https://semantic-link-labs.readthedocs.io/en/stable/)
35
37
  ---
36
38
 
37
- This is a python library intended to be used in [Microsoft Fabric notebooks](https://learn.microsoft.com/fabric/data-engineering/how-to-use-notebook). This library was originally intended to solely contain functions used for [migrating semantic models to Direct Lake mode](https://github.com/microsoft/semantic-link-labs?tab=readme-ov-file#direct-lake-migration). However, it quickly became apparent that functions within such a library could support many other useful activities in the realm of semantic models, reports, lakehouses and really anything Fabric-related. As such, this library contains a variety of functions ranging from running [Vertipaq Analyzer](https://semantic-link-labs.readthedocs.io/en/stable/sempy_labs.html#sempy_labs.import_vertipaq_analyzer) or the [Best Practice Analyzer](https://semantic-link-labs.readthedocs.io/en/stable/sempy_labs.html#sempy_labs.run_model_bpa) against a semantic model to seeing if any [lakehouse tables hit Direct Lake guardrails](https://semantic-link-labs.readthedocs.io/en/stable/sempy_labs.lakehouse.html#sempy_labs.lakehouse.get_lakehouse_tables) or accessing the [Tabular Object Model](https://semantic-link-labs.readthedocs.io/en/stable/sempy_labs.tom.html) and more!
38
-
39
- Instructions for migrating import/DirectQuery semantic models to Direct Lake mode can be found [here](https://github.com/microsoft/semantic-link-labs?tab=readme-ov-file#direct-lake-migration).
39
+ Semantic Link Labs is a Python library designed for use in [Microsoft Fabric notebooks](https://learn.microsoft.com/fabric/data-engineering/how-to-use-notebook). This library extends the capabilities of [Semantic Link](https://learn.microsoft.com/fabric/data-science/semantic-link-overview) offering additional functionalities to seamlessly integrate and work alongside it. The goal of Semantic Link Labs is to simplify technical processes, empowering people to focus on higher level activities and allowing tasks that are better suited for machines to be efficiently handled without human intervention.
40
+
41
+ ## Featured Scenarios
42
+ * Semantic Models
43
+ * [Migrating an import/DirectQuery semantic model to Direct Lake](https://github.com/microsoft/semantic-link-labs?tab=readme-ov-file#direct-lake-migration)
44
+ * [Model Best Practice Analyzer (BPA)](https://semantic-link-labs.readthedocs.io/en/stable/sempy_labs.html#sempy_labs.run_model_bpa)
45
+ * [Vertipaq Analyzer](https://semantic-link-labs.readthedocs.io/en/stable/sempy_labs.html#sempy_labs.vertipaq_analyzer)
46
+ * [Tabular Object Model](https://github.com/microsoft/semantic-link-labs/blob/main/notebooks/Tabular%20Object%20Model.ipynb) [(TOM)](https://semantic-link-labs.readthedocs.io/en/stable/sempy_labs.tom.html)
47
+ * [Translate a semantic model's metadata](https://semantic-link-labs.readthedocs.io/en/stable/sempy_labs.html#sempy_labs.translate_semantic_model)
48
+ * [Check Direct Lake Guardrails](https://semantic-link-labs.readthedocs.io/en/stable/sempy_labs.lakehouse.html#sempy_labs.lakehouse.get_lakehouse_tables)
49
+ * [Refresh](https://github.com/microsoft/semantic-link-labs/blob/main/notebooks/Semantic%20Model%20Refresh.ipynb), [clear cache](https://semantic-link-labs.readthedocs.io/en/stable/sempy_labs.html#sempy_labs.clear_cache), [backup](https://semantic-link-labs.readthedocs.io/en/stable/sempy_labs.html#sempy_labs.backup_semantic_model), [restore](https://semantic-link-labs.readthedocs.io/en/stable/sempy_labs.html#sempy_labs.restore_semantic_model), [copy backup files](https://semantic-link-labs.readthedocs.io/en/stable/sempy_labs.html#sempy_labs.copy_semantic_model_backup_file), [move/deploy across workspaces](https://semantic-link-labs.readthedocs.io/en/stable/sempy_labs.html#sempy_labs.deploy_semantic_model)
50
+ * [Run DAX queries which impersonate a user](https://semantic-link-labs.readthedocs.io/en/stable/sempy_labs.html#sempy_labs.evaluate_dax_impersonation)
51
+ * Reports
52
+ * [Report Best Practice Analyzer (BPA)](https://semantic-link-labs.readthedocs.io/en/stable/sempy_labs.report.html#sempy_labs.report.run_report_bpa)
53
+ * [View report metadata](https://github.com/microsoft/semantic-link-labs/blob/main/notebooks/Report%20Analysis.ipynb)
54
+ * [View semantic model objects most frequently used in Power BI reports](https://semantic-link-labs.readthedocs.io/en/stable/sempy_labs.html#sempy_labs.list_semantic_model_object_report_usage)
55
+ * [View broken reports](https://semantic-link-labs.readthedocs.io/en/stable/sempy_labs.html#sempy_labs.list_report_semantic_model_objects)
56
+ * [Rebind reports](https://semantic-link-labs.readthedocs.io/en/stable/sempy_labs.report.html#sempy_labs.report.report_rebind)
57
+ * Capacities
58
+ * [Migrating a Power BI Premium capacity (P sku) to a Fabric capacity (F sku)](https://github.com/microsoft/semantic-link-labs/blob/main/notebooks/Capacity%20Migration.ipynb)
59
+ * APIs
60
+ * Wrapper functions for [Power BI](https://learn.microsoft.com/rest/api/power-bi/), [Fabric](https://learn.microsoft.com/rest/api/fabric/articles/using-fabric-apis), and [Azure](https://learn.microsoft.com/rest/api/azure/?view=rest-power-bi-embedded-2021-01-01) APIs
61
+
62
+
63
+ ### Check out the [helper notebooks](https://github.com/microsoft/semantic-link-labs/tree/main/notebooks) for getting started!
40
64
 
41
65
  If you encounter any issues, please [raise a bug](https://github.com/microsoft/semantic-link-labs/issues/new?assignees=&labels=&projects=&template=bug_report.md&title=).
42
66
 
@@ -50,11 +74,13 @@ If you have ideas for new features/functions, please [request a feature](https:/
50
74
  ## Once installed, run this code to import the library into your notebook
51
75
  ```python
52
76
  import sempy_labs as labs
53
- from sempy_labs import migration, directlake
77
+ from sempy_labs import migration, directlake, admin
54
78
  from sempy_labs import lakehouse as lake
55
79
  from sempy_labs import report as rep
56
80
  from sempy_labs.tom import connect_semantic_model
81
+ from sempy_labs.report import ReportWrapper
57
82
  from sempy_labs import ConnectWarehouse
83
+ from sempy_labs import ConnectLakehouse
58
84
  ```
59
85
 
60
86
  ## Load semantic-link-labs into a custom [Fabric environment](https://learn.microsoft.com/fabric/data-engineering/create-and-use-environment)
@@ -78,6 +104,8 @@ An even better way to ensure the semantic-link-labs library is available in your
78
104
  2. Select your newly created environment within the 'Environment' drop down in the navigation bar at the top of the notebook
79
105
 
80
106
  ## Version History
107
+ * [0.8.1](https://github.com/microsoft/semantic-link-labs/releases/tag/0.8.1) (October 2, 2024)
108
+ * [0.8.0](https://github.com/microsoft/semantic-link-labs/releases/tag/0.8.0) (September 25, 2024)
81
109
  * [0.7.4](https://github.com/microsoft/semantic-link-labs/releases/tag/0.7.4) (September 16, 2024)
82
110
  * [0.7.3](https://github.com/microsoft/semantic-link-labs/releases/tag/0.7.3) (September 11, 2024)
83
111
  * [0.7.2](https://github.com/microsoft/semantic-link-labs/releases/tag/0.7.2) (August 30, 2024)
@@ -124,7 +152,7 @@ Check out my [blog post](https://www.elegantbi.com/post/direct-lake-migration) o
124
152
  5. Back in the notebook, the next step will create your new Direct Lake semantic model with the name of your choice, taking all the relevant properties from the orignal semantic model and refreshing/framing your new semantic model.
125
153
 
126
154
  > [!NOTE]
127
- > As of version 0.2.1, calculated tables are also migrated to Direct Lake (as data tables with their DAX expression stored as model annotations in the new semantic model). Additionally, Field Parameters are migrated as they were in the original semantic model (as a calculated table).
155
+ > Calculated tables are also migrated to Direct Lake (as data tables with their DAX expression stored as model annotations in the new semantic model). Additionally, Field Parameters are migrated as they were in the original semantic model (as a calculated table). [Auto date/time tables](https://learn.microsoft.com/power-bi/guidance/auto-date-time) are not migrated. Auto date/time must be disabled in Power BI Desktop and proper date table(s) must be created prior to migration.
128
156
 
129
157
  6. Finally, you can easily rebind your all reports which use the import/DQ semantic model to the new Direct Lake semantic model in one click.
130
158
 
@@ -137,6 +165,14 @@ Check out my [blog post](https://www.elegantbi.com/post/direct-lake-migration) o
137
165
  * Non-supported objects are not transferred (i.e. calculated columns, relationships using columns with unsupported data types etc.).
138
166
  * Reports used by your original semantic model will be rebinded to your new semantic model.
139
167
 
168
+ ### Limitations
169
+ * Calculated columns are not migrated.
170
+ * Auto date/time tables are not migrated.
171
+ * References to calculated columns in Field Parameters are removed.
172
+ * References to calculated columns in measure expressions or other DAX expressions will break.
173
+ * Calculated tables are migrated as possible. The success of this migration depends on the interdependencies and complexity of the calculated table. This part of the migration is a workaround as technically calculated tables are not supported in Direct Lake.
174
+ * See [here](https://learn.microsoft.com/fabric/get-started/direct-lake-overview#considerations-and-limitations) for the rest of the limitations of Direct Lake.
175
+
140
176
  ## Contributing
141
177
 
142
178
  This project welcomes contributions and suggestions. Most contributions require you to agree to a