superset 0.1.6 → 0.2.4

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
Files changed (39) hide show
  1. checksums.yaml +4 -4
  2. data/CHANGELOG.md +33 -0
  3. data/README.md +36 -144
  4. data/doc/duplicate_dashboards.md +2 -5
  5. data/doc/migrating_dashboards_across_environments.md +173 -0
  6. data/doc/publishing.md +39 -0
  7. data/doc/setting_up_personal_api_credentials.md +43 -7
  8. data/doc/usage.md +105 -0
  9. data/env.sample +1 -1
  10. data/lib/superset/base_put_request.rb +30 -0
  11. data/lib/superset/chart/create.rb +40 -0
  12. data/lib/superset/chart/duplicate.rb +75 -0
  13. data/lib/superset/chart/put.rb +18 -0
  14. data/lib/superset/chart/update_dataset.rb +1 -1
  15. data/lib/superset/client.rb +7 -1
  16. data/lib/superset/dashboard/bulk_delete_cascade.rb +1 -1
  17. data/lib/superset/dashboard/compare.rb +2 -2
  18. data/lib/superset/dashboard/datasets/list.rb +37 -9
  19. data/lib/superset/dashboard/embedded/get.rb +2 -2
  20. data/lib/superset/dashboard/export.rb +56 -5
  21. data/lib/superset/dashboard/get.rb +5 -0
  22. data/lib/superset/dashboard/import.rb +84 -0
  23. data/lib/superset/dashboard/list.rb +8 -4
  24. data/lib/superset/dashboard/warm_up_cache.rb +1 -1
  25. data/lib/superset/database/export.rb +119 -0
  26. data/lib/superset/database/list.rb +5 -2
  27. data/lib/superset/dataset/get.rb +10 -11
  28. data/lib/superset/dataset/list.rb +1 -1
  29. data/lib/superset/dataset/put.rb +18 -0
  30. data/lib/superset/dataset/update_schema.rb +4 -3
  31. data/lib/superset/file_utilities.rb +4 -3
  32. data/lib/superset/guest_token.rb +14 -7
  33. data/lib/superset/logger.rb +2 -2
  34. data/lib/superset/request.rb +7 -4
  35. data/lib/superset/services/dashboard_loader.rb +69 -0
  36. data/lib/superset/services/duplicate_dashboard.rb +14 -13
  37. data/lib/superset/services/import_dashboard_across_environment.rb +144 -0
  38. data/lib/superset/version.rb +1 -1
  39. metadata +15 -3
checksums.yaml CHANGED
@@ -1,7 +1,7 @@
1
1
  ---
2
2
  SHA256:
3
- metadata.gz: 16bb352b533737ed4faed7d5b4360c850619d60b7997b12638019681a8b36bfd
4
- data.tar.gz: e7dd50138ea34bd23969a61bf6f69f7eb98cc6174f67abdb05fdb9ba4eb60197
3
+ metadata.gz: 0f3edd28b29118fb3c7ebc2eb7a199a92939429683bbe0c19aecc87b09202d9e
4
+ data.tar.gz: f4eba7dd0dccd0093f8016e565df81861f0157560ab2f0320e7c4f29caec31c7
5
5
  SHA512:
6
- metadata.gz: ed7199d606cb33b4f30e79d6816ccd2d00c63061c88d630b8e9b538e540cbf4ae3ea50bfccf3acb8ba7e6d92ba09e51871a58b0ea3a84cbfc37d16641923d737
7
- data.tar.gz: 50e1ed97a6fb3b698136c313b0384755f42eecf6b171aa7a02021919e59fc9d2bed6991fb03129d8f37a47df6ca4d825b542fe8d36b1410d1bbd284570014e01
6
+ metadata.gz: c89e37964c7a4e3ae49c484cd4594f7471734184500874c04c217a58ea785eede26390fcee81b178da92d7f4f0a0f263f61d959f33ee235a7d633a609b3cdb77
7
+ data.tar.gz: 96a68b6e37885caa88dd6fc1b8b219f5fbe45f25347f9e3c66a52f1a8f167b5fa48d03dac849f4fa8b04426d450ba050af0cabe268b5961e77c306fdba98b969
data/CHANGELOG.md CHANGED
@@ -1,5 +1,38 @@
1
1
  ## Change Log
2
2
 
3
+ ## 0.2.4 - 2025-01-29
4
+ * modifies the `Superset::Dashboard::Datasets::List.new(id).schemas` to optionally include filter datasets as well.
5
+ * modifies the `Superset::Dashboard::Embedded::Get.new` to accept dashboard_id as named parameter
6
+ * modifies the `Superset::Dashboard::List.new()` to accept an additional named parameter `include_filter_dataset_schemas` to decide whether filter datasets needs to be included when getting the schemas of the datasets
7
+ * modifies the `Superset::Dashboard::List.new().retrieve_schemas` to call `Datasets::List.new(dashboard_id: id).schemas` with an additional parameter `include_filter_datasets` to fetch the filter dataset schemas as well.
8
+
9
+
10
+ ## 0.2.3 - 2024-11-15
11
+
12
+ * modifies the `Superset::Dashboard::Datasets::List.new(id).dataset_details` and `Superset::Dashboard::Datasets::List.new(id).list` to optionally include filter datasets as well to duplicate those during the dashboard duplication process. It also add a new column "Filter only" in the result which shows if a dataset is used only on filters
13
+ * This also adds an additional check in source dataset duplication that if any dataset already exists with the new name in the target schema we can use the existing dataset itself for the new dashboard also.
14
+ * Exporting a Dashboard will also include the Zip file in the repo backup (destination_path) folder.
15
+ * Exporting a Dashboard and copying the files to the backup folder will also clear out any previous backup files that no longer exist in the current zip export.
16
+ * Adds a put endpoint for chart and datasets which is needed for the bulk update of set of charts/datasets
17
+
18
+ ## 0.2.2 - 2024-10-10
19
+
20
+ * add ImportDashboardAcrossEnvironments class for transfering between superset environments
21
+
22
+ ## 0.2.1 - 2024-09-17
23
+
24
+ * add Superset::Database::Export class for exporting database configurations
25
+ * add Superset::Dashboard::Import class for importing a dashboards
26
+
27
+ ## 0.2.0 - 2024-08-19
28
+
29
+ * Adding RLS filter clause to the 'api/v1/security/guest_token/' API params in guest_token.rb - https://github.com/rdytech/superset-client/pull/31
30
+ * Any filter that needs to applied to the dataset's final where condition can be passed here. Ex: [{ "clause": "publisher = 'Nintendo'" }]. Refer this: https://github.com/apache/superset/tree/master/superset-embedded-sdk#creating-a-guest-token
31
+
32
+ ## 0.1.7 - 2024-08-27
33
+
34
+ * adds filter title_equals to dashboard list class - https://github.com/rdytech/superset-client/pull/33
35
+
3
36
  ## 0.1.6 - 2024-07-10
4
37
 
5
38
  * added a class **WarmUpCache** to hit the 'api/v1/dataset/warm_up_cache' endpoint to warm up the cache of all the datasets for a particular dashaboard being passed to the class - https://github.com/rdytech/superset-client/pull/28
data/README.md CHANGED
@@ -8,6 +8,12 @@ All ruby classes are namespaced under `Superset::`
8
8
 
9
9
  # Installation
10
10
 
11
+ ## Setup API Credentials
12
+
13
+ Follow this to [setup your users API creds](https://github.com/rdytech/superset-client/tree/develop/doc/setting_up_personal_api_credentials.md)
14
+
15
+ Short version is .. copy the `env.sample` to `.env` and add edit values where applicable. Opening a console with `bin/console` will then auto load the `.env` file.
16
+
11
17
  ## Docker Setup
12
18
 
13
19
  Build, bundle and open a ruby console
@@ -18,188 +24,74 @@ docker-compose run --rm app bundle install
18
24
  docker-compose run --rm app bin/console
19
25
  ```
20
26
 
21
- Run specs
27
+ ## Setup Locally ( no docker )
28
+
29
+ Requires Ruby >= 2.6.0
30
+
31
+ Bundle install and open a ruby console.
22
32
 
23
33
  ```
24
- docker-compose run --rm app rspec
25
- # or
26
- docker-compose run --rm app bash # then run 'bundle exec rspec' from the container.
34
+ bundle install
35
+ bin/console
27
36
  ```
28
37
 
29
- ## Local setup or including in a Ruby/Rails app
38
+ ## Including in a Ruby app
30
39
 
31
40
  Add to your Gemfile `gem 'superset'`
32
41
  And then execute: `bundle install`
33
42
  Or install it yourself as `gem install superset`
34
43
 
35
- ## Setup API Credentials
44
+ ## Run specs
36
45
 
37
- Follow this doc setup your users API creds [setting_up_personal_api_credentials](https://github.com/rdytech/superset-client/tree/develop/doc/setting_up_personal_api_credentials.md)
46
+ ```
47
+ docker-compose run --rm app rspec
48
+ # or
49
+ docker-compose run --rm app bash # then run 'bundle exec rspec' from the container.
50
+ ```
38
51
 
39
- Short version is .. copy the `env.sample` to `.env` and add edit values where applicable. Opening a console with `bin/console` will then auto load the `.env` file.
40
52
 
41
53
  ## Usage
42
54
 
43
- Experiment with the API calls directly by open a pry console using `bin/console`
44
-
45
-
46
-
47
-
48
- ### API calls
49
-
50
- Quickstart examples
55
+ Experiment with the API calls directly by open a pry console using `bin/console`.
51
56
 
52
57
  ```ruby
53
- Superset::Database::List.call
54
- Superset::Database::GetSchemas.new(1).list # get schemas for database 1
55
-
56
58
  Superset::Dashboard::List.call
57
- Superset::Dashboard::List.new(title_contains: 'Sales').list
58
-
59
- Superset::Dashboard::BulkDelete.new(dashboard_ids: [1,2,3]).perform # Dashboards only ( leaves all charts, datasets in place)
60
- Superset::Dashboard::BulkDeleteCascade.new(dashboard_ids: [1,2,3]).perform # Dashboards and related charts and datasets.
61
-
62
- Superset::Sqllab::Execute.new(database_id: 1, schema: 'public', query: 'select count(*) from birth_names').perform
63
-
64
- Superset::Dashboard::Export.new(dashboard_id: 1, destination_path: '/tmp').perform
65
-
66
- Superset::RouteInfo.new(route: 'dashboard/_info').perform # Get info on an API endpoint .. handy for getting available filters
67
- Superset::RouteInfo.new(route: 'chart/_info').filters # OR just get the filters for an endpoint
68
59
 
69
60
  superset_class_list # helper method to list all classes under Superset::
70
-
61
+ sshelp # aliased for superset_class_list
71
62
  ```
72
63
 
73
- ### Duplicating Dashboards
74
-
75
- Primary motivation behind this library was to use the Superset API to duplicate dashboards, charts, datasets across multiple database connections.
76
- See examples in [Duplicate Dashboards](https://github.com/rdytech/superset-client/tree/develop/doc/duplicate_dashboards.md)
77
-
78
- ### API Examples with results
79
-
80
- Generally classes follow the convention/path of the Superset API strucuture as per the swagger docs.
64
+ More examples [listed here](https://github.com/rdytech/superset-client/tree/develop/doc/usage.md)
81
65
 
82
- ref https://superset.apache.org/docs/api/
83
66
 
84
- Limited support for filters is available on some list pages, primarily through param `title_contains`.
85
- Pagination is supported via `page_num` param.
67
+ ## Duplicating Dashboards
86
68
 
87
- Primary methods across majority of api calls are
88
- - response : the full API response
89
- - result : just the result attribute array
90
- - list : displays a formatted output to console, handy for quick investigation of objects
91
- - call : is a class method to list on Get and List requests
92
-
93
- ```ruby
94
- # List all Databases
95
- Superset::Database::List.call
96
- # DEBUG -- : Happi: GET https://your-superset-host/api/v1/database/?q=(page:0,page_size:100), {}
97
- +----+------------------------------------+------------+------------------+
98
- | Superset::Database::List |
99
- +----+------------------------------------+------------+------------------+
100
- | Id | Database name | Backend | Expose in sqllab |
101
- +----+------------------------------------+------------+------------------+
102
- | 1 | examples | postgresql | true |
103
- +----+------------------------------------+------------+------------------+
104
-
105
- # List database schemas for Database 1
106
- Superset::Database::GetSchemas.new(1).list
107
- # DEBUG -- : Happi: GET https://your-superset-host/api/v1/database/1/schemas/, {}
108
- => ["information_schema", "public"]
109
-
110
- # List dashboards
111
- Superset::Dashboard::List.call
112
- # PAGE_SIZE is set to 100, so get the second set of 100 dashboards with
113
- Superset::Dashboard::List.new(page_num: 1).list
114
- # OR filter by title
115
- Superset::Dashboard::List.new(title_contains: 'Sales').list
116
- # DEBUG -- : Happi: GET https://your-superset-host/api/v1/dashboard/?q=(filters:!((col:dashboard_title,opr:ct,value:'Sales')),page:0,page_size:100), {}
117
-
118
- +-----+------------------------------+-----------+--------------------------------------------------------------------+
119
- | Superset::Dashboard::List |
120
- +-----+------------------------------+-----------+--------------------------------------------------------------------+
121
- | Id | Dashboard title | Status | Url |
122
- +-----+------------------------------+-----------+--------------------------------------------------------------------+
123
- | 6 | Video Game Sales | published | https://your-superset-host/superset/dashboard/6/ |
124
- | 8 | Sales Dashboard | published | https://your-superset-host/superset/dashboard/8/ |
125
- +-----+------------------------------+-----------+--------------------------------------------------------------------+
126
-
127
-
128
- Superset::Dashboard::Get.call(1) # same as Superset::Dashboard::Get.new(1).list
129
- +----------------------------+
130
- | World Banks Data |
131
- +----------------------------+
132
- | Charts |
133
- +----------------------------+
134
- | % Rural |
135
- | Region Filter |
136
- | Life Expectancy VS Rural % |
137
- | Box plot |
138
- | Most Populated Countries |
139
- | Worlds Population |
140
- | Worlds Pop Growth |
141
- | Rural Breakdown |
142
- | Treemap |
143
- | Growth Rate |
144
- +----------------------------+
145
-
146
-
147
- ```
148
- ## Optional Credential setup for Embedded User
69
+ One Primary motivation behind this gem was to use the Superset API to duplicate dashboards, charts, datasets across multiple database connections.
149
70
 
150
- Primary usage is for api calls and/or for guest token retrieval when setting up applications to use the superset embedded dashboard workflow.
71
+ Targeted use case was for superset embedded functionality implemented in a application resting on multi tenanted database setup.
151
72
 
152
- The Superset API users fall into 2 categories
153
- - a user for general api calls to endpoints for Dashboards, Datasets, Charts, Users, Roles etc.
154
- ref `Superset::Credential::ApiUser`
155
- which pulls credentials from `ENV['SUPERSET_API_USERNAME']` and `ENV['SUPERSET_API_PASSWORD']`
156
-
157
- - a user for guest token api call to use when embedding dashboards in a host application.
158
- ref `Superset::Credential::EmbeddedUser`
159
- which pulls credentials from `ENV['SUPERSET_EMBEDDED_USERNAME']` and `ENV['SUPERSET_EMBEDDED_PASSWORD']`
73
+ See examples in [Duplicate Dashboards](https://github.com/rdytech/superset-client/tree/develop/doc/duplicate_dashboards.md)
160
74
 
75
+ ## Moving / Transferring Dashboards across Environments
161
76
 
162
- ### Fetch a Guest Token for Embedded user
77
+ With a few configuration changes to an import file, the process can be codified to transfer a dashboard between environments.
163
78
 
164
- Assuming you have setup your Dashboard in Superset to be embedded and that your creds are setup in
165
- `ENV['SUPERSET_EMBEDDED_USERNAME']` and `ENV['SUPERSET_EMBEDDED_PASSWORD']`
79
+ See example in [Transferring Dashboards across Environments](https://github.com/rdytech/superset-client/tree/develop/doc/migrating_dashboards_across_environments.md)
166
80
 
167
- ```
168
- Superset::GuestToken.new(embedded_dashboard_id: '15').guest_token
169
- => "eyJ0eXAiOi............VV4mrMfsvg"
170
- ```
81
+ ## Contributing
171
82
 
172
- ## Releasing a new version
83
+ - Fork it
84
+ - Create your feature branch (git checkout -b my-new-feature)
85
+ - Commit your changes (git commit -am 'Add some feature')
86
+ - Push to the branch (git push origin my-new-feature)
87
+ - Create new Pull Request
173
88
 
174
- On develop branch make sure to update `Superset::VERSION` and `CHANGELOG.md` with the new version number and changes.
175
- Build the new version and upload to gemfury.
176
89
 
177
- `gem build superset.gemspec`
178
90
 
179
91
  ### Publishing to RubyGems
180
92
 
181
93
  WIP .. direction is for this gem to be made public
182
94
 
183
- ### ReadyTech private Gemfury repo
184
-
185
- ReadyTech hosts its own private gemfury remote repo.
186
-
187
- Get the latest develop into master
188
-
189
- git checkout master
190
- git pull
191
- git fetch
192
- git merge origin/develop
193
-
194
- Tag the version and push to github
195
-
196
- git tag -a -m "Version 0.1.0" v0.1.0
197
- git push origin master --tags
198
-
199
- Push to gemfury or upload the build manually in the gemfury site.
200
-
201
- git push fury master
202
-
203
95
  ## License
204
96
 
205
97
  The gem is available as open source under the terms of the [MIT License](https://opensource.org/licenses/MIT).
@@ -206,9 +206,6 @@ Putting it simply, the current thinking is to delete all the replica dashboards
206
206
 
207
207
  ### Bringing the Duplicate Dashboard process into Superset core
208
208
 
209
- (WIP) The goal would be to have the DuplicateDashboard process as a part of the core superset codebase.
210
-
211
- To that end this Superset Improvement Proposal (SIP) .. is a starting point.
212
-
213
- {add SIP request here}
209
+ An ideal direction would be to have the DuplicateDashboard process as a part of the core superset codebase.
214
210
 
211
+ A Superset discussion thread has been started in [Duplicating Dashboards into a new database or schema](https://github.com/apache/superset/discussions/29899)
@@ -0,0 +1,173 @@
1
+ # Transferring Dashboards across Environments
2
+
3
+ In this document, we will discuss how to transfer dashboards across Superset hosting environments with the goal of heading towards an API call to automate the process.
4
+
5
+ Current process is limited to dashboards with all datasets based on a single database connection.
6
+
7
+ ## Short Version
8
+
9
+ Assuming you want to transfer a dashboard from Env1 to Env2.
10
+
11
+ You will need the following:
12
+ - a Env1 Dashboard Export Zip file
13
+ - a Env2 Database config export yaml
14
+ - a Env2 schema to point your datasets to
15
+
16
+ Assuming your API env for ruby is setup for your target superset environment.
17
+ ( ie using API creds for Env2 for this example )
18
+
19
+ ```ruby
20
+
21
+ new_import_zip = Superset::Services::ImportDashboardAcrossEnvironments.new(
22
+ dashboard_export_zip: 'path_to/dashboard_101_export_20241010.zip',
23
+ target_database_yaml_file: 'path_to/env2_db_config.yaml',
24
+ target_database_schema: 'acme',
25
+ ).perform
26
+
27
+ # now import the adjusted zip to the target superset env
28
+ Superset::Dashboard::Import.new(source_zip_file: new_import_file).perform
29
+
30
+ ```
31
+
32
+ ## Background
33
+
34
+ A common practice is to set up infrastructure to deploy multiple Superset environments. For example, a simple setup might be:
35
+ - Local development environment for testing version upgrades and feature exploration
36
+ - Staging Superset environment for testing in a production-like environment
37
+ - Production Superset environment that requires a higher level of stability and uptime
38
+
39
+ For the above example, the Superset staging environment often holds connections to staging databases, and the Superset production environment will hold connections to the production databases.
40
+
41
+ In the event where the database schema structure for the local development, staging, and production databases are exactly the same, dashboards can be replicated and transferred across Superset hosting environments.
42
+
43
+ That process does require some manual updating of the exported YAML files before importing them into the target environment. Also required is some understanding of the underlying dashboard export structure and how the object UUIDs work and relate to each other, especially in the context of databases and datasets.
44
+
45
+ ## Dashboard Export/Import within the Same Environment
46
+
47
+ This is a fairly straightforward process.
48
+
49
+ There are multiple methods for exporting a dashboard:
50
+ - Export from the dashboard list page in the GUI
51
+ - Export via the Superset API
52
+ - Export via the Superset CLI
53
+
54
+ Each export method will result in a zip file that contains a set of YAML files as per this list below, which is an export of customized version of the test Sales dashboard from the default example dashboards.
55
+
56
+ Test fixture is: https://github.com/rdytech/superset-client/blob/develop/spec/fixtures/dashboard_18_export_20240322.zip
57
+
58
+ ```
59
+ └── dashboard_export_20240321T214117
60
+ ├── charts
61
+ │   ├── Boy_Name_Cloud_53920.yaml
62
+ │   ├── Names_Sorted_by_Num_in_California_53929.yaml
63
+ │   ├── Number_of_Girls_53930.yaml
64
+ │   ├── Pivot_Table_53931.yaml
65
+ │   └── Top_10_Girl_Name_Share_53921.yaml
66
+ ├── dashboards
67
+ │   └── Birth_Names_18.yaml
68
+ ├── databases
69
+ │   └── examples.yaml
70
+ ├── datasets
71
+ │   └── examples
72
+ │   └── birth_names.yaml
73
+ └── metadata.yaml
74
+ ```
75
+
76
+ Each of the above YAML files holds UUID values for the primary object and any related objects.
77
+
78
+ - Database YAMLs hold the database connection string as well as a UUID for the database
79
+ - Dataset YAMLs have their own UUID as well as a reference to the database UUID
80
+ - Chart YAMLs have their own UUID as well as a reference to their dataset UUID
81
+
82
+ Example of the database YAML file:
83
+
84
+ ```
85
+ cat databases/examples.yaml
86
+ database_name: examples
87
+ sqlalchemy_uri: postgresql+psycopg2://superset:XXXXXXXXXX@superset-host:5432/superset
88
+ cache_timeout: null
89
+ expose_in_sqllab: true
90
+ allow_run_async: true
91
+ allow_ctas: true
92
+ allow_cvas: true
93
+ allow_dml: true
94
+ allow_file_upload: true
95
+ extra:
96
+ metadata_params: {}
97
+ engine_params: {}
98
+ metadata_cache_timeout: {}
99
+ schemas_allowed_for_file_upload:
100
+ - examples
101
+ allows_virtual_table_explore: true
102
+ uuid: a2dc77af-e654-49bb-b321-40f6b559a1ee
103
+ version: 1.0.0
104
+ ```
105
+
106
+ If we grep the database/examples.yaml we can see the UUID of the database.
107
+
108
+ ```
109
+ grep -r uuid databases/
110
+ databases/examples.yaml:uuid: a2dc77af-e654-49bb-b321-40f6b559a1ee
111
+
112
+ ```
113
+
114
+ Now if we look at the UUID values in the datasets, you will see both the dataset UUID and the reference to the database UUID.
115
+
116
+ ```
117
+ grep -r uuid datasets
118
+ datasets/examples/birth_names.yaml:uuid: 283f5023-0814-40f6-b12d-96f6a86b984f
119
+ datasets/examples/birth_names.yaml:database_uuid: a2dc77af-e654-49bb-b321-40f6b559a1ee
120
+ ```
121
+
122
+ If the above dashboard zip file `dashboard_18_export_20240322.zip` was imported as is to the same superset environment as it was exported from, this would mean all UUID's would already exist in superset and these objects would be found and updated with the imported zip data.
123
+
124
+ If the above zip file was imported as is to a different target Superset environment, it would fail as there would be no matching database UUID entry in that target Superset environment.
125
+
126
+ **Key Point:** When importing a dashboard to a different Superset environment than the original environment, the database configuration in the zip export must exist in the target Superset environment and all datasets must point to the database config.
127
+
128
+ ## Migrate a Dashboard to a Different Superset Environment
129
+
130
+ With the above knowledge, we can now think about how to migrate dashboards between Superset environments.
131
+
132
+ Each Superset object is given a UUID. Within the exported dashboard files, we are primarily concerned with:
133
+ - Replacing the staging database configuration with the production configuration
134
+ - Updating all staging datasets to point to the new production database UUID
135
+
136
+ Given we have a request to 'transfer' a dashboard across to a different environment, say staging to production, how would we then proceed?
137
+
138
+ With the condition that the database in staging and production are structurally exactly the same schema, from the above discussion on UUIDs, you can then see that if we want to import a staging dashboard export into the production environment, we will need to perform the following steps:
139
+
140
+ 1. Export the staging dashboard and unzip
141
+ 2. Note the staging database UUIDs in the `databases/` directory
142
+ 3. Get a copy of the production database YAML configuration file
143
+ 4. In the exported dashboard files, replace the staging database YAML with the production YAML
144
+ 5. In the dataset YAML files, replace all instances of the previously noted staging database UUID with the new production UUID
145
+ 6. Zip the files and import them to the production environment
146
+
147
+ The process above assumes that whoever is migrating the dashboard has a copy of the target database YAML files so that in steps 3 and 4 we can then replace the staging database YAML with the production one.
148
+
149
+ ## Requirements
150
+
151
+ The overall process requires the following:
152
+ - The source dashboard zip file
153
+ - The target Superset environment database YAML file
154
+ - Ability to copy and manipulate the source dashboard zip file
155
+ - The ability to import via API to the target Superset environment
156
+
157
+
158
+ ## Gotchas!
159
+
160
+ Migrating a dashboard once to a new target environment, database, schema will result in:
161
+ - Creating a new dashboard with the UUID from the import zip
162
+ - Creating a new set of charts with their UUIDs from the import zip
163
+ - Creating a new set of datasets with their UUIDs from the import zip
164
+
165
+ Migrating the same dashboard a second time to the same target environment, database, but different schema will NOT create a new dashboard.
166
+
167
+ It will attempt to update the same dashboard as the UUID for the dashboard has not changed. It will also NOT change any of the datasets to the new schema. This appears to be a limitation of the import process, which may lead to some confusing results.
168
+
169
+ ## References
170
+
171
+ Some helpful references relating to cross-environment workflows:
172
+ - [Managing Content Across Workspaces](https://docs.preset.io/docs/managing-content-across-workspaces)
173
+ - [Superset Slack AI Explanation](https://apache-superset.slack.com/archives/C072KSLBTC1/p1722382347022689)
data/doc/publishing.md ADDED
@@ -0,0 +1,39 @@
1
+ # Publishing
2
+
3
+ The team at ReadyTech currently manages the gem and new version releases.
4
+
5
+ As per demand / usage of the public gem, the team at ReadyTech will soon move to primarily hosting
6
+ the gem on RubyGems.
7
+
8
+ Until then, ReadyTech will continue to host a private Gemfury version as well as RubyGems.
9
+
10
+
11
+ ## ReadyTech private Gemfury repo
12
+
13
+ ### Releasing a new version
14
+
15
+ On develop branch make sure to update `Superset::VERSION` and `CHANGELOG.md` with the new version number and changes.
16
+
17
+ Build the new version and upload to gemfury.
18
+
19
+ `gem build superset.gemspec`
20
+
21
+ ### Publish to Gemfury
22
+
23
+ ReadyTech hosts its own private gemfury remote repo.
24
+
25
+ Get the latest develop into master
26
+
27
+ git checkout master
28
+ git pull
29
+ git fetch
30
+ git merge origin/develop
31
+
32
+ Tag the version and push to github
33
+
34
+ git tag -a -m "Version 0.1.0" v0.1.0
35
+ git push origin master --tags
36
+
37
+ Push to gemfury or upload the build manually in the gemfury site.
38
+
39
+ git push fury master
@@ -26,17 +26,19 @@ Starting a ruby console with `bin/console` will auto load the env vars.
26
26
 
27
27
  If your Superset instance is setup to authenicate via SSO then your authenticating agent will most likely have provided a username for you in the form of a UUID value.
28
28
 
29
- This is easily retrieved on you User Profile page in Superset.
29
+ This is easily retrieved on your user profile page in Superset.
30
30
 
31
- Optionally use jinja template macro in sql lab.
31
+ Optionally use the jinja template macro in sql lab.
32
32
 
33
33
  ```
34
34
  select '{{ current_username() }}' as current_username;
35
35
  ```
36
36
 
37
+ Then plug your username in to the .env file.
38
+
37
39
  ## Creating / Updating your password via Swagger interface
38
40
 
39
- A common setup is to use SSO to enable user access in Superset. This would mean your authenticating agent is your SSO provider service ( ie Azure ) and most likely you do not have / need a password configured for your Superset user for GUI access.
41
+ A common setup is to use SSO to enable user access in Superset. This would mean your authenticating agent is your SSO provider service ( ie Azure etc ) and most likely you do not have / need a password configured for your Superset user for GUI access.
40
42
 
41
43
  If this is the case, you will need to add your own password via hitting the superset API using the swagger interface.
42
44
 
@@ -50,7 +52,7 @@ select '{{ current_user_id() }}' as current_user_id;
50
52
  ```
51
53
  - ask your superset admin to tell you what your personal superset user id is.
52
54
 
53
- Once you have your user id value, open the Swagger API page on you superset host.
55
+ Once you have your user id value, open the Swagger API page on your superset host.
54
56
  `https://{your-superset-host}/swagger/v1`
55
57
 
56
58
  Scroll down to the 'Security Users' area and find the PUT request that will update your users record.
@@ -78,7 +80,7 @@ Given some local development requirements where you have to access multiple supe
78
80
  Just set the overall superset environment in `.env`
79
81
 
80
82
  ```
81
- # .env file holding one setting for the overall superset environment
83
+ # .env file holding one setting for the superset environment your accessing
82
84
  SUPERSET_ENVIRONMENT='staging'
83
85
  ```
84
86
 
@@ -101,11 +103,22 @@ SUPERSET_API_USERNAME="production-user-here"
101
103
  SUPERSET_API_PASSWORD="set-password-here"
102
104
  ```
103
105
 
104
- The command `bin/console` will then load your env file depending on the value in ENV['SUPERSET_ENVIRONMENT'] from the primary `.env`.
106
+ And again, for localhost if required to perform api call in you local dev environment.
107
+ Create a new file called `.env-localhost` that holds your superset development host and credentials.
108
+
109
+ ```
110
+ # specific settings for the superset localhost env
111
+ SUPERSET_HOST="http://localhost:8080/"
112
+ SUPERSET_API_USERNAME="dev-user-here"
113
+ SUPERSET_API_PASSWORD="set-password-here"
114
+ ```
115
+
116
+
117
+ The command `bin/console` will then load your env file depending on the value in ENV['SUPERSET_ENVIRONMENT'] from the primary `.env` file.
105
118
 
106
119
  When you need to switch envs, exit the console, edit the .env to your desired value, eg `production`, then open a console again.
107
120
 
108
- Bonus is the Pry prompt will now also include the `SUPERSET_ENVIRONMENT` value.
121
+ The ruby prompt will now also include the `SUPERSET_ENVIRONMENT` value.
109
122
 
110
123
  ```
111
124
  bin/console
@@ -122,6 +135,29 @@ Happi: GET https://your-staging-superset-host.com/api/v1/dashboard/?q=(filters:!
122
135
  +----+------------------+-----------+------------------------------------------------------------------+
123
136
  ```
124
137
 
138
+ ## Optional Credential setup for Embedded User
139
+
140
+ Primary usage is for api calls and/or for guest token retrieval when setting up applications to use the superset embedded dashboard workflow.
141
+
142
+ The Superset API users fall into 2 categories
143
+ - a user for general api calls to endpoints for Dashboards, Datasets, Charts, Users, Roles etc.
144
+ ref `Superset::Credential::ApiUser`
145
+ which pulls credentials from `ENV['SUPERSET_API_USERNAME']` and `ENV['SUPERSET_API_PASSWORD']`
146
+
147
+ - a user for guest token api call to use when embedding dashboards in a host application.
148
+ ref `Superset::Credential::EmbeddedUser`
149
+ which pulls credentials from `ENV['SUPERSET_EMBEDDED_USERNAME']` and `ENV['SUPERSET_EMBEDDED_PASSWORD']`
150
+
151
+
152
+ ### Fetch a Guest Token for Embedded user
153
+
154
+ Assuming you have setup your Dashboard in Superset to be embedded and that your creds are setup in
155
+ `ENV['SUPERSET_EMBEDDED_USERNAME']` and `ENV['SUPERSET_EMBEDDED_PASSWORD']`
156
+
157
+ ```
158
+ Superset::GuestToken.new(embedded_dashboard_id: '15').guest_token
159
+ => "eyJ0eXAiOi............VV4mrMfsvg"
160
+ ```
125
161
 
126
162
 
127
163