dremiojs 1.0.0

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
Files changed (45) hide show
  1. package/.eslintrc.json +14 -0
  2. package/.prettierrc +7 -0
  3. package/README.md +59 -0
  4. package/dremiodocs/dremio-cloud/cloud-api-reference.md +748 -0
  5. package/dremiodocs/dremio-cloud/dremio-cloud-about.md +225 -0
  6. package/dremiodocs/dremio-cloud/dremio-cloud-admin.md +3754 -0
  7. package/dremiodocs/dremio-cloud/dremio-cloud-bring-data.md +6098 -0
  8. package/dremiodocs/dremio-cloud/dremio-cloud-changelog.md +32 -0
  9. package/dremiodocs/dremio-cloud/dremio-cloud-developer.md +1147 -0
  10. package/dremiodocs/dremio-cloud/dremio-cloud-explore-analyze.md +2522 -0
  11. package/dremiodocs/dremio-cloud/dremio-cloud-get-started.md +300 -0
  12. package/dremiodocs/dremio-cloud/dremio-cloud-help-support.md +869 -0
  13. package/dremiodocs/dremio-cloud/dremio-cloud-manage-govern.md +800 -0
  14. package/dremiodocs/dremio-cloud/dremio-cloud-overview.md +36 -0
  15. package/dremiodocs/dremio-cloud/dremio-cloud-security.md +1844 -0
  16. package/dremiodocs/dremio-cloud/sql-docs.md +7180 -0
  17. package/dremiodocs/dremio-software/dremio-software-acceleration.md +1575 -0
  18. package/dremiodocs/dremio-software/dremio-software-admin.md +884 -0
  19. package/dremiodocs/dremio-software/dremio-software-client-applications.md +3277 -0
  20. package/dremiodocs/dremio-software/dremio-software-data-products.md +560 -0
  21. package/dremiodocs/dremio-software/dremio-software-data-sources.md +8701 -0
  22. package/dremiodocs/dremio-software/dremio-software-deploy-dremio.md +3446 -0
  23. package/dremiodocs/dremio-software/dremio-software-get-started.md +848 -0
  24. package/dremiodocs/dremio-software/dremio-software-monitoring.md +422 -0
  25. package/dremiodocs/dremio-software/dremio-software-reference.md +677 -0
  26. package/dremiodocs/dremio-software/dremio-software-security.md +2074 -0
  27. package/dremiodocs/dremio-software/dremio-software-v25-api.md +32637 -0
  28. package/dremiodocs/dremio-software/dremio-software-v26-api.md +36757 -0
  29. package/jest.config.js +10 -0
  30. package/package.json +25 -0
  31. package/src/api/catalog.ts +74 -0
  32. package/src/api/jobs.ts +105 -0
  33. package/src/api/reflection.ts +77 -0
  34. package/src/api/source.ts +61 -0
  35. package/src/api/user.ts +32 -0
  36. package/src/client/base.ts +66 -0
  37. package/src/client/cloud.ts +37 -0
  38. package/src/client/software.ts +73 -0
  39. package/src/index.ts +16 -0
  40. package/src/types/catalog.ts +31 -0
  41. package/src/types/config.ts +18 -0
  42. package/src/types/job.ts +18 -0
  43. package/src/types/reflection.ts +29 -0
  44. package/tests/integration_manual.ts +95 -0
  45. package/tsconfig.json +19 -0
@@ -0,0 +1,300 @@
1
+ # Get Started with Dremio Cloud | Dremio Documentation
2
+
3
+ Original URL: https://docs.dremio.com/dremio-cloud/get-started/
4
+
5
+ On this page
6
+
7
+ To get started, sign up for an account at [dremio.com/get-started](https://www.dremio.com/get-started) and follow the guided setup to create your first project.
8
+
9
+ This guide shows you how to analyze transportation data and find usage patterns using natural language queries with Dremio's AI Agent. You'll work with the `dremio_samples.nyc_citibikes.citibikes` table that contains over 115 million bike-sharing records from New York City—real patterns from one of the world's busiest transportation networks.
10
+
11
+ ## Step 1: Discover and Explore Your Data
12
+
13
+ Understanding your dataset structure is crucial before diving into analysis. Start by getting an overview of the bike-sharing data to understand what insights are possible.
14
+
15
+ On the homepage, enter the following prompt into Dremio's AI Agent chat box: `Give me an overview of the nyc_citibikes.citibikes dataset`.
16
+
17
+ The AI Agent uses capabilities like semantic search to find relevant datasets. Review the response to understand the schema, field definitions, and what types of analysis are possible with bike-sharing data.
18
+
19
+ ## Step 2: Compare User Behavior
20
+
21
+ Ask a question in natural language to compare how different user types interact with the bike-sharing service. This analysis reveals usage patterns that inform operational decisions.
22
+
23
+ Ask the AI Agent to compare how subscribers and casual riders use the service: `Give me the total number of rides and the average trip duration grouped by user type across the dataset.`
24
+
25
+ The AI Agent generates and runs SQL using Dremio's query engine to answer this question. The results show that subscribers typically show higher ride frequency but shorter durations, indicating commuter behavior. Casual riders often have longer trips but lower frequency, suggesting leisure or tourist usage patterns. This analysis reveals how different user types require different operational strategies for bike availability, pricing models, and infrastructure investment.
26
+
27
+ ## Step 3: Analyze Peak Demand Patterns
28
+
29
+ Dive deeper to understand when different user groups are most active. This temporal analysis provides insights for operational planning and resource allocation.
30
+
31
+ Ask the AI Agent to reveal hourly demand patterns: `Analyze hourly ride patterns to find when demand peaks. Show a line chart of total rides per hour of the day, separated by user type, and include a short report highlighting the busiest hours for each group and recommendations for bike availability`.
32
+
33
+ The AI Agent will create a clear chart showing distinct patterns. Subscribers peak during rush hours (8-9 AM, 5-6 PM) indicating commuter usage, while casual riders peak during midday and weekends showing leisure patterns. This temporal analysis reveals opportunities for dynamic pricing during peak hours, optimal timing for maintenance during low-demand periods, and capacity planning insights for fleet optimization.
34
+
35
+ ## Step 4: Run Comparative Analysis
36
+
37
+ Use the AI Agent to identify the most influential factors affecting rider behavior. This analysis compares multiple variables to determine primary drivers of ridership patterns.
38
+
39
+ Ask the AI Agent to run a comprehensive comparative analysis: `Run comparative analysis on seasonal, daily, hourly, and bike type patterns to identify which factor has the most significant impact on ride behavior. Then create a detailed visualization of the most influential factor.`
40
+
41
+ The AI Agent will compare multiple variables and automatically identify which factor has the biggest impact on rider behavior, complete with actionable recommendations. This analysis reveals primary drivers of ridership, correlation insights between variables, and predictive indicators for forecasting usage patterns.
42
+
43
+ ## Step 5: Try Your Own Analysis
44
+
45
+ Now that you understand how to analyze transportation data with the AI Agent, try exploring other bike-sharing questions.
46
+
47
+ You can also analyze your own data using Dremio's AI Agent.
48
+
49
+ ## Summary
50
+
51
+ You have completed a transportation data analysis using natural language. You explored the dataset structure, compared user behavior patterns, analyzed peak demand times, and identified influential factors affecting ridership. Discovery, exploration, and analysis that could previously take hours can now be done in minutes with Dremio's AI Agent.
52
+
53
+ ## Troubleshoot
54
+
55
+ * If a prompt doesn't work as expected, try simplifying the request or verifying that you're referencing the correct dataset (`nyc_citibikes.citibikes`).
56
+ * Contact your administrator if sample data isn't available in your environment.
57
+
58
+ ## Related Topics
59
+
60
+ * [Bring Your Data](/dremio-cloud/bring-data/) – Load, connect, and prepare your data.
61
+ * [Quick Tour of the Dremio Console](/dremio-cloud/get-started/quick-tour) – Learn how to navigate Dremio.
62
+ * [Add a User](/dremio-cloud/admin/users#add-a-user) – Invite team members to your organization.
63
+
64
+ Was this page helpful?
65
+
66
+ * Step 1: Discover and Explore Your Data
67
+ * Step 2: Compare User Behavior
68
+ * Step 3: Analyze Peak Demand Patterns
69
+ * Step 4: Run Comparative Analysis
70
+ * Step 5: Try Your Own Analysis
71
+ * Summary
72
+ * Troubleshoot
73
+ * Related Topics
74
+
75
+ <div style="page-break-after: always;"></div>
76
+
77
+ # Build Your First Agentic Lakehouse Use Case | Dremio Documentation
78
+
79
+ Original URL: https://docs.dremio.com/dremio-cloud/get-started/use-case
80
+
81
+ On this page
82
+
83
+ This guide will help you turn a business question into a working, governed data product using your own data, all within your 30-day Dremio Cloud trial.
84
+
85
+ In the [Getting Started guide](/dremio-cloud/get-started/), you saw how Dremio's AI Agent can take you from question to insight within minutes using sample data. With this guide, you'll connect your data, prepare and transform it, build reusable views with semantics, and deliver insights using Dremio's AI Agent or your preferred tool of choice. The end goal is flexible: you might produce an aggregated view that analysts and AI agents query regularly, or a dashboard. Either way, you’ll experience the full value of Dremio Cloud as an agentic lakehouse — open, governed, and self-optimizing.
86
+
87
+ ## Prerequisites
88
+
89
+ Before you begin, ensure that you have the following:
90
+
91
+ * A Dremio Cloud account: You'll need an active Dremio Cloud trial account. If you haven't already, sign up at [dremio.com/get-started](https://dremio.com/get-started) for a 30-day trial with $400 in free credits.
92
+ * Access to data: Identify at least one data source you can connect to, such as object storage or databases. If you don't have these, then you will need a set of local files that you can upload to Dremio Cloud.
93
+ * (Optional) Access to your BI tool: If your use case requires a dashboard, you will need access to your BI tool. This is optional, as you can use Dremio's AI Agent to generate basic charts to visualize trends directly within the Dremio console.
94
+
95
+ ## Step 1: Identify a Business Use Case
96
+
97
+ Begin by identifying a business use case that you will be implementing using this guide. The use case you choose should have clear value and measurable results, not a massive data project but a business question that matters.
98
+
99
+ ### How to Do It
100
+
101
+ Pick a concrete business question, such as:
102
+
103
+ * *How are customer support metrics trending this quarter?*
104
+ * *Which product lines are driving margin growth?*
105
+ * *What are our top churn risks by region?*
106
+
107
+ We recommend that you select a business question that can be answered using a few datasets, going across a maximum of two sources.
108
+
109
+ ## Step 2: Add Your Data
110
+
111
+ To implement the data model that answers the business question you identified in the Identify a Business Use Case section, you will first add your data to the project in your Dremio Cloud account.
112
+
113
+ ### How to Do It
114
+
115
+ You can add data to your project in one of three ways:
116
+
117
+ **Load Data into the Open Catalog**: Dremio provides a default Open Catalog, powered by Apache Polaris. You can load data directly into this catalog as an Iceberg table in your silver layer using your tool of choice, such as Fivetran, dbt, and Airbyte. From there, Dremio manages all Iceberg table metadata and governance while keeping your data in an open format. For instructions on how to load data into your Open Catalog, see [Load Data into Tables](/dremio-cloud/bring-data/load/).
118
+
119
+ **Connect an Existing Source**: Connect your object store, catalogs, databases so Dremio can query data in place. This is your bronze layer of data. For a list of supported sources and step-by-step connection instructions, see [Connect to Your Data](/dremio-cloud/bring-data/connect/).
120
+
121
+ **Upload Local Files**: Upload local files (CSV, JSON, or Parquet) for quick exploration if you don't have direct access to your data sources. Dremio will write the uploaded data into an Iceberg table in your project's Open Catalog. For step-by-step instructions on how to upload files, see [Upload Local Files](/dremio-cloud/bring-data/load/#upload-local-files).
122
+
123
+ Whichever method you choose, Dremio provides live, federated access to all of your data. This flexibility allows you to move from data connection to analysis in minutes.
124
+
125
+ ## Step 3: Clean and Transform Data
126
+
127
+ Dremio lets you prepare data from across different sources without having to move it. You're able to use natural language to generate the SQL using Dremio's AI Agent or write SQL yourself. Your data preparation steps can be represented as views; no additional pipelines are required.
128
+
129
+ ### How to Do It
130
+
131
+ **Use SQL and AI Functions**: Prepare and transform data using [SQL Functions](/dremio-cloud/sql/sql-functions/). You can also turn unstructured data, such as images or PDFs, into a structured, governed Iceberg table using [AI Functions](/dremio-cloud/sql/sql-functions/AI).
132
+
133
+ **Use Dremio's AI Agent**: Ask the built-in AI Agent to identify issues with the data and generate SQL to prepare and transform it. For example, you can ask the AI Agent to:
134
+
135
+ * *Generate SQL to remove null values in the revenue column.*
136
+ * *Generate SQL to join orders and customers on customerID.*
137
+ * *Add a column for gross margin = revenue - cost.*
138
+
139
+ Each transformation can be saved as a view in your silver layer, giving you reusable building blocks. This way, your transformations are continuously updated as more data comes in with no additional changes required from you. This approach replaces complex ETL pipelines with a simple workflow that keeps your data fresh, governed, and easy to iterate on. For instructions on how to create views, see [Create a View](/dremio-cloud/bring-data/prepare/#create-a-view).
140
+
141
+ ## Step 4: Build Views for Aggregations and Metrics
142
+
143
+ Once you've created your silver layer by cleansing and transforming your data, you can create your gold layer of views. These views will capture aggregations and metrics and will be ready for exploration, ad-hoc analysis, or dashboards.
144
+
145
+ ### How to Do It
146
+
147
+ **Use SQL Functions**: Aggregate and build out metrics using [SQL Functions](/dremio-cloud/sql/sql-functions/).
148
+
149
+ **Use Dremio's AI Agent**: Ask the built-in AI Agent to generate the SQL for your view. For example, you can ask the agent to *Give me the SQL for views that summarize the average response time by call center employees and the customer sentiment by region.*
150
+
151
+ Aggregations and metrics are saved as governed views in your Open Catalog. For instructions on how to create views, see [Create a View](/dremio-cloud/bring-data/prepare/#create-a-view).
152
+
153
+ ## Step 5: Add Semantics to Views
154
+
155
+ Data only becomes valuable when everyone can interpret it in the same way. The AI Semantic Layer gives your datasets shared meaning, so when an analyst or AI Agent is looking at "fiscal Q2" or "positive sentiment", they're applying the same business logic every time.
156
+
157
+ ### How to Do It
158
+
159
+ **Enrich Your Data with Semantics**: Generate wikis and labels on your views to reduce the amount of time being spent on manual tasks. For more information on generating semantics, see [Generate Wikis and Labels](/dremio-cloud/manage-govern/wikis-labels/#generate-labels-and-wikis-preview).
160
+
161
+ You can add additional context, such as usage notes, definitions specific to your industry, and common queries.
162
+ These definitions and classifications are stored with the data, guiding both natural language queries, SQL generation, and manual exploration.
163
+
164
+ ## Step 6: Deliver Insights
165
+
166
+ Now that you have connected, curated, aggregated, and enriched your data, you can deliver on the outcome for the business question you defined in [Step 1](/dremio-cloud/get-started/use-case/#step-1-identify-a-business-use-case). The outcome may be the aggregated view you created in the previous step that teams and agents will use directly, or it may be a dashboard that tracks metrics over time. With Dremio Cloud, you're able to deliver on either one.
167
+
168
+ ### How to Do It
169
+
170
+ **Use Dremio's AI Agent for Actionable Insights**: Dremio's AI Agent can analyze patterns and trends directly from views. You and your users can ask the business question you identified in [Step 1](/dremio-cloud/get-started/use-case/#step-1-identify-a-business-use-case), along with other questions. The AI Agent will use the semantics and samples of the data to generate the appropriate SQL queries that provide you with insights and visualizations of the data. For example, on sales data, you can ask the AI Agent to *Create a chart to show the trends in sales across regions over the last year and provide an analysis on the changes.*
171
+
172
+ **Create a Dashboard Using Your Tool of Choice**: If you already have a dashboard or report that you would like to update or you want to create a new one to represent the insights on your data, you can connect to Dremio from tools like Tableau, Microsoft Power BI, and others using Flight SQL JDBC/ODBC connections. For a list of supported tools and step-by-step instructions on connecting, see [Connect Client Applications](/dremio-cloud/explore-analyze/client-apps/).
173
+
174
+ ## Step 7: Operationalize the Use Case
175
+
176
+ Each use case is operationalized when it's governed, monitored, and shareable.
177
+
178
+ ### How to Do It
179
+
180
+ **Access Control Policies**: Create and implement access control policies from role-based access to more granular row and column-level policies. For more information, see [Privileges](/dremio-cloud/security/privileges/) and [Row-Access and Column-Masking Policies](/dremio-cloud/manage-govern/row-column-policies/).
181
+
182
+ **Monitor Query Volumes and Performance**: Track performance and usage of the data. Dremio's [Autonomous Management capability](/dremio-cloud/admin/performance/) automatically handles data management and ensures reliable and fast query performance. In Dremio, you're able to [monitor queries and their performance](/dremio-cloud/admin/monitor/).
183
+
184
+ **Cost Management**: Review consumption and spend of this use case within the Dremio console. These dashboards show how much compute and storage each workload consumes, helping you plan budgets, optimize workloads, and estimate spend before moving to production. For more information, see [Usage](/dremio-cloud/admin/subscription/usage/).
185
+
186
+ Operationalizing your first use case ensures it remains reliable, governed, and cost-effective. You gain insight into both performance and consumption trends, enabling you to scale confidently while maintaining control of your budget.
187
+
188
+ ## Wrap Up and Next Steps
189
+
190
+ You've now implemented your first use case on Dremio Cloud by:
191
+
192
+ * Defining a valuable business use case
193
+ * Adding your own data to your Open Catalog or by connecting existing data sources
194
+ * Cleaning and transforming the data
195
+ * Creating reusable views with semantics
196
+ * Delivering insights via AI or dashboards
197
+ * Operationalizing the data through governance and monitoring
198
+
199
+ Next, extend your use case with additional business questions or another business domain.
200
+
201
+ ## Related Topics
202
+
203
+ * [Dremio MCP Server](/dremio-cloud/developer/mcp-server/) - Use Dremio's hosted MCP server to customize your agentic workflow.
204
+ * [Visual Studio Code](/dremio-cloud/developer/vs-code/) - Use the Visual Studio (VS) Code extension for Dremio for development and analysis.
205
+ * [Optimize Performance](/dremio-cloud/admin/performance/) - Learn about how Dremio autonomously optimizes performance.
206
+
207
+ Was this page helpful?
208
+
209
+ * Prerequisites
210
+ * Step 1: Identify a Business Use Case
211
+ + How to Do It
212
+ * Step 2: Add Your Data
213
+ + How to Do It
214
+ * Step 3: Clean and Transform Data
215
+ + How to Do It
216
+ * Step 4: Build Views for Aggregations and Metrics
217
+ + How to Do It
218
+ * Step 5: Add Semantics to Views
219
+ + How to Do It
220
+ * Step 6: Deliver Insights
221
+ + How to Do It
222
+ * Step 7: Operationalize the Use Case
223
+ + How to Do It
224
+ * Wrap Up and Next Steps
225
+ * Related Topics
226
+
227
+ <div style="page-break-after: always;"></div>
228
+
229
+ # Quick Tour of the Dremio Console | Dremio Documentation
230
+
231
+ Original URL: https://docs.dremio.com/dremio-cloud/get-started/quick-tour
232
+
233
+ On this page
234
+
235
+ This quick tour introduces you to the main areas of the Dremio console, including the homepage, Datasets, SQL Runner, and Jobs pages. You'll learn how to navigate the interface and access key features of your agentic lakehouse.
236
+
237
+ ## Console Navigation
238
+
239
+ The side navigation bar provides links to key areas of the Dremio console.
240
+
241
+ ![Dremio console navigation.](/images/homepage-navigation.png "Dremio console navigation.")
242
+
243
+ | Location | Description |
244
+ | --- | --- |
245
+ | 1 | **Homepage**: Central landing page when you log in. |
246
+ | 2 | **Datasets**: Interface for exploring tables and views across the default Open Catalog, other catalogs, object storage, and database sources. |
247
+ | 3 | **SQL Runner**: Editor for constructing and querying data. |
248
+ | 4 | **Jobs**: History of executed SQL and job details. |
249
+ | 5 | **Project and Organization Settings**: Configuration for your catalog, engines, and routing rules in your project and management of authentication, users, billing, and projects in your organization. |
250
+ | 6 | **Documentation and Support**: Access point for documentation, the Community Forum, or the Support Portal. |
251
+ | 7 | **Account Settings**: Section for managing general information, personal access tokens, appearance preferences, and logout options. |
252
+
253
+ ## Datasets Page
254
+
255
+ The Datasets page provides navigation and management for data in your Open Catalog, other catalogs, object stores, and databases.
256
+
257
+ ![Datasets page navigation and management interface.](/images/datasets-nav.png "Datasets page navigation and management interface.")
258
+
259
+ | Location | Description |
260
+ | --- | --- |
261
+ | 1 | **Project Name**: Name of the current project being explored. |
262
+ | 2 | **Namespaces**: Logical containers that organize data objects within Dremio's Open Catalog, providing hierarchical organization and access control for tables, views, and folders. |
263
+ | 3 | **Sources**: Self-hosted catalogs, object stores, or databases. |
264
+ | 4 | **Path**: Dot-separated identifier indicating the location of the object, starting with the source or catalog name, followed by any folders, and ending with the name of table or view. |
265
+
266
+ ## SQL Runner
267
+
268
+ The SQL Runner provides a query editor for running SQL. Access via ![](/images/icons/sql-runner.png) in the side navigation bar.
269
+
270
+ ![SQL Runner interface](/images/sql-runner-nav.png "SQL Runner interface")
271
+
272
+ | Location | Description |
273
+ | --- | --- |
274
+ | 1 | **Data Panel**: Area for exploring data across your Open Catalog, other catalogs, object stores, and databases, with drag-and-drop support for adding objects into the SQL editor. |
275
+ | 2 | **Scripts Panel**: Panel for saved SQL scripts that can be reused and shared with other users in your organization. Each script includes creation/modification timestamps and editor context and requires VIEW privileges. |
276
+ | 3 | **SQL Editor**: Workspace for creating and editing SQL with autocomplete, syntax highlighting, and function lookup. See [SQL Reference](/dremio-cloud/sql/) for supported SQL. You may also highlight SQL, right-click, and select **Explain SQL** to start a chat with the AI Agent, which features a summary of the query’s overview, datasets, and architecture. For more details, see [Explain SQL](/dremio-cloud/admin/monitor/jobs/#explain-sql). |
277
+ | 4 | **Run**: Execution of the SQL, which returns the complete result set. |
278
+ | 5 | **Preview**: Option for previewing the result set, which returns a subset of rows in less time than running the SQL. |
279
+ | 6 | **Engine**: Dropdown menu for selecting an engine for SQL execution. By default, Automatic is selected, which routes the query to the appropriate engine based on engine routing rules. For more details, see [Manage Engines](/dremio-cloud/admin/engines/). |
280
+ | 7 | **Results Panel**: Table displaying the results of your query with options to download, copy, or edit values. |
281
+ | 8 | **Job Summary**: Tab showing the job status, query type, start time, duration, and job ID. |
282
+ | 9 | **Transformations**: Tools for applying transformations such as Add Column, Group By, Join, Filter, Convert Data Type, and Add Calculated Field that automatically update SQL. |
283
+ | 10 | **Execution State**: Indicator displaying the job status, record count, and execution time, with a link to view full job details. Includes options to download results as JSON, CSV, or Parquet files, or copy data to the clipboard. |
284
+ | 11 | **Details Panel**: Right-side panel for viewing and managing dataset metadata, including columns, ownership, searchable labels, and wiki content. |
285
+
286
+ ### Limitations and Considerations
287
+
288
+ **Row Limit**: `COUNT(*)` and `SELECT` query results are limited to one million rows and may be truncated based on thread distribution. When truncated, a warning appears. To obtain complete results, use [JDBC](/dremio-cloud/explore-analyze/client-apps/drivers/arrow-flight-sql-jdbc/) or [ODBC](/dremio-cloud/explore-analyze/client-apps/drivers/arrow-flight-sql-odbc/) drivers.
289
+
290
+ **CSV Download**: CSV download is unavailable for result sets with complex data types (union, map, array). The download and copy results options can be enabled or disabled for a specific project by navigating to **Project Settings** > **Preferences**.
291
+
292
+ Was this page helpful?
293
+
294
+ * Console Navigation
295
+ * Datasets Page
296
+ * SQL Runner
297
+ + Limitations and Considerations
298
+
299
+ <div style="page-break-after: always;"></div>
300
+