pytrilogy 0.0.2.20__py3-none-any.whl → 0.0.2.22__py3-none-any.whl

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.

Potentially problematic release.


This version of pytrilogy might be problematic. Click here for more details.

@@ -1,6 +1,6 @@
1
1
  Metadata-Version: 2.1
2
2
  Name: pytrilogy
3
- Version: 0.0.2.20
3
+ Version: 0.0.2.22
4
4
  Summary: Declarative, typed query language that compiles to SQL.
5
5
  Home-page:
6
6
  Author:
@@ -145,7 +145,7 @@ Run the following from the directory the file is in.
145
145
  trilogy run hello.trilogy duckdb
146
146
  ```
147
147
 
148
- ![UI Preview](./hello_world.png)
148
+ ![UI Preview](hello-world.png)
149
149
 
150
150
  ## Backends
151
151
 
@@ -158,9 +158,9 @@ The current Trilogy implementation supports these backends:
158
158
 
159
159
  ## Basic Example - Python
160
160
 
161
- Trilogy can be run directly in python.
161
+ Trilogy can be run directly in python through the core SDK. Trilogy code can be defined and parsed inline or parsed out of files.
162
162
 
163
- A bigquery example, similar to bigquery [the quickstart](https://cloud.google.com/bigquery/docs/quickstarts/query-public-dataset-console)
163
+ A bigquery example, similar to bigquery [the quickstart](https://cloud.google.com/bigquery/docs/quickstarts/query-public-dataset-console).
164
164
 
165
165
  ```python
166
166
 
@@ -224,7 +224,7 @@ and second the dialect to run.
224
224
  To pass arguments to a backend, append additional --<option> flags after specifying the dialect.
225
225
 
226
226
  Example:
227
- `trilogy run key in int; datasource test_source ( i:in) grain(in) address test; select in;" duckdb --path <path/to/duckdb>`
227
+ `trilogy run "key x int; datasource test_source ( i:x) grain(in) address test; select x;" duckdb --path <path/to/database>`
228
228
 
229
229
  ### Bigquery Args
230
230
  N/A, only supports default auth. In python you can pass in a custom client.
@@ -267,7 +267,7 @@ Clone repository and install requirements.txt and requirements-test.txt.
267
267
  Please open an issue first to discuss what you would like to change, and then create a PR against that issue.
268
268
 
269
269
  ## Similar in space
270
- Trilogy combines two aspects; a semantic layer and a query language. We've covered examples of both below:
270
+ Trilogy combines two aspects; a semantic layer and a query language. Examples of both are linked below:
271
271
 
272
272
  Python "semantic layers" are tools for defining data access to a warehouse in a more abstract way.
273
273
 
@@ -284,20 +284,26 @@ but all are worth checking out. Please open PRs/comment for anything missed!
284
284
 
285
285
  #### IMPORT
286
286
 
287
- `import <path> as <alias>;`
287
+ `import [path] as [alias];`
288
288
 
289
289
  #### CONCEPT
290
290
 
291
- Types: `string | int | float | bool | date | datetime | time | numeric(scale, precision) | timestamp | interval`;
291
+ Types: `string | int | float | bool | date | datetime | time | numeric(scale, precision) | timestamp | interval | list<[type]> | map<[type], [type]> | struct<name:[type], name:[type]>`;
292
292
 
293
293
  Key:
294
- `key <name> <type>;`
294
+ `key [name] [type];`
295
295
 
296
296
  Property:
297
- `property <key>.<name> <type>;`
297
+ `property [key>].[name] [type];`
298
+ `property x.y int;`
299
+ or
300
+ `property <[key](,[key])?>.<name> [type];`
301
+ `property <x,y>.z int;`
302
+
298
303
 
299
304
  Transformation:
300
- `auto <name> <- <expression>;`
305
+ `auto [name] <- [expression];`
306
+ `auto x <- y + 1;`
301
307
 
302
308
  #### DATASOURCE
303
309
  ```sql
@@ -1,31 +1,31 @@
1
- trilogy/__init__.py,sha256=jkM9h2qEkIh-R8xZca7g_Qzl5cC8w48aOmkcJUjjTAs,291
1
+ trilogy/__init__.py,sha256=C9onmFcqcrpb9znQMhvQ84guVv4plpSoR0x6qC1SDs4,291
2
2
  trilogy/compiler.py,sha256=47DEQpj8HBSa-_TImW-5JCeuQeRkm5NMpJWZG3hSuFU,0
3
- trilogy/constants.py,sha256=pZkOneh_65f9Ua6NICu1bHAFAbmQxmiXRXS7tsmCWbQ,1235
3
+ trilogy/constants.py,sha256=rHCe0Pe3LuB-VwCr2765QhzkUrTqZKEYPJ7rS0ykxYw,1273
4
4
  trilogy/engine.py,sha256=R5ubIxYyrxRExz07aZCUfrTsoXCHQ8DKFTDsobXdWdA,1102
5
- trilogy/executor.py,sha256=An6YLpHQOt96E7ozRQhwZels2hMsDbh0WV767kKCGU0,11294
5
+ trilogy/executor.py,sha256=Sv623APcNOKScYTmiiSvDcSy_ZZiKa04Wtav6dO-TFs,11760
6
6
  trilogy/parser.py,sha256=UtuqSiGiCjpMAYgo1bvNq-b7NSzCA5hzbUW31RXaMII,281
7
7
  trilogy/py.typed,sha256=47DEQpj8HBSa-_TImW-5JCeuQeRkm5NMpJWZG3hSuFU,0
8
8
  trilogy/utility.py,sha256=zM__8r29EsyDW7K9VOHz8yvZC2bXFzh7xKy3cL7GKsk,707
9
9
  trilogy/core/__init__.py,sha256=47DEQpj8HBSa-_TImW-5JCeuQeRkm5NMpJWZG3hSuFU,0
10
- trilogy/core/constants.py,sha256=LL8NLvxb3HRnAjvofyLRXqQJijLcYiXAQYQzGarVD-g,128
11
- trilogy/core/enums.py,sha256=A9VC0lbP5eo9sndm2TzA-nNJRRmvbjE918ZiEXtcQ_c,6043
12
- trilogy/core/env_processor.py,sha256=l7TAB0LalxjTYJdTlcmFIkLXuyxa9lrenWLeZfa9qw0,2276
10
+ trilogy/core/constants.py,sha256=7XaCpZn5mQmjTobbeBn56SzPWq9eMNDfzfsRU-fP0VE,171
11
+ trilogy/core/enums.py,sha256=W4ojA8xWRNh8frrTEYyJNLSm1rDA_O0uBL447USmF6c,6144
12
+ trilogy/core/env_processor.py,sha256=z8pYgl5XpprA4ZzRvn7CVIG0hbMu04BlNkugKlT6i3o,2333
13
13
  trilogy/core/environment_helpers.py,sha256=1miP4is4FEoci01KSAy2VZVYmlmT5TOCOALBekd2muQ,7211
14
- trilogy/core/ergonomics.py,sha256=w3gwXdgrxNHCuaRdyKg73t6F36tj-wIjQf47WZkHmJk,1465
14
+ trilogy/core/ergonomics.py,sha256=ASLDd0RqKWrZiG3XcKHo8nyTjaB_8xfE9t4NZ1UvGpc,1639
15
15
  trilogy/core/exceptions.py,sha256=NvV_4qLOgKXbpotgRf7c8BANDEvHxlqRPaA53IThQ2o,561
16
16
  trilogy/core/functions.py,sha256=ShFTStIKbgI-3EZIU0xTumI78AC5QlvARwnBM53P2O0,10677
17
17
  trilogy/core/graph_models.py,sha256=oJUMSpmYhqXlavckHLpR07GJxuQ8dZ1VbB1fB0KaS8c,2036
18
18
  trilogy/core/internal.py,sha256=jNGFHKENnbMiMCtAgsnLZYVSENDK4b5ALecXFZpTDzQ,1075
19
- trilogy/core/models.py,sha256=O1_HxOQ0muMbjWVcOH1mmEFG3IM7taaPgfgCnZv21W8,150676
19
+ trilogy/core/models.py,sha256=IzB_IYcNmEWLYdqgG6fbplM3tNQOOxhW9oBkLP4XYs4,153920
20
20
  trilogy/core/optimization.py,sha256=od_60A9F8J8Nj24MHgrxl4vwRwmBFH13TMdoMQvgVKs,7717
21
- trilogy/core/query_processor.py,sha256=kXuBsIaRHu1s7zB_rAnT_gRe4-VgRSrPE1TnVJXFLtc,16447
21
+ trilogy/core/query_processor.py,sha256=jTYYC0LjrC0ZSFNXSa26QUGsaImwdvWx2yHFFgWQZRU,16607
22
22
  trilogy/core/optimizations/__init__.py,sha256=bWQecbeiwiDx9LJnLsa7dkWxdbl2wcnkcTN69JyP8iI,356
23
23
  trilogy/core/optimizations/base_optimization.py,sha256=tWWT-xnTbnEU-mNi_isMNbywm8B9WTRsNFwGpeh3rqE,468
24
24
  trilogy/core/optimizations/inline_constant.py,sha256=kHNyc2UoaPVdYfVAPAFwnWuk4sJ_IF5faRtVcDOrBtw,1110
25
25
  trilogy/core/optimizations/inline_datasource.py,sha256=AATzQ6YrtW_1-aQFjQyTYqEYKBoMFhek7ADfBr4uUdQ,3634
26
26
  trilogy/core/optimizations/predicate_pushdown.py,sha256=1l9WnFOSv79e341typG3tTdk0XGl1J_ToQih3LYoGIY,8435
27
27
  trilogy/core/processing/__init__.py,sha256=47DEQpj8HBSa-_TImW-5JCeuQeRkm5NMpJWZG3hSuFU,0
28
- trilogy/core/processing/concept_strategies_v3.py,sha256=DO9gybVLku8GEkO3uNPaCeqhalnufsjYYbvDs-gkwNc,35295
28
+ trilogy/core/processing/concept_strategies_v3.py,sha256=J4efhZCSGSo_CXVRObn7p3Lxr5Ry_G01265amsr2iIU,35294
29
29
  trilogy/core/processing/graph_utils.py,sha256=aq-kqk4Iado2HywDxWEejWc-7PGO6Oa-ZQLAM6XWPHw,1199
30
30
  trilogy/core/processing/utility.py,sha256=v06sqXpnuYct_MMZXxEaiP0WwkeblWpO81QG1Ns3yGc,19420
31
31
  trilogy/core/processing/node_generators/__init__.py,sha256=-mzYkRsaRNa_dfTckYkKVFSR8h8a3ihEiPJDU_tAmDo,672
@@ -35,10 +35,10 @@ trilogy/core/processing/node_generators/filter_node.py,sha256=Vz9Rb67e1dfZgnliek
35
35
  trilogy/core/processing/node_generators/group_node.py,sha256=r54IVEhXW-tzod6uEHIQObrxgQt6aNySk5emWkWyqCU,4938
36
36
  trilogy/core/processing/node_generators/group_to_node.py,sha256=R9i_wHipxjXJyfYEwfeTw2EPpuanXVA327XyfcP2tBg,2537
37
37
  trilogy/core/processing/node_generators/multiselect_node.py,sha256=_KO9lqzHQoy4VAviO0ttQlmK0tjaqrJj4SJPhmoIYm8,6229
38
- trilogy/core/processing/node_generators/node_merge_node.py,sha256=4aoSkynWYcKAxeN4fU5jnCdxausa5rNgFokoVhPXI80,13511
39
- trilogy/core/processing/node_generators/rowset_node.py,sha256=gU_ybfYXO9tZqHjUSABIioVpb8AWtITpegj3IGSf2GI,4587
38
+ trilogy/core/processing/node_generators/node_merge_node.py,sha256=ArwE7QoLgSoc2WUR_z9kyWJROxn9qq3KhMcdMZdH3k0,13977
39
+ trilogy/core/processing/node_generators/rowset_node.py,sha256=KtdN6t2xM8CJxobc4aQX4W8uX98U6IabeuBF_FtBLR4,4583
40
40
  trilogy/core/processing/node_generators/select_merge_node.py,sha256=MKjlXqFBSin6cTnS6n5lEcNBJsMvSefDIXOwYNVbM0s,10371
41
- trilogy/core/processing/node_generators/select_node.py,sha256=vUg3gXHGvagdbniIAE7DdqJcQ0V1VAfHtTrw3edYPso,1734
41
+ trilogy/core/processing/node_generators/select_node.py,sha256=nwXHQF6C-aQUIelx9dyxN2pK3muL-4-6RIqnqQqNwtw,1808
42
42
  trilogy/core/processing/node_generators/unnest_node.py,sha256=cZ26CN338CBnd6asML1OBUtNcDzmNlFpY0Vnade4yrc,2256
43
43
  trilogy/core/processing/node_generators/window_node.py,sha256=jy3FF8uN0VA7yyrBeR40B9CAqR_5qBP4PiS6Gr-f-7w,2590
44
44
  trilogy/core/processing/nodes/__init__.py,sha256=qS5EJDRwwIrCEfS7ibCA2ESE0RPzsAIii1UWd_wNsHA,4760
@@ -50,7 +50,7 @@ trilogy/core/processing/nodes/select_node_v2.py,sha256=gS9OQgS2TSEK59BQ9R0i83pTH
50
50
  trilogy/core/processing/nodes/unnest_node.py,sha256=mAmFluzm2yeeiQ6NfIB7BU_8atRGh-UJfPf9ROwbhr8,2152
51
51
  trilogy/core/processing/nodes/window_node.py,sha256=ro0QfMFi4ZmIn5Q4D0M_vJWfnHH_C0MN7XkVkx8Gygg,1214
52
52
  trilogy/dialect/__init__.py,sha256=47DEQpj8HBSa-_TImW-5JCeuQeRkm5NMpJWZG3hSuFU,0
53
- trilogy/dialect/base.py,sha256=QDeKbc7vgqfbR649-87cwsOz8G3VY63V19zH6I-WITo,33103
53
+ trilogy/dialect/base.py,sha256=9hT4adhR4NG98AnrSYnJ9wGN0xJvth53fd-xLuyw3nI,33151
54
54
  trilogy/dialect/bigquery.py,sha256=15KJ-cOpBlk9O7FPviPgmg8xIydJeKx7WfmL3SSsPE8,2953
55
55
  trilogy/dialect/common.py,sha256=Hr0mxcNxjSvhpBM5Wvb_Q7aklAuYj5aBDrW433py0Zs,4403
56
56
  trilogy/dialect/config.py,sha256=tLVEMctaTDhUgARKXUNfHUcIolGaALkQ0RavUvXAY4w,2994
@@ -66,18 +66,18 @@ trilogy/hooks/graph_hook.py,sha256=onHvMQPwj_KOS3HOTpRFiy7QLLKAiycq2MzJ_Q0Oh5Y,2
66
66
  trilogy/hooks/query_debugger.py,sha256=Pe-Kw1JGngeLqQOMQb0E3-24jXEavqnPCQ-KOfTfjP8,4357
67
67
  trilogy/metadata/__init__.py,sha256=47DEQpj8HBSa-_TImW-5JCeuQeRkm5NMpJWZG3hSuFU,0
68
68
  trilogy/parsing/__init__.py,sha256=47DEQpj8HBSa-_TImW-5JCeuQeRkm5NMpJWZG3hSuFU,0
69
- trilogy/parsing/common.py,sha256=-4LM71ocidA8DI2RngqFEOmhzBrIt8VdBTO4x2BpD8E,9502
69
+ trilogy/parsing/common.py,sha256=kbqWy30nnVc7ID-sdSDwxYomnxd3guyuIJF3yvlpQwg,9960
70
70
  trilogy/parsing/config.py,sha256=Z-DaefdKhPDmSXLgg5V4pebhSB0h590vI0_VtHnlukI,111
71
71
  trilogy/parsing/exceptions.py,sha256=92E5i2frv5hj9wxObJZsZqj5T6bglvPzvdvco_vW1Zk,38
72
72
  trilogy/parsing/helpers.py,sha256=AbpHGcgLb-kRsJGnwFEktk7uzpZOCcBY74-YBdrKVGs,1
73
- trilogy/parsing/parse_engine.py,sha256=_orHmZTdExJYkuNRmAaazdXpZcvtPMLKIBI9vBAgq6w,64871
74
- trilogy/parsing/render.py,sha256=8yxerPAi4AhlhPBlAfbYbOM3F9rz6HzpWVEWPtK2VEg,12321
75
- trilogy/parsing/trilogy.lark,sha256=0JAvQBACFNL-X61I0tB_0QPZgsguZgerfHBv903oKh0,11623
73
+ trilogy/parsing/parse_engine.py,sha256=X1EFrHrc22dOsWHbk_5VVhViZZnF_SdmYlz_xksxplk,63751
74
+ trilogy/parsing/render.py,sha256=FRC42ZV2Xg3P4pZ7dBMZCgRdFYf_QbN_CCOgnVMON_g,12395
75
+ trilogy/parsing/trilogy.lark,sha256=3ElzcGWx8exOv9zJwxd_Vs_lau7g97QulwdfWldOLkA,11971
76
76
  trilogy/scripts/__init__.py,sha256=47DEQpj8HBSa-_TImW-5JCeuQeRkm5NMpJWZG3hSuFU,0
77
77
  trilogy/scripts/trilogy.py,sha256=PHxvv6f2ODv0esyyhWxlARgra8dVhqQhYl0lTrSyVNo,3729
78
- pytrilogy-0.0.2.20.dist-info/LICENSE.md,sha256=5ZRvtTyCCFwz1THxDTjAu3Lidds9WjPvvzgVwPSYNDo,1042
79
- pytrilogy-0.0.2.20.dist-info/METADATA,sha256=pmZ2uTW1EBclZPel8B698DU3sdTmutM5vnu25tNTHfM,8132
80
- pytrilogy-0.0.2.20.dist-info/WHEEL,sha256=GV9aMThwP_4oNCtvEC2ec3qUYutgWeAzklro_0m4WJQ,91
81
- pytrilogy-0.0.2.20.dist-info/entry_points.txt,sha256=0petKryjvvtEfTlbZC1AuMFumH_WQ9v8A19LvoS6G6c,54
82
- pytrilogy-0.0.2.20.dist-info/top_level.txt,sha256=cAy__NW_eMAa_yT9UnUNlZLFfxcg6eimUAZ184cdNiE,8
83
- pytrilogy-0.0.2.20.dist-info/RECORD,,
78
+ pytrilogy-0.0.2.22.dist-info/LICENSE.md,sha256=5ZRvtTyCCFwz1THxDTjAu3Lidds9WjPvvzgVwPSYNDo,1042
79
+ pytrilogy-0.0.2.22.dist-info/METADATA,sha256=KjHfUWqKMsWICew6drUKziQGGQINSUsrXIeggDK4CIw,8403
80
+ pytrilogy-0.0.2.22.dist-info/WHEEL,sha256=GV9aMThwP_4oNCtvEC2ec3qUYutgWeAzklro_0m4WJQ,91
81
+ pytrilogy-0.0.2.22.dist-info/entry_points.txt,sha256=0petKryjvvtEfTlbZC1AuMFumH_WQ9v8A19LvoS6G6c,54
82
+ pytrilogy-0.0.2.22.dist-info/top_level.txt,sha256=cAy__NW_eMAa_yT9UnUNlZLFfxcg6eimUAZ184cdNiE,8
83
+ pytrilogy-0.0.2.22.dist-info/RECORD,,
trilogy/__init__.py CHANGED
@@ -4,6 +4,6 @@ from trilogy.executor import Executor
4
4
  from trilogy.parser import parse
5
5
  from trilogy.constants import CONFIG
6
6
 
7
- __version__ = "0.0.2.20"
7
+ __version__ = "0.0.2.22"
8
8
 
9
9
  __all__ = ["parse", "Executor", "Dialects", "Environment", "CONFIG"]
trilogy/constants.py CHANGED
@@ -44,6 +44,7 @@ class Comments:
44
44
  class Config:
45
45
  strict_mode: bool = True
46
46
  human_identifiers: bool = True
47
+ randomize_cte_names: bool = False
47
48
  validate_missing: bool = True
48
49
  comments: Comments = field(default_factory=Comments)
49
50
  optimizations: Optimizations = field(default_factory=Optimizations)
trilogy/core/constants.py CHANGED
@@ -1,3 +1,4 @@
1
1
  CONSTANT_DATASET: str = "preql_internal_constant_dataset"
2
2
  ALL_ROWS_CONCEPT = "all_rows"
3
3
  INTERNAL_NAMESPACE = "__preql_internal"
4
+ PERSISTED_CONCEPT_PREFIX = "__pre_persist"
trilogy/core/enums.py CHANGED
@@ -12,6 +12,8 @@ class UnnestMode(Enum):
12
12
 
13
13
  class ConceptSource(Enum):
14
14
  MANUAL = "manual"
15
+ CTE = "cte"
16
+ PERSIST_STATEMENT = "persist_statement"
15
17
  AUTO_DERIVED = "auto_derived"
16
18
 
17
19
 
@@ -205,6 +207,8 @@ class FunctionClass(Enum):
205
207
  FunctionType.CURRENT_DATETIME,
206
208
  ]
207
209
 
210
+ ONE_TO_MANY = [FunctionType.UNNEST]
211
+
208
212
 
209
213
  class Boolean(Enum):
210
214
  TRUE = "true"
@@ -10,9 +10,11 @@ def add_concept(concept: Concept, g: ReferenceGraph):
10
10
  g.add_node(concept)
11
11
  # if we have sources, recursively add them
12
12
  node_name = concept_to_node(concept)
13
- if concept.sources:
14
- for source in concept.sources:
13
+ if concept.concept_arguments:
14
+ for source in concept.concept_arguments:
15
15
  generic = source.with_default_grain()
16
+ add_concept(generic, g)
17
+
16
18
  g.add_edge(generic, node_name)
17
19
  for _, pseudonym in concept.pseudonyms.items():
18
20
  pseudonym = pseudonym.with_default_grain()
@@ -1,3 +1,5 @@
1
+ from trilogy.constants import CONFIG
2
+
1
3
  # source: https://github.com/aaronbassett/Pass-phrase
2
4
  CTE_NAMES = """quizzical
3
5
  highfalutin
@@ -103,8 +105,6 @@ mandrill
103
105
  marlin
104
106
  monitor
105
107
  ocelot
106
- osprey
107
- owl
108
108
  petrel
109
109
  python
110
110
  ray
@@ -132,7 +132,6 @@ cuckoo
132
132
  darter
133
133
  dove
134
134
  duck
135
- eagle
136
135
  falcon
137
136
  finch
138
137
  flamingo
@@ -184,4 +183,12 @@ warbler""".split(
184
183
  "\n"
185
184
  )
186
185
 
187
- CTE_NAMES = list(set(CTE_NAMES))
186
+
187
+ def generate_cte_names():
188
+ if CONFIG.randomize_cte_names:
189
+ from random import shuffle
190
+
191
+ new = [*CTE_NAMES]
192
+ shuffle(new)
193
+ return new
194
+ return CTE_NAMES
trilogy/core/models.py CHANGED
@@ -44,6 +44,7 @@ from trilogy.core.constants import (
44
44
  ALL_ROWS_CONCEPT,
45
45
  INTERNAL_NAMESPACE,
46
46
  CONSTANT_DATASET,
47
+ PERSISTED_CONCEPT_PREFIX,
47
48
  )
48
49
  from trilogy.core.enums import (
49
50
  InfiniteFunctionArgs,
@@ -912,10 +913,10 @@ class Grain(Mergeable, BaseModel):
912
913
  )
913
914
 
914
915
  @cached_property
915
- def set(self):
916
+ def set(self) -> set[str]:
916
917
  base = []
917
918
  for x in self.components_copy:
918
- if x.derivation == PurposeLineage.ROWSET:
919
+ if isinstance(x.lineage, RowsetItem):
919
920
  base.append(x.lineage.content.address)
920
921
  else:
921
922
  base.append(x.address)
@@ -3131,6 +3132,12 @@ class EnvironmentConceptDict(dict):
3131
3132
  def values(self) -> ValuesView[Concept]: # type: ignore
3132
3133
  return super().values()
3133
3134
 
3135
+ def get(self, key: str, default: Concept | None = None) -> Concept | None: # type: ignore
3136
+ try:
3137
+ return self.__getitem__(key)
3138
+ except UndefinedConceptException:
3139
+ return default
3140
+
3134
3141
  def __getitem__(
3135
3142
  self, key, line_no: int | None = None
3136
3143
  ) -> Concept | UndefinedConcept:
@@ -3257,24 +3264,62 @@ class Environment(BaseModel):
3257
3264
  for datasource in self.datasources.values():
3258
3265
  for concept in datasource.output_concepts:
3259
3266
  concrete_addresses.add(concept.address)
3260
- self.materialized_concepts = [
3261
- c for c in self.concepts.values() if c.address in concrete_addresses
3262
- ] + [
3263
- c
3264
- for c in self.alias_origin_lookup.values()
3265
- if c.address in concrete_addresses
3266
- ]
3267
+ self.materialized_concepts = unique(
3268
+ [c for c in self.concepts.values() if c.address in concrete_addresses]
3269
+ + [
3270
+ c
3271
+ for c in self.alias_origin_lookup.values()
3272
+ if c.address in concrete_addresses
3273
+ ],
3274
+ "address",
3275
+ )
3267
3276
 
3268
- def validate_concept(self, lookup: str, meta: Meta | None = None):
3277
+ def validate_concept(self, new_concept: Concept, meta: Meta | None = None):
3278
+ lookup = new_concept.address
3269
3279
  existing: Concept = self.concepts.get(lookup) # type: ignore
3270
3280
  if not existing:
3271
3281
  return
3272
- elif existing and self.environment_config.allow_duplicate_declaration:
3282
+
3283
+ def handle_persist():
3284
+ deriv_lookup = (
3285
+ f"{existing.namespace}.{PERSISTED_CONCEPT_PREFIX}_{existing.name}"
3286
+ )
3287
+
3288
+ alt_source = self.alias_origin_lookup.get(deriv_lookup)
3289
+ if not alt_source:
3290
+ return None
3291
+ # if the new concept binding has no lineage
3292
+ # nothing to cause us to think a persist binding
3293
+ # needs to be invalidated
3294
+ if not new_concept.lineage:
3295
+ return existing
3296
+ if str(alt_source.lineage) == str(new_concept.lineage):
3297
+ logger.info(
3298
+ f"Persisted concept {existing.address} matched redeclaration, keeping current persistence binding."
3299
+ )
3300
+ return existing
3301
+ logger.warning(
3302
+ f"Persisted concept {existing.address} lineage {str(alt_source.lineage)} did not match redeclaration {str(new_concept.lineage)}, overwriting and invalidating persist binding."
3303
+ )
3304
+ for k, datasource in self.datasources.items():
3305
+ if existing.address in datasource.output_concepts:
3306
+ datasource.columns = [
3307
+ x
3308
+ for x in datasource.columns
3309
+ if x.concept.address != existing.address
3310
+ ]
3311
+ return None
3312
+
3313
+ if existing and self.environment_config.allow_duplicate_declaration:
3314
+ if existing.metadata.concept_source == ConceptSource.PERSIST_STATEMENT:
3315
+ return handle_persist()
3273
3316
  return
3274
3317
  elif existing.metadata:
3318
+ if existing.metadata.concept_source == ConceptSource.PERSIST_STATEMENT:
3319
+ return handle_persist()
3275
3320
  # if the existing concept is auto derived, we can overwrite it
3276
3321
  if existing.metadata.concept_source == ConceptSource.AUTO_DERIVED:
3277
- return
3322
+ return None
3278
3323
  elif meta and existing.metadata:
3279
3324
  raise ValueError(
3280
3325
  f"Assignment to concept '{lookup}' on line {meta.line} is a duplicate"
@@ -3390,11 +3435,12 @@ class Environment(BaseModel):
3390
3435
  _ignore_cache: bool = False,
3391
3436
  ):
3392
3437
  if not force:
3393
- self.validate_concept(concept.address, meta=meta)
3438
+ existing = self.validate_concept(concept, meta=meta)
3439
+ if existing:
3440
+ concept = existing
3394
3441
  if concept.namespace == DEFAULT_NAMESPACE:
3395
3442
  self.concepts[concept.name] = concept
3396
- else:
3397
- self.concepts[concept.address] = concept
3443
+ self.concepts[concept.address] = concept
3398
3444
  from trilogy.core.environment_helpers import generate_related_concepts
3399
3445
 
3400
3446
  generate_related_concepts(concept, self, meta=meta, add_derived=add_derived)
@@ -3408,21 +3454,42 @@ class Environment(BaseModel):
3408
3454
  meta: Meta | None = None,
3409
3455
  _ignore_cache: bool = False,
3410
3456
  ):
3411
-
3412
3457
  self.datasources[datasource.env_label] = datasource
3413
- for column in datasource.columns:
3414
- current_concept = column.concept
3458
+ for current_concept in datasource.output_concepts:
3415
3459
  current_derivation = current_concept.derivation
3460
+ # TODO: refine this section;
3461
+ # too hacky for maintainability
3416
3462
  if current_derivation not in (PurposeLineage.ROOT, PurposeLineage.CONSTANT):
3417
- new_concept = current_concept.model_copy(deep=True)
3418
- new_concept.set_name("_pre_persist_" + current_concept.name)
3419
- # remove the associated lineage
3420
- current_concept.lineage = None
3421
- self.add_concept(new_concept, meta=meta, force=True, _ignore_cache=True)
3422
- self.add_concept(
3423
- current_concept, meta=meta, force=True, _ignore_cache=True
3424
- )
3425
- self.merge_concept(new_concept, current_concept, [])
3463
+ persisted = f"{PERSISTED_CONCEPT_PREFIX}_" + current_concept.name
3464
+ # override the current concept source to reflect that it's now coming from a datasource
3465
+ if (
3466
+ current_concept.metadata.concept_source
3467
+ != ConceptSource.PERSIST_STATEMENT
3468
+ ):
3469
+ new_concept = current_concept.model_copy(deep=True)
3470
+ new_concept.set_name(persisted)
3471
+ self.add_concept(
3472
+ new_concept, meta=meta, force=True, _ignore_cache=True
3473
+ )
3474
+ current_concept.metadata.concept_source = (
3475
+ ConceptSource.PERSIST_STATEMENT
3476
+ )
3477
+ # remove the associated lineage
3478
+ # to make this a root for discovery purposes
3479
+ # as it now "exists" in a table
3480
+ current_concept.lineage = None
3481
+ current_concept = current_concept.with_default_grain()
3482
+ self.add_concept(
3483
+ current_concept, meta=meta, force=True, _ignore_cache=True
3484
+ )
3485
+ self.merge_concept(new_concept, current_concept, [])
3486
+ else:
3487
+ self.add_concept(current_concept, meta=meta, _ignore_cache=True)
3488
+
3489
+ # else:
3490
+ # self.add_concept(
3491
+ # current_concept, meta=meta, _ignore_cache=True
3492
+ # )
3426
3493
  if not _ignore_cache:
3427
3494
  self.gen_concept_list_caches()
3428
3495
  return datasource
@@ -4230,6 +4297,9 @@ class RowsetDerivationStatement(Namespaced, BaseModel):
4230
4297
  def __repr__(self):
4231
4298
  return f"RowsetDerivation<{str(self.select)}>"
4232
4299
 
4300
+ def __str__(self):
4301
+ return self.__repr__()
4302
+
4233
4303
  @property
4234
4304
  def derived_concepts(self) -> List[Concept]:
4235
4305
  output: list[Concept] = []
@@ -4248,7 +4318,8 @@ class RowsetDerivationStatement(Namespaced, BaseModel):
4248
4318
  content=orig_concept, where=self.select.where_clause, rowset=self
4249
4319
  ),
4250
4320
  grain=orig_concept.grain,
4251
- metadata=orig_concept.metadata,
4321
+ # TODO: add proper metadata
4322
+ metadata=Metadata(concept_source=ConceptSource.CTE),
4252
4323
  namespace=(
4253
4324
  f"{self.name}.{orig_concept.namespace}"
4254
4325
  if orig_concept.namespace != self.namespace
@@ -4275,6 +4346,7 @@ class RowsetDerivationStatement(Namespaced, BaseModel):
4275
4346
  components=[orig[c.address] for c in x.grain.components_copy]
4276
4347
  )
4277
4348
  else:
4349
+
4278
4350
  x.grain = default_grain
4279
4351
  return output
4280
4352
 
@@ -4300,6 +4372,9 @@ class RowsetItem(Mergeable, Namespaced, BaseModel):
4300
4372
  f"<Rowset<{self.rowset.name}>: {str(self.content)} where {str(self.where)}>"
4301
4373
  )
4302
4374
 
4375
+ def __str__(self):
4376
+ return self.__repr__()
4377
+
4303
4378
  def with_merge(self, source: Concept, target: Concept, modifiers: List[Modifier]):
4304
4379
  return RowsetItem(
4305
4380
  content=self.content.with_merge(source, target, modifiers),
@@ -432,7 +432,6 @@ def generate_node(
432
432
  conditions=conditions,
433
433
  )
434
434
  if not check:
435
-
436
435
  logger.info(
437
436
  f"{depth_to_prefix(depth)}{LOGGER_PREFIX} Could not resolve root concepts, checking for expanded concepts"
438
437
  )
@@ -87,12 +87,10 @@ def determine_induced_minimal_nodes(
87
87
  for node in G.nodes:
88
88
  if concepts.get(node):
89
89
  lookup: Concept = concepts[node]
90
- if lookup.derivation not in (PurposeLineage.BASIC, PurposeLineage.ROOT):
91
- nodes_to_remove.append(node)
92
- elif lookup.derivation == PurposeLineage.BASIC and G.out_degree(node) == 0:
90
+ if lookup.derivation in (PurposeLineage.CONSTANT,):
93
91
  nodes_to_remove.append(node)
94
92
  # purge a node if we're already looking for all it's parents
95
- elif filter_downstream and lookup.derivation == PurposeLineage.BASIC:
93
+ if filter_downstream and lookup.derivation not in (PurposeLineage.ROOT,):
96
94
  nodes_to_remove.append(node)
97
95
 
98
96
  H.remove_nodes_from(nodes_to_remove)
@@ -105,11 +103,12 @@ def determine_induced_minimal_nodes(
105
103
  zero_out = list(
106
104
  x for x in H.nodes if G.out_degree(x) == 0 and x not in nodelist
107
105
  )
106
+
108
107
  try:
109
108
  paths = nx.multi_source_dijkstra_path(H, nodelist)
110
109
  except nx.exception.NodeNotFound:
110
+ logger.debug(f"Unable to find paths for {nodelist}")
111
111
  return None
112
-
113
112
  H.remove_nodes_from(list(x for x in H.nodes if x not in paths))
114
113
  sG: nx.Graph = ax.steinertree.steiner_tree(H, nodelist).copy()
115
114
  final: nx.DiGraph = nx.subgraph(G, sG.nodes).copy()
@@ -126,12 +125,24 @@ def determine_induced_minimal_nodes(
126
125
  # all concept nodes must have a parent
127
126
 
128
127
  if not all(
129
- [final.in_degree(node) > 0 for node in final.nodes if node.startswith("c~")]
128
+ [
129
+ final.in_degree(node) > 0
130
+ for node in final.nodes
131
+ if node.startswith("c~") and node in nodelist
132
+ ]
130
133
  ):
134
+ missing = [
135
+ node
136
+ for node in final.nodes
137
+ if node.startswith("c~") and final.in_degree(node) == 0
138
+ ]
139
+ logger.debug(f"Skipping graph for {nodelist} as no in_degree {missing}")
131
140
  return None
132
141
 
133
142
  if not all([node in final.nodes for node in nodelist]):
143
+ logger.debug(f"Skipping graph for {nodelist} as missing nodes")
134
144
  return None
145
+ logger.debug(f"Found final graph {final.nodes}")
135
146
  return final
136
147
 
137
148
 
@@ -256,7 +267,9 @@ def resolve_weak_components(
256
267
 
257
268
  subgraphs: list[list[Concept]] = []
258
269
  # components = nx.strongly_connected_components(g)
270
+ node_list = [x for x in g.nodes if x.startswith("c~")]
259
271
  components = extract_ds_components(g, node_list)
272
+ logger.debug(f"Extracted components {components} from {node_list}")
260
273
  for component in components:
261
274
  # we need to take unique again as different addresses may map to the same concept
262
275
  sub_component = unique(
@@ -96,7 +96,7 @@ def gen_rowset_node(
96
96
  # node.set_preexisting_conditions(conditions.conditional if conditions else None)
97
97
  return node
98
98
 
99
- possible_joins = concept_to_relevant_joins(additional_relevant)
99
+ possible_joins = concept_to_relevant_joins(node.output_concepts)
100
100
  if not possible_joins:
101
101
  logger.info(
102
102
  f"{padding(depth)}{LOGGER_PREFIX} no possible joins for rowset node to get {[x.address for x in local_optional]}; have {[x.address for x in node.output_concepts]}"
@@ -104,7 +104,7 @@ def gen_rowset_node(
104
104
  return node
105
105
  enrich_node: MergeNode = source_concepts( # this fetches the parent + join keys
106
106
  # to then connect to the rest of the query
107
- mandatory_list=additional_relevant + local_optional,
107
+ mandatory_list=possible_joins + local_optional,
108
108
  environment=environment,
109
109
  g=g,
110
110
  depth=depth + 1,
@@ -39,9 +39,10 @@ def gen_select_node(
39
39
  ]
40
40
  )
41
41
  if materialized_lcl != all_lcl:
42
+ missing = all_lcl.difference(materialized_lcl)
42
43
  logger.info(
43
44
  f"{padding(depth)}{LOGGER_PREFIX} Skipping select node generation for {concept.address}"
44
- f" as it + optional includes non-materialized concepts (looking for all {all_lcl}) "
45
+ f" as it + optional includes non-materialized concepts (looking for all {all_lcl}, missing {missing}) "
45
46
  )
46
47
  if fail_if_not_found:
47
48
  raise NoDatasourceException(f"No datasource exists for {concept}")
@@ -11,6 +11,7 @@ from trilogy.core.models import (
11
11
  Concept,
12
12
  Environment,
13
13
  PersistStatement,
14
+ ConceptDeclarationStatement,
14
15
  SelectStatement,
15
16
  MultiSelectStatement,
16
17
  CTE,
@@ -31,7 +32,7 @@ from trilogy.utility import unique
31
32
 
32
33
  from trilogy.hooks.base_hook import BaseHook
33
34
  from trilogy.constants import logger
34
- from trilogy.core.ergonomics import CTE_NAMES
35
+ from trilogy.core.ergonomics import generate_cte_names
35
36
  from trilogy.core.optimization import optimize_ctes
36
37
  from math import ceil
37
38
  from collections import defaultdict
@@ -168,15 +169,16 @@ def datasource_to_query_datasource(datasource: Datasource) -> QueryDatasource:
168
169
 
169
170
 
170
171
  def generate_cte_name(full_name: str, name_map: dict[str, str]) -> str:
172
+ cte_names = generate_cte_names()
171
173
  if CONFIG.human_identifiers:
172
174
  if full_name in name_map:
173
175
  return name_map[full_name]
174
176
  suffix = ""
175
177
  idx = len(name_map)
176
- if idx >= len(CTE_NAMES):
177
- int = ceil(idx / len(CTE_NAMES))
178
+ if idx >= len(cte_names):
179
+ int = ceil(idx / len(cte_names))
178
180
  suffix = f"_{int}"
179
- valid = [x for x in CTE_NAMES if x + suffix not in name_map.values()]
181
+ valid = [x for x in cte_names if x + suffix not in name_map.values()]
180
182
  lookup = valid[0]
181
183
  new_name = f"{lookup}{suffix}"
182
184
  name_map[full_name] = new_name
@@ -393,6 +395,8 @@ def process_auto(
393
395
  return process_persist(environment, statement, hooks)
394
396
  elif isinstance(statement, SelectStatement):
395
397
  return process_query(environment, statement, hooks)
398
+ elif isinstance(statement, ConceptDeclarationStatement):
399
+ return None
396
400
  raise ValueError(f"Do not know how to process {type(statement)}")
397
401
 
398
402
 
trilogy/dialect/base.py CHANGED
@@ -48,6 +48,7 @@ from trilogy.core.models import (
48
48
  MapType,
49
49
  StructType,
50
50
  MergeStatementV2,
51
+ Datasource,
51
52
  )
52
53
  from trilogy.core.query_processor import process_query, process_persist
53
54
  from trilogy.dialect.common import render_join, render_unnest
@@ -724,6 +725,7 @@ class BaseDialect:
724
725
  MergeStatementV2,
725
726
  ImportStatement,
726
727
  RowsetDerivationStatement,
728
+ Datasource,
727
729
  ),
728
730
  ):
729
731
  continue
trilogy/executor.py CHANGED
@@ -23,7 +23,7 @@ from trilogy.dialect.base import BaseDialect
23
23
  from trilogy.dialect.enums import Dialects
24
24
  from trilogy.parser import parse_text
25
25
  from trilogy.hooks.base_hook import BaseHook
26
-
26
+ from pathlib import Path
27
27
  from dataclasses import dataclass
28
28
 
29
29
 
@@ -152,6 +152,13 @@ class Executor(object):
152
152
  def _(self, query: RawSQLStatement) -> CursorResult:
153
153
  return self.execute_raw_sql(query.text)
154
154
 
155
+ @execute_query.register
156
+ def _(self, query: ShowStatement) -> CursorResult:
157
+ sql = self.generator.generate_queries(
158
+ self.environment, [query], hooks=self.hooks
159
+ )
160
+ return self.execute_query(sql[0])
161
+
155
162
  @execute_query.register
156
163
  def _(self, query: ProcessedShowStatement) -> CursorResult:
157
164
  return generate_result_set(
@@ -341,3 +348,9 @@ class Executor(object):
341
348
  if isinstance(statement, ProcessedQueryPersist):
342
349
  self.environment.add_datasource(statement.datasource)
343
350
  return output
351
+
352
+ def execute_file(self, file: str | Path) -> List[CursorResult]:
353
+ file = Path(file)
354
+ with open(file, "r") as f:
355
+ command = f.read()
356
+ return self.execute_text(command)
trilogy/parsing/common.py CHANGED
@@ -141,9 +141,12 @@ def function_to_concept(parent: Function, name: str, namespace: str) -> Concept:
141
141
  for x in parent.concept_arguments
142
142
  if not x.derivation == PurposeLineage.CONSTANT
143
143
  ]
144
- grain = Grain()
144
+ grain: Grain | None = Grain()
145
145
  for x in pkeys:
146
146
  grain += x.grain
147
+ if parent.operator in FunctionClass.ONE_TO_MANY.value:
148
+ # if the function will create more rows, we don't know what grain this is at
149
+ grain = None
147
150
  modifiers = get_upstream_modifiers(pkeys)
148
151
  key_grain = []
149
152
  for x in pkeys:
@@ -156,13 +159,24 @@ def function_to_concept(parent: Function, name: str, namespace: str) -> Concept:
156
159
  purpose = Purpose.CONSTANT
157
160
  else:
158
161
  purpose = parent.output_purpose
162
+ if grain is not None:
163
+ return Concept(
164
+ name=name,
165
+ datatype=parent.output_datatype,
166
+ purpose=purpose,
167
+ lineage=parent,
168
+ namespace=namespace,
169
+ keys=keys,
170
+ modifiers=modifiers,
171
+ grain=grain,
172
+ )
173
+
159
174
  return Concept(
160
175
  name=name,
161
176
  datatype=parent.output_datatype,
162
177
  purpose=purpose,
163
178
  lineage=parent,
164
179
  namespace=namespace,
165
- grain=grain,
166
180
  keys=keys,
167
181
  modifiers=modifiers,
168
182
  )
@@ -461,40 +461,7 @@ class ParseToObjects(Transformer):
461
461
  while isinstance(source_value, Parenthetical):
462
462
  source_value = source_value.content
463
463
 
464
- if (
465
- isinstance(source_value, Function)
466
- and source_value.operator == FunctionType.STRUCT
467
- ):
468
- concept = arbitrary_to_concept(
469
- source_value,
470
- name=name,
471
- namespace=namespace,
472
- purpose=purpose,
473
- metadata=metadata,
474
- )
475
-
476
- if concept.metadata:
477
- concept.metadata.line_number = meta.line
478
- self.environment.add_concept(concept, meta=meta)
479
- return ConceptDerivation(concept=concept)
480
- elif (
481
- isinstance(source_value, Function)
482
- and source_value.operator == FunctionType.ALIAS
483
- ):
484
- concept = arbitrary_to_concept(
485
- source_value,
486
- name=name,
487
- namespace=namespace,
488
- purpose=purpose,
489
- metadata=metadata,
490
- )
491
-
492
- if concept.metadata:
493
- concept.metadata.line_number = meta.line
494
- self.environment.add_concept(concept, meta=meta)
495
- return ConceptDerivation(concept=concept)
496
-
497
- elif isinstance(
464
+ if isinstance(
498
465
  source_value, (FilterItem, WindowItem, AggregateWrapper, Function)
499
466
  ):
500
467
  concept = arbitrary_to_concept(
trilogy/parsing/render.py CHANGED
@@ -186,6 +186,8 @@ class Renderer:
186
186
 
187
187
  @to_string.register
188
188
  def _(self, arg: "Address"):
189
+ if arg.is_query:
190
+ return f"query '''{arg.location}'''"
189
191
  return f"address {arg.location}"
190
192
 
191
193
  @to_string.register
@@ -200,29 +200,42 @@
200
200
  _constant_functions: fcurrent_date | fcurrent_datetime
201
201
 
202
202
  //string
203
- like: "like"i "(" expr "," _string_lit ")"
204
- ilike: "ilike"i "(" expr "," _string_lit ")"
205
- alt_like: expr "like"i expr
206
- upper: "upper"i "(" expr ")"
207
- lower: "lower"i "(" expr ")"
208
- fsplit: "split"i "(" expr "," _string_lit ")"
209
- fstrpos: "strpos"i "(" expr "," expr ")"
203
+ _LIKE.1: "like("i
204
+ like: _LIKE expr "," _string_lit ")"
205
+ _ILIKE.1: "ilike("i
206
+ ilike: _ILIKE expr "," _string_lit ")"
207
+ alt_like: expr "like"i expr
208
+ _UPPER.1: "upper("i
209
+ upper: _UPPER expr ")"
210
+ _LOWER.1: "lower("i
211
+ lower: _LOWER expr ")"
212
+ _SPLIT.1: "split("i
213
+ fsplit: _SPLIT expr "," _string_lit ")"
214
+ _STRPOS.1: "strpos("i
215
+ fstrpos: _STRPOS expr "," expr ")"
210
216
  _SUBSTRING.1: "substring("i
211
217
  fsubstring: _SUBSTRING expr "," expr "," expr ")"
212
218
 
213
219
  _string_functions: like | ilike | upper | lower | fsplit | fstrpos | fsubstring
214
220
 
215
221
  // special aggregate
216
- fgroup: "group"i "(" expr ")" aggregate_over?
222
+ _GROUP.1: "group("i
223
+ fgroup: _GROUP expr ")" aggregate_over?
217
224
  //aggregates
218
225
  _COUNT.1: "count("i
219
226
  count: _COUNT expr ")"
227
+
220
228
  count_distinct: "count_distinct"i "(" expr ")"
221
229
 
222
- sum: "sum"i "(" expr ")"
223
- avg: "avg"i "(" expr ")"
224
- max: "max"i "(" expr ")"
225
- min: "min"i "(" expr ")"
230
+ // avoid conflicts with the window
231
+ _SUM.1: "sum("i
232
+ sum: _SUM expr ")"
233
+ _AVG.1: "avg("i
234
+ avg: _AVG expr ")"
235
+ _MAX.1: "max("i
236
+ max: _MAX expr ")"
237
+ _MIN.1: "min("i
238
+ min: _MIN expr ")"
226
239
 
227
240
  //aggregates can force a grain
228
241
  aggregate_all: "*"
@@ -255,10 +268,14 @@
255
268
  fyear: _YEAR expr ")"
256
269
 
257
270
  DATE_PART: "DAY"i | "WEEK"i | "MONTH"i | "QUARTER"i | "YEAR"i | "MINUTE"i | "HOUR"i | "SECOND"i
258
- fdate_trunc: "date_trunc"i "(" expr "," DATE_PART ")"
259
- fdate_part: "date_part"i "(" expr "," DATE_PART ")"
260
- fdate_add: "date_add"i "(" expr "," DATE_PART "," int_lit ")"
261
- fdate_diff: "date_diff"i "(" expr "," expr "," DATE_PART ")"
271
+ _DATE_TRUNC.1: "date_trunc("i
272
+ fdate_trunc: _DATE_TRUNC expr "," DATE_PART ")"
273
+ _DATE_PART.1: "date_part("i
274
+ fdate_part: _DATE_PART expr "," DATE_PART ")"
275
+ _DATE_ADD.1: "date_add("i
276
+ fdate_add: _DATE_ADD expr "," DATE_PART "," int_lit ")"
277
+ _DATE_DIFF.1: "date_diff("i
278
+ fdate_diff: _DATE_DIFF expr "," expr "," DATE_PART ")"
262
279
 
263
280
  _date_functions: fdate | fdate_add | fdate_diff | fdatetime | ftimestamp | fsecond | fminute | fhour | fday | fday_of_week | fweek | fmonth | fquarter | fyear | fdate_part | fdate_trunc
264
281
 
@@ -288,8 +305,8 @@
288
305
 
289
306
  map_lit: "{" (literal ":" literal ",")* literal ":" literal ","? "}"
290
307
 
291
-
292
- struct_lit: "struct" "(" (IDENTIFIER "=" literal ",")* IDENTIFIER "=" literal ","? ")"
308
+ _STRUCT.1: "struct("i
309
+ struct_lit: _STRUCT (IDENTIFIER "=" literal ",")* IDENTIFIER "=" literal ","? ")"
293
310
 
294
311
  !bool_lit: "True"i | "False"i
295
312