llm-to-toon 0.0.0__py3-none-any.whl → 0.0.32__py3-none-any.whl

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
@@ -1,36 +1,36 @@
1
- Metadata-Version: 2.4
2
- Name: llm-to-toon
3
- Version: 0.0.0
4
- Summary: Tiny wrapper exposing Prompture helpers to convert LLM output into TOON.
5
- Author-email: Juan Denis <juan@vene.co>
6
- License: MIT
7
- Keywords: llm,toon,prompt,structured-output
8
- Requires-Python: >=3.10
9
- Description-Content-Type: text/markdown
10
- Requires-Dist: prompture>=0.0.1
11
-
12
- # llm-to-toon
13
-
14
- Tiny wrapper around `prompture` that returns [TOON](https://github.com/jmorganca/python-toon)
15
- (Token-Oriented Object Notation) instead of JSON. Under the hood it uses
16
- `prompture.extract_and_jsonify(..., output_format="toon")` and converts the result
17
- into the ultra-compact TOON text automatically.
18
-
19
- Install:
20
- ```bash
21
- pip install llm-to-toon
22
- ```
23
-
24
- Usage:
25
-
26
- ```python
27
- from llm_to_toon import from_llm_text
28
-
29
- schema = {"name": "string", "age": "int"}
30
- toon_text = from_llm_text("Name: Juan Age: 30", schema)
31
- print(toon_text)
32
- ```
33
-
34
- By default the helper spins up the local Ollama driver (`gemma:latest`). Pass your
35
- own Prompture driver if you want to call OpenAI, Azure, Groq, etc. For the full
36
- Prompture feature-set see the main project: https://github.com/jhd3197/prompture
1
+ Metadata-Version: 2.4
2
+ Name: llm-to-toon
3
+ Version: 0.0.32
4
+ Summary: Tiny wrapper exposing Prompture helpers to convert LLM output into TOON.
5
+ Author-email: Juan Denis <juan@vene.co>
6
+ License: MIT
7
+ Keywords: llm,toon,prompt,structured-output
8
+ Requires-Python: >=3.10
9
+ Description-Content-Type: text/markdown
10
+ Requires-Dist: prompture>=0.0.32
11
+
12
+ # llm-to-toon
13
+
14
+ Tiny wrapper around `prompture` that returns [TOON](https://github.com/jmorganca/python-toon)
15
+ (Token-Oriented Object Notation) instead of JSON. Under the hood it uses
16
+ `prompture.extract_and_jsonify(..., output_format="toon")` and converts the result
17
+ into the ultra-compact TOON text automatically.
18
+
19
+ Install:
20
+ ```bash
21
+ pip install llm-to-toon
22
+ ```
23
+
24
+ Usage:
25
+
26
+ ```python
27
+ from llm_to_toon import from_llm_text
28
+
29
+ schema = {"name": "string", "age": "int"}
30
+ toon_text = from_llm_text("Name: Juan Age: 30", schema)
31
+ print(toon_text)
32
+ ```
33
+
34
+ By default the helper spins up the local Ollama driver (`gemma:latest`). Pass your
35
+ own Prompture driver if you want to call OpenAI, Azure, Groq, etc. For the full
36
+ Prompture feature-set see the main project: https://github.com/jhd3197/prompture
@@ -0,0 +1,5 @@
1
+ llm_to_toon/__init__.py,sha256=g-BGDYEAZZNB7ZcfSLpMrrMWkV2XApxVa4EoPveBQXE,951
2
+ llm_to_toon-0.0.32.dist-info/METADATA,sha256=ktgLQJ5dH7zU-CnCJflPM0lWf3CFIppmNokXcMyOzYU,1114
3
+ llm_to_toon-0.0.32.dist-info/WHEEL,sha256=_zCd3N1l69ArxyTb8rzEoP9TpbYXkqRFSNOD5OuxnTs,91
4
+ llm_to_toon-0.0.32.dist-info/top_level.txt,sha256=HgVaTsdKTHcoU9RYQcGqeFWh2ynqXZq_-1TZVkjcek4,12
5
+ llm_to_toon-0.0.32.dist-info/RECORD,,
@@ -1,5 +0,0 @@
1
- llm_to_toon/__init__.py,sha256=g-BGDYEAZZNB7ZcfSLpMrrMWkV2XApxVa4EoPveBQXE,951
2
- llm_to_toon-0.0.0.dist-info/METADATA,sha256=XViq9asm4xQOkXGQxbpcr4NfePYUTtD7XZfGrR8JJis,1148
3
- llm_to_toon-0.0.0.dist-info/WHEEL,sha256=_zCd3N1l69ArxyTb8rzEoP9TpbYXkqRFSNOD5OuxnTs,91
4
- llm_to_toon-0.0.0.dist-info/top_level.txt,sha256=HgVaTsdKTHcoU9RYQcGqeFWh2ynqXZq_-1TZVkjcek4,12
5
- llm_to_toon-0.0.0.dist-info/RECORD,,