pop-python 1.0.0__tar.gz

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
@@ -0,0 +1,21 @@
1
+ MIT License
2
+
3
+ Copyright (c) 2024 sgt1796
4
+
5
+ Permission is hereby granted, free of charge, to any person obtaining a copy
6
+ of this software and associated documentation files (the "Software"), to deal
7
+ in the Software without restriction, including without limitation the rights
8
+ to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
9
+ copies of the Software, and to permit persons to whom the Software is
10
+ furnished to do so, subject to the following conditions:
11
+
12
+ The above copyright notice and this permission notice shall be included in all
13
+ copies or substantial portions of the Software.
14
+
15
+ THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
16
+ IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
17
+ FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
18
+ AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
19
+ LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
20
+ OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE
21
+ SOFTWARE.
@@ -0,0 +1,3 @@
1
+ include README.md
2
+ recursive-include POP/prompts *.md
3
+ recursive-include POP/schemas *.json
@@ -0,0 +1,345 @@
1
+ Metadata-Version: 2.4
2
+ Name: pop-python
3
+ Version: 1.0.0
4
+ Summary: Prompt Oriented Programming (POP): reusable, composable prompt functions for LLMs.
5
+ Home-page: https://github.com/sgt1796/POP
6
+ Author: Guotai Shen
7
+ Author-email: sgt1796@gmail.com
8
+ Classifier: Development Status :: 3 - Alpha
9
+ Classifier: Programming Language :: Python :: 3
10
+ Classifier: License :: OSI Approved :: MIT License
11
+ Classifier: Operating System :: OS Independent
12
+ Requires-Python: >=3.8
13
+ Description-Content-Type: text/markdown
14
+ License-File: LICENSE
15
+ Requires-Dist: openai>=1.0.0
16
+ Requires-Dist: requests>=2.25.0
17
+ Requires-Dist: python-dotenv
18
+ Requires-Dist: pydantic>=1.10
19
+ Requires-Dist: transformers>=4.30.0
20
+ Requires-Dist: numpy>=1.21
21
+ Requires-Dist: backoff
22
+ Requires-Dist: Pillow>=9.0
23
+ Requires-Dist: google-genai>=0.2.0
24
+ Dynamic: author
25
+ Dynamic: author-email
26
+ Dynamic: classifier
27
+ Dynamic: description
28
+ Dynamic: description-content-type
29
+ Dynamic: home-page
30
+ Dynamic: license-file
31
+ Dynamic: requires-dist
32
+ Dynamic: requires-python
33
+ Dynamic: summary
34
+
35
+ # Prompt Oriented Programming (POP)
36
+
37
+ ```python
38
+ from POP import PromptFunction
39
+
40
+ pf = PromptFunction(
41
+ prompt="Draw a simple ASCII art of <<<object>>>.",
42
+ client = "openai"
43
+ )
44
+
45
+ print(pf.execute(object="a cat"))
46
+ print(pf.execute(object="a rocket"))
47
+ ```
48
+
49
+ ```
50
+ /\_/\
51
+ ( o.o )
52
+ > ^ <
53
+
54
+ /\
55
+ / \
56
+ / \
57
+ | |
58
+ | |
59
+ ```
60
+
61
+ ---
62
+ Reusable, composable prompt functions for LLM workflows.
63
+
64
+ This release cleans the architecture, moves all LLM client logic to a separate `LLMClient` module, and extends multi-LLM backend support.
65
+
66
+ PyPI:
67
+ [https://pypi.org/project/pop-python/](https://pypi.org/project/pop-python/)
68
+
69
+ GitHub:
70
+ [https://github.com/sgt1796/POP](https://github.com/sgt1796/POP)
71
+
72
+ ---
73
+
74
+ ## Table of Contents
75
+
76
+ 1. [Overview](#1-overview)
77
+ 2. [Major Updates](#2-major-updates)
78
+ 3. [Features](#3-features)
79
+ 4. [Installation](#4-installation)
80
+ 5. [Setup](#5-setup)
81
+ 6. [PromptFunction](#6-promptfunction)
82
+
83
+ * Placeholders
84
+ * Reserved Keywords
85
+ * Executing prompts
86
+ * Improving prompts
87
+ 7. [Function Schema Generation](#7-function-schema-generation)
88
+ 8. [Embeddings](#8-embeddings)
89
+ 9. [Web Snapshot Utility](#9-web-snapshot-utility)
90
+ 10. [Examples](#10-examples)
91
+ 11. [Contributing](#11-contributing)
92
+ ---
93
+
94
+ # 1. Overview
95
+
96
+ Prompt Oriented Programming (POP) is a lightweight framework for building reusable, parameterized prompt functions.
97
+ Instead of scattering prompt strings across your codebase, POP lets you:
98
+
99
+ * encapsulate prompts as objects
100
+ * pass parameters cleanly via placeholders
101
+ * select a backend LLM client dynamically
102
+ * improve prompts using meta-prompting
103
+ * generate OpenAI-compatible function schemas
104
+ * use unified embedding tools
105
+ * work with multiple LLM providers through `LLMClient` subclasses
106
+
107
+ POP is designed to be simple, extensible, and production-friendly.
108
+
109
+ ---
110
+
111
+ # 2. Major Updates
112
+
113
+ This version introduces structural and functional improvements:
114
+
115
+ ### 2.1. LLMClient moved into its own module
116
+
117
+ `LLMClient.py` now holds all LLM backends:
118
+
119
+ * OpenAI
120
+ * Gemini
121
+ * Deepseek
122
+ * Doubao
123
+ * Local PyTorch stub
124
+ * Extensible architecture for adding new backends
125
+
126
+ ### 2.2. Expanded multi-LLM support
127
+
128
+ Each backend now has consistent interface behavior and multimodal (text + image) support where applicable.
129
+
130
+ ---
131
+
132
+ # 3. Features
133
+
134
+ * **Reusable Prompt Functions**
135
+ Use `<<<placeholder>>>` syntax to inject dynamic content.
136
+
137
+ * **Multi-LLM Backend**
138
+ Choose between OpenAI, Gemini, Deepseek, Doubao, or local models.
139
+
140
+ * **Prompt Improvement**
141
+ Improve or rewrite prompts using Fabric-style metaprompts.
142
+
143
+ * **Function Schema Generation**
144
+ Convert natural language descriptions into OpenAI-function schemas.
145
+
146
+ * **Unified Embedding Interface**
147
+ Supports OpenAI, Jina AI embeddings, and local HuggingFace models.
148
+
149
+ * **Webpage Snapshot Utility**
150
+ Convert any URL into structured text using r.jina.ai with optional image captioning.
151
+
152
+ ---
153
+
154
+ # 4. Installation
155
+
156
+ Install from PyPI:
157
+
158
+ ```bash
159
+ pip install pop-python
160
+ ```
161
+
162
+ Or install in development mode from GitHub:
163
+
164
+ ```bash
165
+ git clone https://github.com/sgt1796/POP.git
166
+ cd POP
167
+ pip install -e .
168
+ ```
169
+
170
+ ---
171
+
172
+ # 5. Setup
173
+
174
+ Create a `.env` file in your project root:
175
+
176
+ ```ini
177
+ OPENAI_API_KEY=your_openai_key
178
+ GEMINI_API_KEY=your_gcp_gemini_key
179
+ DEEPSEEK_API_KEY=your_deepseek_key
180
+ DOUBAO_API_KEY=your_volcengine_key
181
+ JINAAI_API_KEY=your_jina_key
182
+ ```
183
+
184
+ All clients automatically read keys from environment variables.
185
+
186
+ ---
187
+
188
+ # 6. PromptFunction
189
+
190
+ The core abstraction of POP is the `PromptFunction` class.
191
+
192
+ ```python
193
+ from POP import PromptFunction
194
+
195
+ pf = PromptFunction(
196
+ sys_prompt="You are a helpful AI.",
197
+ prompt="Give me a summary about <<<topic>>>."
198
+ )
199
+
200
+ print(pf.execute(topic="quantum biology"))
201
+ ```
202
+
203
+ ---
204
+
205
+ ## 6.1. Placeholder Syntax
206
+
207
+ Use angle-triple-brackets inside your prompt:
208
+
209
+ ```
210
+ <<<placeholder>>>
211
+ ```
212
+
213
+ These are replaced at execution time.
214
+
215
+ Example:
216
+
217
+ ```python
218
+ prompt = "Translate <<<sentence>>> to French."
219
+ ```
220
+
221
+ ---
222
+
223
+ ## 6.2. Reserved Keywords
224
+
225
+ Within `.execute()`, the following keyword arguments are **reserved** and should not be used as placeholder names:
226
+
227
+ * `model`
228
+ * `sys`
229
+ * `fmt`
230
+ * `tools`
231
+ * `temp`
232
+ * `images`
233
+ * `ADD_BEFORE`
234
+ * `ADD_AFTER`
235
+
236
+ Most keywords are used for parameters. `ADD_BEFORE` and `ADD_AFTER` will attach input string to head/tail of the prompt.
237
+
238
+ ---
239
+
240
+ ## 6.3. Executing prompts
241
+
242
+ ```python
243
+ result = pf.execute(
244
+ topic="photosynthesis",
245
+ model="gpt-4o-mini",
246
+ temp=0.3
247
+ )
248
+ ```
249
+
250
+ ---
251
+
252
+ ## 6.4. Improving Prompts
253
+
254
+ You can ask POP to rewrite or enhance your system prompt:
255
+
256
+ ```python
257
+ better = pf._improve_prompt()
258
+ print(better)
259
+ ```
260
+
261
+ This uses a Fabric-inspired meta-prompt bundled in the `prompts/` directory.
262
+
263
+ ---
264
+
265
+ # 7. Function Schema Generation
266
+
267
+ POP supports generating **OpenAI function-calling schemas** from natural language descriptions.
268
+
269
+ ```python
270
+ schema = pf.generate_schema(
271
+ description="Return the square and cube of a given integer."
272
+ )
273
+
274
+ print(schema)
275
+ ```
276
+
277
+ What this does:
278
+
279
+ * Applies a standard meta-prompt
280
+ * Uses the selected LLM backend
281
+ * Produces a valid JSON Schema for OpenAI function calling
282
+ * Optionally saves it under `functions/`
283
+
284
+ ---
285
+
286
+ # 8. Embeddings
287
+
288
+ POP includes a unified embedding interface:
289
+
290
+ ```python
291
+ from POP.Embedder import Embedder
292
+
293
+ embedder = Embedder(use_api="openai")
294
+ vecs = embedder.get_embedding(["hello world"])
295
+ ```
296
+
297
+ Supported modes:
298
+
299
+ * OpenAI embeddings
300
+ * JinaAI embeddings
301
+ * Local HuggingFace model embeddings (cpu/gpu)
302
+
303
+ Large inputs are chunked automatically when needed.
304
+
305
+ ---
306
+
307
+ # 9. Web Snapshot Utility
308
+
309
+ ```python
310
+ from POP import get_text_snapshot
311
+
312
+ text = get_text_snapshot("https://example.com", image_caption=True)
313
+ print(text[:500])
314
+ ```
315
+
316
+ Supports:
317
+
318
+ * optional image removal
319
+ * optional image captioning
320
+ * DOM selector filtering
321
+ * returning JSON or plain text
322
+
323
+ ---
324
+
325
+ # 10. Examples
326
+
327
+ ```python
328
+ from POP import PromptFunction
329
+
330
+ pf = PromptFunction(prompt="Give me 3 creative names for a <<<thing>>>.")
331
+
332
+ print(pf.execute(thing="robot"))
333
+ print(pf.execute(thing="new language"))
334
+ ```
335
+
336
+ ---
337
+
338
+ # 11. Contributing
339
+
340
+ Steps:
341
+
342
+ 1. Fork the GitHub repo
343
+ 2. Create a feature branch
344
+ 3. Add tests or examples
345
+ 4. Submit a PR with a clear explanation
@@ -0,0 +1,311 @@
1
+ # Prompt Oriented Programming (POP)
2
+
3
+ ```python
4
+ from POP import PromptFunction
5
+
6
+ pf = PromptFunction(
7
+ prompt="Draw a simple ASCII art of <<<object>>>.",
8
+ client = "openai"
9
+ )
10
+
11
+ print(pf.execute(object="a cat"))
12
+ print(pf.execute(object="a rocket"))
13
+ ```
14
+
15
+ ```
16
+ /\_/\
17
+ ( o.o )
18
+ > ^ <
19
+
20
+ /\
21
+ / \
22
+ / \
23
+ | |
24
+ | |
25
+ ```
26
+
27
+ ---
28
+ Reusable, composable prompt functions for LLM workflows.
29
+
30
+ This release cleans the architecture, moves all LLM client logic to a separate `LLMClient` module, and extends multi-LLM backend support.
31
+
32
+ PyPI:
33
+ [https://pypi.org/project/pop-python/](https://pypi.org/project/pop-python/)
34
+
35
+ GitHub:
36
+ [https://github.com/sgt1796/POP](https://github.com/sgt1796/POP)
37
+
38
+ ---
39
+
40
+ ## Table of Contents
41
+
42
+ 1. [Overview](#1-overview)
43
+ 2. [Major Updates](#2-major-updates)
44
+ 3. [Features](#3-features)
45
+ 4. [Installation](#4-installation)
46
+ 5. [Setup](#5-setup)
47
+ 6. [PromptFunction](#6-promptfunction)
48
+
49
+ * Placeholders
50
+ * Reserved Keywords
51
+ * Executing prompts
52
+ * Improving prompts
53
+ 7. [Function Schema Generation](#7-function-schema-generation)
54
+ 8. [Embeddings](#8-embeddings)
55
+ 9. [Web Snapshot Utility](#9-web-snapshot-utility)
56
+ 10. [Examples](#10-examples)
57
+ 11. [Contributing](#11-contributing)
58
+ ---
59
+
60
+ # 1. Overview
61
+
62
+ Prompt Oriented Programming (POP) is a lightweight framework for building reusable, parameterized prompt functions.
63
+ Instead of scattering prompt strings across your codebase, POP lets you:
64
+
65
+ * encapsulate prompts as objects
66
+ * pass parameters cleanly via placeholders
67
+ * select a backend LLM client dynamically
68
+ * improve prompts using meta-prompting
69
+ * generate OpenAI-compatible function schemas
70
+ * use unified embedding tools
71
+ * work with multiple LLM providers through `LLMClient` subclasses
72
+
73
+ POP is designed to be simple, extensible, and production-friendly.
74
+
75
+ ---
76
+
77
+ # 2. Major Updates
78
+
79
+ This version introduces structural and functional improvements:
80
+
81
+ ### 2.1. LLMClient moved into its own module
82
+
83
+ `LLMClient.py` now holds all LLM backends:
84
+
85
+ * OpenAI
86
+ * Gemini
87
+ * Deepseek
88
+ * Doubao
89
+ * Local PyTorch stub
90
+ * Extensible architecture for adding new backends
91
+
92
+ ### 2.2. Expanded multi-LLM support
93
+
94
+ Each backend now has consistent interface behavior and multimodal (text + image) support where applicable.
95
+
96
+ ---
97
+
98
+ # 3. Features
99
+
100
+ * **Reusable Prompt Functions**
101
+ Use `<<<placeholder>>>` syntax to inject dynamic content.
102
+
103
+ * **Multi-LLM Backend**
104
+ Choose between OpenAI, Gemini, Deepseek, Doubao, or local models.
105
+
106
+ * **Prompt Improvement**
107
+ Improve or rewrite prompts using Fabric-style metaprompts.
108
+
109
+ * **Function Schema Generation**
110
+ Convert natural language descriptions into OpenAI-function schemas.
111
+
112
+ * **Unified Embedding Interface**
113
+ Supports OpenAI, Jina AI embeddings, and local HuggingFace models.
114
+
115
+ * **Webpage Snapshot Utility**
116
+ Convert any URL into structured text using r.jina.ai with optional image captioning.
117
+
118
+ ---
119
+
120
+ # 4. Installation
121
+
122
+ Install from PyPI:
123
+
124
+ ```bash
125
+ pip install pop-python
126
+ ```
127
+
128
+ Or install in development mode from GitHub:
129
+
130
+ ```bash
131
+ git clone https://github.com/sgt1796/POP.git
132
+ cd POP
133
+ pip install -e .
134
+ ```
135
+
136
+ ---
137
+
138
+ # 5. Setup
139
+
140
+ Create a `.env` file in your project root:
141
+
142
+ ```ini
143
+ OPENAI_API_KEY=your_openai_key
144
+ GEMINI_API_KEY=your_gcp_gemini_key
145
+ DEEPSEEK_API_KEY=your_deepseek_key
146
+ DOUBAO_API_KEY=your_volcengine_key
147
+ JINAAI_API_KEY=your_jina_key
148
+ ```
149
+
150
+ All clients automatically read keys from environment variables.
151
+
152
+ ---
153
+
154
+ # 6. PromptFunction
155
+
156
+ The core abstraction of POP is the `PromptFunction` class.
157
+
158
+ ```python
159
+ from POP import PromptFunction
160
+
161
+ pf = PromptFunction(
162
+ sys_prompt="You are a helpful AI.",
163
+ prompt="Give me a summary about <<<topic>>>."
164
+ )
165
+
166
+ print(pf.execute(topic="quantum biology"))
167
+ ```
168
+
169
+ ---
170
+
171
+ ## 6.1. Placeholder Syntax
172
+
173
+ Use angle-triple-brackets inside your prompt:
174
+
175
+ ```
176
+ <<<placeholder>>>
177
+ ```
178
+
179
+ These are replaced at execution time.
180
+
181
+ Example:
182
+
183
+ ```python
184
+ prompt = "Translate <<<sentence>>> to French."
185
+ ```
186
+
187
+ ---
188
+
189
+ ## 6.2. Reserved Keywords
190
+
191
+ Within `.execute()`, the following keyword arguments are **reserved** and should not be used as placeholder names:
192
+
193
+ * `model`
194
+ * `sys`
195
+ * `fmt`
196
+ * `tools`
197
+ * `temp`
198
+ * `images`
199
+ * `ADD_BEFORE`
200
+ * `ADD_AFTER`
201
+
202
+ Most keywords are used for parameters. `ADD_BEFORE` and `ADD_AFTER` will attach input string to head/tail of the prompt.
203
+
204
+ ---
205
+
206
+ ## 6.3. Executing prompts
207
+
208
+ ```python
209
+ result = pf.execute(
210
+ topic="photosynthesis",
211
+ model="gpt-4o-mini",
212
+ temp=0.3
213
+ )
214
+ ```
215
+
216
+ ---
217
+
218
+ ## 6.4. Improving Prompts
219
+
220
+ You can ask POP to rewrite or enhance your system prompt:
221
+
222
+ ```python
223
+ better = pf._improve_prompt()
224
+ print(better)
225
+ ```
226
+
227
+ This uses a Fabric-inspired meta-prompt bundled in the `prompts/` directory.
228
+
229
+ ---
230
+
231
+ # 7. Function Schema Generation
232
+
233
+ POP supports generating **OpenAI function-calling schemas** from natural language descriptions.
234
+
235
+ ```python
236
+ schema = pf.generate_schema(
237
+ description="Return the square and cube of a given integer."
238
+ )
239
+
240
+ print(schema)
241
+ ```
242
+
243
+ What this does:
244
+
245
+ * Applies a standard meta-prompt
246
+ * Uses the selected LLM backend
247
+ * Produces a valid JSON Schema for OpenAI function calling
248
+ * Optionally saves it under `functions/`
249
+
250
+ ---
251
+
252
+ # 8. Embeddings
253
+
254
+ POP includes a unified embedding interface:
255
+
256
+ ```python
257
+ from POP.Embedder import Embedder
258
+
259
+ embedder = Embedder(use_api="openai")
260
+ vecs = embedder.get_embedding(["hello world"])
261
+ ```
262
+
263
+ Supported modes:
264
+
265
+ * OpenAI embeddings
266
+ * JinaAI embeddings
267
+ * Local HuggingFace model embeddings (cpu/gpu)
268
+
269
+ Large inputs are chunked automatically when needed.
270
+
271
+ ---
272
+
273
+ # 9. Web Snapshot Utility
274
+
275
+ ```python
276
+ from POP import get_text_snapshot
277
+
278
+ text = get_text_snapshot("https://example.com", image_caption=True)
279
+ print(text[:500])
280
+ ```
281
+
282
+ Supports:
283
+
284
+ * optional image removal
285
+ * optional image captioning
286
+ * DOM selector filtering
287
+ * returning JSON or plain text
288
+
289
+ ---
290
+
291
+ # 10. Examples
292
+
293
+ ```python
294
+ from POP import PromptFunction
295
+
296
+ pf = PromptFunction(prompt="Give me 3 creative names for a <<<thing>>>.")
297
+
298
+ print(pf.execute(thing="robot"))
299
+ print(pf.execute(thing="new language"))
300
+ ```
301
+
302
+ ---
303
+
304
+ # 11. Contributing
305
+
306
+ Steps:
307
+
308
+ 1. Fork the GitHub repo
309
+ 2. Create a feature branch
310
+ 3. Add tests or examples
311
+ 4. Submit a PR with a clear explanation