ai-prompter 0.1.0__tar.gz → 0.2.1__tar.gz

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
@@ -0,0 +1 @@
1
+ 3.10
@@ -0,0 +1,7 @@
1
+ .PHONY: tag
2
+
3
+ tag:
4
+ @version=$$(grep '^version = ' pyproject.toml | sed 's/version = "\(.*\)"/\1/'); \
5
+ echo "Creating tag v$$version"; \
6
+ git tag "v$$version"; \
7
+ git push origin "v$$version"
@@ -1,11 +1,11 @@
1
1
  Metadata-Version: 2.4
2
2
  Name: ai-prompter
3
- Version: 0.1.0
3
+ Version: 0.2.1
4
4
  Summary: A prompt management library using Jinja2 templates to build complex prompts easily.
5
5
  Author-email: LUIS NOVO <lfnovo@gmail.com>
6
6
  License: MIT
7
7
  License-File: LICENSE
8
- Requires-Python: >=3.10.6
8
+ Requires-Python: >=3.10
9
9
  Requires-Dist: jinja2>=3.1.6
10
10
  Requires-Dist: pip>=25.0.1
11
11
  Requires-Dist: pydantic>=2.0
@@ -21,7 +21,7 @@ A prompt management library using Jinja2 templates to build complex prompts easi
21
21
 
22
22
  - Define prompts as Jinja templates.
23
23
  - Load default templates from `src/ai_prompter/prompts`.
24
- - Override templates via `PROMPT_PATH` environment variable.
24
+ - Override templates via `PROMPTS_PATH` environment variable.
25
25
  - Render prompts with arbitrary data or Pydantic models.
26
26
  - Export to LangChain `ChatPromptTemplate`.
27
27
 
@@ -51,34 +51,104 @@ uv add langchain_core
51
51
  Configure a custom template path by creating a `.env` file in the project root:
52
52
 
53
53
  ```dotenv
54
- PROMPT_PATH=path/to/custom/templates
54
+ PROMPTS_PATH=path/to/custom/templates
55
55
  ```
56
56
 
57
57
  ## Usage
58
58
 
59
+ ### Basic Usage
60
+
61
+ ```python
62
+ from ai_prompter import Prompter
63
+
64
+ # Initialize with a template name
65
+ prompter = Prompter('my_template')
66
+
67
+ # Render a prompt with variables
68
+ prompt = prompter.render({'variable': 'value'})
69
+ print(prompt)
70
+ ```
71
+
72
+ ### Custom Prompt Directory
73
+
74
+ You can specify a custom directory for your prompt templates using the `prompt_dir` parameter:
75
+
76
+ ```python
77
+ prompter = Prompter(template_text='Hello {{ name }}!', prompt_dir='/path/to/your/prompts')
78
+ ```
79
+
80
+ ### Using Environment Variable for Prompt Path
81
+
82
+ Set the `PROMPTS_PATH` environment variable to point to your custom prompts directory:
83
+
84
+ ```bash
85
+ export PROMPTS_PATH=/path/to/your/prompts
86
+ ```
87
+
88
+ The `Prompter` class will check this path if no custom directory is provided in the constructor. If not set, it will also look in the current working directory and `~/ai-prompter/` as fallback options before using the default package prompts.
89
+
59
90
  ### Raw text template
60
91
 
61
92
  ```python
62
93
  from ai_prompter import Prompter
63
94
 
64
95
  template = """Write an article about {{ topic }}."""
65
- prompter = Prompter(prompt_text=template)
96
+ prompter = Prompter(template_text=template)
66
97
  prompt = prompter.render({"topic": "AI"})
67
98
  print(prompt) # Write an article about AI.
68
99
  ```
69
100
 
70
101
  ### Using File-based Templates
71
102
 
72
- You can store your templates in files and reference them by name (without the `.jinja` extension). The library looks for templates in the `prompts` directory by default, or you can set a custom directory with the `PROMPT_PATH` environment variable.
103
+ You can store your templates in files and reference them by name (without the `.jinja` extension). The library looks for templates in the `prompts` directory by default, or you can set a custom directory with the `PROMPTS_PATH` environment variable. You can specify multiple directories separated by `:` (colon), and the library will search through them in order until a matching template is found.
73
104
 
74
105
  ```python
75
106
  from ai_prompter import Prompter
76
107
 
108
+ # Set multiple search paths
109
+ os.environ["PROMPTS_PATH"] = "/path/to/templates1:/path/to/templates2"
110
+
77
111
  prompter = Prompter(prompt_template="greet")
78
- prompt = prompter.render({"who": "Tester"})
79
- print(prompt) # GREET Tester
112
+ result = prompter.render({"name": "World"})
113
+ print(result) # Output depends on the content of greet.jinja in the first found path
114
+ ```
115
+
116
+ ### Using Raw Text Templates
117
+
118
+ Alternatively, you can provide the template content directly as raw text using the `template_text` parameter or the `from_text` class method.
119
+
120
+ ```python
121
+ from ai_prompter import Prompter
122
+
123
+ # Using template_text parameter
124
+ prompter = Prompter(template_text="Hello, {{ name }}!")
125
+ result = prompter.render({"name": "World"})
126
+ print(result) # Output: Hello, World!
127
+
128
+ # Using from_text class method
129
+ prompter = Prompter.from_text("Hi, {{ person }}!", model="gpt-4")
130
+ result = prompter.render({"person": "Alice"})
131
+ print(result) # Output: Hi, Alice!
132
+ ```
133
+
134
+ ### LangChain Integration
135
+
136
+ You can convert your prompts to LangChain's `ChatPromptTemplate` format for use in LangChain workflows. This works for both text-based and file-based templates.
137
+
138
+ ```python
139
+ from ai_prompter import Prompter
140
+
141
+ # With text-based template
142
+ text_prompter = Prompter(template_text="Hello, {{ name }}!")
143
+ lc_text_prompt = text_prompter.to_langchain()
144
+
145
+ # With file-based template
146
+ file_prompter = Prompter(prompt_template="greet")
147
+ lc_file_prompt = file_prompter.to_langchain()
80
148
  ```
81
149
 
150
+ **Note**: LangChain integration requires the `langchain-core` package. Install it with `pip install .[langchain]`.
151
+
82
152
  ### Including Other Templates
83
153
 
84
154
  You can include other template files within a template using Jinja2's `{% include %}` directive. This allows you to build modular templates.
@@ -138,7 +208,7 @@ The library also automatically provides a `current_time` variable with the curre
138
208
  ```python
139
209
  from ai_prompter import Prompter
140
210
 
141
- prompter = Prompter(prompt_text="Current time: {{current_time}}")
211
+ prompter = Prompter(template_text="Current time: {{current_time}}")
142
212
  prompt = prompter.render()
143
213
  print(prompt) # Current time: 2025-04-19 23:28:00
144
214
  ```
@@ -159,16 +229,6 @@ prompt = prompter.render({"topic": "AI"})
159
229
  print(prompt)
160
230
  ```
161
231
 
162
- ### LangChain integration
163
-
164
- ```python
165
- from ai_prompter import Prompter
166
-
167
- prompter = Prompter(prompt_template="article")
168
- lc_template = prompter.to_langchain()
169
- # use lc_template in LangChain chains
170
- ```
171
-
172
232
  ### Jupyter Notebook
173
233
 
174
234
  See `notebooks/prompter_usage.ipynb` for interactive examples.
@@ -6,7 +6,7 @@ A prompt management library using Jinja2 templates to build complex prompts easi
6
6
 
7
7
  - Define prompts as Jinja templates.
8
8
  - Load default templates from `src/ai_prompter/prompts`.
9
- - Override templates via `PROMPT_PATH` environment variable.
9
+ - Override templates via `PROMPTS_PATH` environment variable.
10
10
  - Render prompts with arbitrary data or Pydantic models.
11
11
  - Export to LangChain `ChatPromptTemplate`.
12
12
 
@@ -36,34 +36,104 @@ uv add langchain_core
36
36
  Configure a custom template path by creating a `.env` file in the project root:
37
37
 
38
38
  ```dotenv
39
- PROMPT_PATH=path/to/custom/templates
39
+ PROMPTS_PATH=path/to/custom/templates
40
40
  ```
41
41
 
42
42
  ## Usage
43
43
 
44
+ ### Basic Usage
45
+
46
+ ```python
47
+ from ai_prompter import Prompter
48
+
49
+ # Initialize with a template name
50
+ prompter = Prompter('my_template')
51
+
52
+ # Render a prompt with variables
53
+ prompt = prompter.render({'variable': 'value'})
54
+ print(prompt)
55
+ ```
56
+
57
+ ### Custom Prompt Directory
58
+
59
+ You can specify a custom directory for your prompt templates using the `prompt_dir` parameter:
60
+
61
+ ```python
62
+ prompter = Prompter(template_text='Hello {{ name }}!', prompt_dir='/path/to/your/prompts')
63
+ ```
64
+
65
+ ### Using Environment Variable for Prompt Path
66
+
67
+ Set the `PROMPTS_PATH` environment variable to point to your custom prompts directory:
68
+
69
+ ```bash
70
+ export PROMPTS_PATH=/path/to/your/prompts
71
+ ```
72
+
73
+ The `Prompter` class will check this path if no custom directory is provided in the constructor. If not set, it will also look in the current working directory and `~/ai-prompter/` as fallback options before using the default package prompts.
74
+
44
75
  ### Raw text template
45
76
 
46
77
  ```python
47
78
  from ai_prompter import Prompter
48
79
 
49
80
  template = """Write an article about {{ topic }}."""
50
- prompter = Prompter(prompt_text=template)
81
+ prompter = Prompter(template_text=template)
51
82
  prompt = prompter.render({"topic": "AI"})
52
83
  print(prompt) # Write an article about AI.
53
84
  ```
54
85
 
55
86
  ### Using File-based Templates
56
87
 
57
- You can store your templates in files and reference them by name (without the `.jinja` extension). The library looks for templates in the `prompts` directory by default, or you can set a custom directory with the `PROMPT_PATH` environment variable.
88
+ You can store your templates in files and reference them by name (without the `.jinja` extension). The library looks for templates in the `prompts` directory by default, or you can set a custom directory with the `PROMPTS_PATH` environment variable. You can specify multiple directories separated by `:` (colon), and the library will search through them in order until a matching template is found.
58
89
 
59
90
  ```python
60
91
  from ai_prompter import Prompter
61
92
 
93
+ # Set multiple search paths
94
+ os.environ["PROMPTS_PATH"] = "/path/to/templates1:/path/to/templates2"
95
+
62
96
  prompter = Prompter(prompt_template="greet")
63
- prompt = prompter.render({"who": "Tester"})
64
- print(prompt) # GREET Tester
97
+ result = prompter.render({"name": "World"})
98
+ print(result) # Output depends on the content of greet.jinja in the first found path
99
+ ```
100
+
101
+ ### Using Raw Text Templates
102
+
103
+ Alternatively, you can provide the template content directly as raw text using the `template_text` parameter or the `from_text` class method.
104
+
105
+ ```python
106
+ from ai_prompter import Prompter
107
+
108
+ # Using template_text parameter
109
+ prompter = Prompter(template_text="Hello, {{ name }}!")
110
+ result = prompter.render({"name": "World"})
111
+ print(result) # Output: Hello, World!
112
+
113
+ # Using from_text class method
114
+ prompter = Prompter.from_text("Hi, {{ person }}!", model="gpt-4")
115
+ result = prompter.render({"person": "Alice"})
116
+ print(result) # Output: Hi, Alice!
117
+ ```
118
+
119
+ ### LangChain Integration
120
+
121
+ You can convert your prompts to LangChain's `ChatPromptTemplate` format for use in LangChain workflows. This works for both text-based and file-based templates.
122
+
123
+ ```python
124
+ from ai_prompter import Prompter
125
+
126
+ # With text-based template
127
+ text_prompter = Prompter(template_text="Hello, {{ name }}!")
128
+ lc_text_prompt = text_prompter.to_langchain()
129
+
130
+ # With file-based template
131
+ file_prompter = Prompter(prompt_template="greet")
132
+ lc_file_prompt = file_prompter.to_langchain()
65
133
  ```
66
134
 
135
+ **Note**: LangChain integration requires the `langchain-core` package. Install it with `pip install .[langchain]`.
136
+
67
137
  ### Including Other Templates
68
138
 
69
139
  You can include other template files within a template using Jinja2's `{% include %}` directive. This allows you to build modular templates.
@@ -123,7 +193,7 @@ The library also automatically provides a `current_time` variable with the curre
123
193
  ```python
124
194
  from ai_prompter import Prompter
125
195
 
126
- prompter = Prompter(prompt_text="Current time: {{current_time}}")
196
+ prompter = Prompter(template_text="Current time: {{current_time}}")
127
197
  prompt = prompter.render()
128
198
  print(prompt) # Current time: 2025-04-19 23:28:00
129
199
  ```
@@ -144,16 +214,6 @@ prompt = prompter.render({"topic": "AI"})
144
214
  print(prompt)
145
215
  ```
146
216
 
147
- ### LangChain integration
148
-
149
- ```python
150
- from ai_prompter import Prompter
151
-
152
- prompter = Prompter(prompt_template="article")
153
- lc_template = prompter.to_langchain()
154
- # use lc_template in LangChain chains
155
- ```
156
-
157
217
  ### Jupyter Notebook
158
218
 
159
219
  See `notebooks/prompter_usage.ipynb` for interactive examples.
@@ -0,0 +1,227 @@
1
+ {
2
+ "cells": [
3
+ {
4
+ "cell_type": "markdown",
5
+ "metadata": {},
6
+ "source": [
7
+ "# AI Prompter Usage"
8
+ ]
9
+ },
10
+ {
11
+ "cell_type": "markdown",
12
+ "metadata": {},
13
+ "source": [
14
+ "## Overriding the PROMPTS_PATH"
15
+ ]
16
+ },
17
+ {
18
+ "cell_type": "code",
19
+ "execution_count": 1,
20
+ "metadata": {},
21
+ "outputs": [],
22
+ "source": [
23
+ "from ai_prompter import Prompter\n",
24
+ "import os\n",
25
+ "from pathlib import Path\n",
26
+ "\n",
27
+ "os.environ['PROMPTS_PATH'] = str(Path('prompts').resolve())\n"
28
+ ]
29
+ },
30
+ {
31
+ "cell_type": "markdown",
32
+ "metadata": {},
33
+ "source": [
34
+ "## Using Text Templates"
35
+ ]
36
+ },
37
+ {
38
+ "cell_type": "code",
39
+ "execution_count": 2,
40
+ "metadata": {},
41
+ "outputs": [
42
+ {
43
+ "data": {
44
+ "text/plain": [
45
+ "'Write an article about AI.'"
46
+ ]
47
+ },
48
+ "execution_count": 2,
49
+ "metadata": {},
50
+ "output_type": "execute_result"
51
+ }
52
+ ],
53
+ "source": [
54
+ "from ai_prompter import Prompter\n",
55
+ "\n",
56
+ "template = \"\"\"Write an article about {{topic}}.\"\"\"\n",
57
+ "\n",
58
+ "prompter = Prompter(template_text=template)\n",
59
+ "\n",
60
+ "prompt = prompter.render(dict(topic=\"AI\"))\n",
61
+ "\n",
62
+ "prompt"
63
+ ]
64
+ },
65
+ {
66
+ "cell_type": "markdown",
67
+ "metadata": {},
68
+ "source": [
69
+ "## Using File Templates"
70
+ ]
71
+ },
72
+ {
73
+ "cell_type": "code",
74
+ "execution_count": 3,
75
+ "metadata": {},
76
+ "outputs": [
77
+ {
78
+ "data": {
79
+ "text/plain": [
80
+ "'Write an article about AI.'"
81
+ ]
82
+ },
83
+ "execution_count": 3,
84
+ "metadata": {},
85
+ "output_type": "execute_result"
86
+ }
87
+ ],
88
+ "source": [
89
+ "prompter = Prompter(prompt_template=\"article\") #will look for article.jinja in the prompts directory\n",
90
+ "\n",
91
+ "\n",
92
+ "prompt = prompter.render(dict(topic=\"AI\"))\n",
93
+ "\n",
94
+ "prompt"
95
+ ]
96
+ },
97
+ {
98
+ "cell_type": "markdown",
99
+ "metadata": {},
100
+ "source": [
101
+ "## Using Includes and Ifs"
102
+ ]
103
+ },
104
+ {
105
+ "cell_type": "code",
106
+ "execution_count": 4,
107
+ "metadata": {},
108
+ "outputs": [
109
+ {
110
+ "data": {
111
+ "text/plain": [
112
+ "'This is the outer file \\n\\nThis is the inner file\\n\\nValue: a\\n\\n\\n You selected A\\n\\n\\nThis is the end of the outer file'"
113
+ ]
114
+ },
115
+ "execution_count": 4,
116
+ "metadata": {},
117
+ "output_type": "execute_result"
118
+ }
119
+ ],
120
+ "source": [
121
+ "prompter = Prompter(prompt_template=\"outer\")\n",
122
+ "\n",
123
+ "prompt = prompter.render(dict(type=\"a\"))\n",
124
+ "\n",
125
+ "prompt"
126
+ ]
127
+ },
128
+ {
129
+ "cell_type": "markdown",
130
+ "metadata": {},
131
+ "source": [
132
+ "## Langchain compatibility\n",
133
+ "\n",
134
+ "Returns a Langchain ChatPromptTemplate"
135
+ ]
136
+ },
137
+ {
138
+ "cell_type": "code",
139
+ "execution_count": 5,
140
+ "metadata": {},
141
+ "outputs": [
142
+ {
143
+ "data": {
144
+ "text/plain": [
145
+ "ChatPromptTemplate(input_variables=['topic'], input_types={}, partial_variables={}, messages=[HumanMessagePromptTemplate(prompt=PromptTemplate(input_variables=['topic'], input_types={}, partial_variables={}, template='Write an article about {{topic}}.', template_format='jinja2'), additional_kwargs={})])"
146
+ ]
147
+ },
148
+ "execution_count": 5,
149
+ "metadata": {},
150
+ "output_type": "execute_result"
151
+ }
152
+ ],
153
+ "source": [
154
+ "template = \"\"\"Write an article about {{topic}}.\"\"\"\n",
155
+ "\n",
156
+ "prompter = Prompter(template_text=template)\n",
157
+ "\n",
158
+ "lc_prompt = prompter.to_langchain()\n",
159
+ "\n",
160
+ "lc_prompt"
161
+ ]
162
+ },
163
+ {
164
+ "cell_type": "code",
165
+ "execution_count": 6,
166
+ "metadata": {},
167
+ "outputs": [
168
+ {
169
+ "data": {
170
+ "text/plain": [
171
+ "ChatPromptTemplate(input_variables=['type'], input_types={}, partial_variables={}, messages=[HumanMessagePromptTemplate(prompt=PromptTemplate(input_variables=['type'], input_types={}, partial_variables={}, template=\"This is the outer file \\n\\nThis is the inner file\\n\\nValue: {{ type }}\\n\\n{% if type == 'a' %}\\n You selected A\\n{% else %}\\n You didn't select A\\n{% endif %}\\n\\n\\nThis is the end of the outer file\\n\", template_format='jinja2'), additional_kwargs={})])"
172
+ ]
173
+ },
174
+ "execution_count": 6,
175
+ "metadata": {},
176
+ "output_type": "execute_result"
177
+ }
178
+ ],
179
+ "source": [
180
+ "prompter = Prompter(prompt_template=\"outer\")\n",
181
+ "lc_prompt = prompter.to_langchain()\n",
182
+ "lc_prompt"
183
+ ]
184
+ },
185
+ {
186
+ "cell_type": "code",
187
+ "execution_count": 7,
188
+ "metadata": {},
189
+ "outputs": [
190
+ {
191
+ "data": {
192
+ "text/plain": [
193
+ "ChatPromptValue(messages=[HumanMessage(content='This is the outer file \\n\\nThis is the inner file\\n\\nValue: a\\n\\n\\n You selected A\\n\\n\\n\\nThis is the end of the outer file', additional_kwargs={}, response_metadata={})])"
194
+ ]
195
+ },
196
+ "execution_count": 7,
197
+ "metadata": {},
198
+ "output_type": "execute_result"
199
+ }
200
+ ],
201
+ "source": [
202
+ "lc_prompt.format_prompt(type=\"a\")"
203
+ ]
204
+ }
205
+ ],
206
+ "metadata": {
207
+ "kernelspec": {
208
+ "display_name": ".venv",
209
+ "language": "python",
210
+ "name": "python3"
211
+ },
212
+ "language_info": {
213
+ "codemirror_mode": {
214
+ "name": "ipython",
215
+ "version": 3
216
+ },
217
+ "file_extension": ".py",
218
+ "mimetype": "text/x-python",
219
+ "name": "python",
220
+ "nbconvert_exporter": "python",
221
+ "pygments_lexer": "ipython3",
222
+ "version": "3.10.6"
223
+ }
224
+ },
225
+ "nbformat": 4,
226
+ "nbformat_minor": 2
227
+ }
@@ -1,5 +1,7 @@
1
1
  This is the inner file
2
2
 
3
+ Value: {{ type }}
4
+
3
5
  {% if type == 'a' %}
4
6
  You selected A
5
7
  {% else %}
@@ -1,13 +1,13 @@
1
1
  [project]
2
2
  name = "ai-prompter"
3
- version = "0.1.0"
3
+ version = "0.2.1"
4
4
  description = "A prompt management library using Jinja2 templates to build complex prompts easily."
5
5
  readme = "README.md"
6
6
  homepage = "https://github.com/lfnovo/ai-prompter"
7
7
  authors = [
8
8
  { name = "LUIS NOVO", email = "lfnovo@gmail.com" }
9
9
  ]
10
- requires-python = ">=3.10.6"
10
+ requires-python = ">=3.10"
11
11
  dependencies = [
12
12
  "jinja2>=3.1.6",
13
13
  "pip>=25.0.1",