bsy-clippy 0.1.0__tar.gz
This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
- bsy_clippy-0.1.0/LICENSE +339 -0
- bsy_clippy-0.1.0/PKG-INFO +189 -0
- bsy_clippy-0.1.0/README.md +178 -0
- bsy_clippy-0.1.0/pyproject.toml +25 -0
- bsy_clippy-0.1.0/setup.cfg +4 -0
- bsy_clippy-0.1.0/src/bsy_clippy/__init__.py +5 -0
- bsy_clippy-0.1.0/src/bsy_clippy/cli.py +409 -0
- bsy_clippy-0.1.0/src/bsy_clippy/data/bsy-clippy.txt +2 -0
- bsy_clippy-0.1.0/src/bsy_clippy.egg-info/PKG-INFO +189 -0
- bsy_clippy-0.1.0/src/bsy_clippy.egg-info/SOURCES.txt +12 -0
- bsy_clippy-0.1.0/src/bsy_clippy.egg-info/dependency_links.txt +1 -0
- bsy_clippy-0.1.0/src/bsy_clippy.egg-info/entry_points.txt +2 -0
- bsy_clippy-0.1.0/src/bsy_clippy.egg-info/requires.txt +1 -0
- bsy_clippy-0.1.0/src/bsy_clippy.egg-info/top_level.txt +1 -0
bsy_clippy-0.1.0/LICENSE
ADDED
|
@@ -0,0 +1,339 @@
|
|
|
1
|
+
GNU GENERAL PUBLIC LICENSE
|
|
2
|
+
Version 2, June 1991
|
|
3
|
+
|
|
4
|
+
Copyright (C) 1989, 1991 Free Software Foundation, Inc.,
|
|
5
|
+
51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA
|
|
6
|
+
Everyone is permitted to copy and distribute verbatim copies
|
|
7
|
+
of this license document, but changing it is not allowed.
|
|
8
|
+
|
|
9
|
+
Preamble
|
|
10
|
+
|
|
11
|
+
The licenses for most software are designed to take away your
|
|
12
|
+
freedom to share and change it. By contrast, the GNU General Public
|
|
13
|
+
License is intended to guarantee your freedom to share and change free
|
|
14
|
+
software--to make sure the software is free for all its users. This
|
|
15
|
+
General Public License applies to most of the Free Software
|
|
16
|
+
Foundation's software and to any other program whose authors commit to
|
|
17
|
+
using it. (Some other Free Software Foundation software is covered by
|
|
18
|
+
the GNU Lesser General Public License instead.) You can apply it to
|
|
19
|
+
your programs, too.
|
|
20
|
+
|
|
21
|
+
When we speak of free software, we are referring to freedom, not
|
|
22
|
+
price. Our General Public Licenses are designed to make sure that you
|
|
23
|
+
have the freedom to distribute copies of free software (and charge for
|
|
24
|
+
this service if you wish), that you receive source code or can get it
|
|
25
|
+
if you want it, that you can change the software or use pieces of it
|
|
26
|
+
in new free programs; and that you know you can do these things.
|
|
27
|
+
|
|
28
|
+
To protect your rights, we need to make restrictions that forbid
|
|
29
|
+
anyone to deny you these rights or to ask you to surrender the rights.
|
|
30
|
+
These restrictions translate to certain responsibilities for you if you
|
|
31
|
+
distribute copies of the software, or if you modify it.
|
|
32
|
+
|
|
33
|
+
For example, if you distribute copies of such a program, whether
|
|
34
|
+
gratis or for a fee, you must give the recipients all the rights that
|
|
35
|
+
you have. You must make sure that they, too, receive or can get the
|
|
36
|
+
source code. And you must show them these terms so they know their
|
|
37
|
+
rights.
|
|
38
|
+
|
|
39
|
+
We protect your rights with two steps: (1) copyright the software, and
|
|
40
|
+
(2) offer you this license which gives you legal permission to copy,
|
|
41
|
+
distribute and/or modify the software.
|
|
42
|
+
|
|
43
|
+
Also, for each author's protection and ours, we want to make certain
|
|
44
|
+
that everyone understands that there is no warranty for this free
|
|
45
|
+
software. If the software is modified by someone else and passed on, we
|
|
46
|
+
want its recipients to know that what they have is not the original, so
|
|
47
|
+
that any problems introduced by others will not reflect on the original
|
|
48
|
+
authors' reputations.
|
|
49
|
+
|
|
50
|
+
Finally, any free program is threatened constantly by software
|
|
51
|
+
patents. We wish to avoid the danger that redistributors of a free
|
|
52
|
+
program will individually obtain patent licenses, in effect making the
|
|
53
|
+
program proprietary. To prevent this, we have made it clear that any
|
|
54
|
+
patent must be licensed for everyone's free use or not licensed at all.
|
|
55
|
+
|
|
56
|
+
The precise terms and conditions for copying, distribution and
|
|
57
|
+
modification follow.
|
|
58
|
+
|
|
59
|
+
GNU GENERAL PUBLIC LICENSE
|
|
60
|
+
TERMS AND CONDITIONS FOR COPYING, DISTRIBUTION AND MODIFICATION
|
|
61
|
+
|
|
62
|
+
0. This License applies to any program or other work which contains
|
|
63
|
+
a notice placed by the copyright holder saying it may be distributed
|
|
64
|
+
under the terms of this General Public License. The "Program", below,
|
|
65
|
+
refers to any such program or work, and a "work based on the Program"
|
|
66
|
+
means either the Program or any derivative work under copyright law:
|
|
67
|
+
that is to say, a work containing the Program or a portion of it,
|
|
68
|
+
either verbatim or with modifications and/or translated into another
|
|
69
|
+
language. (Hereinafter, translation is included without limitation in
|
|
70
|
+
the term "modification".) Each licensee is addressed as "you".
|
|
71
|
+
|
|
72
|
+
Activities other than copying, distribution and modification are not
|
|
73
|
+
covered by this License; they are outside its scope. The act of
|
|
74
|
+
running the Program is not restricted, and the output from the Program
|
|
75
|
+
is covered only if its contents constitute a work based on the
|
|
76
|
+
Program (independent of having been made by running the Program).
|
|
77
|
+
Whether that is true depends on what the Program does.
|
|
78
|
+
|
|
79
|
+
1. You may copy and distribute verbatim copies of the Program's
|
|
80
|
+
source code as you receive it, in any medium, provided that you
|
|
81
|
+
conspicuously and appropriately publish on each copy an appropriate
|
|
82
|
+
copyright notice and disclaimer of warranty; keep intact all the
|
|
83
|
+
notices that refer to this License and to the absence of any warranty;
|
|
84
|
+
and give any other recipients of the Program a copy of this License
|
|
85
|
+
along with the Program.
|
|
86
|
+
|
|
87
|
+
You may charge a fee for the physical act of transferring a copy, and
|
|
88
|
+
you may at your option offer warranty protection in exchange for a fee.
|
|
89
|
+
|
|
90
|
+
2. You may modify your copy or copies of the Program or any portion
|
|
91
|
+
of it, thus forming a work based on the Program, and copy and
|
|
92
|
+
distribute such modifications or work under the terms of Section 1
|
|
93
|
+
above, provided that you also meet all of these conditions:
|
|
94
|
+
|
|
95
|
+
a) You must cause the modified files to carry prominent notices
|
|
96
|
+
stating that you changed the files and the date of any change.
|
|
97
|
+
|
|
98
|
+
b) You must cause any work that you distribute or publish, that in
|
|
99
|
+
whole or in part contains or is derived from the Program or any
|
|
100
|
+
part thereof, to be licensed as a whole at no charge to all third
|
|
101
|
+
parties under the terms of this License.
|
|
102
|
+
|
|
103
|
+
c) If the modified program normally reads commands interactively
|
|
104
|
+
when run, you must cause it, when started running for such
|
|
105
|
+
interactive use in the most ordinary way, to print or display an
|
|
106
|
+
announcement including an appropriate copyright notice and a
|
|
107
|
+
notice that there is no warranty (or else, saying that you provide
|
|
108
|
+
a warranty) and that users may redistribute the program under
|
|
109
|
+
these conditions, and telling the user how to view a copy of this
|
|
110
|
+
License. (Exception: if the Program itself is interactive but
|
|
111
|
+
does not normally print such an announcement, your work based on
|
|
112
|
+
the Program is not required to print an announcement.)
|
|
113
|
+
|
|
114
|
+
These requirements apply to the modified work as a whole. If
|
|
115
|
+
identifiable sections of that work are not derived from the Program,
|
|
116
|
+
and can be reasonably considered independent and separate works in
|
|
117
|
+
themselves, then this License, and its terms, do not apply to those
|
|
118
|
+
sections when you distribute them as separate works. But when you
|
|
119
|
+
distribute the same sections as part of a whole which is a work based
|
|
120
|
+
on the Program, the distribution of the whole must be on the terms of
|
|
121
|
+
this License, whose permissions for other licensees extend to the
|
|
122
|
+
entire whole, and thus to each and every part regardless of who wrote it.
|
|
123
|
+
|
|
124
|
+
Thus, it is not the intent of this section to claim rights or contest
|
|
125
|
+
your rights to work written entirely by you; rather, the intent is to
|
|
126
|
+
exercise the right to control the distribution of derivative or
|
|
127
|
+
collective works based on the Program.
|
|
128
|
+
|
|
129
|
+
In addition, mere aggregation of another work not based on the Program
|
|
130
|
+
with the Program (or with a work based on the Program) on a volume of
|
|
131
|
+
a storage or distribution medium does not bring the other work under
|
|
132
|
+
the scope of this License.
|
|
133
|
+
|
|
134
|
+
3. You may copy and distribute the Program (or a work based on it,
|
|
135
|
+
under Section 2) in object code or executable form under the terms of
|
|
136
|
+
Sections 1 and 2 above provided that you also do one of the following:
|
|
137
|
+
|
|
138
|
+
a) Accompany it with the complete corresponding machine-readable
|
|
139
|
+
source code, which must be distributed under the terms of Sections
|
|
140
|
+
1 and 2 above on a medium customarily used for software interchange; or,
|
|
141
|
+
|
|
142
|
+
b) Accompany it with a written offer, valid for at least three
|
|
143
|
+
years, to give any third party, for a charge no more than your
|
|
144
|
+
cost of physically performing source distribution, a complete
|
|
145
|
+
machine-readable copy of the corresponding source code, to be
|
|
146
|
+
distributed under the terms of Sections 1 and 2 above on a medium
|
|
147
|
+
customarily used for software interchange; or,
|
|
148
|
+
|
|
149
|
+
c) Accompany it with the information you received as to the offer
|
|
150
|
+
to distribute corresponding source code. (This alternative is
|
|
151
|
+
allowed only for noncommercial distribution and only if you
|
|
152
|
+
received the program in object code or executable form with such
|
|
153
|
+
an offer, in accord with Subsection b above.)
|
|
154
|
+
|
|
155
|
+
The source code for a work means the preferred form of the work for
|
|
156
|
+
making modifications to it. For an executable work, complete source
|
|
157
|
+
code means all the source code for all modules it contains, plus any
|
|
158
|
+
associated interface definition files, plus the scripts used to
|
|
159
|
+
control compilation and installation of the executable. However, as a
|
|
160
|
+
special exception, the source code distributed need not include
|
|
161
|
+
anything that is normally distributed (in either source or binary
|
|
162
|
+
form) with the major components (compiler, kernel, and so on) of the
|
|
163
|
+
operating system on which the executable runs, unless that component
|
|
164
|
+
itself accompanies the executable.
|
|
165
|
+
|
|
166
|
+
If distribution of executable or object code is made by offering
|
|
167
|
+
access to copy from a designated place, then offering equivalent
|
|
168
|
+
access to copy the source code from the same place counts as
|
|
169
|
+
distribution of the source code, even though third parties are not
|
|
170
|
+
compelled to copy the source along with the object code.
|
|
171
|
+
|
|
172
|
+
4. You may not copy, modify, sublicense, or distribute the Program
|
|
173
|
+
except as expressly provided under this License. Any attempt
|
|
174
|
+
otherwise to copy, modify, sublicense or distribute the Program is
|
|
175
|
+
void, and will automatically terminate your rights under this License.
|
|
176
|
+
However, parties who have received copies, or rights, from you under
|
|
177
|
+
this License will not have their licenses terminated so long as such
|
|
178
|
+
parties remain in full compliance.
|
|
179
|
+
|
|
180
|
+
5. You are not required to accept this License, since you have not
|
|
181
|
+
signed it. However, nothing else grants you permission to modify or
|
|
182
|
+
distribute the Program or its derivative works. These actions are
|
|
183
|
+
prohibited by law if you do not accept this License. Therefore, by
|
|
184
|
+
modifying or distributing the Program (or any work based on the
|
|
185
|
+
Program), you indicate your acceptance of this License to do so, and
|
|
186
|
+
all its terms and conditions for copying, distributing or modifying
|
|
187
|
+
the Program or works based on it.
|
|
188
|
+
|
|
189
|
+
6. Each time you redistribute the Program (or any work based on the
|
|
190
|
+
Program), the recipient automatically receives a license from the
|
|
191
|
+
original licensor to copy, distribute or modify the Program subject to
|
|
192
|
+
these terms and conditions. You may not impose any further
|
|
193
|
+
restrictions on the recipients' exercise of the rights granted herein.
|
|
194
|
+
You are not responsible for enforcing compliance by third parties to
|
|
195
|
+
this License.
|
|
196
|
+
|
|
197
|
+
7. If, as a consequence of a court judgment or allegation of patent
|
|
198
|
+
infringement or for any other reason (not limited to patent issues),
|
|
199
|
+
conditions are imposed on you (whether by court order, agreement or
|
|
200
|
+
otherwise) that contradict the conditions of this License, they do not
|
|
201
|
+
excuse you from the conditions of this License. If you cannot
|
|
202
|
+
distribute so as to satisfy simultaneously your obligations under this
|
|
203
|
+
License and any other pertinent obligations, then as a consequence you
|
|
204
|
+
may not distribute the Program at all. For example, if a patent
|
|
205
|
+
license would not permit royalty-free redistribution of the Program by
|
|
206
|
+
all those who receive copies directly or indirectly through you, then
|
|
207
|
+
the only way you could satisfy both it and this License would be to
|
|
208
|
+
refrain entirely from distribution of the Program.
|
|
209
|
+
|
|
210
|
+
If any portion of this section is held invalid or unenforceable under
|
|
211
|
+
any particular circumstance, the balance of the section is intended to
|
|
212
|
+
apply and the section as a whole is intended to apply in other
|
|
213
|
+
circumstances.
|
|
214
|
+
|
|
215
|
+
It is not the purpose of this section to induce you to infringe any
|
|
216
|
+
patents or other property right claims or to contest validity of any
|
|
217
|
+
such claims; this section has the sole purpose of protecting the
|
|
218
|
+
integrity of the free software distribution system, which is
|
|
219
|
+
implemented by public license practices. Many people have made
|
|
220
|
+
generous contributions to the wide range of software distributed
|
|
221
|
+
through that system in reliance on consistent application of that
|
|
222
|
+
system; it is up to the author/donor to decide if he or she is willing
|
|
223
|
+
to distribute software through any other system and a licensee cannot
|
|
224
|
+
impose that choice.
|
|
225
|
+
|
|
226
|
+
This section is intended to make thoroughly clear what is believed to
|
|
227
|
+
be a consequence of the rest of this License.
|
|
228
|
+
|
|
229
|
+
8. If the distribution and/or use of the Program is restricted in
|
|
230
|
+
certain countries either by patents or by copyrighted interfaces, the
|
|
231
|
+
original copyright holder who places the Program under this License
|
|
232
|
+
may add an explicit geographical distribution limitation excluding
|
|
233
|
+
those countries, so that distribution is permitted only in or among
|
|
234
|
+
countries not thus excluded. In such case, this License incorporates
|
|
235
|
+
the limitation as if written in the body of this License.
|
|
236
|
+
|
|
237
|
+
9. The Free Software Foundation may publish revised and/or new versions
|
|
238
|
+
of the General Public License from time to time. Such new versions will
|
|
239
|
+
be similar in spirit to the present version, but may differ in detail to
|
|
240
|
+
address new problems or concerns.
|
|
241
|
+
|
|
242
|
+
Each version is given a distinguishing version number. If the Program
|
|
243
|
+
specifies a version number of this License which applies to it and "any
|
|
244
|
+
later version", you have the option of following the terms and conditions
|
|
245
|
+
either of that version or of any later version published by the Free
|
|
246
|
+
Software Foundation. If the Program does not specify a version number of
|
|
247
|
+
this License, you may choose any version ever published by the Free Software
|
|
248
|
+
Foundation.
|
|
249
|
+
|
|
250
|
+
10. If you wish to incorporate parts of the Program into other free
|
|
251
|
+
programs whose distribution conditions are different, write to the author
|
|
252
|
+
to ask for permission. For software which is copyrighted by the Free
|
|
253
|
+
Software Foundation, write to the Free Software Foundation; we sometimes
|
|
254
|
+
make exceptions for this. Our decision will be guided by the two goals
|
|
255
|
+
of preserving the free status of all derivatives of our free software and
|
|
256
|
+
of promoting the sharing and reuse of software generally.
|
|
257
|
+
|
|
258
|
+
NO WARRANTY
|
|
259
|
+
|
|
260
|
+
11. BECAUSE THE PROGRAM IS LICENSED FREE OF CHARGE, THERE IS NO WARRANTY
|
|
261
|
+
FOR THE PROGRAM, TO THE EXTENT PERMITTED BY APPLICABLE LAW. EXCEPT WHEN
|
|
262
|
+
OTHERWISE STATED IN WRITING THE COPYRIGHT HOLDERS AND/OR OTHER PARTIES
|
|
263
|
+
PROVIDE THE PROGRAM "AS IS" WITHOUT WARRANTY OF ANY KIND, EITHER EXPRESSED
|
|
264
|
+
OR IMPLIED, INCLUDING, BUT NOT LIMITED TO, THE IMPLIED WARRANTIES OF
|
|
265
|
+
MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE. THE ENTIRE RISK AS
|
|
266
|
+
TO THE QUALITY AND PERFORMANCE OF THE PROGRAM IS WITH YOU. SHOULD THE
|
|
267
|
+
PROGRAM PROVE DEFECTIVE, YOU ASSUME THE COST OF ALL NECESSARY SERVICING,
|
|
268
|
+
REPAIR OR CORRECTION.
|
|
269
|
+
|
|
270
|
+
12. IN NO EVENT UNLESS REQUIRED BY APPLICABLE LAW OR AGREED TO IN WRITING
|
|
271
|
+
WILL ANY COPYRIGHT HOLDER, OR ANY OTHER PARTY WHO MAY MODIFY AND/OR
|
|
272
|
+
REDISTRIBUTE THE PROGRAM AS PERMITTED ABOVE, BE LIABLE TO YOU FOR DAMAGES,
|
|
273
|
+
INCLUDING ANY GENERAL, SPECIAL, INCIDENTAL OR CONSEQUENTIAL DAMAGES ARISING
|
|
274
|
+
OUT OF THE USE OR INABILITY TO USE THE PROGRAM (INCLUDING BUT NOT LIMITED
|
|
275
|
+
TO LOSS OF DATA OR DATA BEING RENDERED INACCURATE OR LOSSES SUSTAINED BY
|
|
276
|
+
YOU OR THIRD PARTIES OR A FAILURE OF THE PROGRAM TO OPERATE WITH ANY OTHER
|
|
277
|
+
PROGRAMS), EVEN IF SUCH HOLDER OR OTHER PARTY HAS BEEN ADVISED OF THE
|
|
278
|
+
POSSIBILITY OF SUCH DAMAGES.
|
|
279
|
+
|
|
280
|
+
END OF TERMS AND CONDITIONS
|
|
281
|
+
|
|
282
|
+
How to Apply These Terms to Your New Programs
|
|
283
|
+
|
|
284
|
+
If you develop a new program, and you want it to be of the greatest
|
|
285
|
+
possible use to the public, the best way to achieve this is to make it
|
|
286
|
+
free software which everyone can redistribute and change under these terms.
|
|
287
|
+
|
|
288
|
+
To do so, attach the following notices to the program. It is safest
|
|
289
|
+
to attach them to the start of each source file to most effectively
|
|
290
|
+
convey the exclusion of warranty; and each file should have at least
|
|
291
|
+
the "copyright" line and a pointer to where the full notice is found.
|
|
292
|
+
|
|
293
|
+
<one line to give the program's name and a brief idea of what it does.>
|
|
294
|
+
Copyright (C) <year> <name of author>
|
|
295
|
+
|
|
296
|
+
This program is free software; you can redistribute it and/or modify
|
|
297
|
+
it under the terms of the GNU General Public License as published by
|
|
298
|
+
the Free Software Foundation; either version 2 of the License, or
|
|
299
|
+
(at your option) any later version.
|
|
300
|
+
|
|
301
|
+
This program is distributed in the hope that it will be useful,
|
|
302
|
+
but WITHOUT ANY WARRANTY; without even the implied warranty of
|
|
303
|
+
MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
|
|
304
|
+
GNU General Public License for more details.
|
|
305
|
+
|
|
306
|
+
You should have received a copy of the GNU General Public License along
|
|
307
|
+
with this program; if not, write to the Free Software Foundation, Inc.,
|
|
308
|
+
51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA.
|
|
309
|
+
|
|
310
|
+
Also add information on how to contact you by electronic and paper mail.
|
|
311
|
+
|
|
312
|
+
If the program is interactive, make it output a short notice like this
|
|
313
|
+
when it starts in an interactive mode:
|
|
314
|
+
|
|
315
|
+
Gnomovision version 69, Copyright (C) year name of author
|
|
316
|
+
Gnomovision comes with ABSOLUTELY NO WARRANTY; for details type `show w'.
|
|
317
|
+
This is free software, and you are welcome to redistribute it
|
|
318
|
+
under certain conditions; type `show c' for details.
|
|
319
|
+
|
|
320
|
+
The hypothetical commands `show w' and `show c' should show the appropriate
|
|
321
|
+
parts of the General Public License. Of course, the commands you use may
|
|
322
|
+
be called something other than `show w' and `show c'; they could even be
|
|
323
|
+
mouse-clicks or menu items--whatever suits your program.
|
|
324
|
+
|
|
325
|
+
You should also get your employer (if you work as a programmer) or your
|
|
326
|
+
school, if any, to sign a "copyright disclaimer" for the program, if
|
|
327
|
+
necessary. Here is a sample; alter the names:
|
|
328
|
+
|
|
329
|
+
Yoyodyne, Inc., hereby disclaims all copyright interest in the program
|
|
330
|
+
`Gnomovision' (which makes passes at compilers) written by James Hacker.
|
|
331
|
+
|
|
332
|
+
<signature of Ty Coon>, 1 April 1989
|
|
333
|
+
Ty Coon, President of Vice
|
|
334
|
+
|
|
335
|
+
This General Public License does not permit incorporating your program into
|
|
336
|
+
proprietary programs. If your program is a subroutine library, you may
|
|
337
|
+
consider it more useful to permit linking proprietary applications with the
|
|
338
|
+
library. If this is what you want to do, use the GNU Lesser General
|
|
339
|
+
Public License instead of this License.
|
|
@@ -0,0 +1,189 @@
|
|
|
1
|
+
Metadata-Version: 2.4
|
|
2
|
+
Name: bsy-clippy
|
|
3
|
+
Version: 0.1.0
|
|
4
|
+
Summary: Terminal client for interacting with an Ollama server
|
|
5
|
+
Author: Sebas
|
|
6
|
+
Requires-Python: >=3.8
|
|
7
|
+
Description-Content-Type: text/markdown
|
|
8
|
+
License-File: LICENSE
|
|
9
|
+
Requires-Dist: requests<3,>=2.28
|
|
10
|
+
Dynamic: license-file
|
|
11
|
+
|
|
12
|
+
# bsy-clippy
|
|
13
|
+
|
|
14
|
+
`bsy-clippy` is a lightweight Python client for interacting with an [Ollama](https://ollama.ai) server.
|
|
15
|
+
|
|
16
|
+
It supports both **batch (stdin) mode** for one-shot prompts and **interactive mode** for chatting directly in the terminal.
|
|
17
|
+
You can also load **system prompts** from a file to guide the LLM’s behavior.
|
|
18
|
+
|
|
19
|
+
---
|
|
20
|
+
|
|
21
|
+
## Features
|
|
22
|
+
|
|
23
|
+
- Connects to Ollama API over HTTP (`/api/generate`).
|
|
24
|
+
- Defaults to:
|
|
25
|
+
- IP: `172.20.0.100`
|
|
26
|
+
- Port: `11434`
|
|
27
|
+
- Model: `qwen3:1.7b`
|
|
28
|
+
- Mode: `batch` (wait for full output)
|
|
29
|
+
- Bundled system prompt file that can be overridden with `--system-file`
|
|
30
|
+
- Configurable parameters:
|
|
31
|
+
- `-i` / `--ip` → Ollama server IP
|
|
32
|
+
- `-p` / `--port` → Ollama server port
|
|
33
|
+
- `-M` / `--model` → model name
|
|
34
|
+
- `-m` / `--mode` → output mode (`stream` or `batch`)
|
|
35
|
+
- `-t` / `--temperature` → sampling temperature (default: `0.7`)
|
|
36
|
+
- `-s` / `--system-file` → path to a text file with system instructions
|
|
37
|
+
- `-u` / `--user-prompt` → extra user instructions prepended before the data payload
|
|
38
|
+
- `-r` / `--memory-lines` → number of conversation lines to remember in interactive mode
|
|
39
|
+
- `-c` / `--chat-after-stdin` → process stdin once, then drop into interactive chat
|
|
40
|
+
- Two modes of operation:
|
|
41
|
+
- **Batch mode** (default) → waits until the answer is complete, then prints only the final result.
|
|
42
|
+
- **Stream mode** → shows response in real-time, tokens appear as they are generated.
|
|
43
|
+
- Colored terminal output:
|
|
44
|
+
- **Yellow** = streaming tokens (the model’s “thinking” in progress).
|
|
45
|
+
- **Default terminal color** = final assembled answer.
|
|
46
|
+
|
|
47
|
+
---
|
|
48
|
+
|
|
49
|
+
## Installation
|
|
50
|
+
|
|
51
|
+
### pipx (recommended)
|
|
52
|
+
|
|
53
|
+
```bash
|
|
54
|
+
pipx install .
|
|
55
|
+
```
|
|
56
|
+
|
|
57
|
+
After updating the source, reinstall with `pipx reinstall bsy-clippy`.
|
|
58
|
+
|
|
59
|
+
### pip / virtual environments
|
|
60
|
+
|
|
61
|
+
```bash
|
|
62
|
+
pip install .
|
|
63
|
+
```
|
|
64
|
+
|
|
65
|
+
---
|
|
66
|
+
|
|
67
|
+
## Usage
|
|
68
|
+
|
|
69
|
+
### System prompt file
|
|
70
|
+
|
|
71
|
+
By default, `bsy-clippy` loads a bundled prompt (`Be very brief. Be very short.`).
|
|
72
|
+
You can change this with `--system-file` or disable it via `--no-default-system`.
|
|
73
|
+
|
|
74
|
+
Example **bsy-clippy.txt**:
|
|
75
|
+
|
|
76
|
+
```
|
|
77
|
+
You are a helpful assistant specialized in cybersecurity.
|
|
78
|
+
Always explain your reasoning clearly, and avoid unnecessary markdown formatting.
|
|
79
|
+
```
|
|
80
|
+
|
|
81
|
+
These lines will be sent to the LLM before every user prompt.
|
|
82
|
+
|
|
83
|
+
### User prompt parameter
|
|
84
|
+
|
|
85
|
+
Use `--user-prompt "Classify the following log:"` when piping data so the model receives:
|
|
86
|
+
|
|
87
|
+
```
|
|
88
|
+
system prompt (if any)
|
|
89
|
+
|
|
90
|
+
user prompt text
|
|
91
|
+
|
|
92
|
+
data from stdin or interactive input
|
|
93
|
+
```
|
|
94
|
+
|
|
95
|
+
### Interactive memory
|
|
96
|
+
|
|
97
|
+
Set `--memory-lines 6` (or `-r 6`) to keep the last six conversation lines (user + assistant) while chatting.
|
|
98
|
+
Only the final assistant reply (not the thinking traces) is stored and sent back on the next turn.
|
|
99
|
+
|
|
100
|
+
### Chat after stdin
|
|
101
|
+
|
|
102
|
+
Use `-c` / `--chat-after-stdin` to process piped data first and then remain in interactive mode with the response (and any configured memory) available:
|
|
103
|
+
|
|
104
|
+
```bash
|
|
105
|
+
cat sample.txt | bsy-clippy -u "Summarize this report" -r 6 -c
|
|
106
|
+
```
|
|
107
|
+
|
|
108
|
+
After the initial answer prints, you can continue the conversation while the tool remembers the piped data and the model’s reply.
|
|
109
|
+
|
|
110
|
+
---
|
|
111
|
+
|
|
112
|
+
### Interactive mode (default = batch)
|
|
113
|
+
|
|
114
|
+
Run without piping input:
|
|
115
|
+
|
|
116
|
+
```bash
|
|
117
|
+
bsy-clippy
|
|
118
|
+
```
|
|
119
|
+
|
|
120
|
+
Example session in **batch mode**:
|
|
121
|
+
|
|
122
|
+
```
|
|
123
|
+
You: Hello!
|
|
124
|
+
Hello! How can I assist you today? 😊
|
|
125
|
+
```
|
|
126
|
+
|
|
127
|
+
To force **streaming mode**:
|
|
128
|
+
|
|
129
|
+
```bash
|
|
130
|
+
bsy-clippy --mode stream
|
|
131
|
+
```
|
|
132
|
+
|
|
133
|
+
Streaming session looks like:
|
|
134
|
+
|
|
135
|
+
```
|
|
136
|
+
You: Hello!
|
|
137
|
+
LLM (thinking): <think>
|
|
138
|
+
Reasoning step by step...
|
|
139
|
+
</think>
|
|
140
|
+
Hello! How can I assist you today? 😊
|
|
141
|
+
```
|
|
142
|
+
|
|
143
|
+
---
|
|
144
|
+
|
|
145
|
+
### Batch mode (stdin)
|
|
146
|
+
|
|
147
|
+
Pipe input directly:
|
|
148
|
+
|
|
149
|
+
```bash
|
|
150
|
+
echo "Tell me a joke" | bsy-clippy
|
|
151
|
+
```
|
|
152
|
+
|
|
153
|
+
Output:
|
|
154
|
+
|
|
155
|
+
```
|
|
156
|
+
Why don’t scientists trust atoms? Because they make up everything!
|
|
157
|
+
```
|
|
158
|
+
|
|
159
|
+
---
|
|
160
|
+
|
|
161
|
+
### Forcing modes
|
|
162
|
+
|
|
163
|
+
```bash
|
|
164
|
+
bsy-clippy --mode batch
|
|
165
|
+
bsy-clippy --mode stream
|
|
166
|
+
```
|
|
167
|
+
|
|
168
|
+
---
|
|
169
|
+
|
|
170
|
+
### Adjusting temperature
|
|
171
|
+
|
|
172
|
+
```bash
|
|
173
|
+
bsy-clippy --temperature 0.2
|
|
174
|
+
bsy-clippy --temperature 1.2
|
|
175
|
+
```
|
|
176
|
+
|
|
177
|
+
---
|
|
178
|
+
|
|
179
|
+
### Custom server and model
|
|
180
|
+
|
|
181
|
+
```bash
|
|
182
|
+
bsy-clippy --ip 127.0.0.1 --port 11434 --model llama2
|
|
183
|
+
```
|
|
184
|
+
|
|
185
|
+
---
|
|
186
|
+
|
|
187
|
+
## Requirements
|
|
188
|
+
|
|
189
|
+
See [`requirements.txt`](requirements.txt).
|
|
@@ -0,0 +1,178 @@
|
|
|
1
|
+
# bsy-clippy
|
|
2
|
+
|
|
3
|
+
`bsy-clippy` is a lightweight Python client for interacting with an [Ollama](https://ollama.ai) server.
|
|
4
|
+
|
|
5
|
+
It supports both **batch (stdin) mode** for one-shot prompts and **interactive mode** for chatting directly in the terminal.
|
|
6
|
+
You can also load **system prompts** from a file to guide the LLM’s behavior.
|
|
7
|
+
|
|
8
|
+
---
|
|
9
|
+
|
|
10
|
+
## Features
|
|
11
|
+
|
|
12
|
+
- Connects to Ollama API over HTTP (`/api/generate`).
|
|
13
|
+
- Defaults to:
|
|
14
|
+
- IP: `172.20.0.100`
|
|
15
|
+
- Port: `11434`
|
|
16
|
+
- Model: `qwen3:1.7b`
|
|
17
|
+
- Mode: `batch` (wait for full output)
|
|
18
|
+
- Bundled system prompt file that can be overridden with `--system-file`
|
|
19
|
+
- Configurable parameters:
|
|
20
|
+
- `-i` / `--ip` → Ollama server IP
|
|
21
|
+
- `-p` / `--port` → Ollama server port
|
|
22
|
+
- `-M` / `--model` → model name
|
|
23
|
+
- `-m` / `--mode` → output mode (`stream` or `batch`)
|
|
24
|
+
- `-t` / `--temperature` → sampling temperature (default: `0.7`)
|
|
25
|
+
- `-s` / `--system-file` → path to a text file with system instructions
|
|
26
|
+
- `-u` / `--user-prompt` → extra user instructions prepended before the data payload
|
|
27
|
+
- `-r` / `--memory-lines` → number of conversation lines to remember in interactive mode
|
|
28
|
+
- `-c` / `--chat-after-stdin` → process stdin once, then drop into interactive chat
|
|
29
|
+
- Two modes of operation:
|
|
30
|
+
- **Batch mode** (default) → waits until the answer is complete, then prints only the final result.
|
|
31
|
+
- **Stream mode** → shows response in real-time, tokens appear as they are generated.
|
|
32
|
+
- Colored terminal output:
|
|
33
|
+
- **Yellow** = streaming tokens (the model’s “thinking” in progress).
|
|
34
|
+
- **Default terminal color** = final assembled answer.
|
|
35
|
+
|
|
36
|
+
---
|
|
37
|
+
|
|
38
|
+
## Installation
|
|
39
|
+
|
|
40
|
+
### pipx (recommended)
|
|
41
|
+
|
|
42
|
+
```bash
|
|
43
|
+
pipx install .
|
|
44
|
+
```
|
|
45
|
+
|
|
46
|
+
After updating the source, reinstall with `pipx reinstall bsy-clippy`.
|
|
47
|
+
|
|
48
|
+
### pip / virtual environments
|
|
49
|
+
|
|
50
|
+
```bash
|
|
51
|
+
pip install .
|
|
52
|
+
```
|
|
53
|
+
|
|
54
|
+
---
|
|
55
|
+
|
|
56
|
+
## Usage
|
|
57
|
+
|
|
58
|
+
### System prompt file
|
|
59
|
+
|
|
60
|
+
By default, `bsy-clippy` loads a bundled prompt (`Be very brief. Be very short.`).
|
|
61
|
+
You can change this with `--system-file` or disable it via `--no-default-system`.
|
|
62
|
+
|
|
63
|
+
Example **bsy-clippy.txt**:
|
|
64
|
+
|
|
65
|
+
```
|
|
66
|
+
You are a helpful assistant specialized in cybersecurity.
|
|
67
|
+
Always explain your reasoning clearly, and avoid unnecessary markdown formatting.
|
|
68
|
+
```
|
|
69
|
+
|
|
70
|
+
These lines will be sent to the LLM before every user prompt.
|
|
71
|
+
|
|
72
|
+
### User prompt parameter
|
|
73
|
+
|
|
74
|
+
Use `--user-prompt "Classify the following log:"` when piping data so the model receives:
|
|
75
|
+
|
|
76
|
+
```
|
|
77
|
+
system prompt (if any)
|
|
78
|
+
|
|
79
|
+
user prompt text
|
|
80
|
+
|
|
81
|
+
data from stdin or interactive input
|
|
82
|
+
```
|
|
83
|
+
|
|
84
|
+
### Interactive memory
|
|
85
|
+
|
|
86
|
+
Set `--memory-lines 6` (or `-r 6`) to keep the last six conversation lines (user + assistant) while chatting.
|
|
87
|
+
Only the final assistant reply (not the thinking traces) is stored and sent back on the next turn.
|
|
88
|
+
|
|
89
|
+
### Chat after stdin
|
|
90
|
+
|
|
91
|
+
Use `-c` / `--chat-after-stdin` to process piped data first and then remain in interactive mode with the response (and any configured memory) available:
|
|
92
|
+
|
|
93
|
+
```bash
|
|
94
|
+
cat sample.txt | bsy-clippy -u "Summarize this report" -r 6 -c
|
|
95
|
+
```
|
|
96
|
+
|
|
97
|
+
After the initial answer prints, you can continue the conversation while the tool remembers the piped data and the model’s reply.
|
|
98
|
+
|
|
99
|
+
---
|
|
100
|
+
|
|
101
|
+
### Interactive mode (default = batch)
|
|
102
|
+
|
|
103
|
+
Run without piping input:
|
|
104
|
+
|
|
105
|
+
```bash
|
|
106
|
+
bsy-clippy
|
|
107
|
+
```
|
|
108
|
+
|
|
109
|
+
Example session in **batch mode**:
|
|
110
|
+
|
|
111
|
+
```
|
|
112
|
+
You: Hello!
|
|
113
|
+
Hello! How can I assist you today? 😊
|
|
114
|
+
```
|
|
115
|
+
|
|
116
|
+
To force **streaming mode**:
|
|
117
|
+
|
|
118
|
+
```bash
|
|
119
|
+
bsy-clippy --mode stream
|
|
120
|
+
```
|
|
121
|
+
|
|
122
|
+
Streaming session looks like:
|
|
123
|
+
|
|
124
|
+
```
|
|
125
|
+
You: Hello!
|
|
126
|
+
LLM (thinking): <think>
|
|
127
|
+
Reasoning step by step...
|
|
128
|
+
</think>
|
|
129
|
+
Hello! How can I assist you today? 😊
|
|
130
|
+
```
|
|
131
|
+
|
|
132
|
+
---
|
|
133
|
+
|
|
134
|
+
### Batch mode (stdin)
|
|
135
|
+
|
|
136
|
+
Pipe input directly:
|
|
137
|
+
|
|
138
|
+
```bash
|
|
139
|
+
echo "Tell me a joke" | bsy-clippy
|
|
140
|
+
```
|
|
141
|
+
|
|
142
|
+
Output:
|
|
143
|
+
|
|
144
|
+
```
|
|
145
|
+
Why don’t scientists trust atoms? Because they make up everything!
|
|
146
|
+
```
|
|
147
|
+
|
|
148
|
+
---
|
|
149
|
+
|
|
150
|
+
### Forcing modes
|
|
151
|
+
|
|
152
|
+
```bash
|
|
153
|
+
bsy-clippy --mode batch
|
|
154
|
+
bsy-clippy --mode stream
|
|
155
|
+
```
|
|
156
|
+
|
|
157
|
+
---
|
|
158
|
+
|
|
159
|
+
### Adjusting temperature
|
|
160
|
+
|
|
161
|
+
```bash
|
|
162
|
+
bsy-clippy --temperature 0.2
|
|
163
|
+
bsy-clippy --temperature 1.2
|
|
164
|
+
```
|
|
165
|
+
|
|
166
|
+
---
|
|
167
|
+
|
|
168
|
+
### Custom server and model
|
|
169
|
+
|
|
170
|
+
```bash
|
|
171
|
+
bsy-clippy --ip 127.0.0.1 --port 11434 --model llama2
|
|
172
|
+
```
|
|
173
|
+
|
|
174
|
+
---
|
|
175
|
+
|
|
176
|
+
## Requirements
|
|
177
|
+
|
|
178
|
+
See [`requirements.txt`](requirements.txt).
|
|
@@ -0,0 +1,25 @@
|
|
|
1
|
+
[build-system]
|
|
2
|
+
requires = ["setuptools>=64", "wheel"]
|
|
3
|
+
build-backend = "setuptools.build_meta"
|
|
4
|
+
|
|
5
|
+
[project]
|
|
6
|
+
name = "bsy-clippy"
|
|
7
|
+
version = "0.1.0"
|
|
8
|
+
description = "Terminal client for interacting with an Ollama server"
|
|
9
|
+
readme = "README.md"
|
|
10
|
+
requires-python = ">=3.8"
|
|
11
|
+
authors = [
|
|
12
|
+
{ name = "Sebas" }
|
|
13
|
+
]
|
|
14
|
+
dependencies = [
|
|
15
|
+
"requests>=2.28,<3",
|
|
16
|
+
]
|
|
17
|
+
|
|
18
|
+
[project.scripts]
|
|
19
|
+
"bsy-clippy" = "bsy_clippy.cli:main"
|
|
20
|
+
|
|
21
|
+
[tool.setuptools.packages.find]
|
|
22
|
+
where = ["src"]
|
|
23
|
+
|
|
24
|
+
[tool.setuptools.package-data]
|
|
25
|
+
"bsy_clippy" = ["data/*.txt"]
|
|
@@ -0,0 +1,409 @@
|
|
|
1
|
+
"""Command-line interface for the bsy-clippy Ollama client."""
|
|
2
|
+
from __future__ import annotations
|
|
3
|
+
|
|
4
|
+
import argparse
|
|
5
|
+
import json
|
|
6
|
+
import os
|
|
7
|
+
import sys
|
|
8
|
+
from importlib import resources
|
|
9
|
+
from pathlib import Path
|
|
10
|
+
from typing import IO, Iterable, List, Optional, Sequence, Tuple
|
|
11
|
+
|
|
12
|
+
import requests
|
|
13
|
+
|
|
14
|
+
YELLOW = "\033[93m"
|
|
15
|
+
ANSWER_COLOR = "\033[96m"
|
|
16
|
+
RESET = "\033[0m"
|
|
17
|
+
|
|
18
|
+
_DEFAULT_SYSTEM_PROMPT = "data/bsy-clippy.txt"
|
|
19
|
+
|
|
20
|
+
|
|
21
|
+
def _read_default_system_prompt() -> str:
|
|
22
|
+
"""Return the packaged default system prompt if it exists."""
|
|
23
|
+
try:
|
|
24
|
+
prompt_path = resources.files("bsy_clippy").joinpath(_DEFAULT_SYSTEM_PROMPT)
|
|
25
|
+
except (FileNotFoundError, ModuleNotFoundError, AttributeError):
|
|
26
|
+
return ""
|
|
27
|
+
|
|
28
|
+
try:
|
|
29
|
+
return prompt_path.read_text(encoding="utf-8").strip("\n")
|
|
30
|
+
except OSError:
|
|
31
|
+
return ""
|
|
32
|
+
|
|
33
|
+
|
|
34
|
+
def load_system_prompt(path: Optional[str], allow_default: bool = True) -> str:
|
|
35
|
+
"""Return the content of a system prompt file, or the packaged default."""
|
|
36
|
+
if path:
|
|
37
|
+
file_path = Path(path)
|
|
38
|
+
if not file_path.exists():
|
|
39
|
+
return _read_default_system_prompt() if allow_default else ""
|
|
40
|
+
try:
|
|
41
|
+
return file_path.read_text(encoding="utf-8").strip("\n")
|
|
42
|
+
except OSError as exc:
|
|
43
|
+
print(f"[Warning] Could not read system prompt file '{path}': {exc}", file=sys.stderr)
|
|
44
|
+
return ""
|
|
45
|
+
if allow_default:
|
|
46
|
+
return _read_default_system_prompt()
|
|
47
|
+
return ""
|
|
48
|
+
|
|
49
|
+
|
|
50
|
+
def compose_prompt(system_prompt: str, user_prompt: str, data: str) -> str:
|
|
51
|
+
"""Combine system prompt, user prompt, and data into a single message."""
|
|
52
|
+
parts: List[str] = []
|
|
53
|
+
for part in (system_prompt, user_prompt, data):
|
|
54
|
+
if part and part.strip():
|
|
55
|
+
parts.append(part.strip("\n"))
|
|
56
|
+
return "\n\n".join(parts)
|
|
57
|
+
|
|
58
|
+
|
|
59
|
+
def strip_think_segments(text: str) -> str:
|
|
60
|
+
"""Return text with <think> sections removed."""
|
|
61
|
+
if not text:
|
|
62
|
+
return ""
|
|
63
|
+
|
|
64
|
+
result: List[str] = []
|
|
65
|
+
idx = 0
|
|
66
|
+
in_think = False
|
|
67
|
+
|
|
68
|
+
while idx < len(text):
|
|
69
|
+
if in_think:
|
|
70
|
+
close_idx = text.find("</think>", idx)
|
|
71
|
+
if close_idx == -1:
|
|
72
|
+
break
|
|
73
|
+
idx = close_idx + len("</think>")
|
|
74
|
+
in_think = False
|
|
75
|
+
else:
|
|
76
|
+
open_idx = text.find("<think>", idx)
|
|
77
|
+
if open_idx == -1:
|
|
78
|
+
result.append(text[idx:])
|
|
79
|
+
break
|
|
80
|
+
|
|
81
|
+
if open_idx > idx:
|
|
82
|
+
result.append(text[idx:open_idx])
|
|
83
|
+
idx = open_idx + len("<think>")
|
|
84
|
+
in_think = True
|
|
85
|
+
|
|
86
|
+
return "".join(result).strip()
|
|
87
|
+
|
|
88
|
+
|
|
89
|
+
def colorize_response(text: str) -> str:
|
|
90
|
+
"""Return the response string with ANSI colors applied to think segments."""
|
|
91
|
+
if not text:
|
|
92
|
+
return ""
|
|
93
|
+
|
|
94
|
+
idx = 0
|
|
95
|
+
in_think = False
|
|
96
|
+
output: List[str] = []
|
|
97
|
+
|
|
98
|
+
while idx < len(text):
|
|
99
|
+
if in_think:
|
|
100
|
+
close_idx = text.find("</think>", idx)
|
|
101
|
+
if close_idx == -1:
|
|
102
|
+
output.append(f"{YELLOW}{text[idx:]}{RESET}")
|
|
103
|
+
break
|
|
104
|
+
|
|
105
|
+
if close_idx > idx:
|
|
106
|
+
output.append(f"{YELLOW}{text[idx:close_idx]}{RESET}")
|
|
107
|
+
output.append(f"{YELLOW}</think>{RESET}")
|
|
108
|
+
idx = close_idx + len("</think>")
|
|
109
|
+
in_think = False
|
|
110
|
+
else:
|
|
111
|
+
open_idx = text.find("<think>", idx)
|
|
112
|
+
if open_idx == -1:
|
|
113
|
+
output.append(f"{ANSWER_COLOR}{text[idx:]}{RESET}")
|
|
114
|
+
break
|
|
115
|
+
|
|
116
|
+
if open_idx > idx:
|
|
117
|
+
output.append(f"{ANSWER_COLOR}{text[idx:open_idx]}{RESET}")
|
|
118
|
+
output.append(f"{YELLOW}<think>{RESET}")
|
|
119
|
+
idx = open_idx + len("<think>")
|
|
120
|
+
in_think = True
|
|
121
|
+
|
|
122
|
+
return "".join(output)
|
|
123
|
+
|
|
124
|
+
|
|
125
|
+
def print_stream_chunk(text: str, in_think: bool) -> Tuple[bool, str]:
|
|
126
|
+
"""Stream a chunk of text with think/final color separation."""
|
|
127
|
+
idx = 0
|
|
128
|
+
final_parts: List[str] = []
|
|
129
|
+
while idx < len(text):
|
|
130
|
+
if in_think:
|
|
131
|
+
close_idx = text.find("</think>", idx)
|
|
132
|
+
if close_idx == -1:
|
|
133
|
+
segment = text[idx:]
|
|
134
|
+
if segment:
|
|
135
|
+
print(f"{YELLOW}{segment}{RESET}", end="", flush=True)
|
|
136
|
+
idx = len(text)
|
|
137
|
+
else:
|
|
138
|
+
segment = text[idx:close_idx]
|
|
139
|
+
if segment:
|
|
140
|
+
print(f"{YELLOW}{segment}{RESET}", end="", flush=True)
|
|
141
|
+
print(f"{YELLOW}</think>{RESET}", end="", flush=True)
|
|
142
|
+
idx = close_idx + len("</think>")
|
|
143
|
+
in_think = False
|
|
144
|
+
else:
|
|
145
|
+
open_idx = text.find("<think>", idx)
|
|
146
|
+
if open_idx == -1:
|
|
147
|
+
segment = text[idx:]
|
|
148
|
+
if segment:
|
|
149
|
+
print(f"{ANSWER_COLOR}{segment}{RESET}", end="", flush=True)
|
|
150
|
+
final_parts.append(segment)
|
|
151
|
+
idx = len(text)
|
|
152
|
+
else:
|
|
153
|
+
segment = text[idx:open_idx]
|
|
154
|
+
if segment:
|
|
155
|
+
print(f"{ANSWER_COLOR}{segment}{RESET}", end="", flush=True)
|
|
156
|
+
final_parts.append(segment)
|
|
157
|
+
print(f"{YELLOW}<think>{RESET}", end="", flush=True)
|
|
158
|
+
idx = open_idx + len("<think>")
|
|
159
|
+
in_think = True
|
|
160
|
+
return in_think, "".join(final_parts)
|
|
161
|
+
|
|
162
|
+
|
|
163
|
+
def call_ollama_batch(api_url: str, model: str, prompt: str, temperature: float) -> Tuple[str, str]:
|
|
164
|
+
"""Send a prompt to Ollama API and return response text (batch mode)."""
|
|
165
|
+
try:
|
|
166
|
+
response = requests.post(
|
|
167
|
+
f"{api_url}/api/generate",
|
|
168
|
+
json={
|
|
169
|
+
"model": model,
|
|
170
|
+
"prompt": prompt,
|
|
171
|
+
"temperature": temperature,
|
|
172
|
+
},
|
|
173
|
+
stream=True,
|
|
174
|
+
timeout=600,
|
|
175
|
+
)
|
|
176
|
+
response.raise_for_status()
|
|
177
|
+
|
|
178
|
+
output: List[str] = []
|
|
179
|
+
for line in response.iter_lines():
|
|
180
|
+
if line:
|
|
181
|
+
try:
|
|
182
|
+
data = json.loads(line.decode("utf-8"))
|
|
183
|
+
output.append(data.get("response", ""))
|
|
184
|
+
except Exception:
|
|
185
|
+
pass
|
|
186
|
+
raw_text = "".join(output)
|
|
187
|
+
return colorize_response(raw_text), strip_think_segments(raw_text)
|
|
188
|
+
|
|
189
|
+
except requests.RequestException as exc:
|
|
190
|
+
error_text = f"[Error contacting Ollama API: {exc}]"
|
|
191
|
+
return error_text, ""
|
|
192
|
+
|
|
193
|
+
|
|
194
|
+
def call_ollama_stream(api_url: str, model: str, prompt: str, temperature: float) -> str:
|
|
195
|
+
"""Send a prompt to Ollama API and stream response with color separation."""
|
|
196
|
+
try:
|
|
197
|
+
response = requests.post(
|
|
198
|
+
f"{api_url}/api/generate",
|
|
199
|
+
json={
|
|
200
|
+
"model": model,
|
|
201
|
+
"prompt": prompt,
|
|
202
|
+
"temperature": temperature,
|
|
203
|
+
},
|
|
204
|
+
stream=True,
|
|
205
|
+
timeout=600,
|
|
206
|
+
)
|
|
207
|
+
response.raise_for_status()
|
|
208
|
+
|
|
209
|
+
in_think = False
|
|
210
|
+
final_parts: List[str] = []
|
|
211
|
+
for line in response.iter_lines():
|
|
212
|
+
if line:
|
|
213
|
+
try:
|
|
214
|
+
data = json.loads(line.decode("utf-8"))
|
|
215
|
+
text = data.get("response", "")
|
|
216
|
+
if text:
|
|
217
|
+
in_think, segment = print_stream_chunk(text, in_think)
|
|
218
|
+
if segment:
|
|
219
|
+
final_parts.append(segment)
|
|
220
|
+
|
|
221
|
+
if data.get("done", False):
|
|
222
|
+
break
|
|
223
|
+
except Exception:
|
|
224
|
+
continue
|
|
225
|
+
print()
|
|
226
|
+
return strip_think_segments("".join(final_parts))
|
|
227
|
+
|
|
228
|
+
except requests.RequestException as exc:
|
|
229
|
+
print(f"[Error contacting Ollama API: {exc}]")
|
|
230
|
+
return ""
|
|
231
|
+
|
|
232
|
+
|
|
233
|
+
def read_user_input(prompt_text: str, input_stream: Optional[IO[str]]) -> str:
|
|
234
|
+
"""Read a line of input, supporting non-tty streams."""
|
|
235
|
+
if input_stream is None:
|
|
236
|
+
return input(prompt_text)
|
|
237
|
+
|
|
238
|
+
print(prompt_text, end="", flush=True)
|
|
239
|
+
line = input_stream.readline()
|
|
240
|
+
if not line:
|
|
241
|
+
raise EOFError
|
|
242
|
+
return line.rstrip("\r\n")
|
|
243
|
+
|
|
244
|
+
|
|
245
|
+
def interactive_mode(
|
|
246
|
+
api_url: str,
|
|
247
|
+
model: str,
|
|
248
|
+
mode: str,
|
|
249
|
+
temperature: float,
|
|
250
|
+
system_prompt: str,
|
|
251
|
+
user_prompt: str,
|
|
252
|
+
memory_lines: int,
|
|
253
|
+
memory_seed: Optional[Sequence[str]] = None,
|
|
254
|
+
input_stream: Optional[IO[str]] = None,
|
|
255
|
+
) -> None:
|
|
256
|
+
"""Interactive chat mode with selectable output mode."""
|
|
257
|
+
print(f"Interactive mode with model '{model}' at {api_url}")
|
|
258
|
+
print(f"Mode: {mode}, Temperature: {temperature}")
|
|
259
|
+
print("Type 'exit' or Ctrl+C to quit.")
|
|
260
|
+
memory: List[str] = list(memory_seed) if memory_seed else []
|
|
261
|
+
if memory_lines > 0 and memory:
|
|
262
|
+
memory[:] = memory[-memory_lines:]
|
|
263
|
+
local_stream = input_stream
|
|
264
|
+
close_stream = False
|
|
265
|
+
if local_stream is None:
|
|
266
|
+
if sys.stdin.isatty():
|
|
267
|
+
local_stream = None
|
|
268
|
+
else:
|
|
269
|
+
tty_paths = ["CONIN$"] if os.name == "nt" else ["/dev/tty"]
|
|
270
|
+
for path in tty_paths:
|
|
271
|
+
try:
|
|
272
|
+
local_stream = open(path, "r", encoding="utf-8", errors="ignore")
|
|
273
|
+
close_stream = True
|
|
274
|
+
break
|
|
275
|
+
except OSError:
|
|
276
|
+
local_stream = None
|
|
277
|
+
if local_stream is None and sys.stdin.isatty():
|
|
278
|
+
local_stream = None
|
|
279
|
+
elif local_stream is None:
|
|
280
|
+
local_stream = sys.stdin
|
|
281
|
+
|
|
282
|
+
try:
|
|
283
|
+
while True:
|
|
284
|
+
try:
|
|
285
|
+
prompt = read_user_input("You: ", local_stream)
|
|
286
|
+
except EOFError:
|
|
287
|
+
if local_stream is sys.stdin and not sys.stdin.isatty():
|
|
288
|
+
print("\n[Warning] No interactive input available; exiting.")
|
|
289
|
+
else:
|
|
290
|
+
print("\nExiting.")
|
|
291
|
+
break
|
|
292
|
+
except KeyboardInterrupt:
|
|
293
|
+
print("\nExiting.")
|
|
294
|
+
break
|
|
295
|
+
|
|
296
|
+
user_text = prompt.strip()
|
|
297
|
+
if user_text.lower() in {"exit", "quit"}:
|
|
298
|
+
break
|
|
299
|
+
history_block = ""
|
|
300
|
+
if memory:
|
|
301
|
+
history_block = "History of Past Interaction:\n" + "\n".join(memory)
|
|
302
|
+
|
|
303
|
+
current_block = ""
|
|
304
|
+
if user_text:
|
|
305
|
+
current_block = f"Current User Message:\n{user_text}"
|
|
306
|
+
|
|
307
|
+
conversation_parts = [part for part in (history_block, current_block) if part]
|
|
308
|
+
conversation_input = "\n\n".join(conversation_parts)
|
|
309
|
+
final_prompt = compose_prompt(system_prompt, user_prompt, conversation_input)
|
|
310
|
+
if not final_prompt:
|
|
311
|
+
continue
|
|
312
|
+
final_text = ""
|
|
313
|
+
if mode == "stream":
|
|
314
|
+
print("LLM (thinking): ", end="", flush=True)
|
|
315
|
+
final_text = call_ollama_stream(api_url, model, final_prompt, temperature)
|
|
316
|
+
else:
|
|
317
|
+
response_text, final_text = call_ollama_batch(api_url, model, final_prompt, temperature)
|
|
318
|
+
print(response_text)
|
|
319
|
+
|
|
320
|
+
if memory_lines > 0:
|
|
321
|
+
if user_text:
|
|
322
|
+
memory.append(f"User: {user_text}")
|
|
323
|
+
if final_text:
|
|
324
|
+
memory.append(f"Assistant: {final_text.strip()}")
|
|
325
|
+
if len(memory) > memory_lines:
|
|
326
|
+
memory[:] = memory[-memory_lines:]
|
|
327
|
+
finally:
|
|
328
|
+
if close_stream and local_stream not in {None, sys.stdin}:
|
|
329
|
+
try:
|
|
330
|
+
local_stream.close()
|
|
331
|
+
except OSError:
|
|
332
|
+
pass
|
|
333
|
+
|
|
334
|
+
|
|
335
|
+
def build_parser() -> argparse.ArgumentParser:
|
|
336
|
+
"""Create the argument parser for the CLI."""
|
|
337
|
+
parser = argparse.ArgumentParser(description="bsy-clippy: Ollama API Client")
|
|
338
|
+
parser.add_argument("-i", "--ip", default="172.20.0.100", help="Ollama server IP (default: 172.20.0.100)")
|
|
339
|
+
parser.add_argument("-p", "--port", default="11434", help="Ollama server port (default: 11434)")
|
|
340
|
+
parser.add_argument("-M", "--model", default="qwen3:1.7b", help="Model name (default: qwen3:1.7b)")
|
|
341
|
+
parser.add_argument("-m", "--mode", choices=["stream", "batch"], default="stream", help="Output mode: 'stream' = real-time, 'batch' = wait for final output")
|
|
342
|
+
parser.add_argument("-t", "--temperature", type=float, default=0.7, help="Sampling temperature (default: 0.7, higher = more random)")
|
|
343
|
+
parser.add_argument(
|
|
344
|
+
"-s",
|
|
345
|
+
"--system-file",
|
|
346
|
+
default=None,
|
|
347
|
+
help="Path to a system prompt file (default: packaged prompt)",
|
|
348
|
+
)
|
|
349
|
+
parser.add_argument("-u", "--user-prompt", default="", help="Additional user instructions to prepend before the data")
|
|
350
|
+
parser.add_argument("-r", "--memory-lines", type=int, default=0, help="Remember this many lines of conversation in interactive mode")
|
|
351
|
+
parser.add_argument("-c", "--chat-after-stdin", action="store_true", help="After processing stdin, continue in interactive chat mode")
|
|
352
|
+
parser.add_argument("--no-default-system", action="store_true", help="Disable the packaged default system prompt")
|
|
353
|
+
return parser
|
|
354
|
+
|
|
355
|
+
|
|
356
|
+
def main(argv: Optional[Sequence[str]] = None) -> None:
|
|
357
|
+
parser = build_parser()
|
|
358
|
+
args = parser.parse_args(argv)
|
|
359
|
+
api_url = f"http://{args.ip}:{args.port}"
|
|
360
|
+
allow_default = not args.no_default_system
|
|
361
|
+
system_prompt = load_system_prompt(args.system_file, allow_default=allow_default)
|
|
362
|
+
user_prompt = args.user_prompt
|
|
363
|
+
memory_lines = max(0, args.memory_lines)
|
|
364
|
+
chat_after_stdin = args.chat_after_stdin
|
|
365
|
+
|
|
366
|
+
mode = args.mode
|
|
367
|
+
if mode is None:
|
|
368
|
+
if not sys.stdin.isatty():
|
|
369
|
+
mode = "batch"
|
|
370
|
+
else:
|
|
371
|
+
mode = "stream"
|
|
372
|
+
|
|
373
|
+
if not sys.stdin.isatty():
|
|
374
|
+
data = sys.stdin.read()
|
|
375
|
+
full_prompt = compose_prompt(system_prompt, user_prompt, data)
|
|
376
|
+
|
|
377
|
+
if not full_prompt:
|
|
378
|
+
interactive_mode(api_url, args.model, mode, args.temperature, system_prompt, user_prompt, memory_lines)
|
|
379
|
+
return
|
|
380
|
+
|
|
381
|
+
memory_seed: List[str] = []
|
|
382
|
+
data_text = data.strip()
|
|
383
|
+
if data_text:
|
|
384
|
+
memory_seed.append(f"User: {data_text}")
|
|
385
|
+
|
|
386
|
+
final_text = ""
|
|
387
|
+
if mode == "stream":
|
|
388
|
+
final_text = call_ollama_stream(api_url, args.model, full_prompt, args.temperature)
|
|
389
|
+
else:
|
|
390
|
+
response_text, final_text = call_ollama_batch(api_url, args.model, full_prompt, args.temperature)
|
|
391
|
+
print(response_text)
|
|
392
|
+
if chat_after_stdin:
|
|
393
|
+
if final_text:
|
|
394
|
+
memory_seed.append(f"Assistant: {final_text.strip()}")
|
|
395
|
+
if memory_lines > 0 and memory_seed:
|
|
396
|
+
memory_seed = memory_seed[-memory_lines:]
|
|
397
|
+
interactive_mode(
|
|
398
|
+
api_url,
|
|
399
|
+
args.model,
|
|
400
|
+
mode,
|
|
401
|
+
args.temperature,
|
|
402
|
+
system_prompt,
|
|
403
|
+
user_prompt,
|
|
404
|
+
memory_lines,
|
|
405
|
+
memory_seed if memory_seed else None,
|
|
406
|
+
)
|
|
407
|
+
return
|
|
408
|
+
|
|
409
|
+
interactive_mode(api_url, args.model, mode, args.temperature, system_prompt, user_prompt, memory_lines)
|
|
@@ -0,0 +1,189 @@
|
|
|
1
|
+
Metadata-Version: 2.4
|
|
2
|
+
Name: bsy-clippy
|
|
3
|
+
Version: 0.1.0
|
|
4
|
+
Summary: Terminal client for interacting with an Ollama server
|
|
5
|
+
Author: Sebas
|
|
6
|
+
Requires-Python: >=3.8
|
|
7
|
+
Description-Content-Type: text/markdown
|
|
8
|
+
License-File: LICENSE
|
|
9
|
+
Requires-Dist: requests<3,>=2.28
|
|
10
|
+
Dynamic: license-file
|
|
11
|
+
|
|
12
|
+
# bsy-clippy
|
|
13
|
+
|
|
14
|
+
`bsy-clippy` is a lightweight Python client for interacting with an [Ollama](https://ollama.ai) server.
|
|
15
|
+
|
|
16
|
+
It supports both **batch (stdin) mode** for one-shot prompts and **interactive mode** for chatting directly in the terminal.
|
|
17
|
+
You can also load **system prompts** from a file to guide the LLM’s behavior.
|
|
18
|
+
|
|
19
|
+
---
|
|
20
|
+
|
|
21
|
+
## Features
|
|
22
|
+
|
|
23
|
+
- Connects to Ollama API over HTTP (`/api/generate`).
|
|
24
|
+
- Defaults to:
|
|
25
|
+
- IP: `172.20.0.100`
|
|
26
|
+
- Port: `11434`
|
|
27
|
+
- Model: `qwen3:1.7b`
|
|
28
|
+
- Mode: `batch` (wait for full output)
|
|
29
|
+
- Bundled system prompt file that can be overridden with `--system-file`
|
|
30
|
+
- Configurable parameters:
|
|
31
|
+
- `-i` / `--ip` → Ollama server IP
|
|
32
|
+
- `-p` / `--port` → Ollama server port
|
|
33
|
+
- `-M` / `--model` → model name
|
|
34
|
+
- `-m` / `--mode` → output mode (`stream` or `batch`)
|
|
35
|
+
- `-t` / `--temperature` → sampling temperature (default: `0.7`)
|
|
36
|
+
- `-s` / `--system-file` → path to a text file with system instructions
|
|
37
|
+
- `-u` / `--user-prompt` → extra user instructions prepended before the data payload
|
|
38
|
+
- `-r` / `--memory-lines` → number of conversation lines to remember in interactive mode
|
|
39
|
+
- `-c` / `--chat-after-stdin` → process stdin once, then drop into interactive chat
|
|
40
|
+
- Two modes of operation:
|
|
41
|
+
- **Batch mode** (default) → waits until the answer is complete, then prints only the final result.
|
|
42
|
+
- **Stream mode** → shows response in real-time, tokens appear as they are generated.
|
|
43
|
+
- Colored terminal output:
|
|
44
|
+
- **Yellow** = streaming tokens (the model’s “thinking” in progress).
|
|
45
|
+
- **Default terminal color** = final assembled answer.
|
|
46
|
+
|
|
47
|
+
---
|
|
48
|
+
|
|
49
|
+
## Installation
|
|
50
|
+
|
|
51
|
+
### pipx (recommended)
|
|
52
|
+
|
|
53
|
+
```bash
|
|
54
|
+
pipx install .
|
|
55
|
+
```
|
|
56
|
+
|
|
57
|
+
After updating the source, reinstall with `pipx reinstall bsy-clippy`.
|
|
58
|
+
|
|
59
|
+
### pip / virtual environments
|
|
60
|
+
|
|
61
|
+
```bash
|
|
62
|
+
pip install .
|
|
63
|
+
```
|
|
64
|
+
|
|
65
|
+
---
|
|
66
|
+
|
|
67
|
+
## Usage
|
|
68
|
+
|
|
69
|
+
### System prompt file
|
|
70
|
+
|
|
71
|
+
By default, `bsy-clippy` loads a bundled prompt (`Be very brief. Be very short.`).
|
|
72
|
+
You can change this with `--system-file` or disable it via `--no-default-system`.
|
|
73
|
+
|
|
74
|
+
Example **bsy-clippy.txt**:
|
|
75
|
+
|
|
76
|
+
```
|
|
77
|
+
You are a helpful assistant specialized in cybersecurity.
|
|
78
|
+
Always explain your reasoning clearly, and avoid unnecessary markdown formatting.
|
|
79
|
+
```
|
|
80
|
+
|
|
81
|
+
These lines will be sent to the LLM before every user prompt.
|
|
82
|
+
|
|
83
|
+
### User prompt parameter
|
|
84
|
+
|
|
85
|
+
Use `--user-prompt "Classify the following log:"` when piping data so the model receives:
|
|
86
|
+
|
|
87
|
+
```
|
|
88
|
+
system prompt (if any)
|
|
89
|
+
|
|
90
|
+
user prompt text
|
|
91
|
+
|
|
92
|
+
data from stdin or interactive input
|
|
93
|
+
```
|
|
94
|
+
|
|
95
|
+
### Interactive memory
|
|
96
|
+
|
|
97
|
+
Set `--memory-lines 6` (or `-r 6`) to keep the last six conversation lines (user + assistant) while chatting.
|
|
98
|
+
Only the final assistant reply (not the thinking traces) is stored and sent back on the next turn.
|
|
99
|
+
|
|
100
|
+
### Chat after stdin
|
|
101
|
+
|
|
102
|
+
Use `-c` / `--chat-after-stdin` to process piped data first and then remain in interactive mode with the response (and any configured memory) available:
|
|
103
|
+
|
|
104
|
+
```bash
|
|
105
|
+
cat sample.txt | bsy-clippy -u "Summarize this report" -r 6 -c
|
|
106
|
+
```
|
|
107
|
+
|
|
108
|
+
After the initial answer prints, you can continue the conversation while the tool remembers the piped data and the model’s reply.
|
|
109
|
+
|
|
110
|
+
---
|
|
111
|
+
|
|
112
|
+
### Interactive mode (default = batch)
|
|
113
|
+
|
|
114
|
+
Run without piping input:
|
|
115
|
+
|
|
116
|
+
```bash
|
|
117
|
+
bsy-clippy
|
|
118
|
+
```
|
|
119
|
+
|
|
120
|
+
Example session in **batch mode**:
|
|
121
|
+
|
|
122
|
+
```
|
|
123
|
+
You: Hello!
|
|
124
|
+
Hello! How can I assist you today? 😊
|
|
125
|
+
```
|
|
126
|
+
|
|
127
|
+
To force **streaming mode**:
|
|
128
|
+
|
|
129
|
+
```bash
|
|
130
|
+
bsy-clippy --mode stream
|
|
131
|
+
```
|
|
132
|
+
|
|
133
|
+
Streaming session looks like:
|
|
134
|
+
|
|
135
|
+
```
|
|
136
|
+
You: Hello!
|
|
137
|
+
LLM (thinking): <think>
|
|
138
|
+
Reasoning step by step...
|
|
139
|
+
</think>
|
|
140
|
+
Hello! How can I assist you today? 😊
|
|
141
|
+
```
|
|
142
|
+
|
|
143
|
+
---
|
|
144
|
+
|
|
145
|
+
### Batch mode (stdin)
|
|
146
|
+
|
|
147
|
+
Pipe input directly:
|
|
148
|
+
|
|
149
|
+
```bash
|
|
150
|
+
echo "Tell me a joke" | bsy-clippy
|
|
151
|
+
```
|
|
152
|
+
|
|
153
|
+
Output:
|
|
154
|
+
|
|
155
|
+
```
|
|
156
|
+
Why don’t scientists trust atoms? Because they make up everything!
|
|
157
|
+
```
|
|
158
|
+
|
|
159
|
+
---
|
|
160
|
+
|
|
161
|
+
### Forcing modes
|
|
162
|
+
|
|
163
|
+
```bash
|
|
164
|
+
bsy-clippy --mode batch
|
|
165
|
+
bsy-clippy --mode stream
|
|
166
|
+
```
|
|
167
|
+
|
|
168
|
+
---
|
|
169
|
+
|
|
170
|
+
### Adjusting temperature
|
|
171
|
+
|
|
172
|
+
```bash
|
|
173
|
+
bsy-clippy --temperature 0.2
|
|
174
|
+
bsy-clippy --temperature 1.2
|
|
175
|
+
```
|
|
176
|
+
|
|
177
|
+
---
|
|
178
|
+
|
|
179
|
+
### Custom server and model
|
|
180
|
+
|
|
181
|
+
```bash
|
|
182
|
+
bsy-clippy --ip 127.0.0.1 --port 11434 --model llama2
|
|
183
|
+
```
|
|
184
|
+
|
|
185
|
+
---
|
|
186
|
+
|
|
187
|
+
## Requirements
|
|
188
|
+
|
|
189
|
+
See [`requirements.txt`](requirements.txt).
|
|
@@ -0,0 +1,12 @@
|
|
|
1
|
+
LICENSE
|
|
2
|
+
README.md
|
|
3
|
+
pyproject.toml
|
|
4
|
+
src/bsy_clippy/__init__.py
|
|
5
|
+
src/bsy_clippy/cli.py
|
|
6
|
+
src/bsy_clippy.egg-info/PKG-INFO
|
|
7
|
+
src/bsy_clippy.egg-info/SOURCES.txt
|
|
8
|
+
src/bsy_clippy.egg-info/dependency_links.txt
|
|
9
|
+
src/bsy_clippy.egg-info/entry_points.txt
|
|
10
|
+
src/bsy_clippy.egg-info/requires.txt
|
|
11
|
+
src/bsy_clippy.egg-info/top_level.txt
|
|
12
|
+
src/bsy_clippy/data/bsy-clippy.txt
|
|
@@ -0,0 +1 @@
|
|
|
1
|
+
|
|
@@ -0,0 +1 @@
|
|
|
1
|
+
requests<3,>=2.28
|
|
@@ -0,0 +1 @@
|
|
|
1
|
+
bsy_clippy
|