kopipasta 0.27.0__tar.gz → 0.28.0__tar.gz
This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
Potentially problematic release.
This version of kopipasta might be problematic. Click here for more details.
- kopipasta-0.28.0/PKG-INFO +109 -0
- kopipasta-0.28.0/README.md +84 -0
- {kopipasta-0.27.0 → kopipasta-0.28.0}/kopipasta/main.py +60 -388
- kopipasta-0.28.0/kopipasta.egg-info/PKG-INFO +109 -0
- kopipasta-0.27.0/requirements.txt → kopipasta-0.28.0/kopipasta.egg-info/requires.txt +0 -2
- kopipasta-0.28.0/requirements.txt +3 -0
- {kopipasta-0.27.0 → kopipasta-0.28.0}/setup.py +1 -1
- kopipasta-0.27.0/PKG-INFO +0 -171
- kopipasta-0.27.0/README.md +0 -144
- kopipasta-0.27.0/kopipasta.egg-info/PKG-INFO +0 -171
- kopipasta-0.27.0/kopipasta.egg-info/requires.txt +0 -5
- {kopipasta-0.27.0 → kopipasta-0.28.0}/LICENSE +0 -0
- {kopipasta-0.27.0 → kopipasta-0.28.0}/MANIFEST.in +0 -0
- {kopipasta-0.27.0 → kopipasta-0.28.0}/kopipasta/__init__.py +0 -0
- {kopipasta-0.27.0 → kopipasta-0.28.0}/kopipasta/import_parser.py +0 -0
- {kopipasta-0.27.0 → kopipasta-0.28.0}/kopipasta.egg-info/SOURCES.txt +0 -0
- {kopipasta-0.27.0 → kopipasta-0.28.0}/kopipasta.egg-info/dependency_links.txt +0 -0
- {kopipasta-0.27.0 → kopipasta-0.28.0}/kopipasta.egg-info/entry_points.txt +0 -0
- {kopipasta-0.27.0 → kopipasta-0.28.0}/kopipasta.egg-info/top_level.txt +0 -0
- {kopipasta-0.27.0 → kopipasta-0.28.0}/setup.cfg +0 -0
|
@@ -0,0 +1,109 @@
|
|
|
1
|
+
Metadata-Version: 2.1
|
|
2
|
+
Name: kopipasta
|
|
3
|
+
Version: 0.28.0
|
|
4
|
+
Summary: A CLI tool to generate prompts with project structure and file contents
|
|
5
|
+
Home-page: https://github.com/mkorpela/kopipasta
|
|
6
|
+
Author: Mikko Korpela
|
|
7
|
+
Author-email: mikko.korpela@gmail.com
|
|
8
|
+
License: MIT
|
|
9
|
+
Classifier: Development Status :: 3 - Alpha
|
|
10
|
+
Classifier: Intended Audience :: Developers
|
|
11
|
+
Classifier: License :: OSI Approved :: MIT License
|
|
12
|
+
Classifier: Operating System :: OS Independent
|
|
13
|
+
Classifier: Programming Language :: Python :: 3
|
|
14
|
+
Classifier: Programming Language :: Python :: 3.8
|
|
15
|
+
Classifier: Programming Language :: Python :: 3.9
|
|
16
|
+
Classifier: Programming Language :: Python :: 3.10
|
|
17
|
+
Classifier: Programming Language :: Python :: 3.11
|
|
18
|
+
Classifier: Programming Language :: Python :: 3.12
|
|
19
|
+
Requires-Python: >=3.8
|
|
20
|
+
Description-Content-Type: text/markdown
|
|
21
|
+
License-File: LICENSE
|
|
22
|
+
Requires-Dist: pyperclip==1.9.0
|
|
23
|
+
Requires-Dist: requests==2.32.3
|
|
24
|
+
Requires-Dist: Pygments==2.18.0
|
|
25
|
+
|
|
26
|
+
# kopipasta
|
|
27
|
+
|
|
28
|
+
[](https://pypi.python.org/pypi/kopipasta)
|
|
29
|
+
[](http://pepy.tech/project/kopipasta)
|
|
30
|
+
|
|
31
|
+
A CLI tool for taking **full, transparent control** of your LLM context. No black boxes.
|
|
32
|
+
|
|
33
|
+
<img src="kopipasta.jpg" alt="kopipasta" width="300">
|
|
34
|
+
|
|
35
|
+
- An LLM told me that "kopi" means Coffee in some languages... and a Diffusion model then made this delicious soup.
|
|
36
|
+
|
|
37
|
+
## The Philosophy: You Control the Context
|
|
38
|
+
|
|
39
|
+
Many AI coding assistants use Retrieval-Augmented Generation (RAG) to automatically find what *they think* is relevant context. This is a black box. When the LLM gives a bad answer, you can't debug it because you don't know what context it was actually given.
|
|
40
|
+
|
|
41
|
+
**`kopipasta` is the opposite.** I built it for myself on the principle of **explicit context control**. You are in the driver's seat. You decide *exactly* what files, functions, and snippets go into the prompt. This transparency is the key to getting reliable, debuggable results from an LLM.
|
|
42
|
+
|
|
43
|
+
It's a "smart copy" command for your project, not a magic wand.
|
|
44
|
+
|
|
45
|
+
## How It Works
|
|
46
|
+
|
|
47
|
+
The workflow is dead simple:
|
|
48
|
+
|
|
49
|
+
1. **Gather:** Run `kopipasta` and point it at the files, directories, and URLs that matter for your task.
|
|
50
|
+
2. **Select:** The tool interactively helps you choose what to include. For large files, you can send just a snippet or even hand-pick individual functions.
|
|
51
|
+
3. **Define:** Your default editor (`$EDITOR`) opens for you to write your instructions to the LLM.
|
|
52
|
+
4. **Paste:** The final, comprehensive prompt is now on your clipboard, ready to be pasted into ChatGPT, Gemini, Claude, or your LLM of choice.
|
|
53
|
+
|
|
54
|
+
## Installation
|
|
55
|
+
|
|
56
|
+
```bash
|
|
57
|
+
# Using pipx (recommended for CLI tools)
|
|
58
|
+
pipx install kopipasta
|
|
59
|
+
|
|
60
|
+
# Or using standard pip
|
|
61
|
+
pip install kopipasta
|
|
62
|
+
```
|
|
63
|
+
|
|
64
|
+
## Usage
|
|
65
|
+
|
|
66
|
+
```bash
|
|
67
|
+
kopipasta [options] [files_or_directories_or_urls...]
|
|
68
|
+
```
|
|
69
|
+
|
|
70
|
+
**Arguments:**
|
|
71
|
+
|
|
72
|
+
* `[files_or_directories_or_urls...]`: One or more paths to files, directories, or web URLs to use as the starting point for your context.
|
|
73
|
+
|
|
74
|
+
**Options:**
|
|
75
|
+
|
|
76
|
+
* `-t TASK`, `--task TASK`: Provide the task description directly on the command line, skipping the editor.
|
|
77
|
+
|
|
78
|
+
## Key Features
|
|
79
|
+
|
|
80
|
+
* **Total Context Control:** Interactively select files, directories, snippets, or even individual functions. You see everything that goes into the prompt.
|
|
81
|
+
* **Transparent & Explicit:** No hidden RAG. You know exactly what's in the prompt because you built it. This makes debugging LLM failures possible.
|
|
82
|
+
* **Web-Aware:** Pulls in content directly from URLs—perfect for API documentation.
|
|
83
|
+
* **Safety First:**
|
|
84
|
+
* Automatically respects your `.gitignore` rules.
|
|
85
|
+
* Detects if you're about to include secrets from a `.env` file and asks what to do.
|
|
86
|
+
* **Context-Aware:** Keeps a running total of the prompt size (in characters and estimated tokens) so you don't overload the LLM's context window.
|
|
87
|
+
* **Developer-Friendly:**
|
|
88
|
+
* Uses your familiar `$EDITOR` for writing task descriptions.
|
|
89
|
+
* Copies the final prompt directly to your clipboard.
|
|
90
|
+
* Provides syntax highlighting during chunk selection.
|
|
91
|
+
|
|
92
|
+
## A Real-World Example
|
|
93
|
+
|
|
94
|
+
I had a bug where my `setup.py` didn't include all the dependencies from `requirements.txt`.
|
|
95
|
+
|
|
96
|
+
1. I ran `kopipasta -t "Update setup.py to read dependencies dynamically from requirements.txt" setup.py requirements.txt`.
|
|
97
|
+
2. The tool confirmed the inclusion of both files and copied the complete prompt to my clipboard.
|
|
98
|
+
3. I pasted the prompt into my LLM chat window.
|
|
99
|
+
4. I copied the LLM's suggested code back into my local `setup.py`.
|
|
100
|
+
5. I tested the changes and committed.
|
|
101
|
+
|
|
102
|
+
No manual file reading, no clumsy copy-pasting, just a clean, context-rich prompt that I had full control over.
|
|
103
|
+
|
|
104
|
+
## Configuration
|
|
105
|
+
|
|
106
|
+
Set your preferred command-line editor via the `EDITOR` environment variable.
|
|
107
|
+
```bash
|
|
108
|
+
export EDITOR=nvim # or vim, nano, code --wait, etc.
|
|
109
|
+
```
|
|
@@ -0,0 +1,84 @@
|
|
|
1
|
+
# kopipasta
|
|
2
|
+
|
|
3
|
+
[](https://pypi.python.org/pypi/kopipasta)
|
|
4
|
+
[](http://pepy.tech/project/kopipasta)
|
|
5
|
+
|
|
6
|
+
A CLI tool for taking **full, transparent control** of your LLM context. No black boxes.
|
|
7
|
+
|
|
8
|
+
<img src="kopipasta.jpg" alt="kopipasta" width="300">
|
|
9
|
+
|
|
10
|
+
- An LLM told me that "kopi" means Coffee in some languages... and a Diffusion model then made this delicious soup.
|
|
11
|
+
|
|
12
|
+
## The Philosophy: You Control the Context
|
|
13
|
+
|
|
14
|
+
Many AI coding assistants use Retrieval-Augmented Generation (RAG) to automatically find what *they think* is relevant context. This is a black box. When the LLM gives a bad answer, you can't debug it because you don't know what context it was actually given.
|
|
15
|
+
|
|
16
|
+
**`kopipasta` is the opposite.** I built it for myself on the principle of **explicit context control**. You are in the driver's seat. You decide *exactly* what files, functions, and snippets go into the prompt. This transparency is the key to getting reliable, debuggable results from an LLM.
|
|
17
|
+
|
|
18
|
+
It's a "smart copy" command for your project, not a magic wand.
|
|
19
|
+
|
|
20
|
+
## How It Works
|
|
21
|
+
|
|
22
|
+
The workflow is dead simple:
|
|
23
|
+
|
|
24
|
+
1. **Gather:** Run `kopipasta` and point it at the files, directories, and URLs that matter for your task.
|
|
25
|
+
2. **Select:** The tool interactively helps you choose what to include. For large files, you can send just a snippet or even hand-pick individual functions.
|
|
26
|
+
3. **Define:** Your default editor (`$EDITOR`) opens for you to write your instructions to the LLM.
|
|
27
|
+
4. **Paste:** The final, comprehensive prompt is now on your clipboard, ready to be pasted into ChatGPT, Gemini, Claude, or your LLM of choice.
|
|
28
|
+
|
|
29
|
+
## Installation
|
|
30
|
+
|
|
31
|
+
```bash
|
|
32
|
+
# Using pipx (recommended for CLI tools)
|
|
33
|
+
pipx install kopipasta
|
|
34
|
+
|
|
35
|
+
# Or using standard pip
|
|
36
|
+
pip install kopipasta
|
|
37
|
+
```
|
|
38
|
+
|
|
39
|
+
## Usage
|
|
40
|
+
|
|
41
|
+
```bash
|
|
42
|
+
kopipasta [options] [files_or_directories_or_urls...]
|
|
43
|
+
```
|
|
44
|
+
|
|
45
|
+
**Arguments:**
|
|
46
|
+
|
|
47
|
+
* `[files_or_directories_or_urls...]`: One or more paths to files, directories, or web URLs to use as the starting point for your context.
|
|
48
|
+
|
|
49
|
+
**Options:**
|
|
50
|
+
|
|
51
|
+
* `-t TASK`, `--task TASK`: Provide the task description directly on the command line, skipping the editor.
|
|
52
|
+
|
|
53
|
+
## Key Features
|
|
54
|
+
|
|
55
|
+
* **Total Context Control:** Interactively select files, directories, snippets, or even individual functions. You see everything that goes into the prompt.
|
|
56
|
+
* **Transparent & Explicit:** No hidden RAG. You know exactly what's in the prompt because you built it. This makes debugging LLM failures possible.
|
|
57
|
+
* **Web-Aware:** Pulls in content directly from URLs—perfect for API documentation.
|
|
58
|
+
* **Safety First:**
|
|
59
|
+
* Automatically respects your `.gitignore` rules.
|
|
60
|
+
* Detects if you're about to include secrets from a `.env` file and asks what to do.
|
|
61
|
+
* **Context-Aware:** Keeps a running total of the prompt size (in characters and estimated tokens) so you don't overload the LLM's context window.
|
|
62
|
+
* **Developer-Friendly:**
|
|
63
|
+
* Uses your familiar `$EDITOR` for writing task descriptions.
|
|
64
|
+
* Copies the final prompt directly to your clipboard.
|
|
65
|
+
* Provides syntax highlighting during chunk selection.
|
|
66
|
+
|
|
67
|
+
## A Real-World Example
|
|
68
|
+
|
|
69
|
+
I had a bug where my `setup.py` didn't include all the dependencies from `requirements.txt`.
|
|
70
|
+
|
|
71
|
+
1. I ran `kopipasta -t "Update setup.py to read dependencies dynamically from requirements.txt" setup.py requirements.txt`.
|
|
72
|
+
2. The tool confirmed the inclusion of both files and copied the complete prompt to my clipboard.
|
|
73
|
+
3. I pasted the prompt into my LLM chat window.
|
|
74
|
+
4. I copied the LLM's suggested code back into my local `setup.py`.
|
|
75
|
+
5. I tested the changes and committed.
|
|
76
|
+
|
|
77
|
+
No manual file reading, no clumsy copy-pasting, just a clean, context-rich prompt that I had full control over.
|
|
78
|
+
|
|
79
|
+
## Configuration
|
|
80
|
+
|
|
81
|
+
Set your preferred command-line editor via the `EDITOR` environment variable.
|
|
82
|
+
```bash
|
|
83
|
+
export EDITOR=nvim # or vim, nano, code --wait, etc.
|
|
84
|
+
```
|