langtune 0.0.2__tar.gz → 0.0.3__tar.gz
This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
Potentially problematic release.
This version of langtune might be problematic. Click here for more details.
- {langtune-0.0.2/src/langtune.egg-info → langtune-0.0.3}/PKG-INFO +26 -25
- {langtune-0.0.2 → langtune-0.0.3}/README.md +25 -24
- {langtune-0.0.2 → langtune-0.0.3}/pyproject.toml +1 -1
- {langtune-0.0.2 → langtune-0.0.3/src/langtune.egg-info}/PKG-INFO +26 -25
- {langtune-0.0.2 → langtune-0.0.3}/LICENSE +0 -0
- {langtune-0.0.2 → langtune-0.0.3}/setup.cfg +0 -0
- {langtune-0.0.2 → langtune-0.0.3}/src/langtune/__init__.py +0 -0
- {langtune-0.0.2 → langtune-0.0.3}/src/langtune.egg-info/SOURCES.txt +0 -0
- {langtune-0.0.2 → langtune-0.0.3}/src/langtune.egg-info/dependency_links.txt +0 -0
- {langtune-0.0.2 → langtune-0.0.3}/src/langtune.egg-info/entry_points.txt +0 -0
- {langtune-0.0.2 → langtune-0.0.3}/src/langtune.egg-info/requires.txt +0 -0
- {langtune-0.0.2 → langtune-0.0.3}/src/langtune.egg-info/top_level.txt +0 -0
|
@@ -1,6 +1,6 @@
|
|
|
1
1
|
Metadata-Version: 2.4
|
|
2
2
|
Name: langtune
|
|
3
|
-
Version: 0.0.
|
|
3
|
+
Version: 0.0.3
|
|
4
4
|
Summary: A package for finetuning text models.
|
|
5
5
|
Author-email: Pritesh Raj <priteshraj41@gmail.com>
|
|
6
6
|
License: MIT License
|
|
@@ -39,7 +39,7 @@ Requires-Dist: pyyaml
|
|
|
39
39
|
Requires-Dist: scipy
|
|
40
40
|
Dynamic: license-file
|
|
41
41
|
|
|
42
|
-
# Langtune:
|
|
42
|
+
# Langtune: Efficient LoRA Fine-Tuning for Text LLMs
|
|
43
43
|
|
|
44
44
|
<hr/>
|
|
45
45
|
<p align="center">
|
|
@@ -60,8 +60,8 @@ Dynamic: license-file
|
|
|
60
60
|
</p>
|
|
61
61
|
|
|
62
62
|
<p align="center">
|
|
63
|
-
<b>Langtune
|
|
64
|
-
<span style="font-size:1.1em"><i>
|
|
63
|
+
<b>Langtune is a Python package for fine-tuning large language models on text data using LoRA.</b><br/>
|
|
64
|
+
<span style="font-size:1.1em"><i>Provides modular components for adapting language models to various NLP tasks.</i></span>
|
|
65
65
|
</p>
|
|
66
66
|
<hr/>
|
|
67
67
|
|
|
@@ -89,41 +89,40 @@ Dynamic: license-file
|
|
|
89
89
|
- [Examples & Use Cases](#examples--use-cases)
|
|
90
90
|
- [Extending the Framework](#extending-the-framework)
|
|
91
91
|
- [Contributing](#contributing)
|
|
92
|
-
- [
|
|
92
|
+
- [License](#license)
|
|
93
93
|
- [Citation](#citation)
|
|
94
94
|
- [Acknowledgements](#acknowledgements)
|
|
95
|
-
- [License](#license)
|
|
96
95
|
|
|
97
96
|
---
|
|
98
97
|
|
|
99
98
|
## Features
|
|
100
|
-
- LoRA adapters for
|
|
99
|
+
- LoRA adapters for efficient fine-tuning
|
|
101
100
|
- Modular transformer backbone
|
|
102
|
-
- Model zoo for
|
|
101
|
+
- Model zoo for language models
|
|
103
102
|
- Configurable and extensible codebase
|
|
104
|
-
- Checkpointing and resume
|
|
103
|
+
- Checkpointing and resume
|
|
105
104
|
- Mixed precision and distributed training
|
|
106
|
-
-
|
|
107
|
-
- CLI for
|
|
108
|
-
-
|
|
105
|
+
- Metrics and visualization tools
|
|
106
|
+
- CLI for training and evaluation
|
|
107
|
+
- Callback support (early stopping, logging, etc.)
|
|
109
108
|
|
|
110
109
|
---
|
|
111
110
|
|
|
112
111
|
## Showcase
|
|
113
112
|
|
|
114
|
-
Langtune is
|
|
113
|
+
Langtune is intended for building and fine-tuning large language models with LoRA. It can be used for text classification, summarization, question answering, and other NLP tasks.
|
|
115
114
|
|
|
116
115
|
---
|
|
117
116
|
|
|
118
117
|
## Getting Started
|
|
119
118
|
|
|
120
|
-
Install
|
|
119
|
+
Install:
|
|
121
120
|
|
|
122
121
|
```bash
|
|
123
122
|
pip install langtune
|
|
124
123
|
```
|
|
125
124
|
|
|
126
|
-
|
|
125
|
+
Example usage:
|
|
127
126
|
|
|
128
127
|
```python
|
|
129
128
|
import torch
|
|
@@ -145,28 +144,28 @@ with torch.no_grad():
|
|
|
145
144
|
print('Output shape:', out.shape)
|
|
146
145
|
```
|
|
147
146
|
|
|
148
|
-
|
|
147
|
+
See the [Documentation](docs/index.md) and `src/langtune/cli/finetune.py` for more details.
|
|
149
148
|
|
|
150
149
|
---
|
|
151
150
|
|
|
152
151
|
## Supported Python Versions
|
|
153
|
-
- Python 3.8
|
|
152
|
+
- Python 3.8 or newer
|
|
154
153
|
|
|
155
154
|
---
|
|
156
155
|
|
|
157
156
|
## Why langtune?
|
|
158
157
|
|
|
159
|
-
-
|
|
160
|
-
- Modular transformer
|
|
161
|
-
- Unified interface for
|
|
162
|
-
-
|
|
163
|
-
- Efficient memory usage
|
|
158
|
+
- Fine-tuning with LoRA adapters
|
|
159
|
+
- Modular transformer design
|
|
160
|
+
- Unified interface for language models
|
|
161
|
+
- Suitable for research and production
|
|
162
|
+
- Efficient memory usage
|
|
164
163
|
|
|
165
164
|
---
|
|
166
165
|
|
|
167
166
|
## Architecture Overview
|
|
168
167
|
|
|
169
|
-
Langtune uses a
|
|
168
|
+
Langtune uses a transformer backbone with LoRA adapters in attention and MLP layers. This enables adaptation of pre-trained models with fewer trainable parameters.
|
|
170
169
|
|
|
171
170
|
### Model Data Flow
|
|
172
171
|
|
|
@@ -337,12 +336,14 @@ model.finetune(dataset, config_path="configs/custom_config.yaml")
|
|
|
337
336
|
- For advanced usage, see `src/langtune/cli/finetune.py`.
|
|
338
337
|
|
|
339
338
|
## Contributing
|
|
340
|
-
|
|
339
|
+
Contributions are welcome. See the [Contributing Guide](CONTRIBUTING.md) for details.
|
|
341
340
|
|
|
342
|
-
## License
|
|
341
|
+
## License
|
|
343
342
|
|
|
344
343
|
This project is licensed under the MIT License. See [LICENSE](LICENSE) for details.
|
|
345
344
|
|
|
345
|
+
## Citation
|
|
346
|
+
|
|
346
347
|
If you use langtune in your research, please cite:
|
|
347
348
|
|
|
348
349
|
```bibtex
|
|
@@ -1,4 +1,4 @@
|
|
|
1
|
-
# Langtune:
|
|
1
|
+
# Langtune: Efficient LoRA Fine-Tuning for Text LLMs
|
|
2
2
|
|
|
3
3
|
<hr/>
|
|
4
4
|
<p align="center">
|
|
@@ -19,8 +19,8 @@
|
|
|
19
19
|
</p>
|
|
20
20
|
|
|
21
21
|
<p align="center">
|
|
22
|
-
<b>Langtune
|
|
23
|
-
<span style="font-size:1.1em"><i>
|
|
22
|
+
<b>Langtune is a Python package for fine-tuning large language models on text data using LoRA.</b><br/>
|
|
23
|
+
<span style="font-size:1.1em"><i>Provides modular components for adapting language models to various NLP tasks.</i></span>
|
|
24
24
|
</p>
|
|
25
25
|
<hr/>
|
|
26
26
|
|
|
@@ -48,41 +48,40 @@
|
|
|
48
48
|
- [Examples & Use Cases](#examples--use-cases)
|
|
49
49
|
- [Extending the Framework](#extending-the-framework)
|
|
50
50
|
- [Contributing](#contributing)
|
|
51
|
-
- [
|
|
51
|
+
- [License](#license)
|
|
52
52
|
- [Citation](#citation)
|
|
53
53
|
- [Acknowledgements](#acknowledgements)
|
|
54
|
-
- [License](#license)
|
|
55
54
|
|
|
56
55
|
---
|
|
57
56
|
|
|
58
57
|
## Features
|
|
59
|
-
- LoRA adapters for
|
|
58
|
+
- LoRA adapters for efficient fine-tuning
|
|
60
59
|
- Modular transformer backbone
|
|
61
|
-
- Model zoo for
|
|
60
|
+
- Model zoo for language models
|
|
62
61
|
- Configurable and extensible codebase
|
|
63
|
-
- Checkpointing and resume
|
|
62
|
+
- Checkpointing and resume
|
|
64
63
|
- Mixed precision and distributed training
|
|
65
|
-
-
|
|
66
|
-
- CLI for
|
|
67
|
-
-
|
|
64
|
+
- Metrics and visualization tools
|
|
65
|
+
- CLI for training and evaluation
|
|
66
|
+
- Callback support (early stopping, logging, etc.)
|
|
68
67
|
|
|
69
68
|
---
|
|
70
69
|
|
|
71
70
|
## Showcase
|
|
72
71
|
|
|
73
|
-
Langtune is
|
|
72
|
+
Langtune is intended for building and fine-tuning large language models with LoRA. It can be used for text classification, summarization, question answering, and other NLP tasks.
|
|
74
73
|
|
|
75
74
|
---
|
|
76
75
|
|
|
77
76
|
## Getting Started
|
|
78
77
|
|
|
79
|
-
Install
|
|
78
|
+
Install:
|
|
80
79
|
|
|
81
80
|
```bash
|
|
82
81
|
pip install langtune
|
|
83
82
|
```
|
|
84
83
|
|
|
85
|
-
|
|
84
|
+
Example usage:
|
|
86
85
|
|
|
87
86
|
```python
|
|
88
87
|
import torch
|
|
@@ -104,28 +103,28 @@ with torch.no_grad():
|
|
|
104
103
|
print('Output shape:', out.shape)
|
|
105
104
|
```
|
|
106
105
|
|
|
107
|
-
|
|
106
|
+
See the [Documentation](docs/index.md) and `src/langtune/cli/finetune.py` for more details.
|
|
108
107
|
|
|
109
108
|
---
|
|
110
109
|
|
|
111
110
|
## Supported Python Versions
|
|
112
|
-
- Python 3.8
|
|
111
|
+
- Python 3.8 or newer
|
|
113
112
|
|
|
114
113
|
---
|
|
115
114
|
|
|
116
115
|
## Why langtune?
|
|
117
116
|
|
|
118
|
-
-
|
|
119
|
-
- Modular transformer
|
|
120
|
-
- Unified interface for
|
|
121
|
-
-
|
|
122
|
-
- Efficient memory usage
|
|
117
|
+
- Fine-tuning with LoRA adapters
|
|
118
|
+
- Modular transformer design
|
|
119
|
+
- Unified interface for language models
|
|
120
|
+
- Suitable for research and production
|
|
121
|
+
- Efficient memory usage
|
|
123
122
|
|
|
124
123
|
---
|
|
125
124
|
|
|
126
125
|
## Architecture Overview
|
|
127
126
|
|
|
128
|
-
Langtune uses a
|
|
127
|
+
Langtune uses a transformer backbone with LoRA adapters in attention and MLP layers. This enables adaptation of pre-trained models with fewer trainable parameters.
|
|
129
128
|
|
|
130
129
|
### Model Data Flow
|
|
131
130
|
|
|
@@ -296,12 +295,14 @@ model.finetune(dataset, config_path="configs/custom_config.yaml")
|
|
|
296
295
|
- For advanced usage, see `src/langtune/cli/finetune.py`.
|
|
297
296
|
|
|
298
297
|
## Contributing
|
|
299
|
-
|
|
298
|
+
Contributions are welcome. See the [Contributing Guide](CONTRIBUTING.md) for details.
|
|
300
299
|
|
|
301
|
-
## License
|
|
300
|
+
## License
|
|
302
301
|
|
|
303
302
|
This project is licensed under the MIT License. See [LICENSE](LICENSE) for details.
|
|
304
303
|
|
|
304
|
+
## Citation
|
|
305
|
+
|
|
305
306
|
If you use langtune in your research, please cite:
|
|
306
307
|
|
|
307
308
|
```bibtex
|
|
@@ -1,6 +1,6 @@
|
|
|
1
1
|
Metadata-Version: 2.4
|
|
2
2
|
Name: langtune
|
|
3
|
-
Version: 0.0.
|
|
3
|
+
Version: 0.0.3
|
|
4
4
|
Summary: A package for finetuning text models.
|
|
5
5
|
Author-email: Pritesh Raj <priteshraj41@gmail.com>
|
|
6
6
|
License: MIT License
|
|
@@ -39,7 +39,7 @@ Requires-Dist: pyyaml
|
|
|
39
39
|
Requires-Dist: scipy
|
|
40
40
|
Dynamic: license-file
|
|
41
41
|
|
|
42
|
-
# Langtune:
|
|
42
|
+
# Langtune: Efficient LoRA Fine-Tuning for Text LLMs
|
|
43
43
|
|
|
44
44
|
<hr/>
|
|
45
45
|
<p align="center">
|
|
@@ -60,8 +60,8 @@ Dynamic: license-file
|
|
|
60
60
|
</p>
|
|
61
61
|
|
|
62
62
|
<p align="center">
|
|
63
|
-
<b>Langtune
|
|
64
|
-
<span style="font-size:1.1em"><i>
|
|
63
|
+
<b>Langtune is a Python package for fine-tuning large language models on text data using LoRA.</b><br/>
|
|
64
|
+
<span style="font-size:1.1em"><i>Provides modular components for adapting language models to various NLP tasks.</i></span>
|
|
65
65
|
</p>
|
|
66
66
|
<hr/>
|
|
67
67
|
|
|
@@ -89,41 +89,40 @@ Dynamic: license-file
|
|
|
89
89
|
- [Examples & Use Cases](#examples--use-cases)
|
|
90
90
|
- [Extending the Framework](#extending-the-framework)
|
|
91
91
|
- [Contributing](#contributing)
|
|
92
|
-
- [
|
|
92
|
+
- [License](#license)
|
|
93
93
|
- [Citation](#citation)
|
|
94
94
|
- [Acknowledgements](#acknowledgements)
|
|
95
|
-
- [License](#license)
|
|
96
95
|
|
|
97
96
|
---
|
|
98
97
|
|
|
99
98
|
## Features
|
|
100
|
-
- LoRA adapters for
|
|
99
|
+
- LoRA adapters for efficient fine-tuning
|
|
101
100
|
- Modular transformer backbone
|
|
102
|
-
- Model zoo for
|
|
101
|
+
- Model zoo for language models
|
|
103
102
|
- Configurable and extensible codebase
|
|
104
|
-
- Checkpointing and resume
|
|
103
|
+
- Checkpointing and resume
|
|
105
104
|
- Mixed precision and distributed training
|
|
106
|
-
-
|
|
107
|
-
- CLI for
|
|
108
|
-
-
|
|
105
|
+
- Metrics and visualization tools
|
|
106
|
+
- CLI for training and evaluation
|
|
107
|
+
- Callback support (early stopping, logging, etc.)
|
|
109
108
|
|
|
110
109
|
---
|
|
111
110
|
|
|
112
111
|
## Showcase
|
|
113
112
|
|
|
114
|
-
Langtune is
|
|
113
|
+
Langtune is intended for building and fine-tuning large language models with LoRA. It can be used for text classification, summarization, question answering, and other NLP tasks.
|
|
115
114
|
|
|
116
115
|
---
|
|
117
116
|
|
|
118
117
|
## Getting Started
|
|
119
118
|
|
|
120
|
-
Install
|
|
119
|
+
Install:
|
|
121
120
|
|
|
122
121
|
```bash
|
|
123
122
|
pip install langtune
|
|
124
123
|
```
|
|
125
124
|
|
|
126
|
-
|
|
125
|
+
Example usage:
|
|
127
126
|
|
|
128
127
|
```python
|
|
129
128
|
import torch
|
|
@@ -145,28 +144,28 @@ with torch.no_grad():
|
|
|
145
144
|
print('Output shape:', out.shape)
|
|
146
145
|
```
|
|
147
146
|
|
|
148
|
-
|
|
147
|
+
See the [Documentation](docs/index.md) and `src/langtune/cli/finetune.py` for more details.
|
|
149
148
|
|
|
150
149
|
---
|
|
151
150
|
|
|
152
151
|
## Supported Python Versions
|
|
153
|
-
- Python 3.8
|
|
152
|
+
- Python 3.8 or newer
|
|
154
153
|
|
|
155
154
|
---
|
|
156
155
|
|
|
157
156
|
## Why langtune?
|
|
158
157
|
|
|
159
|
-
-
|
|
160
|
-
- Modular transformer
|
|
161
|
-
- Unified interface for
|
|
162
|
-
-
|
|
163
|
-
- Efficient memory usage
|
|
158
|
+
- Fine-tuning with LoRA adapters
|
|
159
|
+
- Modular transformer design
|
|
160
|
+
- Unified interface for language models
|
|
161
|
+
- Suitable for research and production
|
|
162
|
+
- Efficient memory usage
|
|
164
163
|
|
|
165
164
|
---
|
|
166
165
|
|
|
167
166
|
## Architecture Overview
|
|
168
167
|
|
|
169
|
-
Langtune uses a
|
|
168
|
+
Langtune uses a transformer backbone with LoRA adapters in attention and MLP layers. This enables adaptation of pre-trained models with fewer trainable parameters.
|
|
170
169
|
|
|
171
170
|
### Model Data Flow
|
|
172
171
|
|
|
@@ -337,12 +336,14 @@ model.finetune(dataset, config_path="configs/custom_config.yaml")
|
|
|
337
336
|
- For advanced usage, see `src/langtune/cli/finetune.py`.
|
|
338
337
|
|
|
339
338
|
## Contributing
|
|
340
|
-
|
|
339
|
+
Contributions are welcome. See the [Contributing Guide](CONTRIBUTING.md) for details.
|
|
341
340
|
|
|
342
|
-
## License
|
|
341
|
+
## License
|
|
343
342
|
|
|
344
343
|
This project is licensed under the MIT License. See [LICENSE](LICENSE) for details.
|
|
345
344
|
|
|
345
|
+
## Citation
|
|
346
|
+
|
|
346
347
|
If you use langtune in your research, please cite:
|
|
347
348
|
|
|
348
349
|
```bibtex
|
|
File without changes
|
|
File without changes
|
|
File without changes
|
|
File without changes
|
|
File without changes
|
|
File without changes
|
|
File without changes
|
|
File without changes
|