langtune 0.0.2__py3-none-any.whl → 0.0.3__py3-none-any.whl

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.

Potentially problematic release.


This version of langtune might be problematic. Click here for more details.

@@ -1,6 +1,6 @@
1
1
  Metadata-Version: 2.4
2
2
  Name: langtune
3
- Version: 0.0.2
3
+ Version: 0.0.3
4
4
  Summary: A package for finetuning text models.
5
5
  Author-email: Pritesh Raj <priteshraj41@gmail.com>
6
6
  License: MIT License
@@ -39,7 +39,7 @@ Requires-Dist: pyyaml
39
39
  Requires-Dist: scipy
40
40
  Dynamic: license-file
41
41
 
42
- # Langtune: Large Language Models (LLMs) with Efficient LoRA Fine-Tuning for Text
42
+ # Langtune: Efficient LoRA Fine-Tuning for Text LLMs
43
43
 
44
44
  <hr/>
45
45
  <p align="center">
@@ -60,8 +60,8 @@ Dynamic: license-file
60
60
  </p>
61
61
 
62
62
  <p align="center">
63
- <b>Langtune provides modular components for text models and LoRA-based fine-tuning.</b><br/>
64
- <span style="font-size:1.1em"><i>Adapt and fine-tune language models for a range of NLP tasks.</i></span>
63
+ <b>Langtune is a Python package for fine-tuning large language models on text data using LoRA.</b><br/>
64
+ <span style="font-size:1.1em"><i>Provides modular components for adapting language models to various NLP tasks.</i></span>
65
65
  </p>
66
66
  <hr/>
67
67
 
@@ -89,41 +89,40 @@ Dynamic: license-file
89
89
  - [Examples & Use Cases](#examples--use-cases)
90
90
  - [Extending the Framework](#extending-the-framework)
91
91
  - [Contributing](#contributing)
92
- - [FAQ](#faq)
92
+ - [License](#license)
93
93
  - [Citation](#citation)
94
94
  - [Acknowledgements](#acknowledgements)
95
- - [License](#license)
96
95
 
97
96
  ---
98
97
 
99
98
  ## Features
100
- - LoRA adapters for parameter-efficient fine-tuning of LLMs
99
+ - LoRA adapters for efficient fine-tuning
101
100
  - Modular transformer backbone
102
- - Model zoo for open-source language models
101
+ - Model zoo for language models
103
102
  - Configurable and extensible codebase
104
- - Checkpointing and resume support
103
+ - Checkpointing and resume
105
104
  - Mixed precision and distributed training
106
- - Built-in metrics and visualization tools
107
- - CLI for fine-tuning and evaluation
108
- - Extensible callbacks (early stopping, logging, etc.)
105
+ - Metrics and visualization tools
106
+ - CLI for training and evaluation
107
+ - Callback support (early stopping, logging, etc.)
109
108
 
110
109
  ---
111
110
 
112
111
  ## Showcase
113
112
 
114
- Langtune is a framework for building and fine-tuning large language models with LoRA support. It is suitable for tasks such as text classification, summarization, question answering, and other NLP applications.
113
+ Langtune is intended for building and fine-tuning large language models with LoRA. It can be used for text classification, summarization, question answering, and other NLP tasks.
115
114
 
116
115
  ---
117
116
 
118
117
  ## Getting Started
119
118
 
120
- Install with pip:
119
+ Install:
121
120
 
122
121
  ```bash
123
122
  pip install langtune
124
123
  ```
125
124
 
126
- Minimal example:
125
+ Example usage:
127
126
 
128
127
  ```python
129
128
  import torch
@@ -145,28 +144,28 @@ with torch.no_grad():
145
144
  print('Output shape:', out.shape)
146
145
  ```
147
146
 
148
- For more details, see the [Documentation](docs/index.md) and `src/langtune/cli/finetune.py`.
147
+ See the [Documentation](docs/index.md) and `src/langtune/cli/finetune.py` for more details.
149
148
 
150
149
  ---
151
150
 
152
151
  ## Supported Python Versions
153
- - Python 3.8+
152
+ - Python 3.8 or newer
154
153
 
155
154
  ---
156
155
 
157
156
  ## Why langtune?
158
157
 
159
- - Parameter-efficient fine-tuning with LoRA adapters
160
- - Modular transformer backbone for flexible model design
161
- - Unified interface for open-source language models
162
- - Designed for both research and production
163
- - Efficient memory usage for large models
158
+ - Fine-tuning with LoRA adapters
159
+ - Modular transformer design
160
+ - Unified interface for language models
161
+ - Suitable for research and production
162
+ - Efficient memory usage
164
163
 
165
164
  ---
166
165
 
167
166
  ## Architecture Overview
168
167
 
169
- Langtune uses a modular transformer backbone with LoRA adapters in attention and MLP layers. This allows adaptation of pre-trained models with fewer trainable parameters.
168
+ Langtune uses a transformer backbone with LoRA adapters in attention and MLP layers. This enables adaptation of pre-trained models with fewer trainable parameters.
170
169
 
171
170
  ### Model Data Flow
172
171
 
@@ -337,12 +336,14 @@ model.finetune(dataset, config_path="configs/custom_config.yaml")
337
336
  - For advanced usage, see `src/langtune/cli/finetune.py`.
338
337
 
339
338
  ## Contributing
340
- We welcome contributions. See the [Contributing Guide](CONTRIBUTING.md) for details.
339
+ Contributions are welcome. See the [Contributing Guide](CONTRIBUTING.md) for details.
341
340
 
342
- ## License & Citation
341
+ ## License
343
342
 
344
343
  This project is licensed under the MIT License. See [LICENSE](LICENSE) for details.
345
344
 
345
+ ## Citation
346
+
346
347
  If you use langtune in your research, please cite:
347
348
 
348
349
  ```bibtex
@@ -0,0 +1,7 @@
1
+ langtune/__init__.py,sha256=47DEQpj8HBSa-_TImW-5JCeuQeRkm5NMpJWZG3hSuFU,0
2
+ langtune-0.0.3.dist-info/licenses/LICENSE,sha256=OtXSlJyUGegIJFsZKNIhua8XYZ2tOgxNP1DYa9Hbgns,1068
3
+ langtune-0.0.3.dist-info/METADATA,sha256=qZ-qydhQztiysD9u7NWfmVJt7W5m9tZVGj88nvrr0Ns,10835
4
+ langtune-0.0.3.dist-info/WHEEL,sha256=_zCd3N1l69ArxyTb8rzEoP9TpbYXkqRFSNOD5OuxnTs,91
5
+ langtune-0.0.3.dist-info/entry_points.txt,sha256=kjkeBLj0Lh5j7cMkm7_f8v4BpVpJDir7_zPh-KJgGcA,65
6
+ langtune-0.0.3.dist-info/top_level.txt,sha256=R0BPR95TtWx__rtQReaNmb4dbyGx0q2AypdeQYzxj7s,9
7
+ langtune-0.0.3.dist-info/RECORD,,
@@ -1,7 +0,0 @@
1
- langtune/__init__.py,sha256=47DEQpj8HBSa-_TImW-5JCeuQeRkm5NMpJWZG3hSuFU,0
2
- langtune-0.0.2.dist-info/licenses/LICENSE,sha256=OtXSlJyUGegIJFsZKNIhua8XYZ2tOgxNP1DYa9Hbgns,1068
3
- langtune-0.0.2.dist-info/METADATA,sha256=e-pK5qp4pXXvPp-zq5KaoT8b3GMTuGJBD-jULopsf54,11026
4
- langtune-0.0.2.dist-info/WHEEL,sha256=_zCd3N1l69ArxyTb8rzEoP9TpbYXkqRFSNOD5OuxnTs,91
5
- langtune-0.0.2.dist-info/entry_points.txt,sha256=kjkeBLj0Lh5j7cMkm7_f8v4BpVpJDir7_zPh-KJgGcA,65
6
- langtune-0.0.2.dist-info/top_level.txt,sha256=R0BPR95TtWx__rtQReaNmb4dbyGx0q2AypdeQYzxj7s,9
7
- langtune-0.0.2.dist-info/RECORD,,