gpt-batch 0.1.2__tar.gz → 0.1.5__tar.gz

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
@@ -1,6 +1,6 @@
1
1
  Metadata-Version: 2.1
2
2
  Name: gpt_batch
3
- Version: 0.1.2
3
+ Version: 0.1.5
4
4
  Summary: A package for batch processing with OpenAI API.
5
5
  Home-page: https://github.com/fengsxy/gpt_batch
6
6
  Author: Ted Yu
@@ -21,7 +21,7 @@ A simple tool to batch process messages using OpenAI's GPT models. `GPTBatcher`
21
21
  To get started with `GPTBatcher`, clone this repository to your local machine. Navigate to the repository directory and install the required dependencies (if any) by running:
22
22
 
23
23
  ```bash
24
- pip install -r requirements.txt
24
+ pip install gpt_batch
25
25
  ```
26
26
 
27
27
  ## Quick Start
@@ -75,11 +75,4 @@ The `GPTBatcher` class can be customized with several parameters to adjust its p
75
75
 
76
76
  For more detailed documentation on the parameters and methods, refer to the class docstring.
77
77
 
78
- ## License
79
-
80
- Specify your licensing information here.
81
-
82
- ```
83
-
84
- This README provides clear instructions on how to install and use the `GPTBatcher`, along with detailed explanations of its configuration parameters. Adjust the "License" section as necessary based on your project's licensing terms.
85
78
 
@@ -10,7 +10,7 @@ A simple tool to batch process messages using OpenAI's GPT models. `GPTBatcher`
10
10
  To get started with `GPTBatcher`, clone this repository to your local machine. Navigate to the repository directory and install the required dependencies (if any) by running:
11
11
 
12
12
  ```bash
13
- pip install -r requirements.txt
13
+ pip install gpt_batch
14
14
  ```
15
15
 
16
16
  ## Quick Start
@@ -63,11 +63,3 @@ The `GPTBatcher` class can be customized with several parameters to adjust its p
63
63
  - **miss_index** (list): Tracks indices of requests that failed to process correctly.
64
64
 
65
65
  For more detailed documentation on the parameters and methods, refer to the class docstring.
66
-
67
- ## License
68
-
69
- Specify your licensing information here.
70
-
71
- ```
72
-
73
- This README provides clear instructions on how to install and use the `GPTBatcher`, along with detailed explanations of its configuration parameters. Adjust the "License" section as necessary based on your project's licensing terms.
@@ -57,22 +57,26 @@ class GPTBatcher:
57
57
  new_list = []
58
58
  num_workers = self.num_workers
59
59
  timeout_duration = self.timeout_duration
60
- retry_attempts=2
61
-
60
+ retry_attempts = 2
61
+
62
62
  executor = ThreadPoolExecutor(max_workers=num_workers)
63
63
  message_chunks = list(self.chunk_list(message_list, num_workers))
64
- for chunk in tqdm(message_chunks, desc="Processing messages"):
65
- future_to_message = {executor.submit(self.get_attitude, message): message for message in chunk}
66
- for _ in range(retry_attempts):
67
- done, not_done = wait(future_to_message.keys(), timeout=timeout_duration)
68
- for future in not_done:
69
- future.cancel()
70
- new_list.extend(future.result() for future in done if future.done())
71
- if len(not_done) == 0:
72
- break
73
- future_to_message = {executor.submit(self.get_attitude, future_to_message[future]): future_to_message[future] for future, msg in not_done}
74
- executor.shutdown(wait=False)
75
- return new_list
64
+ try:
65
+ for chunk in tqdm(message_chunks, desc="Processing messages"):
66
+ future_to_message = {executor.submit(self.get_attitude, message): message for message in chunk}
67
+ for _ in range(retry_attempts):
68
+ done, not_done = wait(future_to_message.keys(), timeout=timeout_duration)
69
+ for future in not_done:
70
+ future.cancel()
71
+ new_list.extend(future.result() for future in done if future.done())
72
+ if len(not_done) == 0:
73
+ break
74
+ future_to_message = {executor.submit(self.get_attitude, future_to_message[future]): future for future in not_done}
75
+ except Exception as e:
76
+ print(f"Error occurred: {e}")
77
+ finally:
78
+ executor.shutdown(wait=False)
79
+ return new_list
76
80
 
77
81
  def complete_attitude_list(self,attitude_list, max_length):
78
82
  completed_list = []
@@ -1,6 +1,6 @@
1
1
  Metadata-Version: 2.1
2
2
  Name: gpt-batch
3
- Version: 0.1.2
3
+ Version: 0.1.5
4
4
  Summary: A package for batch processing with OpenAI API.
5
5
  Home-page: https://github.com/fengsxy/gpt_batch
6
6
  Author: Ted Yu
@@ -21,7 +21,7 @@ A simple tool to batch process messages using OpenAI's GPT models. `GPTBatcher`
21
21
  To get started with `GPTBatcher`, clone this repository to your local machine. Navigate to the repository directory and install the required dependencies (if any) by running:
22
22
 
23
23
  ```bash
24
- pip install -r requirements.txt
24
+ pip install gpt_batch
25
25
  ```
26
26
 
27
27
  ## Quick Start
@@ -75,11 +75,4 @@ The `GPTBatcher` class can be customized with several parameters to adjust its p
75
75
 
76
76
  For more detailed documentation on the parameters and methods, refer to the class docstring.
77
77
 
78
- ## License
79
-
80
- Specify your licensing information here.
81
-
82
- ```
83
-
84
- This README provides clear instructions on how to install and use the `GPTBatcher`, along with detailed explanations of its configuration parameters. Adjust the "License" section as necessary based on your project's licensing terms.
85
78
 
@@ -2,7 +2,7 @@ from setuptools import setup, find_packages
2
2
 
3
3
  setup(
4
4
  name='gpt_batch',
5
- version='0.1.2',
5
+ version='0.1.5',
6
6
  packages=find_packages(),
7
7
  install_requires=[
8
8
  'openai', 'tqdm'
File without changes
File without changes