gpt-pr 0.2.1__tar.gz → 0.4.0__tar.gz

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.

Potentially problematic release.


This version of gpt-pr might be problematic. Click here for more details.

@@ -1,6 +1,6 @@
1
1
  Metadata-Version: 2.1
2
2
  Name: gpt-pr
3
- Version: 0.2.1
3
+ Version: 0.4.0
4
4
  Summary: Automate your GitHub workflow with GPT-PR: an OpenAI powered library for streamlined PR generation.
5
5
  Home-page: http://github.com/alissonperez/gpt-pr
6
6
  Author: Alisson R. Perez
@@ -8,8 +8,9 @@ GPT-PR is an open-source command-line tool designed to streamline your GitHub wo
8
8
 
9
9
  - [Features](#features)
10
10
  - [Prerequisites](#prerequisites)
11
- - [Installation and Usage](#installation-and-usage)
12
- - [Authentication & API Keys](#authentication--api-keys)
11
+ - [Installation](#installation)
12
+ - [Configuration](#configuration)
13
+ - [Usage](#usage)
13
14
  - [How to Contribute](#how-to-contribute)
14
15
  - [Roadmap](#roadmap)
15
16
 
@@ -40,7 +41,7 @@ pip install -U gpt-pr
40
41
 
41
42
  > Note: Use this command to **update** gpt-pr package to the latest version.
42
43
 
43
- 2. Export API keys as environment variables ([Authentication & API Keys](#authentication--api-keys)).
44
+ 2. Setup API keys for GitHub and OpenAI, take a look at [Configuration](#configuration).
44
45
 
45
46
  3. Inside the Git repository you are working on, ensure you have pushed your branch to origin, then run:
46
47
 
@@ -63,7 +64,7 @@ cd gpt-pr
63
64
  pipenv install
64
65
  ```
65
66
 
66
- After exporting api keys as environment variables ([Authentication & API Keys](#authentication--api-keys)), you can use GPT-PR within any git project directory. Suppose you've cloned **this project** to `~/workplace/gpt-pr`, here's how you can use it:
67
+ After setting up API keys ([Configuration](#configuration)), you can use GPT-PR within any git project directory. Suppose you've cloned **this project** to `~/workplace/gpt-pr`, here's how you can use it:
67
68
 
68
69
  ```bash
69
70
  PYTHONPATH=~/workplace/gpt-pr/gpt-pr \
@@ -71,34 +72,12 @@ PIPENV_PIPFILE=~/workplace/gpt-pr/Pipfile \
71
72
  pipenv run python ~/workplace/gpt-pr/gptpr/main.py --help
72
73
  ```
73
74
 
74
- ## Usage
75
-
76
- ### Generating Github Pull Requests
77
-
78
- To create a Pull request from your current branch commits to merge with `main` branch, just run:
79
-
80
- ```
81
- gpt-pr
82
- ```
83
-
84
- If you would like to compare with other base branch that is not `main`, just use `-b` param:
85
-
86
- ```
87
- gpt-pr -b my-other-branch
88
- ```
89
-
90
- ### Usage help
91
-
92
- To show help commands:
93
-
94
- ```
95
- gpt-pr -h
96
- ```
97
-
98
- ## Authentication & API Keys
75
+ ## Configuration
99
76
 
100
77
  ### Setting up GitHub Token (`GH_TOKEN`)
101
78
 
79
+ GPT-PR tool will look for a `GH_TOKEN` in current shell env var OR in gpt-pr config file (at `~/.gpt-pr.ini`).
80
+
102
81
  To authenticate with GitHub, generate and export a GitHub Personal Access Token:
103
82
 
104
83
  1. Navigate to [GitHub's Personal Access Token page](https://github.com/settings/tokens).
@@ -106,7 +85,13 @@ To authenticate with GitHub, generate and export a GitHub Personal Access Token:
106
85
  3. Provide a description and select the required permissions `repo` for the token.
107
86
  4. Click "Generate token" at the bottom of the page.
108
87
  5. Copy the generated token.
109
- 6. Export it as an environment variable:
88
+ 6. Set `gh_token` config running (supposing your gh token is `ghp_4Mb1QEr9gY5e8Lk3tN1KjPzX7W9z2V4HtJ2b`):
89
+
90
+ ```bash
91
+ gpt-pr-config set gh_token ghp_4Mb1QEr9gY5e8Lk3tN1KjPzX7W9z2V4HtJ2b
92
+ ```
93
+
94
+ Or just export it as an environment variable in your shell initializer:
110
95
 
111
96
  ```bash
112
97
  export GH_TOKEN=your_generated_token_here
@@ -114,6 +99,8 @@ export GH_TOKEN=your_generated_token_here
114
99
 
115
100
  ### Setting up OpenAI API Key (`OPENAI_API_KEY`)
116
101
 
102
+ GPT-PR tool will look for a `OPENAI_API_KEY` env var in current shell OR in gpt-pr config file (at `~/.gpt-pr.ini`).
103
+
117
104
  This project needs to interact with the ChatGPT API to generate the pull request description. So, you need to generate and export an OpenAI API Key:
118
105
 
119
106
  1. Navigate to [OpenAI's API Key page](https://platform.openai.com/signup).
@@ -121,12 +108,74 @@ This project needs to interact with the ChatGPT API to generate the pull request
121
108
  3. Go to the API Keys section and click "Create new key."
122
109
  4. Provide a description and click "Create."
123
110
  5. Copy the generated API key.
124
- 6. Export it as an environment variable:
111
+ 6. Set `openai_api_key` config running (supposing your openai_api_key is `QEr9gY5e8Lk3tN1KjPzX7W9z2V4Ht`):
112
+
113
+ ```bash
114
+ gpt-pr-config set openai_api_key QEr9gY5e8Lk3tN1KjPzX7W9z2V4Ht
115
+ ```
116
+
117
+ Or just export it as an environment variable in your shell initializer:
125
118
 
126
119
  ```bash
127
120
  export OPENAI_API_KEY=your_generated_api_key_here
128
121
  ```
129
122
 
123
+ ### Change OpenAI model
124
+
125
+ To change OpenAI model, just run:
126
+
127
+ ```bash
128
+ gpt-pr-config set openai_model gpt-3.5-turbo
129
+ ```
130
+
131
+ To see a full list of available models, access [OpenAI Models Documentation](https://platform.openai.com/docs/models)
132
+
133
+ ### See all configs available
134
+
135
+ To print all default configs and what is being used, just run:
136
+
137
+ ```bash
138
+ gpt-pr-config print
139
+ ```
140
+
141
+ ### Reset config
142
+
143
+ To reset any config to default value, just run:
144
+
145
+ ```bash
146
+ gpt-pr-config reset config-name
147
+ ```
148
+
149
+ Example:
150
+
151
+ ```bash
152
+ gpt-pr-config reset openai_model
153
+ ```
154
+
155
+ ## Usage
156
+
157
+ ### Generating Github Pull Requests
158
+
159
+ To create a Pull request from your current branch commits to merge with `main` branch, just run:
160
+
161
+ ```
162
+ gpt-pr
163
+ ```
164
+
165
+ If you would like to compare with other base branch that is not `main`, just use `-b` param:
166
+
167
+ ```
168
+ gpt-pr -b my-other-branch
169
+ ```
170
+
171
+ ### Usage help
172
+
173
+ To show help commands:
174
+
175
+ ```
176
+ gpt-pr -h
177
+ ```
178
+
130
179
  Output:
131
180
  ![image](https://github.com/alissonperez/gpt-pr/assets/756802/cc6c0ca4-5759-44ce-ad35-e4e7305b3875)
132
181
 
@@ -1,6 +1,6 @@
1
1
  Metadata-Version: 2.1
2
2
  Name: gpt-pr
3
- Version: 0.2.1
3
+ Version: 0.4.0
4
4
  Summary: Automate your GitHub workflow with GPT-PR: an OpenAI powered library for streamlined PR generation.
5
5
  Home-page: http://github.com/alissonperez/gpt-pr
6
6
  Author: Alisson R. Perez
@@ -11,10 +11,14 @@ gpt_pr.egg-info/not-zip-safe
11
11
  gpt_pr.egg-info/requires.txt
12
12
  gpt_pr.egg-info/top_level.txt
13
13
  gptpr/__init__.py
14
+ gptpr/checkversion.py
15
+ gptpr/config.py
14
16
  gptpr/consolecolor.py
15
17
  gptpr/gh.py
16
18
  gptpr/gitutil.py
17
19
  gptpr/main.py
18
20
  gptpr/prdata.py
21
+ gptpr/test_checkversion.py
22
+ gptpr/test_config.py
19
23
  gptpr/test_prdata.py
20
24
  gptpr/version.py
@@ -1,2 +1,3 @@
1
1
  [console_scripts]
2
2
  gpt-pr = gptpr.main:main
3
+ gpt-pr-config = gptpr.main:run_config
@@ -1,5 +1,5 @@
1
1
  cffi==1.15.1
2
- cryptography==42.0.5
2
+ cryptography==42.0.7
3
3
  fire==0.6.0
4
4
  pycparser==2.21
5
5
  wcwidth==0.2.13
@@ -12,11 +12,10 @@ prompt-toolkit==3.0.43
12
12
  openai==1.14.0
13
13
 
14
14
  [:python_version < "3.11"]
15
- exceptiongroup==1.2.0
15
+ exceptiongroup==1.2.1
16
16
 
17
17
  [:python_version < "3.8"]
18
18
  cached-property==1.5.2
19
- typing-extensions==4.7.1
20
19
 
21
20
  [:python_version == "3.7"]
22
21
  importlib-metadata==6.7.0
@@ -28,7 +27,7 @@ six==1.16.0
28
27
  deprecated==1.2.14
29
28
 
30
29
  [:python_version >= "3.5"]
31
- idna==3.6
30
+ idna==3.7
32
31
 
33
32
  [:python_version >= "3.6"]
34
33
  certifi==2024.2.2
@@ -52,7 +51,8 @@ requests==2.31.0
52
51
  smmap==5.0.1
53
52
  sniffio==1.3.1
54
53
  termcolor==2.3.0
55
- tqdm==4.66.2
54
+ tqdm==4.66.4
55
+ typing-extensions==4.7.1
56
56
  urllib3==2.0.7
57
57
  zipp==3.15.0
58
58
 
@@ -0,0 +1,89 @@
1
+ import requests
2
+ import os
3
+ import json
4
+ import tempfile
5
+ from gptpr.version import __version__
6
+ from datetime import datetime, timedelta
7
+
8
+ from gptpr import consolecolor as cc
9
+
10
+
11
+ PACKAGE_NAME = 'gpt-pr'
12
+ CACHE_FILE = os.path.join(os.path.expanduser("~"), '.gpt_pr_update_cache.json')
13
+ CACHE_DURATION = timedelta(days=1)
14
+
15
+
16
+ def cache_daily_version(func):
17
+ def wrapper(*args, **kwargs):
18
+ cache = load_cache()
19
+ last_checked = cache.get('last_checked')
20
+
21
+ if last_checked:
22
+ last_checked = datetime.fromisoformat(last_checked)
23
+
24
+ if datetime.now() - last_checked < CACHE_DURATION:
25
+ # Use cached version info
26
+ latest_version = cache.get('latest_version')
27
+ return latest_version
28
+
29
+ latest_version = func(*args, **kwargs)
30
+ cache = {
31
+ 'last_checked': datetime.now().isoformat(),
32
+ 'latest_version': latest_version
33
+ }
34
+ save_cache(cache)
35
+
36
+ return latest_version
37
+
38
+ return wrapper
39
+
40
+
41
+ def get_cache_file_path():
42
+ temp_dir = tempfile.gettempdir()
43
+ return os.path.join(temp_dir, f'{PACKAGE_NAME}_update_cache.json')
44
+
45
+
46
+ @cache_daily_version
47
+ def get_latest_version():
48
+ url = f'https://pypi.org/pypi/{PACKAGE_NAME}/json'
49
+
50
+ try:
51
+ response = requests.get(url)
52
+ response.raise_for_status()
53
+ data = response.json()
54
+ return data['info']['version']
55
+ except requests.exceptions.RequestException as e:
56
+ print(f"Error fetching latest version info: {e}")
57
+ return None
58
+
59
+
60
+ def load_cache():
61
+ cache_file = get_cache_file_path()
62
+ if os.path.exists(cache_file):
63
+ with open(cache_file, 'r') as file:
64
+ return json.load(file)
65
+
66
+ return {}
67
+
68
+
69
+ def save_cache(data):
70
+ cache_file = get_cache_file_path()
71
+ with open(cache_file, 'w') as file:
72
+ file.write(json.dumps(data))
73
+
74
+
75
+ def check_for_updates():
76
+ latest_version = get_latest_version()
77
+
78
+ if latest_version and latest_version != __version__:
79
+ print('')
80
+ print(cc.yellow(
81
+ f'A new version of {PACKAGE_NAME} is available ({latest_version}). '
82
+ f'You are using version {__version__}. Please update by running'),
83
+ cc.green(f'pip install --upgrade {PACKAGE_NAME}.'))
84
+ print('')
85
+
86
+
87
+ if __name__ == "__main__":
88
+ check_for_updates()
89
+ # Your CLI code here
@@ -0,0 +1,97 @@
1
+ from copy import deepcopy
2
+ import configparser
3
+ import os
4
+
5
+
6
+ def config_command_example(name, value_sample):
7
+ return f'gpt-pr-config set {name} {value_sample}'
8
+
9
+
10
+ CONFIG_README_SECTION = 'https://github.com/alissonperez/gpt-pr?tab=readme-ov-file#authentication--api-keys'
11
+
12
+
13
+ class Config:
14
+
15
+ config_filename = '.gpt-pr.ini'
16
+
17
+ _default_config = {
18
+ # Github
19
+ 'GH_TOKEN': '',
20
+
21
+ # Open AI info
22
+ 'OPENAI_MODEL': 'gpt-4o',
23
+ 'OPENAI_API_KEY': '',
24
+ }
25
+
26
+ def __init__(self, config_dir=None):
27
+ self.default_config = deepcopy(self._default_config)
28
+ self._config_dir = config_dir or os.path.expanduser('~')
29
+ self._config = configparser.ConfigParser()
30
+ self._initialized = False
31
+
32
+ def load(self):
33
+ if self._initialized:
34
+ return
35
+
36
+ config_file_path = self.get_filepath()
37
+
38
+ if os.path.exists(config_file_path):
39
+ self._config.read(config_file_path)
40
+ self._ensure_default_values()
41
+ else:
42
+ self._config['user'] = {}
43
+ self._config['DEFAULT'] = deepcopy(self.default_config)
44
+ self.persist()
45
+
46
+ self._initialized = True
47
+
48
+ def _ensure_default_values(self):
49
+ added = False
50
+ for key, value in self.default_config.items():
51
+ if key not in self._config['DEFAULT']:
52
+ self._config['DEFAULT'][key] = value
53
+ added = True
54
+
55
+ if added:
56
+ self.persist()
57
+
58
+ def persist(self):
59
+ config_file_path = self.get_filepath()
60
+
61
+ with open(config_file_path, 'w') as configfile:
62
+ self._config.write(configfile)
63
+
64
+ def get_filepath(self):
65
+ return os.path.join(self._config_dir, self.config_filename)
66
+
67
+ def set_user_config(self, name, value):
68
+ self.load()
69
+ self._config['user'][name] = value
70
+
71
+ def reset_user_config(self, name):
72
+ self.load()
73
+ self._config['user'][name] = self.default_config[name]
74
+ self.persist()
75
+
76
+ def get_user_config(self, name):
77
+ self.load()
78
+ return self._config['user'][name]
79
+
80
+ def all_values(self):
81
+ self.load()
82
+
83
+ # iterate over all sections and values and return them in a list
84
+ result = []
85
+
86
+ # add default section
87
+ for option in self._config['DEFAULT']:
88
+ result.append(('DEFAULT', option, self._config['DEFAULT'][option]))
89
+
90
+ for section in self._config.sections():
91
+ for option in self._config[section]:
92
+ result.append((section, option, self._config[section][option]))
93
+
94
+ return result
95
+
96
+
97
+ config = Config()
@@ -1,14 +1,24 @@
1
1
  import os
2
2
  from github import Github
3
3
  from InquirerPy import inquirer
4
+ from gptpr.config import config, config_command_example, CONFIG_README_SECTION
4
5
 
5
- GH_TOKEN = os.environ.get('GH_TOKEN')
6
6
 
7
- if not GH_TOKEN:
8
- print("Please set GH_TOKEN environment variable")
9
- exit(1)
7
+ def _get_gh_token():
8
+ gh_token = config.get_user_config('GH_TOKEN')
9
+ if not gh_token:
10
+ gh_token = os.environ.get('GH_TOKEN')
10
11
 
11
- gh = Github(GH_TOKEN)
12
+ if not gh_token:
13
+ print('Please set "gh_token" config. Just run:',
14
+ config_command_example('gh_token', '[my gh token]'),
15
+ 'more about at', CONFIG_README_SECTION)
16
+ exit(1)
17
+
18
+ return gh_token
19
+
20
+
21
+ gh = Github(_get_gh_token())
12
22
 
13
23
 
14
24
  def create_pr(pr_data, yield_confirmation):
@@ -0,0 +1,102 @@
1
+ import fire
2
+ from InquirerPy import inquirer
3
+
4
+ from gptpr.gitutil import get_branch_info
5
+ from gptpr.gh import create_pr
6
+ from gptpr.prdata import get_pr_data
7
+ from gptpr.version import __version__
8
+ from gptpr.config import config, config_command_example, CONFIG_README_SECTION
9
+ from gptpr import consolecolor as cc
10
+ from gptpr.checkversion import check_for_updates
11
+
12
+
13
+ def run(base_branch='main', yield_confirmation=False, version=False):
14
+ '''
15
+ Create Pull Requests from current branch with base branch (default 'main' branch)
16
+ '''
17
+
18
+ if version:
19
+ print('Current version:', __version__)
20
+ return
21
+
22
+ branch_info = get_branch_info(base_branch, yield_confirmation)
23
+
24
+ if not branch_info:
25
+ exit(0)
26
+
27
+ pr_data = None
28
+ generate_pr_data = True
29
+ while generate_pr_data:
30
+ pr_data = get_pr_data(branch_info)
31
+ print('')
32
+ print('#########################################')
33
+ print(pr_data.to_display())
34
+ print('#########################################')
35
+ print('')
36
+
37
+ if yield_confirmation:
38
+ break
39
+
40
+ generate_pr_data = not inquirer.confirm(
41
+ message="Create PR with this? If 'no', let's try again...",
42
+ default=True).execute()
43
+
44
+ if generate_pr_data:
45
+ print('Generating another PR data...')
46
+
47
+ create_pr(pr_data, yield_confirmation)
48
+
49
+
50
+ def set_config(name, value):
51
+ name = name.upper()
52
+ config.set_user_config(name, value)
53
+ config.persist()
54
+
55
+ print('Config value', cc.bold(name), 'set to', cc.yellow(value))
56
+
57
+
58
+ def get_config(name):
59
+ upper_name = name.upper()
60
+ print('Config value', cc.bold(name), '=', cc.yellow(config.get_user_config(upper_name)))
61
+
62
+
63
+ def reset_config(name):
64
+ upper_name = name.upper()
65
+ config.reset_user_config(upper_name)
66
+ print('Config value', cc.bold(name), '=', cc.yellow(config.get_user_config(upper_name)))
67
+
68
+
69
+ def print_config():
70
+ print('Config values at', cc.yellow(config.get_filepath()))
71
+ print('')
72
+ print('To set values, just run:', cc.yellow(config_command_example('[config name]', '[value]')))
73
+ print('More about at', cc.yellow(CONFIG_README_SECTION))
74
+ print('')
75
+ current_section = None
76
+ for section, option, value in config.all_values():
77
+ if current_section != section:
78
+ print('')
79
+ current_section = section
80
+
81
+ print(f'[{cc.bold(section)}]', option, '=', cc.yellow(value))
82
+
83
+
84
+ def main():
85
+ check_for_updates()
86
+
87
+ fire.Fire(run)
88
+
89
+
90
+ def run_config():
91
+ check_for_updates()
92
+
93
+ fire.Fire({
94
+ 'set': set_config,
95
+ 'get': get_config,
96
+ 'print': print_config,
97
+ 'reset': reset_config
98
+ })
99
+
100
+
101
+ if __name__ == '__main__':
102
+ main()
@@ -4,6 +4,7 @@ import os
4
4
  from openai import OpenAI
5
5
 
6
6
  from gptpr.gitutil import BranchInfo
7
+ from gptpr.config import config
7
8
  import gptpr.consolecolor as cc
8
9
 
9
10
  TOKENIZER_RATIO = 4
@@ -37,6 +38,20 @@ def _get_pr_template():
37
38
  return pr_template
38
39
 
39
40
 
41
+ def _get_open_ai_key():
42
+ api_key = config.get_user_config('OPENAI_API_KEY')
43
+
44
+ if not api_key:
45
+ api_key = os.environ.get('OPENAI_API_KEY')
46
+
47
+ if not api_key:
48
+ print('Please set "openai_api_key" config, just run:',
49
+ cc.yellow('gpt-pr-config set openai_api_key [open ai key]'))
50
+ exit(1)
51
+
52
+ return api_key
53
+
54
+
40
55
  @dataclass
41
56
  class PrData():
42
57
  branch_info: BranchInfo
@@ -108,17 +123,14 @@ def get_pr_data(branch_info):
108
123
  else:
109
124
  messages.append({'role': 'user', 'content': 'Diff changes:\n' + branch_info.diff})
110
125
 
111
- openai_api_key = os.environ.get('OPENAI_API_KEY')
112
-
113
- if not openai_api_key:
114
- print("Please set OPENAI_API_KEY environment variable.")
115
- exit(1)
126
+ client = OpenAI(api_key=_get_open_ai_key())
116
127
 
117
- client = OpenAI(api_key=openai_api_key)
128
+ openai_model = config.get_user_config('OPENAI_MODEL')
129
+ print('Using OpenAI model:', cc.yellow(openai_model))
118
130
 
119
131
  chat_completion = client.chat.completions.create(
120
132
  messages=messages,
121
- model='gpt-4-0613',
133
+ model=openai_model,
122
134
  functions=functions,
123
135
  function_call={'name': 'create_pr'},
124
136
  temperature=0,
@@ -0,0 +1,126 @@
1
+ import pytest
2
+ import requests
3
+ import json
4
+ from datetime import datetime
5
+ from unittest.mock import patch, mock_open
6
+
7
+ from gptpr.version import __version__
8
+ from gptpr.checkversion import (get_latest_version, load_cache,
9
+ save_cache, check_for_updates,
10
+ CACHE_DURATION)
11
+
12
+
13
+ @pytest.fixture
14
+ def mock_requests_get(mocker):
15
+ return mocker.patch('requests.get')
16
+
17
+
18
+ @pytest.fixture
19
+ def mock_os_path_exists(mocker):
20
+ return mocker.patch('os.path.exists')
21
+
22
+
23
+ @pytest.fixture
24
+ def mock_open_file(mocker):
25
+ return mocker.patch('builtins.open', mock_open())
26
+
27
+
28
+ @pytest.fixture
29
+ def mock_datetime(mocker):
30
+ return mocker.patch('gptpr.checkversion.datetime')
31
+
32
+
33
+ def test_get_latest_version(mock_requests_get, mock_os_path_exists):
34
+ mock_os_path_exists.return_value = False
35
+ mock_response = mock_requests_get.return_value
36
+ mock_response.raise_for_status.return_value = None
37
+ mock_response.json.return_value = {'info': {'version': '2.0.0'}}
38
+
39
+ assert get_latest_version() == '2.0.0'
40
+
41
+
42
+ def test_get_latest_version_error(mock_requests_get, mock_os_path_exists):
43
+ mock_os_path_exists.return_value = False
44
+ mock_requests_get.side_effect = requests.exceptions.RequestException
45
+
46
+ assert get_latest_version() is None
47
+
48
+
49
+ def test_load_cache(mock_os_path_exists, mock_open_file):
50
+ mock_os_path_exists.return_value = True
51
+ mock_open_file.return_value.read.return_value = json.dumps({
52
+ 'last_checked': datetime.now().isoformat(),
53
+ 'latest_version': '2.0.0'
54
+ })
55
+
56
+ cache = load_cache()
57
+ assert cache['latest_version'] == '2.0.0'
58
+
59
+
60
+ def test_load_cache_no_file(mock_os_path_exists):
61
+ mock_os_path_exists.return_value = False
62
+
63
+ cache = load_cache()
64
+ assert cache == {}
65
+
66
+
67
+ def test_save_cache(mock_open_file):
68
+ data = {
69
+ 'last_checked': datetime.now().isoformat(),
70
+ 'latest_version': '2.0.0'
71
+ }
72
+
73
+ save_cache(data)
74
+ mock_open_file.return_value.write.assert_called_once_with(json.dumps(data))
75
+
76
+
77
+ def test_check_for_updates_new_version(mocker, mock_datetime, mock_requests_get, mock_open_file):
78
+ # Set up mocks
79
+ last_checked_str = (datetime(2024, 1, 1) - CACHE_DURATION).isoformat()
80
+ mock_datetime.now.return_value = datetime(2024, 1, 2)
81
+ mock_datetime.fromisoformat.return_value = datetime.fromisoformat(last_checked_str)
82
+ mock_open_file.return_value.read.return_value = json.dumps({
83
+ 'last_checked': last_checked_str,
84
+ 'latest_version': '1.0.0'
85
+ })
86
+ mock_requests_get.return_value.raise_for_status.return_value = None
87
+ mock_requests_get.return_value.json.return_value = {'info': {'version': '2.0.0'}}
88
+
89
+ # Capture the print statements
90
+ with patch('builtins.print') as mocked_print:
91
+ check_for_updates()
92
+ assert mocked_print.call_count == 3
93
+
94
+
95
+ def test_check_for_updates_no_new_version(mocker, mock_datetime, mock_requests_get, mock_open_file):
96
+ # Set up mocks
97
+ last_checked_str = (datetime(2024, 1, 1) - CACHE_DURATION).isoformat()
98
+ mock_datetime.now.return_value = datetime(2024, 1, 2)
99
+ mock_datetime.fromisoformat.return_value = datetime.fromisoformat(last_checked_str)
100
+ mock_open_file.return_value.read.return_value = json.dumps({
101
+ 'last_checked': (datetime(2024, 1, 1) - CACHE_DURATION).isoformat(),
102
+ 'latest_version': __version__
103
+ })
104
+ mock_requests_get.return_value.raise_for_status.return_value = None
105
+ mock_requests_get.return_value.json.return_value = {'info': {'version': __version__}}
106
+
107
+ # Capture the print statements
108
+ with patch('builtins.print') as mocked_print:
109
+ check_for_updates()
110
+ assert mocked_print.call_count == 0
111
+
112
+
113
+ def test_check_for_updates_cache_valid(mock_datetime, mock_open_file):
114
+ # Set up mocks
115
+ last_checked_str = datetime(2024, 1, 2).isoformat()
116
+ mock_datetime.now.return_value = datetime(2024, 1, 2)
117
+ mock_datetime.fromisoformat.return_value = datetime.fromisoformat(last_checked_str)
118
+ mock_open_file.return_value.read.return_value = json.dumps({
119
+ 'last_checked': last_checked_str,
120
+ 'latest_version': __version__
121
+ })
122
+
123
+ # Capture the print statements
124
+ with patch('builtins.print') as mocked_print:
125
+ check_for_updates()
126
+ assert mocked_print.call_count == 0
@@ -0,0 +1,99 @@
1
+ import os
2
+ import configparser
3
+
4
+ from pytest import fixture
5
+
6
+ from gptpr.config import Config
7
+
8
+
9
+ @fixture
10
+ def temp_config(tmpdir):
11
+ temp_dir = tmpdir.mkdir('config_dir')
12
+ config = Config(temp_dir)
13
+ return config, temp_dir
14
+
15
+
16
+ def _check_config(config, temp_dir, config_list):
17
+ # Read the configuration file and verify its contents
18
+ config_to_test = configparser.ConfigParser()
19
+ config_to_test.read(os.path.join(str(temp_dir), config.config_filename))
20
+
21
+ for section, key, value in config_list:
22
+ assert config_to_test[section][key] == value
23
+
24
+
25
+ def test_init_config_file(temp_config):
26
+ config, temp_dir = temp_config
27
+ config.load()
28
+
29
+ # Check if the file exists
30
+ assert os.path.isfile(os.path.join(str(temp_dir), config.config_filename))
31
+
32
+ _check_config(config, temp_dir, [
33
+ ('DEFAULT', 'OPENAI_MODEL', 'gpt-4o'),
34
+ ('DEFAULT', 'OPENAI_API_KEY', ''),
35
+ ])
36
+
37
+
38
+ def test_new_default_value_should_be_added(temp_config):
39
+ config, temp_dir = temp_config
40
+ config.load() # data was written to the file
41
+
42
+ new_config = Config(temp_dir)
43
+
44
+ # Add a new default value
45
+ new_config.default_config['NEW_DEFAULT'] = 'new_default_value'
46
+ new_config.load() # Should update config file...
47
+
48
+ _check_config(new_config, temp_dir, [
49
+ ('DEFAULT', 'NEW_DEFAULT', 'new_default_value'),
50
+ ])
51
+
52
+
53
+ def test_set_user_config(temp_config):
54
+ config, temp_dir = temp_config
55
+
56
+ config.set_user_config('OPENAI_MODEL', 'gpt-3.5')
57
+ config.persist()
58
+
59
+ # Read the configuration file and verify its contents
60
+ config_to_test = configparser.ConfigParser()
61
+ config_to_test.read(os.path.join(str(temp_dir), config.config_filename))
62
+
63
+ _check_config(config, temp_dir, [
64
+ ('user', 'OPENAI_MODEL', 'gpt-3.5'),
65
+ ('user', 'OPENAI_API_KEY', ''),
66
+ ])
67
+
68
+
69
+ def test_all_values(temp_config):
70
+ config, temp_dir = temp_config
71
+
72
+ all_values = config.all_values()
73
+
74
+ assert all_values == [
75
+ ('DEFAULT', 'gh_token', ''),
76
+ ('DEFAULT', 'openai_model', 'gpt-4o'),
77
+ ('DEFAULT', 'openai_api_key', ''),
78
+ ('user', 'gh_token', ''),
79
+ ('user', 'openai_model', 'gpt-4o'),
80
+ ('user', 'openai_api_key', ''),
81
+ ]
82
+
83
+
84
+ def test_reset_user_config(temp_config):
85
+ config, temp_dir = temp_config
86
+
87
+ config.set_user_config('OPENAI_MODEL', 'gpt-3.5')
88
+ config.persist()
89
+
90
+ config.reset_user_config('OPENAI_MODEL')
91
+
92
+ # Read the configuration file and verify its contents
93
+ config_to_test = configparser.ConfigParser()
94
+ config_to_test.read(os.path.join(str(temp_dir), config.config_filename))
95
+
96
+ _check_config(config, temp_dir, [
97
+ ('user', 'OPENAI_MODEL', 'gpt-4o'),
98
+ ('user', 'OPENAI_API_KEY', ''),
99
+ ])
@@ -0,0 +1 @@
1
+ __version__ = "0.4.0"
@@ -5,17 +5,17 @@ cached-property==1.5.2; python_version < '3.8'
5
5
  certifi==2024.2.2; python_version >= '3.6'
6
6
  cffi==1.15.1
7
7
  charset-normalizer==3.3.2; python_full_version >= '3.7.0'
8
- cryptography==42.0.5
8
+ cryptography==42.0.7
9
9
  deprecated==1.2.14; python_version >= '2.7' and python_version not in '3.0, 3.1, 3.2, 3.3'
10
10
  distro==1.9.0; python_version >= '3.6'
11
- exceptiongroup==1.2.0; python_version < '3.11'
11
+ exceptiongroup==1.2.1; python_version < '3.11'
12
12
  fire==0.6.0
13
13
  gitdb==4.0.11; python_version >= '3.7'
14
14
  gitpython==3.1.42; python_version >= '3.7'
15
15
  h11==0.14.0; python_version >= '3.7'
16
16
  httpcore==0.17.3; python_version >= '3.7'
17
17
  httpx==0.24.1; python_version >= '3.7'
18
- idna==3.6; python_version >= '3.5'
18
+ idna==3.7; python_version >= '3.5'
19
19
  importlib-metadata==6.7.0; python_version == '3.7'
20
20
  inquirerpy==0.3.4; python_version >= '3.7' and python_version < '4.0'
21
21
  openai==1.14.0; python_full_version >= '3.7.1'
@@ -32,8 +32,8 @@ six==1.16.0; python_version >= '2.7' and python_version not in '3.0, 3.1, 3.2'
32
32
  smmap==5.0.1; python_version >= '3.7'
33
33
  sniffio==1.3.1; python_version >= '3.7'
34
34
  termcolor==2.3.0; python_version >= '3.7'
35
- tqdm==4.66.2; python_version >= '3.7'
36
- typing-extensions==4.7.1; python_version < '3.8'
35
+ tqdm==4.66.4; python_version >= '3.7'
36
+ typing-extensions==4.7.1; python_version >= '3.7'
37
37
  urllib3==2.0.7; python_version >= '3.7'
38
38
  wcwidth==0.2.13
39
39
  wrapt==1.16.0; python_version >= '3.6'
@@ -39,7 +39,10 @@ setup(name='gpt-pr',
39
39
  author_email='alissonperez@outlook.com',
40
40
  license='MIT',
41
41
  entry_points={
42
- 'console_scripts': ['gpt-pr=gptpr.main:main'],
42
+ 'console_scripts': [
43
+ 'gpt-pr=gptpr.main:main',
44
+ 'gpt-pr-config=gptpr.main:run_config',
45
+ ],
43
46
  },
44
47
  packages=find_packages('.'),
45
48
  include_package_data=True,
@@ -1,52 +0,0 @@
1
- import fire
2
- from InquirerPy import inquirer
3
-
4
- from gptpr.gitutil import get_branch_info
5
- from gptpr.gh import create_pr
6
- from gptpr.prdata import get_pr_data
7
- from gptpr.version import __version__
8
-
9
-
10
- def run(base_branch='main', yield_confirmation=False, version=False):
11
- '''
12
- Create Pull Requests from current branch with base branch (default 'main' branch)
13
- '''
14
-
15
- if version:
16
- print('Current version:', __version__)
17
- return
18
-
19
- branch_info = get_branch_info(base_branch, yield_confirmation)
20
-
21
- if not branch_info:
22
- exit(0)
23
-
24
- pr_data = None
25
- generate_pr_data = True
26
- while generate_pr_data:
27
- pr_data = get_pr_data(branch_info)
28
- print('')
29
- print('#########################################')
30
- print(pr_data.to_display())
31
- print('#########################################')
32
- print('')
33
-
34
- if yield_confirmation:
35
- break
36
-
37
- generate_pr_data = not inquirer.confirm(
38
- message="Create PR with this? If 'no', let's try again...",
39
- default=True).execute()
40
-
41
- if generate_pr_data:
42
- print('Generating another PR data...')
43
-
44
- create_pr(pr_data, yield_confirmation)
45
-
46
-
47
- def main():
48
- fire.Fire(run)
49
-
50
-
51
- if __name__ == '__main__':
52
- main()
@@ -1 +0,0 @@
1
- __version__ = "0.2.1"
File without changes
File without changes
File without changes
File without changes
File without changes
File without changes