cs2tracker 2.1.6__tar.gz → 2.1.8__tar.gz
This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
Potentially problematic release.
This version of cs2tracker might be problematic. Click here for more details.
- {cs2tracker-2.1.6 → cs2tracker-2.1.8}/.pylintrc +2 -1
- {cs2tracker-2.1.6 → cs2tracker-2.1.8}/PKG-INFO +9 -6
- {cs2tracker-2.1.6 → cs2tracker-2.1.8}/README.md +8 -5
- {cs2tracker-2.1.6 → cs2tracker-2.1.8}/cs2tracker/_version.py +2 -2
- {cs2tracker-2.1.6 → cs2tracker-2.1.8}/cs2tracker/application.py +68 -22
- {cs2tracker-2.1.6 → cs2tracker-2.1.8}/cs2tracker/constants.py +6 -2
- cs2tracker-2.1.8/cs2tracker/data/config.ini +205 -0
- {cs2tracker-2.1.6 → cs2tracker-2.1.8}/cs2tracker/main.py +5 -1
- {cs2tracker-2.1.6 → cs2tracker-2.1.8}/cs2tracker/scraper.py +200 -46
- {cs2tracker-2.1.6 → cs2tracker-2.1.8}/cs2tracker.egg-info/PKG-INFO +9 -6
- {cs2tracker-2.1.6 → cs2tracker-2.1.8}/cs2tracker.egg-info/SOURCES.txt +1 -1
- cs2tracker-2.1.6/cs2tracker/data/config.ini +0 -198
- {cs2tracker-2.1.6 → cs2tracker-2.1.8}/.flake8 +0 -0
- {cs2tracker-2.1.6 → cs2tracker-2.1.8}/.gitignore +0 -0
- {cs2tracker-2.1.6 → cs2tracker-2.1.8}/.isort.cfg +0 -0
- {cs2tracker-2.1.6 → cs2tracker-2.1.8}/.pre-commit-config.yaml +0 -0
- {cs2tracker-2.1.6 → cs2tracker-2.1.8}/LICENSE.md +0 -0
- {cs2tracker-2.1.6 → cs2tracker-2.1.8}/MANIFEST.in +0 -0
- /cs2tracker-2.1.6/assets/icons8-counter-strike-bubbles-96.png → /cs2tracker-2.1.8/assets/icon.png +0 -0
- {cs2tracker-2.1.6 → cs2tracker-2.1.8}/cs2tracker/__init__.py +0 -0
- {cs2tracker-2.1.6 → cs2tracker-2.1.8}/cs2tracker/__main__.py +0 -0
- {cs2tracker-2.1.6 → cs2tracker-2.1.8}/cs2tracker/data/output.csv +0 -0
- {cs2tracker-2.1.6 → cs2tracker-2.1.8}/cs2tracker/padded_console.py +0 -0
- {cs2tracker-2.1.6 → cs2tracker-2.1.8}/cs2tracker.egg-info/dependency_links.txt +0 -0
- {cs2tracker-2.1.6 → cs2tracker-2.1.8}/cs2tracker.egg-info/entry_points.txt +0 -0
- {cs2tracker-2.1.6 → cs2tracker-2.1.8}/cs2tracker.egg-info/requires.txt +0 -0
- {cs2tracker-2.1.6 → cs2tracker-2.1.8}/cs2tracker.egg-info/top_level.txt +0 -0
- {cs2tracker-2.1.6 → cs2tracker-2.1.8}/pyproject.toml +0 -0
- {cs2tracker-2.1.6 → cs2tracker-2.1.8}/requirements.txt +0 -0
- {cs2tracker-2.1.6 → cs2tracker-2.1.8}/setup.cfg +0 -0
|
@@ -1,6 +1,6 @@
|
|
|
1
1
|
Metadata-Version: 2.4
|
|
2
2
|
Name: cs2tracker
|
|
3
|
-
Version: 2.1.
|
|
3
|
+
Version: 2.1.8
|
|
4
4
|
Summary: Tracking the steam market prices of CS2 items
|
|
5
5
|
Home-page: https://github.com/ashiven/cs2tracker
|
|
6
6
|
Author: Jannik Novak
|
|
@@ -40,12 +40,13 @@ Dynamic: license-file
|
|
|
40
40
|
|
|
41
41
|
### Prerequisites
|
|
42
42
|
|
|
43
|
-
- Download and install the latest versions of [Python](https://www.python.org/downloads/) and [Pip](https://pypi.org/project/pip/). (Required
|
|
43
|
+
- Download and install the latest versions of [Python](https://www.python.org/downloads/) and [Pip](https://pypi.org/project/pip/). (Required on Linux)
|
|
44
44
|
- Register for the [Crawlbase Smart Proxy API](https://crawlbase.com/) and retrieve your API key. (Optional)
|
|
45
|
+
- Create a [Discord Webhook](https://support.discord.com/hc/en-us/articles/228383668-Intro-to-Webhooks) to be notified about recent price updates. (Optional)
|
|
45
46
|
|
|
46
47
|
### Setup
|
|
47
48
|
|
|
48
|
-
#### Windows Executable
|
|
49
|
+
#### Windows Executable _(no color support)_
|
|
49
50
|
|
|
50
51
|
- Simply [download the latest executable](https://github.com/ashiven/cs2tracker/releases/latest/download/cs2tracker-windows.zip) and run it.
|
|
51
52
|
|
|
@@ -58,6 +59,7 @@ Dynamic: license-file
|
|
|
58
59
|
```
|
|
59
60
|
|
|
60
61
|
2. Run it:
|
|
62
|
+
|
|
61
63
|
```bash
|
|
62
64
|
cs2tracker
|
|
63
65
|
```
|
|
@@ -65,10 +67,11 @@ Dynamic: license-file
|
|
|
65
67
|
### Options
|
|
66
68
|
|
|
67
69
|
- `Run!` to gather the current market prices of your items and calculate the total amount in USD and EUR.
|
|
68
|
-
- `Edit Config` to
|
|
70
|
+
- `Edit Config` to specify the numbers of items owned in the config file. You can also add items other than cases and sticker capsules following the format in the `Custom Items` section. (item_name = item_owned item_page)
|
|
69
71
|
- `Show History` to see a price chart consisting of past calculations. A new data point is generated once a day upon running the program.
|
|
70
|
-
- `Daily Background
|
|
71
|
-
-
|
|
72
|
+
- `Daily Background Calculations` to automatically run a daily calculation of your investment in the background and save the results such that they can later be viewed via `Show History`.
|
|
73
|
+
- `Receive Discord Notifications` to receive a notification on your Discord server when the program has finished calculating your investment. You need to set up a [webhook](https://support.discord.com/hc/en-us/articles/228383668-Intro-to-Webhooks) in your Discord server and enter the webhook url into the `discord_webhook_url` field in the config file.
|
|
74
|
+
- `Proxy Requests` to prevent your requests from being rate limited by the steamcommunity server. You need to register for a free API key on [Crawlbase](crawlbase.com) and enter it into the `api_key` field in the config file.
|
|
72
75
|
|
|
73
76
|
---
|
|
74
77
|
|
|
@@ -17,12 +17,13 @@
|
|
|
17
17
|
|
|
18
18
|
### Prerequisites
|
|
19
19
|
|
|
20
|
-
- Download and install the latest versions of [Python](https://www.python.org/downloads/) and [Pip](https://pypi.org/project/pip/). (Required
|
|
20
|
+
- Download and install the latest versions of [Python](https://www.python.org/downloads/) and [Pip](https://pypi.org/project/pip/). (Required on Linux)
|
|
21
21
|
- Register for the [Crawlbase Smart Proxy API](https://crawlbase.com/) and retrieve your API key. (Optional)
|
|
22
|
+
- Create a [Discord Webhook](https://support.discord.com/hc/en-us/articles/228383668-Intro-to-Webhooks) to be notified about recent price updates. (Optional)
|
|
22
23
|
|
|
23
24
|
### Setup
|
|
24
25
|
|
|
25
|
-
#### Windows Executable
|
|
26
|
+
#### Windows Executable _(no color support)_
|
|
26
27
|
|
|
27
28
|
- Simply [download the latest executable](https://github.com/ashiven/cs2tracker/releases/latest/download/cs2tracker-windows.zip) and run it.
|
|
28
29
|
|
|
@@ -35,6 +36,7 @@
|
|
|
35
36
|
```
|
|
36
37
|
|
|
37
38
|
2. Run it:
|
|
39
|
+
|
|
38
40
|
```bash
|
|
39
41
|
cs2tracker
|
|
40
42
|
```
|
|
@@ -42,10 +44,11 @@
|
|
|
42
44
|
### Options
|
|
43
45
|
|
|
44
46
|
- `Run!` to gather the current market prices of your items and calculate the total amount in USD and EUR.
|
|
45
|
-
- `Edit Config` to
|
|
47
|
+
- `Edit Config` to specify the numbers of items owned in the config file. You can also add items other than cases and sticker capsules following the format in the `Custom Items` section. (item_name = item_owned item_page)
|
|
46
48
|
- `Show History` to see a price chart consisting of past calculations. A new data point is generated once a day upon running the program.
|
|
47
|
-
- `Daily Background
|
|
48
|
-
-
|
|
49
|
+
- `Daily Background Calculations` to automatically run a daily calculation of your investment in the background and save the results such that they can later be viewed via `Show History`.
|
|
50
|
+
- `Receive Discord Notifications` to receive a notification on your Discord server when the program has finished calculating your investment. You need to set up a [webhook](https://support.discord.com/hc/en-us/articles/228383668-Intro-to-Webhooks) in your Discord server and enter the webhook url into the `discord_webhook_url` field in the config file.
|
|
51
|
+
- `Proxy Requests` to prevent your requests from being rate limited by the steamcommunity server. You need to register for a free API key on [Crawlbase](crawlbase.com) and enter it into the `api_key` field in the config file.
|
|
49
52
|
|
|
50
53
|
---
|
|
51
54
|
|
|
@@ -1,3 +1,4 @@
|
|
|
1
|
+
import ctypes
|
|
1
2
|
import tkinter as tk
|
|
2
3
|
from subprocess import Popen
|
|
3
4
|
from threading import Thread
|
|
@@ -9,6 +10,7 @@ from matplotlib.dates import DateFormatter
|
|
|
9
10
|
|
|
10
11
|
from cs2tracker.constants import (
|
|
11
12
|
CONFIG_FILE,
|
|
13
|
+
ICON_FILE,
|
|
12
14
|
OS,
|
|
13
15
|
OUTPUT_FILE,
|
|
14
16
|
PYTHON_EXECUTABLE,
|
|
@@ -18,8 +20,9 @@ from cs2tracker.constants import (
|
|
|
18
20
|
)
|
|
19
21
|
from cs2tracker.scraper import Scraper
|
|
20
22
|
|
|
21
|
-
|
|
22
|
-
|
|
23
|
+
APPLICATION_NAME = "CS2Tracker"
|
|
24
|
+
|
|
25
|
+
WINDOW_SIZE = "500x450"
|
|
23
26
|
BACKGROUND_COLOR = "#1e1e1e"
|
|
24
27
|
BUTTON_COLOR = "#3c3f41"
|
|
25
28
|
BUTTON_HOVER_COLOR = "#505354"
|
|
@@ -27,9 +30,8 @@ BUTTON_ACTIVE_COLOR = "#5c5f61"
|
|
|
27
30
|
FONT_STYLE = "Segoe UI"
|
|
28
31
|
FONT_COLOR = "white"
|
|
29
32
|
|
|
30
|
-
SCRAPER_WINDOW_TITLE = "CS2Tracker"
|
|
31
33
|
SCRAPER_WINDOW_HEIGHT = 40
|
|
32
|
-
SCRAPER_WINDOW_WIDTH =
|
|
34
|
+
SCRAPER_WINDOW_WIDTH = 120
|
|
33
35
|
SCRAPER_WINDOW_BACKGROUND_COLOR = "Black"
|
|
34
36
|
|
|
35
37
|
|
|
@@ -58,21 +60,42 @@ class Application:
|
|
|
58
60
|
button.bind("<Leave>", lambda _: button.config(bg=BUTTON_COLOR))
|
|
59
61
|
return button
|
|
60
62
|
|
|
63
|
+
def _add_checkbox(self, frame, text, variable, command):
|
|
64
|
+
checkbox = tk.Checkbutton(
|
|
65
|
+
frame,
|
|
66
|
+
text=text,
|
|
67
|
+
variable=variable,
|
|
68
|
+
command=command,
|
|
69
|
+
bg=BACKGROUND_COLOR,
|
|
70
|
+
fg=FONT_COLOR,
|
|
71
|
+
selectcolor=BUTTON_COLOR,
|
|
72
|
+
activebackground=BACKGROUND_COLOR,
|
|
73
|
+
font=(FONT_STYLE, 10),
|
|
74
|
+
anchor="w",
|
|
75
|
+
padx=20,
|
|
76
|
+
)
|
|
77
|
+
checkbox.pack(fill="x", anchor="w", pady=2)
|
|
78
|
+
|
|
61
79
|
def _configure_window(self):
|
|
62
80
|
"""Configure the main application window UI and add buttons for the main
|
|
63
81
|
functionalities.
|
|
64
82
|
"""
|
|
65
83
|
window = tk.Tk()
|
|
66
|
-
window.title(
|
|
84
|
+
window.title(APPLICATION_NAME)
|
|
67
85
|
window.geometry(WINDOW_SIZE)
|
|
68
86
|
window.configure(bg=BACKGROUND_COLOR)
|
|
87
|
+
if OS == OSType.WINDOWS:
|
|
88
|
+
app_id = "cs2tracker.unique.id"
|
|
89
|
+
ctypes.windll.shell32.SetCurrentProcessExplicitAppUserModelID(app_id)
|
|
90
|
+
icon = tk.PhotoImage(file=ICON_FILE)
|
|
91
|
+
window.wm_iconphoto(False, icon)
|
|
69
92
|
|
|
70
93
|
frame = tk.Frame(window, bg=BACKGROUND_COLOR, padx=30, pady=30)
|
|
71
94
|
frame.pack(expand=True, fill="both")
|
|
72
95
|
|
|
73
96
|
label = tk.Label(
|
|
74
97
|
frame,
|
|
75
|
-
text=f"Welcome to {
|
|
98
|
+
text=f"Welcome to {APPLICATION_NAME}!",
|
|
76
99
|
font=(FONT_STYLE, 16, "bold"),
|
|
77
100
|
fg=FONT_COLOR,
|
|
78
101
|
bg=BACKGROUND_COLOR,
|
|
@@ -84,30 +107,46 @@ class Application:
|
|
|
84
107
|
self._add_button(frame, "Show History (Chart)", self._draw_plot)
|
|
85
108
|
self._add_button(frame, "Show History (File)", self._edit_log_file)
|
|
86
109
|
|
|
110
|
+
checkbox_frame = tk.Frame(frame, bg=BACKGROUND_COLOR)
|
|
111
|
+
checkbox_frame.pack(pady=(20, 0), fill="x")
|
|
112
|
+
|
|
87
113
|
background_checkbox_value = tk.BooleanVar(value=self.scraper.identify_background_task())
|
|
88
|
-
|
|
89
|
-
|
|
90
|
-
|
|
91
|
-
|
|
92
|
-
|
|
93
|
-
|
|
94
|
-
|
|
95
|
-
|
|
96
|
-
|
|
97
|
-
|
|
114
|
+
self._add_checkbox(
|
|
115
|
+
checkbox_frame,
|
|
116
|
+
"Daily Background Calculations",
|
|
117
|
+
background_checkbox_value,
|
|
118
|
+
lambda: self._toggle_background_task(background_checkbox_value.get()),
|
|
119
|
+
)
|
|
120
|
+
|
|
121
|
+
discord_webhook_checkbox_value = tk.BooleanVar(
|
|
122
|
+
value=self.scraper.config.getboolean(
|
|
123
|
+
"App Settings", "discord_notifications", fallback=False
|
|
124
|
+
)
|
|
125
|
+
)
|
|
126
|
+
self._add_checkbox(
|
|
127
|
+
checkbox_frame,
|
|
128
|
+
"Receive Discord Notifications",
|
|
129
|
+
discord_webhook_checkbox_value,
|
|
130
|
+
lambda: self._toggle_discord_webhook(discord_webhook_checkbox_value.get()),
|
|
131
|
+
)
|
|
132
|
+
|
|
133
|
+
use_proxy_checkbox_value = tk.BooleanVar(
|
|
134
|
+
value=self.scraper.config.getboolean("App Settings", "use_proxy", fallback=False)
|
|
135
|
+
)
|
|
136
|
+
self._add_checkbox(
|
|
137
|
+
checkbox_frame,
|
|
138
|
+
"Proxy Requests",
|
|
139
|
+
use_proxy_checkbox_value,
|
|
140
|
+
lambda: self._toggle_use_proxy(use_proxy_checkbox_value.get()),
|
|
98
141
|
)
|
|
99
|
-
background_checkbox.pack(pady=20)
|
|
100
142
|
|
|
101
143
|
return window
|
|
102
144
|
|
|
103
145
|
def _construct_scraper_command_windows(self):
|
|
104
146
|
"""Construct the command to run the scraper in a new window for Windows."""
|
|
105
|
-
set_utf8_encoding = (
|
|
106
|
-
"[Console]::InputEncoding = [Console]::OutputEncoding = [System.Text.Encoding]::UTF8;"
|
|
107
|
-
)
|
|
108
147
|
get_size = "$size = $Host.UI.RawUI.WindowSize;"
|
|
109
148
|
set_size = "$Host.UI.RawUI.WindowSize = $size;"
|
|
110
|
-
set_window_title = f"$Host.UI.RawUI.WindowTitle = '{
|
|
149
|
+
set_window_title = f"$Host.UI.RawUI.WindowTitle = '{APPLICATION_NAME}';"
|
|
111
150
|
set_window_width = (
|
|
112
151
|
f"$size.Width = [Math]::Min({SCRAPER_WINDOW_WIDTH}, $Host.UI.RawUI.BufferSize.Width);"
|
|
113
152
|
)
|
|
@@ -125,7 +164,6 @@ class Application:
|
|
|
125
164
|
|
|
126
165
|
cmd = (
|
|
127
166
|
'start powershell -NoExit -Command "& {'
|
|
128
|
-
+ set_utf8_encoding
|
|
129
167
|
+ set_window_title
|
|
130
168
|
+ get_size
|
|
131
169
|
+ set_window_width
|
|
@@ -189,6 +227,14 @@ class Application:
|
|
|
189
227
|
"""Toggle whether a daily price calculation should run in the background."""
|
|
190
228
|
self.scraper.toggle_background_task(enabled)
|
|
191
229
|
|
|
230
|
+
def _toggle_use_proxy(self, enabled: bool):
|
|
231
|
+
"""Toggle whether the scraper should use proxy servers for requests."""
|
|
232
|
+
self.scraper.toggle_use_proxy(enabled)
|
|
233
|
+
|
|
234
|
+
def _toggle_discord_webhook(self, enabled: bool):
|
|
235
|
+
"""Toggle whether the scraper should send notifications to a Discord webhook."""
|
|
236
|
+
self.scraper.toggle_discord_webhook(enabled)
|
|
237
|
+
|
|
192
238
|
|
|
193
239
|
def _popen_and_call(popen_args, callback):
|
|
194
240
|
"""
|
|
@@ -6,8 +6,10 @@ from shutil import copy
|
|
|
6
6
|
|
|
7
7
|
try:
|
|
8
8
|
from cs2tracker._version import version # pylint: disable=E0611
|
|
9
|
+
|
|
10
|
+
VERSION = f"v{version}"
|
|
9
11
|
except ImportError:
|
|
10
|
-
|
|
12
|
+
VERSION = "latest"
|
|
11
13
|
|
|
12
14
|
|
|
13
15
|
class OSType(enum.Enum):
|
|
@@ -22,6 +24,7 @@ PYTHON_EXECUTABLE = sys.executable
|
|
|
22
24
|
|
|
23
25
|
MODULE_DIR = os.path.dirname(os.path.abspath(__file__))
|
|
24
26
|
PROJECT_DIR = os.path.dirname(MODULE_DIR)
|
|
27
|
+
ICON_FILE = os.path.join(PROJECT_DIR, "assets", "icon.png")
|
|
25
28
|
OUTPUT_FILE = os.path.join(MODULE_DIR, "data", "output.csv")
|
|
26
29
|
CONFIG_FILE = os.path.join(MODULE_DIR, "data", "config.ini")
|
|
27
30
|
BATCH_FILE = os.path.join(MODULE_DIR, "data", "cs2tracker_scraper.bat")
|
|
@@ -33,6 +36,7 @@ if RUNNING_IN_EXE:
|
|
|
33
36
|
MEIPASS_DIR = sys._MEIPASS # type: ignore pylint: disable=protected-access
|
|
34
37
|
MODULE_DIR = MEIPASS_DIR
|
|
35
38
|
PROJECT_DIR = MEIPASS_DIR
|
|
39
|
+
ICON_FILE = os.path.join(PROJECT_DIR, "assets", "icon.png")
|
|
36
40
|
CONFIG_FILE_SOURCE = os.path.join(MODULE_DIR, "data", "config.ini")
|
|
37
41
|
OUTPUT_FILE_SOURCE = os.path.join(MODULE_DIR, "data", "output.csv")
|
|
38
42
|
|
|
@@ -60,7 +64,7 @@ BANNER = """
|
|
|
60
64
|
|
|
61
65
|
"""
|
|
62
66
|
AUTHOR_STRING = (
|
|
63
|
-
f"Version:
|
|
67
|
+
f"Version: {VERSION} - {datetime.today().strftime('%Y/%m/%d')} - Jannik Novak @ashiven\n"
|
|
64
68
|
)
|
|
65
69
|
|
|
66
70
|
|
|
@@ -0,0 +1,205 @@
|
|
|
1
|
+
[User Settings]
|
|
2
|
+
api_key = None
|
|
3
|
+
discord_webhook_url = None
|
|
4
|
+
|
|
5
|
+
[App Settings]
|
|
6
|
+
use_proxy = False
|
|
7
|
+
discord_notifications = False
|
|
8
|
+
|
|
9
|
+
[Custom Items]
|
|
10
|
+
copenhagen_flames_gold_2022 = 0 https://steamcommunity.com/market/listings/730/Sticker%20%7C%20Copenhagen%20Flames%20%28Gold%29%20%7C%20Antwerp%202022
|
|
11
|
+
|
|
12
|
+
[Cases]
|
|
13
|
+
revolution_case = 0
|
|
14
|
+
recoil_case = 0
|
|
15
|
+
dreams_and_nightmares_case = 0
|
|
16
|
+
operation_riptide_case = 0
|
|
17
|
+
snakebite_case = 0
|
|
18
|
+
operation_broken_fang_case = 0
|
|
19
|
+
fracture_case = 0
|
|
20
|
+
chroma_case = 0
|
|
21
|
+
chroma_2_case = 0
|
|
22
|
+
chroma_3_case = 0
|
|
23
|
+
clutch_case = 0
|
|
24
|
+
csgo_weapon_case = 0
|
|
25
|
+
csgo_weapon_case_2 = 0
|
|
26
|
+
csgo_weapon_case_3 = 0
|
|
27
|
+
cs20_case = 0
|
|
28
|
+
danger_zone_case = 0
|
|
29
|
+
esports_2013_case = 0
|
|
30
|
+
esports_2013_winter_case = 0
|
|
31
|
+
esports_2014_summer_case = 0
|
|
32
|
+
falchion_case = 0
|
|
33
|
+
gamma_case = 0
|
|
34
|
+
gamma_2_case = 0
|
|
35
|
+
glove_case = 0
|
|
36
|
+
horizon_case = 0
|
|
37
|
+
huntsman_case = 0
|
|
38
|
+
operation_bravo_case = 0
|
|
39
|
+
operation_breakout_case = 0
|
|
40
|
+
operation_hydra_case = 0
|
|
41
|
+
operation_phoenix_case = 0
|
|
42
|
+
operation_vanguard_case = 0
|
|
43
|
+
operation_wildfire_case = 0
|
|
44
|
+
prisma_case = 0
|
|
45
|
+
prisma_2_case = 0
|
|
46
|
+
revolver_case = 0
|
|
47
|
+
shadow_case = 0
|
|
48
|
+
shattered_web_case = 0
|
|
49
|
+
spectrum_case = 0
|
|
50
|
+
spectrum_2_case = 0
|
|
51
|
+
winter_offensive_case = 0
|
|
52
|
+
kilowatt_case = 0
|
|
53
|
+
gallery_case = 0
|
|
54
|
+
fever_case = 0
|
|
55
|
+
|
|
56
|
+
[Katowice 2014 Sticker Capsule]
|
|
57
|
+
katowice_legends = 0
|
|
58
|
+
katowice_challengers = 0
|
|
59
|
+
|
|
60
|
+
[Cologne 2014 Sticker Capsule]
|
|
61
|
+
cologne_legends = 0
|
|
62
|
+
cologne_challengers = 0
|
|
63
|
+
|
|
64
|
+
[DreamHack 2014 Sticker Capsule]
|
|
65
|
+
dreamhack_legends = 0
|
|
66
|
+
|
|
67
|
+
[Katowice 2015 Sticker Capsule]
|
|
68
|
+
katowice_legends = 0
|
|
69
|
+
katowice_challengers = 0
|
|
70
|
+
|
|
71
|
+
[Cologne 2015 Sticker Capsule]
|
|
72
|
+
cologne_legends = 0
|
|
73
|
+
cologne_challengers = 0
|
|
74
|
+
|
|
75
|
+
[Cluj-Napoca 2015 Sticker Capsule]
|
|
76
|
+
cluj_napoca_legends = 0
|
|
77
|
+
cluj_napoca_challengers = 0
|
|
78
|
+
cluj_napoca_legends_autographs = 0
|
|
79
|
+
cluj_napoca_challengers_autographs = 0
|
|
80
|
+
|
|
81
|
+
[Columbus 2016 Sticker Capsule]
|
|
82
|
+
columbus_legends = 0
|
|
83
|
+
columbus_challengers = 0
|
|
84
|
+
columbus_legends_autographs = 0
|
|
85
|
+
columbus_challengers_autographs = 0
|
|
86
|
+
|
|
87
|
+
[Cologne 2016 Sticker Capsule]
|
|
88
|
+
cologne_legends = 0
|
|
89
|
+
cologne_challengers = 0
|
|
90
|
+
cologne_legends_autographs = 0
|
|
91
|
+
cologne_challengers_autographs = 0
|
|
92
|
+
|
|
93
|
+
[Atlanta 2017 Sticker Capsule]
|
|
94
|
+
atlanta_legends = 0
|
|
95
|
+
atlanta_challengers = 0
|
|
96
|
+
atlanta_legends_autographs = 0
|
|
97
|
+
atlanta_challengers_autographs = 0
|
|
98
|
+
|
|
99
|
+
[Krakow 2017 Sticker Capsule]
|
|
100
|
+
krakow_legends = 0
|
|
101
|
+
krakow_challengers = 0
|
|
102
|
+
krakow_legends_autographs = 0
|
|
103
|
+
krakow_challengers_autographs = 0
|
|
104
|
+
|
|
105
|
+
[Boston 2018 Sticker Capsule]
|
|
106
|
+
boston_legends = 0
|
|
107
|
+
boston_minor_challengers = 0
|
|
108
|
+
boston_returning_challengers = 0
|
|
109
|
+
boston_attending_legends = 0
|
|
110
|
+
boston_minor_challengers_with_flash_gaming = 0
|
|
111
|
+
boston_legends_autographs = 0
|
|
112
|
+
boston_minor_challengers_autographs = 0
|
|
113
|
+
boston_returning_challengers_autographs = 0
|
|
114
|
+
boston_attending_legends_autographs = 0
|
|
115
|
+
boston_minor_challengers_with_flash_gaming_autographs = 0
|
|
116
|
+
|
|
117
|
+
[London 2018 Sticker Capsule]
|
|
118
|
+
london_legends = 0
|
|
119
|
+
london_minor_challengers = 0
|
|
120
|
+
london_returning_challengers = 0
|
|
121
|
+
london_legends_autographs = 0
|
|
122
|
+
london_minor_challengers_autographs = 0
|
|
123
|
+
london_returning_challengers_autographs = 0
|
|
124
|
+
|
|
125
|
+
[Katowice 2019 Sticker Capsule]
|
|
126
|
+
katowice_legends = 0
|
|
127
|
+
katowice_minor_challengers = 0
|
|
128
|
+
katowice_returning_challengers = 0
|
|
129
|
+
katowice_legends_autographs = 0
|
|
130
|
+
katowice_minor_challengers_autographs = 0
|
|
131
|
+
katowice_returning_challengers_autographs = 0
|
|
132
|
+
|
|
133
|
+
[Berlin 2019 Sticker Capsule]
|
|
134
|
+
berlin_legends = 0
|
|
135
|
+
berlin_minor_challengers = 0
|
|
136
|
+
berlin_returning_challengers = 0
|
|
137
|
+
berlin_legends_autographs = 0
|
|
138
|
+
berlin_minor_challengers_autographs = 0
|
|
139
|
+
berlin_returning_challengers_autographs = 0
|
|
140
|
+
|
|
141
|
+
[2020 RMR Sticker Capsule]
|
|
142
|
+
rmr_legends = 0
|
|
143
|
+
rmr_challengers = 0
|
|
144
|
+
rmr_contenders = 0
|
|
145
|
+
|
|
146
|
+
[Stockholm 2021 Sticker Capsule]
|
|
147
|
+
stockholm_legends = 0
|
|
148
|
+
stockholm_challengers = 0
|
|
149
|
+
stockholm_contenders = 0
|
|
150
|
+
stockholm_champions_autographs = 0
|
|
151
|
+
stockholm_finalists_autographs = 0
|
|
152
|
+
|
|
153
|
+
[Antwerp 2022 Sticker Capsule]
|
|
154
|
+
antwerp_legends = 0
|
|
155
|
+
antwerp_challengers = 0
|
|
156
|
+
antwerp_contenders = 0
|
|
157
|
+
antwerp_champions_autographs = 0
|
|
158
|
+
antwerp_challengers_autographs = 0
|
|
159
|
+
antwerp_legends_autographs = 0
|
|
160
|
+
antwerp_contenders_autographs = 0
|
|
161
|
+
|
|
162
|
+
[Rio 2022 Sticker Capsule]
|
|
163
|
+
rio_legends = 0
|
|
164
|
+
rio_challengers = 0
|
|
165
|
+
rio_contenders = 0
|
|
166
|
+
rio_champions_autographs = 0
|
|
167
|
+
rio_challengers_autographs = 0
|
|
168
|
+
rio_legends_autographs = 0
|
|
169
|
+
rio_contenders_autographs = 0
|
|
170
|
+
|
|
171
|
+
[Paris 2023 Sticker Capsule]
|
|
172
|
+
paris_legends = 0
|
|
173
|
+
paris_challengers = 0
|
|
174
|
+
paris_contenders = 0
|
|
175
|
+
paris_champions_autographs = 0
|
|
176
|
+
paris_challengers_autographs = 0
|
|
177
|
+
paris_legends_autographs = 0
|
|
178
|
+
paris_contenders_autographs = 0
|
|
179
|
+
|
|
180
|
+
[Copenhagen 2024 Sticker Capsule]
|
|
181
|
+
copenhagen_legends = 0
|
|
182
|
+
copenhagen_challengers = 0
|
|
183
|
+
copenhagen_contenders = 0
|
|
184
|
+
copenhagen_champions_autographs = 0
|
|
185
|
+
copenhagen_challengers_autographs = 0
|
|
186
|
+
copenhagen_legends_autographs = 0
|
|
187
|
+
copenhagen_contenders_autographs = 0
|
|
188
|
+
|
|
189
|
+
[Shanghai 2024 Sticker Capsule]
|
|
190
|
+
shanghai_legends = 0
|
|
191
|
+
shanghai_challengers = 0
|
|
192
|
+
shanghai_contenders = 0
|
|
193
|
+
shanghai_champions_autographs = 0
|
|
194
|
+
shanghai_challengers_autographs = 0
|
|
195
|
+
shanghai_legends_autographs = 0
|
|
196
|
+
shanghai_contenders_autographs = 0
|
|
197
|
+
|
|
198
|
+
[Austin 2025 Sticker Capsule]
|
|
199
|
+
austin_legends = 0
|
|
200
|
+
austin_challengers = 0
|
|
201
|
+
austin_contenders = 0
|
|
202
|
+
austin_champions_autographs = 0
|
|
203
|
+
austin_challengers_autographs = 0
|
|
204
|
+
austin_legends_autographs = 0
|
|
205
|
+
austin_contenders_autographs = 0
|
|
@@ -3,7 +3,7 @@ import sys
|
|
|
3
3
|
import urllib3
|
|
4
4
|
|
|
5
5
|
from cs2tracker.application import Application
|
|
6
|
-
from cs2tracker.constants import AUTHOR_STRING, BANNER
|
|
6
|
+
from cs2tracker.constants import AUTHOR_STRING, BANNER, OS, OSType
|
|
7
7
|
from cs2tracker.padded_console import PaddedConsole
|
|
8
8
|
from cs2tracker.scraper import Scraper
|
|
9
9
|
|
|
@@ -19,6 +19,10 @@ def main():
|
|
|
19
19
|
# Disable warnings for proxy requests
|
|
20
20
|
urllib3.disable_warnings(urllib3.exceptions.InsecureRequestWarning)
|
|
21
21
|
|
|
22
|
+
# Set output encoding to UTF-8 with BOM for Windows compatibility
|
|
23
|
+
if OS == OSType.WINDOWS and sys.stdout is not None:
|
|
24
|
+
sys.stdout.reconfigure(encoding="utf-8-sig") # type: ignore
|
|
25
|
+
|
|
22
26
|
console = PaddedConsole()
|
|
23
27
|
console.print(f"[bold yellow]{BANNER}\n{AUTHOR_STRING}\n")
|
|
24
28
|
|
|
@@ -4,6 +4,7 @@ import time
|
|
|
4
4
|
from configparser import ConfigParser
|
|
5
5
|
from datetime import datetime
|
|
6
6
|
from subprocess import DEVNULL, call
|
|
7
|
+
from urllib.parse import unquote
|
|
7
8
|
|
|
8
9
|
from bs4 import BeautifulSoup
|
|
9
10
|
from bs4.element import Tag
|
|
@@ -36,6 +37,10 @@ PRICE_INFO = "Owned: {:<10} Steam market price: ${:<10} Total: ${:<10}\n"
|
|
|
36
37
|
HTTP_PROXY_URL = "http://{}:@smartproxy.crawlbase.com:8012"
|
|
37
38
|
HTTPS_PROXY_URL = "http://{}:@smartproxy.crawlbase.com:8012"
|
|
38
39
|
|
|
40
|
+
DC_WEBHOOK_USERNAME = "CS2Tracker"
|
|
41
|
+
DC_WEBHOOK_AVATAR_URL = "https://img.icons8.com/?size=100&id=uWQJp2tLXUH6&format=png&color=000000"
|
|
42
|
+
DC_RECENT_HISTORY_LIMIT = 5
|
|
43
|
+
|
|
39
44
|
WIN_BACKGROUND_TASK_NAME = "CS2Tracker Daily Calculation"
|
|
40
45
|
WIN_BACKGROUND_TASK_SCHEDULE = "DAILY"
|
|
41
46
|
WIN_BACKGROUND_TASK_TIME = "12:00"
|
|
@@ -56,7 +61,7 @@ class Scraper:
|
|
|
56
61
|
|
|
57
62
|
def parse_config(self):
|
|
58
63
|
"""Parse the configuration file to read settings and user-owned items."""
|
|
59
|
-
self.config = ConfigParser()
|
|
64
|
+
self.config = ConfigParser(interpolation=None)
|
|
60
65
|
self.config.read(CONFIG_FILE)
|
|
61
66
|
|
|
62
67
|
def _start_session(self):
|
|
@@ -91,12 +96,22 @@ class Scraper:
|
|
|
91
96
|
"[bold red][!] Failed to scrape case prices. (Consider using proxies to prevent rate limiting)\n"
|
|
92
97
|
)
|
|
93
98
|
|
|
99
|
+
custom_item_usd_total = 0
|
|
100
|
+
try:
|
|
101
|
+
custom_item_usd_total = self._scrape_custom_item_prices()
|
|
102
|
+
except (RequestException, AttributeError, RetryError, ValueError):
|
|
103
|
+
self.console.print(
|
|
104
|
+
"[bold red][!] Failed to scrape custom item prices. (Consider using proxies to prevent rate limiting)\n"
|
|
105
|
+
)
|
|
106
|
+
|
|
94
107
|
self.usd_total += capsule_usd_total
|
|
95
108
|
self.usd_total += case_usd_total
|
|
109
|
+
self.usd_total += custom_item_usd_total
|
|
96
110
|
self.eur_total = CurrencyConverter().convert(self.usd_total, "USD", "EUR")
|
|
97
111
|
|
|
98
112
|
self._print_total()
|
|
99
113
|
self._save_price_log()
|
|
114
|
+
self._send_discord_notification()
|
|
100
115
|
|
|
101
116
|
# Reset totals for next run
|
|
102
117
|
self.usd_total, self.eur_total = 0, 0
|
|
@@ -122,6 +137,9 @@ class Scraper:
|
|
|
122
137
|
|
|
123
138
|
This will append a new entry to the output file if no entry has been made for
|
|
124
139
|
today.
|
|
140
|
+
|
|
141
|
+
:raises FileNotFoundError: If the output file does not exist.
|
|
142
|
+
:raises IOError: If there is an error writing to the output file.
|
|
125
143
|
"""
|
|
126
144
|
with open(OUTPUT_FILE, "r", encoding="utf-8") as price_logs:
|
|
127
145
|
price_logs_reader = csv.reader(price_logs)
|
|
@@ -157,6 +175,8 @@ class Scraper:
|
|
|
157
175
|
data is used for drawing the plot of past prices.
|
|
158
176
|
|
|
159
177
|
:return: A tuple containing three lists: dates, dollar prices, and euro prices.
|
|
178
|
+
:raises FileNotFoundError: If the output file does not exist.
|
|
179
|
+
:raises IOError: If there is an error reading the output file.
|
|
160
180
|
"""
|
|
161
181
|
dates, dollars, euros = [], [], []
|
|
162
182
|
with open(OUTPUT_FILE, "r", encoding="utf-8") as price_logs:
|
|
@@ -173,6 +193,82 @@ class Scraper:
|
|
|
173
193
|
|
|
174
194
|
return dates, dollars, euros
|
|
175
195
|
|
|
196
|
+
def _construct_recent_calculations_embeds(self):
|
|
197
|
+
"""
|
|
198
|
+
Construct the embeds for the Discord message that will be sent after a price
|
|
199
|
+
calculation has been made.
|
|
200
|
+
|
|
201
|
+
:return: A list of embeds for the Discord message.
|
|
202
|
+
"""
|
|
203
|
+
dates, usd_logs, eur_logs = self.read_price_log()
|
|
204
|
+
dates, usd_logs, eur_logs = reversed(dates), reversed(usd_logs), reversed(eur_logs)
|
|
205
|
+
|
|
206
|
+
date_history, usd_history, eur_history = [], [], []
|
|
207
|
+
for date, usd_log, eur_log in zip(dates, usd_logs, eur_logs):
|
|
208
|
+
if len(date_history) >= DC_RECENT_HISTORY_LIMIT:
|
|
209
|
+
break
|
|
210
|
+
date_history.append(date.strftime("%Y-%m-%d"))
|
|
211
|
+
usd_history.append(f"${usd_log:.2f}")
|
|
212
|
+
eur_history.append(f"€{eur_log:.2f}")
|
|
213
|
+
|
|
214
|
+
date_history = "\n".join(date_history)
|
|
215
|
+
usd_history = "\n".join(usd_history)
|
|
216
|
+
eur_history = "\n".join(eur_history)
|
|
217
|
+
|
|
218
|
+
embeds = [
|
|
219
|
+
{
|
|
220
|
+
"title": "📊 Recent Price History",
|
|
221
|
+
"color": 5814783,
|
|
222
|
+
"fields": [
|
|
223
|
+
{
|
|
224
|
+
"name": "Date",
|
|
225
|
+
"value": date_history,
|
|
226
|
+
"inline": True,
|
|
227
|
+
},
|
|
228
|
+
{
|
|
229
|
+
"name": "USD Total",
|
|
230
|
+
"value": usd_history,
|
|
231
|
+
"inline": True,
|
|
232
|
+
},
|
|
233
|
+
{
|
|
234
|
+
"name": "EUR Total",
|
|
235
|
+
"value": eur_history,
|
|
236
|
+
"inline": True,
|
|
237
|
+
},
|
|
238
|
+
],
|
|
239
|
+
}
|
|
240
|
+
]
|
|
241
|
+
|
|
242
|
+
return embeds
|
|
243
|
+
|
|
244
|
+
def _send_discord_notification(self):
|
|
245
|
+
"""Send a message to a Discord webhook if notifications are enabled in the
|
|
246
|
+
config file and a webhook URL is provided.
|
|
247
|
+
"""
|
|
248
|
+
discord_notifications = self.config.getboolean(
|
|
249
|
+
"App Settings", "discord_notifications", fallback=False
|
|
250
|
+
)
|
|
251
|
+
webhook_url = self.config.get("User Settings", "discord_webhook_url", fallback=None)
|
|
252
|
+
webhook_url = None if webhook_url in ("None", "") else webhook_url
|
|
253
|
+
|
|
254
|
+
if discord_notifications and webhook_url:
|
|
255
|
+
embeds = self._construct_recent_calculations_embeds()
|
|
256
|
+
try:
|
|
257
|
+
response = self.session.post(
|
|
258
|
+
url=webhook_url,
|
|
259
|
+
json={
|
|
260
|
+
"embeds": embeds,
|
|
261
|
+
"username": DC_WEBHOOK_USERNAME,
|
|
262
|
+
"avatar_url": DC_WEBHOOK_AVATAR_URL,
|
|
263
|
+
},
|
|
264
|
+
)
|
|
265
|
+
response.raise_for_status()
|
|
266
|
+
self.console.print("[bold steel_blue3][+] Discord notification sent.\n")
|
|
267
|
+
except RequestException as error:
|
|
268
|
+
self.console.print(f"[bold red][!] Failed to send Discord notification: {error}\n")
|
|
269
|
+
except Exception as error:
|
|
270
|
+
self.console.print(f"[bold red][!] An unexpected error occurred: {error}\n")
|
|
271
|
+
|
|
176
272
|
@retry(stop=stop_after_attempt(10))
|
|
177
273
|
def _get_page(self, url):
|
|
178
274
|
"""
|
|
@@ -184,8 +280,8 @@ class Scraper:
|
|
|
184
280
|
:raises RequestException: If the request fails.
|
|
185
281
|
:raises RetryError: If the retry limit is reached.
|
|
186
282
|
"""
|
|
187
|
-
use_proxy = self.config.getboolean("Settings", "
|
|
188
|
-
api_key = self.config.get("Settings", "
|
|
283
|
+
use_proxy = self.config.getboolean("App Settings", "use_proxy", fallback=False)
|
|
284
|
+
api_key = self.config.get("User Settings", "api_key", fallback=None)
|
|
189
285
|
api_key = None if api_key in ("None", "") else api_key
|
|
190
286
|
if use_proxy and api_key:
|
|
191
287
|
page = self.session.get(
|
|
@@ -206,26 +302,26 @@ class Scraper:
|
|
|
206
302
|
|
|
207
303
|
return page
|
|
208
304
|
|
|
209
|
-
def
|
|
305
|
+
def _parse_item_price(self, item_page, item_href):
|
|
210
306
|
"""
|
|
211
|
-
Parse the price of
|
|
307
|
+
Parse the price of an item from the given steamcommunity market page and item
|
|
308
|
+
href.
|
|
212
309
|
|
|
213
|
-
:param
|
|
214
|
-
|
|
215
|
-
:
|
|
216
|
-
:
|
|
217
|
-
:raises ValueError: If the capsule listing or price span cannot be found.
|
|
310
|
+
:param item_page: The HTTP response object containing the item page content.
|
|
311
|
+
:param item_href: The href of the item listing to find the price for.
|
|
312
|
+
:return: The price of the item as a float.
|
|
313
|
+
:raises ValueError: If the item listing or price span cannot be found.
|
|
218
314
|
"""
|
|
219
|
-
|
|
220
|
-
|
|
221
|
-
if not isinstance(
|
|
222
|
-
raise ValueError(f"Failed to find
|
|
315
|
+
item_soup = BeautifulSoup(item_page.content, "html.parser")
|
|
316
|
+
item_listing = item_soup.find("a", attrs={"href": f"{item_href}"})
|
|
317
|
+
if not isinstance(item_listing, Tag):
|
|
318
|
+
raise ValueError(f"Failed to find item listing: {item_href}")
|
|
223
319
|
|
|
224
|
-
|
|
225
|
-
if not isinstance(
|
|
226
|
-
raise ValueError(f"Failed to find price span in
|
|
320
|
+
item_price_span = item_listing.find("span", attrs={"class": "normal_price"})
|
|
321
|
+
if not isinstance(item_price_span, Tag):
|
|
322
|
+
raise ValueError(f"Failed to find price span in item listing: {item_href}")
|
|
227
323
|
|
|
228
|
-
price_str =
|
|
324
|
+
price_str = item_price_span.text.split()[2]
|
|
229
325
|
price = float(price_str.replace("$", ""))
|
|
230
326
|
|
|
231
327
|
return price
|
|
@@ -244,7 +340,7 @@ class Scraper:
|
|
|
244
340
|
hrefs, and names.
|
|
245
341
|
"""
|
|
246
342
|
capsule_title = capsule_section.center(MAX_LINE_LEN, SEPARATOR)
|
|
247
|
-
self.console.print(f"[bold magenta]{capsule_title}")
|
|
343
|
+
self.console.print(f"[bold magenta]{capsule_title}\n")
|
|
248
344
|
|
|
249
345
|
capsule_usd_total = 0
|
|
250
346
|
capsule_page = self._get_page(capsule_info["page"])
|
|
@@ -254,7 +350,7 @@ class Scraper:
|
|
|
254
350
|
if owned == 0:
|
|
255
351
|
continue
|
|
256
352
|
|
|
257
|
-
price_usd = self.
|
|
353
|
+
price_usd = self._parse_item_price(capsule_page, capsule_href)
|
|
258
354
|
price_usd_owned = round(float(owned * price_usd), 2)
|
|
259
355
|
|
|
260
356
|
self.console.print(f"[bold deep_sky_blue4]{capsule_name}")
|
|
@@ -273,29 +369,6 @@ class Scraper:
|
|
|
273
369
|
|
|
274
370
|
return capsule_usd_total
|
|
275
371
|
|
|
276
|
-
def _parse_case_price(self, case_page, case_href):
|
|
277
|
-
"""
|
|
278
|
-
Parse the price of a case from the given page and href.
|
|
279
|
-
|
|
280
|
-
:param case_page: The HTTP response object containing the case page content.
|
|
281
|
-
:param case_href: The href of the case listing to find the price for.
|
|
282
|
-
:return: The price of the case as a float.
|
|
283
|
-
:raises ValueError: If the case listing or price span cannot be found.
|
|
284
|
-
"""
|
|
285
|
-
case_soup = BeautifulSoup(case_page.content, "html.parser")
|
|
286
|
-
case_listing = case_soup.find("a", attrs={"href": case_href})
|
|
287
|
-
if not isinstance(case_listing, Tag):
|
|
288
|
-
raise ValueError(f"Failed to find case listing: {case_href}")
|
|
289
|
-
|
|
290
|
-
price_class = case_listing.find("span", attrs={"class": "normal_price"})
|
|
291
|
-
if not isinstance(price_class, Tag):
|
|
292
|
-
raise ValueError(f"Failed to find price class in case listing: {case_href}")
|
|
293
|
-
|
|
294
|
-
price_str = price_class.text.split()[2]
|
|
295
|
-
price = float(price_str.replace("$", ""))
|
|
296
|
-
|
|
297
|
-
return price
|
|
298
|
-
|
|
299
372
|
def _scrape_case_prices(self):
|
|
300
373
|
"""
|
|
301
374
|
Scrape prices for all cases defined in the configuration.
|
|
@@ -310,20 +383,72 @@ class Scraper:
|
|
|
310
383
|
|
|
311
384
|
case_name = config_case_name.replace("_", " ").title()
|
|
312
385
|
case_title = case_name.center(MAX_LINE_LEN, SEPARATOR)
|
|
313
|
-
self.console.print(f"[bold magenta]{case_title}")
|
|
386
|
+
self.console.print(f"[bold magenta]{case_title}\n")
|
|
314
387
|
|
|
315
388
|
case_page = self._get_page(CASE_PAGES[case_index])
|
|
316
|
-
price_usd = self.
|
|
389
|
+
price_usd = self._parse_item_price(case_page, CASE_HREFS[case_index])
|
|
317
390
|
price_usd_owned = round(float(int(owned) * price_usd), 2)
|
|
318
391
|
|
|
319
392
|
self.console.print(PRICE_INFO.format(owned, price_usd, price_usd_owned))
|
|
320
393
|
case_usd_total += price_usd_owned
|
|
321
394
|
|
|
322
|
-
if not self.config.getboolean("Settings", "
|
|
395
|
+
if not self.config.getboolean("App Settings", "use_proxy", fallback=False):
|
|
323
396
|
time.sleep(1)
|
|
324
397
|
|
|
325
398
|
return case_usd_total
|
|
326
399
|
|
|
400
|
+
def _market_page_from_href(self, item_href):
|
|
401
|
+
"""
|
|
402
|
+
Convert an href of a Steam Community Market item to a market page URL. This is
|
|
403
|
+
done by decoding the URL-encoded item name and formatting it into a search URL.
|
|
404
|
+
|
|
405
|
+
:param item_href: The href of the item listing, typically ending with the item's
|
|
406
|
+
name.
|
|
407
|
+
:return: A URL string for the Steam Community Market page of the item.
|
|
408
|
+
"""
|
|
409
|
+
url_encoded_name = item_href.split("/")[-1]
|
|
410
|
+
decoded_name = unquote(url_encoded_name)
|
|
411
|
+
decoded_name_query = decoded_name.lower().replace(" ", "+")
|
|
412
|
+
page_url = f"https://steamcommunity.com/market/search?q={decoded_name_query}"
|
|
413
|
+
|
|
414
|
+
return page_url
|
|
415
|
+
|
|
416
|
+
def _scrape_custom_item_prices(self):
|
|
417
|
+
"""
|
|
418
|
+
Scrape prices for custom items defined in the configuration.
|
|
419
|
+
|
|
420
|
+
For each custom item, it prints the item name, owned count, price per item, and
|
|
421
|
+
total price for owned items.
|
|
422
|
+
"""
|
|
423
|
+
custom_item_usd_total = 0
|
|
424
|
+
for config_custom_item_name, owned_and_href in self.config.items("Custom Items"):
|
|
425
|
+
if " " not in owned_and_href:
|
|
426
|
+
self.console.print(
|
|
427
|
+
"[bold red][!] Invalid custom item format (<item_name> = <owned_count> <item_url>)\n"
|
|
428
|
+
)
|
|
429
|
+
continue
|
|
430
|
+
|
|
431
|
+
owned, custom_item_href = owned_and_href.split(" ", 1)
|
|
432
|
+
if int(owned) == 0:
|
|
433
|
+
continue
|
|
434
|
+
|
|
435
|
+
custom_item_name = config_custom_item_name.replace("_", " ").title()
|
|
436
|
+
custom_item_title = custom_item_name.center(MAX_LINE_LEN, SEPARATOR)
|
|
437
|
+
self.console.print(f"[bold magenta]{custom_item_title}\n")
|
|
438
|
+
|
|
439
|
+
custom_item_page_url = self._market_page_from_href(custom_item_href)
|
|
440
|
+
custom_item_page = self._get_page(custom_item_page_url)
|
|
441
|
+
price_usd = self._parse_item_price(custom_item_page, custom_item_href)
|
|
442
|
+
price_usd_owned = round(float(int(owned) * price_usd), 2)
|
|
443
|
+
|
|
444
|
+
self.console.print(PRICE_INFO.format(owned, price_usd, price_usd_owned))
|
|
445
|
+
custom_item_usd_total += price_usd_owned
|
|
446
|
+
|
|
447
|
+
if not self.config.getboolean("App Settings", "use_proxy", fallback=False):
|
|
448
|
+
time.sleep(1)
|
|
449
|
+
|
|
450
|
+
return custom_item_usd_total
|
|
451
|
+
|
|
327
452
|
def identify_background_task(self):
|
|
328
453
|
"""
|
|
329
454
|
Search the OS for a daily background task that runs the scraper.
|
|
@@ -406,6 +531,35 @@ class Scraper:
|
|
|
406
531
|
# TODO: implement toggle for cron jobs
|
|
407
532
|
pass
|
|
408
533
|
|
|
534
|
+
def toggle_use_proxy(self, enabled: bool):
|
|
535
|
+
"""
|
|
536
|
+
Toggle the use of proxies for requests. This will update the configuration file.
|
|
537
|
+
|
|
538
|
+
:param enabled: If True, proxies will be used; if False, they will not be used.
|
|
539
|
+
"""
|
|
540
|
+
self.config.set("App Settings", "use_proxy", str(enabled))
|
|
541
|
+
with open(CONFIG_FILE, "w", encoding="utf-8") as config_file:
|
|
542
|
+
self.config.write(config_file)
|
|
543
|
+
|
|
544
|
+
self.console.print(
|
|
545
|
+
f"[bold green]{'[+] Enabled' if enabled else '[-] Disabled'} proxy usage for requests."
|
|
546
|
+
)
|
|
547
|
+
|
|
548
|
+
def toggle_discord_webhook(self, enabled: bool):
|
|
549
|
+
"""
|
|
550
|
+
Toggle the use of a Discord webhook to notify users of price calculations.
|
|
551
|
+
|
|
552
|
+
:param enabled: If True, the webhook will be used; if False, it will not be
|
|
553
|
+
used.
|
|
554
|
+
"""
|
|
555
|
+
self.config.set("App Settings", "discord_notifications", str(enabled))
|
|
556
|
+
with open(CONFIG_FILE, "w", encoding="utf-8") as config_file:
|
|
557
|
+
self.config.write(config_file)
|
|
558
|
+
|
|
559
|
+
self.console.print(
|
|
560
|
+
f"[bold green]{'[+] Enabled' if enabled else '[-] Disabled'} Discord webhook notifications."
|
|
561
|
+
)
|
|
562
|
+
|
|
409
563
|
|
|
410
564
|
if __name__ == "__main__":
|
|
411
565
|
scraper = Scraper()
|
|
@@ -1,6 +1,6 @@
|
|
|
1
1
|
Metadata-Version: 2.4
|
|
2
2
|
Name: cs2tracker
|
|
3
|
-
Version: 2.1.
|
|
3
|
+
Version: 2.1.8
|
|
4
4
|
Summary: Tracking the steam market prices of CS2 items
|
|
5
5
|
Home-page: https://github.com/ashiven/cs2tracker
|
|
6
6
|
Author: Jannik Novak
|
|
@@ -40,12 +40,13 @@ Dynamic: license-file
|
|
|
40
40
|
|
|
41
41
|
### Prerequisites
|
|
42
42
|
|
|
43
|
-
- Download and install the latest versions of [Python](https://www.python.org/downloads/) and [Pip](https://pypi.org/project/pip/). (Required
|
|
43
|
+
- Download and install the latest versions of [Python](https://www.python.org/downloads/) and [Pip](https://pypi.org/project/pip/). (Required on Linux)
|
|
44
44
|
- Register for the [Crawlbase Smart Proxy API](https://crawlbase.com/) and retrieve your API key. (Optional)
|
|
45
|
+
- Create a [Discord Webhook](https://support.discord.com/hc/en-us/articles/228383668-Intro-to-Webhooks) to be notified about recent price updates. (Optional)
|
|
45
46
|
|
|
46
47
|
### Setup
|
|
47
48
|
|
|
48
|
-
#### Windows Executable
|
|
49
|
+
#### Windows Executable _(no color support)_
|
|
49
50
|
|
|
50
51
|
- Simply [download the latest executable](https://github.com/ashiven/cs2tracker/releases/latest/download/cs2tracker-windows.zip) and run it.
|
|
51
52
|
|
|
@@ -58,6 +59,7 @@ Dynamic: license-file
|
|
|
58
59
|
```
|
|
59
60
|
|
|
60
61
|
2. Run it:
|
|
62
|
+
|
|
61
63
|
```bash
|
|
62
64
|
cs2tracker
|
|
63
65
|
```
|
|
@@ -65,10 +67,11 @@ Dynamic: license-file
|
|
|
65
67
|
### Options
|
|
66
68
|
|
|
67
69
|
- `Run!` to gather the current market prices of your items and calculate the total amount in USD and EUR.
|
|
68
|
-
- `Edit Config` to
|
|
70
|
+
- `Edit Config` to specify the numbers of items owned in the config file. You can also add items other than cases and sticker capsules following the format in the `Custom Items` section. (item_name = item_owned item_page)
|
|
69
71
|
- `Show History` to see a price chart consisting of past calculations. A new data point is generated once a day upon running the program.
|
|
70
|
-
- `Daily Background
|
|
71
|
-
-
|
|
72
|
+
- `Daily Background Calculations` to automatically run a daily calculation of your investment in the background and save the results such that they can later be viewed via `Show History`.
|
|
73
|
+
- `Receive Discord Notifications` to receive a notification on your Discord server when the program has finished calculating your investment. You need to set up a [webhook](https://support.discord.com/hc/en-us/articles/228383668-Intro-to-Webhooks) in your Discord server and enter the webhook url into the `discord_webhook_url` field in the config file.
|
|
74
|
+
- `Proxy Requests` to prevent your requests from being rate limited by the steamcommunity server. You need to register for a free API key on [Crawlbase](crawlbase.com) and enter it into the `api_key` field in the config file.
|
|
72
75
|
|
|
73
76
|
---
|
|
74
77
|
|
|
@@ -1,198 +0,0 @@
|
|
|
1
|
-
[Settings]
|
|
2
|
-
Use_Proxy = False
|
|
3
|
-
API_Key = None
|
|
4
|
-
|
|
5
|
-
[Cases]
|
|
6
|
-
Revolution_Case = 0
|
|
7
|
-
Recoil_Case = 0
|
|
8
|
-
Dreams_And_Nightmares_Case = 0
|
|
9
|
-
Operation_Riptide_Case = 0
|
|
10
|
-
Snakebite_Case = 0
|
|
11
|
-
Operation_Broken_Fang_Case = 0
|
|
12
|
-
Fracture_Case = 0
|
|
13
|
-
Chroma_Case = 0
|
|
14
|
-
Chroma_2_Case = 0
|
|
15
|
-
Chroma_3_Case = 0
|
|
16
|
-
Clutch_Case = 0
|
|
17
|
-
CSGO_Weapon_Case = 0
|
|
18
|
-
CSGO_Weapon_Case_2 = 0
|
|
19
|
-
CSGO_Weapon_Case_3 = 0
|
|
20
|
-
CS20_Case = 0
|
|
21
|
-
Danger_Zone_Case = 0
|
|
22
|
-
eSports_2013_Case = 0
|
|
23
|
-
eSports_2013_Winter_Case = 0
|
|
24
|
-
eSports_2014_Summer_Case = 0
|
|
25
|
-
Falchion_Case = 0
|
|
26
|
-
Gamma_Case = 0
|
|
27
|
-
Gamma_2_Case = 0
|
|
28
|
-
Glove_Case = 0
|
|
29
|
-
Horizon_Case = 0
|
|
30
|
-
Huntsman_Case = 0
|
|
31
|
-
Operation_Bravo_Case = 0
|
|
32
|
-
Operation_Breakout_Case = 0
|
|
33
|
-
Operation_Hydra_Case = 0
|
|
34
|
-
Operation_Phoenix_Case = 0
|
|
35
|
-
Operation_Vanguard_Case = 0
|
|
36
|
-
Operation_Wildfire_Case = 0
|
|
37
|
-
Prisma_Case = 0
|
|
38
|
-
Prisma_2_Case = 0
|
|
39
|
-
Revolver_Case = 0
|
|
40
|
-
Shadow_Case = 0
|
|
41
|
-
Shattered_Web_Case = 0
|
|
42
|
-
Spectrum_Case = 0
|
|
43
|
-
Spectrum_2_Case = 0
|
|
44
|
-
Winter_Offensive_Case = 0
|
|
45
|
-
Kilowatt_Case = 0
|
|
46
|
-
Gallery_Case = 0
|
|
47
|
-
Fever_Case = 0
|
|
48
|
-
|
|
49
|
-
[Katowice 2014 Sticker Capsule]
|
|
50
|
-
Katowice_Legends = 0
|
|
51
|
-
Katowice_Challengers = 0
|
|
52
|
-
|
|
53
|
-
[Cologne 2014 Sticker Capsule]
|
|
54
|
-
Cologne_Legends = 0
|
|
55
|
-
Cologne_Challengers = 0
|
|
56
|
-
|
|
57
|
-
[DreamHack 2014 Sticker Capsule]
|
|
58
|
-
DreamHack_Legends = 0
|
|
59
|
-
|
|
60
|
-
[Katowice 2015 Sticker Capsule]
|
|
61
|
-
Katowice_Legends = 0
|
|
62
|
-
Katowice_Challengers = 0
|
|
63
|
-
|
|
64
|
-
[Cologne 2015 Sticker Capsule]
|
|
65
|
-
Cologne_Legends = 0
|
|
66
|
-
Cologne_Challengers = 0
|
|
67
|
-
|
|
68
|
-
[Cluj-Napoca 2015 Sticker Capsule]
|
|
69
|
-
Cluj_Napoca_Legends = 0
|
|
70
|
-
Cluj_Napoca_Challengers = 0
|
|
71
|
-
Cluj_Napoca_Legends_Autographs = 0
|
|
72
|
-
Cluj_Napoca_Challengers_Autographs = 0
|
|
73
|
-
|
|
74
|
-
[Columbus 2016 Sticker Capsule]
|
|
75
|
-
Columbus_Legends = 0
|
|
76
|
-
Columbus_Challengers = 0
|
|
77
|
-
Columbus_Legends_Autographs = 0
|
|
78
|
-
Columbus_Challengers_Autographs = 0
|
|
79
|
-
|
|
80
|
-
[Cologne 2016 Sticker Capsule]
|
|
81
|
-
Cologne_Legends = 0
|
|
82
|
-
Cologne_Challengers = 0
|
|
83
|
-
Cologne_Legends_Autographs = 0
|
|
84
|
-
Cologne_Challengers_Autographs = 0
|
|
85
|
-
|
|
86
|
-
[Atlanta 2017 Sticker Capsule]
|
|
87
|
-
Atlanta_Legends = 0
|
|
88
|
-
Atlanta_Challengers = 0
|
|
89
|
-
Atlanta_Legends_Autographs = 0
|
|
90
|
-
Atlanta_Challengers_Autographs = 0
|
|
91
|
-
|
|
92
|
-
[Krakow 2017 Sticker Capsule]
|
|
93
|
-
Krakow_Legends = 0
|
|
94
|
-
Krakow_Challengers = 0
|
|
95
|
-
Krakow_Legends_Autographs = 0
|
|
96
|
-
Krakow_Challengers_Autographs = 0
|
|
97
|
-
|
|
98
|
-
[Boston 2018 Sticker Capsule]
|
|
99
|
-
Boston_Legends = 0
|
|
100
|
-
Boston_Minor_Challengers = 0
|
|
101
|
-
Boston_Returning_Challengers = 0
|
|
102
|
-
Boston_Attending_Legends = 0
|
|
103
|
-
Boston_Minor_Challengers_with_Flash_Gaming = 0
|
|
104
|
-
Boston_Legends_Autographs = 0
|
|
105
|
-
Boston_Minor_Challengers_Autographs = 0
|
|
106
|
-
Boston_Returning_Challengers_Autographs = 0
|
|
107
|
-
Boston_Attending_Legends_Autographs = 0
|
|
108
|
-
Boston_Minor_Challengers_with_Flash_Gaming_Autographs = 0
|
|
109
|
-
|
|
110
|
-
[London 2018 Sticker Capsule]
|
|
111
|
-
London_Legends = 0
|
|
112
|
-
London_Minor_Challengers = 0
|
|
113
|
-
London_Returning_Challengers = 0
|
|
114
|
-
London_Legends_Autographs = 0
|
|
115
|
-
London_Minor_Challengers_Autographs = 0
|
|
116
|
-
London_Returning_Challengers_Autographs = 0
|
|
117
|
-
|
|
118
|
-
[Katowice 2019 Sticker Capsule]
|
|
119
|
-
Katowice_Legends = 0
|
|
120
|
-
Katowice_Minor_Challengers = 0
|
|
121
|
-
Katowice_Returning_Challengers = 0
|
|
122
|
-
Katowice_Legends_Autographs = 0
|
|
123
|
-
Katowice_Minor_Challengers_Autographs = 0
|
|
124
|
-
Katowice_Returning_Challengers_Autographs = 0
|
|
125
|
-
|
|
126
|
-
[Berlin 2019 Sticker Capsule]
|
|
127
|
-
Berlin_Legends = 0
|
|
128
|
-
Berlin_Minor_Challengers = 0
|
|
129
|
-
Berlin_Returning_Challengers = 0
|
|
130
|
-
Berlin_Legends_Autographs = 0
|
|
131
|
-
Berlin_Minor_Challengers_Autographs = 0
|
|
132
|
-
Berlin_Returning_Challengers_Autographs = 0
|
|
133
|
-
|
|
134
|
-
[2020 RMR Sticker Capsule]
|
|
135
|
-
RMR_Legends = 0
|
|
136
|
-
RMR_Challengers = 0
|
|
137
|
-
RMR_Contenders = 0
|
|
138
|
-
|
|
139
|
-
[Stockholm 2021 Sticker Capsule]
|
|
140
|
-
Stockholm_Legends = 0
|
|
141
|
-
Stockholm_Challengers = 0
|
|
142
|
-
Stockholm_Contenders = 0
|
|
143
|
-
Stockholm_Champions_Autographs = 0
|
|
144
|
-
Stockholm_Finalists_Autographs = 0
|
|
145
|
-
|
|
146
|
-
[Antwerp 2022 Sticker Capsule]
|
|
147
|
-
Antwerp_Legends = 0
|
|
148
|
-
Antwerp_Challengers = 0
|
|
149
|
-
Antwerp_Contenders = 0
|
|
150
|
-
Antwerp_Champions_Autographs = 0
|
|
151
|
-
Antwerp_Challengers_Autographs = 0
|
|
152
|
-
Antwerp_Legends_Autographs = 0
|
|
153
|
-
Antwerp_Contenders_Autographs = 0
|
|
154
|
-
|
|
155
|
-
[Rio 2022 Sticker Capsule]
|
|
156
|
-
Rio_Legends = 0
|
|
157
|
-
Rio_Challengers = 0
|
|
158
|
-
Rio_Contenders = 0
|
|
159
|
-
Rio_Champions_Autographs = 0
|
|
160
|
-
Rio_Challengers_Autographs = 0
|
|
161
|
-
Rio_Legends_Autographs = 0
|
|
162
|
-
Rio_Contenders_Autographs = 0
|
|
163
|
-
|
|
164
|
-
[Paris 2023 Sticker Capsule]
|
|
165
|
-
Paris_Legends = 0
|
|
166
|
-
Paris_Challengers = 0
|
|
167
|
-
Paris_Contenders = 0
|
|
168
|
-
Paris_Champions_Autographs = 0
|
|
169
|
-
Paris_Challengers_Autographs = 0
|
|
170
|
-
Paris_Legends_Autographs = 0
|
|
171
|
-
Paris_Contenders_Autographs = 0
|
|
172
|
-
|
|
173
|
-
[Copenhagen 2024 Sticker Capsule]
|
|
174
|
-
Copenhagen_Legends = 0
|
|
175
|
-
Copenhagen_Challengers = 0
|
|
176
|
-
Copenhagen_Contenders = 0
|
|
177
|
-
Copenhagen_Champions_Autographs = 0
|
|
178
|
-
Copenhagen_Challengers_Autographs = 0
|
|
179
|
-
Copenhagen_Legends_Autographs = 0
|
|
180
|
-
Copenhagen_Contenders_Autographs = 0
|
|
181
|
-
|
|
182
|
-
[Shanghai 2024 Sticker Capsule]
|
|
183
|
-
Shanghai_Legends = 0
|
|
184
|
-
Shanghai_Challengers = 0
|
|
185
|
-
Shanghai_Contenders = 0
|
|
186
|
-
Shanghai_Champions_Autographs = 0
|
|
187
|
-
Shanghai_Challengers_Autographs = 0
|
|
188
|
-
Shanghai_Legends_Autographs = 0
|
|
189
|
-
Shanghai_Contenders_Autographs = 0
|
|
190
|
-
|
|
191
|
-
[Austin 2025 Sticker Capsule]
|
|
192
|
-
Austin_Legends = 0
|
|
193
|
-
Austin_Challengers = 0
|
|
194
|
-
Austin_Contenders = 0
|
|
195
|
-
Austin_Champions_Autographs = 0
|
|
196
|
-
Austin_Challengers_Autographs = 0
|
|
197
|
-
Austin_Legends_Autographs = 0
|
|
198
|
-
Austin_Contenders_Autographs = 0
|
|
File without changes
|
|
File without changes
|
|
File without changes
|
|
File without changes
|
|
File without changes
|
|
File without changes
|
/cs2tracker-2.1.6/assets/icons8-counter-strike-bubbles-96.png → /cs2tracker-2.1.8/assets/icon.png
RENAMED
|
File without changes
|
|
File without changes
|
|
File without changes
|
|
File without changes
|
|
File without changes
|
|
File without changes
|
|
File without changes
|
|
File without changes
|
|
File without changes
|
|
File without changes
|
|
File without changes
|
|
File without changes
|