TeLLMgramBot 2.2.0__tar.gz → 2.4.0__tar.gz

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
Files changed (24) hide show
  1. {tellmgrambot-2.2.0 → tellmgrambot-2.4.0}/PKG-INFO +57 -43
  2. {tellmgrambot-2.2.0 → tellmgrambot-2.4.0}/README.md +54 -41
  3. {tellmgrambot-2.2.0 → tellmgrambot-2.4.0}/TeLLMgramBot/TeLLMgramBot.py +165 -109
  4. {tellmgrambot-2.2.0 → tellmgrambot-2.4.0}/TeLLMgramBot/conversation.py +4 -4
  5. {tellmgrambot-2.2.0 → tellmgrambot-2.4.0}/TeLLMgramBot/initialize.py +27 -39
  6. {tellmgrambot-2.2.0 → tellmgrambot-2.4.0}/TeLLMgramBot/message_handlers.py +5 -10
  7. tellmgrambot-2.4.0/TeLLMgramBot/providers/__init__.py +0 -0
  8. tellmgrambot-2.4.0/TeLLMgramBot/providers/anthropic_provider.py +53 -0
  9. tellmgrambot-2.4.0/TeLLMgramBot/providers/base.py +27 -0
  10. tellmgrambot-2.4.0/TeLLMgramBot/providers/factory.py +23 -0
  11. tellmgrambot-2.4.0/TeLLMgramBot/providers/openai_provider.py +34 -0
  12. {tellmgrambot-2.2.0 → tellmgrambot-2.4.0}/TeLLMgramBot/tokenGPT.py +33 -24
  13. {tellmgrambot-2.2.0 → tellmgrambot-2.4.0}/TeLLMgramBot/web_utils.py +0 -1
  14. {tellmgrambot-2.2.0 → tellmgrambot-2.4.0}/TeLLMgramBot.egg-info/PKG-INFO +57 -43
  15. {tellmgrambot-2.2.0 → tellmgrambot-2.4.0}/TeLLMgramBot.egg-info/SOURCES.txt +6 -2
  16. {tellmgrambot-2.2.0 → tellmgrambot-2.4.0}/TeLLMgramBot.egg-info/requires.txt +1 -0
  17. {tellmgrambot-2.2.0 → tellmgrambot-2.4.0}/setup.py +3 -2
  18. tellmgrambot-2.2.0/TeLLMgramBot/openai_singleton.py +0 -12
  19. {tellmgrambot-2.2.0 → tellmgrambot-2.4.0}/LICENSE +0 -0
  20. {tellmgrambot-2.2.0 → tellmgrambot-2.4.0}/TeLLMgramBot/__init__.py +0 -0
  21. {tellmgrambot-2.2.0 → tellmgrambot-2.4.0}/TeLLMgramBot/utils.py +0 -0
  22. {tellmgrambot-2.2.0 → tellmgrambot-2.4.0}/TeLLMgramBot.egg-info/dependency_links.txt +0 -0
  23. {tellmgrambot-2.2.0 → tellmgrambot-2.4.0}/TeLLMgramBot.egg-info/top_level.txt +0 -0
  24. {tellmgrambot-2.2.0 → tellmgrambot-2.4.0}/setup.cfg +0 -0
@@ -1,7 +1,7 @@
1
1
  Metadata-Version: 2.4
2
2
  Name: TeLLMgramBot
3
- Version: 2.2.0
4
- Summary: OpenAI GPT, driven by Telegram
3
+ Version: 2.4.0
4
+ Summary: LLM-powered Telegram bot (OpenAI + Anthropic)
5
5
  Home-page: https://github.com/Digital-Heresy/TeLLMgramBot
6
6
  Author: Digital Heresy
7
7
  Author-email: ronin.atx@gmail.com
@@ -10,6 +10,7 @@ Requires-Python: >=3.12
10
10
  Description-Content-Type: text/markdown
11
11
  License-File: LICENSE
12
12
  Requires-Dist: openai>2.0
13
+ Requires-Dist: anthropic>=0.40
13
14
  Requires-Dist: PyYAML
14
15
  Requires-Dist: httpx
15
16
  Requires-Dist: beautifulsoup4
@@ -29,9 +30,9 @@ Dynamic: requires-python
29
30
  Dynamic: summary
30
31
 
31
32
  # TeLLMgramBot
32
- The basic goal of this project is to create a bridge between a Telegram Bot and a Large Langage Model (LLM), like OpenAI's GPT models.
33
+ The basic goal of this project is to create a bridge between a Telegram Bot and a Large Language Model (LLM), supporting both OpenAI's GPT models and Anthropic's Claude models.
33
34
  * To use this library, you must have a Telegram account **with a user name**, not just a phone number. If you don't have one, [create one online](https://telegram.org/).
34
- * If added to a Telgram group, the bot must be [adminstrator](https://www.alphr.com/add-admin-telegram/) in order to respond to a user calling out its name, initials, or nickname.
35
+ * If added to a Telegram group, the bot must be [administrator](https://www.alphr.com/add-admin-telegram/) in order to respond to a user calling out its name, initials, or nickname.
35
36
  <img src="assets/TeLLMgramBot_Logo.png" width=200 align=center />
36
37
 
37
38
  ## Telegram Bot + LLM Encapsulation
@@ -39,36 +40,45 @@ The basic goal of this project is to create a bridge between a Telegram Bot and
39
40
  * The more dynamic conversation gets handed off to the LLM to manage prompts and responses, and Telegram acts as the interaction broker.
40
41
  * Pass the URL in [square brackets] and mention how the bot should interpret it.
41
42
  * Example: "What do you think of this article? [https://some_site/article]"
42
- * This uses another GPT model, preferably GPT-5 or GPT-4o, to support more URL content with its higher token limit.
43
+ * This uses a separate model (configurable via `url_model`) to support more URL content with its higher token limit.
43
44
  * Tokens are used to measure the length of all conversation messages between the Telegram bot assistant and the user. This is useful to:
44
45
  * Ensure the length does not go over the model limit. If it does, prune oldest messages to fit within the limit.
45
46
  * Remember 50% of the past conversations when starting up TeLLMgramBot again.
46
47
  * Users can also clear their conversation history for privacy.
47
48
 
48
49
  ## Why Telegram?
49
- Using Telegram as the interface not only solves "exposing" the interface, but gives you boadloads of interactivity over a standard Command Line interface, or trying to create a website with input boxes and submit buttons to try to handle everything:
50
+ Using Telegram as the interface not only solves "exposing" the interface, but gives you boatloads of interactivity over a standard Command Line interface, or trying to create a website with input boxes and submit buttons to try to handle everything:
50
51
  1. Telegram already lets you paste in verbose, multiline messages.
51
52
  2. Telegram already lets you paste in pictures, videos, links, etc.
52
53
  3. Telegram already lets you react with emojis, stickers, etc.
53
54
 
55
+ ## Supported LLM Providers
56
+ TeLLMgramBot selects the LLM provider automatically based on the model name:
57
+
58
+ | Model prefix | Provider | Example models |
59
+ |---|---|---|
60
+ | `gpt-` | OpenAI | `gpt-4o`, `gpt-4o-mini`, `gpt-5-mini` |
61
+ | `claude-` | Anthropic | `claude-sonnet-4-6`, `claude-haiku-4-5` |
62
+
63
+ Simply set `chat_model` (and optionally `url_model`) in your `config.yaml` to any supported model and supply the corresponding API key — no other changes needed.
64
+
54
65
  ## Directories
55
66
  When initializing TeLLMgramBot, the following directories get created:
56
67
  * `configs` - Contains bot configuration files.
57
- * `commands.txt`
58
- * Users can type `/help` interacting with the to see this file's text get displayed in Telegram.
59
68
  * `config.yaml` (can be a different name)
60
- * This file sets main OpenAI parameters like naming and GPT models to process.
61
- * The parameter `url_model` is to read URL content, different than `chat_model` that the bot normally uses to interact with the user.
62
- * An empty `token_limit` would do the maximum amount of tokens supported by the `chat_model` (e.g. 128000 for `gpt-4o-mini`).
69
+ * This file sets main bot parameters like naming and the LLM models to use.
70
+ * `chat_model` the model used for normal conversation (e.g. `gpt-5-mini` or `claude-sonnet-4-6`).
71
+ * `url_model` the model used to read and summarize URL content, can differ from `chat_model`.
72
+ * An empty `token_limit` will use the maximum tokens supported by the `chat_model`.
63
73
  * `tokenGPT.yaml`
64
- * This important YAML file contains token size parameters for supported OpenAI models.
65
- * If the first time, `gpt-5`, `gpt-5-mini`, `gpt-5-nano`, `gpt-4o`, and `gpt-4o-mini` get populated, but the user can specify more models with token size parameters as needed.
74
+ * Contains token size parameters for all supported models.
75
+ * On first run, GPT and Claude model families are pre-populated. Additional models can be added manually.
66
76
  * `prompts` - Contains prompt files for how the bot interacts with any user.
67
- * `test_personality.prmpt`
68
- * This is a sample prompt file as a basis to test this library.
69
- * The user can create more prompt files as needed for different personalities. See [OpenAI Playground](https://platform.openai.com/playground) to test some ideas.
77
+ * `test_personality.prmpt` (can be a different name)
78
+ * A sample prompt file as a basis to test this library.
79
+ * The user can create more prompt files as needed for different personalities.
70
80
  * `url_analysis.prmpt`
71
- * This is a crucial prompt file to analyze URL content in brackets `[]` in a different model (such as `gpt-4o` or `gpt-4.1`).
81
+ * Prompt template used to analyze URL content passed in brackets `[]`.
72
82
  * `errorlogs`
73
83
  * Contains a `tellmgrambot_error.log` file to investigate if there are problems during the interaction.
74
84
  * User will also get notified to contact the owner.
@@ -86,24 +96,27 @@ TeLLMgramBot also creates or utilizes the following environment variables that c
86
96
  If neither of these are defined, the initialization would use the top-level execution run directory.
87
97
 
88
98
  ## API Keys
89
- To operate TeLLMgramBot, three API keys are required:
90
- * [OpenAI](https://platform.openai.com/overview) - Drives the actual GPT AI.
91
- * [Telegram](https://core.telegram.org/api) - Offers a Bot API through BotFather for the messaging platform.
92
- * [VirusTotal](https://www.virustotal.com/gui/home/) - Performs safety checks on URLs.
99
+ To operate TeLLMgramBot, the following API keys are required:
100
+ * **[OpenAI](https://platform.openai.com/overview)** required when using a `gpt-*` model.
101
+ * **[Anthropic](https://console.anthropic.com/)** required when using a `claude-*` model.
102
+ * **[Telegram](https://core.telegram.org/api)** always required; offers a Bot API through BotFather.
103
+ * **[VirusTotal](https://www.virustotal.com/gui/home/)** — always required; performs safety checks on URLs.
93
104
 
94
105
  There are two ways to populate each API key: environment variables or `.key` files.
95
106
 
96
107
  ### Environment Variables
97
- TeLLMgramBot uses the following environment variables that can be pre-loaded with the three API keys respectively:
98
- 1. `TELLMGRAMBOT_OPENAI_API_KEY`
99
- 2. `TELLMGRAMBOT_TELEGRAM_API_KEY`
100
- 3. `TELLMGRAMBOT_VIRUSTOTAL_API_KEY`
108
+ TeLLMgramBot uses the following environment variables for API keys:
109
+ 1. `TELLMGRAMBOT_OPENAI_API_KEY` *(OpenAI models)*
110
+ 2. `TELLMGRAMBOT_ANTHROPIC_API_KEY` *(Anthropic models)*
111
+ 3. `TELLMGRAMBOT_TELEGRAM_API_KEY`
112
+ 4. `TELLMGRAMBOT_VIRUSTOTAL_API_KEY`
101
113
 
102
114
  During spin-up time, a user can call out `os.environ[env_var]` to set those variables, like the following example:
103
115
  ```
104
116
  my_keys = Some_Vault_Fetch_Function()
105
117
 
106
- os.environ['TELLMGRAMBOT_OPENAI_API_KEY'] = my_keys['GPTKey']
118
+ os.environ['TELLMGRAMBOT_OPENAI_API_KEY'] = my_keys['OpenAIKey']
119
+ os.environ['TELLMGRAMBOT_ANTHROPIC_API_KEY'] = my_keys['AnthropicKey']
107
120
  os.environ['TELLMGRAMBOT_TELEGRAM_API_KEY'] = my_keys['BotFatherToken']
108
121
  os.environ['TELLMGRAMBOT_VIRUSTOTAL_API_KEY'] = my_keys['VirusTotalToken']
109
122
  ```
@@ -111,27 +124,27 @@ os.environ['TELLMGRAMBOT_VIRUSTOTAL_API_KEY'] = my_keys['VirusTotalToken']
111
124
  This means the user can implement whatever key vault they want to fetch the keys at runtime, without needing files stored in the directory.
112
125
 
113
126
  ### API Key Files
114
- The other route is to create three files by the base path during execution or a specified environment variable `TELLMGRAMBOT_KEYS_PATH`. By default, three files are created for the user to input each API key:
127
+ The other route is to create files by the base path during execution or a specified environment variable `TELLMGRAMBOT_KEYS_PATH`. By default, files are created for the user to input each API key:
115
128
  1. `openai.key`
116
- 2. `telegram.key`
117
- 3. `virustotal.key`
129
+ 2. `anthropic.key` _(planned — env var only for now; see Phase 3)_
130
+ 3. `telegram.key`
131
+ 4. `virustotal.key`
118
132
 
119
133
  Each file with the associated API key will update its respective environment variable if not defined.
120
134
 
121
135
  ## Bot Setup
122
- This library includes an example script `test_local.py`, which uses files from the folders `configs` and `prompts` for TeLLMgramBot to process. The bot communicates with OpenAI via the **Responses API**, which replaces the older Chat Completions endpoint.
136
+ This library includes an example script `test_local.py`, which uses files from the folders `configs` and `prompts` for TeLLMgramBot to process.
123
137
  1. Ensure the previous sections are followed with the proper API keys and your Telegram bot set.
124
138
  2. Install this library via PIP (`pip install TeLLMgramBot`) and then import into your project.
125
- 3. Instantiate the bot by passing in various configuration pieces needed below:
139
+ 3. Instantiate the bot by passing in various configuration pieces needed below.
140
+ Note the Telegram bot's full name and username auto-populate before startup.
126
141
  ```
127
142
  telegram_bot = TeLLMgramBot.TelegramBot(
128
- bot_username = <Bot username like 'friendly_bot'>,
129
143
  bot_owner = <Bot owner's Telegram username>,
130
- bot_name = <Bot name like 'Friendly Bot'>,
131
144
  bot_nickname = <Bot nickname like 'Botty'>,
132
145
  bot_initials = <Bot initials like 'FB'>,
133
- chat_model = <Conversation model like 'gpt-4o-mini'>,
134
- url_model = <URL analysis model like 'gpt-4o'>,
146
+ chat_model = <Conversation model like 'gpt-4o-mini' or 'claude-sonnet-4-6'>,
147
+ url_model = <URL analysis model like 'gpt-4o' or 'claude-haiku-4-5'>,
135
148
  token_limit = <Maximum token count set, by default chat_model max>,
136
149
  persona_temp = <LLM factual to creative value [0-2], by default 1.0>,
137
150
  persona_prompt = <System prompt summarizing bot personality>
@@ -142,14 +155,15 @@ This library includes an example script `test_local.py`, which uses files from t
142
155
  telegram_bot.start_polling()
143
156
  ```
144
157
  Once you see `TeLLMgramBot polling...`, the bot is online in Telegram.
145
- 5. Typing `/help` shows all available commands reported by the `configs/commands.txt` file.
146
- 6. Only as owner, type `/start` directly to the bot to initiate user conversations.
158
+ 5. Converse! Type `/help` for all available commands.
147
159
 
148
160
  ## Resources
149
161
  * GitHub repository [python-telegram-bot](https://github.com/python-telegram-bot/python-telegram-bot) has guides to create a Telegram bot.
150
- * For more information on OpenAI models like `gpt-4o` and token limits, see the following:
151
- * [OpenAI model overview and maximum tokens](https://platform.openai.com/docs/models).
152
- * [OpenAI message conversion to tokens](https://github.com/openai/openai-python).
153
- * [OpenAI custom fine-tuning](https://platform.openai.com/docs/guides/model-optimization).
154
- * [OpenAI's tiktoken library, including some helpful guides](https://github.com/openai/tiktoken/tree/main).
155
- * [OpenAI Playground](https://platform.openai.com/playground) is a great place to test out prompts and responses.
162
+ * For more information on OpenAI models and token limits:
163
+ * [OpenAI model overview and maximum tokens](https://platform.openai.com/docs/models)
164
+ * [OpenAI message conversion to tokens](https://github.com/openai/openai-python)
165
+ * [OpenAI custom fine-tuning](https://platform.openai.com/docs/guides/model-optimization)
166
+ * [OpenAI's tiktoken library](https://github.com/openai/tiktoken/tree/main)
167
+ * For more information on Anthropic Claude models:
168
+ * [Anthropic model overview and context windows](https://docs.anthropic.com/en/docs/about-claude/models)
169
+ * [Anthropic Python SDK](https://github.com/anthropic/anthropic-sdk-python)
@@ -1,7 +1,7 @@
1
1
  # TeLLMgramBot
2
- The basic goal of this project is to create a bridge between a Telegram Bot and a Large Langage Model (LLM), like OpenAI's GPT models.
2
+ The basic goal of this project is to create a bridge between a Telegram Bot and a Large Language Model (LLM), supporting both OpenAI's GPT models and Anthropic's Claude models.
3
3
  * To use this library, you must have a Telegram account **with a user name**, not just a phone number. If you don't have one, [create one online](https://telegram.org/).
4
- * If added to a Telgram group, the bot must be [adminstrator](https://www.alphr.com/add-admin-telegram/) in order to respond to a user calling out its name, initials, or nickname.
4
+ * If added to a Telegram group, the bot must be [administrator](https://www.alphr.com/add-admin-telegram/) in order to respond to a user calling out its name, initials, or nickname.
5
5
  <img src="assets/TeLLMgramBot_Logo.png" width=200 align=center />
6
6
 
7
7
  ## Telegram Bot + LLM Encapsulation
@@ -9,36 +9,45 @@ The basic goal of this project is to create a bridge between a Telegram Bot and
9
9
  * The more dynamic conversation gets handed off to the LLM to manage prompts and responses, and Telegram acts as the interaction broker.
10
10
  * Pass the URL in [square brackets] and mention how the bot should interpret it.
11
11
  * Example: "What do you think of this article? [https://some_site/article]"
12
- * This uses another GPT model, preferably GPT-5 or GPT-4o, to support more URL content with its higher token limit.
12
+ * This uses a separate model (configurable via `url_model`) to support more URL content with its higher token limit.
13
13
  * Tokens are used to measure the length of all conversation messages between the Telegram bot assistant and the user. This is useful to:
14
14
  * Ensure the length does not go over the model limit. If it does, prune oldest messages to fit within the limit.
15
15
  * Remember 50% of the past conversations when starting up TeLLMgramBot again.
16
16
  * Users can also clear their conversation history for privacy.
17
17
 
18
18
  ## Why Telegram?
19
- Using Telegram as the interface not only solves "exposing" the interface, but gives you boadloads of interactivity over a standard Command Line interface, or trying to create a website with input boxes and submit buttons to try to handle everything:
19
+ Using Telegram as the interface not only solves "exposing" the interface, but gives you boatloads of interactivity over a standard Command Line interface, or trying to create a website with input boxes and submit buttons to try to handle everything:
20
20
  1. Telegram already lets you paste in verbose, multiline messages.
21
21
  2. Telegram already lets you paste in pictures, videos, links, etc.
22
22
  3. Telegram already lets you react with emojis, stickers, etc.
23
23
 
24
+ ## Supported LLM Providers
25
+ TeLLMgramBot selects the LLM provider automatically based on the model name:
26
+
27
+ | Model prefix | Provider | Example models |
28
+ |---|---|---|
29
+ | `gpt-` | OpenAI | `gpt-4o`, `gpt-4o-mini`, `gpt-5-mini` |
30
+ | `claude-` | Anthropic | `claude-sonnet-4-6`, `claude-haiku-4-5` |
31
+
32
+ Simply set `chat_model` (and optionally `url_model`) in your `config.yaml` to any supported model and supply the corresponding API key — no other changes needed.
33
+
24
34
  ## Directories
25
35
  When initializing TeLLMgramBot, the following directories get created:
26
36
  * `configs` - Contains bot configuration files.
27
- * `commands.txt`
28
- * Users can type `/help` interacting with the to see this file's text get displayed in Telegram.
29
37
  * `config.yaml` (can be a different name)
30
- * This file sets main OpenAI parameters like naming and GPT models to process.
31
- * The parameter `url_model` is to read URL content, different than `chat_model` that the bot normally uses to interact with the user.
32
- * An empty `token_limit` would do the maximum amount of tokens supported by the `chat_model` (e.g. 128000 for `gpt-4o-mini`).
38
+ * This file sets main bot parameters like naming and the LLM models to use.
39
+ * `chat_model` the model used for normal conversation (e.g. `gpt-5-mini` or `claude-sonnet-4-6`).
40
+ * `url_model` the model used to read and summarize URL content, can differ from `chat_model`.
41
+ * An empty `token_limit` will use the maximum tokens supported by the `chat_model`.
33
42
  * `tokenGPT.yaml`
34
- * This important YAML file contains token size parameters for supported OpenAI models.
35
- * If the first time, `gpt-5`, `gpt-5-mini`, `gpt-5-nano`, `gpt-4o`, and `gpt-4o-mini` get populated, but the user can specify more models with token size parameters as needed.
43
+ * Contains token size parameters for all supported models.
44
+ * On first run, GPT and Claude model families are pre-populated. Additional models can be added manually.
36
45
  * `prompts` - Contains prompt files for how the bot interacts with any user.
37
- * `test_personality.prmpt`
38
- * This is a sample prompt file as a basis to test this library.
39
- * The user can create more prompt files as needed for different personalities. See [OpenAI Playground](https://platform.openai.com/playground) to test some ideas.
46
+ * `test_personality.prmpt` (can be a different name)
47
+ * A sample prompt file as a basis to test this library.
48
+ * The user can create more prompt files as needed for different personalities.
40
49
  * `url_analysis.prmpt`
41
- * This is a crucial prompt file to analyze URL content in brackets `[]` in a different model (such as `gpt-4o` or `gpt-4.1`).
50
+ * Prompt template used to analyze URL content passed in brackets `[]`.
42
51
  * `errorlogs`
43
52
  * Contains a `tellmgrambot_error.log` file to investigate if there are problems during the interaction.
44
53
  * User will also get notified to contact the owner.
@@ -56,24 +65,27 @@ TeLLMgramBot also creates or utilizes the following environment variables that c
56
65
  If neither of these are defined, the initialization would use the top-level execution run directory.
57
66
 
58
67
  ## API Keys
59
- To operate TeLLMgramBot, three API keys are required:
60
- * [OpenAI](https://platform.openai.com/overview) - Drives the actual GPT AI.
61
- * [Telegram](https://core.telegram.org/api) - Offers a Bot API through BotFather for the messaging platform.
62
- * [VirusTotal](https://www.virustotal.com/gui/home/) - Performs safety checks on URLs.
68
+ To operate TeLLMgramBot, the following API keys are required:
69
+ * **[OpenAI](https://platform.openai.com/overview)** required when using a `gpt-*` model.
70
+ * **[Anthropic](https://console.anthropic.com/)** required when using a `claude-*` model.
71
+ * **[Telegram](https://core.telegram.org/api)** always required; offers a Bot API through BotFather.
72
+ * **[VirusTotal](https://www.virustotal.com/gui/home/)** — always required; performs safety checks on URLs.
63
73
 
64
74
  There are two ways to populate each API key: environment variables or `.key` files.
65
75
 
66
76
  ### Environment Variables
67
- TeLLMgramBot uses the following environment variables that can be pre-loaded with the three API keys respectively:
68
- 1. `TELLMGRAMBOT_OPENAI_API_KEY`
69
- 2. `TELLMGRAMBOT_TELEGRAM_API_KEY`
70
- 3. `TELLMGRAMBOT_VIRUSTOTAL_API_KEY`
77
+ TeLLMgramBot uses the following environment variables for API keys:
78
+ 1. `TELLMGRAMBOT_OPENAI_API_KEY` *(OpenAI models)*
79
+ 2. `TELLMGRAMBOT_ANTHROPIC_API_KEY` *(Anthropic models)*
80
+ 3. `TELLMGRAMBOT_TELEGRAM_API_KEY`
81
+ 4. `TELLMGRAMBOT_VIRUSTOTAL_API_KEY`
71
82
 
72
83
  During spin-up time, a user can call out `os.environ[env_var]` to set those variables, like the following example:
73
84
  ```
74
85
  my_keys = Some_Vault_Fetch_Function()
75
86
 
76
- os.environ['TELLMGRAMBOT_OPENAI_API_KEY'] = my_keys['GPTKey']
87
+ os.environ['TELLMGRAMBOT_OPENAI_API_KEY'] = my_keys['OpenAIKey']
88
+ os.environ['TELLMGRAMBOT_ANTHROPIC_API_KEY'] = my_keys['AnthropicKey']
77
89
  os.environ['TELLMGRAMBOT_TELEGRAM_API_KEY'] = my_keys['BotFatherToken']
78
90
  os.environ['TELLMGRAMBOT_VIRUSTOTAL_API_KEY'] = my_keys['VirusTotalToken']
79
91
  ```
@@ -81,27 +93,27 @@ os.environ['TELLMGRAMBOT_VIRUSTOTAL_API_KEY'] = my_keys['VirusTotalToken']
81
93
  This means the user can implement whatever key vault they want to fetch the keys at runtime, without needing files stored in the directory.
82
94
 
83
95
  ### API Key Files
84
- The other route is to create three files by the base path during execution or a specified environment variable `TELLMGRAMBOT_KEYS_PATH`. By default, three files are created for the user to input each API key:
96
+ The other route is to create files by the base path during execution or a specified environment variable `TELLMGRAMBOT_KEYS_PATH`. By default, files are created for the user to input each API key:
85
97
  1. `openai.key`
86
- 2. `telegram.key`
87
- 3. `virustotal.key`
98
+ 2. `anthropic.key` _(planned — env var only for now; see Phase 3)_
99
+ 3. `telegram.key`
100
+ 4. `virustotal.key`
88
101
 
89
102
  Each file with the associated API key will update its respective environment variable if not defined.
90
103
 
91
104
  ## Bot Setup
92
- This library includes an example script `test_local.py`, which uses files from the folders `configs` and `prompts` for TeLLMgramBot to process. The bot communicates with OpenAI via the **Responses API**, which replaces the older Chat Completions endpoint.
105
+ This library includes an example script `test_local.py`, which uses files from the folders `configs` and `prompts` for TeLLMgramBot to process.
93
106
  1. Ensure the previous sections are followed with the proper API keys and your Telegram bot set.
94
107
  2. Install this library via PIP (`pip install TeLLMgramBot`) and then import into your project.
95
- 3. Instantiate the bot by passing in various configuration pieces needed below:
108
+ 3. Instantiate the bot by passing in various configuration pieces needed below.
109
+ Note the Telegram bot's full name and username auto-populate before startup.
96
110
  ```
97
111
  telegram_bot = TeLLMgramBot.TelegramBot(
98
- bot_username = <Bot username like 'friendly_bot'>,
99
112
  bot_owner = <Bot owner's Telegram username>,
100
- bot_name = <Bot name like 'Friendly Bot'>,
101
113
  bot_nickname = <Bot nickname like 'Botty'>,
102
114
  bot_initials = <Bot initials like 'FB'>,
103
- chat_model = <Conversation model like 'gpt-4o-mini'>,
104
- url_model = <URL analysis model like 'gpt-4o'>,
115
+ chat_model = <Conversation model like 'gpt-4o-mini' or 'claude-sonnet-4-6'>,
116
+ url_model = <URL analysis model like 'gpt-4o' or 'claude-haiku-4-5'>,
105
117
  token_limit = <Maximum token count set, by default chat_model max>,
106
118
  persona_temp = <LLM factual to creative value [0-2], by default 1.0>,
107
119
  persona_prompt = <System prompt summarizing bot personality>
@@ -112,14 +124,15 @@ This library includes an example script `test_local.py`, which uses files from t
112
124
  telegram_bot.start_polling()
113
125
  ```
114
126
  Once you see `TeLLMgramBot polling...`, the bot is online in Telegram.
115
- 5. Typing `/help` shows all available commands reported by the `configs/commands.txt` file.
116
- 6. Only as owner, type `/start` directly to the bot to initiate user conversations.
127
+ 5. Converse! Type `/help` for all available commands.
117
128
 
118
129
  ## Resources
119
130
  * GitHub repository [python-telegram-bot](https://github.com/python-telegram-bot/python-telegram-bot) has guides to create a Telegram bot.
120
- * For more information on OpenAI models like `gpt-4o` and token limits, see the following:
121
- * [OpenAI model overview and maximum tokens](https://platform.openai.com/docs/models).
122
- * [OpenAI message conversion to tokens](https://github.com/openai/openai-python).
123
- * [OpenAI custom fine-tuning](https://platform.openai.com/docs/guides/model-optimization).
124
- * [OpenAI's tiktoken library, including some helpful guides](https://github.com/openai/tiktoken/tree/main).
125
- * [OpenAI Playground](https://platform.openai.com/playground) is a great place to test out prompts and responses.
131
+ * For more information on OpenAI models and token limits:
132
+ * [OpenAI model overview and maximum tokens](https://platform.openai.com/docs/models)
133
+ * [OpenAI message conversion to tokens](https://github.com/openai/openai-python)
134
+ * [OpenAI custom fine-tuning](https://platform.openai.com/docs/guides/model-optimization)
135
+ * [OpenAI's tiktoken library](https://github.com/openai/tiktoken/tree/main)
136
+ * For more information on Anthropic Claude models:
137
+ * [Anthropic model overview and context windows](https://docs.anthropic.com/en/docs/about-claude/models)
138
+ * [Anthropic Python SDK](https://github.com/anthropic/anthropic-sdk-python)