langchain-githubcopilot-chat 0.1.2__tar.gz → 0.3.0__tar.gz
This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
- langchain_githubcopilot_chat-0.3.0/LICENSE +21 -0
- langchain_githubcopilot_chat-0.3.0/PKG-INFO +137 -0
- langchain_githubcopilot_chat-0.3.0/README.md +115 -0
- {langchain_githubcopilot_chat-0.1.2 → langchain_githubcopilot_chat-0.3.0}/langchain_githubcopilot_chat/__init__.py +7 -10
- langchain_githubcopilot_chat-0.3.0/langchain_githubcopilot_chat/auth.py +96 -0
- {langchain_githubcopilot_chat-0.1.2 → langchain_githubcopilot_chat-0.3.0}/langchain_githubcopilot_chat/chat_models.py +52 -26
- {langchain_githubcopilot_chat-0.1.2 → langchain_githubcopilot_chat-0.3.0}/langchain_githubcopilot_chat/embeddings.py +24 -24
- {langchain_githubcopilot_chat-0.1.2 → langchain_githubcopilot_chat-0.3.0}/pyproject.toml +1 -1
- langchain_githubcopilot_chat-0.1.2/PKG-INFO +0 -69
- langchain_githubcopilot_chat-0.1.2/README.md +0 -46
- langchain_githubcopilot_chat-0.1.2/langchain_githubcopilot_chat/document_loaders.py +0 -73
- langchain_githubcopilot_chat-0.1.2/langchain_githubcopilot_chat/retrievers.py +0 -107
- langchain_githubcopilot_chat-0.1.2/langchain_githubcopilot_chat/toolkits.py +0 -72
- langchain_githubcopilot_chat-0.1.2/langchain_githubcopilot_chat/tools.py +0 -94
- langchain_githubcopilot_chat-0.1.2/langchain_githubcopilot_chat/vectorstores.py +0 -439
- /langchain_githubcopilot_chat-0.1.2/LICENSE → /langchain_githubcopilot_chat-0.3.0/LICENSE.langchain +0 -0
- {langchain_githubcopilot_chat-0.1.2 → langchain_githubcopilot_chat-0.3.0}/langchain_githubcopilot_chat/py.typed +0 -0
|
@@ -0,0 +1,21 @@
|
|
|
1
|
+
MIT License
|
|
2
|
+
|
|
3
|
+
Copyright (c) 2026 YIhan Wu
|
|
4
|
+
|
|
5
|
+
Permission is hereby granted, free of charge, to any person obtaining a copy
|
|
6
|
+
of this software and associated documentation files (the "Software"), to deal
|
|
7
|
+
in the Software without restriction, including without limitation the rights
|
|
8
|
+
to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
|
|
9
|
+
copies of the Software, and to permit persons to whom the Software is
|
|
10
|
+
furnished to do so, subject to the following conditions:
|
|
11
|
+
|
|
12
|
+
The above copyright notice and this permission notice shall be included in all
|
|
13
|
+
copies or substantial portions of the Software.
|
|
14
|
+
|
|
15
|
+
THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
|
|
16
|
+
IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
|
|
17
|
+
FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
|
|
18
|
+
AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
|
|
19
|
+
LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
|
|
20
|
+
OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE
|
|
21
|
+
SOFTWARE.
|
|
@@ -0,0 +1,137 @@
|
|
|
1
|
+
Metadata-Version: 2.1
|
|
2
|
+
Name: langchain-githubcopilot-chat
|
|
3
|
+
Version: 0.3.0
|
|
4
|
+
Summary: An integration package connecting GithubcopilotChat and LangChain
|
|
5
|
+
Home-page: https://github.com/langchain-ai/langchain
|
|
6
|
+
License: MIT
|
|
7
|
+
Author: YIhan Wu
|
|
8
|
+
Author-email: iumm@ibat.ac.cn
|
|
9
|
+
Requires-Python: >=3.10,<4.0
|
|
10
|
+
Classifier: License :: OSI Approved :: MIT License
|
|
11
|
+
Classifier: Programming Language :: Python :: 3
|
|
12
|
+
Classifier: Programming Language :: Python :: 3.10
|
|
13
|
+
Classifier: Programming Language :: Python :: 3.11
|
|
14
|
+
Classifier: Programming Language :: Python :: 3.12
|
|
15
|
+
Classifier: Programming Language :: Python :: 3.13
|
|
16
|
+
Requires-Dist: httpx (>=0.28.1)
|
|
17
|
+
Requires-Dist: langchain-core (>=1.1.0,<2.0.0)
|
|
18
|
+
Project-URL: Repository, https://github.com/langchain-ai/langchain
|
|
19
|
+
Project-URL: Release Notes, https://github.com/langchain-ai/langchain/releases?q=tag%3A%22githubcopilot-chat%3D%3D0%22&expanded=true
|
|
20
|
+
Project-URL: Source Code, https://github.com/langchain-ai/langchain/tree/master/libs/partners/githubcopilot-chat
|
|
21
|
+
Description-Content-Type: text/markdown
|
|
22
|
+
|
|
23
|
+
# LangChain GitHub Copilot Chat
|
|
24
|
+
|
|
25
|
+
This package provides a LangChain integration for **GitHub Copilot**, allowing you to use Copilot's models (including GPT-4o, Claude 3.5 Sonnet, etc.) as standard LangChain `BaseChatModel` components.
|
|
26
|
+
|
|
27
|
+
Unlike other integrations, this package mimics the official VS Code Copilot Chat extension behavior, providing access to the full suite of models available to Copilot subscribers.
|
|
28
|
+
|
|
29
|
+
## 🚀 Features
|
|
30
|
+
|
|
31
|
+
- **Real Copilot API**: Connects to `api.githubcopilot.com` using official VS Code headers.
|
|
32
|
+
- **Easy Auth**: Built-in GitHub Device Flow for acquiring a valid Copilot Token.
|
|
33
|
+
- **Model Discovery**: Dynamic fetching of all models authorized for your account.
|
|
34
|
+
- **LangChain Native**: Full support for Streaming, Tool Calling, and Async operations.
|
|
35
|
+
|
|
36
|
+
## 📦 Installation
|
|
37
|
+
|
|
38
|
+
```bash
|
|
39
|
+
pip install -U langchain-githubcopilot-chat
|
|
40
|
+
```
|
|
41
|
+
|
|
42
|
+
## 🔐 Authentication
|
|
43
|
+
|
|
44
|
+
To use GitHub Copilot, you need a valid Copilot Token. You can obtain one interactively using the built-in helper:
|
|
45
|
+
|
|
46
|
+
```python
|
|
47
|
+
from langchain_githubcopilot_chat import get_vscode_token
|
|
48
|
+
|
|
49
|
+
# This will prompt you to visit a GitHub URL and enter a code
|
|
50
|
+
token = get_vscode_token()
|
|
51
|
+
print(f"Your Token: {token}")
|
|
52
|
+
```
|
|
53
|
+
|
|
54
|
+
For custom output handling (e.g., in GUI applications), pass a callback:
|
|
55
|
+
|
|
56
|
+
```python
|
|
57
|
+
from langchain_githubcopilot_chat import get_copilot_token
|
|
58
|
+
|
|
59
|
+
def on_message(msg):
|
|
60
|
+
# Handle status messages (e.g., display in UI)
|
|
61
|
+
print(f"[Copilot] {msg}")
|
|
62
|
+
|
|
63
|
+
token = get_vscode_token(callback=on_message)
|
|
64
|
+
```
|
|
65
|
+
|
|
66
|
+
Alternatively, set it as an environment variable:
|
|
67
|
+
```bash
|
|
68
|
+
export GITHUB_TOKEN="your_copilot_token_here"
|
|
69
|
+
```
|
|
70
|
+
|
|
71
|
+
## 🛠 Usage
|
|
72
|
+
|
|
73
|
+
### Chat Models
|
|
74
|
+
|
|
75
|
+
Access any model supported by Copilot (e.g., `gpt-4o`, `gpt-4o-mini`, `claude-3.5-sonnet`).
|
|
76
|
+
|
|
77
|
+
```python
|
|
78
|
+
from langchain_githubcopilot_chat import ChatGithubCopilot
|
|
79
|
+
|
|
80
|
+
# Initialize with a specific model
|
|
81
|
+
llm = ChatGithubCopilot(
|
|
82
|
+
model="gpt-4o",
|
|
83
|
+
temperature=0.7
|
|
84
|
+
)
|
|
85
|
+
|
|
86
|
+
# Simple invocation
|
|
87
|
+
response = llm.invoke("Explain Quantum Entanglement in one sentence.")
|
|
88
|
+
print(response.content)
|
|
89
|
+
|
|
90
|
+
# Streaming
|
|
91
|
+
for chunk in llm.stream("Write a short poem about coding."):
|
|
92
|
+
print(chunk.content, end="", flush=True)
|
|
93
|
+
```
|
|
94
|
+
|
|
95
|
+
### Discovery Available Models
|
|
96
|
+
|
|
97
|
+
GitHub Copilot periodically updates its available models. You can list what's currently available for your token:
|
|
98
|
+
|
|
99
|
+
```python
|
|
100
|
+
from langchain_githubcopilot_chat import get_available_models
|
|
101
|
+
|
|
102
|
+
models = get_available_models()
|
|
103
|
+
for model in models:
|
|
104
|
+
print(f"ID: {model['id']} - Name: {model.get('name')}")
|
|
105
|
+
```
|
|
106
|
+
|
|
107
|
+
### Embeddings
|
|
108
|
+
|
|
109
|
+
Use Copilot's embedding models for RAG or semantic search:
|
|
110
|
+
|
|
111
|
+
```python
|
|
112
|
+
from langchain_githubcopilot_chat import GithubcopilotChatEmbeddings
|
|
113
|
+
|
|
114
|
+
embeddings = GithubcopilotChatEmbeddings(model="text-embedding-3-small")
|
|
115
|
+
vector = embeddings.embed_query("GitHub Copilot is awesome!")
|
|
116
|
+
```
|
|
117
|
+
|
|
118
|
+
## 📖 Advanced: Tool Calling
|
|
119
|
+
|
|
120
|
+
```python
|
|
121
|
+
from pydantic import BaseModel, Field
|
|
122
|
+
from langchain_githubcopilot_chat import ChatGithubCopilot
|
|
123
|
+
|
|
124
|
+
class GetWeather(BaseModel):
|
|
125
|
+
"""Get the current weather in a given location."""
|
|
126
|
+
location: str = Field(..., description="The city and state, e.g. San Francisco, CA")
|
|
127
|
+
|
|
128
|
+
llm = ChatGithubCopilot(model="gpt-4o")
|
|
129
|
+
llm_with_tools = llm.bind_tools([GetWeather])
|
|
130
|
+
|
|
131
|
+
ai_msg = llm_with_tools.invoke("What's the weather like in Tokyo?")
|
|
132
|
+
print(ai_msg.tool_calls)
|
|
133
|
+
```
|
|
134
|
+
|
|
135
|
+
## ⚖️ Disclaimer
|
|
136
|
+
|
|
137
|
+
This project is an independent community integration and is not affiliated with, endorsed by, or supported by GitHub, Inc. Usage of this package must comply with GitHub's [Terms of Service](https://docs.github.com/en/site-policy/github-terms/github-terms-of-service).
|
|
@@ -0,0 +1,115 @@
|
|
|
1
|
+
# LangChain GitHub Copilot Chat
|
|
2
|
+
|
|
3
|
+
This package provides a LangChain integration for **GitHub Copilot**, allowing you to use Copilot's models (including GPT-4o, Claude 3.5 Sonnet, etc.) as standard LangChain `BaseChatModel` components.
|
|
4
|
+
|
|
5
|
+
Unlike other integrations, this package mimics the official VS Code Copilot Chat extension behavior, providing access to the full suite of models available to Copilot subscribers.
|
|
6
|
+
|
|
7
|
+
## 🚀 Features
|
|
8
|
+
|
|
9
|
+
- **Real Copilot API**: Connects to `api.githubcopilot.com` using official VS Code headers.
|
|
10
|
+
- **Easy Auth**: Built-in GitHub Device Flow for acquiring a valid Copilot Token.
|
|
11
|
+
- **Model Discovery**: Dynamic fetching of all models authorized for your account.
|
|
12
|
+
- **LangChain Native**: Full support for Streaming, Tool Calling, and Async operations.
|
|
13
|
+
|
|
14
|
+
## 📦 Installation
|
|
15
|
+
|
|
16
|
+
```bash
|
|
17
|
+
pip install -U langchain-githubcopilot-chat
|
|
18
|
+
```
|
|
19
|
+
|
|
20
|
+
## 🔐 Authentication
|
|
21
|
+
|
|
22
|
+
To use GitHub Copilot, you need a valid Copilot Token. You can obtain one interactively using the built-in helper:
|
|
23
|
+
|
|
24
|
+
```python
|
|
25
|
+
from langchain_githubcopilot_chat import get_vscode_token
|
|
26
|
+
|
|
27
|
+
# This will prompt you to visit a GitHub URL and enter a code
|
|
28
|
+
token = get_vscode_token()
|
|
29
|
+
print(f"Your Token: {token}")
|
|
30
|
+
```
|
|
31
|
+
|
|
32
|
+
For custom output handling (e.g., in GUI applications), pass a callback:
|
|
33
|
+
|
|
34
|
+
```python
|
|
35
|
+
from langchain_githubcopilot_chat import get_copilot_token
|
|
36
|
+
|
|
37
|
+
def on_message(msg):
|
|
38
|
+
# Handle status messages (e.g., display in UI)
|
|
39
|
+
print(f"[Copilot] {msg}")
|
|
40
|
+
|
|
41
|
+
token = get_vscode_token(callback=on_message)
|
|
42
|
+
```
|
|
43
|
+
|
|
44
|
+
Alternatively, set it as an environment variable:
|
|
45
|
+
```bash
|
|
46
|
+
export GITHUB_TOKEN="your_copilot_token_here"
|
|
47
|
+
```
|
|
48
|
+
|
|
49
|
+
## 🛠 Usage
|
|
50
|
+
|
|
51
|
+
### Chat Models
|
|
52
|
+
|
|
53
|
+
Access any model supported by Copilot (e.g., `gpt-4o`, `gpt-4o-mini`, `claude-3.5-sonnet`).
|
|
54
|
+
|
|
55
|
+
```python
|
|
56
|
+
from langchain_githubcopilot_chat import ChatGithubCopilot
|
|
57
|
+
|
|
58
|
+
# Initialize with a specific model
|
|
59
|
+
llm = ChatGithubCopilot(
|
|
60
|
+
model="gpt-4o",
|
|
61
|
+
temperature=0.7
|
|
62
|
+
)
|
|
63
|
+
|
|
64
|
+
# Simple invocation
|
|
65
|
+
response = llm.invoke("Explain Quantum Entanglement in one sentence.")
|
|
66
|
+
print(response.content)
|
|
67
|
+
|
|
68
|
+
# Streaming
|
|
69
|
+
for chunk in llm.stream("Write a short poem about coding."):
|
|
70
|
+
print(chunk.content, end="", flush=True)
|
|
71
|
+
```
|
|
72
|
+
|
|
73
|
+
### Discovery Available Models
|
|
74
|
+
|
|
75
|
+
GitHub Copilot periodically updates its available models. You can list what's currently available for your token:
|
|
76
|
+
|
|
77
|
+
```python
|
|
78
|
+
from langchain_githubcopilot_chat import get_available_models
|
|
79
|
+
|
|
80
|
+
models = get_available_models()
|
|
81
|
+
for model in models:
|
|
82
|
+
print(f"ID: {model['id']} - Name: {model.get('name')}")
|
|
83
|
+
```
|
|
84
|
+
|
|
85
|
+
### Embeddings
|
|
86
|
+
|
|
87
|
+
Use Copilot's embedding models for RAG or semantic search:
|
|
88
|
+
|
|
89
|
+
```python
|
|
90
|
+
from langchain_githubcopilot_chat import GithubcopilotChatEmbeddings
|
|
91
|
+
|
|
92
|
+
embeddings = GithubcopilotChatEmbeddings(model="text-embedding-3-small")
|
|
93
|
+
vector = embeddings.embed_query("GitHub Copilot is awesome!")
|
|
94
|
+
```
|
|
95
|
+
|
|
96
|
+
## 📖 Advanced: Tool Calling
|
|
97
|
+
|
|
98
|
+
```python
|
|
99
|
+
from pydantic import BaseModel, Field
|
|
100
|
+
from langchain_githubcopilot_chat import ChatGithubCopilot
|
|
101
|
+
|
|
102
|
+
class GetWeather(BaseModel):
|
|
103
|
+
"""Get the current weather in a given location."""
|
|
104
|
+
location: str = Field(..., description="The city and state, e.g. San Francisco, CA")
|
|
105
|
+
|
|
106
|
+
llm = ChatGithubCopilot(model="gpt-4o")
|
|
107
|
+
llm_with_tools = llm.bind_tools([GetWeather])
|
|
108
|
+
|
|
109
|
+
ai_msg = llm_with_tools.invoke("What's the weather like in Tokyo?")
|
|
110
|
+
print(ai_msg.tool_calls)
|
|
111
|
+
```
|
|
112
|
+
|
|
113
|
+
## ⚖️ Disclaimer
|
|
114
|
+
|
|
115
|
+
This project is an independent community integration and is not affiliated with, endorsed by, or supported by GitHub, Inc. Usage of this package must comply with GitHub's [Terms of Service](https://docs.github.com/en/site-policy/github-terms/github-terms-of-service).
|
|
@@ -1,15 +1,11 @@
|
|
|
1
1
|
from importlib import metadata
|
|
2
2
|
|
|
3
|
+
from langchain_githubcopilot_chat.auth import get_copilot_token
|
|
3
4
|
from langchain_githubcopilot_chat.chat_models import (
|
|
4
5
|
ChatGithubCopilot,
|
|
5
6
|
ChatGithubcopilotChat,
|
|
6
7
|
)
|
|
7
|
-
from langchain_githubcopilot_chat.document_loaders import GithubcopilotChatLoader
|
|
8
8
|
from langchain_githubcopilot_chat.embeddings import GithubcopilotChatEmbeddings
|
|
9
|
-
from langchain_githubcopilot_chat.retrievers import GithubcopilotChatRetriever
|
|
10
|
-
from langchain_githubcopilot_chat.toolkits import GithubcopilotChatToolkit
|
|
11
|
-
from langchain_githubcopilot_chat.tools import GithubcopilotChatTool
|
|
12
|
-
from langchain_githubcopilot_chat.vectorstores import GithubcopilotChatVectorStore
|
|
13
9
|
|
|
14
10
|
try:
|
|
15
11
|
__version__ = metadata.version(__package__)
|
|
@@ -18,14 +14,15 @@ except metadata.PackageNotFoundError:
|
|
|
18
14
|
__version__ = ""
|
|
19
15
|
del metadata # optional, avoids polluting the results of dir(__package__)
|
|
20
16
|
|
|
17
|
+
get_available_models = ChatGithubCopilot.get_available_models
|
|
18
|
+
get_vscode_token = get_copilot_token
|
|
19
|
+
|
|
21
20
|
__all__ = [
|
|
22
21
|
"ChatGithubCopilot",
|
|
23
22
|
"ChatGithubcopilotChat", # backwards-compatible alias
|
|
24
|
-
"GithubcopilotChatVectorStore",
|
|
25
23
|
"GithubcopilotChatEmbeddings",
|
|
26
|
-
"
|
|
27
|
-
"
|
|
28
|
-
"
|
|
29
|
-
"GithubcopilotChatTool",
|
|
24
|
+
"get_copilot_token",
|
|
25
|
+
"get_vscode_token",
|
|
26
|
+
"get_available_models",
|
|
30
27
|
"__version__",
|
|
31
28
|
]
|
|
@@ -0,0 +1,96 @@
|
|
|
1
|
+
import time
|
|
2
|
+
from typing import Callable, Optional
|
|
3
|
+
|
|
4
|
+
import httpx
|
|
5
|
+
|
|
6
|
+
CLIENT_ID = "Iv1.b507a08c87ecfe98"
|
|
7
|
+
|
|
8
|
+
|
|
9
|
+
def get_copilot_token(
|
|
10
|
+
client_id: str = CLIENT_ID, callback: Optional[Callable[[str], None]] = None
|
|
11
|
+
) -> Optional[str]:
|
|
12
|
+
"""
|
|
13
|
+
Authenticate via GitHub Device Flow to get a Copilot Token.
|
|
14
|
+
This function will block and wait for the user to complete the
|
|
15
|
+
authorization in their browser.
|
|
16
|
+
|
|
17
|
+
Args:
|
|
18
|
+
client_id: The GitHub OAuth App Client ID to use. Defaults
|
|
19
|
+
to the VS Code Copilot Chat client ID.
|
|
20
|
+
callback: Optional callable that receives status messages instead of
|
|
21
|
+
printing them. If None, messages are printed to stdout.
|
|
22
|
+
|
|
23
|
+
Returns:
|
|
24
|
+
The fetched Copilot Token string, or None if authentication failed.
|
|
25
|
+
"""
|
|
26
|
+
|
|
27
|
+
def _print(msg: str) -> None:
|
|
28
|
+
if callback:
|
|
29
|
+
callback(msg)
|
|
30
|
+
else:
|
|
31
|
+
print(msg) # noqa: T201
|
|
32
|
+
|
|
33
|
+
_print("1. Requesting device code from GitHub...")
|
|
34
|
+
with httpx.Client() as client:
|
|
35
|
+
res = client.post(
|
|
36
|
+
"https://github.com/login/device/code",
|
|
37
|
+
headers={"Accept": "application/json"},
|
|
38
|
+
data={"client_id": client_id, "scope": "read:user"},
|
|
39
|
+
)
|
|
40
|
+
res.raise_for_status()
|
|
41
|
+
data = res.json()
|
|
42
|
+
|
|
43
|
+
device_code = data.get("device_code")
|
|
44
|
+
user_code = data.get("user_code")
|
|
45
|
+
verification_uri = data.get("verification_uri")
|
|
46
|
+
interval = data.get("interval", 5)
|
|
47
|
+
|
|
48
|
+
_print("\n==========================================")
|
|
49
|
+
_print(f"Please open your browser to: {verification_uri}")
|
|
50
|
+
_print(f"And enter the authorization code: {user_code}")
|
|
51
|
+
_print("==========================================\n")
|
|
52
|
+
_print(f"Waiting for authorization (checking every {interval} seconds)...")
|
|
53
|
+
|
|
54
|
+
access_token = None
|
|
55
|
+
with httpx.Client() as client:
|
|
56
|
+
while True:
|
|
57
|
+
token_res = client.post(
|
|
58
|
+
"https://github.com/login/oauth/access_token",
|
|
59
|
+
headers={"Accept": "application/json"},
|
|
60
|
+
data={
|
|
61
|
+
"client_id": client_id,
|
|
62
|
+
"device_code": device_code,
|
|
63
|
+
"grant_type": "urn:ietf:params:oauth:grant-type:device_code",
|
|
64
|
+
},
|
|
65
|
+
).json()
|
|
66
|
+
|
|
67
|
+
if "access_token" in token_res:
|
|
68
|
+
access_token = token_res["access_token"]
|
|
69
|
+
_print(
|
|
70
|
+
"\n✅ Authorization successful! Exchanging for Copilot Token..."
|
|
71
|
+
)
|
|
72
|
+
break
|
|
73
|
+
elif token_res.get("error") == "authorization_pending":
|
|
74
|
+
time.sleep(interval)
|
|
75
|
+
else:
|
|
76
|
+
_print(f"\n❌ Authorization failed: {token_res}")
|
|
77
|
+
return None
|
|
78
|
+
|
|
79
|
+
# Exchange the standard access token for a Copilot internal token
|
|
80
|
+
copilot_res = client.get(
|
|
81
|
+
"https://api.github.com/copilot_internal/v2/token",
|
|
82
|
+
headers={
|
|
83
|
+
"Authorization": f"token {access_token}",
|
|
84
|
+
"Accept": "application/json",
|
|
85
|
+
"Editor-Version": "vscode/1.104.1",
|
|
86
|
+
"Editor-Plugin-Version": "copilot-chat/0.26.7",
|
|
87
|
+
},
|
|
88
|
+
)
|
|
89
|
+
|
|
90
|
+
if copilot_res.status_code == 200:
|
|
91
|
+
copilot_token = copilot_res.json().get("token")
|
|
92
|
+
_print("🎉 Successfully acquired Copilot Token!")
|
|
93
|
+
return copilot_token
|
|
94
|
+
else:
|
|
95
|
+
_print(f"❌ Failed to acquire Copilot Token: {copilot_res.text}")
|
|
96
|
+
return None
|
|
@@ -56,10 +56,23 @@ _ROLE_MAP = {
|
|
|
56
56
|
"tool": "tool",
|
|
57
57
|
}
|
|
58
58
|
|
|
59
|
-
|
|
60
|
-
_INFERENCE_PATH = "/
|
|
61
|
-
|
|
62
|
-
|
|
59
|
+
_GITHUB_COPILOT_BASE_URL = "https://api.githubcopilot.com"
|
|
60
|
+
_INFERENCE_PATH = "/chat/completions"
|
|
61
|
+
|
|
62
|
+
COPILOT_EDITOR_VERSION = "vscode/1.104.1"
|
|
63
|
+
COPILOT_PLUGIN_VERSION = "copilot-chat/0.26.7"
|
|
64
|
+
COPILOT_INTEGRATION_ID = "vscode-chat"
|
|
65
|
+
COPILOT_USER_AGENT = "GitHubCopilotChat/0.26.7"
|
|
66
|
+
|
|
67
|
+
COPILOT_DEFAULT_HEADERS = {
|
|
68
|
+
"Copilot-Integration-Id": COPILOT_INTEGRATION_ID,
|
|
69
|
+
"User-Agent": COPILOT_USER_AGENT,
|
|
70
|
+
"Editor-Version": COPILOT_EDITOR_VERSION,
|
|
71
|
+
"Editor-Plugin-Version": COPILOT_PLUGIN_VERSION,
|
|
72
|
+
"editor-version": COPILOT_EDITOR_VERSION,
|
|
73
|
+
"editor-plugin-version": COPILOT_PLUGIN_VERSION,
|
|
74
|
+
"copilot-vision-request": "true",
|
|
75
|
+
}
|
|
63
76
|
|
|
64
77
|
|
|
65
78
|
def _message_to_dict(message: BaseMessage) -> Dict[str, Any]:
|
|
@@ -403,19 +416,8 @@ class ChatGithubCopilot(BaseChatModel):
|
|
|
403
416
|
is used.
|
|
404
417
|
"""
|
|
405
418
|
|
|
406
|
-
base_url: str =
|
|
407
|
-
"""Base URL for the GitHub
|
|
408
|
-
|
|
409
|
-
org: Optional[str] = None
|
|
410
|
-
"""Organisation login for attributed inference requests.
|
|
411
|
-
|
|
412
|
-
When set, requests are sent to
|
|
413
|
-
``/orgs/{org}/inference/chat/completions`` instead of
|
|
414
|
-
``/inference/chat/completions``.
|
|
415
|
-
"""
|
|
416
|
-
|
|
417
|
-
api_version: str = _API_VERSION
|
|
418
|
-
"""GitHub Models API version sent as the ``X-GitHub-Api-Version`` header."""
|
|
419
|
+
base_url: str = _GITHUB_COPILOT_BASE_URL
|
|
420
|
+
"""Base URL for the GitHub Copilot API."""
|
|
419
421
|
|
|
420
422
|
temperature: Optional[float] = None
|
|
421
423
|
"""Sampling temperature in ``[0, 1]``."""
|
|
@@ -491,19 +493,43 @@ class ChatGithubCopilot(BaseChatModel):
|
|
|
491
493
|
@property
|
|
492
494
|
def _inference_url(self) -> str:
|
|
493
495
|
"""Return the full chat-completions endpoint URL."""
|
|
494
|
-
|
|
495
|
-
path = _ORG_INFERENCE_PATH.format(org=self.org)
|
|
496
|
-
else:
|
|
497
|
-
path = _INFERENCE_PATH
|
|
498
|
-
return self.base_url.rstrip("/") + path
|
|
496
|
+
return self.base_url.rstrip("/") + _INFERENCE_PATH
|
|
499
497
|
|
|
500
498
|
def _build_headers(self) -> Dict[str, str]:
|
|
501
|
-
|
|
499
|
+
headers = {
|
|
502
500
|
"Authorization": f"Bearer {self._token}",
|
|
503
|
-
"Accept": "application/
|
|
501
|
+
"Accept": "application/json",
|
|
502
|
+
"Content-Type": "application/json",
|
|
503
|
+
}
|
|
504
|
+
headers.update(COPILOT_DEFAULT_HEADERS)
|
|
505
|
+
return headers
|
|
506
|
+
|
|
507
|
+
@classmethod
|
|
508
|
+
def get_available_models(
|
|
509
|
+
cls, github_token: Optional[str] = None
|
|
510
|
+
) -> List[Dict[str, Any]]:
|
|
511
|
+
"""Get the list of available models from the GitHub Copilot API."""
|
|
512
|
+
token = github_token or os.environ.get("GITHUB_TOKEN")
|
|
513
|
+
if not token:
|
|
514
|
+
raise ValueError(
|
|
515
|
+
"A GitHub token is required. Set the GITHUB_TOKEN environment "
|
|
516
|
+
"variable or pass ``github_token``."
|
|
517
|
+
)
|
|
518
|
+
|
|
519
|
+
headers = {
|
|
520
|
+
"Authorization": f"Bearer {token}",
|
|
521
|
+
"Accept": "application/json",
|
|
504
522
|
"Content-Type": "application/json",
|
|
505
|
-
"X-GitHub-Api-Version": self.api_version,
|
|
506
523
|
}
|
|
524
|
+
headers.update(COPILOT_DEFAULT_HEADERS)
|
|
525
|
+
|
|
526
|
+
url = f"{_GITHUB_COPILOT_BASE_URL}/models"
|
|
527
|
+
|
|
528
|
+
with httpx.Client() as client:
|
|
529
|
+
response = client.get(url, headers=headers)
|
|
530
|
+
response.raise_for_status()
|
|
531
|
+
data = response.json()
|
|
532
|
+
return data.get("data", [])
|
|
507
533
|
|
|
508
534
|
def _build_payload(
|
|
509
535
|
self,
|
|
@@ -712,7 +738,7 @@ class ChatGithubCopilot(BaseChatModel):
|
|
|
712
738
|
|
|
713
739
|
@property
|
|
714
740
|
def _llm_type(self) -> str:
|
|
715
|
-
return "
|
|
741
|
+
return "github-copilot"
|
|
716
742
|
|
|
717
743
|
@property
|
|
718
744
|
def _identifying_params(self) -> Dict[str, Any]:
|
|
@@ -9,10 +9,23 @@ import httpx
|
|
|
9
9
|
from langchain_core.embeddings import Embeddings
|
|
10
10
|
from pydantic import BaseModel, Field, SecretStr, model_validator
|
|
11
11
|
|
|
12
|
-
|
|
13
|
-
_EMBEDDINGS_PATH = "/
|
|
14
|
-
|
|
15
|
-
|
|
12
|
+
_GITHUB_COPILOT_BASE_URL = "https://api.githubcopilot.com"
|
|
13
|
+
_EMBEDDINGS_PATH = "/embeddings"
|
|
14
|
+
|
|
15
|
+
COPILOT_EDITOR_VERSION = "vscode/1.104.1"
|
|
16
|
+
COPILOT_PLUGIN_VERSION = "copilot-chat/0.26.7"
|
|
17
|
+
COPILOT_INTEGRATION_ID = "vscode-chat"
|
|
18
|
+
COPILOT_USER_AGENT = "GitHubCopilotChat/0.26.7"
|
|
19
|
+
|
|
20
|
+
COPILOT_DEFAULT_HEADERS = {
|
|
21
|
+
"Copilot-Integration-Id": COPILOT_INTEGRATION_ID,
|
|
22
|
+
"User-Agent": COPILOT_USER_AGENT,
|
|
23
|
+
"Editor-Version": COPILOT_EDITOR_VERSION,
|
|
24
|
+
"Editor-Plugin-Version": COPILOT_PLUGIN_VERSION,
|
|
25
|
+
"editor-version": COPILOT_EDITOR_VERSION,
|
|
26
|
+
"editor-plugin-version": COPILOT_PLUGIN_VERSION,
|
|
27
|
+
"copilot-vision-request": "true",
|
|
28
|
+
}
|
|
16
29
|
|
|
17
30
|
|
|
18
31
|
class GithubcopilotChatEmbeddings(BaseModel, Embeddings):
|
|
@@ -109,18 +122,8 @@ class GithubcopilotChatEmbeddings(BaseModel, Embeddings):
|
|
|
109
122
|
is used.
|
|
110
123
|
"""
|
|
111
124
|
|
|
112
|
-
base_url: str =
|
|
113
|
-
"""Base URL for the GitHub
|
|
114
|
-
|
|
115
|
-
org: Optional[str] = None
|
|
116
|
-
"""Organisation login for attributed inference requests.
|
|
117
|
-
|
|
118
|
-
When set, requests are sent to
|
|
119
|
-
``/orgs/{org}/inference/embeddings`` instead of ``/inference/embeddings``.
|
|
120
|
-
"""
|
|
121
|
-
|
|
122
|
-
api_version: str = _API_VERSION
|
|
123
|
-
"""GitHub Models API version sent as the ``X-GitHub-Api-Version`` header."""
|
|
125
|
+
base_url: str = _GITHUB_COPILOT_BASE_URL
|
|
126
|
+
"""Base URL for the GitHub Copilot API."""
|
|
124
127
|
|
|
125
128
|
dimensions: Optional[int] = None
|
|
126
129
|
"""Number of output embedding dimensions.
|
|
@@ -173,19 +176,16 @@ class GithubcopilotChatEmbeddings(BaseModel, Embeddings):
|
|
|
173
176
|
@property
|
|
174
177
|
def _embeddings_url(self) -> str:
|
|
175
178
|
"""Return the full embeddings endpoint URL."""
|
|
176
|
-
|
|
177
|
-
path = _ORG_EMBEDDINGS_PATH.format(org=self.org)
|
|
178
|
-
else:
|
|
179
|
-
path = _EMBEDDINGS_PATH
|
|
180
|
-
return self.base_url.rstrip("/") + path
|
|
179
|
+
return self.base_url.rstrip("/") + _EMBEDDINGS_PATH
|
|
181
180
|
|
|
182
181
|
def _build_headers(self) -> Dict[str, str]:
|
|
183
|
-
|
|
182
|
+
headers = {
|
|
184
183
|
"Authorization": f"Bearer {self._token}",
|
|
185
|
-
"Accept": "application/
|
|
184
|
+
"Accept": "application/json",
|
|
186
185
|
"Content-Type": "application/json",
|
|
187
|
-
"X-GitHub-Api-Version": self.api_version,
|
|
188
186
|
}
|
|
187
|
+
headers.update(COPILOT_DEFAULT_HEADERS)
|
|
188
|
+
return headers
|
|
189
189
|
|
|
190
190
|
def _build_payload(self, input: Union[str, List[str]]) -> Dict[str, Any]:
|
|
191
191
|
"""Assemble the JSON body for the embeddings API."""
|
|
@@ -4,7 +4,7 @@ build-backend = "poetry.core.masonry.api"
|
|
|
4
4
|
|
|
5
5
|
[tool.poetry]
|
|
6
6
|
name = "langchain-githubcopilot-chat"
|
|
7
|
-
version = "0.
|
|
7
|
+
version = "0.3.0"
|
|
8
8
|
description = "An integration package connecting GithubcopilotChat and LangChain"
|
|
9
9
|
authors = ["YIhan Wu <iumm@ibat.ac.cn>"]
|
|
10
10
|
readme = "README.md"
|
|
@@ -1,69 +0,0 @@
|
|
|
1
|
-
Metadata-Version: 2.1
|
|
2
|
-
Name: langchain-githubcopilot-chat
|
|
3
|
-
Version: 0.1.2
|
|
4
|
-
Summary: An integration package connecting GithubcopilotChat and LangChain
|
|
5
|
-
Home-page: https://github.com/langchain-ai/langchain
|
|
6
|
-
License: MIT
|
|
7
|
-
Author: YIhan Wu
|
|
8
|
-
Author-email: iumm@ibat.ac.cn
|
|
9
|
-
Requires-Python: >=3.10,<4.0
|
|
10
|
-
Classifier: License :: OSI Approved :: MIT License
|
|
11
|
-
Classifier: Programming Language :: Python :: 3
|
|
12
|
-
Classifier: Programming Language :: Python :: 3.10
|
|
13
|
-
Classifier: Programming Language :: Python :: 3.11
|
|
14
|
-
Classifier: Programming Language :: Python :: 3.12
|
|
15
|
-
Classifier: Programming Language :: Python :: 3.13
|
|
16
|
-
Requires-Dist: httpx (>=0.28.1)
|
|
17
|
-
Requires-Dist: langchain-core (>=1.1.0,<2.0.0)
|
|
18
|
-
Project-URL: Repository, https://github.com/langchain-ai/langchain
|
|
19
|
-
Project-URL: Release Notes, https://github.com/langchain-ai/langchain/releases?q=tag%3A%22githubcopilot-chat%3D%3D0%22&expanded=true
|
|
20
|
-
Project-URL: Source Code, https://github.com/langchain-ai/langchain/tree/master/libs/partners/githubcopilot-chat
|
|
21
|
-
Description-Content-Type: text/markdown
|
|
22
|
-
|
|
23
|
-
# langchain-githubcopilot-chat
|
|
24
|
-
|
|
25
|
-
This package contains the LangChain integration with GithubcopilotChat
|
|
26
|
-
|
|
27
|
-
## Installation
|
|
28
|
-
|
|
29
|
-
```bash
|
|
30
|
-
pip install -U langchain-githubcopilot-chat
|
|
31
|
-
```
|
|
32
|
-
|
|
33
|
-
And you should configure credentials by setting the following environment variables:
|
|
34
|
-
|
|
35
|
-
* TODO: fill this out
|
|
36
|
-
|
|
37
|
-
## Chat Models
|
|
38
|
-
|
|
39
|
-
`ChatGithubcopilotChat` class exposes chat models from GithubcopilotChat.
|
|
40
|
-
|
|
41
|
-
```python
|
|
42
|
-
from langchain_githubcopilot_chat import ChatGithubcopilotChat
|
|
43
|
-
|
|
44
|
-
llm = ChatGithubcopilotChat()
|
|
45
|
-
llm.invoke("Sing a ballad of LangChain.")
|
|
46
|
-
```
|
|
47
|
-
|
|
48
|
-
## Embeddings
|
|
49
|
-
|
|
50
|
-
`GithubcopilotChatEmbeddings` class exposes embeddings from GithubcopilotChat.
|
|
51
|
-
|
|
52
|
-
```python
|
|
53
|
-
from langchain_githubcopilot_chat import GithubcopilotChatEmbeddings
|
|
54
|
-
|
|
55
|
-
embeddings = GithubcopilotChatEmbeddings()
|
|
56
|
-
embeddings.embed_query("What is the meaning of life?")
|
|
57
|
-
```
|
|
58
|
-
|
|
59
|
-
## LLMs
|
|
60
|
-
|
|
61
|
-
`GithubcopilotChatLLM` class exposes LLMs from GithubcopilotChat.
|
|
62
|
-
|
|
63
|
-
```python
|
|
64
|
-
from langchain_githubcopilot_chat import GithubcopilotChatLLM
|
|
65
|
-
|
|
66
|
-
llm = GithubcopilotChatLLM()
|
|
67
|
-
llm.invoke("The meaning of life is")
|
|
68
|
-
```
|
|
69
|
-
|