nexlify-mcp-server 0.1.0__tar.gz → 0.1.1__tar.gz

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
@@ -0,0 +1,2 @@
1
+ NEXLIFY_API_BASE_URI=<your_api_base_uri>
2
+ MCP_TIMEOUT=500 # Timeout in seconds
@@ -0,0 +1,21 @@
1
+ .PHONY: help run build install publish
2
+
3
+ PYTHON_PATH ?= /usr/local/bin/python3.12 # Default Python path; override as needed
4
+ PYPI_TOKEN ?= # PyPI publish token; override when running make (e.g., make publish PYPI_TOKEN=your-token)
5
+
6
+ help: ## Display available commands
7
+ @grep -h -E '^[a-zA-Z_-]+:.*?## .*$$' $(MAKEFILE_LIST) | awk 'BEGIN {FS = ":.*?## "}; {printf "\033[36m%-30s\033[0m %s\n", $$1, $$2}'
8
+
9
+ # Application commands
10
+
11
+ run: ## Run the application
12
+ uv run app
13
+
14
+ build: ## Build the application
15
+ uv build
16
+
17
+ install: ## Install the local build package (For Development only)
18
+ uv pip install --python $(PYTHON_PATH) dist/nexlify_mcp_server-0.1.1-py3-none-any.whl
19
+
20
+ publish: build ## Publish the application
21
+ uv publish --token $(PYPI_TOKEN)
@@ -1,75 +1,142 @@
1
+ Metadata-Version: 2.4
2
+ Name: nexlify-mcp-server
3
+ Version: 0.1.1
4
+ Summary: The Nexlify MCP Server is a lightweight Python package designed to integrate GitHub Copilot with the Nexlify AI system.
5
+ Project-URL: Repository, https://github.com/DeepakPant93/nexlify
6
+ Project-URL: Documentation, https://github.com/DeepakPant93/nexlify/tree/main/nexlify-mcp-server
7
+ Author-email: Deepak Pant <mail2deepakpant@gmail.com>
8
+ License-Expression: MIT
9
+ Requires-Python: >=3.10
10
+ Requires-Dist: httpx>=0.28.1
11
+ Requires-Dist: mcp[cli]>=1.12.1
12
+ Description-Content-Type: text/markdown
13
+
1
14
  # Nexlify MCP Server Package
2
15
 
3
16
  ## Overview
4
17
 
5
- The Nexlify MCP Server is a lightweight Python package designed to integrate GitHub Copilot with the Nexlify AI system. It acts as a bridge, allowing developers to send queries from their IDE directly to the Nexlify API server—a CrewAI-based agentic AI service. This server executes queries against a vector database (powered by Qdrant) for internal documentation and performs restricted searches on whitelisted URLs (e.g., GitHub, StackOverflow) to retrieve relevant results. The package implements the Model Context Protocol (MCP) for seamless communication with GitHub Copilot, enhancing developer productivity by providing RAG-based (Retrieval-Augmented Generation) responses within the IDE[1].
18
+ The Nexlify MCP Server is a lightweight Python package designed to integrate GitHub Copilot with the Nexlify AI system. It acts as a bridge, allowing developers to send queries from their IDE directly to the Nexlify API server—a CrewAI-based agentic AI service. This server executes queries against a vector database (powered by Qdrant) for internal documentation and performs restricted searches on whitelisted URLs (e.g., GitHub, StackOverflow) to retrieve relevant results. The package implements the Model Context Protocol (MCP) for seamless communication with GitHub Copilot, enhancing developer productivity by providing RAG-based (Retrieval-Augmented Generation) responses within the IDE.
6
19
 
7
20
  Key features include:
21
+
8
22
  - Simple query forwarding to the Nexlify CrewAI microservice.
9
23
  - Support for semantic searches using embeddings stored in Qdrant.
10
24
  - Restriction to whitelisted URLs for safe and targeted internet searches.
11
25
  - Easy setup for local running and IDE integration.
12
26
 
13
- This package is part of the Nexlify MVP, which leverages technologies like FastAPI, CrewAI, and OpenAI for embedding generation[1].
27
+ This package is part of the Nexlify MVP, which leverages technologies like FastAPI, CrewAI, and Gemini AI for embedding generation.
14
28
 
15
29
  ## Installation
16
30
 
17
- To install the Nexlify MCP package, use pip. It is published on PyPI for easy access.
31
+ To install the latest version of the Nexlify MCP package, use pip. It is published on PyPI for easy access.
18
32
 
19
33
  ```bash
20
34
  pip install nexlify-mcp-server
21
35
  ```
22
36
 
37
+
23
38
  ### Requirements
39
+
24
40
  - Python 3.10 or higher.
25
41
  - Dependencies: `requests` (automatically installed via pip).
26
42
 
43
+
27
44
  ## Configuration
28
45
 
29
46
  Before using the package, configure your environment and IDE.
30
47
 
31
48
  ### Environment Variables
32
- Create a `.env` file in your project root with the following:
49
+
50
+ Create a `.env` file in your project root with the following (take reference from `.env.example`):
33
51
 
34
52
  ```
35
- CREW_AI_URL=http://nexlify-ai-agentics-server:8001 # URL of the CrewAI microservice /search endpoint
53
+ NEXLIFY_API_BASE_URI=<your_api_base_uri>
54
+ MCP_TIMEOUT=500 # Timeout in seconds
36
55
  ```
37
56
 
38
57
  Load these variables using `python-dotenv` if needed in custom scripts.
39
58
 
40
59
  ### IDE Setup
41
- - **VS Code**: Add the MCP server configuration to `.vscode/mcp.json` or `settings.json`. Enable MCP discovery with `"chat.mcp.discovery.enabled": true` and specify the local server URL (e.g., `http://localhost:8000`)[1].
42
- - **IntelliJ IDEA**: Configure via the Tools menu. Add the MCP server endpoint and enable integration for GitHub Copilot queries[1].
60
+
61
+ - **VS Code**: Add the MCP server configuration to `.vscode/mcp.json` or `settings.json`. Enable MCP discovery with `"chat.mcp.discovery.enabled": true` and specify the local server URL (e.g., `http://localhost:8000`).
62
+ - **IntelliJ IDEA**: Configure via the Tools menu. Add the MCP server endpoint and enable integration for GitHub Copilot queries.
43
63
 
44
64
  Ensure the Nexlify CrewAI microservice is running and accessible (e.g., via Docker Compose or AWS EC2 deployment).
45
65
 
66
+ Add this JSON configuration in your current workspace, file -> .vscode/mcp.json
67
+
68
+ ```json
69
+ {
70
+ "servers": {
71
+ "nexlify-mcp-server": {
72
+ "type": "stdio",
73
+ "command": "nexlify_mcp_server",
74
+ "env": {
75
+ "NEXLIFY_API_BASE_URI": "${input:nexlify-app-uri}",
76
+ }
77
+
78
+ }
79
+ },
80
+ "inputs": [
81
+ {
82
+ "id": "nexlify-app-uri",
83
+ "type": "promptString",
84
+ "description": "Enter the URL of your Netlify app.",
85
+ "default": "http://0.0.0.0:8000",
86
+ },
87
+ ],
88
+ }
89
+ ```
90
+
91
+
46
92
  ## Usage
47
93
 
48
94
  ### Running the MCP Server
49
- Run the server locally to handle queries from GitHub Copilot:
95
+
96
+ To run the MCP server, execute this command:
50
97
 
51
98
  ```bash
52
- python -m nexlify_mcp
99
+ nexlify-mcp-server
53
100
  ```
54
101
 
55
102
  This starts a lightweight server that listens for MCP requests and forwards them to the configured CrewAI URL.
56
103
 
57
104
  ### Querying from IDE
105
+
58
106
  Once running and configured in your IDE:
107
+
59
108
  1. Open GitHub Copilot chat in VS Code or IntelliJ.
60
109
  2. Submit a query (e.g., "How do I fix this Python error?").
61
110
  3. The MCP server forwards the query to the CrewAI microservice.
62
111
  4. The CrewAI service:
63
- - Queries the vector database for internal results.
64
- - Searches whitelisted URLs for external insights.
112
+ - Queries the vector database for internal results.
113
+ - Searches whitelisted URLs for external insights.
65
114
  5. Consolidated results are returned and displayed in the IDE.
66
115
 
116
+ ## Publishing the Package
117
+
118
+ To publish the package, create a PyPI token using this URL: (https://pypi.org/manage/account/token/)
119
+
120
+ ## Package Details
121
+
122
+ To check the package details, follow this link: (https://pypi.org/project/nexlify-mcp-server/)
123
+
124
+ ## Make Commands
125
+
126
+ | Command | Description | Usage Example |
127
+ | :-- | :-- | :-- |
128
+ | run | Run the application | make run |
129
+ | build | Build the application | make build |
130
+ | install | Install the local build package (For Development only) | make install PYTHON_PATH="<your-python-path>" |
131
+ | publish | Publish the application | make publish |
132
+
67
133
  ## Limitations
68
134
 
69
135
  - Relies on the availability of the Nexlify CrewAI microservice.
70
136
  - Queries are limited to text-based inputs; no support for file uploads in MVP.
71
- - Internet searches are restricted to whitelisted URLs for safety[1].
137
+ - Internet searches are restricted to whitelisted URLs for safety.
138
+
72
139
 
73
140
  ## License
74
141
 
75
- This package is licensed under the MIT License. See the LICENSE file in the repository for details.
142
+ This package is licensed under the [MIT License](../LICENSE). See the LICENSE file in the repository for details.
@@ -1,86 +1,129 @@
1
- Metadata-Version: 2.4
2
- Name: nexlify-mcp-server
3
- Version: 0.1.0
4
- Summary: Add your description here
5
- Author-email: Deepak Pant <mail2deepakpant@gmail.com>
6
- License-Expression: MIT
7
- Requires-Python: >=3.10
8
- Requires-Dist: httpx>=0.28.1
9
- Requires-Dist: mcp[cli]>=1.12.1
10
- Description-Content-Type: text/markdown
11
-
12
1
  # Nexlify MCP Server Package
13
2
 
14
3
  ## Overview
15
4
 
16
- The Nexlify MCP Server is a lightweight Python package designed to integrate GitHub Copilot with the Nexlify AI system. It acts as a bridge, allowing developers to send queries from their IDE directly to the Nexlify API server—a CrewAI-based agentic AI service. This server executes queries against a vector database (powered by Qdrant) for internal documentation and performs restricted searches on whitelisted URLs (e.g., GitHub, StackOverflow) to retrieve relevant results. The package implements the Model Context Protocol (MCP) for seamless communication with GitHub Copilot, enhancing developer productivity by providing RAG-based (Retrieval-Augmented Generation) responses within the IDE[1].
5
+ The Nexlify MCP Server is a lightweight Python package designed to integrate GitHub Copilot with the Nexlify AI system. It acts as a bridge, allowing developers to send queries from their IDE directly to the Nexlify API server—a CrewAI-based agentic AI service. This server executes queries against a vector database (powered by Qdrant) for internal documentation and performs restricted searches on whitelisted URLs (e.g., GitHub, StackOverflow) to retrieve relevant results. The package implements the Model Context Protocol (MCP) for seamless communication with GitHub Copilot, enhancing developer productivity by providing RAG-based (Retrieval-Augmented Generation) responses within the IDE.
17
6
 
18
7
  Key features include:
8
+
19
9
  - Simple query forwarding to the Nexlify CrewAI microservice.
20
10
  - Support for semantic searches using embeddings stored in Qdrant.
21
11
  - Restriction to whitelisted URLs for safe and targeted internet searches.
22
12
  - Easy setup for local running and IDE integration.
23
13
 
24
- This package is part of the Nexlify MVP, which leverages technologies like FastAPI, CrewAI, and OpenAI for embedding generation[1].
14
+ This package is part of the Nexlify MVP, which leverages technologies like FastAPI, CrewAI, and Gemini AI for embedding generation.
25
15
 
26
16
  ## Installation
27
17
 
28
- To install the Nexlify MCP package, use pip. It is published on PyPI for easy access.
18
+ To install the latest version of the Nexlify MCP package, use pip. It is published on PyPI for easy access.
29
19
 
30
20
  ```bash
31
21
  pip install nexlify-mcp-server
32
22
  ```
33
23
 
24
+
34
25
  ### Requirements
26
+
35
27
  - Python 3.10 or higher.
36
28
  - Dependencies: `requests` (automatically installed via pip).
37
29
 
30
+
38
31
  ## Configuration
39
32
 
40
33
  Before using the package, configure your environment and IDE.
41
34
 
42
35
  ### Environment Variables
43
- Create a `.env` file in your project root with the following:
36
+
37
+ Create a `.env` file in your project root with the following (take reference from `.env.example`):
44
38
 
45
39
  ```
46
- CREW_AI_URL=http://nexlify-ai-agentics-server:8001 # URL of the CrewAI microservice /search endpoint
40
+ NEXLIFY_API_BASE_URI=<your_api_base_uri>
41
+ MCP_TIMEOUT=500 # Timeout in seconds
47
42
  ```
48
43
 
49
44
  Load these variables using `python-dotenv` if needed in custom scripts.
50
45
 
51
46
  ### IDE Setup
52
- - **VS Code**: Add the MCP server configuration to `.vscode/mcp.json` or `settings.json`. Enable MCP discovery with `"chat.mcp.discovery.enabled": true` and specify the local server URL (e.g., `http://localhost:8000`)[1].
53
- - **IntelliJ IDEA**: Configure via the Tools menu. Add the MCP server endpoint and enable integration for GitHub Copilot queries[1].
47
+
48
+ - **VS Code**: Add the MCP server configuration to `.vscode/mcp.json` or `settings.json`. Enable MCP discovery with `"chat.mcp.discovery.enabled": true` and specify the local server URL (e.g., `http://localhost:8000`).
49
+ - **IntelliJ IDEA**: Configure via the Tools menu. Add the MCP server endpoint and enable integration for GitHub Copilot queries.
54
50
 
55
51
  Ensure the Nexlify CrewAI microservice is running and accessible (e.g., via Docker Compose or AWS EC2 deployment).
56
52
 
53
+ Add this JSON configuration in your current workspace, file -> .vscode/mcp.json
54
+
55
+ ```json
56
+ {
57
+ "servers": {
58
+ "nexlify-mcp-server": {
59
+ "type": "stdio",
60
+ "command": "nexlify_mcp_server",
61
+ "env": {
62
+ "NEXLIFY_API_BASE_URI": "${input:nexlify-app-uri}",
63
+ }
64
+
65
+ }
66
+ },
67
+ "inputs": [
68
+ {
69
+ "id": "nexlify-app-uri",
70
+ "type": "promptString",
71
+ "description": "Enter the URL of your Netlify app.",
72
+ "default": "http://0.0.0.0:8000",
73
+ },
74
+ ],
75
+ }
76
+ ```
77
+
78
+
57
79
  ## Usage
58
80
 
59
81
  ### Running the MCP Server
60
- Run the server locally to handle queries from GitHub Copilot:
82
+
83
+ To run the MCP server, execute this command:
61
84
 
62
85
  ```bash
63
- python -m nexlify_mcp
86
+ nexlify-mcp-server
64
87
  ```
65
88
 
66
89
  This starts a lightweight server that listens for MCP requests and forwards them to the configured CrewAI URL.
67
90
 
68
91
  ### Querying from IDE
92
+
69
93
  Once running and configured in your IDE:
94
+
70
95
  1. Open GitHub Copilot chat in VS Code or IntelliJ.
71
96
  2. Submit a query (e.g., "How do I fix this Python error?").
72
97
  3. The MCP server forwards the query to the CrewAI microservice.
73
98
  4. The CrewAI service:
74
- - Queries the vector database for internal results.
75
- - Searches whitelisted URLs for external insights.
99
+ - Queries the vector database for internal results.
100
+ - Searches whitelisted URLs for external insights.
76
101
  5. Consolidated results are returned and displayed in the IDE.
77
102
 
103
+ ## Publishing the Package
104
+
105
+ To publish the package, create a PyPI token using this URL: (https://pypi.org/manage/account/token/)
106
+
107
+ ## Package Details
108
+
109
+ To check the package details, follow this link: (https://pypi.org/project/nexlify-mcp-server/)
110
+
111
+ ## Make Commands
112
+
113
+ | Command | Description | Usage Example |
114
+ | :-- | :-- | :-- |
115
+ | run | Run the application | make run |
116
+ | build | Build the application | make build |
117
+ | install | Install the local build package (For Development only) | make install PYTHON_PATH="<your-python-path>" |
118
+ | publish | Publish the application | make publish |
119
+
78
120
  ## Limitations
79
121
 
80
122
  - Relies on the availability of the Nexlify CrewAI microservice.
81
123
  - Queries are limited to text-based inputs; no support for file uploads in MVP.
82
- - Internet searches are restricted to whitelisted URLs for safety[1].
124
+ - Internet searches are restricted to whitelisted URLs for safety.
125
+
83
126
 
84
127
  ## License
85
128
 
86
- This package is licensed under the MIT License. See the LICENSE file in the repository for details.
129
+ This package is licensed under the [MIT License](../LICENSE). See the LICENSE file in the repository for details.
@@ -1,7 +1,7 @@
1
1
  [project]
2
2
  name = "nexlify-mcp-server"
3
- version = "0.1.0"
4
- description = "Add your description here"
3
+ version = "0.1.1"
4
+ description = "The Nexlify MCP Server is a lightweight Python package designed to integrate GitHub Copilot with the Nexlify AI system."
5
5
  readme = "README.md"
6
6
  requires-python = ">=3.10"
7
7
  license = "MIT"
@@ -11,9 +11,15 @@ dependencies = [
11
11
  "mcp[cli]>=1.12.1",
12
12
  ]
13
13
 
14
+ [project.urls]
15
+ Repository = "https://github.com/DeepakPant93/nexlify"
16
+ Documentation = "https://github.com/DeepakPant93/nexlify/tree/main/nexlify-mcp-server"
17
+
14
18
  [project.scripts]
15
19
  nexlify_mcp_server = "nexlify_mcp_server.main:run"
16
20
 
21
+
22
+
17
23
  [build-system]
18
24
  requires = ["hatchling"]
19
25
  build-backend = "hatchling.build"