opspilot-ai 0.1.1__tar.gz → 0.1.2__tar.gz
This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
- {opspilot_ai-0.1.1 → opspilot_ai-0.1.2}/PKG-INFO +154 -44
- {opspilot_ai-0.1.1 → opspilot_ai-0.1.2}/README.md +153 -43
- {opspilot_ai-0.1.1 → opspilot_ai-0.1.2}/pyproject.toml +1 -1
- {opspilot_ai-0.1.1 → opspilot_ai-0.1.2}/.env.example +0 -0
- {opspilot_ai-0.1.1 → opspilot_ai-0.1.2}/ARCHITECTURE.md +0 -0
- {opspilot_ai-0.1.1 → opspilot_ai-0.1.2}/DEPLOYMENT.md +0 -0
- {opspilot_ai-0.1.1 → opspilot_ai-0.1.2}/LICENSE +0 -0
- {opspilot_ai-0.1.1 → opspilot_ai-0.1.2}/MANIFEST.in +0 -0
- {opspilot_ai-0.1.1 → opspilot_ai-0.1.2}/opspilot/__init__.py +0 -0
- {opspilot_ai-0.1.1 → opspilot_ai-0.1.2}/opspilot/agents/fixer.py +0 -0
- {opspilot_ai-0.1.1 → opspilot_ai-0.1.2}/opspilot/agents/planner.py +0 -0
- {opspilot_ai-0.1.1 → opspilot_ai-0.1.2}/opspilot/agents/remediation.py +0 -0
- {opspilot_ai-0.1.1 → opspilot_ai-0.1.2}/opspilot/agents/verifier.py +0 -0
- {opspilot_ai-0.1.1 → opspilot_ai-0.1.2}/opspilot/cli.py +0 -0
- {opspilot_ai-0.1.1 → opspilot_ai-0.1.2}/opspilot/config.py +0 -0
- {opspilot_ai-0.1.1 → opspilot_ai-0.1.2}/opspilot/context/__init__.py +0 -0
- {opspilot_ai-0.1.1 → opspilot_ai-0.1.2}/opspilot/context/deployment_history.py +0 -0
- {opspilot_ai-0.1.1 → opspilot_ai-0.1.2}/opspilot/context/deps.py +0 -0
- {opspilot_ai-0.1.1 → opspilot_ai-0.1.2}/opspilot/context/docker.py +0 -0
- {opspilot_ai-0.1.1 → opspilot_ai-0.1.2}/opspilot/context/env.py +0 -0
- {opspilot_ai-0.1.1 → opspilot_ai-0.1.2}/opspilot/context/logs.py +0 -0
- {opspilot_ai-0.1.1 → opspilot_ai-0.1.2}/opspilot/context/production_logs.py +0 -0
- {opspilot_ai-0.1.1 → opspilot_ai-0.1.2}/opspilot/context/project.py +0 -0
- {opspilot_ai-0.1.1 → opspilot_ai-0.1.2}/opspilot/diffs/redis.py +0 -0
- {opspilot_ai-0.1.1 → opspilot_ai-0.1.2}/opspilot/graph/engine.py +0 -0
- {opspilot_ai-0.1.1 → opspilot_ai-0.1.2}/opspilot/graph/nodes.py +0 -0
- {opspilot_ai-0.1.1 → opspilot_ai-0.1.2}/opspilot/memory.py +0 -0
- {opspilot_ai-0.1.1 → opspilot_ai-0.1.2}/opspilot/memory_redis.py +0 -0
- {opspilot_ai-0.1.1 → opspilot_ai-0.1.2}/opspilot/state.py +0 -0
- {opspilot_ai-0.1.1 → opspilot_ai-0.1.2}/opspilot/tools/__init__.py +0 -0
- {opspilot_ai-0.1.1 → opspilot_ai-0.1.2}/opspilot/tools/dep_tools.py +0 -0
- {opspilot_ai-0.1.1 → opspilot_ai-0.1.2}/opspilot/tools/env_tools.py +0 -0
- {opspilot_ai-0.1.1 → opspilot_ai-0.1.2}/opspilot/tools/log_tools.py +0 -0
- {opspilot_ai-0.1.1 → opspilot_ai-0.1.2}/opspilot/tools/pattern_analysis.py +0 -0
- {opspilot_ai-0.1.1 → opspilot_ai-0.1.2}/opspilot/utils/__init__.py +0 -0
- {opspilot_ai-0.1.1 → opspilot_ai-0.1.2}/opspilot/utils/llm.py +0 -0
- {opspilot_ai-0.1.1 → opspilot_ai-0.1.2}/opspilot/utils/llm_providers.py +0 -0
- {opspilot_ai-0.1.1 → opspilot_ai-0.1.2}/opspilot_ai.egg-info/SOURCES.txt +0 -0
- {opspilot_ai-0.1.1 → opspilot_ai-0.1.2}/setup.cfg +0 -0
- {opspilot_ai-0.1.1 → opspilot_ai-0.1.2}/setup.py +0 -0
- {opspilot_ai-0.1.1 → opspilot_ai-0.1.2}/tests/__init__.py +0 -0
- {opspilot_ai-0.1.1 → opspilot_ai-0.1.2}/tests/test_llm_providers.py +0 -0
- {opspilot_ai-0.1.1 → opspilot_ai-0.1.2}/tests/test_pattern_analysis.py +0 -0
- {opspilot_ai-0.1.1 → opspilot_ai-0.1.2}/tests/test_production_logs.py +0 -0
- {opspilot_ai-0.1.1 → opspilot_ai-0.1.2}/tests/test_remediation.py +0 -0
|
@@ -1,6 +1,6 @@
|
|
|
1
1
|
Metadata-Version: 2.4
|
|
2
2
|
Name: opspilot-ai
|
|
3
|
-
Version: 0.1.
|
|
3
|
+
Version: 0.1.2
|
|
4
4
|
Summary: AI-powered production incident analysis tool using multi-agent systems
|
|
5
5
|
Home-page: https://github.com/choudharikiranv15/OpsPilot-AI
|
|
6
6
|
Author: Kiran Vijaykumar Choudhari
|
|
@@ -87,86 +87,196 @@ OpsPilot-AI uses a **multi-agent AI architecture** to understand your project's
|
|
|
87
87
|
|
|
88
88
|
---
|
|
89
89
|
|
|
90
|
-
## 🚀 Quick Start
|
|
90
|
+
## 🚀 Quick Start (2 Minutes Setup)
|
|
91
91
|
|
|
92
|
-
###
|
|
92
|
+
### Step 1: Install OpsPilot-AI
|
|
93
93
|
|
|
94
|
-
- Python 3.8 or higher
|
|
95
|
-
- At least ONE of the following LLM providers:
|
|
96
|
-
- [Ollama](https://ollama.ai/) (local, free, recommended)
|
|
97
|
-
- Google Gemini API key (free tier)
|
|
98
|
-
- OpenRouter API key (free models available)
|
|
99
|
-
- HuggingFace API token (free tier)
|
|
100
|
-
|
|
101
|
-
### Installation
|
|
102
|
-
|
|
103
|
-
**From PyPI (Recommended):**
|
|
104
94
|
```bash
|
|
105
|
-
# Basic installation
|
|
106
95
|
pip install opspilot-ai
|
|
96
|
+
```
|
|
107
97
|
|
|
108
|
-
|
|
109
|
-
pip install opspilot-ai[redis]
|
|
98
|
+
### Step 2: Setup LLM (Choose ONE Option)
|
|
110
99
|
|
|
111
|
-
|
|
112
|
-
pip install opspilot-ai[all]
|
|
113
|
-
```
|
|
100
|
+
OpsPilot-AI needs an LLM to analyze your code. Choose **one** of these options:
|
|
114
101
|
|
|
115
|
-
|
|
116
|
-
```bash
|
|
117
|
-
# Clone repository
|
|
118
|
-
git clone https://github.com/choudharikiranv15/OpsPilot-AI.git
|
|
119
|
-
cd opspilot
|
|
102
|
+
---
|
|
120
103
|
|
|
121
|
-
|
|
122
|
-
pip install -e ".[all]"
|
|
123
|
-
```
|
|
104
|
+
#### Option A: Ollama (Recommended - Free & Private)
|
|
124
105
|
|
|
125
|
-
|
|
106
|
+
Ollama runs locally on your machine. Your code never leaves your computer.
|
|
126
107
|
|
|
127
|
-
**
|
|
108
|
+
**For macOS/Linux:**
|
|
128
109
|
```bash
|
|
129
110
|
# Install Ollama
|
|
130
111
|
curl -fsSL https://ollama.ai/install.sh | sh
|
|
131
112
|
|
|
132
|
-
# Pull
|
|
113
|
+
# Pull the AI model (one-time download, ~4GB)
|
|
133
114
|
ollama pull llama3
|
|
115
|
+
|
|
116
|
+
# Verify it's running
|
|
117
|
+
ollama list
|
|
118
|
+
```
|
|
119
|
+
|
|
120
|
+
**For Windows:**
|
|
121
|
+
1. Download Ollama from [ollama.ai/download](https://ollama.ai/download)
|
|
122
|
+
2. Install and run the application
|
|
123
|
+
3. Open terminal and run:
|
|
124
|
+
```bash
|
|
125
|
+
ollama pull llama3
|
|
126
|
+
```
|
|
127
|
+
|
|
128
|
+
---
|
|
129
|
+
|
|
130
|
+
#### Option B: Cloud API (No Local Install Required)
|
|
131
|
+
|
|
132
|
+
Use cloud-based LLMs with free tiers. Set **one** of these environment variables:
|
|
133
|
+
|
|
134
|
+
**Google Gemini (Recommended Cloud Option):**
|
|
135
|
+
```bash
|
|
136
|
+
# Get free API key: https://makersuite.google.com/app/apikey
|
|
137
|
+
export GOOGLE_API_KEY="your-api-key-here"
|
|
134
138
|
```
|
|
135
139
|
|
|
136
|
-
**
|
|
140
|
+
**OpenRouter (100+ Models Available):**
|
|
137
141
|
```bash
|
|
138
|
-
#
|
|
139
|
-
|
|
142
|
+
# Get free API key: https://openrouter.ai/keys
|
|
143
|
+
export OPENROUTER_API_KEY="your-api-key-here"
|
|
144
|
+
```
|
|
140
145
|
|
|
141
|
-
|
|
142
|
-
|
|
143
|
-
#
|
|
144
|
-
|
|
146
|
+
**HuggingFace:**
|
|
147
|
+
```bash
|
|
148
|
+
# Get free token: https://huggingface.co/settings/tokens
|
|
149
|
+
export HUGGINGFACE_API_KEY="your-api-key-here"
|
|
145
150
|
```
|
|
146
151
|
|
|
147
|
-
|
|
152
|
+
**Windows Users (set environment variable):**
|
|
153
|
+
```cmd
|
|
154
|
+
set GOOGLE_API_KEY=your-api-key-here
|
|
155
|
+
```
|
|
148
156
|
|
|
149
|
-
|
|
157
|
+
---
|
|
150
158
|
|
|
151
|
-
|
|
159
|
+
### Step 3: Analyze Your Project
|
|
152
160
|
|
|
153
161
|
```bash
|
|
154
|
-
#
|
|
162
|
+
# Navigate to your project
|
|
163
|
+
cd /path/to/your/project
|
|
164
|
+
|
|
165
|
+
# Run analysis
|
|
155
166
|
opspilot analyze
|
|
167
|
+
```
|
|
168
|
+
|
|
169
|
+
That's it! OpsPilot-AI will analyze your project and provide diagnosis.
|
|
170
|
+
|
|
171
|
+
---
|
|
156
172
|
|
|
173
|
+
## 📖 Usage Examples
|
|
174
|
+
|
|
175
|
+
### Basic Commands
|
|
176
|
+
|
|
177
|
+
```bash
|
|
178
|
+
# Quick analysis (fastest)
|
|
179
|
+
opspilot analyze --mode quick
|
|
180
|
+
|
|
181
|
+
# Deep analysis (thorough, recommended)
|
|
182
|
+
opspilot analyze --mode deep
|
|
183
|
+
|
|
184
|
+
# Verbose output (see what's happening)
|
|
185
|
+
opspilot analyze --verbose
|
|
186
|
+
```
|
|
187
|
+
|
|
188
|
+
### Advanced Commands
|
|
189
|
+
|
|
190
|
+
```bash
|
|
157
191
|
# Analyze with production logs from S3
|
|
158
192
|
opspilot analyze --log-source s3://my-bucket/logs/app.log
|
|
159
193
|
|
|
160
|
-
# Analyze with deployment correlation
|
|
194
|
+
# Analyze with deployment correlation (links errors to git commits)
|
|
161
195
|
opspilot analyze --deployment-analysis --since-hours 48
|
|
162
196
|
|
|
163
|
-
# JSON output for automation
|
|
197
|
+
# JSON output for CI/CD automation
|
|
164
198
|
opspilot analyze --json --mode quick
|
|
165
199
|
|
|
166
|
-
#
|
|
200
|
+
# Analyze specific log file
|
|
201
|
+
opspilot analyze --log-source /var/log/myapp/error.log
|
|
202
|
+
|
|
203
|
+
# Full debugging output
|
|
167
204
|
opspilot analyze --verbose --debug
|
|
168
205
|
```
|
|
169
206
|
|
|
207
|
+
### Analysis Modes
|
|
208
|
+
|
|
209
|
+
| Mode | Speed | LLM Calls | Use Case |
|
|
210
|
+
|------|-------|-----------|----------|
|
|
211
|
+
| `quick` | Fast | 1 | Quick check, CI/CD pipelines |
|
|
212
|
+
| `deep` | Thorough | Up to 4 | Detailed incident analysis |
|
|
213
|
+
| `explain` | Instant | 0 | Context gathering only (no LLM) |
|
|
214
|
+
|
|
215
|
+
### Installation Options
|
|
216
|
+
|
|
217
|
+
```bash
|
|
218
|
+
# Basic (just the CLI)
|
|
219
|
+
pip install opspilot-ai
|
|
220
|
+
|
|
221
|
+
# With Redis support (remembers past incidents)
|
|
222
|
+
pip install "opspilot-ai[redis]"
|
|
223
|
+
|
|
224
|
+
# With AWS support (S3, CloudWatch logs)
|
|
225
|
+
pip install "opspilot-ai[aws]"
|
|
226
|
+
|
|
227
|
+
# With Kubernetes support (K8s pod logs)
|
|
228
|
+
pip install "opspilot-ai[k8s]"
|
|
229
|
+
|
|
230
|
+
# Everything included
|
|
231
|
+
pip install "opspilot-ai[all]"
|
|
232
|
+
```
|
|
233
|
+
|
|
234
|
+
---
|
|
235
|
+
|
|
236
|
+
## 🔧 Troubleshooting
|
|
237
|
+
|
|
238
|
+
### "No LLM providers available"
|
|
239
|
+
|
|
240
|
+
You need to set up an LLM. Choose one:
|
|
241
|
+
|
|
242
|
+
```bash
|
|
243
|
+
# Option 1: Install Ollama (recommended)
|
|
244
|
+
curl -fsSL https://ollama.ai/install.sh | sh
|
|
245
|
+
ollama pull llama3
|
|
246
|
+
|
|
247
|
+
# Option 2: Set a cloud API key
|
|
248
|
+
export GOOGLE_API_KEY="your-key"
|
|
249
|
+
```
|
|
250
|
+
|
|
251
|
+
### "Ollama connection refused"
|
|
252
|
+
|
|
253
|
+
Make sure Ollama is running:
|
|
254
|
+
```bash
|
|
255
|
+
# Start Ollama service
|
|
256
|
+
ollama serve
|
|
257
|
+
|
|
258
|
+
# In another terminal, verify it works
|
|
259
|
+
ollama list
|
|
260
|
+
```
|
|
261
|
+
|
|
262
|
+
### "No logs found"
|
|
263
|
+
|
|
264
|
+
OpsPilot-AI looks for logs in these locations:
|
|
265
|
+
- `./logs/` directory
|
|
266
|
+
- `*.log` files in project root
|
|
267
|
+
- Files specified with `--log-source`
|
|
268
|
+
|
|
269
|
+
```bash
|
|
270
|
+
# Specify log file directly
|
|
271
|
+
opspilot analyze --log-source ./my-app/error.log
|
|
272
|
+
```
|
|
273
|
+
|
|
274
|
+
### "Context collected: 0 env vars"
|
|
275
|
+
|
|
276
|
+
Make sure you have a `.env` file in your project directory, or environment variables set.
|
|
277
|
+
|
|
278
|
+
---
|
|
279
|
+
|
|
170
280
|
**Example Output:**
|
|
171
281
|
|
|
172
282
|
```
|
|
@@ -28,86 +28,196 @@ OpsPilot-AI uses a **multi-agent AI architecture** to understand your project's
|
|
|
28
28
|
|
|
29
29
|
---
|
|
30
30
|
|
|
31
|
-
## 🚀 Quick Start
|
|
31
|
+
## 🚀 Quick Start (2 Minutes Setup)
|
|
32
32
|
|
|
33
|
-
###
|
|
33
|
+
### Step 1: Install OpsPilot-AI
|
|
34
34
|
|
|
35
|
-
- Python 3.8 or higher
|
|
36
|
-
- At least ONE of the following LLM providers:
|
|
37
|
-
- [Ollama](https://ollama.ai/) (local, free, recommended)
|
|
38
|
-
- Google Gemini API key (free tier)
|
|
39
|
-
- OpenRouter API key (free models available)
|
|
40
|
-
- HuggingFace API token (free tier)
|
|
41
|
-
|
|
42
|
-
### Installation
|
|
43
|
-
|
|
44
|
-
**From PyPI (Recommended):**
|
|
45
35
|
```bash
|
|
46
|
-
# Basic installation
|
|
47
36
|
pip install opspilot-ai
|
|
37
|
+
```
|
|
48
38
|
|
|
49
|
-
|
|
50
|
-
pip install opspilot-ai[redis]
|
|
39
|
+
### Step 2: Setup LLM (Choose ONE Option)
|
|
51
40
|
|
|
52
|
-
|
|
53
|
-
pip install opspilot-ai[all]
|
|
54
|
-
```
|
|
41
|
+
OpsPilot-AI needs an LLM to analyze your code. Choose **one** of these options:
|
|
55
42
|
|
|
56
|
-
|
|
57
|
-
```bash
|
|
58
|
-
# Clone repository
|
|
59
|
-
git clone https://github.com/choudharikiranv15/OpsPilot-AI.git
|
|
60
|
-
cd opspilot
|
|
43
|
+
---
|
|
61
44
|
|
|
62
|
-
|
|
63
|
-
pip install -e ".[all]"
|
|
64
|
-
```
|
|
45
|
+
#### Option A: Ollama (Recommended - Free & Private)
|
|
65
46
|
|
|
66
|
-
|
|
47
|
+
Ollama runs locally on your machine. Your code never leaves your computer.
|
|
67
48
|
|
|
68
|
-
**
|
|
49
|
+
**For macOS/Linux:**
|
|
69
50
|
```bash
|
|
70
51
|
# Install Ollama
|
|
71
52
|
curl -fsSL https://ollama.ai/install.sh | sh
|
|
72
53
|
|
|
73
|
-
# Pull
|
|
54
|
+
# Pull the AI model (one-time download, ~4GB)
|
|
74
55
|
ollama pull llama3
|
|
56
|
+
|
|
57
|
+
# Verify it's running
|
|
58
|
+
ollama list
|
|
59
|
+
```
|
|
60
|
+
|
|
61
|
+
**For Windows:**
|
|
62
|
+
1. Download Ollama from [ollama.ai/download](https://ollama.ai/download)
|
|
63
|
+
2. Install and run the application
|
|
64
|
+
3. Open terminal and run:
|
|
65
|
+
```bash
|
|
66
|
+
ollama pull llama3
|
|
67
|
+
```
|
|
68
|
+
|
|
69
|
+
---
|
|
70
|
+
|
|
71
|
+
#### Option B: Cloud API (No Local Install Required)
|
|
72
|
+
|
|
73
|
+
Use cloud-based LLMs with free tiers. Set **one** of these environment variables:
|
|
74
|
+
|
|
75
|
+
**Google Gemini (Recommended Cloud Option):**
|
|
76
|
+
```bash
|
|
77
|
+
# Get free API key: https://makersuite.google.com/app/apikey
|
|
78
|
+
export GOOGLE_API_KEY="your-api-key-here"
|
|
75
79
|
```
|
|
76
80
|
|
|
77
|
-
**
|
|
81
|
+
**OpenRouter (100+ Models Available):**
|
|
78
82
|
```bash
|
|
79
|
-
#
|
|
80
|
-
|
|
83
|
+
# Get free API key: https://openrouter.ai/keys
|
|
84
|
+
export OPENROUTER_API_KEY="your-api-key-here"
|
|
85
|
+
```
|
|
81
86
|
|
|
82
|
-
|
|
83
|
-
|
|
84
|
-
#
|
|
85
|
-
|
|
87
|
+
**HuggingFace:**
|
|
88
|
+
```bash
|
|
89
|
+
# Get free token: https://huggingface.co/settings/tokens
|
|
90
|
+
export HUGGINGFACE_API_KEY="your-api-key-here"
|
|
86
91
|
```
|
|
87
92
|
|
|
88
|
-
|
|
93
|
+
**Windows Users (set environment variable):**
|
|
94
|
+
```cmd
|
|
95
|
+
set GOOGLE_API_KEY=your-api-key-here
|
|
96
|
+
```
|
|
89
97
|
|
|
90
|
-
|
|
98
|
+
---
|
|
91
99
|
|
|
92
|
-
|
|
100
|
+
### Step 3: Analyze Your Project
|
|
93
101
|
|
|
94
102
|
```bash
|
|
95
|
-
#
|
|
103
|
+
# Navigate to your project
|
|
104
|
+
cd /path/to/your/project
|
|
105
|
+
|
|
106
|
+
# Run analysis
|
|
96
107
|
opspilot analyze
|
|
108
|
+
```
|
|
109
|
+
|
|
110
|
+
That's it! OpsPilot-AI will analyze your project and provide diagnosis.
|
|
111
|
+
|
|
112
|
+
---
|
|
97
113
|
|
|
114
|
+
## 📖 Usage Examples
|
|
115
|
+
|
|
116
|
+
### Basic Commands
|
|
117
|
+
|
|
118
|
+
```bash
|
|
119
|
+
# Quick analysis (fastest)
|
|
120
|
+
opspilot analyze --mode quick
|
|
121
|
+
|
|
122
|
+
# Deep analysis (thorough, recommended)
|
|
123
|
+
opspilot analyze --mode deep
|
|
124
|
+
|
|
125
|
+
# Verbose output (see what's happening)
|
|
126
|
+
opspilot analyze --verbose
|
|
127
|
+
```
|
|
128
|
+
|
|
129
|
+
### Advanced Commands
|
|
130
|
+
|
|
131
|
+
```bash
|
|
98
132
|
# Analyze with production logs from S3
|
|
99
133
|
opspilot analyze --log-source s3://my-bucket/logs/app.log
|
|
100
134
|
|
|
101
|
-
# Analyze with deployment correlation
|
|
135
|
+
# Analyze with deployment correlation (links errors to git commits)
|
|
102
136
|
opspilot analyze --deployment-analysis --since-hours 48
|
|
103
137
|
|
|
104
|
-
# JSON output for automation
|
|
138
|
+
# JSON output for CI/CD automation
|
|
105
139
|
opspilot analyze --json --mode quick
|
|
106
140
|
|
|
107
|
-
#
|
|
141
|
+
# Analyze specific log file
|
|
142
|
+
opspilot analyze --log-source /var/log/myapp/error.log
|
|
143
|
+
|
|
144
|
+
# Full debugging output
|
|
108
145
|
opspilot analyze --verbose --debug
|
|
109
146
|
```
|
|
110
147
|
|
|
148
|
+
### Analysis Modes
|
|
149
|
+
|
|
150
|
+
| Mode | Speed | LLM Calls | Use Case |
|
|
151
|
+
|------|-------|-----------|----------|
|
|
152
|
+
| `quick` | Fast | 1 | Quick check, CI/CD pipelines |
|
|
153
|
+
| `deep` | Thorough | Up to 4 | Detailed incident analysis |
|
|
154
|
+
| `explain` | Instant | 0 | Context gathering only (no LLM) |
|
|
155
|
+
|
|
156
|
+
### Installation Options
|
|
157
|
+
|
|
158
|
+
```bash
|
|
159
|
+
# Basic (just the CLI)
|
|
160
|
+
pip install opspilot-ai
|
|
161
|
+
|
|
162
|
+
# With Redis support (remembers past incidents)
|
|
163
|
+
pip install "opspilot-ai[redis]"
|
|
164
|
+
|
|
165
|
+
# With AWS support (S3, CloudWatch logs)
|
|
166
|
+
pip install "opspilot-ai[aws]"
|
|
167
|
+
|
|
168
|
+
# With Kubernetes support (K8s pod logs)
|
|
169
|
+
pip install "opspilot-ai[k8s]"
|
|
170
|
+
|
|
171
|
+
# Everything included
|
|
172
|
+
pip install "opspilot-ai[all]"
|
|
173
|
+
```
|
|
174
|
+
|
|
175
|
+
---
|
|
176
|
+
|
|
177
|
+
## 🔧 Troubleshooting
|
|
178
|
+
|
|
179
|
+
### "No LLM providers available"
|
|
180
|
+
|
|
181
|
+
You need to set up an LLM. Choose one:
|
|
182
|
+
|
|
183
|
+
```bash
|
|
184
|
+
# Option 1: Install Ollama (recommended)
|
|
185
|
+
curl -fsSL https://ollama.ai/install.sh | sh
|
|
186
|
+
ollama pull llama3
|
|
187
|
+
|
|
188
|
+
# Option 2: Set a cloud API key
|
|
189
|
+
export GOOGLE_API_KEY="your-key"
|
|
190
|
+
```
|
|
191
|
+
|
|
192
|
+
### "Ollama connection refused"
|
|
193
|
+
|
|
194
|
+
Make sure Ollama is running:
|
|
195
|
+
```bash
|
|
196
|
+
# Start Ollama service
|
|
197
|
+
ollama serve
|
|
198
|
+
|
|
199
|
+
# In another terminal, verify it works
|
|
200
|
+
ollama list
|
|
201
|
+
```
|
|
202
|
+
|
|
203
|
+
### "No logs found"
|
|
204
|
+
|
|
205
|
+
OpsPilot-AI looks for logs in these locations:
|
|
206
|
+
- `./logs/` directory
|
|
207
|
+
- `*.log` files in project root
|
|
208
|
+
- Files specified with `--log-source`
|
|
209
|
+
|
|
210
|
+
```bash
|
|
211
|
+
# Specify log file directly
|
|
212
|
+
opspilot analyze --log-source ./my-app/error.log
|
|
213
|
+
```
|
|
214
|
+
|
|
215
|
+
### "Context collected: 0 env vars"
|
|
216
|
+
|
|
217
|
+
Make sure you have a `.env` file in your project directory, or environment variables set.
|
|
218
|
+
|
|
219
|
+
---
|
|
220
|
+
|
|
111
221
|
**Example Output:**
|
|
112
222
|
|
|
113
223
|
```
|
|
File without changes
|
|
File without changes
|
|
File without changes
|
|
File without changes
|
|
File without changes
|
|
File without changes
|
|
File without changes
|
|
File without changes
|
|
File without changes
|
|
File without changes
|
|
File without changes
|
|
File without changes
|
|
File without changes
|
|
File without changes
|
|
File without changes
|
|
File without changes
|
|
File without changes
|
|
File without changes
|
|
File without changes
|
|
File without changes
|
|
File without changes
|
|
File without changes
|
|
File without changes
|
|
File without changes
|
|
File without changes
|
|
File without changes
|
|
File without changes
|
|
File without changes
|
|
File without changes
|
|
File without changes
|
|
File without changes
|
|
File without changes
|
|
File without changes
|
|
File without changes
|
|
File without changes
|
|
File without changes
|
|
File without changes
|
|
File without changes
|
|
File without changes
|
|
File without changes
|
|
File without changes
|
|
File without changes
|