testdriverai 7.1.3 → 7.2.0
This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
- package/.github/workflows/acceptance.yaml +81 -0
- package/.github/workflows/publish.yaml +44 -0
- package/.github/workflows/test-init.yml +145 -0
- package/agent/index.js +18 -19
- package/agent/lib/commander.js +2 -2
- package/agent/lib/commands.js +324 -124
- package/agent/lib/redraw.js +99 -39
- package/agent/lib/sandbox.js +98 -6
- package/agent/lib/sdk.js +25 -0
- package/agent/lib/system.js +2 -1
- package/agent/lib/validation.js +6 -6
- package/docs/docs.json +211 -101
- package/docs/snippets/tests/type-repeated-replay.mdx +1 -1
- package/docs/v7/_drafts/caching-selectors.mdx +24 -0
- package/docs/v7/_drafts/migration.mdx +3 -3
- package/docs/v7/api/act.mdx +2 -2
- package/docs/v7/api/assert.mdx +2 -2
- package/docs/v7/api/assertions.mdx +21 -21
- package/docs/v7/api/elements.mdx +78 -0
- package/docs/v7/api/find.mdx +38 -0
- package/docs/v7/api/focusApplication.mdx +2 -2
- package/docs/v7/api/hover.mdx +2 -2
- package/docs/v7/features/ai-native.mdx +57 -71
- package/docs/v7/features/application-logs.mdx +353 -0
- package/docs/v7/features/browser-logs.mdx +414 -0
- package/docs/v7/features/cache-management.mdx +402 -0
- package/docs/v7/features/continuous-testing.mdx +346 -0
- package/docs/v7/features/coverage.mdx +508 -0
- package/docs/v7/features/data-driven-testing.mdx +441 -0
- package/docs/v7/features/easy-to-write.mdx +2 -73
- package/docs/v7/features/enterprise.mdx +155 -39
- package/docs/v7/features/fast.mdx +63 -81
- package/docs/v7/features/managed-sandboxes.mdx +384 -0
- package/docs/v7/features/network-monitoring.mdx +568 -0
- package/docs/v7/features/observable.mdx +3 -22
- package/docs/v7/features/parallel-execution.mdx +381 -0
- package/docs/v7/features/powerful.mdx +1 -1
- package/docs/v7/features/reports.mdx +414 -0
- package/docs/v7/features/sandbox-customization.mdx +229 -0
- package/docs/v7/features/scalable.mdx +217 -2
- package/docs/v7/features/stable.mdx +106 -147
- package/docs/v7/features/system-performance.mdx +616 -0
- package/docs/v7/features/test-analytics.mdx +373 -0
- package/docs/v7/features/test-cases.mdx +393 -0
- package/docs/v7/features/test-replays.mdx +408 -0
- package/docs/v7/features/test-reports.mdx +308 -0
- package/docs/v7/getting-started/{running-and-debugging.mdx → debugging-tests.mdx} +12 -142
- package/docs/v7/getting-started/quickstart.mdx +22 -305
- package/docs/v7/getting-started/running-tests.mdx +173 -0
- package/docs/v7/overview/readme.mdx +1 -1
- package/docs/v7/overview/what-is-testdriver.mdx +2 -14
- package/docs/v7/presets/chrome-extension.mdx +147 -122
- package/interfaces/cli/commands/init.js +78 -20
- package/interfaces/cli/lib/base.js +3 -2
- package/interfaces/logger.js +0 -2
- package/interfaces/shared-test-state.mjs +0 -5
- package/interfaces/vitest-plugin.mjs +69 -42
- package/lib/core/Dashcam.js +65 -66
- package/lib/vitest/hooks.mjs +42 -50
- package/manual/test-init-command.js +223 -0
- package/package.json +2 -2
- package/schema.json +5 -5
- package/sdk-log-formatter.js +351 -176
- package/sdk.d.ts +8 -8
- package/sdk.js +436 -121
- package/setup/aws/cloudformation.yaml +2 -2
- package/setup/aws/self-hosted.yml +1 -1
- package/test/testdriver/chrome-extension.test.mjs +55 -72
- package/test/testdriver/element-not-found.test.mjs +2 -1
- package/test/testdriver/hover-image.test.mjs +1 -1
- package/test/testdriver/hover-text-with-description.test.mjs +0 -3
- package/test/testdriver/scroll-until-text.test.mjs +10 -6
- package/test/testdriver/setup/lifecycleHelpers.mjs +19 -24
- package/test/testdriver/setup/testHelpers.mjs +18 -23
- package/vitest.config.mjs +3 -3
- package/.github/workflows/linux-tests.yml +0 -28
- package/docs/v7/getting-started/generating-tests.mdx +0 -525
- package/test/testdriver/auto-cache-key-demo.test.mjs +0 -56
|
@@ -0,0 +1,414 @@
|
|
|
1
|
+
---
|
|
2
|
+
title: "Test Reports & Analytics"
|
|
3
|
+
description: "Comprehensive test reporting and analytics for teams"
|
|
4
|
+
icon: "chart-line"
|
|
5
|
+
---
|
|
6
|
+
|
|
7
|
+
TestDriver provides detailed test reports and analytics through the web dashboard at [app.testdriver.ai](https://app.testdriver.ai), giving you complete visibility into test runs, failures, and trends.
|
|
8
|
+
|
|
9
|
+
## Web Dashboard
|
|
10
|
+
|
|
11
|
+
Access comprehensive test analytics and reports:
|
|
12
|
+
|
|
13
|
+
<Card title="TestDriver Dashboard" icon="chart-simple" href="https://app.testdriver.ai">
|
|
14
|
+
View all test runs, replays, and analytics at app.testdriver.ai
|
|
15
|
+
</Card>
|
|
16
|
+
|
|
17
|
+
### Dashboard Features
|
|
18
|
+
|
|
19
|
+
<CardGroup cols={2}>
|
|
20
|
+
<Card title="Test Runs" icon="list">
|
|
21
|
+
- All test executions
|
|
22
|
+
- Pass/fail status
|
|
23
|
+
- Duration and timing
|
|
24
|
+
- Filter by date, project, or status
|
|
25
|
+
</Card>
|
|
26
|
+
|
|
27
|
+
<Card title="Dashcam Replays" icon="video">
|
|
28
|
+
- Video replays of every test
|
|
29
|
+
- Step-by-step action logs
|
|
30
|
+
- Screenshots at each step
|
|
31
|
+
- Shareable replay links
|
|
32
|
+
</Card>
|
|
33
|
+
|
|
34
|
+
<Card title="Analytics" icon="chart-line">
|
|
35
|
+
- Success rate trends
|
|
36
|
+
- Test duration over time
|
|
37
|
+
- Flaky test detection
|
|
38
|
+
- Cache hit rates
|
|
39
|
+
</Card>
|
|
40
|
+
|
|
41
|
+
<Card title="Team Collaboration" icon="users">
|
|
42
|
+
- Share replays with team
|
|
43
|
+
- Comment on test runs
|
|
44
|
+
- Tag team members
|
|
45
|
+
- Export reports
|
|
46
|
+
</Card>
|
|
47
|
+
</CardGroup>
|
|
48
|
+
|
|
49
|
+
## Test Run Details
|
|
50
|
+
|
|
51
|
+
Each test run includes comprehensive information:
|
|
52
|
+
|
|
53
|
+
```javascript
|
|
54
|
+
import { test } from 'vitest';
|
|
55
|
+
import { chrome } from 'testdriverai/presets';
|
|
56
|
+
|
|
57
|
+
test('detailed test run', async (context) => {
|
|
58
|
+
const { testdriver, dashcam } = await chrome(context, {
|
|
59
|
+
url: 'https://example.com'
|
|
60
|
+
});
|
|
61
|
+
|
|
62
|
+
await testdriver.find('login button').click();
|
|
63
|
+
await testdriver.assert('login form is visible');
|
|
64
|
+
|
|
65
|
+
// After test completes, view detailed report at:
|
|
66
|
+
console.log('Replay URL:', dashcam.url);
|
|
67
|
+
// https://app.testdriver.ai/replay/abc123
|
|
68
|
+
});
|
|
69
|
+
```
|
|
70
|
+
|
|
71
|
+
### Available Report Data
|
|
72
|
+
|
|
73
|
+
<AccordionGroup>
|
|
74
|
+
<Accordion title="Execution Details">
|
|
75
|
+
- Test name and file
|
|
76
|
+
- Start and end time
|
|
77
|
+
- Total duration
|
|
78
|
+
- Pass/fail status
|
|
79
|
+
- Error messages and stack traces
|
|
80
|
+
- Exit code
|
|
81
|
+
</Accordion>
|
|
82
|
+
|
|
83
|
+
<Accordion title="Action Log">
|
|
84
|
+
- Every TestDriver command executed
|
|
85
|
+
- Timestamp for each action
|
|
86
|
+
- AI vision analysis results
|
|
87
|
+
- Element coordinates found
|
|
88
|
+
- Screenshots before/after each action
|
|
89
|
+
</Accordion>
|
|
90
|
+
|
|
91
|
+
<Accordion title="Performance Metrics">
|
|
92
|
+
- Time per command
|
|
93
|
+
- AI inference latency
|
|
94
|
+
- Cache hit/miss ratio
|
|
95
|
+
- Network request timing
|
|
96
|
+
- Screenshot capture time
|
|
97
|
+
</Accordion>
|
|
98
|
+
|
|
99
|
+
<Accordion title="Environment Info">
|
|
100
|
+
- Operating system
|
|
101
|
+
- Browser version
|
|
102
|
+
- Screen resolution
|
|
103
|
+
- Sandbox ID
|
|
104
|
+
- API version
|
|
105
|
+
</Accordion>
|
|
106
|
+
</AccordionGroup>
|
|
107
|
+
|
|
108
|
+
## JUnit XML Reports
|
|
109
|
+
|
|
110
|
+
Generate JUnit-compatible XML reports for CI/CD integration:
|
|
111
|
+
|
|
112
|
+
```bash
|
|
113
|
+
# Generate JUnit report
|
|
114
|
+
npx vitest run --reporter=junit --outputFile=test-results.xml
|
|
115
|
+
```
|
|
116
|
+
|
|
117
|
+
```xml test-results.xml
|
|
118
|
+
<?xml version="1.0" encoding="UTF-8"?>
|
|
119
|
+
<testsuites name="vitest tests" tests="12" failures="1" errors="0" time="125.3">
|
|
120
|
+
<testsuite name="login.test.js" tests="3" failures="1" time="31.2">
|
|
121
|
+
<testcase name="user can login" time="12.4" />
|
|
122
|
+
<testcase name="invalid credentials show error" time="9.8">
|
|
123
|
+
<failure message="AssertionError: expected login to fail">
|
|
124
|
+
Stack trace details...
|
|
125
|
+
</failure>
|
|
126
|
+
</testcase>
|
|
127
|
+
<testcase name="password reset works" time="9.0" />
|
|
128
|
+
</testsuite>
|
|
129
|
+
</testsuites>
|
|
130
|
+
```
|
|
131
|
+
|
|
132
|
+
<Check>
|
|
133
|
+
JUnit reports integrate with Jenkins, Azure DevOps, TeamCity, CircleCI, and GitHub Actions for test result visualization.
|
|
134
|
+
</Check>
|
|
135
|
+
|
|
136
|
+
## CI/CD Integration
|
|
137
|
+
|
|
138
|
+
TestDriver reports integrate seamlessly with popular CI/CD platforms:
|
|
139
|
+
|
|
140
|
+
<Tabs>
|
|
141
|
+
<Tab title="GitHub Actions">
|
|
142
|
+
```yaml .github/workflows/test.yml
|
|
143
|
+
name: E2E Tests
|
|
144
|
+
|
|
145
|
+
on: [push, pull_request]
|
|
146
|
+
|
|
147
|
+
jobs:
|
|
148
|
+
test:
|
|
149
|
+
runs-on: ubuntu-latest
|
|
150
|
+
steps:
|
|
151
|
+
- uses: actions/checkout@v3
|
|
152
|
+
- uses: actions/setup-node@v3
|
|
153
|
+
- run: npm install
|
|
154
|
+
- run: npx vitest run --reporter=junit --outputFile=test-results.xml
|
|
155
|
+
env:
|
|
156
|
+
TD_API_KEY: ${{ secrets.TD_API_KEY }}
|
|
157
|
+
|
|
158
|
+
# Upload test results
|
|
159
|
+
- uses: actions/upload-artifact@v3
|
|
160
|
+
if: always()
|
|
161
|
+
with:
|
|
162
|
+
name: test-results
|
|
163
|
+
path: test-results.xml
|
|
164
|
+
```
|
|
165
|
+
|
|
166
|
+
Test results appear in the GitHub Actions UI with pass/fail status.
|
|
167
|
+
</Tab>
|
|
168
|
+
|
|
169
|
+
<Tab title="GitLab CI">
|
|
170
|
+
```yaml .gitlab-ci.yml
|
|
171
|
+
test:
|
|
172
|
+
image: node:20
|
|
173
|
+
script:
|
|
174
|
+
- npm install
|
|
175
|
+
- npx vitest run --reporter=junit --outputFile=junit.xml
|
|
176
|
+
artifacts:
|
|
177
|
+
when: always
|
|
178
|
+
reports:
|
|
179
|
+
junit: junit.xml
|
|
180
|
+
variables:
|
|
181
|
+
TD_API_KEY: $TD_API_KEY
|
|
182
|
+
```
|
|
183
|
+
|
|
184
|
+
GitLab automatically parses JUnit reports and displays test results in merge requests.
|
|
185
|
+
</Tab>
|
|
186
|
+
|
|
187
|
+
<Tab title="CircleCI">
|
|
188
|
+
```yaml .circleci/config.yml
|
|
189
|
+
version: 2.1
|
|
190
|
+
jobs:
|
|
191
|
+
test:
|
|
192
|
+
docker:
|
|
193
|
+
- image: cimg/node:20.0
|
|
194
|
+
steps:
|
|
195
|
+
- checkout
|
|
196
|
+
- run: npm install
|
|
197
|
+
- run: npx vitest run --reporter=junit --outputFile=test-results/junit.xml
|
|
198
|
+
- store_test_results:
|
|
199
|
+
path: test-results
|
|
200
|
+
- store_artifacts:
|
|
201
|
+
path: test-results
|
|
202
|
+
```
|
|
203
|
+
|
|
204
|
+
CircleCI displays test insights on the Tests tab.
|
|
205
|
+
</Tab>
|
|
206
|
+
|
|
207
|
+
<Tab title="Jenkins">
|
|
208
|
+
```groovy Jenkinsfile
|
|
209
|
+
pipeline {
|
|
210
|
+
agent any
|
|
211
|
+
stages {
|
|
212
|
+
stage('Test') {
|
|
213
|
+
steps {
|
|
214
|
+
sh 'npm install'
|
|
215
|
+
sh 'npx vitest run --reporter=junit --outputFile=test-results.xml'
|
|
216
|
+
}
|
|
217
|
+
}
|
|
218
|
+
}
|
|
219
|
+
post {
|
|
220
|
+
always {
|
|
221
|
+
junit 'test-results.xml'
|
|
222
|
+
}
|
|
223
|
+
}
|
|
224
|
+
}
|
|
225
|
+
```
|
|
226
|
+
|
|
227
|
+
Jenkins publishes test results with trends and history.
|
|
228
|
+
</Tab>
|
|
229
|
+
</Tabs>
|
|
230
|
+
|
|
231
|
+
## Custom Reporting
|
|
232
|
+
|
|
233
|
+
Access raw test data programmatically:
|
|
234
|
+
|
|
235
|
+
```javascript
|
|
236
|
+
import { test } from 'vitest';
|
|
237
|
+
import { chrome } from 'testdriverai/presets';
|
|
238
|
+
import fs from 'fs';
|
|
239
|
+
|
|
240
|
+
test('custom reporting', async (context) => {
|
|
241
|
+
const { testdriver, dashcam } = await chrome(context, {
|
|
242
|
+
url: 'https://example.com'
|
|
243
|
+
});
|
|
244
|
+
|
|
245
|
+
const startTime = Date.now();
|
|
246
|
+
|
|
247
|
+
await testdriver.find('button').click();
|
|
248
|
+
await testdriver.assert('page loaded');
|
|
249
|
+
|
|
250
|
+
const duration = Date.now() - startTime;
|
|
251
|
+
|
|
252
|
+
// Build custom report
|
|
253
|
+
const report = {
|
|
254
|
+
testName: 'custom reporting',
|
|
255
|
+
duration,
|
|
256
|
+
replayUrl: dashcam.url,
|
|
257
|
+
status: 'passed',
|
|
258
|
+
timestamp: new Date().toISOString()
|
|
259
|
+
};
|
|
260
|
+
|
|
261
|
+
// Save to custom format
|
|
262
|
+
fs.writeFileSync(
|
|
263
|
+
'custom-report.json',
|
|
264
|
+
JSON.stringify(report, null, 2)
|
|
265
|
+
);
|
|
266
|
+
});
|
|
267
|
+
```
|
|
268
|
+
|
|
269
|
+
## Analytics & Trends
|
|
270
|
+
|
|
271
|
+
View historical trends in the web dashboard:
|
|
272
|
+
|
|
273
|
+
<CardGroup cols={2}>
|
|
274
|
+
<Card title="Success Rate" icon="percent">
|
|
275
|
+
Track test stability over time
|
|
276
|
+
- Daily/weekly/monthly success rates
|
|
277
|
+
- Identify flaky tests
|
|
278
|
+
- Compare across branches
|
|
279
|
+
</Card>
|
|
280
|
+
|
|
281
|
+
<Card title="Performance Trends" icon="gauge">
|
|
282
|
+
Monitor test performance
|
|
283
|
+
- Average test duration
|
|
284
|
+
- Slowest tests
|
|
285
|
+
- Cache effectiveness
|
|
286
|
+
</Card>
|
|
287
|
+
|
|
288
|
+
<Card title="Usage Metrics" icon="chart-bar">
|
|
289
|
+
Track team usage
|
|
290
|
+
- Tests run per day
|
|
291
|
+
- Active team members
|
|
292
|
+
- API usage
|
|
293
|
+
</Card>
|
|
294
|
+
|
|
295
|
+
<Card title="Cost Analysis" icon="dollar-sign">
|
|
296
|
+
Optimize spending
|
|
297
|
+
- Cost per test
|
|
298
|
+
- Sandbox usage
|
|
299
|
+
- Cache savings
|
|
300
|
+
</Card>
|
|
301
|
+
</CardGroup>
|
|
302
|
+
|
|
303
|
+
## Export Options
|
|
304
|
+
|
|
305
|
+
Download reports in multiple formats:
|
|
306
|
+
|
|
307
|
+
<Tabs>
|
|
308
|
+
<Tab title="JSON">
|
|
309
|
+
```json
|
|
310
|
+
{
|
|
311
|
+
"testRuns": [
|
|
312
|
+
{
|
|
313
|
+
"id": "run_abc123",
|
|
314
|
+
"name": "login test",
|
|
315
|
+
"status": "passed",
|
|
316
|
+
"duration": 12400,
|
|
317
|
+
"timestamp": "2024-12-09T10:30:00Z",
|
|
318
|
+
"replayUrl": "https://app.testdriver.ai/replay/abc123"
|
|
319
|
+
}
|
|
320
|
+
],
|
|
321
|
+
"summary": {
|
|
322
|
+
"total": 100,
|
|
323
|
+
"passed": 95,
|
|
324
|
+
"failed": 5,
|
|
325
|
+
"successRate": 0.95
|
|
326
|
+
}
|
|
327
|
+
}
|
|
328
|
+
```
|
|
329
|
+
</Tab>
|
|
330
|
+
|
|
331
|
+
<Tab title="CSV">
|
|
332
|
+
```csv
|
|
333
|
+
Test Name,Status,Duration (ms),Timestamp,Replay URL
|
|
334
|
+
login test,passed,12400,2024-12-09T10:30:00Z,https://app.testdriver.ai/replay/abc123
|
|
335
|
+
checkout test,passed,18200,2024-12-09T10:31:00Z,https://app.testdriver.ai/replay/def456
|
|
336
|
+
search test,failed,5600,2024-12-09T10:32:00Z,https://app.testdriver.ai/replay/ghi789
|
|
337
|
+
```
|
|
338
|
+
</Tab>
|
|
339
|
+
|
|
340
|
+
<Tab title="PDF">
|
|
341
|
+
Generate PDF reports with:
|
|
342
|
+
- Executive summary
|
|
343
|
+
- Test results table
|
|
344
|
+
- Embedded screenshots
|
|
345
|
+
- Trend charts
|
|
346
|
+
- Recommendations
|
|
347
|
+
</Tab>
|
|
348
|
+
</Tabs>
|
|
349
|
+
|
|
350
|
+
## Slack/Teams Integration
|
|
351
|
+
|
|
352
|
+
Get test notifications in your team chat:
|
|
353
|
+
|
|
354
|
+
```yaml .github/workflows/test.yml
|
|
355
|
+
- name: Notify Slack
|
|
356
|
+
if: failure()
|
|
357
|
+
uses: slackapi/slack-github-action@v1
|
|
358
|
+
with:
|
|
359
|
+
payload: |
|
|
360
|
+
{
|
|
361
|
+
"text": "E2E Tests Failed",
|
|
362
|
+
"blocks": [
|
|
363
|
+
{
|
|
364
|
+
"type": "section",
|
|
365
|
+
"text": {
|
|
366
|
+
"type": "mrkdwn",
|
|
367
|
+
"text": "❌ E2E tests failed on ${{ github.ref }}\n<${{ env.REPLAY_URL }}|View Replay>"
|
|
368
|
+
}
|
|
369
|
+
}
|
|
370
|
+
]
|
|
371
|
+
}
|
|
372
|
+
env:
|
|
373
|
+
SLACK_WEBHOOK_URL: ${{ secrets.SLACK_WEBHOOK }}
|
|
374
|
+
```
|
|
375
|
+
|
|
376
|
+
<Card title="Notifications Setup" icon="bell" href="/v7/guides/notifications">
|
|
377
|
+
Configure Slack, Teams, Discord, and email notifications
|
|
378
|
+
</Card>
|
|
379
|
+
|
|
380
|
+
## Learn More
|
|
381
|
+
|
|
382
|
+
<CardGroup cols={2}>
|
|
383
|
+
<Card
|
|
384
|
+
title="Observability & Debugging"
|
|
385
|
+
icon="microscope"
|
|
386
|
+
href="/v7/features/observable"
|
|
387
|
+
>
|
|
388
|
+
Learn about Dashcam replays and debugging
|
|
389
|
+
</Card>
|
|
390
|
+
|
|
391
|
+
<Card
|
|
392
|
+
title="Continuous Testing"
|
|
393
|
+
icon="arrows-spin"
|
|
394
|
+
href="/v7/features/continuous-testing"
|
|
395
|
+
>
|
|
396
|
+
CI/CD integration guide
|
|
397
|
+
</Card>
|
|
398
|
+
|
|
399
|
+
<Card
|
|
400
|
+
title="Dashboard"
|
|
401
|
+
icon="chart-simple"
|
|
402
|
+
href="https://app.testdriver.ai"
|
|
403
|
+
>
|
|
404
|
+
View your test reports
|
|
405
|
+
</Card>
|
|
406
|
+
|
|
407
|
+
<Card
|
|
408
|
+
title="API Documentation"
|
|
409
|
+
icon="code"
|
|
410
|
+
href="/v7/api/overview"
|
|
411
|
+
>
|
|
412
|
+
Programmatic access to reports
|
|
413
|
+
</Card>
|
|
414
|
+
</CardGroup>
|
|
@@ -0,0 +1,229 @@
|
|
|
1
|
+
---
|
|
2
|
+
title: 'Sandbox Customization'
|
|
3
|
+
description: 'Customize your test environments with pre-installed applications, dependencies, and configurations'
|
|
4
|
+
---
|
|
5
|
+
|
|
6
|
+
## Overview
|
|
7
|
+
|
|
8
|
+
TestDriver sandboxes can be customized to match your testing requirements. Install applications, configure system settings, and prepare environments programmatically to ensure your tests run in the right context.
|
|
9
|
+
|
|
10
|
+
## Provisioning API
|
|
11
|
+
|
|
12
|
+
The provisioning API allows you to customize sandbox environments before running tests.
|
|
13
|
+
|
|
14
|
+
### Installing Applications
|
|
15
|
+
|
|
16
|
+
Use the provisioning API to install software packages:
|
|
17
|
+
|
|
18
|
+
```javascript
|
|
19
|
+
import { ai } from '@testdriverai/sdk';
|
|
20
|
+
|
|
21
|
+
// Install applications before tests
|
|
22
|
+
await ai('install Chrome browser');
|
|
23
|
+
await ai('install Node.js version 18');
|
|
24
|
+
await ai('install PostgreSQL database');
|
|
25
|
+
```
|
|
26
|
+
|
|
27
|
+
### System Configuration
|
|
28
|
+
|
|
29
|
+
Configure system settings and preferences:
|
|
30
|
+
|
|
31
|
+
```javascript
|
|
32
|
+
// Set environment variables
|
|
33
|
+
await ai('set NODE_ENV to production');
|
|
34
|
+
|
|
35
|
+
// Configure system settings
|
|
36
|
+
await ai('enable dark mode');
|
|
37
|
+
await ai('set screen resolution to 1920x1080');
|
|
38
|
+
```
|
|
39
|
+
|
|
40
|
+
## Exec Command
|
|
41
|
+
|
|
42
|
+
The `exec` command allows you to run shell commands directly in the sandbox:
|
|
43
|
+
|
|
44
|
+
```javascript
|
|
45
|
+
import { exec } from '@testdriverai/sdk';
|
|
46
|
+
|
|
47
|
+
// Install packages via package manager
|
|
48
|
+
await exec('apt-get update && apt-get install -y git');
|
|
49
|
+
|
|
50
|
+
// Run setup scripts
|
|
51
|
+
await exec('npm install -g typescript');
|
|
52
|
+
|
|
53
|
+
// Configure applications
|
|
54
|
+
await exec('cp config.json /etc/myapp/config.json');
|
|
55
|
+
```
|
|
56
|
+
|
|
57
|
+
### Examples
|
|
58
|
+
|
|
59
|
+
#### Installing Development Tools
|
|
60
|
+
|
|
61
|
+
```javascript
|
|
62
|
+
// Install common development tools
|
|
63
|
+
await exec('apt-get update');
|
|
64
|
+
await exec('apt-get install -y curl wget git vim');
|
|
65
|
+
```
|
|
66
|
+
|
|
67
|
+
#### Setting Up a Web Server
|
|
68
|
+
|
|
69
|
+
```javascript
|
|
70
|
+
// Install and configure nginx
|
|
71
|
+
await exec('apt-get install -y nginx');
|
|
72
|
+
await exec('systemctl start nginx');
|
|
73
|
+
await exec('systemctl enable nginx');
|
|
74
|
+
```
|
|
75
|
+
|
|
76
|
+
#### Installing Language Runtimes
|
|
77
|
+
|
|
78
|
+
```javascript
|
|
79
|
+
// Install Python and pip
|
|
80
|
+
await exec('apt-get install -y python3 python3-pip');
|
|
81
|
+
await exec('pip3 install requests pytest');
|
|
82
|
+
```
|
|
83
|
+
|
|
84
|
+
## Pre-Test Setup
|
|
85
|
+
|
|
86
|
+
Run setup commands before each test:
|
|
87
|
+
|
|
88
|
+
```javascript
|
|
89
|
+
import { beforeAll } from 'vitest';
|
|
90
|
+
|
|
91
|
+
beforeAll(async () => {
|
|
92
|
+
// Install dependencies
|
|
93
|
+
await exec('npm install');
|
|
94
|
+
|
|
95
|
+
// Start services
|
|
96
|
+
await exec('npm run start-server &');
|
|
97
|
+
|
|
98
|
+
// Wait for service to be ready
|
|
99
|
+
await ai('wait for server to start on port 3000');
|
|
100
|
+
});
|
|
101
|
+
```
|
|
102
|
+
|
|
103
|
+
## Environment Templates
|
|
104
|
+
|
|
105
|
+
Create reusable environment configurations:
|
|
106
|
+
|
|
107
|
+
```javascript
|
|
108
|
+
// environments/chrome-dev.js
|
|
109
|
+
export async function setupChromeDev() {
|
|
110
|
+
await exec('apt-get update');
|
|
111
|
+
await exec('apt-get install -y google-chrome-stable');
|
|
112
|
+
await exec('npm install -g lighthouse');
|
|
113
|
+
}
|
|
114
|
+
|
|
115
|
+
// Use in tests
|
|
116
|
+
import { setupChromeDev } from './environments/chrome-dev';
|
|
117
|
+
|
|
118
|
+
beforeAll(async () => {
|
|
119
|
+
await setupChromeDev();
|
|
120
|
+
});
|
|
121
|
+
```
|
|
122
|
+
|
|
123
|
+
## Custom VM Images (Enterprise)
|
|
124
|
+
|
|
125
|
+
For more complex customization needs, Enterprise customers can work with TestDriver to create custom VM images.
|
|
126
|
+
|
|
127
|
+
### Benefits of Custom AMIs
|
|
128
|
+
|
|
129
|
+
- **Faster Startup**: Pre-installed applications ready to use
|
|
130
|
+
- **Consistency**: Identical environments across all test runs
|
|
131
|
+
- **Complex Setup**: Handle intricate installation procedures once
|
|
132
|
+
- **Version Control**: Track and manage environment changes
|
|
133
|
+
|
|
134
|
+
### What Can Be Pre-Installed
|
|
135
|
+
|
|
136
|
+
- Desktop applications (browsers, IDEs, etc.)
|
|
137
|
+
- System dependencies and libraries
|
|
138
|
+
- Language runtimes and frameworks
|
|
139
|
+
- Database servers and tools
|
|
140
|
+
- Custom certificates and credentials
|
|
141
|
+
- System configurations and settings
|
|
142
|
+
|
|
143
|
+
### Working with TestDriver
|
|
144
|
+
|
|
145
|
+
Enterprise customers receive dedicated support for custom environments:
|
|
146
|
+
|
|
147
|
+
1. **Requirements Gathering**: Discuss your environment needs
|
|
148
|
+
2. **AMI Development**: We build and test the custom image
|
|
149
|
+
3. **Validation**: Review and approve the environment
|
|
150
|
+
4. **Deployment**: Roll out to your test infrastructure
|
|
151
|
+
5. **Maintenance**: Regular updates and security patches
|
|
152
|
+
|
|
153
|
+
Learn more about custom environments in our [Enterprise documentation](/v7/features/enterprise#custom-environments).
|
|
154
|
+
|
|
155
|
+
## Best Practices
|
|
156
|
+
|
|
157
|
+
### Keep Setup Scripts Fast
|
|
158
|
+
|
|
159
|
+
- Pre-install large dependencies in custom AMIs when possible
|
|
160
|
+
- Cache downloaded files and packages
|
|
161
|
+
- Run setup in parallel when independent
|
|
162
|
+
|
|
163
|
+
### Make Setup Idempotent
|
|
164
|
+
|
|
165
|
+
- Check if software is already installed before installing
|
|
166
|
+
- Use version checks to avoid unnecessary reinstalls
|
|
167
|
+
- Handle errors gracefully
|
|
168
|
+
|
|
169
|
+
```javascript
|
|
170
|
+
async function ensureNodeInstalled() {
|
|
171
|
+
try {
|
|
172
|
+
await exec('node --version');
|
|
173
|
+
} catch {
|
|
174
|
+
await exec('curl -fsSL https://deb.nodesource.com/setup_18.x | bash -');
|
|
175
|
+
await exec('apt-get install -y nodejs');
|
|
176
|
+
}
|
|
177
|
+
}
|
|
178
|
+
```
|
|
179
|
+
|
|
180
|
+
### Version Lock Dependencies
|
|
181
|
+
|
|
182
|
+
- Specify exact versions of tools and packages
|
|
183
|
+
- Avoid "latest" tags that can change
|
|
184
|
+
- Document version requirements
|
|
185
|
+
|
|
186
|
+
```javascript
|
|
187
|
+
// Good - specific version
|
|
188
|
+
await exec('npm install -g typescript@5.3.3');
|
|
189
|
+
|
|
190
|
+
// Avoid - version may change
|
|
191
|
+
await exec('npm install -g typescript');
|
|
192
|
+
```
|
|
193
|
+
|
|
194
|
+
## Troubleshooting
|
|
195
|
+
|
|
196
|
+
### Installation Failures
|
|
197
|
+
|
|
198
|
+
Check logs for detailed error messages:
|
|
199
|
+
|
|
200
|
+
```javascript
|
|
201
|
+
try {
|
|
202
|
+
await exec('apt-get install -y mypackage');
|
|
203
|
+
} catch (error) {
|
|
204
|
+
console.error('Installation failed:', error);
|
|
205
|
+
}
|
|
206
|
+
```
|
|
207
|
+
|
|
208
|
+
### Permission Issues
|
|
209
|
+
|
|
210
|
+
Some commands may require sudo:
|
|
211
|
+
|
|
212
|
+
```javascript
|
|
213
|
+
await exec('sudo apt-get update');
|
|
214
|
+
```
|
|
215
|
+
|
|
216
|
+
### Network Issues
|
|
217
|
+
|
|
218
|
+
Ensure your sandbox has internet access for downloads:
|
|
219
|
+
|
|
220
|
+
```javascript
|
|
221
|
+
// Test connectivity
|
|
222
|
+
await exec('ping -c 3 google.com');
|
|
223
|
+
```
|
|
224
|
+
|
|
225
|
+
## Learn More
|
|
226
|
+
|
|
227
|
+
- [Exec API Reference](/v7/api/exec)
|
|
228
|
+
- [Managed Sandboxes](/v7/features/managed-sandboxes)
|
|
229
|
+
- [Enterprise Features](/v7/features/enterprise)
|