@akshayram1/omnibrowser-agent 0.2.32 → 0.3.1
This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
- package/.env +1 -0
- package/README.github.md +37 -0
- package/demo1.gif +0 -0
- package/dist/background.js +1 -1
- package/dist/background.js.map +1 -1
- package/dist/content.js +65 -15
- package/dist/content.js.map +3 -3
- package/dist/lib.js +109 -19
- package/dist/lib.js.map +3 -3
- package/dist/manifest.json +1 -1
- package/dist/types/lib/index.d.ts +1 -0
- package/dist/types/shared/contracts.d.ts +4 -0
- package/dist/types/shared/parse-action.d.ts +1 -0
- package/dist/types/shared/safety.d.ts +2 -2
- package/icons/big.png +0 -0
- package/icons/logo.png +0 -0
- package/icons/logo_horizontal.png +0 -0
- package/notebook/.env +1 -0
- package/notebook/README.md +39 -0
- package/notebook/custom_quantized_llm_colab copy.ipynb +7084 -0
- package/notebook/data/omnibrowser_planner_train.jsonl +500 -0
- package/package.json +1 -1
package/notebook/.env
ADDED
|
@@ -0,0 +1 @@
|
|
|
1
|
+
HF_TOKEN=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
|
|
@@ -0,0 +1,39 @@
|
|
|
1
|
+
# Custom Quantized LLM Notebook Assets
|
|
2
|
+
|
|
3
|
+
This folder contains everything needed to bootstrap a custom OmniBrowser planner model using Google Colab:
|
|
4
|
+
|
|
5
|
+
- `custom_quantized_llm_colab.ipynb` — QLoRA fine-tune + MLC quantization workflow.
|
|
6
|
+
- `data/omnibrowser_planner_train.jsonl` — starter training dataset (500 rows, JSONL chat format).
|
|
7
|
+
- `scripts/generate_dataset.mjs` — generates chain-based training data that matches `src/core/prompt.ts`.
|
|
8
|
+
- `scripts/validate_dataset.mjs` — validates assistant JSON shape + selector correctness.
|
|
9
|
+
|
|
10
|
+
## Generate dataset
|
|
11
|
+
|
|
12
|
+
```bash
|
|
13
|
+
node notebook/scripts/generate_dataset.mjs 300 notebook/data/omnibrowser_planner_train.jsonl
|
|
14
|
+
```
|
|
15
|
+
|
|
16
|
+
With explicit seed:
|
|
17
|
+
|
|
18
|
+
```bash
|
|
19
|
+
node notebook/scripts/generate_dataset.mjs 500 notebook/data/omnibrowser_planner_train.jsonl 1337
|
|
20
|
+
```
|
|
21
|
+
|
|
22
|
+
## Validate dataset
|
|
23
|
+
|
|
24
|
+
```bash
|
|
25
|
+
node notebook/scripts/validate_dataset.mjs notebook/data/omnibrowser_planner_train.jsonl
|
|
26
|
+
```
|
|
27
|
+
|
|
28
|
+
## Colab usage
|
|
29
|
+
|
|
30
|
+
1. Upload `notebook/data/omnibrowser_planner_train.jsonl` to Colab.
|
|
31
|
+
2. Open and run `notebook/custom_quantized_llm_colab.ipynb`.
|
|
32
|
+
3. Set your Hugging Face repo names inside the notebook cells.
|
|
33
|
+
4. Train, merge, quantize (`q4f16_1`), and upload.
|
|
34
|
+
|
|
35
|
+
## Notes
|
|
36
|
+
|
|
37
|
+
- The dataset is intentionally a starter baseline; expand to 200-500+ high-quality real traces before final training.
|
|
38
|
+
- Rows are generated as linked 3-5 step chains with accumulated history/memory and 6-15 distractor candidates per step.
|
|
39
|
+
- Keep selector strings exact. Any mismatch between candidates and `action.selector` will degrade planner reliability.
|