amaprice 1.0.4 → 1.0.6

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
package/README.md CHANGED
@@ -1,6 +1,11 @@
1
1
  # amaprice
2
2
 
3
- CLI tool to look up and track Amazon product prices.
3
+ [![License: MIT](https://img.shields.io/badge/license-MIT-blue.svg)](LICENSE)
4
+ [![Node.js >=20](https://img.shields.io/badge/node-%3E%3D20-339933?logo=node.js&logoColor=white)](https://nodejs.org/)
5
+ [![Playwright](https://img.shields.io/badge/Playwright-Chromium-2EAD33?logo=playwright&logoColor=white)](https://playwright.dev/)
6
+ [![Supabase](https://img.shields.io/badge/Supabase-Postgres-3ECF8E?logo=supabase&logoColor=white)](https://supabase.com/)
7
+
8
+ `amaprice` is a terminal-first CLI to check Amazon prices, track products, and build shared price history automatically.
4
9
 
5
10
  ## Install
6
11
 
@@ -8,28 +13,19 @@ CLI tool to look up and track Amazon product prices.
8
13
  npm install -g amaprice
9
14
  ```
10
15
 
11
- ## Usage
16
+ ## Quickstart
12
17
 
13
18
  ```bash
14
- # One-shot price lookup
19
+ # one-shot lookup
15
20
  amaprice price "https://www.amazon.de/dp/B0DZ5P7JD6"
16
- amaprice price B0DZ5P7JD6
17
- amaprice price
18
- # then paste full Amazon URL or ASIN when prompted
19
-
20
- # JSON output (for scripts / AI agents)
21
- amaprice price "https://www.amazon.de/dp/B0DZ5P7JD6" --json
22
21
 
23
- # Track a product's price over time
24
- amaprice track "https://www.amazon.de/dp/B0DZ5P7JD6"
25
- amaprice track B0DZ5P7JD6
26
- amaprice track
27
- # then paste full Amazon URL or ASIN when prompted
22
+ # start tracking with a tier
23
+ amaprice track B0DZ5P7JD6 --tier daily
28
24
 
29
- # View price history
30
- amaprice history B0DZ5P7JD6
25
+ # show history
26
+ amaprice history B0DZ5P7JD6 --limit 30
31
27
 
32
- # List all tracked products
28
+ # list tracked products
33
29
  amaprice list
34
30
  ```
35
31
 
@@ -37,23 +33,155 @@ amaprice list
37
33
 
38
34
  | Command | Description |
39
35
  |---|---|
40
- | `amaprice price [url\|asin]` | One-shot price lookup (or prompt if omitted) |
41
- | `amaprice track [url\|asin]` | Track a product's price (or prompt if omitted) |
42
- | `amaprice history <url\|asin>` | Show price history (`--limit N`, default 30) |
43
- | `amaprice list` | Show all tracked products with latest price |
36
+ | `amaprice [url\|asin]` | Shortcut for `amaprice price [url\|asin]` |
37
+ | `amaprice price [url\|asin]` | One-shot lookup and silent history insert |
38
+ | `amaprice track [url\|asin]` | Track product + current price |
39
+ | `amaprice history <url\|asin>` | Show history (`--limit N`) |
40
+ | `amaprice list` | List tracked products + latest price |
41
+ | `amaprice sync --limit <n>` | Run background sync for due products |
42
+ | `amaprice tier <url\|asin> <hourly\|daily\|weekly>` | Set tier for tracked product |
43
+
44
+ All commands support `--json`.
45
+
46
+ ## Testing
47
+
48
+ Run regression and parser tests:
49
+
50
+ ```bash
51
+ npm test
52
+ ```
53
+
54
+ ## Tiered Background Model
55
+
56
+ Each product has:
57
+ - `tier`: `hourly`, `daily`, or `weekly`
58
+ - `tier_mode`: `auto` or `manual`
59
+ - `next_scrape_at`: when the worker should scrape next
60
+
61
+ How tiers are determined in `auto` mode:
62
+ - `hourly`: 2+ price changes in 48h, or >=5% change across 7 days
63
+ - `daily`: normal active products
64
+ - `weekly`: no observed change in 30 days
65
+
66
+ Worker behavior:
67
+ - claims due products
68
+ - scrapes with Playwright
69
+ - writes `price_history`
70
+ - writes `scrape_attempts` telemetry for block/error monitoring
71
+ - resets/backs off on failures
72
+ - updates next run with jitter
73
+
74
+ ## Database Migration (Supabase)
75
+
76
+ Run this SQL in Supabase SQL Editor:
77
+
78
+ `supabase/migrations/20260220_add_tier_scheduler.sql`
79
+
80
+ `supabase/migrations/20260220_add_scrape_attempts.sql`
81
+
82
+ `supabase/migrations/20260220_add_worker_health_view.sql`
83
+
84
+ `supabase/migrations/20260220_grant_worker_health_select.sql`
85
+
86
+ `supabase/migrations/20260220_add_price_history_currency.sql`
87
+
88
+ These migrations add tier fields, indexes, telemetry, worker health rollups, and `price_history.currency`.
89
+
90
+ ## Block Detection Queries
91
+
92
+ Products currently failing or likely blocked:
93
+
94
+ ```sql
95
+ select asin, tier, consecutive_failures, last_error, last_scraped_at, next_scrape_at
96
+ from products
97
+ where consecutive_failures >= 3
98
+ or last_error ilike '%captcha%'
99
+ or last_error ilike '%robot%'
100
+ or last_error ilike '%503%'
101
+ order by consecutive_failures desc, next_scrape_at asc;
102
+ ```
103
+
104
+ Hourly block-rate from telemetry:
105
+
106
+ ```sql
107
+ select
108
+ date_trunc('hour', scraped_at) as hour,
109
+ count(*) as total,
110
+ sum(case when blocked_signal then 1 else 0 end) as blocked,
111
+ round(100.0 * sum(case when blocked_signal then 1 else 0 end) / nullif(count(*), 0), 2) as blocked_pct
112
+ from scrape_attempts
113
+ where scraped_at >= now() - interval '24 hours'
114
+ group by 1
115
+ order by 1 desc;
116
+ ```
117
+
118
+ Single-row worker health view:
119
+
120
+ ```sql
121
+ select * from worker_health;
122
+ ```
123
+
124
+ ## Local/Worker Environment
125
+
126
+ Use env vars (recommended):
127
+
128
+ ```bash
129
+ export SUPABASE_URL="https://<project-ref>.supabase.co"
130
+ export SUPABASE_KEY="<anon-or-service-role-key>"
131
+ ```
132
+
133
+ For production background workers, prefer the Supabase **service role key**.
44
134
 
45
- All commands support `--json` for machine-readable output.
135
+ ## Railway Worker Deployment
46
136
 
47
- If your URL contains query parameters (`?` / `&`), either wrap it in quotes or run the command without an argument and paste the full URL into the prompt.
137
+ This repo includes:
138
+ - `src/worker.js` (long-running loop worker)
139
+ - `railway.json` + `Dockerfile` (Playwright-ready runtime)
48
140
 
49
- ## Community Price Database
141
+ Steps:
142
+ 1. Create a Railway project from this repo.
143
+ 2. Add env vars: `SUPABASE_URL`, `SUPABASE_KEY`.
144
+ 3. Optional env vars:
145
+ - `SYNC_INTERVAL_MINUTES=5`
146
+ - `SYNC_LIMIT=20`
147
+ 4. Ensure builder is Dockerfile (root `Dockerfile`).
148
+ 5. Deploy.
149
+ 6. Confirm logs show `[worker] processed=...`.
50
150
 
51
- amaprice contributes anonymized price data (product title, ASIN, price, and timestamp) to a shared database. This means every lookup helps build a broader price history that benefits all users — the more people use amaprice, the richer the tracking data becomes for everyone. No personal or device information is collected.
151
+ If Railway still uses Railpack instead of Dockerfile, set builder to Dockerfile manually in Railway service settings and redeploy.
152
+
153
+ One-shot run for testing:
154
+
155
+ ```bash
156
+ npm run worker:once
157
+ ```
158
+
159
+ ## Vercel Website Deployment (`amaprice.sh`)
160
+
161
+ Lean marketing site is a Next.js app in `website/`.
162
+
163
+ Steps:
164
+ 1. Import the repo in Vercel.
165
+ 2. Leave the project at repo root (deployment is controlled by root `vercel.json`).
166
+ 3. Set website env vars:
167
+ - `NEXT_PUBLIC_SUPABASE_URL`
168
+ - `NEXT_PUBLIC_SUPABASE_ANON_KEY`
169
+ 4. Deploy.
170
+ 5. Add domain `amaprice.sh` in Vercel Domains and assign to this project.
171
+ 6. Set `www.amaprice.sh` redirect to `amaprice.sh`.
172
+
173
+ Local website development:
174
+
175
+ ```bash
176
+ cd website
177
+ npm install
178
+ npm run dev
179
+ ```
52
180
 
53
- ## Requirements
181
+ ## Community Price Data
54
182
 
55
- - Node.js >= 18
56
- - Chromium is installed automatically via Playwright
183
+ `amaprice` contributes anonymized price snapshots (title, ASIN, price, timestamp) to a shared dataset.
184
+ No personal/device data is stored.
57
185
 
58
186
  ## License
59
187
 
package/bin/cli.js CHANGED
@@ -2,6 +2,16 @@
2
2
 
3
3
  const { program } = require('commander');
4
4
  const pkg = require('../package.json');
5
+ const KNOWN_COMMANDS = new Set(['price', 'track', 'history', 'list', 'sync', 'tier', 'help']);
6
+
7
+ const userArgs = process.argv.slice(2);
8
+ if (userArgs.length > 0) {
9
+ const firstArg = userArgs[0];
10
+ // Convenience mode: treat `amaprice <url-or-asin>` as `amaprice price <url-or-asin>`.
11
+ if (!firstArg.startsWith('-') && !KNOWN_COMMANDS.has(firstArg)) {
12
+ process.argv.splice(2, 0, 'price');
13
+ }
14
+ }
5
15
 
6
16
  program
7
17
  .name('amaprice')
@@ -13,5 +23,7 @@ require('../src/commands/price')(program);
13
23
  require('../src/commands/track')(program);
14
24
  require('../src/commands/history')(program);
15
25
  require('../src/commands/list')(program);
26
+ require('../src/commands/sync')(program);
27
+ require('../src/commands/tier')(program);
16
28
 
17
29
  program.parse();
package/package.json CHANGED
@@ -1,6 +1,6 @@
1
1
  {
2
2
  "name": "amaprice",
3
- "version": "1.0.4",
3
+ "version": "1.0.6",
4
4
  "description": "CLI tool to scrape and track Amazon product prices",
5
5
  "main": "src/scraper.js",
6
6
  "type": "commonjs",
@@ -8,6 +8,10 @@
8
8
  "amaprice": "bin/cli.js"
9
9
  },
10
10
  "scripts": {
11
+ "sync": "node bin/cli.js sync",
12
+ "worker": "node src/worker.js",
13
+ "worker:once": "SYNC_RUN_ONCE=1 node src/worker.js",
14
+ "test": "node --test",
11
15
  "postinstall": "npx --yes playwright install chromium"
12
16
  },
13
17
  "files": [
@@ -17,7 +21,7 @@
17
21
  "LICENSE"
18
22
  ],
19
23
  "engines": {
20
- "node": ">=18"
24
+ "node": ">=20"
21
25
  },
22
26
  "keywords": [
23
27
  "amazon",
@@ -16,6 +16,10 @@ module.exports = function (program) {
16
16
  title: p.title,
17
17
  url: p.url,
18
18
  domain: p.domain,
19
+ tier: p.tier ?? 'daily',
20
+ tierMode: p.tier_mode ?? 'auto',
21
+ active: p.is_active ?? true,
22
+ nextScrapeAt: p.next_scrape_at ?? null,
19
23
  latestPrice: p.latestPrice ? parseFloat(p.latestPrice.price) : null,
20
24
  currency: p.latestPrice?.currency ?? null,
21
25
  lastScraped: p.latestPrice?.scraped_at ?? null,
@@ -30,7 +34,9 @@ module.exports = function (program) {
30
34
  const price = p.latestPrice
31
35
  ? formatPrice(parseFloat(p.latestPrice.price), p.latestPrice.currency)
32
36
  : 'N/A';
33
- console.log(` ${p.asin} ${price} ${p.title}`);
37
+ const tier = p.tier || 'daily';
38
+ const status = p.is_active === false ? 'paused' : tier;
39
+ console.log(` ${p.asin} ${price} [${status}] ${p.title}`);
34
40
  }
35
41
  }
36
42
  } catch (err) {
@@ -1,7 +1,8 @@
1
1
  const { normalizeAmazonInput } = require('../url');
2
2
  const { resolveCliInput } = require('../input');
3
3
  const { scrapePrice } = require('../scraper');
4
- const { upsertProduct, insertPrice } = require('../db');
4
+ const { upsertProduct, insertPrice, updateProductById } = require('../db');
5
+ const { normalizeTier, computeNextScrapeAt } = require('../tiering');
5
6
 
6
7
  module.exports = function (program) {
7
8
  program
@@ -49,6 +50,15 @@ module.exports = function (program) {
49
50
  price: result.price.numeric,
50
51
  currency: result.price.currency,
51
52
  });
53
+ const tier = normalizeTier(product.tier, 'daily');
54
+ await updateProductById(product.id, {
55
+ last_price: result.price.numeric,
56
+ last_scraped_at: new Date().toISOString(),
57
+ consecutive_failures: 0,
58
+ last_error: null,
59
+ next_scrape_at: computeNextScrapeAt(tier),
60
+ last_price_change_at: new Date().toISOString(),
61
+ });
52
62
  } catch {
53
63
  // Silent — don't disrupt the user experience
54
64
  }
@@ -0,0 +1,38 @@
1
+ const { runDueSync } = require('../sync-runner');
2
+
3
+ module.exports = function (program) {
4
+ program
5
+ .command('sync')
6
+ .description('Run background sync for due products (for cron/worker usage)')
7
+ .option('--limit <n>', 'Max products to process in one run', '20')
8
+ .option('--json', 'Output as JSON')
9
+ .action(async (opts) => {
10
+ const limit = Math.max(1, parseInt(opts.limit, 10) || 20);
11
+
12
+ try {
13
+ const report = await runDueSync({ limit });
14
+ if (opts.json) {
15
+ console.log(JSON.stringify(report));
16
+ } else {
17
+ if (report.processed === 0) {
18
+ console.log('No due products.');
19
+ return;
20
+ }
21
+
22
+ console.log(`Processed: ${report.processed}`);
23
+ console.log(`Success: ${report.success}`);
24
+ console.log(`Failed: ${report.failed}`);
25
+ for (const item of report.items) {
26
+ if (item.status === 'ok') {
27
+ console.log(` OK ${item.asin} ${item.price} ${item.currency} tier=${item.tier}`);
28
+ } else {
29
+ console.log(` FAIL ${item.asin} tier=${item.tier} ${item.error}`);
30
+ }
31
+ }
32
+ }
33
+ } catch (err) {
34
+ console.error(`Error: ${err.message}`);
35
+ process.exit(1);
36
+ }
37
+ });
38
+ };
@@ -0,0 +1,75 @@
1
+ const { extractAsin } = require('../url');
2
+ const { getProductByAsin, updateProductByAsin } = require('../db');
3
+ const { normalizeTier, computeNextScrapeAt } = require('../tiering');
4
+
5
+ module.exports = function (program) {
6
+ program
7
+ .command('tier <url-or-asin> <tier>')
8
+ .description('Set polling tier for a tracked product (hourly|daily|weekly)')
9
+ .option('--auto', 'Enable automatic re-tiering based on price behavior')
10
+ .option('--manual', 'Keep this tier fixed')
11
+ .option('--activate', 'Enable background sync for this product')
12
+ .option('--deactivate', 'Disable background sync for this product')
13
+ .option('--json', 'Output as JSON')
14
+ .action(async (urlOrAsin, tierArg, opts) => {
15
+ const asin = extractAsin(urlOrAsin);
16
+ if (!asin) {
17
+ console.error('Error: Could not extract ASIN from input.');
18
+ process.exit(1);
19
+ }
20
+
21
+ const tier = normalizeTier(tierArg);
22
+ if (!tier) {
23
+ console.error('Error: Tier must be one of: hourly, daily, weekly.');
24
+ process.exit(1);
25
+ }
26
+
27
+ if (opts.activate && opts.deactivate) {
28
+ console.error('Error: Use either --activate or --deactivate, not both.');
29
+ process.exit(1);
30
+ }
31
+
32
+ try {
33
+ const product = await getProductByAsin(asin);
34
+ if (!product) {
35
+ console.error(`Error: Product with ASIN ${asin} is not tracked yet.`);
36
+ process.exit(1);
37
+ }
38
+
39
+ let tierMode = product.tier_mode || 'auto';
40
+ if (opts.auto) tierMode = 'auto';
41
+ if (opts.manual) tierMode = 'manual';
42
+ if (!opts.auto && !opts.manual) tierMode = 'manual';
43
+
44
+ const patch = {
45
+ tier,
46
+ tier_mode: tierMode,
47
+ next_scrape_at: computeNextScrapeAt(tier),
48
+ };
49
+ if (opts.activate) patch.is_active = true;
50
+ if (opts.deactivate) patch.is_active = false;
51
+
52
+ const updated = await updateProductByAsin(asin, patch);
53
+
54
+ if (opts.json) {
55
+ console.log(JSON.stringify({
56
+ asin: updated.asin,
57
+ tier: updated.tier,
58
+ tierMode: updated.tier_mode,
59
+ active: updated.is_active,
60
+ nextScrapeAt: updated.next_scrape_at,
61
+ }));
62
+ } else {
63
+ console.log(`ASIN: ${updated.asin}`);
64
+ console.log(`Tier: ${updated.tier}`);
65
+ console.log(`Tier mode: ${updated.tier_mode}`);
66
+ console.log(`Active: ${updated.is_active ? 'yes' : 'no'}`);
67
+ console.log(`Next scrape: ${updated.next_scrape_at}`);
68
+ }
69
+ } catch (err) {
70
+ console.error(`Error: ${err.message}`);
71
+ process.exit(1);
72
+ }
73
+ });
74
+ };
75
+
@@ -1,12 +1,17 @@
1
1
  const { normalizeAmazonInput } = require('../url');
2
2
  const { resolveCliInput } = require('../input');
3
3
  const { scrapePrice } = require('../scraper');
4
- const { upsertProduct, insertPrice } = require('../db');
4
+ const { upsertProduct, insertPrice, updateProductById } = require('../db');
5
+ const { normalizeTier, computeNextScrapeAt } = require('../tiering');
5
6
 
6
7
  module.exports = function (program) {
7
8
  program
8
9
  .command('track [input...]')
9
10
  .description('Save product + current price to Supabase')
11
+ .option('--tier <tier>', 'Set polling tier: hourly|daily|weekly')
12
+ .option('--manual-tier', 'Pin this product to its current tier (disable auto-tier)')
13
+ .option('--auto-tier', 'Enable automatic tiering for this product')
14
+ .option('--inactive', 'Track product but do not include it in background sync')
10
15
  .option('--json', 'Output as JSON')
11
16
  .action(async (inputParts, opts) => {
12
17
  const input = await resolveCliInput(inputParts);
@@ -16,6 +21,16 @@ module.exports = function (program) {
16
21
  process.exit(1);
17
22
  }
18
23
 
24
+ const selectedTier = opts.tier ? normalizeTier(opts.tier) : undefined;
25
+ if (opts.tier && !selectedTier) {
26
+ console.error('Error: Tier must be one of: hourly, daily, weekly.');
27
+ process.exit(1);
28
+ }
29
+ if (opts.manualTier && opts.autoTier) {
30
+ console.error('Error: Use either --manual-tier or --auto-tier, not both.');
31
+ process.exit(1);
32
+ }
33
+
19
34
  try {
20
35
  const result = await scrapePrice(normalized.url);
21
36
 
@@ -29,6 +44,10 @@ module.exports = function (program) {
29
44
  title: result.title,
30
45
  url: result.url,
31
46
  domain: result.domain,
47
+ tier: selectedTier,
48
+ tierMode: opts.manualTier ? 'manual' : (opts.autoTier ? 'auto' : undefined),
49
+ isActive: opts.inactive ? false : undefined,
50
+ nextScrapeAt: selectedTier ? computeNextScrapeAt(selectedTier) : undefined,
32
51
  });
33
52
 
34
53
  const priceRecord = await insertPrice({
@@ -37,6 +56,20 @@ module.exports = function (program) {
37
56
  currency: result.price.currency,
38
57
  });
39
58
 
59
+ const nextTier = normalizeTier(product.tier, selectedTier || 'daily');
60
+ try {
61
+ await updateProductById(product.id, {
62
+ last_price: result.price.numeric,
63
+ last_scraped_at: priceRecord.scraped_at,
64
+ consecutive_failures: 0,
65
+ last_error: null,
66
+ next_scrape_at: computeNextScrapeAt(nextTier),
67
+ last_price_change_at: priceRecord.scraped_at,
68
+ });
69
+ } catch {
70
+ // Background scheduling fields may not exist before migration.
71
+ }
72
+
40
73
  if (opts.json) {
41
74
  console.log(JSON.stringify({
42
75
  product: result.title,
@@ -47,11 +80,15 @@ module.exports = function (program) {
47
80
  productId: product.id,
48
81
  priceRecordId: priceRecord.id,
49
82
  trackedAt: priceRecord.scraped_at,
83
+ tier: nextTier,
84
+ tierMode: opts.manualTier ? 'manual' : (opts.autoTier ? 'auto' : (product.tier_mode || 'auto')),
85
+ active: opts.inactive ? false : (product.is_active ?? true),
50
86
  }));
51
87
  } else {
52
88
  console.log(`Tracking: ${result.title}`);
53
89
  console.log(`ASIN: ${result.asin}`);
54
90
  console.log(`Price: ${result.priceRaw}`);
91
+ console.log(`Tier: ${nextTier}`);
55
92
  console.log(`Saved to Supabase.`);
56
93
  }
57
94
  } catch (err) {
package/src/config.js CHANGED
@@ -1,4 +1,7 @@
1
- const SUPABASE_URL = 'https://fetgmcukbeetwdahrkhe.supabase.co';
2
- const SUPABASE_KEY = 'eyJhbGciOiJIUzI1NiIsInR5cCI6IkpXVCJ9.eyJpc3MiOiJzdXBhYmFzZSIsInJlZiI6ImZldGdtY3VrYmVldHdkYWhya2hlIiwicm9sZSI6ImFub24iLCJpYXQiOjE3NzE2MTQ2MTQsImV4cCI6MjA4NzE5MDYxNH0.KOymOB5I05eO_MMXyVHkQ2PukXkDIbFVKmukOI71r4Y';
1
+ const DEFAULT_SUPABASE_URL = 'https://fetgmcukbeetwdahrkhe.supabase.co';
2
+ const DEFAULT_SUPABASE_KEY = 'eyJhbGciOiJIUzI1NiIsInR5cCI6IkpXVCJ9.eyJpc3MiOiJzdXBhYmFzZSIsInJlZiI6ImZldGdtY3VrYmVldHdkYWhya2hlIiwicm9sZSI6ImFub24iLCJpYXQiOjE3NzE2MTQ2MTQsImV4cCI6MjA4NzE5MDYxNH0.KOymOB5I05eO_MMXyVHkQ2PukXkDIbFVKmukOI71r4Y';
3
+
4
+ const SUPABASE_URL = process.env.SUPABASE_URL || DEFAULT_SUPABASE_URL;
5
+ const SUPABASE_KEY = process.env.SUPABASE_KEY || process.env.SUPABASE_ANON_KEY || DEFAULT_SUPABASE_KEY;
3
6
 
4
7
  module.exports = { SUPABASE_URL, SUPABASE_KEY };