freshcontext-mcp 0.3.10 → 0.3.12

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
@@ -0,0 +1,16 @@
1
+ FROM apify/actor-node:20
2
+
3
+ # Copy package files first for better Docker layer caching
4
+ COPY package*.json ./
5
+
6
+ # Install all dependencies including apify SDK
7
+ RUN npm install --include=dev
8
+
9
+ # Copy source and pre-built dist
10
+ COPY . ./
11
+
12
+ # Rebuild TypeScript (dist/ from repo is fallback if this fails)
13
+ RUN npm run build || echo "Build had warnings, using pre-compiled dist/"
14
+
15
+ # Tell Apify to run the Actor entry point, not the MCP server
16
+ CMD ["node", "dist/apify.js"]
package/.actor/actor.json CHANGED
@@ -4,5 +4,6 @@
4
4
  "title": "FreshContext MCP",
5
5
  "version": "0.3.1",
6
6
  "input": "../input_schema.json",
7
- "output": "./output_schema.json"
7
+ "output": "./output_schema.json",
8
+ "dockerfile": "./Dockerfile"
8
9
  }
@@ -0,0 +1,32 @@
1
+ name: Build and Publish
2
+
3
+ on:
4
+ push:
5
+ branches:
6
+ - main
7
+
8
+ jobs:
9
+ build-and-publish:
10
+ runs-on: ubuntu-latest
11
+
12
+ steps:
13
+ - name: Checkout repository
14
+ uses: actions/checkout@v4
15
+
16
+ - name: Set up Node.js 18
17
+ uses: actions/setup-node@v4
18
+ with:
19
+ node-version: '18'
20
+ registry-url: 'https://registry.npmjs.org'
21
+
22
+ - name: Install dependencies
23
+ run: npm ci
24
+
25
+ - name: Build
26
+ run: npm run build
27
+
28
+ - name: Publish to npm
29
+ run: npm publish --access public
30
+ env:
31
+ NODE_AUTH_TOKEN: ${{ secrets.NPM_TOKEN }}
32
+ continue-on-error: true
@@ -0,0 +1,88 @@
1
+ # FreshContext — Architecture Upgrade Checklist
2
+ **Date started:** 2026-03-19
3
+ **Author:** Prince Gabriel, Grootfontein, Namibia
4
+
5
+ ---
6
+
7
+ ## [ ] Upgrade 1 — freshness_score numeric field
8
+ Implement the 0-100 numeric score defined in FRESHCONTEXT_SPEC.md.
9
+ Formula: max(0, 100 - (days_since_retrieved * decay_rate))
10
+ Location: src/tools/freshnessStamp.ts
11
+ Decay rates by adapter: finance=5.0, jobs=3.0, hackernews=2.0, github=1.0, scholar=0.3, default=1.5
12
+ Adds the score to both the text envelope and the JSON form.
13
+ Makes FreshContext fully spec-compliant by your own standard.
14
+ Cost: zero.
15
+
16
+ ---
17
+
18
+ ## [x] Upgrade 2 — Cloudflare KV response caching ← DONE (already implemented in worker.ts)
19
+ Cache adapter results in KV with adapter-specific TTLs so the same query
20
+ hitting the Worker twice doesn't make two upstream API calls.
21
+ Cache key: sha256(tool + ":" + url)
22
+ TTLs: HN/Reddit = 1 hour, GitHub/YC = 6 hours, govcontracts/scholar = 24 hours
23
+ Location: worker/src/index.ts
24
+ Cost: zero. KV free tier is 100k reads/day, 1k writes/day.
25
+
26
+ ---
27
+
28
+ ## [x] Upgrade 3 — Apify Actor timeout increase ← DONE 2026-03-19
29
+ Change the Actor timeout from 300 seconds to 3600 seconds in the Apify UI.
30
+ Apify console → Actor → Settings → Timeout → 3600
31
+ Playwright-based tools (extract_reddit, extract_yc, extract_producthunt) need
32
+ more than 5 minutes to launch Chromium and scrape. They will keep timing out
33
+ until this is changed. This is a UI field change, not a code change.
34
+ Cost: zero.
35
+
36
+ ---
37
+
38
+ ## [x] Upgrade 4 — D1 deduplication in the cron job ← DONE (hash-based dedup already in runScheduledScrape)
39
+ Before inserting a new scrape result, check if the same source_url was already
40
+ stored in the last 24 hours. If yes, skip the insert.
41
+ Prevents the scrape_results table from filling with duplicate data across
42
+ consecutive cron runs, keeping the historical dataset clean for the intelligence
43
+ layer (Layer 7 in the roadmap).
44
+ Location: the cron job handler in the Worker code.
45
+ Cost: zero.
46
+
47
+ ---
48
+
49
+ ## [x] Upgrade 5 — Structured JSON response form ← DONE 2026-03-19
50
+ Add the optional JSON form defined in FRESHCONTEXT_SPEC.md alongside the text
51
+ envelope in every adapter response. The JSON form has: source_url, content_date,
52
+ retrieved_at, freshness_confidence, adapter, freshness_score.
53
+ When a request has Accept: application/json, serve the structured form.
54
+ Both forms can be returned together — text for agents, JSON for programmatic use.
55
+ Location: src/tools/freshnessStamp.ts (same file as Upgrade 1, do together)
56
+ Cost: zero.
57
+
58
+ ---
59
+
60
+ ## [x] Upgrade 6 — GitHub Actions CI/CD automation ← DONE 2026-03-19
61
+ .github/workflows/publish.yml created. Triggers on every push to main.
62
+ Runs npm ci → npm run build → npm publish using NPM_TOKEN secret.
63
+ continue-on-error on publish so doc-only pushes don't fail the workflow.
64
+ First run: green checkmark, 23 seconds.
65
+ Manual PowerShell build/publish commands no longer needed.
66
+
67
+ ---
68
+
69
+ ## [x] Upgrade 7 — server.json version sync ← DONE 2026-03-19
70
+ server.json (MCP Registry listing) shows version 0.3.1 while package.json
71
+ is at 0.3.10. Anyone discovering FreshContext via the MCP Registry sees an
72
+ outdated version number. Fix by updating server.json manually now, then
73
+ optionally add a workflow step that syncs the version automatically on each
74
+ GitHub Actions run.
75
+ Location: server.json — change "version" field to match package.json.
76
+ Cost: zero.
77
+
78
+ ---
79
+
80
+ ## Priority order for remaining six upgrades
81
+
82
+ Do Upgrade 3 first — it is one UI field change and immediately fixes the
83
+ broken Apify Actor runs. Do Upgrades 1 and 5 together second since they
84
+ both touch freshnessStamp.ts and completing them makes FreshContext fully
85
+ spec-compliant. Do Upgrade 2 third — KV caching makes the Worker resilient
86
+ against upstream API instability. Do Upgrade 4 fourth — D1 deduplication
87
+ prepares the dataset for the future intelligence layer. Do Upgrade 7 last —
88
+ a simple version number correction, low urgency but worth keeping clean.
@@ -0,0 +1,174 @@
1
+ # FreshContext — Architecture Upgrade Roadmap V1
2
+ **Date:** 2026-03-19
3
+ **Author:** Immanuel Gabriel (Prince Gabriel), Grootfontein, Namibia
4
+
5
+ This document describes every free structural upgrade available to FreshContext,
6
+ prioritised by impact, with implementation notes for each.
7
+
8
+ ---
9
+
10
+ ## Upgrade 1 — freshness_score numeric field (HIGHEST PRIORITY)
11
+
12
+ **What it is:** The FreshContext Specification v1.0 defines an optional freshness_score
13
+ field (0-100) calculated as: max(0, 100 - (days_since_retrieved * decay_rate)).
14
+ Right now every response carries the text envelope and the confidence level (high/medium/low)
15
+ but not the numeric score. This is the one remaining piece that makes FreshContext fully
16
+ spec-compliant by your own standard.
17
+
18
+ **Why it matters:** Once the score exists, agents can filter results programmatically —
19
+ "only use results with freshness_score > 70" rather than parsing the string confidence
20
+ level. This is the difference between a label and a query parameter. It also strengthens
21
+ the acquisition narrative: the spec is complete, the reference implementation is complete,
22
+ and the standard is fully self-consistent.
23
+
24
+ **Domain-specific decay rates from the spec:**
25
+ Financial data decays at 5.0 (half-life ~10 days). Job listings at 3.0 (~17 days).
26
+ News and HN at 2.0 (~25 days). GitHub repos at 1.0 (~50 days). Academic papers at 0.3
27
+ (~167 days). General web content defaults to 1.5.
28
+
29
+ **Where to implement:** In src/tools/freshnessStamp.ts — the function that wraps every
30
+ adapter result already has retrieved_at and content_date. Add a calculateFreshnessScore
31
+ function that takes content_date, decay_rate (looked up by adapter name), and returns
32
+ the numeric score. Add it to both the text envelope and the JSON form.
33
+
34
+ **Cost:** Zero. Pure TypeScript logic, no new services.
35
+
36
+ ---
37
+
38
+ ## Upgrade 2 — Cloudflare KV response caching
39
+
40
+ **What it is:** When the same query hits an adapter twice within a short window, the
41
+ Worker currently makes two full upstream API calls. KV caching stores the first result
42
+ with a TTL and serves subsequent identical requests from cache — meaning the upstream
43
+ API (USASpending, GitHub, HN, etc.) only gets called once per cache window.
44
+
45
+ **Why it matters:** This reduces the chance of hitting upstream rate limits, makes
46
+ repeated queries near-instant for users, and reduces Worker CPU time. For adapters like
47
+ extract_govcontracts that call a government API, caching also reduces the risk of
48
+ temporary blocks from aggressive polling.
49
+
50
+ **Implementation:** In the Worker code (worker/src/index.ts or equivalent), before calling
51
+ the adapter, compute a cache key as sha256(tool + ":" + url). Call env.KV.get(cacheKey).
52
+ If the result exists, return it immediately. If not, run the adapter, then call
53
+ env.KV.put(cacheKey, result, { expirationTtl: ttl }) before returning. Use adapter-specific
54
+ TTLs — 3600 seconds (1 hour) for HN and Reddit, 21600 (6 hours) for GitHub and YC,
55
+ 86400 (24 hours) for govcontracts and scholar.
56
+
57
+ **Cost:** Zero. KV reads are free up to 100,000 per day, writes free up to 1,000 per day
58
+ on Cloudflare's free tier. You are nowhere near those limits.
59
+
60
+ ---
61
+
62
+ ## Upgrade 3 — Apify Actor timeout increase
63
+
64
+ **What it is:** The Apify Actor timeout is currently set to 300 seconds (5 minutes). Tools
65
+ that use Playwright to launch a browser — extract_reddit, extract_yc, extract_producthunt —
66
+ need more time than this to launch Chromium, navigate, wait for the page to render, and
67
+ extract content. They will keep timing out until this setting is increased.
68
+
69
+ **Where to change it:** Apify console → your Actor → Settings → Timeout. Change from
70
+ 300 to 3600 (1 hour). This is a UI change, not a code change.
71
+
72
+ **Cost:** Zero. The timeout setting is just a number. You won't actually use anywhere
73
+ near 3600 seconds — most tools complete in 10-30 seconds. The setting just prevents Apify
74
+ from killing the process prematurely for the slower Playwright-based tools.
75
+
76
+ ---
77
+
78
+ ## Upgrade 4 — D1 deduplication in the cron job
79
+
80
+ **What it is:** Every 6 hours the cron job runs all 18 watched queries and stores results
81
+ in the scrape_results D1 table. Right now there is no deduplication — if the same article
82
+ or repo appears in two consecutive cron runs, it gets stored twice. Over time this creates
83
+ noise in the dataset and wastes storage.
84
+
85
+ **Implementation:** Before inserting a new result, run a SELECT to check whether a row
86
+ with the same source_url already exists within the last 24 hours. If it does, skip the
87
+ insert. This is a single SQL WHERE clause addition to the existing insert logic.
88
+
89
+ **Why it matters:** As you build the intelligence layer (Layer 7 in the roadmap), the
90
+ quality of the historical signal depends on clean, deduplicated data. Starting deduplication
91
+ now means the dataset is clean by the time you need it.
92
+
93
+ **Cost:** Zero. D1 reads are free up to 25 million rows per day. A deduplication check
94
+ adds one read per result per cron run — trivially within limits.
95
+
96
+ ---
97
+
98
+ ## Upgrade 5 — Structured JSON response form in every adapter
99
+
100
+ **What it is:** The FreshContext Specification defines two valid response formats — the
101
+ text envelope ([FRESHCONTEXT]...[/FRESHCONTEXT]) and an optional structured JSON form with
102
+ a freshcontext object containing source_url, content_date, retrieved_at,
103
+ freshness_confidence, adapter, and freshness_score fields. Right now only the text envelope
104
+ is returned. Adding the JSON form makes FreshContext usable programmatically without
105
+ parsing the text envelope.
106
+
107
+ **Implementation:** In src/tools/freshnessStamp.ts, after assembling the text envelope,
108
+ also return a structured object. When the Worker serves a response, detect whether the
109
+ request has Accept: application/json and serve the structured form instead of the text
110
+ form if so. Both formats can also be returned together — text for human/agent reading,
111
+ JSON for programmatic use.
112
+
113
+ **Cost:** Zero. This is a response format change, no new services.
114
+
115
+ ---
116
+
117
+ ## Upgrade 6 — GitHub Actions: version bump automation
118
+
119
+ **What it is:** The current GitHub Actions workflow (publish.yml) runs npm publish on every
120
+ push, but only succeeds if the version in package.json has changed. Right now you manually
121
+ bump the version before pushing. A small addition to the workflow can automate this by
122
+ running npm version patch automatically before the publish step — so every push to main
123
+ creates a new patch version and publishes it without any manual intervention.
124
+
125
+ **Tradeoff:** This means every push creates a new npm version, which may not always be
126
+ desirable for documentation-only changes. A better approach is to only auto-bump when
127
+ commits touch src/ or .actor/ — which can be detected in the workflow with a path filter.
128
+
129
+ **Implementation:** Add a paths filter to the workflow trigger so it only runs the publish
130
+ step when source files change. Then add an npm version patch --no-git-tag-version step
131
+ before the publish step. Push the bumped package.json back to the repo using a
132
+ git commit and git push within the workflow (requires GITHUB_TOKEN, which is automatically
133
+ available in all Actions workflows at no cost).
134
+
135
+ **Cost:** Zero.
136
+
137
+ ---
138
+
139
+ ## Upgrade 7 — server.json version sync check
140
+
141
+ **What it is:** The server.json file (used by the MCP Registry listing) still shows version
142
+ 0.3.1 while package.json is at 0.3.10. This discrepancy means anyone who discovers
143
+ FreshContext via the MCP Registry sees an outdated version number. It is a cosmetic issue
144
+ but it affects credibility in a space where people are evaluating tools carefully.
145
+
146
+ **Implementation:** Add a step to the GitHub Actions workflow that reads the version from
147
+ package.json and uses sed or node -e to update the version field in server.json to match
148
+ before committing. Alternatively, update server.json manually now and keep it in sync
149
+ going forward.
150
+
151
+ **Cost:** Zero.
152
+
153
+ ---
154
+
155
+ ## Priority Order for Implementation
156
+
157
+ The order that maximises impact relative to effort is as follows. Implement the Apify
158
+ timeout increase first because it is a one-field UI change that immediately fixes the
159
+ broken Actor runs. Implement KV caching second because it makes the Worker more robust
160
+ against upstream API instability and improves response times for repeat queries. Implement
161
+ the freshness_score calculation third because it completes the spec and strengthens every
162
+ conversation about acquisition or partnership. Implement D1 deduplication fourth because
163
+ it improves data quality for the intelligence layer you will eventually build. Implement
164
+ the structured JSON response form fifth as part of the same PR as freshness_score since
165
+ they touch the same file. Implement the GitHub Actions version sync last as a quality-of-life
166
+ automation.
167
+
168
+ The total engineering cost of all six remaining upgrades is approximately 4-6 hours of
169
+ focused work. All run entirely within free tiers.
170
+
171
+ ---
172
+
173
+ *"The work isn't gone. It's just waiting to be continued."*
174
+ *— Prince Gabriel, Grootfontein, Namibia*
package/HANDOFF.md ADDED
@@ -0,0 +1,184 @@
1
+ # FreshContext — Handoff Document
2
+ **Version:** 0.3.11
3
+ **Date:** 2026-03-20
4
+ **Author:** Immanuel Gabriel (Prince Gabriel), Grootfontein, Namibia
5
+ **Contact:** gimmanuel73@gmail.com
6
+
7
+ ---
8
+
9
+ ## What You Are Receiving
10
+
11
+ FreshContext is a web intelligence engine for AI agents. It wraps every piece of
12
+ retrieved web data in a structured freshness envelope — exact retrieval timestamp,
13
+ publication date estimate, freshness confidence (high/medium/low), and a 0-100
14
+ numeric score with domain-specific decay rates.
15
+
16
+ 15 tools. No API keys required. Deployed globally on Cloudflare's edge. Listed on
17
+ Anthropic's official MCP Registry, npm, and Apify Store.
18
+
19
+ The two tools that exist nowhere else: extract_govcontracts (US federal contract
20
+ intelligence via USASpending.gov) and extract_changelog (product release velocity
21
+ from any GitHub repo, npm package, or website).
22
+
23
+ ---
24
+
25
+ ## Services and Infrastructure
26
+
27
+ ### 1. GitHub Repository
28
+ URL: https://github.com/PrinceGabriel-lgtm/freshcontext-mcp
29
+ Branch: main
30
+ Transfer method: GitHub Settings > Transfer ownership
31
+ What it contains: All source code, Dockerfile, specs, session saves, roadmap
32
+
33
+ ### 2. npm Package
34
+ Package: freshcontext-mcp (v0.3.11)
35
+ URL: https://www.npmjs.com/package/freshcontext-mcp
36
+ Account: immanuel-gabriel on npmjs.com
37
+ Transfer method: npm owner add new-username freshcontext-mcp
38
+ Note: Published automatically via GitHub Actions on every push to main
39
+
40
+ ### 3. Cloudflare Account
41
+ What lives here:
42
+ Worker: freshcontext-mcp (the live MCP endpoint)
43
+ D1: freshcontext-db (ID: d9898d65-f67e-4dcb-abdc-7f7b53f2d444)
44
+ KV: RATE_LIMITER and CACHE (IDs in wrangler.jsonc)
45
+ Cron: 0 */6 * * * (every 6 hours, runs automatically)
46
+ Endpoint: https://freshcontext-mcp.gimmanuel73.workers.dev/mcp
47
+
48
+ Transfer method: Add new account as Super Administrator, remove original.
49
+ D1 export: wrangler d1 export freshcontext-db --output=dump.sql
50
+
51
+ ### 4. Apify Actor
52
+ Actor: prince_gabriel/freshcontext-mcp
53
+ URL: https://apify.com/prince_gabriel/freshcontext-mcp
54
+ Monetization: $50.00 per 1,000 results (Pay per event)
55
+ Transfer method: Re-publish under new Apify account
56
+
57
+ ### 5. MCP Registry Listing
58
+ Entry: io.github.PrinceGabriel-lgtm/freshcontext
59
+ Config: server.json in the GitHub repo
60
+ Transfer method: Update server.json with new repo URL and re-submit
61
+
62
+ ### 6. GitHub Actions CI/CD
63
+ File: .github/workflows/publish.yml
64
+ Action: On every push to main — npm ci > tsc > npm publish
65
+ Secret: NPM_TOKEN (granular access token from npmjs.com, renew annually)
66
+
67
+ ---
68
+
69
+ ## Credentials Map (categories only — not values)
70
+
71
+ | Credential | Where Used | Location |
72
+ |---|---|---|
73
+ | API_KEY | Worker auth header | Cloudflare env var |
74
+ | ANTHROPIC_KEY | Synthesis/briefing endpoint | Cloudflare env var |
75
+ | GITHUB_TOKEN | GitHub API rate limit bypass | Cloudflare env var |
76
+ | NPM_TOKEN | GitHub Actions auto-publish | GitHub secret |
77
+
78
+ ---
79
+
80
+ ## Codebase Map
81
+
82
+ src/server.ts — MCP stdio server, all 15 tools registered here
83
+ src/apify.ts — Apify Actor entry point (read input, call adapter, exit)
84
+ src/security.ts — URL validation, SSRF prevention
85
+ src/types.ts — FreshContext, AdapterResult, ExtractOptions interfaces
86
+ src/adapters/ — One file per data source (13 files)
87
+ changelog.ts — UNIQUE: GitHub Releases API + npm + auto-discover
88
+ govcontracts.ts — UNIQUE: USASpending.gov federal contract awards
89
+ src/tools/
90
+ freshnessStamp.ts — Score calculation, JSON form, text envelope
91
+ worker/src/worker.ts — Cloudflare Worker: 15 tools + KV cache + rate limit
92
+ + D1 cron scraper + briefing formatter
93
+ .actor/Dockerfile — Apify build config (Node 20, runs dist/apify.js)
94
+ .actor/actor.json — Apify Actor metadata
95
+ FRESHCONTEXT_SPEC.md — The open standard (MIT license)
96
+ ROADMAP.md — 10-layer product vision
97
+ server.json — MCP Registry listing
98
+ .github/workflows/
99
+ publish.yml — GitHub Actions CI/CD
100
+
101
+ ---
102
+
103
+ ## The 15 Tools
104
+
105
+ Standard (11):
106
+ extract_github, extract_hackernews, extract_scholar, extract_arxiv,
107
+ extract_reddit, extract_yc, extract_producthunt, search_repos,
108
+ package_trends, extract_finance, search_jobs
109
+
110
+ Composite landscapes (3):
111
+ extract_landscape — YC + GitHub + HN + Reddit + Product Hunt + npm
112
+ extract_gov_landscape — govcontracts + HN + GitHub + changelog
113
+ extract_finance_landscape — finance + HN + Reddit + GitHub + changelog
114
+
115
+ Unique — not in any other MCP server (2):
116
+ extract_changelog — release history from any repo, package, or website
117
+ extract_govcontracts — US federal contract awards from USASpending.gov
118
+
119
+ ---
120
+
121
+ ## D1 Database Schema
122
+
123
+ watched_queries — 18 active monitored topics
124
+ id, adapter, query, label, filters, enabled, last_run_at
125
+
126
+ scrape_results — raw results, deduplicated by content hash
127
+ id, watched_query_id, adapter, query, raw_content, result_hash, is_new, scraped_at
128
+
129
+ briefings — formatted intelligence reports per cron run
130
+ id, user_id, summary, new_results_count, adapters_run, created_at
131
+
132
+ user_profiles — personalization data for briefing synthesis
133
+ id, name, skills, certifications, targets, location, context
134
+
135
+ ---
136
+
137
+ ## The FreshContext Specification
138
+
139
+ FRESHCONTEXT_SPEC.md is the open standard, MIT license, authored March 2026.
140
+ Any implementation returning the [FRESHCONTEXT]...[/FRESHCONTEXT] envelope
141
+ or the structured JSON form with freshcontext.retrieved_at and
142
+ freshcontext.freshness_confidence is FreshContext-compatible.
143
+
144
+ The spec is the durable asset. The code is the reference implementation.
145
+
146
+ ---
147
+
148
+ ## What Keeps Running Without You
149
+
150
+ The Cloudflare cron fires every 6 hours automatically. Every run scrapes all 18
151
+ watched queries, deduplicates by content hash, stores new signals in D1, and
152
+ generates a briefing. The dataset accumulates indefinitely. No action required.
153
+
154
+ ---
155
+
156
+ ## Pending Items at Time of Handoff
157
+
158
+ Apify Playwright — extract_reddit, extract_yc, extract_producthunt may error on
159
+ Apify. Fix: add RUN npx playwright install chromium --with-deps to .actor/Dockerfile.
160
+
161
+ Synthesis endpoint — /briefing/now is paused. Needs ANTHROPIC_KEY in Worker env
162
+ and credits loaded at console.anthropic.com. Infrastructure is fully built.
163
+
164
+ Agnost AI analytics — free MCP analytics offered by Apify. Sign up at app.agnost.ai,
165
+ add one line to server.ts. Gives tool call tracking and usage dashboards.
166
+
167
+ Apify GitHub Actions trigger — npm publish is automated but Apify rebuild is
168
+ manual. Add a workflow step calling Apify's API to auto-rebuild on push.
169
+
170
+ ---
171
+
172
+ ## Outreach Status at Time of Handoff
173
+
174
+ 19 confirmed delivered emails across US, Singapore, China, and Europe.
175
+ Cloudflare for Startups — awaiting reply (10 business day window).
176
+ GovTech Singapore — auto-acknowledged, reply expected by March 25, 2026.
177
+ All others — first contact sent, no replies yet.
178
+
179
+ HN post live: https://news.ycombinator.com/user?id=Prince-Gabriel
180
+
181
+ ---
182
+
183
+ *"The work isn't gone. It's just waiting to be continued."*
184
+ *— Prince Gabriel, Grootfontein, Namibia*