datasette-ts 0.0.19 → 0.0.20

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
package/README.md CHANGED
@@ -26,7 +26,14 @@ datasette-ts inspect ./my.db --inspect-file inspect-data.json
26
26
 
27
27
  ## Deploy to Cloudflare
28
28
 
29
- Prereqs: Alchemy configured (`npx alchemy login`)
29
+ First, set up Alchemy ([docs](https://alchemy.run/getting-started)):
30
+
31
+ ```bash
32
+ npx alchemy configure
33
+ npx alchemy login
34
+ ```
35
+
36
+ Then deploy:
30
37
 
31
38
  ```bash
32
39
  # Deploy with name derived from filename
@@ -35,8 +42,11 @@ datasette-ts deploy cloudflare ./my.db
35
42
  # Deploy with explicit name
36
43
  datasette-ts deploy cloudflare ./my.db --name my-app
37
44
 
38
- # Deploy with specific Cloudflare profile
45
+ # Deploy with a specific Alchemy profile
39
46
  datasette-ts deploy cloudflare ./my.db --name my-app --profile prod
47
+
48
+ # Deploy with metadata (robots settings, cache settings, etc.)
49
+ datasette-ts deploy cloudflare ./my.db --metadata datasette.yml
40
50
  ```
41
51
 
42
52
  This creates a Cloudflare Worker and D1 database with your data.
@@ -50,6 +60,46 @@ datasette-ts inspect --help
50
60
  datasette-ts deploy cloudflare --help
51
61
  ```
52
62
 
63
+ ## HTTP caching
64
+
65
+ Defaults to `Cache-Control: max-age=5` on all responses. You can
66
+ override this default using a setting, or per-request using the `_ttl` query
67
+ string parameter:
68
+
69
+ ```bash
70
+ # Set the default max-age for all responses (in seconds)
71
+ datasette-ts serve ./my.db --setting default_cache_ttl 60
72
+
73
+ # Override cache behavior for a single request
74
+ curl "http://127.0.0.1:8001/_ttl=0"
75
+ curl "http://127.0.0.1:8001/?_ttl=120"
76
+ ```
77
+
78
+ - `default_cache_ttl` of `0` disables caching by sending `Cache-Control: no-store`.
79
+ - `_ttl=0` disables caching for that request; `_ttl=<seconds>` sets `max-age`.
80
+
81
+ ## Robots.txt
82
+
83
+ Serve a robots.txt that blocks crawlers by default. Configure it
84
+ via metadata:
85
+
86
+ ```json
87
+ {
88
+ "plugins": {
89
+ "datasette-block-robots": {
90
+ "allow_only_index": true
91
+ }
92
+ }
93
+ }
94
+ ```
95
+
96
+ Options:
97
+ - `allow_only_index`: allow indexing the homepage while blocking each database path.
98
+ - `disallow`: custom list of paths to disallow, e.g. `["/mydb/mytable"]`.
99
+ - `literal`: full robots.txt contents (overrides all other settings).
100
+
101
+ The `/robots.txt` endpoint returns the generated file with `text/plain` content.
102
+
53
103
  ## Build
54
104
 
55
105
  ```bash