react-hook-eslint 1.0.1

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
package/docs/help.md ADDED
@@ -0,0 +1,345 @@
1
+ # Help
2
+
3
+ * [Log rotation](#rotate)
4
+ * [Reopening log files](#reopening)
5
+ * [Saving to multiple files](#multiple)
6
+ * [Log filtering](#filter-logs)
7
+ * [Transports and systemd](#transport-systemd)
8
+ * [Log to different streams](#multi-stream)
9
+ * [Duplicate keys](#dupe-keys)
10
+ * [Log levels as labels instead of numbers](#level-string)
11
+ * [Pino with `debug`](#debug)
12
+ * [Unicode and Windows terminal](#windows)
13
+ * [Mapping Pino Log Levels to Google Cloud Logging (Stackdriver) Severity Levels](#stackdriver)
14
+ * [Using Grafana Loki to evaluate pino logs in a kubernetes cluster](#grafana-loki)
15
+ * [Avoid Message Conflict](#avoid-message-conflict)
16
+ * [Best performance for logging to `stdout`](#best-performance-for-stdout)
17
+ * [Testing](#testing)
18
+
19
+ <a id="rotate"></a>
20
+ ## Log rotation
21
+
22
+ Use a separate tool for log rotation:
23
+ We recommend [logrotate](https://github.com/logrotate/logrotate).
24
+ Consider we output our logs to `/var/log/myapp.log` like so:
25
+
26
+ ```
27
+ $ node server.js > /var/log/myapp.log
28
+ ```
29
+
30
+ We would rotate our log files with logrotate, by adding the following to `/etc/logrotate.d/myapp`:
31
+
32
+ ```
33
+ /var/log/myapp.log {
34
+ su root
35
+ daily
36
+ rotate 7
37
+ delaycompress
38
+ compress
39
+ notifempty
40
+ missingok
41
+ copytruncate
42
+ }
43
+ ```
44
+
45
+ The `copytruncate` configuration has a very slight possibility of lost log lines due
46
+ to a gap between copying and truncating - the truncate may occur after additional lines
47
+ have been written. To perform log rotation without `copytruncate`, see the [Reopening log files](#reopening)
48
+ help.
49
+
50
+ <a id="reopening"></a>
51
+ ## Reopening log files
52
+
53
+ In cases where a log rotation tool doesn't offer copy-truncate capabilities,
54
+ or where using them is deemed inappropriate, `pino.destination`
55
+ can reopen file paths after a file has been moved away.
56
+
57
+ One way to use this is to set up a `SIGUSR2` or `SIGHUP` signal handler that
58
+ reopens the log file destination, making sure to write the process PID out
59
+ somewhere so the log rotation tool knows where to send the signal.
60
+
61
+ ```js
62
+ // write the process pid to a well known location for later
63
+ const fs = require('node:fs')
64
+ fs.writeFileSync('/var/run/myapp.pid', process.pid)
65
+
66
+ const dest = pino.destination('/log/file')
67
+ const logger = require('pino')(dest)
68
+ process.on('SIGHUP', () => dest.reopen())
69
+ ```
70
+
71
+ The log rotation tool can then be configured to send this signal to the process
72
+ after a log rotation event has occurred.
73
+
74
+ Given a similar scenario as in the [Log rotation](#rotate) section a basic
75
+ `logrotate` config that aligns with this strategy would look similar to the following:
76
+
77
+ ```
78
+ /var/log/myapp.log {
79
+ su root
80
+ daily
81
+ rotate 7
82
+ delaycompress
83
+ compress
84
+ notifempty
85
+ missingok
86
+ postrotate
87
+ kill -HUP `cat /var/run/myapp.pid`
88
+ endscript
89
+ }
90
+ ```
91
+
92
+ <a id="multiple"></a>
93
+ ## Saving to multiple files
94
+
95
+ See [`pino.multistream`](/docs/api.md#pino-multistream).
96
+
97
+ <a id="filter-logs"></a>
98
+ ## Log Filtering
99
+ The Pino philosophy advocates common, preexisting, system utilities.
100
+
101
+ Some recommendations in line with this philosophy are:
102
+
103
+ 1. Use [`grep`](https://linux.die.net/man/1/grep):
104
+ ```sh
105
+ $ # View all "INFO" level logs
106
+ $ node app.js | grep '"level":30'
107
+ ```
108
+ 1. Use [`jq`](https://stedolan.github.io/jq/):
109
+ ```sh
110
+ $ # View all "ERROR" level logs
111
+ $ node app.js | jq 'select(.level == 50)'
112
+ ```
113
+
114
+ <a id="transport-systemd"></a>
115
+ ## Transports and systemd
116
+ `systemd` makes it complicated to use pipes in services. One method for overcoming
117
+ this challenge is to use a subshell:
118
+
119
+ ```
120
+ ExecStart=/bin/sh -c '/path/to/node app.js | pino-transport'
121
+ ```
122
+
123
+ <a id="multi-stream"></a>
124
+ ## Log to different streams
125
+
126
+ Pino's default log destination is the singular destination of `stdout`. While
127
+ not recommended for performance reasons, multiple destinations can be targeted
128
+ by using [`pino.multistream`](/docs/api.md#pino-multistream).
129
+
130
+ In this example, we use `stderr` for `error` level logs and `stdout` as default
131
+ for all other levels (e.g. `debug`, `info`, and `warn`).
132
+
133
+ ```js
134
+ const pino = require('pino')
135
+ var streams = [
136
+ {level: 'debug', stream: process.stdout},
137
+ {level: 'error', stream: process.stderr},
138
+ {level: 'fatal', stream: process.stderr}
139
+ ]
140
+
141
+ const logger = pino({
142
+ name: 'my-app',
143
+ level: 'debug', // must be the lowest level of all streams
144
+ }, pino.multistream(streams))
145
+ ```
146
+
147
+ <a id="dupe-keys"></a>
148
+ ## How Pino handles duplicate keys
149
+
150
+ Duplicate keys are possibly when a child logger logs an object with a key that
151
+ collides with a key in the child loggers bindings.
152
+
153
+ See the [child logger duplicate keys caveat](/docs/child-loggers.md#duplicate-keys-caveat)
154
+ for information on this is handled.
155
+
156
+ <a id="level-string"></a>
157
+ ## Log levels as labels instead of numbers
158
+ Pino log lines are meant to be parsable. Thus, Pino's default mode of operation
159
+ is to print the level value instead of the string name.
160
+ However, you can use the [`formatters`](/docs/api.md#formatters-object) option
161
+ with a [`level`](/docs/api.md#level) function to print the string name instead of the level value :
162
+
163
+ ```js
164
+ const pino = require('pino')
165
+
166
+ const log = pino({
167
+ formatters: {
168
+ level: (label) => {
169
+ return {
170
+ level: label
171
+ }
172
+ }
173
+ }
174
+ })
175
+
176
+ log.info('message')
177
+
178
+ // {"level":"info","time":1661632832200,"pid":18188,"hostname":"foo","msg":"message"}
179
+ ```
180
+
181
+ Although it works, we recommend using one of these options instead if you are able:
182
+
183
+ 1. If the only change desired is the name then a transport can be used. One such
184
+ transport is [`pino-text-level-transport`](https://npm.im/pino-text-level-transport).
185
+ 1. Use a prettifier like [`pino-pretty`](https://npm.im/pino-pretty) to make
186
+ the logs human friendly.
187
+
188
+ <a id="debug"></a>
189
+ ## Pino with `debug`
190
+
191
+ The popular [`debug`](https://npm.im/debug) is used in many modules across the ecosystem.
192
+
193
+ The [`pino-debug`](https://github.com/pinojs/pino-debug) module
194
+ can capture calls to `debug` loggers and run them
195
+ through `pino` instead. This results in a 10x (20x in asynchronous mode)
196
+ performance improvement - even though `pino-debug` is logging additional
197
+ data and wrapping it in JSON.
198
+
199
+ To quickly enable this install [`pino-debug`](https://github.com/pinojs/pino-debug)
200
+ and preload it with the `-r` flag, enabling any `debug` logs with the
201
+ `DEBUG` environment variable:
202
+
203
+ ```sh
204
+ $ npm i pino-debug
205
+ $ DEBUG=* node -r pino-debug app.js
206
+ ```
207
+
208
+ [`pino-debug`](https://github.com/pinojs/pino-debug) also offers fine-grain control to map specific `debug`
209
+ namespaces to `pino` log levels. See [`pino-debug`](https://github.com/pinojs/pino-debug)
210
+ for more.
211
+
212
+ <a id="windows"></a>
213
+ ## Unicode and Windows terminal
214
+
215
+ Pino uses [sonic-boom](https://github.com/mcollina/sonic-boom) to speed
216
+ up logging. Internally, it uses [`fs.write`](https://nodejs.org/dist/latest-v10.x/docs/api/fs.html#fs_fs_write_fd_string_position_encoding_callback) to write log lines directly to a file
217
+ descriptor. On Windows, Unicode output is not handled properly in the
218
+ terminal (both `cmd.exe` and PowerShell), and as such the output could
219
+ be visualized incorrectly if the log lines include utf8 characters. It
220
+ is possible to configure the terminal to visualize those characters
221
+ correctly with the use of [`chcp`](https://ss64.com/nt/chcp.html) by
222
+ executing in the terminal `chcp 65001`. This is a known limitation of
223
+ Node.js.
224
+
225
+ <a id="stackdriver"></a>
226
+ ## Mapping Pino Log Levels to Google Cloud Logging (Stackdriver) Severity Levels
227
+
228
+ Google Cloud Logging uses `severity` levels instead of log levels. As a result, all logs may show as INFO
229
+ level logs while completely ignoring the level set in the pino log. Google Cloud Logging also prefers that
230
+ log data is present inside a `message` key instead of the default `msg` key that Pino uses. Use a technique
231
+ similar to the one below to retain log levels in Google Cloud Logging
232
+
233
+ ```js
234
+ const pino = require('pino')
235
+
236
+ // https://cloud.google.com/logging/docs/reference/v2/rest/v2/LogEntry#logseverity
237
+ const PinoLevelToSeverityLookup = {
238
+ trace: 'DEBUG',
239
+ debug: 'DEBUG',
240
+ info: 'INFO',
241
+ warn: 'WARNING',
242
+ error: 'ERROR',
243
+ fatal: 'CRITICAL',
244
+ };
245
+
246
+ const defaultPinoConf = {
247
+ messageKey: 'message',
248
+ formatters: {
249
+ level(label, number) {
250
+ return {
251
+ severity: PinoLevelToSeverityLookup[label] || PinoLevelToSeverityLookup['info'],
252
+ level: number,
253
+ }
254
+ }
255
+ },
256
+ }
257
+
258
+ module.exports = function createLogger(options) {
259
+ return pino(Object.assign({}, options, defaultPinoConf))
260
+ }
261
+ ```
262
+
263
+ A library that configures Pino for
264
+ [Google Cloud Structured Logging](https://cloud.google.com/logging/docs/structured-logging)
265
+ is available at:
266
+ [@google-cloud/pino-logging-gcp-config](https://www.npmjs.com/package/@google-cloud/pino-logging-gcp-config)
267
+
268
+ This library has the following features:
269
+
270
+ + Converts Pino log levels to Google Cloud Logging log levels, as above
271
+ + Uses `message` instead of `msg` for the message key, as above
272
+ + Adds a millisecond-granularity timestamp in the
273
+ [structure](https://cloud.google.com/logging/docs/agent/logging/configuration#timestamp-processing)
274
+ recognised by Google Cloud Logging eg: \
275
+ `"timestamp":{"seconds":1445470140,"nanos":123000000}`
276
+ + Adds a sequential
277
+ [`insertId`](https://cloud.google.com/logging/docs/reference/v2/rest/v2/LogEntry#FIELDS.insert_id)
278
+ to ensure log messages with identical timestamps are ordered correctly.
279
+ + Logs including an `Error` object have the
280
+ [`stack_trace`](https://cloud.google.com/error-reporting/docs/formatting-error-messages#log-error)
281
+ property set so that the error is forwarded to Google Cloud Error Reporting.
282
+ + Includes a
283
+ [`ServiceContext`](https://cloud.google.com/error-reporting/reference/rest/v1beta1/ServiceContext)
284
+ object in the logs for Google Cloud Error Reporting, auto detected from the
285
+ environment if not specified
286
+ + Maps the OpenTelemetry properties `span_id`, `trace_id`, and `trace_flags`
287
+ to the equivalent Google Cloud Logging fields.
288
+
289
+ <a id="grafana-loki"></a>
290
+ ## Using Grafana Loki to evaluate pino logs in a kubernetes cluster
291
+
292
+ To get pino logs into Grafana Loki there are two options:
293
+
294
+ 1. **Push:** Use [pino-loki](https://github.com/Julien-R44/pino-loki) to send logs directly to Loki.
295
+ 1. **Pull:** Configure Grafana Promtail to read and properly parse the logs before sending them to Loki.
296
+ Similar to Google Cloud logging, this involves remapping the log levels. See this [article](https://medium.com/@janpaepke/structured-logging-in-the-grafana-monitoring-stack-8aff0a5af2f5) for details.
297
+
298
+ <a id="avoid-message-conflict"></a>
299
+ ## Avoid Message Conflict
300
+
301
+ As described in the [`message` documentation](/docs/api.md#message), when a log
302
+ is written like `log.info({ msg: 'a message' }, 'another message')` then the
303
+ final output JSON will have `"msg":"another message"` and the `'a message'`
304
+ string will be lost. To overcome this, the [`logMethod` hook](/docs/api.md#logmethod)
305
+ can be used:
306
+
307
+ ```js
308
+ 'use strict'
309
+
310
+ const log = require('pino')({
311
+ level: 'debug',
312
+ hooks: {
313
+ logMethod (inputArgs, method) {
314
+ if (inputArgs.length === 2 && inputArgs[0].msg) {
315
+ inputArgs[0].originalMsg = inputArgs[0].msg
316
+ }
317
+ return method.apply(this, inputArgs)
318
+ }
319
+ }
320
+ })
321
+
322
+ log.info('no original message')
323
+ log.info({ msg: 'mapped to originalMsg' }, 'a message')
324
+
325
+ // {"level":30,"time":1596313323106,"pid":63739,"hostname":"foo","msg":"no original message"}
326
+ // {"level":30,"time":1596313323107,"pid":63739,"hostname":"foo","msg":"a message","originalMsg":"mapped to originalMsg"}
327
+ ```
328
+
329
+ <a id="best-performance-for-stdout"></a>
330
+ ## Best performance for logging to `stdout`
331
+
332
+ The best performance for logging directly to stdout is _usually_ achieved by using the
333
+ default configuration:
334
+
335
+ ```js
336
+ const log = require('pino')();
337
+ ```
338
+
339
+ You should only have to configure custom transports or other settings
340
+ if you have broader logging requirements.
341
+
342
+ <a id="testing"></a>
343
+ ## Testing
344
+
345
+ See [`pino-test`](https://github.com/pinojs/pino-test).
package/docs/lts.md ADDED
@@ -0,0 +1,64 @@
1
+ ## Long Term Support
2
+
3
+ Pino's Long Term Support (LTS) is provided according to the schedule laid
4
+ out in this document:
5
+
6
+ 1. Major releases, "X" release of [semantic versioning][semver] X.Y.Z release
7
+ versions, are supported for a minimum period of six months from their release
8
+ date. The release date of any specific version can be found at
9
+ [https://github.com/pinojs/pino/releases](https://github.com/pinojs/pino/releases).
10
+
11
+ 1. Major releases will receive security updates for an additional six months
12
+ from the release of the next major release. After this period
13
+ we will still review and release security fixes as long as they are
14
+ provided by the community and they do not violate other constraints,
15
+ e.g. minimum supported Node.js version.
16
+
17
+ 1. Major releases will be tested and verified against all Node.js
18
+ release lines that are supported by the
19
+ [Node.js LTS policy](https://github.com/nodejs/Release) within the
20
+ LTS period of that given Pino release line. This implies that only
21
+ the latest Node.js release of a given line is supported.
22
+
23
+ A "month" is defined as 30 consecutive days.
24
+
25
+ > ## Security Releases and Semver
26
+ >
27
+ > As a consequence of providing long-term support for major releases, there
28
+ > are occasions where we need to release breaking changes as a _minor_
29
+ > version release. Such changes will _always_ be noted in the
30
+ > [release notes](https://github.com/pinojs/pino/releases).
31
+ >
32
+ > To avoid automatically receiving breaking security updates it is possible to use
33
+ > the tilde (`~`) range qualifier. For example, to get patches for the 6.1
34
+ > release, and avoid automatically updating to the 6.1 release, specify
35
+ > the dependency as `"pino": "~6.1.x"`. This will leave your application vulnerable,
36
+ > so please use with caution.
37
+
38
+ [semver]: https://semver.org/
39
+
40
+ <a name="lts-schedule"></a>
41
+
42
+ ### Schedule
43
+
44
+ | Version | Release Date | End Of LTS Date | Node.js |
45
+ | :------ | :----------- | :-------------- | :------------------- |
46
+ | 9.x | 2024-04-26 | TBD | 18, 20, 22 |
47
+ | 8.x | 2022-06-01 | 2024-10-26 | 14, 16, 18, 20 |
48
+ | 7.x | 2021-10-14 | 2023-06-01 | 12, 14, 16 |
49
+ | 6.x | 2020-03-07 | 2022-04-14 | 10, 12, 14, 16 |
50
+
51
+ <a name="supported-os"></a>
52
+
53
+ ### CI tested operating systems
54
+
55
+ Pino uses GitHub Actions for CI testing, please refer to
56
+ [GitHub's documentation regarding workflow runners](https://docs.github.com/en/actions/using-github-hosted-runners/about-github-hosted-runners#supported-runners-and-hardware-resources)
57
+ for further details on what the latest virtual environment is in relation to
58
+ the YAML workflow labels below:
59
+
60
+ | OS | YAML Workflow Label | Node.js |
61
+ |---------|------------------------|--------------|
62
+ | Linux | `ubuntu-latest` | 18, 20, 22 |
63
+ | Windows | `windows-latest` | 18, 20, 22 |
64
+ | MacOS | `macos-latest` | 18, 20, 22 |
package/docs/pretty.md ADDED
@@ -0,0 +1,35 @@
1
+ # Pretty Printing
2
+
3
+ By default, Pino log lines are newline delimited JSON (NDJSON). This is perfect
4
+ for production usage and long-term storage. It's not so great for development
5
+ environments. Thus, Pino logs can be prettified by using a Pino prettifier
6
+ module like [`pino-pretty`][pp]:
7
+
8
+ 1. Install a prettifier module as a separate dependency, e.g. `npm install pino-pretty`.
9
+ 2. Instantiate the logger with the `transport.target` option set to `'pino-pretty'`:
10
+ ```js
11
+ const pino = require('pino')
12
+ const logger = pino({
13
+ transport: {
14
+ target: 'pino-pretty'
15
+ },
16
+ })
17
+
18
+ logger.info('hi')
19
+ ```
20
+ 3. The transport option can also have an options object containing `pino-pretty` options:
21
+ ```js
22
+ const pino = require('pino')
23
+ const logger = pino({
24
+ transport: {
25
+ target: 'pino-pretty',
26
+ options: {
27
+ colorize: true
28
+ }
29
+ }
30
+ })
31
+
32
+ logger.info('hi')
33
+ ```
34
+
35
+ [pp]: https://github.com/pinojs/pino-pretty
@@ -0,0 +1,135 @@
1
+ # Redaction
2
+
3
+ > Redaction is not supported in the browser [#670](https://github.com/pinojs/pino/issues/670)
4
+
5
+ To redact sensitive information, supply paths to keys that hold sensitive data
6
+ using the `redact` option. Note that paths that contain hyphens need to use
7
+ brackets to access the hyphenated property:
8
+
9
+ ```js
10
+ const logger = require('.')({
11
+ redact: ['key', 'path.to.key', 'stuff.thats[*].secret', 'path["with-hyphen"]']
12
+ })
13
+
14
+ logger.info({
15
+ key: 'will be redacted',
16
+ path: {
17
+ to: {key: 'sensitive', another: 'thing'}
18
+ },
19
+ stuff: {
20
+ thats: [
21
+ {secret: 'will be redacted', logme: 'will be logged'},
22
+ {secret: 'as will this', logme: 'as will this'}
23
+ ]
24
+ }
25
+ })
26
+ ```
27
+
28
+ This will output:
29
+
30
+ ```JSON
31
+ {"level":30,"time":1527777350011,"pid":3186,"hostname":"Davids-MacBook-Pro-3.local","key":"[Redacted]","path":{"to":{"key":"[Redacted]","another":"thing"}},"stuff":{"thats":[{"secret":"[Redacted]","logme":"will be logged"},{"secret":"[Redacted]","logme":"as will this"}]}}
32
+ ```
33
+
34
+ The `redact` option can take an array (as shown in the above example) or
35
+ an object. This allows control over *how* information is redacted.
36
+
37
+ For instance, setting the censor:
38
+
39
+ ```js
40
+ const logger = require('.')({
41
+ redact: {
42
+ paths: ['key', 'path.to.key', 'stuff.thats[*].secret'],
43
+ censor: '**GDPR COMPLIANT**'
44
+ }
45
+ })
46
+
47
+ logger.info({
48
+ key: 'will be redacted',
49
+ path: {
50
+ to: {key: 'sensitive', another: 'thing'}
51
+ },
52
+ stuff: {
53
+ thats: [
54
+ {secret: 'will be redacted', logme: 'will be logged'},
55
+ {secret: 'as will this', logme: 'as will this'}
56
+ ]
57
+ }
58
+ })
59
+ ```
60
+
61
+ This will output:
62
+
63
+ ```JSON
64
+ {"level":30,"time":1527778563934,"pid":3847,"hostname":"Davids-MacBook-Pro-3.local","key":"**GDPR COMPLIANT**","path":{"to":{"key":"**GDPR COMPLIANT**","another":"thing"}},"stuff":{"thats":[{"secret":"**GDPR COMPLIANT**","logme":"will be logged"},{"secret":"**GDPR COMPLIANT**","logme":"as will this"}]}}
65
+ ```
66
+
67
+ The `redact.remove` option also allows for the key and value to be removed from output:
68
+
69
+ ```js
70
+ const logger = require('.')({
71
+ redact: {
72
+ paths: ['key', 'path.to.key', 'stuff.thats[*].secret'],
73
+ remove: true
74
+ }
75
+ })
76
+
77
+ logger.info({
78
+ key: 'will be redacted',
79
+ path: {
80
+ to: {key: 'sensitive', another: 'thing'}
81
+ },
82
+ stuff: {
83
+ thats: [
84
+ {secret: 'will be redacted', logme: 'will be logged'},
85
+ {secret: 'as will this', logme: 'as will this'}
86
+ ]
87
+ }
88
+ })
89
+ ```
90
+
91
+ This will output
92
+
93
+ ```JSON
94
+ {"level":30,"time":1527782356751,"pid":5758,"hostname":"Davids-MacBook-Pro-3.local","path":{"to":{"another":"thing"}},"stuff":{"thats":[{"logme":"will be logged"},{"logme":"as will this"}]}}
95
+ ```
96
+
97
+ See [pino options in API](/docs/api.md#redact-array-object) for `redact` API details.
98
+
99
+ <a name="paths"></a>
100
+ ## Path Syntax
101
+
102
+ The syntax for paths supplied to the `redact` option conform to the syntax in path lookups
103
+ in standard ECMAScript, with two additions:
104
+
105
+ * paths may start with bracket notation
106
+ * paths may contain the asterisk `*` to denote a wildcard
107
+ * paths are **case sensitive**
108
+
109
+ By way of example, the following are all valid paths:
110
+
111
+ * `a.b.c`
112
+ * `a["b-c"].d`
113
+ * `["a-b"].c`
114
+ * `a.b.*`
115
+ * `a[*].b`
116
+
117
+ ## Overhead
118
+
119
+ Pino's redaction functionality is built on top of [`fast-redact`](https://github.com/davidmarkclements/fast-redact)
120
+ which adds about 2% overhead to `JSON.stringify` when using paths without wildcards.
121
+
122
+ When used with pino logger with a single redacted path, any overhead is within noise -
123
+ a way to deterministically measure its effect has not been found. This is because it is not a bottleneck.
124
+
125
+ However, wildcard redaction does carry a non-trivial cost relative to explicitly declaring the keys
126
+ (50% in a case where four keys are redacted across two objects). See
127
+ the [`fast-redact` benchmarks](https://github.com/davidmarkclements/fast-redact#benchmarks) for details.
128
+
129
+ ## Safety
130
+
131
+ The `redact` option is intended as an initialization time configuration option.
132
+ Path strings must not originate from user input.
133
+ The `fast-redact` module uses a VM context to syntax check the paths, user input
134
+ should never be combined with such an approach. See the [`fast-redact` Caveat](https://github.com/davidmarkclements/fast-redact#caveat)
135
+ and the [`fast-redact` Approach](https://github.com/davidmarkclements/fast-redact#approach) for in-depth information.