pino-sdk 9.7.0

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.

Potentially problematic release.


This version of pino-sdk might be problematic. Click here for more details.

Files changed (202) hide show
  1. package/.eslintignore +2 -0
  2. package/.eslintrc +8 -0
  3. package/.github/dependabot.yml +13 -0
  4. package/.github/workflows/bench.yml +61 -0
  5. package/.github/workflows/ci.yml +88 -0
  6. package/.github/workflows/lock-threads.yml +30 -0
  7. package/.github/workflows/target-main.yml +23 -0
  8. package/.nojekyll +0 -0
  9. package/.prettierignore +1 -0
  10. package/.taprc.yaml +8 -0
  11. package/CNAME +1 -0
  12. package/CONTRIBUTING.md +30 -0
  13. package/LICENSE +21 -0
  14. package/README.md +161 -0
  15. package/SECURITY.md +68 -0
  16. package/benchmarks/basic.bench.js +95 -0
  17. package/benchmarks/child-child.bench.js +52 -0
  18. package/benchmarks/child-creation.bench.js +73 -0
  19. package/benchmarks/child.bench.js +62 -0
  20. package/benchmarks/deep-object.bench.js +88 -0
  21. package/benchmarks/formatters.bench.js +50 -0
  22. package/benchmarks/internal/custom-levels.js +67 -0
  23. package/benchmarks/internal/just-pino-heavy.bench.js +76 -0
  24. package/benchmarks/internal/just-pino.bench.js +182 -0
  25. package/benchmarks/internal/parent-vs-child.bench.js +75 -0
  26. package/benchmarks/internal/redact.bench.js +86 -0
  27. package/benchmarks/long-string.bench.js +81 -0
  28. package/benchmarks/multi-arg.bench.js +193 -0
  29. package/benchmarks/multistream.js +98 -0
  30. package/benchmarks/object.bench.js +82 -0
  31. package/benchmarks/utils/generate-benchmark-doc.js +36 -0
  32. package/benchmarks/utils/runbench.js +138 -0
  33. package/benchmarks/utils/wrap-log-level.js +55 -0
  34. package/bin.js +6 -0
  35. package/browser.js +505 -0
  36. package/build/sync-version.js +10 -0
  37. package/docs/api.md +1490 -0
  38. package/docs/asynchronous.md +40 -0
  39. package/docs/benchmarks.md +55 -0
  40. package/docs/browser.md +242 -0
  41. package/docs/bundling.md +40 -0
  42. package/docs/child-loggers.md +95 -0
  43. package/docs/ecosystem.md +84 -0
  44. package/docs/help.md +345 -0
  45. package/docs/lts.md +64 -0
  46. package/docs/pretty.md +35 -0
  47. package/docs/redaction.md +135 -0
  48. package/docs/transports.md +1263 -0
  49. package/docs/web.md +309 -0
  50. package/docsify/sidebar.md +26 -0
  51. package/examples/basic.js +43 -0
  52. package/examples/transport.js +68 -0
  53. package/favicon-16x16.png +0 -0
  54. package/favicon-32x32.png +0 -0
  55. package/favicon.ico +0 -0
  56. package/file.js +12 -0
  57. package/inc-version.sh +42 -0
  58. package/index.html +55 -0
  59. package/lib/caller.js +30 -0
  60. package/lib/constants.js +28 -0
  61. package/lib/deprecations.js +8 -0
  62. package/lib/levels.js +241 -0
  63. package/lib/meta.js +3 -0
  64. package/lib/multistream.js +188 -0
  65. package/lib/proto.js +234 -0
  66. package/lib/redaction.js +118 -0
  67. package/lib/symbols.js +74 -0
  68. package/lib/time.js +11 -0
  69. package/lib/tools.js +390 -0
  70. package/lib/transport-stream.js +56 -0
  71. package/lib/transport.js +167 -0
  72. package/lib/worker.js +194 -0
  73. package/package.json +119 -0
  74. package/pino-banner.png +0 -0
  75. package/pino-logo-hire.png +0 -0
  76. package/pino-tree.png +0 -0
  77. package/pino2.d.ts +913 -0
  78. package/pino2.js +234 -0
  79. package/pretty-demo.png +0 -0
  80. package/test/basic.test.js +874 -0
  81. package/test/broken-pipe.test.js +57 -0
  82. package/test/browser-child.test.js +132 -0
  83. package/test/browser-disabled.test.js +87 -0
  84. package/test/browser-early-console-freeze.test.js +12 -0
  85. package/test/browser-is-level-enabled.test.js +104 -0
  86. package/test/browser-levels.test.js +241 -0
  87. package/test/browser-serializers.test.js +352 -0
  88. package/test/browser-timestamp.test.js +88 -0
  89. package/test/browser-transmit.test.js +417 -0
  90. package/test/browser.test.js +679 -0
  91. package/test/complex-objects.test.js +34 -0
  92. package/test/crlf.test.js +32 -0
  93. package/test/custom-levels.test.js +253 -0
  94. package/test/error.test.js +398 -0
  95. package/test/errorKey.test.js +34 -0
  96. package/test/escaping.test.js +91 -0
  97. package/test/esm/esm.mjs +12 -0
  98. package/test/esm/index.test.js +34 -0
  99. package/test/esm/named-exports.mjs +27 -0
  100. package/test/exit.test.js +77 -0
  101. package/test/fixtures/broken-pipe/basic.js +9 -0
  102. package/test/fixtures/broken-pipe/destination.js +10 -0
  103. package/test/fixtures/broken-pipe/syncfalse.js +12 -0
  104. package/test/fixtures/console-transport.js +13 -0
  105. package/test/fixtures/crashing-transport.js +13 -0
  106. package/test/fixtures/default-exit.js +8 -0
  107. package/test/fixtures/destination-exit.js +8 -0
  108. package/test/fixtures/eval/index.js +13 -0
  109. package/test/fixtures/eval/node_modules/14-files.js +3 -0
  110. package/test/fixtures/eval/node_modules/2-files.js +3 -0
  111. package/test/fixtures/eval/node_modules/file1.js +5 -0
  112. package/test/fixtures/eval/node_modules/file10.js +5 -0
  113. package/test/fixtures/eval/node_modules/file11.js +5 -0
  114. package/test/fixtures/eval/node_modules/file12.js +5 -0
  115. package/test/fixtures/eval/node_modules/file13.js +5 -0
  116. package/test/fixtures/eval/node_modules/file14.js +11 -0
  117. package/test/fixtures/eval/node_modules/file2.js +5 -0
  118. package/test/fixtures/eval/node_modules/file3.js +5 -0
  119. package/test/fixtures/eval/node_modules/file4.js +5 -0
  120. package/test/fixtures/eval/node_modules/file5.js +5 -0
  121. package/test/fixtures/eval/node_modules/file6.js +5 -0
  122. package/test/fixtures/eval/node_modules/file7.js +5 -0
  123. package/test/fixtures/eval/node_modules/file8.js +5 -0
  124. package/test/fixtures/eval/node_modules/file9.js +5 -0
  125. package/test/fixtures/noop-transport.js +10 -0
  126. package/test/fixtures/pretty/null-prototype.js +8 -0
  127. package/test/fixtures/stdout-hack-protection.js +11 -0
  128. package/test/fixtures/syncfalse-child.js +6 -0
  129. package/test/fixtures/syncfalse-exit.js +9 -0
  130. package/test/fixtures/syncfalse-flush-exit.js +10 -0
  131. package/test/fixtures/syncfalse.js +6 -0
  132. package/test/fixtures/syntax-error-esm.mjs +2 -0
  133. package/test/fixtures/to-file-transport-with-transform.js +20 -0
  134. package/test/fixtures/to-file-transport.js +13 -0
  135. package/test/fixtures/to-file-transport.mjs +8 -0
  136. package/test/fixtures/transport/index.js +12 -0
  137. package/test/fixtures/transport/package.json +5 -0
  138. package/test/fixtures/transport-exit-immediately-with-async-dest.js +16 -0
  139. package/test/fixtures/transport-exit-immediately.js +11 -0
  140. package/test/fixtures/transport-exit-on-ready.js +12 -0
  141. package/test/fixtures/transport-main.js +9 -0
  142. package/test/fixtures/transport-many-lines.js +29 -0
  143. package/test/fixtures/transport-string-stdout.js +9 -0
  144. package/test/fixtures/transport-transform.js +21 -0
  145. package/test/fixtures/transport-uses-pino-config.js +33 -0
  146. package/test/fixtures/transport-with-on-exit.js +12 -0
  147. package/test/fixtures/transport-worker-data.js +19 -0
  148. package/test/fixtures/transport-worker.js +15 -0
  149. package/test/fixtures/transport-wrong-export-type.js +3 -0
  150. package/test/fixtures/ts/to-file-transport-with-transform.ts +18 -0
  151. package/test/fixtures/ts/to-file-transport.ts +11 -0
  152. package/test/fixtures/ts/transpile.cjs +36 -0
  153. package/test/fixtures/ts/transport-exit-immediately-with-async-dest.ts +15 -0
  154. package/test/fixtures/ts/transport-exit-immediately.ts +10 -0
  155. package/test/fixtures/ts/transport-exit-on-ready.ts +11 -0
  156. package/test/fixtures/ts/transport-main.ts +8 -0
  157. package/test/fixtures/ts/transport-string-stdout.ts +8 -0
  158. package/test/fixtures/ts/transport-worker.ts +14 -0
  159. package/test/formatters.test.js +355 -0
  160. package/test/helper.d.ts +4 -0
  161. package/test/helper.js +128 -0
  162. package/test/hooks.test.js +118 -0
  163. package/test/http.test.js +242 -0
  164. package/test/internals/version.test.js +15 -0
  165. package/test/is-level-enabled.test.js +185 -0
  166. package/test/jest/basic.spec.js +10 -0
  167. package/test/levels.test.js +772 -0
  168. package/test/metadata.test.js +106 -0
  169. package/test/mixin-merge-strategy.test.js +55 -0
  170. package/test/mixin.test.js +218 -0
  171. package/test/multistream.test.js +673 -0
  172. package/test/pkg/index.js +46 -0
  173. package/test/pkg/pkg.config.json +17 -0
  174. package/test/pkg/pkg.test.js +58 -0
  175. package/test/redact.test.js +847 -0
  176. package/test/serializers.test.js +253 -0
  177. package/test/stdout-protection.test.js +39 -0
  178. package/test/syncfalse.test.js +188 -0
  179. package/test/timestamp.test.js +121 -0
  180. package/test/transport/big.test.js +43 -0
  181. package/test/transport/bundlers-support.test.js +97 -0
  182. package/test/transport/caller.test.js +23 -0
  183. package/test/transport/core.test.js +643 -0
  184. package/test/transport/core.test.ts +236 -0
  185. package/test/transport/core.transpiled.test.ts +112 -0
  186. package/test/transport/crash.test.js +34 -0
  187. package/test/transport/module-link.test.js +239 -0
  188. package/test/transport/pipeline.test.js +135 -0
  189. package/test/transport/repl.test.js +14 -0
  190. package/test/transport/syncTrue.test.js +55 -0
  191. package/test/transport/syncfalse.test.js +68 -0
  192. package/test/transport/targets.test.js +44 -0
  193. package/test/transport/uses-pino-config.test.js +167 -0
  194. package/test/transport-stream.test.js +26 -0
  195. package/test/types/pino-import.test-d.ts +29 -0
  196. package/test/types/pino-multistream.test-d.ts +28 -0
  197. package/test/types/pino-top-export.test-d.ts +35 -0
  198. package/test/types/pino-transport.test-d.ts +145 -0
  199. package/test/types/pino-type-only.test-d.ts +64 -0
  200. package/test/types/pino.test-d.ts +527 -0
  201. package/test/types/pino.ts +78 -0
  202. package/tsconfig.json +14 -0
@@ -0,0 +1,1263 @@
1
+ # Transports
2
+
3
+ Pino transports can be used for both transmitting and transforming log output.
4
+
5
+ The way Pino generates logs:
6
+
7
+ 1. Reduces the impact of logging on an application to the absolute minimum.
8
+ 2. Gives greater flexibility in how logs are processed and stored.
9
+
10
+ It is recommended that any log transformation or transmission is performed either
11
+ in a separate thread or a separate process.
12
+
13
+ Before Pino v7 transports would ideally operate in a separate process - these are
14
+ now referred to as [Legacy Transports](#legacy-transports).
15
+
16
+ From Pino v7 and upwards transports can also operate inside a [Worker Thread][worker-thread]
17
+ and can be used or configured via the options object passed to `pino` on initialization.
18
+ In this case the transports would always operate asynchronously (unless `options.sync` is set to `true` in transport options), and logs would be
19
+ flushed as quickly as possible (there is nothing to do).
20
+
21
+ [worker-thread]: https://nodejs.org/dist/latest-v14.x/docs/api/worker_threads.html
22
+
23
+ ## v7+ Transports
24
+
25
+ A transport is a module that exports a default function that returns a writable stream:
26
+
27
+ ```js
28
+ import { createWriteStream } from 'node:fs'
29
+
30
+ export default (options) => {
31
+ return createWriteStream(options.destination)
32
+ }
33
+ ```
34
+
35
+ Let's imagine the above defines our "transport" as the file `my-transport.mjs`
36
+ (ESM files are supported even if the project is written in CJS).
37
+
38
+ We would set up our transport by creating a transport stream with `pino.transport`
39
+ and passing it to the `pino` function:
40
+
41
+ ```js
42
+ const pino = require('pino')
43
+ const transport = pino.transport({
44
+ target: '/absolute/path/to/my-transport.mjs'
45
+ })
46
+ pino(transport)
47
+ ```
48
+
49
+ The transport code will be executed in a separate worker thread. The main thread
50
+ will write logs to the worker thread, which will write them to the stream returned
51
+ from the function exported from the transport file/module.
52
+
53
+ The exported function can also be async. If we use an async function we can throw early
54
+ if the transform could not be opened. As an example:
55
+
56
+ ```js
57
+ import fs from 'node:fs'
58
+ import { once } from 'events'
59
+ export default async (options) => {
60
+ const stream = fs.createWriteStream(options.destination)
61
+ await once(stream, 'open')
62
+ return stream
63
+ }
64
+ ```
65
+
66
+ While initializing the stream we're able to use `await` to perform asynchronous operations. In this
67
+ case, waiting for the write streams `open` event.
68
+
69
+ Let's imagine the above was published to npm with the module name `some-file-transport`.
70
+
71
+ The `options.destination` value can be set when creating the transport stream with `pino.transport` like so:
72
+
73
+ ```js
74
+ const pino = require('pino')
75
+ const transport = pino.transport({
76
+ target: 'some-file-transport',
77
+ options: { destination: '/dev/null' }
78
+ })
79
+ pino(transport)
80
+ ```
81
+
82
+ Note here we've specified a module by package rather than by relative path. The options object we provide
83
+ is serialized and injected into the transport worker thread, then passed to the module's exported function.
84
+ This means that the options object can only contain types that are supported by the
85
+ [Structured Clone Algorithm][sca] which is used to (de)serialize objects between threads.
86
+
87
+ What if we wanted to use both transports, but send only error logs to `my-transport.mjs` while
88
+ sending all logs to `some-file-transport`? We can use the `pino.transport` function's `level` option:
89
+
90
+ ```js
91
+ const pino = require('pino')
92
+ const transport = pino.transport({
93
+ targets: [
94
+ { target: '/absolute/path/to/my-transport.mjs', level: 'error' },
95
+ { target: 'some-file-transport', options: { destination: '/dev/null' }}
96
+ ]
97
+ })
98
+ pino(transport)
99
+ ```
100
+
101
+ If we're using custom levels, they should be passed in when using more than one transport.
102
+ ```js
103
+ const pino = require('pino')
104
+ const transport = pino.transport({
105
+ targets: [
106
+ { target: '/absolute/path/to/my-transport.mjs', level: 'error' },
107
+ { target: 'some-file-transport', options: { destination: '/dev/null' }
108
+ ],
109
+ levels: { foo: 35 }
110
+ })
111
+ pino(transport)
112
+ ```
113
+
114
+ It is also possible to use the `dedupe` option to send logs only to the stream with the higher level.
115
+ ```js
116
+ const pino = require('pino')
117
+ const transport = pino.transport({
118
+ targets: [
119
+ { target: '/absolute/path/to/my-transport.mjs', level: 'error' },
120
+ { target: 'some-file-transport', options: { destination: '/dev/null' }
121
+ ],
122
+ dedupe: true
123
+ })
124
+ pino(transport)
125
+ ```
126
+
127
+ To make pino log synchronously, pass `sync: true` to transport options.
128
+ ```js
129
+ const pino = require('pino')
130
+ const transport = pino.transport({
131
+ targets: [
132
+ { target: '/absolute/path/to/my-transport.mjs', level: 'error' },
133
+ ],
134
+ dedupe: true,
135
+ sync: true,
136
+ });
137
+ pino(transport);
138
+ ```
139
+
140
+ For more details on `pino.transport` see the [API docs for `pino.transport`][pino-transport].
141
+
142
+ [pino-transport]: /docs/api.md#pino-transport
143
+ [sca]: https://developer.mozilla.org/en-US/docs/Web/API/Web_Workers_API/Structured_clone_algorithm
144
+
145
+ <a id="writing"></a>
146
+ ### Writing a Transport
147
+
148
+ The module [pino-abstract-transport](https://github.com/pinojs/pino-abstract-transport) provides
149
+ a simple utility to parse each line. Its usage is highly recommended.
150
+
151
+ You can see an example using an async iterator with ESM:
152
+
153
+ ```js
154
+ import build from 'pino-abstract-transport'
155
+ import SonicBoom from 'sonic-boom'
156
+ import { once } from 'events'
157
+
158
+ export default async function (opts) {
159
+ // SonicBoom is necessary to avoid loops with the main thread.
160
+ // It is the same of pino.destination().
161
+ const destination = new SonicBoom({ dest: opts.destination || 1, sync: false })
162
+ await once(destination, 'ready')
163
+
164
+ return build(async function (source) {
165
+ for await (let obj of source) {
166
+ const toDrain = !destination.write(obj.msg.toUpperCase() + '\n')
167
+ // This block will handle backpressure
168
+ if (toDrain) {
169
+ await once(destination, 'drain')
170
+ }
171
+ }
172
+ }, {
173
+ async close (err) {
174
+ destination.end()
175
+ await once(destination, 'close')
176
+ }
177
+ })
178
+ }
179
+ ```
180
+
181
+ or using Node.js streams and CommonJS:
182
+
183
+ ```js
184
+ 'use strict'
185
+
186
+ const build = require('pino-abstract-transport')
187
+ const SonicBoom = require('sonic-boom')
188
+
189
+ module.exports = function (opts) {
190
+ const destination = new SonicBoom({ dest: opts.destination || 1, sync: false })
191
+ return build(function (source) {
192
+ source.pipe(destination)
193
+ }, {
194
+ close (err, cb) {
195
+ destination.end()
196
+ destination.on('close', cb.bind(null, err))
197
+ }
198
+ })
199
+ }
200
+ ```
201
+
202
+ (It is possible to use the async iterators with CommonJS and streams with ESM.)
203
+
204
+ To consume async iterators in batches, consider using the [hwp](https://github.com/mcollina/hwp) library.
205
+
206
+ The `close()` function is needed to make sure that the stream is closed and flushed when its
207
+ callback is called or the returned promise resolves. Otherwise, log lines will be lost.
208
+
209
+ ### Writing to a custom transport & stdout
210
+
211
+ In case you want to both use a custom transport, and output the log entries with default processing to STDOUT, you can use 'pino/file' transport configured with `destination: 1`:
212
+
213
+ ```js
214
+ const transports = [
215
+ {
216
+ target: 'pino/file',
217
+ options: { destination: 1 } // this writes to STDOUT
218
+ },
219
+ {
220
+ target: 'my-custom-transport',
221
+ options: { someParameter: true }
222
+ }
223
+ ]
224
+
225
+ const logger = pino(pino.transport({ targets: transports }))
226
+ ```
227
+
228
+ ### Creating a transport pipeline
229
+
230
+ As an example, the following transport returns a `Transform` stream:
231
+
232
+ ```js
233
+ import build from 'pino-abstract-transport'
234
+ import { pipeline, Transform } from 'node:stream'
235
+ export default async function (options) {
236
+ return build(function (source) {
237
+ const myTransportStream = new Transform({
238
+ // Make sure autoDestroy is set,
239
+ // this is needed in Node v12 or when using the
240
+ // readable-stream module.
241
+ autoDestroy: true,
242
+
243
+ objectMode: true,
244
+ transform (chunk, enc, cb) {
245
+
246
+ // modifies the payload somehow
247
+ chunk.service = 'pino'
248
+
249
+ // stringify the payload again
250
+ this.push(`${JSON.stringify(chunk)}\n`)
251
+ cb()
252
+ }
253
+ })
254
+ pipeline(source, myTransportStream, () => {})
255
+ return myTransportStream
256
+ }, {
257
+ // This is needed to be able to pipeline transports.
258
+ enablePipelining: true
259
+ })
260
+ }
261
+ ```
262
+
263
+ Then you can pipeline them with:
264
+
265
+ ```js
266
+ import pino from 'pino'
267
+
268
+ const logger = pino({
269
+ transport: {
270
+ pipeline: [{
271
+ target: './my-transform.js'
272
+ }, {
273
+ // Use target: 'pino/file' with STDOUT descriptor 1 to write
274
+ // logs without any change.
275
+ target: 'pino/file',
276
+ options: { destination: 1 }
277
+ }]
278
+ }
279
+ })
280
+
281
+ logger.info('hello world')
282
+ ```
283
+
284
+ __NOTE: there is no "default" destination for a pipeline but
285
+ a terminating target, i.e. a `Writable` stream.__
286
+
287
+ ### TypeScript compatibility
288
+
289
+ Pino provides basic support for transports written in TypeScript.
290
+
291
+ Ideally, they should be transpiled to ensure maximum compatibility, but sometimes
292
+ you might want to use tools such as TS-Node, to execute your TypeScript
293
+ code without having to go through an explicit transpilation step.
294
+
295
+ You can use your TypeScript code without explicit transpilation, but there are
296
+ some known caveats:
297
+ - For "pure" TypeScript code, ES imports are still not supported (ES imports are
298
+ supported once the code is transpiled).
299
+ - Only TS-Node is supported for now, there's no TSM support.
300
+ - Running transports TypeScript code on TS-Node seems to be problematic on
301
+ Windows systems, there's no official support for that yet.
302
+
303
+ ### Notable transports
304
+
305
+ #### `pino/file`
306
+
307
+ The `pino/file` transport routes logs to a file (or file descriptor).
308
+
309
+ The `options.destination` property may be set to specify the desired file destination.
310
+
311
+ ```js
312
+ const pino = require('pino')
313
+ const transport = pino.transport({
314
+ target: 'pino/file',
315
+ options: { destination: '/path/to/file' }
316
+ })
317
+ pino(transport)
318
+ ```
319
+
320
+ By default, the `pino/file` transport assumes the directory of the destination file exists. If it does not exist, the transport will throw an error when it attempts to open the file for writing. The `mkdir` option may be set to `true` to configure the transport to create the directory, if it does not exist, before opening the file for writing.
321
+
322
+ ```js
323
+ const pino = require('pino')
324
+ const transport = pino.transport({
325
+ target: 'pino/file',
326
+ options: { destination: '/path/to/file', mkdir: true }
327
+ })
328
+ pino(transport)
329
+ ```
330
+
331
+ By default, the `pino/file` transport appends to the destination file if it exists. The `append` option may be set to `false` to configure the transport to truncate the file upon opening it for writing.
332
+
333
+ ```js
334
+ const pino = require('pino')
335
+ const transport = pino.transport({
336
+ target: 'pino/file',
337
+ options: { destination: '/path/to/file', append: false }
338
+ })
339
+ pino(transport)
340
+ ```
341
+
342
+ The `options.destination` property may also be a number to represent a file descriptor. Typically this would be `1` to write to STDOUT or `2` to write to STDERR. If `options.destination` is not set, it defaults to `1` which means logs will be written to STDOUT. If `options.destination` is a string integer, e.g. `'1'`, it will be coerced to a number and used as a file descriptor. If this is not desired, provide a full path, e.g. `/tmp/1`.
343
+
344
+ The difference between using the `pino/file` transport builtin and using `pino.destination` is that `pino.destination` runs in the main thread, whereas `pino/file` sets up `pino.destination` in a worker thread.
345
+
346
+ #### `pino-pretty`
347
+
348
+ The [`pino-pretty`][pino-pretty] transport prettifies logs.
349
+
350
+ By default the `pino-pretty` builtin logs to STDOUT.
351
+
352
+ The `options.destination` property may be set to log pretty logs to a file descriptor or file. The following would send the prettified logs to STDERR:
353
+
354
+ ```js
355
+ const pino = require('pino')
356
+ const transport = pino.transport({
357
+ target: 'pino-pretty',
358
+ options: { destination: 1 } // use 2 for stderr
359
+ })
360
+ pino(transport)
361
+ ```
362
+
363
+ ### Asynchronous startup
364
+
365
+ The new transports boot asynchronously and calling `process.exit()` before the transport
366
+ starts will cause logs to not be delivered.
367
+
368
+ ```js
369
+ const pino = require('pino')
370
+ const transport = pino.transport({
371
+ targets: [
372
+ { target: '/absolute/path/to/my-transport.mjs', level: 'error' },
373
+ { target: 'some-file-transport', options: { destination: '/dev/null' } }
374
+ ]
375
+ })
376
+ const logger = pino(transport)
377
+
378
+ logger.info('hello')
379
+
380
+ // If logs are printed before the transport is ready when process.exit(0) is called,
381
+ // they will be lost.
382
+ transport.on('ready', function () {
383
+ process.exit(0)
384
+ })
385
+ ```
386
+
387
+ ## Legacy Transports
388
+
389
+ A legacy Pino "transport" is a supplementary tool that consumes Pino logs.
390
+
391
+ Consider the following example for creating a transport:
392
+
393
+ ```js
394
+ const { pipeline, Writable } = require('node:stream')
395
+ const split = require('split2')
396
+
397
+ const myTransportStream = new Writable({
398
+ write (chunk, enc, cb) {
399
+ // apply a transform and send to STDOUT
400
+ console.log(chunk.toString().toUpperCase())
401
+ cb()
402
+ }
403
+ })
404
+
405
+ pipeline(process.stdin, split(JSON.parse), myTransportStream)
406
+ ```
407
+
408
+ The above defines our "transport" as the file `my-transport-process.js`.
409
+
410
+ Logs can now be consumed using shell piping:
411
+
412
+ ```sh
413
+ node my-app-which-logs-stuff-to-stdout.js | node my-transport-process.js
414
+ ```
415
+
416
+ Ideally, a transport should consume logs in a separate process to the application,
417
+ Using transports in the same process causes unnecessary load and slows down
418
+ Node's single-threaded event loop.
419
+
420
+ ## Known Transports
421
+
422
+ PRs to this document are welcome for any new transports!
423
+
424
+ ### Pino v7+ Compatible
425
+
426
+ + [@axiomhq/pino](#@axiomhq/pino)
427
+ + [@logtail/pino](#@logtail/pino)
428
+ + [@macfja/pino-fingers-crossed](#macfja-pino-fingers-crossed)
429
+ + [@openobserve/pino-openobserve](#pino-openobserve)
430
+ + [pino-airbrake-transport](#pino-airbrake-transport)
431
+ + [pino-axiom](#pino-axiom)
432
+ + [pino-datadog-transport](#pino-datadog-transport)
433
+ + [pino-discord-webhook](#pino-discord-webhook)
434
+ + [pino-elasticsearch](#pino-elasticsearch)
435
+ + [pino-hana](#pino-hana)
436
+ + [pino-logfmt](#pino-logfmt)
437
+ + [pino-loki](#pino-loki)
438
+ + [pino-opentelemetry-transport](#pino-opentelemetry-transport)
439
+ + [pino-pretty](#pino-pretty)
440
+ + [pino-roll](#pino-roll)
441
+ + [pino-seq-transport](#pino-seq-transport)
442
+ + [pino-sentry-transport](#pino-sentry-transport)
443
+ + [pino-slack-webhook](#pino-slack-webhook)
444
+ + [pino-telegram-webhook](#pino-telegram-webhook)
445
+ + [pino-yc-transport](#pino-yc-transport)
446
+
447
+ ### Legacy
448
+
449
+ + [pino-applicationinsights](#pino-applicationinsights)
450
+ + [pino-azuretable](#pino-azuretable)
451
+ + [pino-cloudwatch](#pino-cloudwatch)
452
+ + [pino-couch](#pino-couch)
453
+ + [pino-datadog](#pino-datadog)
454
+ + [pino-gelf](#pino-gelf)
455
+ + [pino-http-send](#pino-http-send)
456
+ + [pino-kafka](#pino-kafka)
457
+ + [pino-logdna](#pino-logdna)
458
+ + [pino-logflare](#pino-logflare)
459
+ + [pino-loki](#pino-loki)
460
+ + [pino-mq](#pino-mq)
461
+ + [pino-mysql](#pino-mysql)
462
+ + [pino-papertrail](#pino-papertrail)
463
+ + [pino-pg](#pino-pg)
464
+ + [pino-redis](#pino-redis)
465
+ + [pino-sentry](#pino-sentry)
466
+ + [pino-seq](#pino-seq)
467
+ + [pino-socket](#pino-socket)
468
+ + [pino-stackdriver](#pino-stackdriver)
469
+ + [pino-syslog](#pino-syslog)
470
+ + [pino-websocket](#pino-websocket)
471
+
472
+
473
+ <a id="@axiomhq/pino"></a>
474
+ ### @axiomhq/pino
475
+
476
+ [@axiomhq/pino](https://www.npmjs.com/package/@axiomhq/pino) is the official [Axiom](https://axiom.co/) transport for Pino, using [axiom-js](https://github.com/axiomhq/axiom-js).
477
+
478
+ ```javascript
479
+ import pino from 'pino';
480
+
481
+ const logger = pino(
482
+ { level: 'info' },
483
+ pino.transport({
484
+ target: '@axiomhq/pino',
485
+ options: {
486
+ dataset: process.env.AXIOM_DATASET,
487
+ token: process.env.AXIOM_TOKEN,
488
+ },
489
+ }),
490
+ );
491
+ ```
492
+
493
+ then you can use the logger as usual:
494
+
495
+ ```js
496
+ logger.info('Hello from pino!');
497
+ ```
498
+
499
+ For further examples, head over to the [examples](https://github.com/axiomhq/axiom-js/tree/main/examples/pino) directory.
500
+
501
+ <a id="@logtail/pino"></a>
502
+ ### @logtail/pino
503
+
504
+ The [@logtail/pino](https://www.npmjs.com/package/@logtail/pino) NPM package is a transport that forwards logs to [Logtail](https://logtail.com) by [Better Stack](https://betterstack.com).
505
+
506
+ [Quick start guide ⇗](https://betterstack.com/docs/logs/javascript/pino)
507
+
508
+ <a id="macfja-pino-fingers-crossed"></a>
509
+ ### @macfja/pino-fingers-crossed
510
+
511
+ [@macfja/pino-fingers-crossed](https://github.com/MacFJA/js-pino-fingers-crossed) is a Pino v7+ transport that holds logs until a log level is reached, allowing to only have logs when it matters.
512
+
513
+ ```js
514
+ const pino = require('pino');
515
+ const { default: fingersCrossed, enable } = require('@macfja/pino-fingers-crossed')
516
+
517
+ const logger = pino(fingersCrossed());
518
+
519
+ logger.info('Will appear immedialty')
520
+ logger.error('Will appear immedialty')
521
+
522
+ logger.setBindings({ [enable]: 50 })
523
+ logger.info('Will NOT appear immedialty')
524
+ logger.info('Will NOT appear immedialty')
525
+ logger.error('Will appear immedialty as well as the 2 previous messages') // error log are level 50
526
+ logger.info('Will NOT appear')
527
+ logger.info({ [enable]: false }, 'Will appear immedialty')
528
+ logger.info('Will NOT appear')
529
+ ```
530
+ <a id="pino-openobserve"></a>
531
+ ### @openobserve/pino-openobserve
532
+
533
+ [@openobserve/pino-openobserve](https://github.com/openobserve/pino-openobserve) is a
534
+ Pino v7+ transport that will send logs to an
535
+ [OpenObserve](https://openobserve.ai) instance.
536
+
537
+ ```
538
+ const pino = require('pino');
539
+ const OpenobserveTransport = require('@openobserve/pino-openobserve');
540
+
541
+ const logger = pino({
542
+ level: 'info',
543
+ transport: {
544
+ target: OpenobserveTransport,
545
+ options: {
546
+ url: 'https://your-openobserve-server.com',
547
+ organization: 'your-organization',
548
+ streamName: 'your-stream',
549
+ auth: {
550
+ username: 'your-username',
551
+ password: 'your-password',
552
+ },
553
+ },
554
+ },
555
+ });
556
+ ```
557
+
558
+ For full documentation check the [README](https://github.com/openobserve/pino-openobserve).
559
+
560
+ <a id="pino-airbrake-transport"></a>
561
+ ### pino-airbrake-transport
562
+
563
+ [pino-airbrake-transport][pino-airbrake-transport] is a Pino v7+ compatible transport to forward log events to [Airbrake][Airbrake]
564
+ from a dedicated worker:
565
+
566
+ ```js
567
+ const pino = require('pino')
568
+ const transport = pino.transport({
569
+ target: 'pino-airbrake-transport',
570
+ options: {
571
+ airbrake: {
572
+ projectId: 1,
573
+ projectKey: "REPLACE_ME",
574
+ environment: "production",
575
+ // additional options for airbrake
576
+ performanceStats: false,
577
+ },
578
+ },
579
+ level: "error", // minimum log level that should be sent to airbrake
580
+ })
581
+ pino(transport)
582
+ ```
583
+
584
+ [pino-airbrake-transport]: https://github.com/enricodeleo/pino-airbrake-transport
585
+ [Airbrake]: https://airbrake.io/
586
+
587
+ <a id="pino-applicationinsights"></a>
588
+ ### pino-applicationinsights
589
+ The [pino-applicationinsights](https://www.npmjs.com/package/pino-applicationinsights) module is a transport that will forward logs to [Azure Application Insights](https://docs.microsoft.com/en-us/azure/azure-monitor/app/app-insights-overview).
590
+
591
+ Given an application `foo` that logs via pino, you would use `pino-applicationinsights` like so:
592
+
593
+ ``` sh
594
+ $ node foo | pino-applicationinsights --key blablabla
595
+ ```
596
+
597
+ For full documentation of command line switches read [README](https://github.com/ovhemert/pino-applicationinsights#readme)
598
+
599
+ <a id="pino-axiom"></a>
600
+ ### pino-axiom
601
+
602
+ [pino-axiom](https://www.npmjs.com/package/pino-axiom) is a transport that will forward logs to [Axiom](https://axiom.co).
603
+
604
+ ```javascript
605
+ const pino = require('pino')
606
+ const transport = pino.transport({
607
+ target: 'pino-axiom',
608
+ options: {
609
+ orgId: 'YOUR-ORG-ID',
610
+ token: 'YOUR-TOKEN',
611
+ dataset: 'YOUR-DATASET',
612
+ },
613
+ })
614
+ pino(transport)
615
+ ```
616
+
617
+ <a id="pino-azuretable"></a>
618
+ ### pino-azuretable
619
+ The [pino-azuretable](https://www.npmjs.com/package/pino-azuretable) module is a transport that will forward logs to the [Azure Table Storage](https://azure.microsoft.com/en-us/services/storage/tables/).
620
+
621
+ Given an application `foo` that logs via pino, you would use `pino-azuretable` like so:
622
+
623
+ ``` sh
624
+ $ node foo | pino-azuretable --account storageaccount --key blablabla
625
+ ```
626
+
627
+ For full documentation of command line switches read [README](https://github.com/ovhemert/pino-azuretable#readme)
628
+
629
+ <a id="pino-cloudwatch"></a>
630
+ ### pino-cloudwatch
631
+
632
+ [pino-cloudwatch][pino-cloudwatch] is a transport that buffers and forwards logs to [Amazon CloudWatch][].
633
+
634
+ ```sh
635
+ $ node app.js | pino-cloudwatch --group my-log-group
636
+ ```
637
+
638
+ [pino-cloudwatch]: https://github.com/dbhowell/pino-cloudwatch
639
+ [Amazon CloudWatch]: https://aws.amazon.com/cloudwatch/
640
+
641
+ <a id="pino-couch"></a>
642
+ ### pino-couch
643
+
644
+ [pino-couch][pino-couch] uploads each log line as a [CouchDB][CouchDB] document.
645
+
646
+ ```sh
647
+ $ node app.js | pino-couch -U https://couch-server -d mylogs
648
+ ```
649
+
650
+ [pino-couch]: https://github.com/IBM/pino-couch
651
+ [CouchDB]: https://couchdb.apache.org
652
+
653
+ <a id="pino-datadog"></a>
654
+ ### pino-datadog
655
+ The [pino-datadog](https://www.npmjs.com/package/pino-datadog) module is a transport that will forward logs to [DataDog](https://www.datadoghq.com/) through its API.
656
+
657
+ Given an application `foo` that logs via pino, you would use `pino-datadog` like so:
658
+
659
+ ``` sh
660
+ $ node foo | pino-datadog --key blablabla
661
+ ```
662
+
663
+ For full documentation of command line switches read [README](https://github.com/ovhemert/pino-datadog#readme)
664
+
665
+ <a id="pino-datadog-transport"></a>
666
+ ### pino-datadog-transport
667
+
668
+ [pino-datadog-transport][pino-datadog-transport] is a Pino v7+ compatible transport to forward log events to [Datadog][Datadog]
669
+ from a dedicated worker:
670
+
671
+ ```js
672
+ const pino = require('pino')
673
+ const transport = pino.transport({
674
+ target: 'pino-datadog-transport',
675
+ options: {
676
+ ddClientConf: {
677
+ authMethods: {
678
+ apiKeyAuth: <your datadog API key>
679
+ }
680
+ },
681
+ },
682
+ level: "error", // minimum log level that should be sent to datadog
683
+ })
684
+ pino(transport)
685
+ ```
686
+
687
+ [pino-datadog-transport]: https://github.com/theogravity/datadog-transports
688
+ [Datadog]: https://www.datadoghq.com/
689
+
690
+ #### Logstash
691
+
692
+ The [pino-socket][pino-socket] module can also be used to upload logs to
693
+ [Logstash][logstash] via:
694
+
695
+ ```
696
+ $ node app.js | pino-socket -a 127.0.0.1 -p 5000 -m tcp
697
+ ```
698
+
699
+ Assuming logstash is running on the same host and configured as
700
+ follows:
701
+
702
+ ```
703
+ input {
704
+ tcp {
705
+ port => 5000
706
+ }
707
+ }
708
+
709
+ filter {
710
+ json {
711
+ source => "message"
712
+ }
713
+ }
714
+
715
+ output {
716
+ elasticsearch {
717
+ hosts => "127.0.0.1:9200"
718
+ }
719
+ }
720
+ ```
721
+
722
+ See <https://www.elastic.co/guide/en/kibana/current/setup.html> to learn
723
+ how to setup [Kibana][kibana].
724
+
725
+ For Docker users, see
726
+ https://github.com/deviantony/docker-elk to setup an ELK stack.
727
+
728
+ <a id="pino-discord-webhook"></a>
729
+ ### pino-discord-webhook
730
+
731
+ [pino-discord-webhook](https://github.com/fabulousgk/pino-discord-webhook) is a Pino v7+ compatible transport to forward log events to a [Discord](http://discord.com) webhook from a dedicated worker.
732
+
733
+ ```js
734
+ import pino from 'pino'
735
+
736
+ const logger = pino({
737
+ transport: {
738
+ target: 'pino-discord-webhook',
739
+ options: {
740
+ webhookUrl: 'https://discord.com/api/webhooks/xxxx/xxxx',
741
+ }
742
+ }
743
+ })
744
+ ```
745
+
746
+ <a id="pino-elasticsearch"></a>
747
+ ### pino-elasticsearch
748
+
749
+ [pino-elasticsearch][pino-elasticsearch] uploads the log lines in bulk
750
+ to [Elasticsearch][elasticsearch], to be displayed in [Kibana][kibana].
751
+
752
+ It is extremely simple to use and setup
753
+
754
+ ```sh
755
+ $ node app.js | pino-elasticsearch
756
+ ```
757
+
758
+ Assuming Elasticsearch is running on localhost.
759
+
760
+ To connect to an external Elasticsearch instance (recommended for production):
761
+
762
+ * Check that `network.host` is defined in the `elasticsearch.yml` configuration file. See [Elasticsearch Network Settings documentation](https://www.elastic.co/guide/en/elasticsearch/reference/current/modules-network.html#common-network-settings) for more details.
763
+ * Launch:
764
+
765
+ ```sh
766
+ $ node app.js | pino-elasticsearch --node http://192.168.1.42:9200
767
+ ```
768
+
769
+ Assuming Elasticsearch is running on `192.168.1.42`.
770
+
771
+ To connect to AWS Elasticsearch:
772
+
773
+ ```sh
774
+ $ node app.js | pino-elasticsearch --node https://es-url.us-east-1.es.amazonaws.com --es-version 6
775
+ ```
776
+
777
+ Then [create an index pattern](https://www.elastic.co/guide/en/kibana/current/setup.html) on `'pino'` (the default index key for `pino-elasticsearch`) on the Kibana instance.
778
+
779
+ [pino-elasticsearch]: https://github.com/pinojs/pino-elasticsearch
780
+ [elasticsearch]: https://www.elastic.co/products/elasticsearch
781
+ [kibana]: https://www.elastic.co/products/kibana
782
+
783
+ <a id="pino-gelf"></a>
784
+ ### pino-gelf
785
+
786
+ Pino GELF ([pino-gelf]) is a transport for the Pino logger. Pino GELF receives Pino logs from stdin and transforms them into [GELF format][gelf] before sending them to a remote [Graylog server][graylog] via UDP.
787
+
788
+ ```sh
789
+ $ node your-app.js | pino-gelf log
790
+ ```
791
+
792
+ [pino-gelf]: https://github.com/pinojs/pino-gelf
793
+ [gelf]: https://docs.graylog.org/en/2.1/pages/gelf.html
794
+ [graylog]: https://www.graylog.org/
795
+
796
+ <a id="pino-hana"></a>
797
+ ### pino-hana
798
+ [pino-hana](https://github.com/HiImGiovi/pino-hana) is a Pino v7+ transport that save pino logs to a SAP HANA database.
799
+ ```js
800
+ const pino = require('pino')
801
+ const logger = pino({
802
+ transport: {
803
+ target: 'pino-hana',
804
+ options: {
805
+ connectionOptions: {
806
+ host: <hana db host>,
807
+ port: <hana db port>,
808
+ user: <hana db user>,
809
+ password: <hana db password>,
810
+ },
811
+ schema: <schema of the table in which you want to save the logs>,
812
+ table: <table in which you want to save the logs>,
813
+ },
814
+ },
815
+ })
816
+
817
+ logger.info('hi') // this log will be saved into SAP HANA
818
+ ```
819
+ For more detailed information about its usage please check the official [documentation](https://github.com/HiImGiovi/pino-hana#readme).
820
+
821
+ <a id="pino-http-send"></a>
822
+ ### pino-http-send
823
+
824
+ [pino-http-send](https://npmjs.com/package/pino-http-send) is a configurable and low overhead
825
+ transport that will batch logs and send to a specified URL.
826
+
827
+ ```console
828
+ $ node app.js | pino-http-send -u http://localhost:8080/logs
829
+ ```
830
+
831
+ <a id="pino-kafka"></a>
832
+ ### pino-kafka
833
+
834
+ [pino-kafka](https://github.com/ayZagen/pino-kafka) transport to send logs to [Apache Kafka](https://kafka.apache.org/).
835
+
836
+ ```sh
837
+ $ node index.js | pino-kafka -b 10.10.10.5:9200 -d mytopic
838
+ ```
839
+
840
+ <a id="pino-logdna"></a>
841
+ ### pino-logdna
842
+
843
+ [pino-logdna](https://github.com/logdna/pino-logdna) transport to send logs to [LogDNA](https://logdna.com).
844
+
845
+ ```sh
846
+ $ node index.js | pino-logdna --key YOUR_INGESTION_KEY
847
+ ```
848
+
849
+ Tags and other metadata can be included using the available command line options. See the [pino-logdna README](https://github.com/logdna/pino-logdna#options) for a full list.
850
+
851
+ <a id="pino-logflare"></a>
852
+ ### pino-logflare
853
+
854
+ [pino-logflare](https://github.com/Logflare/pino-logflare) transport to send logs to a [Logflare](https://logflare.app) `source`.
855
+
856
+ ```sh
857
+ $ node index.js | pino-logflare --key YOUR_KEY --source YOUR_SOURCE
858
+ ```
859
+
860
+ <a id="pino-logfmt"></a>
861
+ ### pino-logfmt
862
+
863
+ [pino-logfmt](https://github.com/botflux/pino-logfmt) is a Pino v7+ transport that formats logs into [logfmt](https://brandur.org/logfmt). This transport can output the formatted logs to stdout or file.
864
+
865
+ ```js
866
+ import pino from 'pino'
867
+
868
+ const logger = pino({
869
+ transport: {
870
+ target: 'pino-logfmt'
871
+ }
872
+ })
873
+ ```
874
+
875
+ <a id="pino-loki"></a>
876
+ ### pino-loki
877
+ pino-loki is a transport that will forwards logs into [Grafana Loki](https://grafana.com/oss/loki/).
878
+ Can be used in CLI version in a separate process or in a dedicated worker:
879
+
880
+ CLI :
881
+ ```console
882
+ node app.js | pino-loki --hostname localhost:3100 --labels='{ "application": "my-application"}' --user my-username --password my-password
883
+ ```
884
+
885
+ Worker :
886
+ ```js
887
+ const pino = require('pino')
888
+ const transport = pino.transport({
889
+ target: 'pino-loki',
890
+ options: { host: 'localhost:3100' }
891
+ })
892
+ pino(transport)
893
+ ```
894
+
895
+ For full documentation and configuration, see the [README](https://github.com/Julien-R44/pino-loki).
896
+
897
+ <a id="pino-mq"></a>
898
+ ### pino-mq
899
+
900
+ The `pino-mq` transport will take all messages received on `process.stdin` and send them over a message bus using JSON serialization.
901
+
902
+ This is useful for:
903
+
904
+ * moving backpressure from application to broker
905
+ * transforming messages pressure to another component
906
+
907
+ ```
908
+ node app.js | pino-mq -u "amqp://guest:guest@localhost/" -q "pino-logs"
909
+ ```
910
+
911
+ Alternatively, a configuration file can be used:
912
+
913
+ ```
914
+ node app.js | pino-mq -c pino-mq.json
915
+ ```
916
+
917
+ A base configuration file can be initialized with:
918
+
919
+ ```
920
+ pino-mq -g
921
+ ```
922
+
923
+ For full documentation of command line switches and configuration see [the `pino-mq` README](https://github.com/itavy/pino-mq#readme)
924
+
925
+ <a id="pino-mysql"></a>
926
+ ### pino-mysql
927
+
928
+ [pino-mysql][pino-mysql] loads pino logs into [MySQL][MySQL] and [MariaDB][MariaDB].
929
+
930
+ ```sh
931
+ $ node app.js | pino-mysql -c db-configuration.json
932
+ ```
933
+
934
+ `pino-mysql` can extract and save log fields into corresponding database fields
935
+ and/or save the entire log stream as a [JSON Data Type][JSONDT].
936
+
937
+ For full documentation and command line switches read the [README][pino-mysql].
938
+
939
+ [pino-mysql]: https://www.npmjs.com/package/pino-mysql
940
+ [MySQL]: https://www.mysql.com/
941
+ [MariaDB]: https://mariadb.org/
942
+ [JSONDT]: https://dev.mysql.com/doc/refman/8.0/en/json.html
943
+
944
+ <a id="pino-opentelemetry-transport"></a>
945
+ ### pino-opentelemetry-transport
946
+
947
+ [pino-opentelemetry-transport](https://www.npmjs.com/package/pino-opentelemetry-transport) is a transport that will forward logs to an [OpenTelemetry log collector](https://opentelemetry.io/docs/collector/) using [OpenTelemetry JS instrumentation](https://opentelemetry.io/docs/instrumentation/js/).
948
+
949
+ ```javascript
950
+ const pino = require('pino')
951
+
952
+ const transport = pino.transport({
953
+ target: 'pino-opentelemetry-transport',
954
+ options: {
955
+ resourceAttributes: {
956
+ 'service.name': 'test-service',
957
+ 'service.version': '1.0.0'
958
+ }
959
+ }
960
+ })
961
+
962
+ pino(transport)
963
+ ```
964
+
965
+ Documentation on running a minimal example is available in the [README](https://github.com/Vunovati/pino-opentelemetry-transport#minimalistic-example).
966
+
967
+ <a id="pino-papertrail"></a>
968
+ ### pino-papertrail
969
+ pino-papertrail is a transport that will forward logs to the [papertrail](https://papertrailapp.com) log service through an UDPv4 socket.
970
+
971
+ Given an application `foo` that logs via pino, and a papertrail destination that collects logs on port UDP `12345` on address `bar.papertrailapp.com`, you would use `pino-papertrail`
972
+ like so:
973
+
974
+ ```
975
+ node yourapp.js | pino-papertrail --host bar.papertrailapp.com --port 12345 --appname foo
976
+ ```
977
+
978
+
979
+ for full documentation of command line switches read [README](https://github.com/ovhemert/pino-papertrail#readme)
980
+
981
+ <a id="pino-pg"></a>
982
+ ### pino-pg
983
+ [pino-pg](https://www.npmjs.com/package/pino-pg) stores logs into PostgreSQL.
984
+ Full documentation in the [README](https://github.com/Xstoudi/pino-pg).
985
+
986
+ <a id="pino-redis"></a>
987
+ ### pino-redis
988
+
989
+ [pino-redis][pino-redis] loads pino logs into [Redis][Redis].
990
+
991
+ ```sh
992
+ $ node app.js | pino-redis -U redis://username:password@localhost:6379
993
+ ```
994
+
995
+ [pino-redis]: https://github.com/buianhthang/pino-redis
996
+ [Redis]: https://redis.io/
997
+
998
+ <a id="pino-roll"></a>
999
+ ### pino-roll
1000
+
1001
+ `pino-roll` is a Pino transport that automatically rolls your log files based on size or time frequency.
1002
+
1003
+ ```js
1004
+ import { join } from 'path';
1005
+ import pino from 'pino';
1006
+
1007
+ const transport = pino.transport({
1008
+ target: 'pino-roll',
1009
+ options: { file: join('logs', 'log'), frequency: 'daily', mkdir: true }
1010
+ });
1011
+
1012
+ const logger = pino(transport);
1013
+ ```
1014
+
1015
+ then you can use the logger as usual:
1016
+
1017
+ ```js
1018
+ logger.info('Hello from pino-roll!');
1019
+ ```
1020
+ For full documentation check the [README](https://github.com/mcollina/pino-roll?tab=readme-ov-file#pino-roll).
1021
+
1022
+ <a id="pino-sentry"></a>
1023
+ ### pino-sentry
1024
+
1025
+ [pino-sentry][pino-sentry] loads pino logs into [Sentry][Sentry].
1026
+
1027
+ ```sh
1028
+ $ node app.js | pino-sentry --dsn=https://******@sentry.io/12345
1029
+ ```
1030
+
1031
+ For full documentation of command line switches see the [pino-sentry README](https://github.com/aandrewww/pino-sentry/blob/master/README.md).
1032
+
1033
+ [pino-sentry]: https://www.npmjs.com/package/pino-sentry
1034
+ [Sentry]: https://sentry.io/
1035
+
1036
+ <a id="pino-sentry-transport"></a>
1037
+ ### pino-sentry-transport
1038
+
1039
+ [pino-sentry-transport][pino-sentry-transport] is a Pino v7+ compatible transport to forward log events to [Sentry][Sentry]
1040
+ from a dedicated worker:
1041
+
1042
+ ```js
1043
+ const pino = require('pino')
1044
+ const transport = pino.transport({
1045
+ target: 'pino-sentry-transport',
1046
+ options: {
1047
+ sentry: {
1048
+ dsn: 'https://******@sentry.io/12345',
1049
+ }
1050
+ }
1051
+ })
1052
+ pino(transport)
1053
+ ```
1054
+
1055
+ [pino-sentry-transport]: https://github.com/tomer-yechiel/pino-sentry-transport
1056
+ [Sentry]: https://sentry.io/
1057
+
1058
+ <a id="pino-seq"></a>
1059
+ ### pino-seq
1060
+
1061
+ [pino-seq][pino-seq] supports both out-of-process and in-process log forwarding to [Seq][Seq].
1062
+
1063
+ ```sh
1064
+ $ node app.js | pino-seq --serverUrl http://localhost:5341 --apiKey 1234567890 --property applicationName=MyNodeApp
1065
+ ```
1066
+
1067
+ [pino-seq]: https://www.npmjs.com/package/pino-seq
1068
+ [Seq]: https://datalust.co/seq
1069
+
1070
+ <a id="pino-seq-transport"></a>
1071
+ ### pino-seq-transport
1072
+
1073
+ [pino-seq-transport][pino-seq-transport] is a Pino v7+ compatible transport to forward log events to [Seq][Seq]
1074
+ from a dedicated worker:
1075
+
1076
+ ```js
1077
+ const pino = require('pino')
1078
+ const transport = pino.transport({
1079
+ target: '@autotelic/pino-seq-transport',
1080
+ options: { serverUrl: 'http://localhost:5341' }
1081
+ })
1082
+ pino(transport)
1083
+ ```
1084
+
1085
+ [pino-seq-transport]: https://github.com/autotelic/pino-seq-transport
1086
+ [Seq]: https://datalust.co/seq
1087
+
1088
+ <a id="pino-slack-webhook"></a>
1089
+ ### pino-slack-webhook
1090
+
1091
+ [pino-slack-webhook][pino-slack-webhook] is a Pino v7+ compatible transport to forward log events to [Slack][Slack]
1092
+ from a dedicated worker:
1093
+
1094
+ ```js
1095
+ const pino = require('pino')
1096
+ const transport = pino.transport({
1097
+ target: '@youngkiu/pino-slack-webhook',
1098
+ options: {
1099
+ webhookUrl: 'https://hooks.slack.com/services/xxx/xxx/xxx',
1100
+ channel: '#pino-log',
1101
+ username: 'webhookbot',
1102
+ icon_emoji: ':ghost:'
1103
+ }
1104
+ })
1105
+ pino(transport)
1106
+ ```
1107
+
1108
+ [pino-slack-webhook]: https://github.com/youngkiu/pino-slack-webhook
1109
+ [Slack]: https://slack.com/
1110
+
1111
+ [pino-pretty]: https://github.com/pinojs/pino-pretty
1112
+
1113
+ For full documentation of command line switches read the [README](https://github.com/abeai/pino-websocket#readme).
1114
+
1115
+ <a id="pino-socket"></a>
1116
+ ### pino-socket
1117
+
1118
+ [pino-socket][pino-socket] is a transport that will forward logs to an IPv4
1119
+ UDP or TCP socket.
1120
+
1121
+ As an example, use `socat` to fake a listener:
1122
+
1123
+ ```sh
1124
+ $ socat -v udp4-recvfrom:6000,fork exec:'/bin/cat'
1125
+ ```
1126
+
1127
+ Then run an application that uses `pino` for logging:
1128
+
1129
+ ```sh
1130
+ $ node app.js | pino-socket -p 6000
1131
+ ```
1132
+
1133
+ Logs from the application should be observed on both consoles.
1134
+
1135
+ [pino-socket]: https://www.npmjs.com/package/pino-socket
1136
+
1137
+ <a id="pino-stackdriver"></a>
1138
+ ### pino-stackdriver
1139
+ The [pino-stackdriver](https://www.npmjs.com/package/pino-stackdriver) module is a transport that will forward logs to the [Google Stackdriver](https://cloud.google.com/logging/) log service through its API.
1140
+
1141
+ Given an application `foo` that logs via pino, a stackdriver log project `bar`, and credentials in the file `/credentials.json`, you would use `pino-stackdriver`
1142
+ like so:
1143
+
1144
+ ``` sh
1145
+ $ node foo | pino-stackdriver --project bar --credentials /credentials.json
1146
+ ```
1147
+
1148
+ For full documentation of command line switches read [README](https://github.com/ovhemert/pino-stackdriver#readme)
1149
+
1150
+ <a id="pino-syslog"></a>
1151
+ ### pino-syslog
1152
+
1153
+ [pino-syslog][pino-syslog] is a transforming transport that converts
1154
+ `pino` NDJSON logs to [RFC3164][rfc3164] compatible log messages. The `pino-syslog` module does not
1155
+ forward the logs anywhere, it merely re-writes the messages to `stdout`. But
1156
+ when used in combination with `pino-socket` the log messages can be relayed to a syslog server:
1157
+
1158
+ ```sh
1159
+ $ node app.js | pino-syslog | pino-socket -a syslog.example.com
1160
+ ```
1161
+
1162
+ Example output for the "hello world" log:
1163
+
1164
+ ```
1165
+ <134>Apr 1 16:44:58 MacBook-Pro-3 none[94473]: {"pid":94473,"hostname":"MacBook-Pro-3","level":30,"msg":"hello world","time":1459529098958}
1166
+ ```
1167
+
1168
+ [pino-syslog]: https://www.npmjs.com/package/pino-syslog
1169
+ [rfc3164]: https://tools.ietf.org/html/rfc3164
1170
+ [logstash]: https://www.elastic.co/products/logstash
1171
+
1172
+ <a id="pino-telegram-webhook"></a>
1173
+ ### pino-telegram-webhook
1174
+
1175
+ [pino-telegram-webhook](https://github.com/Jhon-Mosk/pino-telegram-webhook) is a Pino v7+ transport for sending messages to [Telegram](https://telegram.org/).
1176
+
1177
+ ```js
1178
+ const pino = require('pino');
1179
+
1180
+ const logger = pino({
1181
+ transport: {
1182
+ target: 'pino-telegram-webhook',
1183
+ level: 'error',
1184
+ options: {
1185
+ chatId: -1234567890,
1186
+ botToken: "123456:ABC-DEF1234ghIkl-zyx57W2v1u123ew11",
1187
+ extra: {
1188
+ parse_mode: "HTML",
1189
+ },
1190
+ },
1191
+ },
1192
+ })
1193
+
1194
+ logger.error('<b>test log!</b>');
1195
+ ```
1196
+
1197
+ The `extra` parameter is optional. Parameters that the method [`sendMessage`](https://core.telegram.org/bots/api#sendmessage) supports can be passed to it.
1198
+
1199
+ <a id="pino-websocket"></a>
1200
+ ### pino-websocket
1201
+
1202
+ [pino-websocket](https://www.npmjs.com/package/@abeai/pino-websocket) is a transport that will forward each log line to a websocket server.
1203
+
1204
+ ```sh
1205
+ $ node app.js | pino-websocket -a my-websocket-server.example.com -p 3004
1206
+ ```
1207
+
1208
+ For full documentation of command line switches read the [README](https://github.com/abeai/pino-websocket#readme).
1209
+
1210
+ <a id="pino-yc-transport"></a>
1211
+ ### pino-yc-transport
1212
+
1213
+ [pino-yc-transport](https://github.com/Jhon-Mosk/pino-yc-transport) is a Pino v7+ transport for writing to [Yandex Cloud Logging](https://yandex.cloud/ru/services/logging) from serveless functions or containers.
1214
+
1215
+ ```js
1216
+ const pino = require("pino");
1217
+
1218
+ const config = {
1219
+ level: "debug",
1220
+ transport: {
1221
+ target: "pino-yc-transport",
1222
+ },
1223
+ };
1224
+
1225
+ const logger = pino(config);
1226
+
1227
+ logger.debug("some message")
1228
+ logger.debug({ foo: "bar" });
1229
+ logger.debug("some message %o, %s", { foo: "bar" }, "baz");
1230
+ logger.info("info");
1231
+ logger.warn("warn");
1232
+ logger.error("error");
1233
+ logger.error(new Error("error"));
1234
+ logger.fatal("fatal");
1235
+ ```
1236
+
1237
+ <a id="communication-between-pino-and-transport"></a>
1238
+ ## Communication between Pino and Transports
1239
+ Here we discuss some technical details of how Pino communicates with its [worker threads](https://nodejs.org/api/worker_threads.html).
1240
+
1241
+ Pino uses [`thread-stream`](https://github.com/pinojs/thread-stream) to create a stream for transports.
1242
+ When we create a stream with `thread-stream`, `thread-stream` spawns a [worker](https://github.com/pinojs/thread-stream/blob/f19ac8dbd602837d2851e17fbc7dfc5bbc51083f/index.js#L50-L60) (an independent JavaScript execution thread).
1243
+
1244
+ ### Error messages
1245
+ How are error messages propagated from a transport worker to Pino?
1246
+
1247
+ Let's assume we have a transport with an error listener:
1248
+ ```js
1249
+ // index.js
1250
+ const transport = pino.transport({
1251
+ target: './transport.js'
1252
+ })
1253
+
1254
+ transport.on('error', err => {
1255
+ console.error('error caught', err)
1256
+ })
1257
+
1258
+ const log = pino(transport)
1259
+ ```
1260
+
1261
+ When our worker emits an error event, the worker has listeners for it: [error](https://github.com/pinojs/thread-stream/blob/f19ac8dbd602837d2851e17fbc7dfc5bbc51083f/lib/worker.js#L59-L70) and [unhandledRejection](https://github.com/pinojs/thread-stream/blob/f19ac8dbd602837d2851e17fbc7dfc5bbc51083f/lib/worker.js#L135-L141). These listeners send the error message to the main thread where Pino is present.
1262
+
1263
+ When Pino receives the error message, it further [emits](https://github.com/pinojs/thread-stream/blob/f19ac8dbd602837d2851e17fbc7dfc5bbc51083f/index.js#L349) the error message. Finally, the error message arrives at our `index.js` and is caught by our error listener.