logtunnel 1.0.0 → 1.1.0

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
package/README.md CHANGED
@@ -1,61 +1,259 @@
1
- # Logtunnel
1
+ # logtunnel
2
2
 
3
- CLI tool that allows you to format, filter and search your log output
3
+ `logtunnel` (`lt`) is a CLI tool that helps you search logs, parse them into structured data, filter by fields, and reformat them for reading or other tools.
4
4
 
5
- ## Requirements
5
+ ## Installation
6
6
 
7
- - [NodeJS >= 12](https://nodejs.org/en/download/)
8
- - [NPM](https://docs.npmjs.com/cli/v7/configuring-npm/install)
7
+ ```bash
8
+ npm i -g logtunnel
9
+ ```
9
10
 
10
- ## Installation
11
+ If you are on Linux, you might need `sudo` depending on your setup.
12
+
13
+ ## Tutorial
14
+
15
+ ### The simplest form: find lines
16
+
17
+ `lt <filter>` is shorthand for “keep only lines that match this regex” (case-insensitive).
18
+
19
+ ```bash
20
+ curl -s https://cdn.codetunnel.net/lt/text.log | lt error
21
+ ```
22
+
23
+ You can also use `-f`/`--filter` multiple times (AND behavior, all filters must match). For lines containing both checkout and alice, you could use:
24
+
25
+ ```bash
26
+ curl -s https://cdn.codetunnel.net/lt/text.log | lt -f checkout -f alice
27
+ ```
28
+
29
+ ### Ignore noise
30
+
31
+ Use `-i`/`--ignore` to drop lines that match those regexes:
32
+
33
+ ```bash
34
+ curl -s https://cdn.codetunnel.net/lt/text.log | lt -i healthz -i metrics
35
+ ```
36
+
37
+ Find something while ignoring other things:
38
+
39
+ ```bash
40
+ curl -s https://cdn.codetunnel.net/lt/text.log | lt -f error -i NullPointer -i "retrying in"
41
+ ```
42
+
43
+ Tip: `-f` and `-i` always run against the original input line (before parsing). If you want “filter by JSON fields”, use `-F` with a parser.
44
+
45
+ ### Parse logs (turn text into structured data)
46
+
47
+ Parsing makes each line become an “event object”, enabling field filters (`-F`) and structured outputs (`-o json`, `-o logfmt`, `-o table`, templates, etc).
48
+
49
+ Supported parsers:
50
+
51
+ - `-p json` (one JSON object per line)
52
+ - `-p logfmt` (key=value log lines)
53
+ - `-p table` (space-aligned tables like `kubectl get pods`)
54
+ - `-p '<regex with named groups>'` (custom parsing using RegExp named groups)
55
+
56
+ #### Parse JSON and format a clean line
57
+
58
+ ```bash
59
+ curl -s https://cdn.codetunnel.net/lt/json.log | lt -p json -o '[{{ts}} {{upper level}}] {{message}}'
60
+ ```
61
+
62
+ #### Parse logfmt and convert to JSON (great for piping into other tools)
63
+
64
+ ```bash
65
+ curl -s https://cdn.codetunnel.net/lt/logfmt.log | lt -p logfmt -o json
66
+ ```
67
+
68
+ #### Parse JSON and show “human friendly structured output”
69
+
70
+ Default output (no `-o`) is “inspect”, objects are pretty-printed with colors.
71
+
72
+ ```bash
73
+ curl -s https://cdn.codetunnel.net/lt/json.log | lt -p json
74
+ ```
75
+
76
+ Use `-o inspect` to force multi-line output (useful for large or nested objects):
77
+
78
+ ```bash
79
+ curl -s https://cdn.codetunnel.net/lt/json.log | lt -p json -o inspect
80
+ ```
81
+
82
+ ### Custom formats
83
+
84
+ When you pass a string to `-o` that isn’t one of `json|logfmt|inspect|original|table`, `lt` treats it as a Bigodon template (a safe Mustache/Handlebars-like language).
85
+
86
+ It supports:
87
+
88
+ - Variables: `{{message}}`, `{{ts}}`, `{{kubernetes.pod}}`
89
+ - Helpers: `{{upper level}}`, `{{lower user.email}}`, `{{toFixed delay_ms 2}}`
90
+ - Nested expressions: `{{capitalize (lower level)}}`
91
+
92
+ Example (compact “service log line”):
93
+
94
+ ```bash
95
+ curl -s https://cdn.codetunnel.net/lt/json.log | lt -p json -o '[{{ts}}] {{service}} {{kubernetes.namespace}}/{{kubernetes.pod}} {{upper level}} {{message}}'
96
+ ```
97
+
98
+ You can find the bigodon language reference [here](https://github.com/gabriel-pinheiro/bigodon/blob/main/LANGUAGE.md) and the available helpers [here](https://github.com/gabriel-pinheiro/bigodon/blob/main/HELPERS.md).
99
+
100
+ ### Field filters (`-F <expression>`)
101
+
102
+ `-F/--field` filters *parsed objects* (so it requires `-p ...`). You can specify `-F` multiple times; all field filters must match (AND behavior).
103
+
104
+ Common helpers you’ll use in field filters:
105
+
106
+ - comparisons: `gt`, `gte`, `lt`, `lte`, `eq`, `and`, `or`, `not`
107
+ - strings: `lower`, `upper`, `startsWith`, `endsWith`
108
+ - `includes` works for strings and arrays
109
+
110
+ Show only slow requests (delay over 200ms):
111
+
112
+ ```bash
113
+ curl -s https://cdn.codetunnel.net/lt/json.log | lt -p json -F 'gt delay_ms 200' -o inspect
114
+ ```
115
+
116
+ Case-insensitive “message contains alice”:
117
+
118
+ ```bash
119
+ curl -s https://cdn.codetunnel.net/lt/json.log | lt -p json -F 'includes (lower message) "alice"' -o '[{{ts} {{upper level}}] {{message}}'
120
+ ```
121
+
122
+ Combine multiple conditions:
123
+
124
+ ```bash
125
+ curl -s https://cdn.codetunnel.net/lt/json.log | lt -p json -F 'and (eq level "error") (gt http.status 499)' -o '[{{ts}} {{upper level}}] {{message}}'
126
+ ```
127
+
128
+ Show the *original raw line* after field filtering:
129
+
130
+ ```bash
131
+ curl -s https://cdn.codetunnel.net/lt/json.log | lt -p json -F 'gt delay_ms 200' -o original
132
+ ```
133
+
134
+ #### Tip: General filter combined with expression filters
135
+
136
+ Inclusion (`-f`) and exclusion (`-i`) filters are ~5x faster than field filters (`-F`) because they skip the parsing step. If you can apply a broader filter with `-f`/`-i` before the more specific `-F` filter, it'll be much quicker on large files. If you are seeing poor performance on filters like:
137
+
138
+ ```
139
+ curl -s https://cdn.codetunnel.net/lt/json.log | lt -p json -F 'eq level "error"' -o original
140
+ ```
141
+
142
+ And you can't use filters like this because they'd show loglines that are of level INFO but contain the "error" string on the message:
143
+
144
+ ```
145
+ curl -s https://cdn.codetunnel.net/lt/json.log | lt -f error
146
+ ```
147
+
148
+ You can combine both to reduce the amount of parsed lines with the `-f` filter before the more specific `-F`:
149
+
150
+ ```
151
+ curl -s https://cdn.codetunnel.net/lt/json.log | lt -f error -p json -F 'eq level "error"' -o original
152
+ ```
153
+
154
+ ### Kubernetes tables
155
+
156
+ `-p table` is designed for outputs like `kubectl get pods -A` (space-separated columns).
157
+
158
+ Find all pods (`kubectl get pods -A`) but ignore lines containing kube-system:
159
+
160
+ ```bash
161
+ curl -s https://cdn.codetunnel.net/lt/table.log | lt -i kube-system
162
+ ```
163
+
164
+ As the pods on the kube-system namespace had longer names, this log will be wider (requiring a wider terminal before wrapping) because kubernetes still considered their length to build the table, other things like the time since the last restart, longer status names (CrashLoopBackOff) can make the original table wider.
165
+
166
+ You can use `-p table` to parse the table, filters to include/exclude lines, and `-o table` to output a new table, considering the length of the selected lines, only. The command above printed the table as wide as kubernetes generated, this one prints a narrower one:
167
+
168
+ ```bash
169
+ curl -s https://cdn.codetunnel.net/lt/table.log | lt -p table -o table -i kube-system
170
+ ```
171
+
172
+ If you are looking for all pods containing the word `gateway`, you might end up excluding the headers row:
173
+
174
+ ```bash
175
+ curl -s https://cdn.codetunnel.net/lt/table.log | lt gateway
176
+ ```
177
+
178
+ You can always print the headers row with `-H`:
179
+
180
+ ```bash
181
+ curl -s https://cdn.codetunnel.net/lt/table.log | lt -H gateway
182
+ ```
183
+
184
+ ### Kubernetes -k option and more examples
185
+
186
+ On kubectl commands, you most likely want to parse the table (`-p table`) and reformat as a new table (`-o table`). You can use the `-k` as an alias to `-p table -o table`:
187
+
188
+ ```bash
189
+ curl -s https://cdn.codetunnel.net/lt/table.log | lt -k payment
190
+ ```
191
+
192
+ When parsing the table with `-p table`, you can filter with custom logic using the field filters (`-F`), you can get pods with at least one restart:
193
+ ```bash
194
+ curl -s https://cdn.codetunnel.net/lt/table.log | lt -k -F 'gt RESTARTS 0'
195
+ ```
196
+
197
+ Show pods that are not fully ready (`READY` looks like `1/2`):
11
198
 
12
- ``sudo npm i -g logtunnel``
199
+ ```bash
200
+ curl -s https://cdn.codetunnel.net/lt/table.log | lt -p '(?<up>\d+)/(?<total>\d+)' -F 'lt up total' -H -o original
201
+ ```
13
202
 
14
- ## Usage
203
+ Or, combining multiple lts to re-format the table:
15
204
 
16
- Find logs that contain "alice":
17
- ``curl -s https://cdn.codetunnel.net/lt/text.log | lt alice``
205
+ ```bash
206
+ curl -s https://cdn.codetunnel.net/lt/table.log | lt -p '(?<up>\d+)/(?<total>\d+)' -F 'lt up total' -H -o original | lt -k
207
+ ```
18
208
 
19
- Find logs that contain "alice" and "purchase":
20
- ``curl -s https://cdn.codetunnel.net/lt/text.log | lt -f alice -f purchase``
209
+ Or, using just templates:
21
210
 
22
- Find logs that contain "alice" and ignore the ones that contain "info":
23
- ``curl -s https://cdn.codetunnel.net/lt/text.log | lt -f alice -i info``
211
+ ```bash
212
+ curl -s https://cdn.codetunnel.net/lt/table.log | lt -k -F 'lt (itemAt (split READY "/") 0) (itemAt (split READY "/") 1)'
213
+ ```
24
214
 
25
- Parse logs as JSON and output them with that template:
26
- ``curl -s https://cdn.codetunnel.net/lt/json.log | lt -p json -o '[{{lvl}}] {{log}}'``
215
+ Convert `kubectl` table output to logfmt for easier downstream filtering:
27
216
 
28
- Parse logs as JSON, apply template and find the ones containing "alice":
29
- ``curl -s https://cdn.codetunnel.net/lt/json.log | lt -p json -o '[{{lvl}}] {{log}}' -f alice``
217
+ ```bash
218
+ curl -s https://cdn.codetunnel.net/lt/table.log | lt -p table -o logfmt
219
+ ```
30
220
 
31
- Parse logs as JSON, apply template and show the ones with "delay > 200":
32
- ``curl -s https://cdn.codetunnel.net/lt/json.log | lt -p json -o '[{{lvl}} in {{delay}}ms] {{log}}' -F 'delay > 200'``
221
+ ### Custom regex parsing (`-p '(?<name>...)'`)
33
222
 
34
- Parse logs as JSON, apply template and show the ones with "log" containing "Alice":
35
- ``curl -s https://cdn.codetunnel.net/lt/json.log | lt -p json -o '[{{lvl}}] {{log}}' -F 'log.toLowerCase().includes("alice")'``
223
+ Use a regex with named groups to “extract fields” from unstructured text:
36
224
 
37
- Parse logs as logfmt, show the ones with "delay > 200" and show their original line (as if no parsing happened):
38
- ``curl -s https://cdn.codetunnel.net/lt/logfmt.log | lt -p logfmt -o original -F 'delay > 200'``
225
+ ```bash
226
+ curl -s https://cdn.codetunnel.net/lt/text.log | lt -p '(?<ts>\S+) \[(?<level>\w+)\] (?<message>.*)' -o logfmt
227
+ ```
39
228
 
40
- Parse logs as JSON and output them as a table:
41
- ``curl -s https://cdn.codetunnel.net/lt/json.log | lt -p json -o table``
229
+ Then field-filter on extracted fields:
42
230
 
43
- Parse logs with regex, and output in logfmt:
44
- ``curl -s https://cdn.codetunnel.net/lt/text.log | lt -p '\[(?<lvl>\S*) in\s*(?<delay>\d*)ms\] (?<log>.*)' -o logfmt``
231
+ ```bash
232
+ curl -s https://cdn.codetunnel.net/lt/text.log | lt -p '(?<delay_ms>\d+)ms' -F 'gt delay_ms 200' -o original
233
+ ```
45
234
 
46
- Parse logs with regex, and show the ones with "delay > 200":
47
- ``curl -s https://cdn.codetunnel.net/lt/text.log | lt -p '(?<delay>\d+)ms' -o original -F 'delay > 200'``
235
+ ## Reference
48
236
 
49
- Parse table and show rows containing "cilium":
50
- ``curl -s https://cdn.codetunnel.net/lt/table.log | lt -p table -o original -f cilium``
237
+ ### Options
51
238
 
52
- Parse table, show rows containing "cilium" and the first headers row:
53
- ``curl -s https://cdn.codetunnel.net/lt/table.log | lt -p table -o original -f cilium -H``
239
+ - `lt <filter>`: shorthand for a single text filter
240
+ - `-f, --filter <regex>`: keep lines that match this regex (repeatable)
241
+ - `-i, --ignore <regex>`: drop lines that match this regex (repeatable)
242
+ - `-p, --parser <json|logfmt|table|regex>`: parse each line into an object
243
+ - `-F, --field <bigodon expression>`: filter parsed objects by expression (repeatable)
244
+ - `-o, --output <format|template>`: output `json|logfmt|inspect|original|table` or a Bigodon template
245
+ - `-H, --headers`: always output the first input line (table headers)
246
+ - `-k, --kubectl`: shortcut for `-p table -o table`
247
+ - `-h, --help`: show help
248
+ - `-v, --version`: show version
54
249
 
55
- Parse table, show rows with RESTARTS > 0:
56
- ``curl -s https://cdn.codetunnel.net/lt/table.log | lt -p table -o original -F 'RESTARTS > 0' -H``
250
+ ### Formats at a glance
57
251
 
58
- Show rows that are not ready:
59
- ``curl -s https://cdn.codetunnel.net/lt/table.log | lt -p '(?<up>\d)/(?<total>\d)' -o original -F 'up < total' -H``
252
+ - `-o original`: print the original input line (even after parsing/filtering)
253
+ - `-o inspect` (or default): print objects with colors for humans
254
+ - `-o json`: emit JSON objects
255
+ - `-o logfmt`: emit `key=value` lines
256
+ - `-o table`: render a table from parsed objects (buffers until EOF)
257
+ - `-o '<bigodon template>'`: render a custom line from parsed objects
60
258
 
61
- For more information ``lt --help``
259
+ For the built-in help (includes examples): `lt --help`.
package/package.json CHANGED
@@ -1,6 +1,6 @@
1
1
  {
2
2
  "name": "logtunnel",
3
- "version": "1.0.0",
3
+ "version": "1.1.0",
4
4
  "description": "CLI tool that allows you to format, filter and search your log output",
5
5
  "main": "src/main.js",
6
6
  "bin": {
package/src/definition.js CHANGED
@@ -34,7 +34,7 @@ const definition = {
34
34
  type: 'string',
35
35
  },
36
36
  F: {
37
- description: 'Show only logs that match the field filter. You can use JavaScript.',
37
+ description: 'Show only logs that match the field filter (Bigodon expression). Requires a parser like -p json.',
38
38
  alias: 'field',
39
39
  type: 'string',
40
40
  multiple: true,
@@ -44,41 +44,42 @@ const definition = {
44
44
  alias: 'headers',
45
45
  type: 'boolean',
46
46
  },
47
+ k: {
48
+ description: 'Shortcut for kubectl tables (equivalent to -p table -o table).',
49
+ alias: 'kubectl',
50
+ type: 'boolean',
51
+ },
47
52
  };
48
53
 
49
54
  const $ = '$ '.gray;
50
55
  const examples = [
51
56
  '\n\nExamples:\n',
52
- 'Find logs that contain "alice":'.dim,
53
- $ + 'curl -s https://cdn.codetunnel.net/lt/text.log | lt alice',
54
- 'Find logs that contain "alice" and "purchase":'.dim,
55
- $ + 'curl -s https://cdn.codetunnel.net/lt/text.log | lt -f alice -f purchase',
56
- 'Find logs that contain "alice" and ignore the ones that contain "info"'.dim,
57
- $ + 'curl -s https://cdn.codetunnel.net/lt/text.log | lt -f alice -i info',
58
- 'Parse logs as JSON and output them with that template'.dim,
59
- $ + 'curl -s https://cdn.codetunnel.net/lt/json.log | lt -p json -o \'[{{lvl}}] {{log}}\'',
60
- 'Parse logs as JSON, apply template and find the ones containing "alice"'.dim,
61
- $ + 'curl -s https://cdn.codetunnel.net/lt/json.log | lt -p json -o \'[{{lvl}}] {{log}}\' -f alice',
62
- 'Parse logs as JSON, apply template and show the ones with "delay > 200"'.dim,
63
- $ + 'curl -s https://cdn.codetunnel.net/lt/json.log | lt -p json -o \'[{{lvl}} in {{delay}}ms] {{log}}\' -F \'delay > 200\'',
64
- 'Parse logs as JSON, apply template and show the ones with "log" containing "Alice"'.dim,
65
- $ + 'curl -s https://cdn.codetunnel.net/lt/json.log | lt -p json -o \'[{{lvl}}] {{log}}\' -F \'log.toLowerCase().includes("alice")\'',
66
- 'Parse logs as logfmt, show the ones with "delay > 200" and show their original line (as if no parsing happened)'.dim,
67
- $ + 'curl -s https://cdn.codetunnel.net/lt/logfmt.log | lt -p logfmt -o original -F \'delay > 200\'',
68
- 'Parse logs as JSON and output them as a table'.dim,
69
- $ + 'curl -s https://cdn.codetunnel.net/lt/json.log | lt -p json -o table',
70
- 'Parse logs with regex, and output in logfmt'.dim,
71
- $ + 'curl -s https://cdn.codetunnel.net/lt/text.log | lt -p \'\\[(?<lvl>\\S*) in\\s*(?<delay>\\d*)ms\\] (?<log>.*)\' -o logfmt',
72
- 'Parse logs with regex, and show the ones with "delay > 200"'.dim,
73
- $ + 'curl -s https://cdn.codetunnel.net/lt/text.log | lt -p \'(?<delay>\\d+)ms\' -o original -F \'delay > 200\'',
74
- 'Parse table and show rows containing "cilium"'.dim,
75
- $ + 'curl -s https://cdn.codetunnel.net/lt/table.log | lt -p table -o original -f cilium',
76
- 'Parse table, show rows containing "cilium" and the first headers row'.dim,
77
- $ + 'curl -s https://cdn.codetunnel.net/lt/table.log | lt -p table -o original -f cilium -H',
78
- 'Parse table, show rows with RESTARTS > 0'.dim,
79
- $ + 'curl -s https://cdn.codetunnel.net/lt/table.log | lt -p table -o original -F \'RESTARTS > 0\' -H',
80
- 'Show rows that are not ready'.dim,
81
- $ + 'curl -s https://cdn.codetunnel.net/lt/table.log | lt -p \'(?<up>\\d)/(?<total>\\d)\' -o original -F \'up < total\' -H',
57
+ 'Find lines containing "error" (shorthand for a single -f filter):'.dim,
58
+ $ + 'curl -s https://cdn.codetunnel.net/lt/text.log | lt error',
59
+ 'Find lines containing "checkout" and "alice" (AND):'.dim,
60
+ $ + 'curl -s https://cdn.codetunnel.net/lt/text.log | lt -f checkout -f alice',
61
+ 'Ignore noise (drop health/metrics lines):'.dim,
62
+ $ + 'curl -s https://cdn.codetunnel.net/lt/text.log | lt -i healthz -i metrics',
63
+ 'Find errors while ignoring known spam:'.dim,
64
+ $ + 'curl -s https://cdn.codetunnel.net/lt/text.log | lt -f error -i NullPointer -i "retrying in"',
65
+ 'Parse JSON (default output is inspect):'.dim,
66
+ $ + 'curl -s https://cdn.codetunnel.net/lt/json.log | lt -p json',
67
+ 'Parse JSON and format a clean line:'.dim,
68
+ $ + 'curl -s https://cdn.codetunnel.net/lt/json.log | lt -p json -o \'[{{ts}} {{upper level}}] {{message}}\'',
69
+ 'Field filter: only slow requests (delay_ms > 200):'.dim,
70
+ $ + 'curl -s https://cdn.codetunnel.net/lt/json.log | lt -p json -F \'gt delay_ms 200\' -o original',
71
+ 'Field filter: combine conditions:'.dim,
72
+ $ + 'curl -s https://cdn.codetunnel.net/lt/json.log | lt -p json -F \'and (eq level "error") (gt http.status 499)\' -o \'[{{ts}} {{upper level}}] {{message}}\'',
73
+ 'Parse logfmt and convert to JSON:'.dim,
74
+ $ + 'curl -s https://cdn.codetunnel.net/lt/logfmt.log | lt -p logfmt -o json',
75
+ 'Kubernetes tables: find rows containing "gateway", keep headers:'.dim,
76
+ $ + 'curl -s https://cdn.codetunnel.net/lt/table.log | lt -H gateway',
77
+ 'Kubernetes tables: -k is shorthand for -p table -o table:'.dim,
78
+ $ + 'curl -s https://cdn.codetunnel.net/lt/table.log | lt -k payment',
79
+ 'Kubernetes tables: show pods that are not fully ready:'.dim,
80
+ $ + 'curl -s https://cdn.codetunnel.net/lt/table.log | lt -k -F \'lt (itemAt (split READY "/") 0) (itemAt (split READY "/") 1)\'',
81
+ 'Custom regex parsing (extract fields) and output as logfmt:'.dim,
82
+ $ + 'curl -s https://cdn.codetunnel.net/lt/text.log | lt -p \'(?<ts>\\S+) \\[(?<level>\\w+)\\] (?<message>.*)\' -o logfmt',
82
83
  ];
83
84
 
84
85
  const usage = Bossy.usage(definition, 'lt [options]\n Or: lt <filter>') + examples.join('\n');
package/src/pipeline.js CHANGED
@@ -11,7 +11,7 @@ const debug = require('debug')('logtunnel:pipeline');
11
11
  class LogPipeline {
12
12
  constructor(args, stdout) {
13
13
  this.firstLine = null;
14
- this.args = args;
14
+ this.args = this._normalizeArgs(args);
15
15
  this.stdout = stdout;
16
16
  this.outputTransformer = outputFactory(this.args.output);
17
17
  this.isOutputBuffered = Boolean(this.outputTransformer.flush);
@@ -22,8 +22,8 @@ class LogPipeline {
22
22
  onLogLine(line) {
23
23
  try {
24
24
  debug('got line: ' + line)
25
- this._logLine(line);
26
- this._updateFirstLine(line);
25
+ const isFirstLine = this._updateFirstLine(line);
26
+ this._logLine(line, isFirstLine);
27
27
  } catch (e) {
28
28
  // Covering this would kill the process
29
29
  /* $lab:coverage:off$ */
@@ -33,11 +33,15 @@ class LogPipeline {
33
33
  }
34
34
  }
35
35
 
36
- async _logLine(line) {
36
+ async _logLine(line, isFirstLine) {
37
37
  let output = line;
38
38
 
39
39
  for (let transformer of this.transformers) {
40
- const result = await transformer.run(output, line, this);
40
+ const result = await transformer.run(
41
+ output,
42
+ line,
43
+ isFirstLine ? null : this.firstLine,
44
+ );
41
45
 
42
46
  // Transformer accepted the line
43
47
  if(result === true) {
@@ -65,13 +69,15 @@ class LogPipeline {
65
69
 
66
70
  _updateFirstLine(line) {
67
71
  if (this.firstLine) {
68
- return;
72
+ return false;
69
73
  }
70
74
 
71
75
  this.firstLine = line;
72
76
  if(this.args.headers && !this.isOutputBuffered) {
73
77
  this.stdout.write(this.firstLine + '\n');
74
78
  }
79
+
80
+ return true;
75
81
  }
76
82
 
77
83
  onEnd() {
@@ -97,6 +103,17 @@ class LogPipeline {
97
103
  this.outputTransformer,
98
104
  ];
99
105
  }
106
+
107
+ _normalizeArgs(args) {
108
+ const normalized = { ...args };
109
+
110
+ if (normalized.kubectl) {
111
+ normalized.parser = 'table';
112
+ normalized.output = 'table';
113
+ }
114
+
115
+ return normalized;
116
+ }
100
117
  }
101
118
 
102
119
  module.exports.LogPipeline = LogPipeline;
@@ -3,13 +3,14 @@ class TableParse {
3
3
  this.headers = null;
4
4
  }
5
5
 
6
- run(line, _original, pipeline) {
7
- if (!pipeline.firstLine) {
6
+ run(line, _original, firstLine) {
7
+ if (!firstLine) {
8
8
  // Ignore first line, it's the headers
9
+ this.headers = this._splitColumns(line);
9
10
  return false;
10
11
  }
11
12
  if (!this.headers) {
12
- this.headers = this._splitColumns(pipeline.firstLine);
13
+ this.headers = this._splitColumns(firstLine);
13
14
  }
14
15
 
15
16
  const columns = this._splitColumns(line);
@@ -51,22 +51,20 @@ describe('parsers', () => {
51
51
  ];
52
52
 
53
53
  it('should ignore headers', () => {
54
- const pipeline = { firstLine: null };
55
54
  const parser = parseFactory('table');
56
55
  const line = tableRows[0];
57
56
 
58
- expect(parser.run(line, line, pipeline)).to.be.false();
57
+ expect(parser.run(line, line, null)).to.be.false();
59
58
  });
60
59
 
61
60
  it('should parse subsequent rows', () => {
62
- const pipeline = { firstLine: tableRows[0] };
63
61
  const parser = parseFactory('table');
64
62
  let line;
65
63
 
66
64
  line = tableRows[1];
67
- expect(parser.run(line, line, pipeline)).to.equal({ NAME: 'foo', TYPE: 'bar' });
65
+ expect(parser.run(line, line, tableRows[0])).to.equal({ NAME: 'foo', TYPE: 'bar' });
68
66
  line = tableRows[2];
69
- expect(parser.run(line, line, pipeline)).to.equal({ NAME: 'baz', TYPE: 'qux' });
67
+ expect(parser.run(line, line, tableRows[0])).to.equal({ NAME: 'baz', TYPE: 'qux' });
70
68
  });
71
69
  });
72
70
  });
@@ -1,6 +1,6 @@
1
1
  const Lab = require('@hapi/lab');
2
2
  const Code = require('@hapi/code');
3
- const { runPipeline, _, f, i, F, p, o, H } = require('./utils');
3
+ const { runPipeline, _, f, i, F, p, o, H, k } = require('./utils');
4
4
 
5
5
  const { describe, it } = exports.lab = Lab.script();
6
6
  const { expect } = Code;
@@ -186,6 +186,25 @@ describe('pipeline', () => {
186
186
  expect(actual).to.equal(expected);
187
187
  });
188
188
 
189
+ it('should not duplicate headers when parsing and reformatting tables with ignore filters', async () => {
190
+ const args = [
191
+ p('table'),
192
+ o('table'),
193
+ i('kube'),
194
+ ];
195
+ const actual = await runPipeline([
196
+ 'NAMESPACE NAME',
197
+ 'kube-system coredns',
198
+ 'default api',
199
+ ], args);
200
+ const expected = [
201
+ 'NAMESPACE NAME',
202
+ 'default api',
203
+ ];
204
+
205
+ expect(actual).to.equal(expected);
206
+ });
207
+
189
208
  it('should not specify null prototype of regex parser', async () => {
190
209
  const args = [
191
210
  p('(?<num>\\d*)'),
@@ -196,4 +215,22 @@ describe('pipeline', () => {
196
215
 
197
216
  expect(actual).not.to.include('null prototype');
198
217
  });
218
+
219
+ it('should parse and format kubectl tables with -k', async () => {
220
+ const args = [
221
+ k(),
222
+ ];
223
+ const actual = await runPipeline([
224
+ 'NAME AGE',
225
+ 'foo 1',
226
+ 'bar 22',
227
+ ], args);
228
+ const expected = [
229
+ 'NAME AGE',
230
+ 'foo 1',
231
+ 'bar 22',
232
+ ];
233
+
234
+ expect(actual).to.equal(expected);
235
+ });
199
236
  });
package/test/utils.js CHANGED
@@ -67,5 +67,6 @@ const F = str => ({ field: [str] });
67
67
  const p = str => ({ parser: str });
68
68
  const o = str => ({ output: str });
69
69
  const H = () => ({ headers: true });
70
+ const k = () => ({ kubectl: true });
70
71
 
71
- module.exports = { pod, slowPod, runPipeline, _, f, i, F, p, o, H };
72
+ module.exports = { pod, slowPod, runPipeline, _, f, i, F, p, o, H, k };