kafka-console 2.0.76 → 2.1.0

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
package/README.md CHANGED
@@ -1,181 +1,421 @@
1
- # Kafka CLI tool
1
+ # Kafka Console CLI
2
2
 
3
- Command line tool to sufficiently and easy work with Kafka
3
+ A powerful and easy-to-use command-line interface for Apache Kafka operations.
4
4
 
5
5
  [![NPM version][npm-image]][npm-url]
6
6
  [![Downloads][downloads-image]][npm-url]
7
7
 
8
8
  ## Table of Contents
9
9
 
10
- - [Features](#features)
11
- - [Installing](#installing)
12
- - [Examples](#examples)
13
- - [Consumer](#consumer)
14
- - [Producer](#producer)
15
- - [Formatters](#formatters)
16
- - [Environment](#environment)
17
- - [License](#license)
10
+ - [Features](#features)
11
+ - [Installation](#installation)
12
+ - [Quick Start](#quick-start)
13
+ - [Commands](#commands)
14
+ - [Consuming Messages](#consuming-messages)
15
+ - [Producing Messages](#producing-messages)
16
+ - [Topic Management](#topic-management)
17
+ - [Cluster Information](#cluster-information)
18
+ - [Authentication](#authentication)
19
+ - [Message Formats](#message-formats)
20
+ - [Environment Variables](#environment-variables)
21
+ - [Common Use Cases](#common-use-cases)
22
+ - [Troubleshooting](#troubleshooting)
23
+ - [License](#license)
18
24
 
19
25
  ## Features
20
26
 
21
- - Producer
22
- - Consumer groups with seek and timeout
23
- - Built-in message encoders/decoders with types: json, js, raw
24
- - Custom message encoders/decoders as a js module
25
- - Message headers
26
- - GZIP compression
27
- - Plain, SSL and SASL_SSL implementations
28
- - Admin client
29
- - TypeScript support
30
-
31
- ## Installing
32
-
33
- ```sh
27
+ - ✅ **Consumer & Producer** - Full support for consuming and producing messages
28
+ - **Multiple Authentication Methods** - Plain, SCRAM-SHA-256/512, AWS IAM, OAuth Bearer
29
+ - **Flexible Message Formats** - JSON, JavaScript, raw text, or custom formatters
30
+ - **Consumer Groups** - Full consumer group support with offset management
31
+ - **Time-based Consumption** - Read messages from specific timestamps
32
+ - **SSL/TLS Support** - Secure connections to Kafka clusters
33
+ - ✅ **Topic Management** - Create, delete, and inspect topics
34
+ - **Headers Support** - Read and write message headers
35
+ - **GZIP Compression** - Automatic compression support
36
+ - ✅ **TypeScript** - Full TypeScript support
37
+
38
+ ## Installation
39
+
40
+ ### Global Installation (Recommended)
41
+ ```bash
34
42
  npm install -g kafka-console
35
43
  ```
36
44
 
37
- ## Examples
45
+ ### Local Installation
46
+ ```bash
47
+ npm install kafka-console
48
+ ```
38
49
 
39
- ### Common options
50
+ ### Using without Installation
51
+ ```bash
52
+ npx kafka-console [command]
40
53
  ```
41
- -b, --brokers <brokers> bootstrap server host (default: "localhost:9092")
42
- -l, --log-level <logLevel> log level
43
- -t, --timeout <timeout> set a timeout of operation (default: "0")
44
- -p, --pretty pretty print (default: false)
45
- --ssl enable ssl (default: false)
46
- --mechanism <mechanism> sasl mechanism
47
- --username <username> sasl username
48
- --password <password> sasl password
49
- --auth-id <authId> sasl aws authorization identity
50
- --access-key-id <accessKeyId> sasl aws access key id
51
- --secret-access-key <secretAccessKey> sasl aws secret access key
52
- --session-token <seccionToken> sasl aws session token
53
- --oauth-bearer <oauthBearer> sasl oauth bearer token
54
- -V, --version output the version number
55
- -h, --help display help for command
54
+
55
+ ## Quick Start
56
+
57
+ ### 1. List all topics
58
+ ```bash
59
+ kafka-console list --brokers localhost:9092
56
60
  ```
57
61
 
58
- ### Commands
62
+ ### 2. Consume messages from a topic
63
+ ```bash
64
+ kafka-console consume my-topic --brokers localhost:9092
59
65
  ```
60
- consume [options] <topic> Consume kafka topic events
61
- produce [options] <topic> Produce kafka topic events
62
- metadata Displays kafka server metadata
63
- list|ls [options] Lists kafka topics
64
- config [options] Describes config for specific resource
65
- topic:create <topic> Creates kafka topic
66
- topic:delete <topic> Deletes kafka topic
67
- topic:offsets <topic> [timestamp] Shows kafka topic offsets
68
- help [command] display help for command
66
+
67
+ ### 3. Produce a message to a topic
68
+ ```bash
69
+ echo '{"message": "Hello Kafka!"}' | kafka-console produce my-topic --brokers localhost:9092
69
70
  ```
70
71
 
71
- ### Consumer
72
+ ## Commands
73
+
74
+ ### Consuming Messages
72
75
 
73
- `npx kafka-console consume [options] <topic>`
76
+ ```bash
77
+ kafka-console consume <topic> [options]
78
+ ```
74
79
 
75
80
  #### Options
81
+ | Option | Description | Default |
82
+ |--------|-------------|---------|
83
+ | `-g, --group <group>` | Consumer group name | `kafka-console-consumer-{timestamp}` |
84
+ | `-f, --from <from>` | Start position (timestamp/ISO date/0 for beginning) | latest |
85
+ | `-c, --count <count>` | Number of messages to read | unlimited |
86
+ | `-s, --skip <skip>` | Number of messages to skip | 0 |
87
+ | `-o, --output <file>` | Write output to file | stdout |
88
+ | `-d, --data-format <format>` | Message format (json/js/raw/custom) | json |
89
+ | `-p, --pretty` | Pretty print JSON output | false |
90
+
91
+ #### Examples
92
+
93
+ **Consume from beginning and pretty print:**
94
+ ```bash
95
+ kafka-console consume my-topic --from 0 --pretty
96
+ ```
97
+
98
+ **Consume last 10 messages:**
99
+ ```bash
100
+ kafka-console consume my-topic --count 10
76
101
  ```
77
- -g, --group <group> consumer group name (default: "kafka-console-consumer-TIMESTAMP")
78
- -d, --data-format <data-format> messages data-format: json, js, raw (default: "json")
79
- -o, --output <filename> write output to specified filename
80
- -f, --from <from> read messages from the specific timestamp in milliseconds or ISO 8601 format. Set 0 to read from the beginning
81
- -c, --count <count> a number of messages to read (default: null)
82
- -s, --skip <skip> a number of messages to skip (default: 0)
83
- -h, --help display help for command
102
+
103
+ **Consume from specific timestamp:**
104
+ ```bash
105
+ kafka-console consume my-topic --from "2024-01-01T00:00:00Z"
84
106
  ```
85
107
 
86
- General usage with authentication
87
- ```sh
88
- npx kafka-console --brokers $KAFKA_BROKERS --ssl --mechanism plain --username $KAFKA_USERNAME --password $KAFKA_PASSWORD consume $KAFKA_TOPIC --group $KAFKA_TOPIC_GROUP
108
+ **Consume with specific consumer group:**
109
+ ```bash
110
+ kafka-console consume my-topic --group my-consumer-group
89
111
  ```
90
112
 
91
- Stdout from timestamp `jq` example
92
- ```sh
93
- npx kafka-console consume $KAFKA_TOPIC --from 0 | jq .value
113
+ **Save output to file:**
114
+ ```bash
115
+ kafka-console consume my-topic --output messages.json
94
116
  ```
95
117
 
96
- Custom data formatter example
97
- ```sh
98
- npx kafka-console consume $KAFKA_TOPIC --data-format ./formatter/avro.js | jq
118
+ **Extract specific fields with jq:**
119
+ ```bash
120
+ kafka-console consume my-topic | jq '.value.userId'
99
121
  ```
100
122
 
101
- ### Producer
123
+ ### Producing Messages
102
124
 
103
- ```sh
104
- npx kafka-console produce [options] <topic>
125
+ ```bash
126
+ kafka-console produce <topic> [options]
105
127
  ```
106
128
 
107
129
  #### Options
130
+ | Option | Description | Default |
131
+ |--------|-------------|---------|
132
+ | `-i, --input <file>` | Read input from file | stdin |
133
+ | `-d, --data-format <format>` | Message format (json/js/raw/custom) | json |
134
+ | `-h, --header <header>` | Add message header (format: key:value) | none |
135
+ | `-w, --wait <ms>` | Wait time between messages | 0 |
136
+
137
+ #### Examples
138
+
139
+ **Produce single message:**
140
+ ```bash
141
+ echo '{"user": "john", "action": "login"}' | kafka-console produce my-topic
142
+ ```
143
+
144
+ **Produce from file:**
145
+ ```bash
146
+ kafka-console produce my-topic --input messages.json
147
+ ```
148
+
149
+ **Produce with headers:**
150
+ ```bash
151
+ echo '{"data": "test"}' | kafka-console produce my-topic --header "source:api" --header "version:1.0"
108
152
  ```
109
- -d, --data-format <data-format> messages data-format: json, js, raw (default: "json")
110
- -i, --input <filename> input filename
111
- -w, --wait <wait> wait the time in ms after sending a message (default: 0)
112
- -h, --header <header> set a static header (default: [])
113
- --help display help for command
153
+
154
+ **Produce multiple messages from JSON array:**
155
+ ```bash
156
+ cat users.json | jq -c '.[]' | kafka-console produce my-topic
114
157
  ```
115
158
 
116
- General usage
117
- ```sh
118
- npx kafka-console produce $KAFKA_TOPIC -b $KAFKA_BROKERS --ssl --mechanism plain --username $KAFKA_USERNAME --password $KAFKA_PASSWORD
159
+ **Produce with key (for partitioning):**
160
+ ```bash
161
+ echo '{"key": "user123", "value": {"name": "John"}}' | kafka-console produce my-topic
119
162
  ```
120
163
 
121
- Produce a json data from stdin with custom formatter
122
- ```sh
123
- npx kafka-console payload.txt|kcli produce $KAFKA_TOPIC --data-format ./formatter/avro.js
164
+ ### Topic Management
165
+
166
+ #### Create Topic
167
+ ```bash
168
+ kafka-console topic:create my-new-topic
124
169
  ```
125
170
 
126
- Produce a json data from stdin
127
- ```sh
128
- node payloadGenerator.js|npx kafka-console produce $KAFKA_TOPIC
171
+ #### Delete Topic
172
+ ```bash
173
+ kafka-console topic:delete old-topic
129
174
  ```
130
175
 
131
- Produce a json array data from stdin
132
- ```sh
133
- cat payload.json|jq -r -c .[]|npx kafka-console produce $KAFKA_TOPIC
176
+ #### Show Topic Offsets
177
+ ```bash
178
+ kafka-console topic:offsets my-topic
134
179
  ```
135
180
 
136
- Payload single message input interface
137
- ```typescript
138
- interface Payload {
139
- key?: string; // kafka
140
- value: any;
141
- headers?: { [key: string]: value };
142
- }
181
+ #### Show Topic Offsets for Specific Timestamp
182
+ ```bash
183
+ kafka-console topic:offsets my-topic "2024-01-01T00:00:00Z"
143
184
  ```
144
185
 
145
- ### Formatters
186
+ ### Cluster Information
146
187
 
147
- ```typescript
148
- export interface Encoder<T> {
149
- (value: T): Promise<string | Buffer> | string | Buffer;
150
- }
188
+ #### List All Topics
189
+ ```bash
190
+ kafka-console list
191
+ ```
151
192
 
152
- export interface Decoder<T> {
153
- (value: Buffer): Promise<T> | T;
154
- }
193
+ #### List Including Internal Topics
194
+ ```bash
195
+ kafka-console list --all
196
+ ```
155
197
 
156
- export interface Formatter<T> {
157
- encode: Encoder<T>;
158
- decode: Decoder<T>;
159
- }
198
+ #### Show Cluster Metadata
199
+ ```bash
200
+ kafka-console metadata
160
201
  ```
161
202
 
162
- ## Supported Environment Variables
203
+ #### Show Topic Configuration
204
+ ```bash
205
+ kafka-console config --resource topic --resourceName my-topic
206
+ ```
163
207
 
164
- - KAFKA_BROKERS
165
- - KAFKA_TIMEOUT
166
- - KAFKA_MECHANISM
167
- - KAFKA_USERNAME
168
- - KAFKA_PASSWORD
169
- - KAFKA_AUTH_ID
170
- - KAFKA_ACCESS_KEY_ID
171
- - KAFKA_SECRET_ACCESS_KEY
172
- - KAFKA_SESSION_TOKEN
173
- - KAFKA_OAUTH_BEARER
208
+ ## Authentication
209
+
210
+ ### SSL/TLS Connection
211
+ ```bash
212
+ kafka-console consume my-topic \
213
+ --brokers broker1:9093,broker2:9093 \
214
+ --ssl
215
+ ```
216
+
217
+ ### SASL/PLAIN
218
+ ```bash
219
+ kafka-console consume my-topic \
220
+ --brokers broker:9093 \
221
+ --ssl \
222
+ --mechanism plain \
223
+ --username myuser \
224
+ --password mypassword
225
+ ```
226
+
227
+ ### SASL/SCRAM-SHA-256
228
+ ```bash
229
+ kafka-console consume my-topic \
230
+ --brokers broker:9093 \
231
+ --ssl \
232
+ --mechanism scram-sha-256 \
233
+ --username myuser \
234
+ --password mypassword
235
+ ```
236
+
237
+ ### AWS IAM
238
+ ```bash
239
+ kafka-console consume my-topic \
240
+ --brokers broker:9093 \
241
+ --ssl \
242
+ --mechanism aws \
243
+ --access-key-id AKIAXXXXXXXX \
244
+ --secret-access-key XXXXXXXXXX \
245
+ --session-token XXXXXXXXXX
246
+ ```
247
+
248
+ ### OAuth Bearer
249
+ ```bash
250
+ kafka-console consume my-topic \
251
+ --brokers broker:9093 \
252
+ --ssl \
253
+ --mechanism oauthbearer \
254
+ --oauth-bearer "eyJhbGciOiJIUzI1NiIs..."
255
+ ```
256
+
257
+ ## Message Formats
258
+
259
+ ### JSON Format (Default)
260
+ Messages are parsed as JSON:
261
+ ```bash
262
+ echo '{"name": "Alice", "age": 30}' | kafka-console produce my-topic
263
+ ```
264
+
265
+ ### Raw Format
266
+ Messages are sent as plain text:
267
+ ```bash
268
+ echo "Plain text message" | kafka-console produce my-topic --data-format raw
269
+ ```
270
+
271
+ ### JavaScript Format
272
+ Messages can contain JavaScript exports:
273
+ ```bash
274
+ echo 'module.exports = { timestamp: Date.now() }' | kafka-console produce my-topic --data-format js
275
+ ```
276
+
277
+ ### Custom Formatter
278
+ Create a custom formatter module:
279
+
280
+ ```javascript
281
+ // formatter/custom.js
282
+ module.exports = {
283
+ encode: (value) => Buffer.from(JSON.stringify(value)),
284
+ decode: (buffer) => JSON.parse(buffer.toString())
285
+ };
286
+ ```
287
+
288
+ Use the custom formatter:
289
+ ```bash
290
+ kafka-console consume my-topic --data-format ./formatter/custom.js
291
+ ```
292
+
293
+ ## Environment Variables
294
+
295
+ Set environment variables to avoid repeating common options:
296
+
297
+ ```bash
298
+ export KAFKA_BROKERS=broker1:9092,broker2:9092
299
+ export KAFKA_USERNAME=myuser
300
+ export KAFKA_PASSWORD=mypassword
301
+ export KAFKA_MECHANISM=plain
302
+ export KAFKA_TIMEOUT=30000
303
+ ```
304
+
305
+ All supported environment variables:
306
+ - `KAFKA_BROKERS` - Comma-separated list of brokers
307
+ - `KAFKA_TIMEOUT` - Operation timeout in milliseconds
308
+ - `KAFKA_MECHANISM` - SASL mechanism
309
+ - `KAFKA_USERNAME` - SASL username
310
+ - `KAFKA_PASSWORD` - SASL password
311
+ - `KAFKA_AUTH_ID` - AWS authorization identity
312
+ - `KAFKA_ACCESS_KEY_ID` - AWS access key ID
313
+ - `KAFKA_SECRET_ACCESS_KEY` - AWS secret access key
314
+ - `KAFKA_SESSION_TOKEN` - AWS session token
315
+ - `KAFKA_OAUTH_BEARER` - OAuth bearer token
316
+
317
+ ## Common Use Cases
318
+
319
+ ### Monitor Topic in Real-time
320
+ ```bash
321
+ kafka-console consume logs --group monitor-group --pretty
322
+ ```
323
+
324
+ ### Replay Messages from Yesterday
325
+ ```bash
326
+ kafka-console consume events --from "$(date -d yesterday --iso-8601)"
327
+ ```
328
+
329
+ ### Copy Messages Between Topics
330
+ ```bash
331
+ kafka-console consume source-topic | kafka-console produce destination-topic
332
+ ```
333
+
334
+ ### Filter Messages
335
+ ```bash
336
+ kafka-console consume all-events | jq 'select(.value.type == "ERROR")'
337
+ ```
338
+
339
+ ### Count Messages in Topic
340
+ ```bash
341
+ kafka-console consume my-topic --from 0 | wc -l
342
+ ```
343
+
344
+ ### Sample Messages
345
+ ```bash
346
+ kafka-console consume large-topic --count 100 --pretty
347
+ ```
348
+
349
+ ### Debug Message Headers
350
+ ```bash
351
+ kafka-console consume my-topic | jq '.headers'
352
+ ```
353
+
354
+ ## Troubleshooting
355
+
356
+ ### Connection Issues
357
+
358
+ **Problem:** Cannot connect to Kafka broker
359
+ ```bash
360
+ Error: KafkaJSConnectionError: Connection timeout
361
+ ```
362
+ **Solution:**
363
+ - Verify broker addresses are correct
364
+ - Check network connectivity: `telnet broker-host 9092`
365
+ - Ensure security groups/firewalls allow connection
366
+ - For Docker: use host network or proper port mapping
367
+
368
+ ### Authentication Failures
369
+
370
+ **Problem:** Authentication failed
371
+ ```bash
372
+ Error: KafkaJSProtocolError: SASL authentication failed
373
+ ```
374
+ **Solution:**
375
+ - Verify credentials are correct
376
+ - Check SASL mechanism matches broker configuration
377
+ - Ensure SSL is enabled if required: `--ssl`
378
+
379
+ ### Consumer Group Issues
380
+
381
+ **Problem:** Not receiving messages
382
+ **Solution:**
383
+ - Check consumer group offset: `kafka-console topic:offsets my-topic --group my-group`
384
+ - Reset to beginning: `--from 0`
385
+ - Use a new consumer group name
386
+
387
+ ### Message Format Errors
388
+
389
+ **Problem:** JSON parsing errors
390
+ ```bash
391
+ SyntaxError: Unexpected token...
392
+ ```
393
+ **Solution:**
394
+ - Verify message format matches specified data-format
395
+ - Use `--data-format raw` for non-JSON messages
396
+ - Check for malformed JSON with: `jq . < input.json`
397
+
398
+ ### Performance Issues
399
+
400
+ **Problem:** Slow message consumption
401
+ **Solution:**
402
+ - Increase batch size in consumer configuration
403
+ - Use multiple consumer instances with same group
404
+ - Check network latency to brokers
405
+
406
+ ### SSL/TLS Issues
407
+
408
+ **Problem:** SSL handshake failed
409
+ **Solution:**
410
+ - Ensure `--ssl` flag is used
411
+ - Verify broker SSL port (usually 9093)
412
+ - Check certificate validity
174
413
 
175
414
  ## License
415
+
176
416
  License [The MIT License](http://opensource.org/licenses/MIT)
177
417
  Copyright (c) 2024 Ivan Zakharchanka
178
418
 
179
419
  [npm-url]: https://www.npmjs.com/package/kafka-console
180
420
  [downloads-image]: https://img.shields.io/npm/dw/kafka-console.svg?maxAge=43200
181
- [npm-image]: https://img.shields.io/npm/v/kafka-console.svg?maxAge=43200
421
+ [npm-image]: https://img.shields.io/npm/v/kafka-console.svg?maxAge=43200
@@ -24,7 +24,7 @@ exports.default = fetchTopicOffset;
24
24
  const kafka_1 = require("../utils/kafka");
25
25
  function fetchTopicOffset(topic_1, timestamp_1, opts_1, _a) {
26
26
  return __awaiter(this, arguments, void 0, function* (topic, timestamp, opts, { parent }) {
27
- const _b = Object.assign(Object.assign({}, parent.opts()), opts), { brokers, logLevel, ssl, pretty } = _b, rest = __rest(_b, ["brokers", "logLevel", "ssl", "pretty"]);
27
+ const _b = Object.assign(Object.assign({}, parent.opts()), opts), { brokers, group, logLevel, ssl, pretty } = _b, rest = __rest(_b, ["brokers", "group", "logLevel", "ssl", "pretty"]);
28
28
  const space = pretty ? 2 : 0;
29
29
  const sasl = (0, kafka_1.getSASL)(rest);
30
30
  const client = (0, kafka_1.createClient)(brokers, ssl, sasl, logLevel);
@@ -37,6 +37,10 @@ function fetchTopicOffset(topic_1, timestamp_1, opts_1, _a) {
37
37
  const topicOffsets = yield admin.fetchTopicOffsetsByTimestamp(topic, unixTimestamp);
38
38
  console.log(JSON.stringify(topicOffsets, null, space));
39
39
  }
40
+ if (group) {
41
+ const topicOffsets = yield admin.fetchOffsets({ groupId: group, topics: [topic] });
42
+ console.log(JSON.stringify(topicOffsets, null, space));
43
+ }
40
44
  else {
41
45
  const topicOffsets = yield admin.fetchTopicOffsets(topic);
42
46
  console.log(JSON.stringify(topicOffsets, null, space));
package/build/index.js CHANGED
@@ -83,6 +83,7 @@ commander
83
83
  commander
84
84
  .command('topic:offsets <topic> [timestamp]')
85
85
  .description('Shows kafka topic offsets')
86
+ .option('-g, --group <group>', 'consumer group name', `kafka-console-consumer-${Date.now()}`)
86
87
  .action(fetchTopicOffsets_1.default);
87
88
  commander.on('--help', function () {
88
89
  [
package/package.json CHANGED
@@ -1,6 +1,6 @@
1
1
  {
2
2
  "name": "kafka-console",
3
- "version": "2.0.76",
3
+ "version": "2.1.0",
4
4
  "description": "Kafka CLI tool",
5
5
  "main": "index.js",
6
6
  "bin": {
package/codecov DELETED
Binary file
package/codecov.SHA256SUM DELETED
@@ -1 +0,0 @@
1
- 5cc89518864eaae49f3819e42ba353f9a3c676912d4472351d31970643f1a49a codecov
@@ -1,16 +0,0 @@
1
- -----BEGIN PGP SIGNATURE-----
2
-
3
- iQIzBAABCgAdFiEEJwNOf9uFDgu8LGL/gGuyiu13mGkFAmiKiI8ACgkQgGuyiu13
4
- mGlk4xAAmsinowQbB0DMmXRHBW5lHO4dS8WQDq8vUsAO++eomoi6w3dH0DWdv1+2
5
- PY3v2UmCgEqS42fm+avl5LkIEyhE8n2MCwPevgtj72KGEeswoLSXk7T8XB/E5fbj
6
- TAAUjxeES/0r8ugbs3JuXw7ttl9Tt2XoGb3sMBAWSCLN1UxUEh5oc22UKjDaNSve
7
- NRirNwhO6BndacABO2Kt+Wu1UbBznyvy4XQ9jmF1wPdGUZeyHMROcePUof8xc3At
8
- IUIbdlAIIEG7o3WXzCHoRwKZ7P1PMXNdXBTprIeUNGueMi9OJdo/5tFSDyOkQl8z
9
- BbK6NqljjMXDaCH1gXS4OAyxHevUd/BDOlkKzKT9krThDQ5KMb36jag21VNI3WSi
10
- NASgbl83raq2taqPCOLOpQZoROuXtTLyYc1623D+3nrv+wL7lwJTR1b+BJ0Zyhgf
11
- mKQ3uu94cQHxn1fXPranBeZwD47xXpLIPHpsLXPgK1T/Ge6odW33rnhasQvWkJdu
12
- +MDVvBcKKzGILlpmHRXCNTqcDWO/Pe6I4I1fP1vGH3XRELLHeSiCDKYrY/PH28zj
13
- xJTPXd89xu6IEeSU9PtIvQ4oCeUahOnSbBUbCMzl5/2FNu1pIPPR6yv6XA313Zdi
14
- OPkrKo/0qtm/z0ZVn44zNzqbUgx7QiKBinFNuAJ2N6mMqkE4wdM=
15
- =9CDH
16
- -----END PGP SIGNATURE-----