@backstage/plugin-events-backend-module-kafka 0.0.0-nightly-20251213024329 → 0.0.0-nightly-20260108025012
This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
- package/CHANGELOG.md +22 -3
- package/README.md +142 -25
- package/config.d.ts +550 -136
- package/dist/KafkaConsumingEventPublisher/KafkaConsumingEventPublisher.cjs.js +104 -0
- package/dist/KafkaConsumingEventPublisher/KafkaConsumingEventPublisher.cjs.js.map +1 -0
- package/dist/KafkaConsumingEventPublisher/config.cjs.js +78 -0
- package/dist/KafkaConsumingEventPublisher/config.cjs.js.map +1 -0
- package/dist/{service/eventsModuleKafkaConsumingEventPublisher.cjs.js → KafkaConsumingEventPublisher/module.cjs.js} +10 -9
- package/dist/KafkaConsumingEventPublisher/module.cjs.js.map +1 -0
- package/dist/KafkaPublishingEventConsumer/KafkaPublishingEventConsumer.cjs.js +73 -0
- package/dist/KafkaPublishingEventConsumer/KafkaPublishingEventConsumer.cjs.js.map +1 -0
- package/dist/KafkaPublishingEventConsumer/config.cjs.js +44 -0
- package/dist/KafkaPublishingEventConsumer/config.cjs.js.map +1 -0
- package/dist/KafkaPublishingEventConsumer/module.cjs.js +36 -0
- package/dist/KafkaPublishingEventConsumer/module.cjs.js.map +1 -0
- package/dist/index.cjs.js +10 -3
- package/dist/index.cjs.js.map +1 -1
- package/dist/index.d.ts +7 -4
- package/dist/utils/LoggerServiceAdapter.cjs.js.map +1 -0
- package/dist/utils/config.cjs.js +46 -0
- package/dist/utils/config.cjs.js.map +1 -0
- package/dist/utils/kafkaTransformers.cjs.js +24 -0
- package/dist/utils/kafkaTransformers.cjs.js.map +1 -0
- package/package.json +6 -6
- package/dist/publisher/KafkaConsumerClient.cjs.js +0 -44
- package/dist/publisher/KafkaConsumerClient.cjs.js.map +0 -1
- package/dist/publisher/KafkaConsumingEventPublisher.cjs.js +0 -63
- package/dist/publisher/KafkaConsumingEventPublisher.cjs.js.map +0 -1
- package/dist/publisher/LoggerServiceAdapter.cjs.js.map +0 -1
- package/dist/publisher/config.cjs.js +0 -102
- package/dist/publisher/config.cjs.js.map +0 -1
- package/dist/service/eventsModuleKafkaConsumingEventPublisher.cjs.js.map +0 -1
- /package/dist/{publisher → utils}/LoggerServiceAdapter.cjs.js +0 -0
package/CHANGELOG.md
CHANGED
|
@@ -1,14 +1,33 @@
|
|
|
1
1
|
# @backstage/plugin-events-backend-module-kafka
|
|
2
2
|
|
|
3
|
-
## 0.0.0-nightly-
|
|
3
|
+
## 0.0.0-nightly-20260108025012
|
|
4
|
+
|
|
5
|
+
### Minor Changes
|
|
6
|
+
|
|
7
|
+
- ef5bbd8: Add support for Kafka offset configuration (`fromBeginning`) and `autoCommit`
|
|
4
8
|
|
|
5
9
|
### Patch Changes
|
|
6
10
|
|
|
7
11
|
- Updated dependencies
|
|
8
|
-
- @backstage/plugin-
|
|
9
|
-
- @backstage/backend-plugin-api@0.0.0-nightly-20251213024329
|
|
12
|
+
- @backstage/backend-plugin-api@1.6.0
|
|
10
13
|
- @backstage/config@1.3.6
|
|
11
14
|
- @backstage/types@1.2.2
|
|
15
|
+
- @backstage/plugin-events-node@0.4.18
|
|
16
|
+
|
|
17
|
+
## 0.2.0
|
|
18
|
+
|
|
19
|
+
### Minor Changes
|
|
20
|
+
|
|
21
|
+
- 2c74ea9: Added support for multiple named instances in `kafkaConsumingEventPublisher` configuration. The previous single configuration format is still supported for backward compatibility.
|
|
22
|
+
- 2c74ea9: Added `KafkaPublishingEventConsumer` to support sending Backstage events to Kafka topics.
|
|
23
|
+
|
|
24
|
+
This addition enables Backstage to publish events to external Kafka systems, complementing the existing ability to receive events from Kafka. This allows for better integration with external systems that rely on Kafka for event streaming.
|
|
25
|
+
|
|
26
|
+
### Patch Changes
|
|
27
|
+
|
|
28
|
+
- Updated dependencies
|
|
29
|
+
- @backstage/plugin-events-node@0.4.18
|
|
30
|
+
- @backstage/backend-plugin-api@1.6.0
|
|
12
31
|
|
|
13
32
|
## 0.1.6-next.1
|
|
14
33
|
|
package/README.md
CHANGED
|
@@ -2,65 +2,182 @@
|
|
|
2
2
|
|
|
3
3
|
Welcome to the `events-backend-module-kafka` backend module!
|
|
4
4
|
|
|
5
|
-
This package is a module for the `events-backend` backend plugin and extends the events system with
|
|
5
|
+
This package is a module for the `events-backend` backend plugin and extends the events system with a `KafkaConsumingEventPublisher` and `KafkaPublishingEventConsumer`
|
|
6
6
|
|
|
7
|
-
This
|
|
7
|
+
This module provides two-way integration with Kafka:
|
|
8
|
+
|
|
9
|
+
- **KafkaConsumingEventPublisher**: Receives events from Kafka queues and publishes them to the Backstage events system
|
|
10
|
+
- **KafkaPublishingEventConsumer**: Consumes events from Backstage and publishes them to Kafka queues
|
|
8
11
|
|
|
9
12
|
## Configuration
|
|
10
13
|
|
|
11
|
-
To set up Kafka
|
|
14
|
+
To set up Kafka integration, you need to configure one or both of the following components:
|
|
15
|
+
|
|
16
|
+
### KafkaConsumingEventPublisher Configuration
|
|
17
|
+
|
|
18
|
+
To receive events from Kafka queues and publish them to Backstage:
|
|
12
19
|
|
|
13
20
|
```yaml
|
|
14
21
|
events:
|
|
15
22
|
modules:
|
|
16
23
|
kafka:
|
|
17
24
|
kafkaConsumingEventPublisher:
|
|
18
|
-
|
|
19
|
-
|
|
20
|
-
|
|
21
|
-
|
|
22
|
-
|
|
23
|
-
|
|
24
|
-
|
|
25
|
-
|
|
26
|
-
|
|
27
|
-
|
|
25
|
+
production: # Instance name, will be included in logs
|
|
26
|
+
clientId: your-client-id # (Required) Client ID used by Backstage to identify when connecting to the Kafka cluster.
|
|
27
|
+
brokers: # (Required) List of brokers in the Kafka cluster to connect to.
|
|
28
|
+
- broker1
|
|
29
|
+
- broker2
|
|
30
|
+
topics:
|
|
31
|
+
- topic: 'backstage.topic' # (Required) Replace with actual topic name as expected by subscribers
|
|
32
|
+
kafka:
|
|
33
|
+
topics: # (Required) The Kafka topics to subscribe to.
|
|
34
|
+
- topic1
|
|
35
|
+
groupId: your-group-id # (Required) The GroupId to be used by the topic consumers.
|
|
36
|
+
# Optional offset management settings (these can be omitted to use defaults):
|
|
37
|
+
# fromBeginning: false # Start from earliest offset when no committed offset exists. Default: not set (latest)
|
|
38
|
+
# autoCommit: true # Enable auto-commit. Default: true (for backward compatibility)
|
|
39
|
+
# pauseOnError: false # Pause consumer on error. Default: false (for backward compatibility)
|
|
40
|
+
```
|
|
41
|
+
|
|
42
|
+
### KafkaPublishingEventConsumer Configuration
|
|
43
|
+
|
|
44
|
+
To publish events from Backstage to Kafka queues, you can configure the `KafkaPublishingEventConsumer`:
|
|
45
|
+
|
|
46
|
+
```yaml
|
|
47
|
+
events:
|
|
48
|
+
modules:
|
|
49
|
+
kafka:
|
|
50
|
+
kafkaPublishingEventConsumer:
|
|
51
|
+
production: # Instance name, will be included in logs
|
|
52
|
+
clientId: your-client-id # (Required) Client ID used by Backstage to identify when connecting to the Kafka cluster.
|
|
53
|
+
brokers: # (Required) List of brokers in the Kafka cluster to connect to.
|
|
54
|
+
- broker1
|
|
55
|
+
- broker2
|
|
56
|
+
topics:
|
|
57
|
+
- topic: 'catalog.entity.created' # (Required) The Backstage topic to consume from
|
|
58
|
+
kafka:
|
|
59
|
+
topic: kafka-topic-name # (Required) The Kafka topic to publish to
|
|
28
60
|
```
|
|
29
61
|
|
|
30
62
|
For a complete list of all available fields that can be configured, refer to the [config.d.ts file](./config.d.ts).
|
|
31
63
|
|
|
64
|
+
### Offset Management
|
|
65
|
+
|
|
66
|
+
The plugin supports configurable offset management to control message delivery semantics:
|
|
67
|
+
|
|
68
|
+
#### Auto Commit (Default - Backward Compatible)
|
|
69
|
+
|
|
70
|
+
By default (`autoCommit: true` or not specified), Kafka automatically commits offsets at regular intervals. This is the original behavior and ensures backward compatibility.
|
|
71
|
+
|
|
72
|
+
#### Manual Commit (Opt-in for Reliability)
|
|
73
|
+
|
|
74
|
+
When you explicitly set `autoCommit: false`, the plugin will:
|
|
75
|
+
|
|
76
|
+
1. Start consuming from the last committed offset for the consumer group
|
|
77
|
+
2. Process each message by publishing it to the Backstage events system
|
|
78
|
+
3. Only commit the offset after successful processing
|
|
79
|
+
4. If processing fails, pause the consumer and do not commit the offset
|
|
80
|
+
|
|
81
|
+
**Example configuration for manual commit:**
|
|
82
|
+
|
|
83
|
+
```yaml
|
|
84
|
+
kafka:
|
|
85
|
+
topics:
|
|
86
|
+
- topic1
|
|
87
|
+
groupId: my-group
|
|
88
|
+
autoCommit: false # Enable manual commit
|
|
89
|
+
```
|
|
90
|
+
|
|
91
|
+
#### Error Handling
|
|
92
|
+
|
|
93
|
+
The `pauseOnError` option controls how the consumer behaves when message processing fails:
|
|
94
|
+
|
|
95
|
+
**Skip Failed Messages (Default - Backward Compatible)**
|
|
96
|
+
|
|
97
|
+
By default (`pauseOnError: false` or not specified), the consumer will skip failed messages and continue processing:
|
|
98
|
+
|
|
99
|
+
- The consumer logs the error but continues processing subsequent messages
|
|
100
|
+
- If `autoCommit: false`, the offset is still committed to skip the failed message
|
|
101
|
+
- If `autoCommit: true`, Kafka's auto-commit handles the offset
|
|
102
|
+
- Recommended when occasional message failures are acceptable and should not block processing
|
|
103
|
+
|
|
104
|
+
**Pause on Error (Opt-in)**
|
|
105
|
+
|
|
106
|
+
When you explicitly set `pauseOnError: true`, the consumer will pause when an error occurs during message processing:
|
|
107
|
+
|
|
108
|
+
- The consumer pauses and stops processing new messages
|
|
109
|
+
- The failed message offset is not committed
|
|
110
|
+
- The error is re-thrown and logged
|
|
111
|
+
- Recommended when you want to investigate and fix issues before continuing
|
|
112
|
+
|
|
113
|
+
**Example configuration to pause on error:**
|
|
114
|
+
|
|
115
|
+
```yaml
|
|
116
|
+
kafka:
|
|
117
|
+
topics:
|
|
118
|
+
- topic1
|
|
119
|
+
groupId: my-group
|
|
120
|
+
autoCommit: false
|
|
121
|
+
pauseOnError: true # Pause consumer when a message fails
|
|
122
|
+
```
|
|
123
|
+
|
|
124
|
+
**Note:** When using the default behavior (`pauseOnError: false`) with `autoCommit: false`, failed messages will have their offsets committed, meaning they will be skipped and not reprocessed. Use this configuration carefully based on your application's requirements.
|
|
125
|
+
|
|
126
|
+
#### Starting Position
|
|
127
|
+
|
|
128
|
+
The `fromBeginning` option controls where the consumer starts when no committed offset exists:
|
|
129
|
+
|
|
130
|
+
- `fromBeginning: true` - Start from the earliest available message
|
|
131
|
+
- `fromBeginning: false` (default) - Start from the latest message (only new messages)
|
|
132
|
+
|
|
133
|
+
Once the consumer group has committed an offset, it will always resume from that position, regardless of the `fromBeginning` setting.
|
|
134
|
+
|
|
32
135
|
### Optional SSL Configuration
|
|
33
136
|
|
|
34
|
-
If your Kafka cluster requires SSL, you can configure it
|
|
137
|
+
If your Kafka cluster requires SSL, you can configure it for both `kafkaConsumingEventPublisher` and `kafkaPublishingEventConsumer` instances:
|
|
35
138
|
|
|
36
139
|
```yaml
|
|
37
140
|
events:
|
|
38
141
|
modules:
|
|
39
142
|
kafka:
|
|
40
143
|
kafkaConsumingEventPublisher:
|
|
41
|
-
|
|
42
|
-
|
|
43
|
-
|
|
44
|
-
|
|
45
|
-
|
|
144
|
+
production:
|
|
145
|
+
# ... other configuration ...
|
|
146
|
+
ssl:
|
|
147
|
+
rejectUnauthorized: true # (Optional) If true, the server certificate is verified against the list of supplied CAs.
|
|
148
|
+
ca: [path/to/ca-cert] # (Optional) Array of trusted certificates in PEM format.
|
|
149
|
+
key: path/to/client-key # (Optional) Private key in PEM format.
|
|
150
|
+
cert: path/to/client-cert # (Optional) Public x509 certificate in PEM format.
|
|
151
|
+
kafkaPublishingEventConsumer:
|
|
152
|
+
production:
|
|
153
|
+
# ... other configuration ...
|
|
154
|
+
ssl:
|
|
155
|
+
# Same SSL configuration options as above
|
|
46
156
|
```
|
|
47
157
|
|
|
48
158
|
### Optional SASL Authentication Configuration
|
|
49
159
|
|
|
50
|
-
If your Kafka cluster requires
|
|
160
|
+
If your Kafka cluster requires SASL authentication, you can configure it for both components:
|
|
51
161
|
|
|
52
162
|
```yaml
|
|
53
163
|
events:
|
|
54
164
|
modules:
|
|
55
165
|
kafka:
|
|
56
166
|
kafkaConsumingEventPublisher:
|
|
57
|
-
|
|
58
|
-
|
|
59
|
-
|
|
60
|
-
|
|
167
|
+
production:
|
|
168
|
+
# ... other configuration ...
|
|
169
|
+
sasl:
|
|
170
|
+
mechanism: 'plain' # SASL mechanism ('plain', 'scram-sha-256' or 'scram-sha-512')
|
|
171
|
+
username: your-username # SASL username
|
|
172
|
+
password: your-password # SASL password
|
|
173
|
+
kafkaPublishingEventConsumer:
|
|
174
|
+
production:
|
|
175
|
+
# ... other configuration ...
|
|
176
|
+
sasl:
|
|
177
|
+
# Same SASL configuration options as above
|
|
61
178
|
```
|
|
62
179
|
|
|
63
|
-
|
|
180
|
+
These SSL and SASL configurations apply to both Kafka components and provide enhanced security for your Kafka connections.
|
|
64
181
|
|
|
65
182
|
## Installation
|
|
66
183
|
|