karafka 2.4.9 → 2.4.11

Sign up to get free protection for your applications and to get access to all the features.
Files changed (47) hide show
  1. checksums.yaml +4 -4
  2. checksums.yaml.gz.sig +0 -0
  3. data/.ruby-version +1 -1
  4. data/CHANGELOG.md +14 -0
  5. data/Gemfile.lock +6 -6
  6. data/config/locales/errors.yml +1 -0
  7. data/config/locales/pro_errors.yml +18 -0
  8. data/lib/active_job/queue_adapters/karafka_adapter.rb +6 -0
  9. data/lib/karafka/base_consumer.rb +23 -0
  10. data/lib/karafka/contracts/consumer_group.rb +17 -0
  11. data/lib/karafka/instrumentation/logger_listener.rb +3 -0
  12. data/lib/karafka/instrumentation/notifications.rb +3 -0
  13. data/lib/karafka/instrumentation/vendors/appsignal/client.rb +32 -11
  14. data/lib/karafka/instrumentation/vendors/appsignal/errors_listener.rb +1 -1
  15. data/lib/karafka/messages/message.rb +6 -0
  16. data/lib/karafka/pro/loader.rb +2 -1
  17. data/lib/karafka/pro/routing/features/recurring_tasks/builder.rb +9 -8
  18. data/lib/karafka/pro/routing/features/scheduled_messages/builder.rb +131 -0
  19. data/lib/karafka/pro/routing/features/scheduled_messages/config.rb +28 -0
  20. data/lib/karafka/pro/routing/features/scheduled_messages/contracts/topic.rb +40 -0
  21. data/lib/karafka/pro/routing/features/scheduled_messages/proxy.rb +27 -0
  22. data/lib/karafka/pro/routing/features/scheduled_messages/topic.rb +44 -0
  23. data/lib/karafka/pro/routing/features/scheduled_messages.rb +24 -0
  24. data/lib/karafka/pro/scheduled_messages/consumer.rb +185 -0
  25. data/lib/karafka/pro/scheduled_messages/contracts/config.rb +56 -0
  26. data/lib/karafka/pro/scheduled_messages/contracts/message.rb +77 -0
  27. data/lib/karafka/pro/scheduled_messages/daily_buffer.rb +79 -0
  28. data/lib/karafka/pro/scheduled_messages/day.rb +45 -0
  29. data/lib/karafka/pro/scheduled_messages/deserializers/headers.rb +46 -0
  30. data/lib/karafka/pro/scheduled_messages/deserializers/payload.rb +35 -0
  31. data/lib/karafka/pro/scheduled_messages/dispatcher.rb +122 -0
  32. data/lib/karafka/pro/scheduled_messages/errors.rb +28 -0
  33. data/lib/karafka/pro/scheduled_messages/max_epoch.rb +41 -0
  34. data/lib/karafka/pro/scheduled_messages/proxy.rb +176 -0
  35. data/lib/karafka/pro/scheduled_messages/schema_validator.rb +37 -0
  36. data/lib/karafka/pro/scheduled_messages/serializer.rb +55 -0
  37. data/lib/karafka/pro/scheduled_messages/setup/config.rb +60 -0
  38. data/lib/karafka/pro/scheduled_messages/state.rb +62 -0
  39. data/lib/karafka/pro/scheduled_messages/tracker.rb +64 -0
  40. data/lib/karafka/pro/scheduled_messages.rb +67 -0
  41. data/lib/karafka/processing/executor.rb +6 -0
  42. data/lib/karafka/processing/strategies/default.rb +10 -0
  43. data/lib/karafka/railtie.rb +0 -20
  44. data/lib/karafka/version.rb +1 -1
  45. data.tar.gz.sig +0 -0
  46. metadata +26 -3
  47. metadata.gz.sig +3 -2
checksums.yaml CHANGED
@@ -1,7 +1,7 @@
1
1
  ---
2
2
  SHA256:
3
- metadata.gz: 69bfb8dac259cdb89eadfa86b8a13b7e901fbf82f7562e0869c7bd88f6111f30
4
- data.tar.gz: 68959cc13a83db2611a41556ce99db5fed6f98272626dc12f64e4c18415bc633
3
+ metadata.gz: 346743e75bc80a6a3e04361aa6b7b5caa540b31697bc25bb138baa490edd0a93
4
+ data.tar.gz: 6a87b0f7af16210b93732f55e98072b0c2eaf4541b700bb1929864b515a91747
5
5
  SHA512:
6
- metadata.gz: 54d7bb2ad1d65df9b2e92f28a8ae2255dac95c37fcbc3b1bde554a7d6be0c48c79ee164f6d7b8ebd57d9b09a18ff3f955cc9a193e5ca0207403d503224fc0f61
7
- data.tar.gz: 94b08e703378d7b074d32f220dbb24a02a0e3039f948780bb7c434be721df53a52b03ca349d3e26fd8330ff6d7049d389fe27965b8120ef9d4047b1acc3984d1
6
+ metadata.gz: c20b7bb58d31b7771e593334783285edf13bfaff977350226aeb5a979fe1fd92482bb27acf72385a05a762d69581260196f953bdcea4aafbd1d50d0842fec9a2
7
+ data.tar.gz: 07b991c9048f20352c1670a5782c028757e177f7f062ce3a1ce113fd01a97184717b8c7d3612d515993cb048e21b3dcf1804b22852c1b6d9bccebb16463ca2aa
checksums.yaml.gz.sig CHANGED
Binary file
data/.ruby-version CHANGED
@@ -1 +1 @@
1
- 3.3.4
1
+ 3.3.5
data/CHANGELOG.md CHANGED
@@ -1,5 +1,19 @@
1
1
  # Karafka Framework Changelog
2
2
 
3
+ ## 2.4.11 (2024-09-04)
4
+ - [Enhancement] Validate envelope target topic type for Scheduled Messages.
5
+ - [Enhancement] Support for enqueue_after_transaction_commit in rails active job.
6
+ - [Fix] Fix invalid reference to AppSignal version.
7
+
8
+ ## 2.4.10 (2024-09-03)
9
+ - **[Feature]** Provide Kafka based Scheduled Messages to be able to send messages in the future via a proxy topic.
10
+ - [Enhancement] Introduce a `#assigned` hook for consumers to be able to trigger actions when consumer is built and assigned but before first consume/ticking, etc.
11
+ - [Enhancement] Provide `Karafka::Messages::Message#tombstone?` to be able to quickly check if a message is a tombstone message.
12
+ - [Enhancement] Provide more flexible API for Recurring Tasks topics reconfiguration.
13
+ - [Enhancement] Remove no longer needed Rails connection releaser.
14
+ - [Enhancement] Update AppSignal client to support newer versions (tombruijn and hieuk09).
15
+ - [Fix] Fix a case where there would be a way to define multiple subscription groups for same topic with different consumer.
16
+
3
17
  ## 2.4.9 (2024-08-23)
4
18
  - **[Feature]** Provide Kafka based Recurring (Cron) Tasks.
5
19
  - [Enhancement] Wrap worker work with Rails Reloader/Executor (fusion2004)
data/Gemfile.lock CHANGED
@@ -1,7 +1,7 @@
1
1
  PATH
2
2
  remote: .
3
3
  specs:
4
- karafka (2.4.9)
4
+ karafka (2.4.11)
5
5
  base64 (~> 0.2)
6
6
  karafka-core (>= 2.4.3, < 2.5.0)
7
7
  karafka-rdkafka (>= 0.17.2)
@@ -48,16 +48,16 @@ GEM
48
48
  concurrent-ruby (~> 1.0)
49
49
  karafka-core (2.4.4)
50
50
  karafka-rdkafka (>= 0.15.0, < 0.18.0)
51
- karafka-rdkafka (0.17.3)
51
+ karafka-rdkafka (0.17.6)
52
52
  ffi (~> 1.15)
53
53
  mini_portile2 (~> 2.6)
54
54
  rake (> 12)
55
55
  karafka-testing (2.4.6)
56
56
  karafka (>= 2.4.0, < 2.5.0)
57
57
  waterdrop (>= 2.7.0)
58
- karafka-web (0.10.0)
58
+ karafka-web (0.10.2)
59
59
  erubi (~> 1.4)
60
- karafka (>= 2.4.7, < 2.5.0)
60
+ karafka (>= 2.4.10, < 2.5.0)
61
61
  karafka-core (>= 2.4.0, < 2.5.0)
62
62
  roda (~> 3.68, >= 3.69)
63
63
  tilt (~> 2.0)
@@ -97,7 +97,7 @@ GEM
97
97
  karafka-core (>= 2.4.3, < 3.0.0)
98
98
  karafka-rdkafka (>= 0.15.1)
99
99
  zeitwerk (~> 2.3)
100
- zeitwerk (2.6.17)
100
+ zeitwerk (2.6.18)
101
101
 
102
102
  PLATFORMS
103
103
  ruby
@@ -116,4 +116,4 @@ DEPENDENCIES
116
116
  simplecov
117
117
 
118
118
  BUNDLED WITH
119
- 2.5.11
119
+ 2.4.22
@@ -140,6 +140,7 @@ en:
140
140
  consumer_group:
141
141
  missing: needs to be present
142
142
  topics_names_not_unique: all topic names within a single consumer group must be unique
143
+ topics_many_consumers_same_topic: 'topic within a single consumer group cannot have distinct consumers'
143
144
  id_format: 'needs to be a string with a Kafka accepted format'
144
145
  topics_format: needs to be a non-empty array
145
146
  topics_namespaced_names_not_unique: |
@@ -64,6 +64,8 @@ en:
64
64
  swarm_nodes_with_non_existent_nodes: includes unreachable nodes ids
65
65
 
66
66
  recurring_tasks.active_format: 'needs to be boolean'
67
+ scheduled_messages.active_format: 'needs to be boolean'
68
+ scheduled_messages.active_missing: 'needs to be boolean'
67
69
 
68
70
  direct_assignments.active_missing: needs to be present
69
71
  direct_assignments.active_format: 'needs to be boolean'
@@ -109,6 +111,15 @@ en:
109
111
  recurring_tasks.deserializer_format: 'needs to be configured'
110
112
  recurring_tasks.logging_format: needs to be a boolean
111
113
 
114
+ scheduled_messages.consumer_class_format: 'must be a class'
115
+ scheduled_messages.dispatcher_class_format: 'must be a class'
116
+ scheduled_messages.flush_batch_size_format: needs to be an integer bigger than 0
117
+ scheduled_messages.interval_format: needs to be an integer bigger or equal to 1000
118
+ scheduled_messages.deserializers.headers_format: cannot be nil
119
+ scheduled_messages.deserializers.payload_format: cannot be nil
120
+ scheduled_messages.group_id_format: 'needs to be a string with a Kafka accepted format'
121
+ scheduled_messages.states_postfix_format: 'needs to be a string with a Kafka accepted format'
122
+
112
123
  routing:
113
124
  swarm_nodes_not_used: 'At least one of the nodes has no assignments'
114
125
 
@@ -118,3 +129,10 @@ en:
118
129
  enabled_format: needs to be a boolean
119
130
  changed_format: needs to be a boolean
120
131
  previous_time_format: needs to be a numerical or time
132
+
133
+ scheduled_messages_message:
134
+ key_missing: must be present and should be unique within the partition
135
+ key_format: needs to be a non-empty string unique within the partition
136
+ headers_schedule_target_epoch_in_the_past: 'scheduling cannot happen in the past'
137
+ headers_format: are not correct
138
+ not_a_scheduled_messages_topic: 'the envelope topic is not a scheduled messages topic'
@@ -29,6 +29,12 @@ module ActiveJob
29
29
  def enqueue_at(_job, _timestamp)
30
30
  raise NotImplementedError, 'This queueing backend does not support scheduling jobs.'
31
31
  end
32
+
33
+ # @return [true] should we by default enqueue after the transaction and not during.
34
+ # Defaults to true to prevent weird issues during rollbacks, etc.
35
+ def enqueue_after_transaction_commit?
36
+ true
37
+ end
32
38
  end
33
39
  end
34
40
  end
@@ -31,6 +31,20 @@ module Karafka
31
31
  @used = false
32
32
  end
33
33
 
34
+ # Trigger method running after consumer is fully initialized.
35
+ #
36
+ # @private
37
+ def on_initialized
38
+ handle_initialized
39
+ rescue StandardError => e
40
+ Karafka.monitor.instrument(
41
+ 'error.occurred',
42
+ error: e,
43
+ caller: self,
44
+ type: 'consumer.initialized.error'
45
+ )
46
+ end
47
+
34
48
  # Can be used to run preparation code prior to the job being enqueued
35
49
  #
36
50
  # @private
@@ -176,6 +190,15 @@ module Karafka
176
190
 
177
191
  private
178
192
 
193
+ # Method called post-initialization of a consumer when all basic things are assigned.
194
+ # Since initialization via `#initialize` is complex and some states are set a bit later, this
195
+ # hook allows to initialize resources once at a time when topic, partition and other things
196
+ # are assigned to the consumer
197
+ #
198
+ # @note Please keep in mind that it will run many times when persistence is off. Basically once
199
+ # each batch.
200
+ def initialized; end
201
+
179
202
  # Method that will perform business logic and on data received from Kafka (it will consume
180
203
  # the data)
181
204
  # @note This method needs to be implemented in a subclass. We stub it here as a failover if
@@ -25,6 +25,23 @@ module Karafka
25
25
  [[%i[topics], :names_not_unique]]
26
26
  end
27
27
 
28
+ # Prevent same topics subscriptions in one CG with different consumer classes
29
+ # This should prevent users from accidentally creating multi-sg one CG setup with weird
30
+ # different consumer usage. If you need to consume same topic twice, use distinct CGs.
31
+ virtual do |data, errors|
32
+ next unless errors.empty?
33
+
34
+ topics_consumers = Hash.new { |h, k| h[k] = Set.new }
35
+
36
+ data.fetch(:topics).map do |topic|
37
+ topics_consumers[topic[:name]] << topic[:consumer]
38
+ end
39
+
40
+ next if topics_consumers.values.map(&:size).all? { |count| count == 1 }
41
+
42
+ [[%i[topics], :many_consumers_same_topic]]
43
+ end
44
+
28
45
  virtual do |data, errors|
29
46
  next unless errors.empty?
30
47
  next unless ::Karafka::App.config.strict_topics_namespacing
@@ -275,6 +275,9 @@ module Karafka
275
275
  details = (error.backtrace || []).join("\n")
276
276
 
277
277
  case type
278
+ when 'consumer.initialized.error'
279
+ error "Consumer initialized error: #{error}"
280
+ error details
278
281
  when 'consumer.consume.error'
279
282
  error "Consumer consuming error: #{error}"
280
283
  error details
@@ -48,6 +48,9 @@ module Karafka
48
48
  connection.listener.stopping
49
49
  connection.listener.stopped
50
50
 
51
+ consumer.initialize
52
+ consumer.initialized
53
+
51
54
  consumer.before_schedule_consume
52
55
  consumer.consume
53
56
  consumer.consumed
@@ -23,11 +23,16 @@ module Karafka
23
23
  # @param action_name [String] action name. For processing this should be equal to
24
24
  # consumer class + method name
25
25
  def start_transaction(action_name)
26
- transaction = ::Appsignal::Transaction.create(
27
- SecureRandom.uuid,
28
- namespace_name,
29
- ::Appsignal::Transaction::GenericRequest.new({})
30
- )
26
+ transaction =
27
+ if version_4_or_newer?
28
+ ::Appsignal::Transaction.create(namespace_name)
29
+ else
30
+ ::Appsignal::Transaction.create(
31
+ SecureRandom.uuid,
32
+ namespace_name,
33
+ ::Appsignal::Transaction::GenericRequest.new({})
34
+ )
35
+ end
31
36
 
32
37
  transaction.set_action_if_nil(action_name)
33
38
  end
@@ -45,10 +50,10 @@ module Karafka
45
50
  def metadata=(metadata_hash)
46
51
  return unless transaction?
47
52
 
48
- transaction = ::Appsignal::Transaction.current
53
+ current_transaction = transaction
49
54
 
50
55
  stringify_hash(metadata_hash).each do |key, value|
51
- transaction.set_metadata(key, value)
56
+ current_transaction.set_metadata(key, value)
52
57
  end
53
58
  end
54
59
 
@@ -78,15 +83,20 @@ module Karafka
78
83
  )
79
84
  end
80
85
 
81
- # Sends the error that occurred to Appsignal
86
+ # Report the error that occurred to Appsignal
82
87
  #
83
88
  # @param error [Object] error we want to ship to Appsignal
84
- def send_error(error)
89
+ def report_error(error)
90
+ if ::Appsignal.respond_to?(:report_error)
91
+ # This helper will always report the error
92
+ ::Appsignal.report_error(error) do |transaction|
93
+ transaction.set_namespace(namespace_name)
94
+ end
85
95
  # If we have an active transaction we should use it instead of creating a generic one
86
96
  # That way proper namespace and other data may be transferred
87
97
  #
88
98
  # In case there is no transaction, a new generic background job one will be used
89
- if transaction?
99
+ elsif transaction?
90
100
  transaction.set_error(error)
91
101
  else
92
102
  ::Appsignal.send_error(error) do |transaction|
@@ -99,7 +109,11 @@ module Karafka
99
109
  # @param name [Symbol] probe name
100
110
  # @param probe [Proc] code to run every minute
101
111
  def register_probe(name, probe)
102
- ::Appsignal::Minutely.probes.register(name, probe)
112
+ if ::Appsignal::Probes.respond_to?(:register)
113
+ ::Appsignal::Probes.register(name, probe)
114
+ else
115
+ ::Appsignal::Minutely.probes.register(name, probe)
116
+ end
103
117
  end
104
118
 
105
119
  private
@@ -129,6 +143,13 @@ module Karafka
129
143
  def namespace_name
130
144
  @namespace_name ||= ::Appsignal::Transaction::BACKGROUND_JOB
131
145
  end
146
+
147
+ # @return [Boolean] is this v4+ version of Appsignal gem or older. Used for backwards
148
+ # compatibility checks.
149
+ def version_4_or_newer?
150
+ @version_4_or_newer ||=
151
+ Gem::Version.new(::Appsignal::VERSION) >= Gem::Version.new('4.0.0')
152
+ end
132
153
  end
133
154
  end
134
155
  end
@@ -21,7 +21,7 @@ module Karafka
21
21
  #
22
22
  # @param event [Karafka::Core::Monitoring::Event]
23
23
  def on_error_occurred(event)
24
- client.send_error(event[:error])
24
+ client.report_error(event[:error])
25
25
  end
26
26
  end
27
27
  end
@@ -51,6 +51,12 @@ module Karafka
51
51
  @deserialized
52
52
  end
53
53
 
54
+ # @return [Boolean] true if the message has a key and raw payload is nil, it is a tombstone
55
+ # event. Otherwise it is not.
56
+ def tombstone?
57
+ !raw_key.nil? && @raw_payload.nil?
58
+ end
59
+
54
60
  private
55
61
 
56
62
  # @return [Object] deserialized data
@@ -76,7 +76,8 @@ module Karafka
76
76
  [
77
77
  Encryption,
78
78
  Cleaner,
79
- RecurringTasks
79
+ RecurringTasks,
80
+ ScheduledMessages
80
81
  ]
81
82
  end
82
83
 
@@ -22,9 +22,7 @@ module Karafka
22
22
  #
23
23
  # @param active [Boolean] should recurring tasks be active. We use a boolean flag to
24
24
  # have API consistency in the system, so it matches other routing related APIs.
25
- # @param block [Proc] optional reconfiguration of the tasks topic definitions.
26
- # @note Since we cannot provide two blocks, reconfiguration of logs topic can be only
27
- # done if user explicitly redefines it in the routing.
25
+ # @param block [Proc] optional reconfiguration of the topics definitions.
28
26
  def recurring_tasks(active = false, &block)
29
27
  return unless active
30
28
 
@@ -39,7 +37,7 @@ module Karafka
39
37
  consumer_group tasks_cfg.group_id do
40
38
  # Registers the primary topic that we use to control schedules execution. This is
41
39
  # the one that we use to trigger recurring tasks.
42
- topic(topics_cfg.schedules) do
40
+ schedules_topic = topic(topics_cfg.schedules) do
43
41
  consumer tasks_cfg.consumer_class
44
42
  deserializer tasks_cfg.deserializer
45
43
  # Because the topic method name as well as builder proxy method name is the same
@@ -85,14 +83,15 @@ module Karafka
85
83
  during_retry: false
86
84
  )
87
85
 
88
- next unless block
89
-
90
- instance_eval(&block)
86
+ # If this is the direct schedules redefinition style, we run it
87
+ # The second one (see end of this method) allows for linear reconfiguration of
88
+ # both the topics
89
+ instance_eval(&block) if block && block.arity.zero?
91
90
  end
92
91
 
93
92
  # This topic is to store logs that we can then inspect either from the admin or via
94
93
  # the Web UI
95
- topic(topics_cfg.logs) do
94
+ logs_topic = topic(topics_cfg.logs) do
96
95
  active(false)
97
96
  deserializer tasks_cfg.deserializer
98
97
  target.recurring_tasks(true)
@@ -104,6 +103,8 @@ module Karafka
104
103
  'retention.ms': 604_800_000
105
104
  )
106
105
  end
106
+
107
+ yield(schedules_topic, logs_topic) if block && block.arity.positive?
107
108
  end
108
109
  end
109
110
 
@@ -0,0 +1,131 @@
1
+ # frozen_string_literal: true
2
+
3
+ # This Karafka component is a Pro component under a commercial license.
4
+ # This Karafka component is NOT licensed under LGPL.
5
+ #
6
+ # All of the commercial components are present in the lib/karafka/pro directory of this
7
+ # repository and their usage requires commercial license agreement.
8
+ #
9
+ # Karafka has also commercial-friendly license, commercial support and commercial components.
10
+ #
11
+ # By sending a pull request to the pro components, you are agreeing to transfer the copyright of
12
+ # your code to Maciej Mensfeld.
13
+
14
+ module Karafka
15
+ module Pro
16
+ module Routing
17
+ module Features
18
+ class ScheduledMessages < Base
19
+ # Routing extensions for scheduled messages
20
+ module Builder
21
+ # Enabled scheduled messages operations and adds needed topics and other stuff.
22
+ #
23
+ # @param group_name [String, false] name for scheduled messages topic that is also used
24
+ # as a group identifier. Users can have multiple schedule topics flows to prevent key
25
+ # collisions, prioritize and do other stuff. `false` if not active.
26
+ # @param block [Proc] optional reconfiguration of the topics definitions.
27
+ # @note Namespace for topics should include the divider as it is not automatically
28
+ # added.
29
+ def scheduled_messages(group_name = false, &block)
30
+ return unless group_name
31
+
32
+ # Load zlib only if user enables scheduled messages
33
+ require 'zlib'
34
+
35
+ # We set it to 5 so we have enough space to handle more events. All related topics
36
+ # should have same partition count.
37
+ default_partitions = 5
38
+ msg_cfg = App.config.scheduled_messages
39
+
40
+ consumer_group msg_cfg.group_id do
41
+ # Registers the primary topic that we use to control schedules execution. This is
42
+ # the one that we use to trigger scheduled messages.
43
+ messages_topic = topic(group_name) do
44
+ instance_eval(&block) if block && block.arity.zero?
45
+
46
+ consumer msg_cfg.consumer_class
47
+
48
+ deserializers(
49
+ headers: msg_cfg.deserializers.headers
50
+ )
51
+
52
+ # Because the topic method name as well as builder proxy method name is the same
53
+ # we need to reference it via target directly
54
+ target.scheduled_messages(true)
55
+
56
+ # We manage offsets directly because messages can have both schedules and
57
+ # commands and we need to apply them only when we need to
58
+ manual_offset_management(true)
59
+
60
+ # We use multi-batch operations and in-memory state for schedules. This needs to
61
+ # always operate without re-creation.
62
+ consumer_persistence(true)
63
+
64
+ # This needs to be enabled for the eof to work correctly
65
+ kafka('enable.partition.eof': true, inherit: true)
66
+ eofed(true)
67
+
68
+ # Since this is a topic that gets replayed because of schedule management, we do
69
+ # want to get more data faster during recovery
70
+ max_messages(10_000)
71
+
72
+ max_wait_time(1_000)
73
+
74
+ # This is a setup that should allow messages to be compacted fairly fast. Since
75
+ # each dispatched message should be removed via tombstone, they do not have to
76
+ # be present in the topic for too long.
77
+ config(
78
+ partitions: default_partitions,
79
+ # Will ensure, that after tombstone is present, given scheduled message, that
80
+ # has been dispatched is removed by Kafka
81
+ 'cleanup.policy': 'compact',
82
+ # When 10% or more dispatches are done, compact data
83
+ 'min.cleanable.dirty.ratio': 0.1,
84
+ # Frequent segment rotation to support intense compaction
85
+ 'segment.ms': 3_600_000,
86
+ 'delete.retention.ms': 3_600_000,
87
+ 'segment.bytes': 52_428_800
88
+ )
89
+
90
+ # This is the core of execution. Since we dispatch data in time intervals, we
91
+ # need to be able to do this even when no new data is coming
92
+ periodic(
93
+ interval: msg_cfg.interval,
94
+ during_pause: false,
95
+ during_retry: false
96
+ )
97
+
98
+ # If this is the direct schedules redefinition style, we run it
99
+ # The second one (see end of this method) allows for linear reconfiguration of
100
+ # both the topics
101
+ instance_eval(&block) if block && block.arity.zero?
102
+ end
103
+
104
+ # Holds states of scheduler per each of the partitions since they tick
105
+ # independently. We only hold future statistics not to have to deal with
106
+ # any type of state restoration
107
+ states_topic = topic("#{group_name}#{msg_cfg.states_postfix}") do
108
+ active(false)
109
+ target.scheduled_messages(true)
110
+ config(
111
+ partitions: default_partitions,
112
+ 'cleanup.policy': 'compact',
113
+ 'min.cleanable.dirty.ratio': 0.1,
114
+ 'segment.ms': 3_600_000,
115
+ 'delete.retention.ms': 3_600_000,
116
+ 'segment.bytes': 52_428_800
117
+ )
118
+ deserializers(
119
+ payload: msg_cfg.deserializers.payload
120
+ )
121
+ end
122
+
123
+ yield(messages_topic, states_topic) if block && block.arity.positive?
124
+ end
125
+ end
126
+ end
127
+ end
128
+ end
129
+ end
130
+ end
131
+ end
@@ -0,0 +1,28 @@
1
+ # frozen_string_literal: true
2
+
3
+ # This Karafka component is a Pro component under a commercial license.
4
+ # This Karafka component is NOT licensed under LGPL.
5
+ #
6
+ # All of the commercial components are present in the lib/karafka/pro directory of this
7
+ # repository and their usage requires commercial license agreement.
8
+ #
9
+ # Karafka has also commercial-friendly license, commercial support and commercial components.
10
+ #
11
+ # By sending a pull request to the pro components, you are agreeing to transfer the copyright of
12
+ # your code to Maciej Mensfeld.
13
+
14
+ module Karafka
15
+ module Pro
16
+ module Routing
17
+ module Features
18
+ class ScheduledMessages < Base
19
+ # Scheduled messages configuration
20
+ Config = Struct.new(
21
+ :active,
22
+ keyword_init: true
23
+ ) { alias_method :active?, :active }
24
+ end
25
+ end
26
+ end
27
+ end
28
+ end
@@ -0,0 +1,40 @@
1
+ # frozen_string_literal: true
2
+
3
+ # This Karafka component is a Pro component under a commercial license.
4
+ # This Karafka component is NOT licensed under LGPL.
5
+ #
6
+ # All of the commercial components are present in the lib/karafka/pro directory of this
7
+ # repository and their usage requires commercial license agreement.
8
+ #
9
+ # Karafka has also commercial-friendly license, commercial support and commercial components.
10
+ #
11
+ # By sending a pull request to the pro components, you are agreeing to transfer the copyright of
12
+ # your code to Maciej Mensfeld.
13
+
14
+ module Karafka
15
+ module Pro
16
+ module Routing
17
+ module Features
18
+ class ScheduledMessages < Base
19
+ # Namespace for scheduled messages contracts
20
+ module Contracts
21
+ # Rules around scheduled messages settings
22
+ class Topic < Karafka::Contracts::Base
23
+ configure do |config|
24
+ config.error_messages = YAML.safe_load(
25
+ File.read(
26
+ File.join(Karafka.gem_root, 'config', 'locales', 'pro_errors.yml')
27
+ )
28
+ ).fetch('en').fetch('validations').fetch('topic')
29
+ end
30
+
31
+ nested(:scheduled_messages) do
32
+ required(:active) { |val| [true, false].include?(val) }
33
+ end
34
+ end
35
+ end
36
+ end
37
+ end
38
+ end
39
+ end
40
+ end
@@ -0,0 +1,27 @@
1
+ # frozen_string_literal: true
2
+
3
+ # This Karafka component is a Pro component under a commercial license.
4
+ # This Karafka component is NOT licensed under LGPL.
5
+ #
6
+ # All of the commercial components are present in the lib/karafka/pro directory of this
7
+ # repository and their usage requires commercial license agreement.
8
+ #
9
+ # Karafka has also commercial-friendly license, commercial support and commercial components.
10
+ #
11
+ # By sending a pull request to the pro components, you are agreeing to transfer the copyright of
12
+ # your code to Maciej Mensfeld.
13
+
14
+ module Karafka
15
+ module Pro
16
+ module Routing
17
+ module Features
18
+ class ScheduledMessages < Base
19
+ # Routing proxy extensions for scheduled messages
20
+ module Proxy
21
+ include Builder
22
+ end
23
+ end
24
+ end
25
+ end
26
+ end
27
+ end
@@ -0,0 +1,44 @@
1
+ # frozen_string_literal: true
2
+
3
+ # This Karafka component is a Pro component under a commercial license.
4
+ # This Karafka component is NOT licensed under LGPL.
5
+ #
6
+ # All of the commercial components are present in the lib/karafka/pro directory of this
7
+ # repository and their usage requires commercial license agreement.
8
+ #
9
+ # Karafka has also commercial-friendly license, commercial support and commercial components.
10
+ #
11
+ # By sending a pull request to the pro components, you are agreeing to transfer the copyright of
12
+ # your code to Maciej Mensfeld.
13
+
14
+ module Karafka
15
+ module Pro
16
+ module Routing
17
+ module Features
18
+ class ScheduledMessages < Base
19
+ # Topic extensions to be able to check if given topic is a scheduled messages topic
20
+ # Please note, that this applies to both the schedules topic and logs topic
21
+ module Topic
22
+ # @param active [Boolean] should this topic be considered related to scheduled messages
23
+ def scheduled_messages(active = false)
24
+ @scheduled_messages ||= Config.new(active: active)
25
+ end
26
+
27
+ # @return [Boolean] is this an ActiveJob topic
28
+ def scheduled_messages?
29
+ scheduled_messages.active?
30
+ end
31
+
32
+ # @return [Hash] topic with all its native configuration options plus scheduled
33
+ # messages namespace settings
34
+ def to_h
35
+ super.merge(
36
+ scheduled_messages: scheduled_messages.to_h
37
+ ).freeze
38
+ end
39
+ end
40
+ end
41
+ end
42
+ end
43
+ end
44
+ end