rdkafka 0.14.0 → 0.15.1

Sign up to get free protection for your applications and to get access to all the features.
Files changed (48) hide show
  1. checksums.yaml +4 -4
  2. checksums.yaml.gz.sig +0 -0
  3. data/.github/FUNDING.yml +1 -0
  4. data/.github/workflows/ci.yml +2 -3
  5. data/.ruby-version +1 -1
  6. data/CHANGELOG.md +25 -0
  7. data/README.md +44 -22
  8. data/docker-compose.yml +3 -1
  9. data/ext/Rakefile +43 -26
  10. data/lib/rdkafka/admin/acl_binding_result.rb +51 -0
  11. data/lib/rdkafka/admin/create_acl_handle.rb +28 -0
  12. data/lib/rdkafka/admin/create_acl_report.rb +24 -0
  13. data/lib/rdkafka/admin/create_partitions_handle.rb +27 -0
  14. data/lib/rdkafka/admin/create_partitions_report.rb +6 -0
  15. data/lib/rdkafka/admin/delete_acl_handle.rb +30 -0
  16. data/lib/rdkafka/admin/delete_acl_report.rb +23 -0
  17. data/lib/rdkafka/admin/delete_groups_handle.rb +28 -0
  18. data/lib/rdkafka/admin/delete_groups_report.rb +24 -0
  19. data/lib/rdkafka/admin/describe_acl_handle.rb +30 -0
  20. data/lib/rdkafka/admin/describe_acl_report.rb +23 -0
  21. data/lib/rdkafka/admin.rb +443 -0
  22. data/lib/rdkafka/bindings.rb +125 -2
  23. data/lib/rdkafka/callbacks.rb +196 -1
  24. data/lib/rdkafka/config.rb +24 -3
  25. data/lib/rdkafka/consumer/headers.rb +1 -1
  26. data/lib/rdkafka/consumer/topic_partition_list.rb +8 -7
  27. data/lib/rdkafka/consumer.rb +80 -29
  28. data/lib/rdkafka/producer/delivery_handle.rb +12 -1
  29. data/lib/rdkafka/producer/delivery_report.rb +16 -3
  30. data/lib/rdkafka/producer.rb +42 -12
  31. data/lib/rdkafka/version.rb +3 -3
  32. data/lib/rdkafka.rb +11 -0
  33. data/rdkafka.gemspec +2 -2
  34. data/spec/rdkafka/admin/create_acl_handle_spec.rb +56 -0
  35. data/spec/rdkafka/admin/create_acl_report_spec.rb +18 -0
  36. data/spec/rdkafka/admin/delete_acl_handle_spec.rb +85 -0
  37. data/spec/rdkafka/admin/delete_acl_report_spec.rb +72 -0
  38. data/spec/rdkafka/admin/describe_acl_handle_spec.rb +85 -0
  39. data/spec/rdkafka/admin/describe_acl_report_spec.rb +73 -0
  40. data/spec/rdkafka/admin_spec.rb +204 -0
  41. data/spec/rdkafka/config_spec.rb +8 -0
  42. data/spec/rdkafka/consumer_spec.rb +89 -0
  43. data/spec/rdkafka/producer/delivery_report_spec.rb +4 -0
  44. data/spec/rdkafka/producer_spec.rb +26 -2
  45. data/spec/spec_helper.rb +3 -1
  46. data.tar.gz.sig +0 -0
  47. metadata +29 -4
  48. metadata.gz.sig +0 -0
checksums.yaml CHANGED
@@ -1,7 +1,7 @@
1
1
  ---
2
2
  SHA256:
3
- metadata.gz: 59f5b664693b87cd66340f821027e45a86c70c0f652c91a0ec6826ba69e02969
4
- data.tar.gz: 79e734db66b41d1618a581686df428de75912d06d1e41fe9e2340e63ace3d0da
3
+ metadata.gz: 8636c80e1798cf24b34cf25a20ca24f35e2951fb179843a3a85a94fe0274ca76
4
+ data.tar.gz: f115aa7fff4961d42280a7ad6fd78fed40568b936139d588ae2362a2f0f45c25
5
5
  SHA512:
6
- metadata.gz: cf9821c62c9f958196d78cbe2d8f50b860d06084017cfc3fbf3ffd6c289278d8a8b54f3b40d16d3d3e6c1d71f1112de897cce67cfc09071cb8fa5a0895bd5ed7
7
- data.tar.gz: a8e9fab48d749ca62fcdfeda7c2a3d2f85cff7849338cedbe43bbcb8d9300979c0ec44c59a49de8f4efd2ee9a3d7feb3a1fa718f2311c89c6b5aedd99c6d870c
6
+ metadata.gz: e5b5368a732e42b1c57aff93a7172c95b0bad93ac646dcba495c0509c5b6c29cf8753601d1f04f028c98579846e5db7e7119ae9a6a4cca2f41441316b60b5c9c
7
+ data.tar.gz: 8596d6944d5151df3ad875d93dbd7cf2aee00c1ded85e053e96880b8c5420ca6ba18a72f57cc867a7bfedf38060be3c9ea4334d79d7a21b2503a078bce1d266a
checksums.yaml.gz.sig CHANGED
Binary file
@@ -0,0 +1 @@
1
+ custom: ['https://karafka.io/#become-pro']
@@ -22,16 +22,15 @@ jobs:
22
22
  fail-fast: false
23
23
  matrix:
24
24
  ruby:
25
- - '3.3.0-preview2'
25
+ - '3.3'
26
26
  - '3.2'
27
27
  - '3.1'
28
28
  - '3.1.0'
29
29
  - '3.0'
30
30
  - '3.0.0'
31
31
  - '2.7'
32
- - '2.7.0'
33
32
  include:
34
- - ruby: '3.2'
33
+ - ruby: '3.3'
35
34
  coverage: 'true'
36
35
  steps:
37
36
  - uses: actions/checkout@v4
data/.ruby-version CHANGED
@@ -1 +1 @@
1
- 3.2.2
1
+ 3.3.0
data/CHANGELOG.md CHANGED
@@ -1,5 +1,30 @@
1
1
  # Rdkafka Changelog
2
2
 
3
+ ## 0.15.1 (2024-01-30)
4
+ - [Enhancement] Provide support for Nix OS (alexandriainfantino)
5
+ - [Enhancement] Replace `rd_kafka_offset_store` with `rd_kafka_offsets_store` (mensfeld)
6
+ - [Enhancement] Alias `topic_name` as `topic` in the delivery report (mensfeld)
7
+ - [Enhancement] Provide `label` producer handler and report reference for improved traceability (mensfeld)
8
+ - [Enhancement] Include the error when invoking `create_result` on producer handle (mensfeld)
9
+ - [Enhancement] Skip intermediate array creation on delivery report callback execution (one per message) (mensfeld).
10
+ - [Enhancement] Report `-1` instead of `nil` in case `partition_count` failure (mensfeld).
11
+ - [Fix] Fix return type on `#rd_kafka_poll` (mensfeld)
12
+ - [Fix] `uint8_t` does not exist on Apple Silicon (mensfeld)
13
+ - [Fix] Missing ACL `RD_KAFKA_RESOURCE_BROKER` constant reference (mensfeld)
14
+ - [Fix] Partition cache caches invalid nil result for `PARTITIONS_COUNT_TTL` (mensfeld)
15
+ - [Change] Rename `matching_acl_pattern_type` to `matching_acl_resource_pattern_type` to align the whole API (mensfeld)
16
+
17
+ ## 0.15.0 (2023-12-03)
18
+ - **[Feature]** Add `Admin#metadata` (mensfeld)
19
+ - **[Feature]** Add `Admin#create_partitions` (mensfeld)
20
+ - **[Feature]** Add `Admin#delete_group` utility (piotaixr)
21
+ - **[Feature]** Add Create and Delete ACL Feature To Admin Functions (vgnanasekaran)
22
+ - **[Feature]** Support `#assignment_lost?` on a consumer to check for involuntary assignment revocation (mensfeld)
23
+ - [Enhancement] Expose alternative way of managing consumer events via a separate queue (mensfeld)
24
+ - [Enhancement] **Bump** librdkafka to 2.3.0 (mensfeld)
25
+ - [Enhancement] Increase the `#lag` and `#query_watermark_offsets` default timeouts from 100ms to 1000ms. This will compensate for network glitches and remote clusters operations (mensfeld)
26
+ - [Change] Use `SecureRandom.uuid` instead of `random` for test consumer groups (mensfeld)
27
+
3
28
  ## 0.14.0 (2023-11-21)
4
29
  - [Enhancement] Add `raise_response_error` flag to the `Rdkafka::AbstractHandle`.
5
30
  - [Enhancement] Allow for setting `statistics_callback` as nil to reset predefined settings configured by a different gem (mensfeld)
data/README.md CHANGED
@@ -18,21 +18,31 @@ become EOL.
18
18
 
19
19
  `rdkafka` was written because of the need for a reliable Ruby client for Kafka that supports modern Kafka at [AppSignal](https://appsignal.com). AppSignal runs it in production on very high-traffic systems.
20
20
 
21
- The most important pieces of a Kafka client are implemented. We're
22
- working towards feature completeness. You can track that here:
23
- https://github.com/appsignal/rdkafka-ruby/milestone/1
21
+ The most important pieces of a Kafka client are implemented, and we aim to provide all relevant consumer, producer, and admin APIs.
24
22
 
25
23
  ## Table of content
26
24
 
25
+ - [Project Scope](#project-scope)
27
26
  - [Installation](#installation)
28
27
  - [Usage](#usage)
29
- * [Consuming messages](#consuming-messages)
30
- * [Producing messages](#producing-messages)
31
- - [Higher level libraries](#higher-level-libraries)
32
- * [Message processing frameworks](#message-processing-frameworks)
33
- * [Message publishing libraries](#message-publishing-libraries)
28
+ * [Consuming Messages](#consuming-messages)
29
+ * [Producing Messages](#producing-messages)
30
+ - [Higher Level Libraries](#higher-level-libraries)
31
+ * [Message Processing Frameworks](#message-processing-frameworks)
32
+ * [Message Publishing Libraries](#message-publishing-libraries)
34
33
  - [Development](#development)
35
34
  - [Example](#example)
35
+ - [Versions](#versions)
36
+
37
+ ## Project Scope
38
+
39
+ While rdkafka-ruby aims to simplify the use of librdkafka in Ruby applications, it's important to understand the limitations of this library:
40
+
41
+ - **No Complex Producers/Consumers**: This library does not intend to offer complex producers or consumers. The aim is to stick closely to the functionalities provided by librdkafka itself.
42
+
43
+ - **Focus on librdkafka Capabilities**: Features that can be achieved directly in Ruby, without specific needs from librdkafka, are outside the scope of this library.
44
+
45
+ - **Existing High-Level Functionalities**: Certain high-level functionalities like producer metadata cache and simple consumer are already part of the library. Although they fall slightly outside the primary goal, they will remain part of the contract, given their existing usage.
36
46
 
37
47
 
38
48
  ## Installation
@@ -42,9 +52,9 @@ If you have any problems installing the gem, please open an issue.
42
52
 
43
53
  ## Usage
44
54
 
45
- See the [documentation](https://www.rubydoc.info/github/appsignal/rdkafka-ruby) for full details on how to use this gem. Two quick examples:
55
+ See the [documentation](https://karafka.io/docs/code/rdkafka-ruby/) for full details on how to use this gem. Two quick examples:
46
56
 
47
- ### Consuming messages
57
+ ### Consuming Messages
48
58
 
49
59
  Subscribe to a topic and get messages. Kafka will automatically spread
50
60
  the available partitions over consumers with the same group id.
@@ -62,7 +72,7 @@ consumer.each do |message|
62
72
  end
63
73
  ```
64
74
 
65
- ### Producing messages
75
+ ### Producing Messages
66
76
 
67
77
  Produce a number of messages, put the delivery handles in an array, and
68
78
  wait for them before exiting. This way the messages will be batched and
@@ -89,41 +99,42 @@ Note that creating a producer consumes some resources that will not be
89
99
  released until it `#close` is explicitly called, so be sure to call
90
100
  `Config#producer` only as necessary.
91
101
 
92
- ## Higher level libraries
102
+ ## Higher Level Libraries
93
103
 
94
104
  Currently, there are two actively developed frameworks based on rdkafka-ruby, that provide higher-level API that can be used to work with Kafka messages and one library for publishing messages.
95
105
 
96
- ### Message processing frameworks
106
+ ### Message Processing Frameworks
97
107
 
98
108
  * [Karafka](https://github.com/karafka/karafka) - Ruby and Rails efficient Kafka processing framework.
99
109
  * [Racecar](https://github.com/zendesk/racecar) - A simple framework for Kafka consumers in Ruby
100
110
 
101
- ### Message publishing libraries
111
+ ### Message Publishing Libraries
102
112
 
103
113
  * [WaterDrop](https://github.com/karafka/waterdrop) – Standalone Karafka library for producing Kafka messages.
104
114
 
105
115
  ## Development
106
116
 
107
- A Docker Compose file is included to run Kafka. To run
108
- that:
117
+ Contributors are encouraged to focus on enhancements that align with the core goal of the library. We appreciate contributions but will likely not accept pull requests for features that:
118
+
119
+ - Implement functionalities that can achieved using standard Ruby capabilities without changes to the underlying rdkafka-ruby bindings.
120
+ - Deviate significantly from the primary aim of providing librdkafka bindings with Ruby-friendly interfaces.
121
+
122
+ A Docker Compose file is included to run Kafka. To run that:
109
123
 
110
124
  ```
111
125
  docker-compose up
112
126
  ```
113
127
 
114
- Run `bundle` and `cd ext && bundle exec rake && cd ..` to download and
115
- compile `librdkafka`.
128
+ Run `bundle` and `cd ext && bundle exec rake && cd ..` to download and compile `librdkafka`.
116
129
 
117
- You can then run `bundle exec rspec` to run the tests. To see rdkafka
118
- debug output:
130
+ You can then run `bundle exec rspec` to run the tests. To see rdkafka debug output:
119
131
 
120
132
  ```
121
133
  DEBUG_PRODUCER=true bundle exec rspec
122
134
  DEBUG_CONSUMER=true bundle exec rspec
123
135
  ```
124
136
 
125
- After running the tests, you can bring the cluster down to start with a
126
- clean slate:
137
+ After running the tests, you can bring the cluster down to start with a clean slate:
127
138
 
128
139
  ```
129
140
  docker-compose down
@@ -137,3 +148,14 @@ To see everything working, run these in separate tabs:
137
148
  bundle exec rake consume_messages
138
149
  bundle exec rake produce_messages
139
150
  ```
151
+
152
+ ## Versions
153
+
154
+ | rdkafka-ruby | librdkafka |
155
+ |-|-|
156
+ | 0.15.0 (2023-12-03) | 2.3.0 (2023-10-25) |
157
+ | 0.14.0 (2023-11-21) | 2.2.0 (2023-07-12) |
158
+ | 0.13.0 (2023-07-24) | 2.0.2 (2023-01-20) |
159
+ | 0.12.0 (2022-06-17) | 1.9.0 (2022-06-16) |
160
+ | 0.11.0 (2021-11-17) | 1.8.2 (2021-10-18) |
161
+ | 0.10.0 (2021-09-07) | 1.5.0 (2020-07-20) |
data/docker-compose.yml CHANGED
@@ -3,7 +3,7 @@ version: '2'
3
3
  services:
4
4
  kafka:
5
5
  container_name: kafka
6
- image: confluentinc/cp-kafka:7.5.2
6
+ image: confluentinc/cp-kafka:7.5.3
7
7
 
8
8
  ports:
9
9
  - 9092:9092
@@ -23,3 +23,5 @@ services:
23
23
  KAFKA_AUTO_CREATE_TOPICS_ENABLE: 'true'
24
24
  KAFKA_TRANSACTION_STATE_LOG_REPLICATION_FACTOR: 1
25
25
  KAFKA_TRANSACTION_STATE_LOG_MIN_ISR: 1
26
+ KAFKA_ALLOW_EVERYONE_IF_NO_ACL_FOUND: "true"
27
+ KAFKA_AUTHORIZER_CLASS_NAME: org.apache.kafka.metadata.authorizer.StandardAuthorizer
data/ext/Rakefile CHANGED
@@ -1,40 +1,57 @@
1
1
  # frozen_string_literal: true
2
2
 
3
3
  require File.expand_path('../../lib/rdkafka/version', __FILE__)
4
- require "mini_portile2"
5
4
  require "fileutils"
6
5
  require "open-uri"
7
6
 
8
7
  task :default => :clean do
9
- # Download and compile librdkafka
10
- recipe = MiniPortile.new("librdkafka", Rdkafka::LIBRDKAFKA_VERSION)
8
+ # For nix users, nix can't locate the file paths because the packages it's requiring aren't managed by the system but are
9
+ # managed by nix itself, so using the normal file paths doesn't work for nix users.
10
+ #
11
+ # Mini_portile causes an issue because it's dependencies are downloaded on the fly and therefore don't exist/aren't
12
+ # accessible in the nix environment
13
+ if ENV.fetch('RDKAFKA_EXT_PATH', '').empty?
14
+ # Download and compile librdkafka if RDKAFKA_EXT_PATH is not set
15
+ require "mini_portile2"
16
+ recipe = MiniPortile.new("librdkafka", Rdkafka::LIBRDKAFKA_VERSION)
11
17
 
12
- # Use default homebrew openssl if we're on mac and the directory exists
13
- # and each of flags is not empty
14
- if recipe.host&.include?("darwin") && system("which brew &> /dev/null") && Dir.exist?("#{homebrew_prefix = %x(brew --prefix openssl).strip}")
15
- ENV["CPPFLAGS"] = "-I#{homebrew_prefix}/include" unless ENV["CPPFLAGS"]
16
- ENV["LDFLAGS"] = "-L#{homebrew_prefix}/lib" unless ENV["LDFLAGS"]
17
- end
18
+ # Use default homebrew openssl if we're on mac and the directory exists
19
+ # and each of flags is not empty
20
+ if recipe.host&.include?("darwin") && system("which brew &> /dev/null") && Dir.exist?("#{homebrew_prefix = %x(brew --prefix openssl).strip}")
21
+ ENV["CPPFLAGS"] = "-I#{homebrew_prefix}/include" unless ENV["CPPFLAGS"]
22
+ ENV["LDFLAGS"] = "-L#{homebrew_prefix}/lib" unless ENV["LDFLAGS"]
23
+ end
18
24
 
19
- recipe.files << {
20
- :url => "https://codeload.github.com/confluentinc/librdkafka/tar.gz/v#{Rdkafka::LIBRDKAFKA_VERSION}",
21
- :sha256 => Rdkafka::LIBRDKAFKA_SOURCE_SHA256
22
- }
23
- recipe.configure_options = ["--host=#{recipe.host}"]
24
- recipe.cook
25
- # Move dynamic library we're interested in
26
- if recipe.host.include?('darwin')
27
- from_extension = '1.dylib'
28
- to_extension = 'dylib'
25
+ recipe.files << {
26
+ :url => "https://codeload.github.com/edenhill/librdkafka/tar.gz/v#{Rdkafka::LIBRDKAFKA_VERSION}",
27
+ :sha256 => Rdkafka::LIBRDKAFKA_SOURCE_SHA256
28
+ }
29
+ recipe.configure_options = ["--host=#{recipe.host}"]
30
+ recipe.cook
31
+ # Move dynamic library we're interested in
32
+ if recipe.host.include?('darwin')
33
+ from_extension = '1.dylib'
34
+ to_extension = 'dylib'
35
+ else
36
+ from_extension = 'so.1'
37
+ to_extension = 'so'
38
+ end
39
+ lib_path = File.join(File.dirname(__FILE__), "ports/#{recipe.host}/librdkafka/#{Rdkafka::LIBRDKAFKA_VERSION}/lib/librdkafka.#{from_extension}")
40
+ FileUtils.mv(lib_path, File.join(File.dirname(__FILE__), "librdkafka.#{to_extension}"))
41
+ # Cleanup files created by miniportile we don't need in the gem
42
+ FileUtils.rm_rf File.join(File.dirname(__FILE__), "tmp")
43
+ FileUtils.rm_rf File.join(File.dirname(__FILE__), "ports")
29
44
  else
30
- from_extension = 'so.1'
31
- to_extension = 'so'
45
+ # Otherwise, copy existing libraries to ./ext
46
+ if ENV['RDKAFKA_EXT_PATH'].nil? || ENV['RDKAFKA_EXT_PATH'].empty?
47
+ raise "RDKAFKA_EXT_PATH must be set in your nix config when running under nix"
48
+ end
49
+ files = [
50
+ File.join(ENV['RDKAFKA_EXT_PATH'], 'lib', 'librdkafka.dylib'),
51
+ File.join(ENV['RDKAFKA_EXT_PATH'], 'lib', 'librdkafka.so')
52
+ ]
53
+ files.each { |ext| FileUtils.cp(ext, File.dirname(__FILE__)) if File.exist?(ext) }
32
54
  end
33
- lib_path = File.join(File.dirname(__FILE__), "ports/#{recipe.host}/librdkafka/#{Rdkafka::LIBRDKAFKA_VERSION}/lib/librdkafka.#{from_extension}")
34
- FileUtils.mv(lib_path, File.join(File.dirname(__FILE__), "librdkafka.#{to_extension}"))
35
- # Cleanup files created by miniportile we don't need in the gem
36
- FileUtils.rm_rf File.join(File.dirname(__FILE__), "tmp")
37
- FileUtils.rm_rf File.join(File.dirname(__FILE__), "ports")
38
55
  end
39
56
 
40
57
  task :clean do
@@ -0,0 +1,51 @@
1
+ # frozen_string_literal: true
2
+
3
+ module Rdkafka
4
+ class Admin
5
+ # Extracts attributes of rd_kafka_AclBinding_t
6
+ #
7
+ class AclBindingResult
8
+ attr_reader :result_error, :error_string, :matching_acl_resource_type,
9
+ :matching_acl_resource_name, :matching_acl_resource_pattern_type,
10
+ :matching_acl_principal, :matching_acl_host, :matching_acl_operation,
11
+ :matching_acl_permission_type
12
+
13
+ # This attribute was initially released under the name that is now an alias
14
+ # We keep it for backwards compatibility but it was changed for the consistency
15
+ alias matching_acl_pattern_type matching_acl_resource_pattern_type
16
+
17
+ def initialize(matching_acl)
18
+ rd_kafka_error_pointer = Rdkafka::Bindings.rd_kafka_AclBinding_error(matching_acl)
19
+ @result_error = Rdkafka::Bindings.rd_kafka_error_code(rd_kafka_error_pointer)
20
+ error_string = Rdkafka::Bindings.rd_kafka_error_string(rd_kafka_error_pointer)
21
+
22
+ if error_string != FFI::Pointer::NULL
23
+ @error_string = error_string.read_string
24
+ end
25
+
26
+ @matching_acl_resource_type = Rdkafka::Bindings.rd_kafka_AclBinding_restype(matching_acl)
27
+ matching_acl_resource_name = Rdkafka::Bindings.rd_kafka_AclBinding_name(matching_acl)
28
+
29
+ if matching_acl_resource_name != FFI::Pointer::NULL
30
+ @matching_acl_resource_name = matching_acl_resource_name.read_string
31
+ end
32
+
33
+ @matching_acl_resource_pattern_type = Rdkafka::Bindings.rd_kafka_AclBinding_resource_pattern_type(matching_acl)
34
+ matching_acl_principal = Rdkafka::Bindings.rd_kafka_AclBinding_principal(matching_acl)
35
+
36
+ if matching_acl_principal != FFI::Pointer::NULL
37
+ @matching_acl_principal = matching_acl_principal.read_string
38
+ end
39
+
40
+ matching_acl_host = Rdkafka::Bindings.rd_kafka_AclBinding_host(matching_acl)
41
+
42
+ if matching_acl_host != FFI::Pointer::NULL
43
+ @matching_acl_host = matching_acl_host.read_string
44
+ end
45
+
46
+ @matching_acl_operation = Rdkafka::Bindings.rd_kafka_AclBinding_operation(matching_acl)
47
+ @matching_acl_permission_type = Rdkafka::Bindings.rd_kafka_AclBinding_permission_type(matching_acl)
48
+ end
49
+ end
50
+ end
51
+ end
@@ -0,0 +1,28 @@
1
+ # frozen_string_literal: true
2
+
3
+ module Rdkafka
4
+ class Admin
5
+ class CreateAclHandle < AbstractHandle
6
+ layout :pending, :bool,
7
+ :response, :int,
8
+ :response_string, :pointer
9
+
10
+ # @return [String] the name of the operation
11
+ def operation_name
12
+ "create acl"
13
+ end
14
+
15
+ # @return [CreateAclReport] instance with rdkafka_response value as 0 and rdkafka_response_string value as empty string if the acl creation was successful
16
+ def create_result
17
+ CreateAclReport.new(rdkafka_response: self[:response], rdkafka_response_string: self[:response_string])
18
+ end
19
+
20
+ def raise_error
21
+ raise RdkafkaError.new(
22
+ self[:response],
23
+ broker_message: self[:response_string].read_string
24
+ )
25
+ end
26
+ end
27
+ end
28
+ end
@@ -0,0 +1,24 @@
1
+ # frozen_string_literal: true
2
+
3
+ module Rdkafka
4
+ class Admin
5
+ class CreateAclReport
6
+
7
+ # Upon successful creation of Acl RD_KAFKA_RESP_ERR_NO_ERROR - 0 is returned as rdkafka_response
8
+ # @return [Integer]
9
+ attr_reader :rdkafka_response
10
+
11
+
12
+ # Upon successful creation of Acl empty string will be returned as rdkafka_response_string
13
+ # @return [String]
14
+ attr_reader :rdkafka_response_string
15
+
16
+ def initialize(rdkafka_response:, rdkafka_response_string:)
17
+ @rdkafka_response = rdkafka_response
18
+ if rdkafka_response_string != FFI::Pointer::NULL
19
+ @rdkafka_response_string = rdkafka_response_string.read_string
20
+ end
21
+ end
22
+ end
23
+ end
24
+ end
@@ -0,0 +1,27 @@
1
+ module Rdkafka
2
+ class Admin
3
+ class CreatePartitionsHandle < AbstractHandle
4
+ layout :pending, :bool,
5
+ :response, :int,
6
+ :error_string, :pointer,
7
+ :result_name, :pointer
8
+
9
+ # @return [String] the name of the operation
10
+ def operation_name
11
+ "create partitions"
12
+ end
13
+
14
+ # @return [Boolean] whether the create topic was successful
15
+ def create_result
16
+ CreatePartitionsReport.new(self[:error_string], self[:result_name])
17
+ end
18
+
19
+ def raise_error
20
+ raise RdkafkaError.new(
21
+ self[:response],
22
+ broker_message: CreateTopicReport.new(self[:error_string], self[:result_name]).error_string
23
+ )
24
+ end
25
+ end
26
+ end
27
+ end
@@ -0,0 +1,6 @@
1
+ module Rdkafka
2
+ class Admin
3
+ class CreatePartitionsReport < CreateTopicReport
4
+ end
5
+ end
6
+ end
@@ -0,0 +1,30 @@
1
+ # frozen_string_literal: true
2
+
3
+ module Rdkafka
4
+ class Admin
5
+ class DeleteAclHandle < AbstractHandle
6
+ layout :pending, :bool,
7
+ :response, :int,
8
+ :response_string, :pointer,
9
+ :matching_acls, :pointer,
10
+ :matching_acls_count, :int
11
+
12
+ # @return [String] the name of the operation
13
+ def operation_name
14
+ "delete acl"
15
+ end
16
+
17
+ # @return [DeleteAclReport] instance with an array of matching_acls
18
+ def create_result
19
+ DeleteAclReport.new(matching_acls: self[:matching_acls], matching_acls_count: self[:matching_acls_count])
20
+ end
21
+
22
+ def raise_error
23
+ raise RdkafkaError.new(
24
+ self[:response],
25
+ broker_message: self[:response_string].read_string
26
+ )
27
+ end
28
+ end
29
+ end
30
+ end
@@ -0,0 +1,23 @@
1
+ # frozen_string_literal: true
2
+
3
+ module Rdkafka
4
+ class Admin
5
+ class DeleteAclReport
6
+
7
+ # deleted acls
8
+ # @return [Rdkafka::Bindings::AclBindingResult]
9
+ attr_reader :deleted_acls
10
+
11
+ def initialize(matching_acls:, matching_acls_count:)
12
+ @deleted_acls=[]
13
+ if matching_acls != FFI::Pointer::NULL
14
+ acl_binding_result_pointers = matching_acls.read_array_of_pointer(matching_acls_count)
15
+ (1..matching_acls_count).map do |matching_acl_index|
16
+ acl_binding_result = AclBindingResult.new(acl_binding_result_pointers[matching_acl_index - 1])
17
+ @deleted_acls << acl_binding_result
18
+ end
19
+ end
20
+ end
21
+ end
22
+ end
23
+ end
@@ -0,0 +1,28 @@
1
+ # frozen_string_literal: true
2
+
3
+ module Rdkafka
4
+ class Admin
5
+ class DeleteGroupsHandle < AbstractHandle
6
+ layout :pending, :bool, # TODO: ???
7
+ :response, :int,
8
+ :error_string, :pointer,
9
+ :result_name, :pointer
10
+
11
+ # @return [String] the name of the operation
12
+ def operation_name
13
+ "delete groups"
14
+ end
15
+
16
+ def create_result
17
+ DeleteGroupsReport.new(self[:error_string], self[:result_name])
18
+ end
19
+
20
+ def raise_error
21
+ raise RdkafkaError.new(
22
+ self[:response],
23
+ broker_message: create_result.error_string
24
+ )
25
+ end
26
+ end
27
+ end
28
+ end
@@ -0,0 +1,24 @@
1
+ # frozen_string_literal: true
2
+
3
+ module Rdkafka
4
+ class Admin
5
+ class DeleteGroupsReport
6
+ # Any error message generated from the DeleteTopic
7
+ # @return [String]
8
+ attr_reader :error_string
9
+
10
+ # The name of the topic deleted
11
+ # @return [String]
12
+ attr_reader :result_name
13
+
14
+ def initialize(error_string, result_name)
15
+ if error_string != FFI::Pointer::NULL
16
+ @error_string = error_string.read_string
17
+ end
18
+ if result_name != FFI::Pointer::NULL
19
+ @result_name = @result_name = result_name.read_string
20
+ end
21
+ end
22
+ end
23
+ end
24
+ end
@@ -0,0 +1,30 @@
1
+ # frozen_string_literal: true
2
+
3
+ module Rdkafka
4
+ class Admin
5
+ class DescribeAclHandle < AbstractHandle
6
+ layout :pending, :bool,
7
+ :response, :int,
8
+ :response_string, :pointer,
9
+ :acls, :pointer,
10
+ :acls_count, :int
11
+
12
+ # @return [String] the name of the operation.
13
+ def operation_name
14
+ "describe acl"
15
+ end
16
+
17
+ # @return [DescribeAclReport] instance with an array of acls that matches the request filters.
18
+ def create_result
19
+ DescribeAclReport.new(acls: self[:acls], acls_count: self[:acls_count])
20
+ end
21
+
22
+ def raise_error
23
+ raise RdkafkaError.new(
24
+ self[:response],
25
+ broker_message: self[:response_string].read_string
26
+ )
27
+ end
28
+ end
29
+ end
30
+ end
@@ -0,0 +1,23 @@
1
+ # frozen_string_literal: true
2
+
3
+ module Rdkafka
4
+ class Admin
5
+ class DescribeAclReport
6
+
7
+ # acls that exists in the cluster for the resource_type, resource_name and pattern_type filters provided in the request.
8
+ # @return [Rdkafka::Bindings::AclBindingResult] array of matching acls.
9
+ attr_reader :acls
10
+
11
+ def initialize(acls:, acls_count:)
12
+ @acls=[]
13
+ if acls != FFI::Pointer::NULL
14
+ acl_binding_result_pointers = acls.read_array_of_pointer(acls_count)
15
+ (1..acls_count).map do |acl_index|
16
+ acl_binding_result = AclBindingResult.new(acl_binding_result_pointers[acl_index - 1])
17
+ @acls << acl_binding_result
18
+ end
19
+ end
20
+ end
21
+ end
22
+ end
23
+ end