logstash-filter-csv 3.0.6 → 3.1.1

Sign up to get free protection for your applications and to get access to all the features.
checksums.yaml CHANGED
@@ -1,7 +1,7 @@
1
1
  ---
2
- SHA1:
3
- metadata.gz: 9ff0bd6fdf58706e32acf77c116694d1274b7d11
4
- data.tar.gz: 28b97ce191613e3245376d2c21475623c4a0f985
2
+ SHA256:
3
+ metadata.gz: bf9f8d4119cd1f50a3456eecc603223146a3a8af699161e69620830e94a89bd2
4
+ data.tar.gz: 955a26060fa7c92bdbc4f00b632e8c33ef2ade7ed54d03084503666d5f89a583
5
5
  SHA512:
6
- metadata.gz: a1533f34f35e0701ae830109a548f0ff1d260147c99e36f4fd82ebdb2700a22d7500da3edb48bc15c549c2e63fa529967017d7f11b3e4e009e1977ced0b21b29
7
- data.tar.gz: a04d372cb6114b935828ec6de9ca6d2821c1afab13287819b200f2ca3f1196e638368be53e78e40da5878ad7a405f5d3d51ea2774de99c34982bc17c304dd020
6
+ metadata.gz: 8439e88c6dac47b52866730cb04c015cb1e88f439316063da35fb9264383ac7e177d45801aa0da64756b79bbfeee5d6ee0bbd31bc04b2e33475f839288b4da36
7
+ data.tar.gz: ce650dfb546568cd8983d4949662739fbe794b097ea391b4a20b0d46d2d793ad2090e9b212248def1d32721baf26f364c2a5d4e69275a0eef4e1f959b386dd05
data/CHANGELOG.md CHANGED
@@ -1,3 +1,25 @@
1
+ ## 3.1.1
2
+ - Refactor: unified ECS target + validate field reference [#86](https://github.com/logstash-plugins/logstash-filter-csv/pull/86)
3
+
4
+ ## 3.1.0
5
+ - Add ECS support [#85](https://github.com/logstash-plugins/logstash-filter-csv/pull/85)
6
+
7
+ ## 3.0.11
8
+ - [DOC] Fixed formatting to improve readability [#84](https://github.com/logstash-plugins/logstash-filter-csv/pull/84)
9
+
10
+ ## 3.0.10
11
+ - [DOC] Fix asciidoc formatting for example [#73](https://github.com/logstash-plugins/logstash-filter-csv/pull/73)
12
+
13
+ ## 3.0.9
14
+ - [DOC] Document that the `autodetect_column_names` and `skip_header` options work only when the number of Logstash
15
+ pipeline workers is set to `1`.
16
+
17
+ ## 3.0.8
18
+ - feature: Added support for tagging empty rows which users can reference to conditionally drop events
19
+
20
+ ## 3.0.7
21
+ - Update gemspec summary
22
+
1
23
  ## 3.0.6
2
24
  - Fix a bug where `[nested][field]` references were incorrectly used. (#24, #52)
3
25
 
data/CONTRIBUTORS CHANGED
@@ -12,6 +12,7 @@ Contributors:
12
12
  * Pier-Hugues Pellerin (ph)
13
13
  * Richard Pijnenburg (electrical)
14
14
  * Suyog Rao (suyograo)
15
+ * Abdul Haseeb Hussain (AbdulHaseebHussain)
15
16
 
16
17
  Note: If you've sent us patches, bug reports, or otherwise contributed to
17
18
  Logstash, and you aren't on the list above and want to be, please let us know
data/LICENSE CHANGED
@@ -1,13 +1,202 @@
1
- Copyright (c) 2012–2016 Elasticsearch <http://www.elastic.co>
2
1
 
3
- Licensed under the Apache License, Version 2.0 (the "License");
4
- you may not use this file except in compliance with the License.
5
- You may obtain a copy of the License at
2
+ Apache License
3
+ Version 2.0, January 2004
4
+ http://www.apache.org/licenses/
6
5
 
7
- http://www.apache.org/licenses/LICENSE-2.0
6
+ TERMS AND CONDITIONS FOR USE, REPRODUCTION, AND DISTRIBUTION
8
7
 
9
- Unless required by applicable law or agreed to in writing, software
10
- distributed under the License is distributed on an "AS IS" BASIS,
11
- WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
12
- See the License for the specific language governing permissions and
13
- limitations under the License.
8
+ 1. Definitions.
9
+
10
+ "License" shall mean the terms and conditions for use, reproduction,
11
+ and distribution as defined by Sections 1 through 9 of this document.
12
+
13
+ "Licensor" shall mean the copyright owner or entity authorized by
14
+ the copyright owner that is granting the License.
15
+
16
+ "Legal Entity" shall mean the union of the acting entity and all
17
+ other entities that control, are controlled by, or are under common
18
+ control with that entity. For the purposes of this definition,
19
+ "control" means (i) the power, direct or indirect, to cause the
20
+ direction or management of such entity, whether by contract or
21
+ otherwise, or (ii) ownership of fifty percent (50%) or more of the
22
+ outstanding shares, or (iii) beneficial ownership of such entity.
23
+
24
+ "You" (or "Your") shall mean an individual or Legal Entity
25
+ exercising permissions granted by this License.
26
+
27
+ "Source" form shall mean the preferred form for making modifications,
28
+ including but not limited to software source code, documentation
29
+ source, and configuration files.
30
+
31
+ "Object" form shall mean any form resulting from mechanical
32
+ transformation or translation of a Source form, including but
33
+ not limited to compiled object code, generated documentation,
34
+ and conversions to other media types.
35
+
36
+ "Work" shall mean the work of authorship, whether in Source or
37
+ Object form, made available under the License, as indicated by a
38
+ copyright notice that is included in or attached to the work
39
+ (an example is provided in the Appendix below).
40
+
41
+ "Derivative Works" shall mean any work, whether in Source or Object
42
+ form, that is based on (or derived from) the Work and for which the
43
+ editorial revisions, annotations, elaborations, or other modifications
44
+ represent, as a whole, an original work of authorship. For the purposes
45
+ of this License, Derivative Works shall not include works that remain
46
+ separable from, or merely link (or bind by name) to the interfaces of,
47
+ the Work and Derivative Works thereof.
48
+
49
+ "Contribution" shall mean any work of authorship, including
50
+ the original version of the Work and any modifications or additions
51
+ to that Work or Derivative Works thereof, that is intentionally
52
+ submitted to Licensor for inclusion in the Work by the copyright owner
53
+ or by an individual or Legal Entity authorized to submit on behalf of
54
+ the copyright owner. For the purposes of this definition, "submitted"
55
+ means any form of electronic, verbal, or written communication sent
56
+ to the Licensor or its representatives, including but not limited to
57
+ communication on electronic mailing lists, source code control systems,
58
+ and issue tracking systems that are managed by, or on behalf of, the
59
+ Licensor for the purpose of discussing and improving the Work, but
60
+ excluding communication that is conspicuously marked or otherwise
61
+ designated in writing by the copyright owner as "Not a Contribution."
62
+
63
+ "Contributor" shall mean Licensor and any individual or Legal Entity
64
+ on behalf of whom a Contribution has been received by Licensor and
65
+ subsequently incorporated within the Work.
66
+
67
+ 2. Grant of Copyright License. Subject to the terms and conditions of
68
+ this License, each Contributor hereby grants to You a perpetual,
69
+ worldwide, non-exclusive, no-charge, royalty-free, irrevocable
70
+ copyright license to reproduce, prepare Derivative Works of,
71
+ publicly display, publicly perform, sublicense, and distribute the
72
+ Work and such Derivative Works in Source or Object form.
73
+
74
+ 3. Grant of Patent License. Subject to the terms and conditions of
75
+ this License, each Contributor hereby grants to You a perpetual,
76
+ worldwide, non-exclusive, no-charge, royalty-free, irrevocable
77
+ (except as stated in this section) patent license to make, have made,
78
+ use, offer to sell, sell, import, and otherwise transfer the Work,
79
+ where such license applies only to those patent claims licensable
80
+ by such Contributor that are necessarily infringed by their
81
+ Contribution(s) alone or by combination of their Contribution(s)
82
+ with the Work to which such Contribution(s) was submitted. If You
83
+ institute patent litigation against any entity (including a
84
+ cross-claim or counterclaim in a lawsuit) alleging that the Work
85
+ or a Contribution incorporated within the Work constitutes direct
86
+ or contributory patent infringement, then any patent licenses
87
+ granted to You under this License for that Work shall terminate
88
+ as of the date such litigation is filed.
89
+
90
+ 4. Redistribution. You may reproduce and distribute copies of the
91
+ Work or Derivative Works thereof in any medium, with or without
92
+ modifications, and in Source or Object form, provided that You
93
+ meet the following conditions:
94
+
95
+ (a) You must give any other recipients of the Work or
96
+ Derivative Works a copy of this License; and
97
+
98
+ (b) You must cause any modified files to carry prominent notices
99
+ stating that You changed the files; and
100
+
101
+ (c) You must retain, in the Source form of any Derivative Works
102
+ that You distribute, all copyright, patent, trademark, and
103
+ attribution notices from the Source form of the Work,
104
+ excluding those notices that do not pertain to any part of
105
+ the Derivative Works; and
106
+
107
+ (d) If the Work includes a "NOTICE" text file as part of its
108
+ distribution, then any Derivative Works that You distribute must
109
+ include a readable copy of the attribution notices contained
110
+ within such NOTICE file, excluding those notices that do not
111
+ pertain to any part of the Derivative Works, in at least one
112
+ of the following places: within a NOTICE text file distributed
113
+ as part of the Derivative Works; within the Source form or
114
+ documentation, if provided along with the Derivative Works; or,
115
+ within a display generated by the Derivative Works, if and
116
+ wherever such third-party notices normally appear. The contents
117
+ of the NOTICE file are for informational purposes only and
118
+ do not modify the License. You may add Your own attribution
119
+ notices within Derivative Works that You distribute, alongside
120
+ or as an addendum to the NOTICE text from the Work, provided
121
+ that such additional attribution notices cannot be construed
122
+ as modifying the License.
123
+
124
+ You may add Your own copyright statement to Your modifications and
125
+ may provide additional or different license terms and conditions
126
+ for use, reproduction, or distribution of Your modifications, or
127
+ for any such Derivative Works as a whole, provided Your use,
128
+ reproduction, and distribution of the Work otherwise complies with
129
+ the conditions stated in this License.
130
+
131
+ 5. Submission of Contributions. Unless You explicitly state otherwise,
132
+ any Contribution intentionally submitted for inclusion in the Work
133
+ by You to the Licensor shall be under the terms and conditions of
134
+ this License, without any additional terms or conditions.
135
+ Notwithstanding the above, nothing herein shall supersede or modify
136
+ the terms of any separate license agreement you may have executed
137
+ with Licensor regarding such Contributions.
138
+
139
+ 6. Trademarks. This License does not grant permission to use the trade
140
+ names, trademarks, service marks, or product names of the Licensor,
141
+ except as required for reasonable and customary use in describing the
142
+ origin of the Work and reproducing the content of the NOTICE file.
143
+
144
+ 7. Disclaimer of Warranty. Unless required by applicable law or
145
+ agreed to in writing, Licensor provides the Work (and each
146
+ Contributor provides its Contributions) on an "AS IS" BASIS,
147
+ WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or
148
+ implied, including, without limitation, any warranties or conditions
149
+ of TITLE, NON-INFRINGEMENT, MERCHANTABILITY, or FITNESS FOR A
150
+ PARTICULAR PURPOSE. You are solely responsible for determining the
151
+ appropriateness of using or redistributing the Work and assume any
152
+ risks associated with Your exercise of permissions under this License.
153
+
154
+ 8. Limitation of Liability. In no event and under no legal theory,
155
+ whether in tort (including negligence), contract, or otherwise,
156
+ unless required by applicable law (such as deliberate and grossly
157
+ negligent acts) or agreed to in writing, shall any Contributor be
158
+ liable to You for damages, including any direct, indirect, special,
159
+ incidental, or consequential damages of any character arising as a
160
+ result of this License or out of the use or inability to use the
161
+ Work (including but not limited to damages for loss of goodwill,
162
+ work stoppage, computer failure or malfunction, or any and all
163
+ other commercial damages or losses), even if such Contributor
164
+ has been advised of the possibility of such damages.
165
+
166
+ 9. Accepting Warranty or Additional Liability. While redistributing
167
+ the Work or Derivative Works thereof, You may choose to offer,
168
+ and charge a fee for, acceptance of support, warranty, indemnity,
169
+ or other liability obligations and/or rights consistent with this
170
+ License. However, in accepting such obligations, You may act only
171
+ on Your own behalf and on Your sole responsibility, not on behalf
172
+ of any other Contributor, and only if You agree to indemnify,
173
+ defend, and hold each Contributor harmless for any liability
174
+ incurred by, or claims asserted against, such Contributor by reason
175
+ of your accepting any such warranty or additional liability.
176
+
177
+ END OF TERMS AND CONDITIONS
178
+
179
+ APPENDIX: How to apply the Apache License to your work.
180
+
181
+ To apply the Apache License to your work, attach the following
182
+ boilerplate notice, with the fields enclosed by brackets "[]"
183
+ replaced with your own identifying information. (Don't include
184
+ the brackets!) The text should be enclosed in the appropriate
185
+ comment syntax for the file format. We also recommend that a
186
+ file or class name and description of purpose be included on the
187
+ same "printed page" as the copyright notice for easier
188
+ identification within third-party archives.
189
+
190
+ Copyright 2020 Elastic and contributors
191
+
192
+ Licensed under the Apache License, Version 2.0 (the "License");
193
+ you may not use this file except in compliance with the License.
194
+ You may obtain a copy of the License at
195
+
196
+ http://www.apache.org/licenses/LICENSE-2.0
197
+
198
+ Unless required by applicable law or agreed to in writing, software
199
+ distributed under the License is distributed on an "AS IS" BASIS,
200
+ WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
201
+ See the License for the specific language governing permissions and
202
+ limitations under the License.
data/README.md CHANGED
@@ -1,6 +1,6 @@
1
1
  # Logstash Plugin
2
2
 
3
- [![Travis Build Status](https://travis-ci.org/logstash-plugins/logstash-filter-csv.svg)](https://travis-ci.org/logstash-plugins/logstash-filter-csv)
3
+ [![Travis Build Status](https://travis-ci.com/logstash-plugins/logstash-filter-csv.svg)](https://travis-ci.com/logstash-plugins/logstash-filter-csv)
4
4
 
5
5
  This is a plugin for [Logstash](https://github.com/elastic/logstash).
6
6
 
data/VERSION ADDED
@@ -0,0 +1 @@
1
+ 3.1.1
data/docs/index.asciidoc CHANGED
@@ -21,8 +21,14 @@ include::{include_path}/plugin_header.asciidoc[]
21
21
  ==== Description
22
22
 
23
23
  The CSV filter takes an event field containing CSV data, parses it,
24
- and stores it as individual fields (can optionally specify the names).
25
- This filter can also parse data with any separator, not just commas.
24
+ and stores it as individual fields with optionally-specified field names.
25
+ This filter can parse data with any separator, not just commas.
26
+
27
+ [id="plugins-{type}s-{plugin}-ecs_metadata"]
28
+ ==== Event Metadata and the Elastic Common Schema (ECS)
29
+ The plugin behaves the same regardless of ECS compatibility, except giving a warning when ECS is enabled and `target` isn't set.
30
+
31
+ TIP: Set the `target` option to avoid potential schema conflicts.
26
32
 
27
33
  [id="plugins-{type}s-{plugin}-options"]
28
34
  ==== Csv Filter Configuration Options
@@ -36,9 +42,12 @@ This plugin supports the following configuration options plus the <<plugins-{typ
36
42
  | <<plugins-{type}s-{plugin}-autogenerate_column_names>> |<<boolean,boolean>>|No
37
43
  | <<plugins-{type}s-{plugin}-columns>> |<<array,array>>|No
38
44
  | <<plugins-{type}s-{plugin}-convert>> |<<hash,hash>>|No
45
+ | <<plugins-{type}s-{plugin}-ecs_compatibility>> | <<string,string>>|No
39
46
  | <<plugins-{type}s-{plugin}-quote_char>> |<<string,string>>|No
40
47
  | <<plugins-{type}s-{plugin}-separator>> |<<string,string>>|No
41
48
  | <<plugins-{type}s-{plugin}-skip_empty_columns>> |<<boolean,boolean>>|No
49
+ | <<plugins-{type}s-{plugin}-skip_empty_rows>> |<<boolean,boolean>>|No
50
+ | <<plugins-{type}s-{plugin}-skip_header>> |<<boolean,boolean>>|No
42
51
  | <<plugins-{type}s-{plugin}-source>> |<<string,string>>|No
43
52
  | <<plugins-{type}s-{plugin}-target>> |<<string,string>>|No
44
53
  |=======================================================================
@@ -57,6 +66,8 @@ filter plugins.
57
66
  Define whether column names should be auto-detected from the header column or not.
58
67
  Defaults to false.
59
68
 
69
+ Logstash pipeline workers must be set to `1` for this option to work.
70
+
60
71
  [id="plugins-{type}s-{plugin}-autogenerate_column_names"]
61
72
  ===== `autogenerate_column_names`
62
73
 
@@ -88,7 +99,8 @@ in the data than specified in this column list, extra columns will be auto-numbe
88
99
  Define a set of datatype conversions to be applied to columns.
89
100
  Possible conversions are integer, float, date, date_time, boolean
90
101
 
91
- # Example:
102
+ Example:
103
+
92
104
  [source,ruby]
93
105
  filter {
94
106
  csv {
@@ -99,6 +111,18 @@ Possible conversions are integer, float, date, date_time, boolean
99
111
  }
100
112
  }
101
113
 
114
+ [id="plugins-{type}s-{plugin}-ecs_compatibility"]
115
+ ===== `ecs_compatibility`
116
+
117
+ * Value type is <<string,string>>
118
+ * Supported values are:
119
+ ** `disabled`: does not use ECS-compatible field names
120
+ ** `v1`: uses the value in `target` as field name
121
+
122
+ Controls this plugin's compatibility with the
123
+ {ecs-ref}[Elastic Common Schema (ECS)].
124
+ See <<plugins-{type}s-{plugin}-ecs_metadata>> for detailed information.
125
+
102
126
  [id="plugins-{type}s-{plugin}-quote_char"]
103
127
  ===== `quote_char`
104
128
 
@@ -129,6 +153,33 @@ Optional.
129
153
  Define whether empty columns should be skipped.
130
154
  Defaults to false. If set to true, columns containing no value will not get set.
131
155
 
156
+ [id="plugins-{type}s-{plugin}-skip_empty_rows"]
157
+ ===== `skip_empty_rows`
158
+
159
+ * Value type is <<boolean,boolean>>
160
+ * Default value is `false`
161
+
162
+ Define whether empty rows could potentially be skipped.
163
+ Defaults to false. If set to true, rows containing no value will be tagged with "_csvskippedemptyfield".
164
+ This tag can referenced by users if they wish to cancel events using an 'if' conditional statement.
165
+
166
+ [id="plugins-{type}s-{plugin}-skip_header"]
167
+ ===== `skip_header`
168
+
169
+ * Value type is <<boolean,boolean>>
170
+ * Default value is `false`
171
+
172
+ Define whether the header should be skipped.
173
+ Defaults to false. If set to true, the header will be skipped.
174
+ Assumes that header is not repeated within further rows as such rows will also be skipped.
175
+ If `skip_header` is set without `autodetect_column_names` being set then columns should be set which
176
+ will result in the skipping of any row that exactly matches the specified column values.
177
+ If `skip_header` and `autodetect_column_names` are specified then columns should not be specified, in this case
178
+ `autodetect_column_names` will fill the columns setting in the background, from the first event seen, and any
179
+ subsequent values that match what was autodetected will be skipped.
180
+
181
+ Logstash pipeline workers must be set to `1` for this option to work.
182
+
132
183
  [id="plugins-{type}s-{plugin}-source"]
133
184
  ===== `source`
134
185
 
@@ -148,6 +199,5 @@ Define target field for placing the data.
148
199
  Defaults to writing to the root of the event.
149
200
 
150
201
 
151
-
152
202
  [id="plugins-{type}s-{plugin}-common-options"]
153
- include::{include_path}/{type}.asciidoc[]
203
+ include::{include_path}/{type}.asciidoc[]
@@ -1,6 +1,9 @@
1
1
  # encoding: utf-8
2
2
  require "logstash/filters/base"
3
3
  require "logstash/namespace"
4
+ require 'logstash/plugin_mixins/ecs_compatibility_support'
5
+ require 'logstash/plugin_mixins/ecs_compatibility_support/target_check'
6
+ require 'logstash/plugin_mixins/validator_support/field_reference_validation_adapter'
4
7
 
5
8
  require "csv"
6
9
 
@@ -8,6 +11,11 @@ require "csv"
8
11
  # and stores it as individual fields (can optionally specify the names).
9
12
  # This filter can also parse data with any separator, not just commas.
10
13
  class LogStash::Filters::CSV < LogStash::Filters::Base
14
+ include LogStash::PluginMixins::ECSCompatibilitySupport(:disabled, :v1, :v8 => :v1)
15
+ include LogStash::PluginMixins::ECSCompatibilitySupport::TargetCheck
16
+
17
+ extend LogStash::PluginMixins::ValidatorSupport::FieldReferenceValidationAdapter
18
+
11
19
  config_name "csv"
12
20
 
13
21
  # The CSV data in the value of the `source` field will be expanded into a
@@ -35,16 +43,25 @@ class LogStash::Filters::CSV < LogStash::Filters::Base
35
43
 
36
44
  # Define target field for placing the data.
37
45
  # Defaults to writing to the root of the event.
38
- config :target, :validate => :string
46
+ config :target, :validate => :field_reference
39
47
 
40
48
  # Define whether column names should autogenerated or not.
41
49
  # Defaults to true. If set to false, columns not having a header specified will not be parsed.
42
50
  config :autogenerate_column_names, :validate => :boolean, :default => true
43
51
 
52
+ # Define whether the header should be skipped or not
53
+ # Defaults to false, If set to true, the header is dropped
54
+ config :skip_header, :validate => :boolean, :default => false
55
+
44
56
  # Define whether empty columns should be skipped.
45
57
  # Defaults to false. If set to true, columns containing no value will not get set.
46
58
  config :skip_empty_columns, :validate => :boolean, :default => false
47
59
 
60
+ # Define whether empty rows could potentially be skipped.
61
+ # Defaults to false. If set to true, rows containing no value will be tagged with _csvskippedemptyfield.
62
+ # This tag can referenced by users if they wish to cancel events using an 'if' conditional statement.
63
+ config :skip_empty_rows, :validate => :boolean, :default => false
64
+
48
65
  # Define a set of datatype conversions to be applied to columns.
49
66
  # Possible conversions are integer, float, date, date_time, boolean
50
67
  #
@@ -104,7 +121,7 @@ class LogStash::Filters::CSV < LogStash::Filters::Base
104
121
  # @convert_symbols contains the symbolized types to avoid symbol conversion in the transform method
105
122
  @convert_symbols = @convert.inject({}){|result, (k, v)| result[k] = v.to_sym; result}
106
123
 
107
- # make sure @target is in the format [field name] if defined, i.e. surrounded by brakets
124
+ # make sure @target is in the format [field name] if defined, i.e. surrounded by brackets
108
125
  @target = "[#{@target}]" if @target && !@target.start_with?("[")
109
126
 
110
127
  # if the zero byte character is entered in the config, set the value
@@ -116,11 +133,12 @@ class LogStash::Filters::CSV < LogStash::Filters::Base
116
133
  end
117
134
 
118
135
  def filter(event)
119
- @logger.debug? && @logger.debug("Running csv filter", :event => event)
136
+ @logger.debug? && @logger.debug("Running csv filter", :event => event.to_hash)
120
137
 
121
138
  if (source = event.get(@source))
122
139
  begin
123
- values = CSV.parse_line(source, :col_sep => @separator, :quote_char => @quote_char)
140
+
141
+ values = CSV.parse_line(source, :col_sep => @separator, :quote_char => @quote_char)
124
142
 
125
143
  if (@autodetect_column_names && @columns.empty?)
126
144
  @columns = values
@@ -128,6 +146,17 @@ class LogStash::Filters::CSV < LogStash::Filters::Base
128
146
  return
129
147
  end
130
148
 
149
+ if (@skip_header && (!@columns.empty?) && (@columns == values))
150
+ event.cancel
151
+ return
152
+ end
153
+
154
+ if(@skip_empty_rows && values.nil?)
155
+ # applies tag to empty rows, users can cancel event referencing this tag in an 'if' conditional statement
156
+ event.tag("_csvskippedemptyfield")
157
+ return
158
+ end
159
+
131
160
  values.each_index do |i|
132
161
  unless (@skip_empty_columns && (values[i].nil? || values[i].empty?))
133
162
  unless ignore_field?(i)
@@ -145,7 +174,7 @@ class LogStash::Filters::CSV < LogStash::Filters::Base
145
174
  end
146
175
  end
147
176
 
148
- @logger.debug? && @logger.debug("Event after csv filter", :event => event)
177
+ @logger.debug? && @logger.debug("Event after csv filter", :event => event.to_hash)
149
178
  end
150
179
 
151
180
  private
@@ -1,9 +1,11 @@
1
+ CSV_VERSION = File.read(File.expand_path(File.join(File.dirname(__FILE__), "VERSION"))).strip unless defined?(CSV_VERSION)
2
+
1
3
  Gem::Specification.new do |s|
2
4
 
3
5
  s.name = 'logstash-filter-csv'
4
- s.version = '3.0.6'
6
+ s.version = CSV_VERSION
5
7
  s.licenses = ['Apache License (2.0)']
6
- s.summary = "The CSV filter takes an event field containing CSV data, parses it, and stores it as individual fields (can optionally specify the names)."
8
+ s.summary = "Parses comma-separated value data into individual fields"
7
9
  s.description = "This gem is a Logstash plugin required to be installed on top of the Logstash core pipeline using $LS_HOME/bin/logstash-plugin install gemname. This gem is not a stand-alone program"
8
10
  s.authors = ["Elastic"]
9
11
  s.email = 'info@elastic.co'
@@ -21,6 +23,8 @@ Gem::Specification.new do |s|
21
23
 
22
24
  # Gem dependencies
23
25
  s.add_runtime_dependency "logstash-core-plugin-api", ">= 1.60", "<= 2.99"
26
+ s.add_runtime_dependency 'logstash-mixin-ecs_compatibility_support', '~> 1.3'
27
+ s.add_runtime_dependency 'logstash-mixin-validator_support', '~> 1.0'
24
28
 
25
29
  s.add_development_dependency 'logstash-devutils'
26
30
  end
@@ -1,243 +1,260 @@
1
1
  # encoding: utf-8
2
2
  require "logstash/devutils/rspec/spec_helper"
3
3
  require "logstash/filters/csv"
4
+ require 'logstash/plugin_mixins/ecs_compatibility_support/spec_helper'
4
5
 
5
- describe LogStash::Filters::CSV do
6
+ describe LogStash::Filters::CSV, :ecs_compatibility_support, :aggregate_failures do
7
+ ecs_compatibility_matrix(:disabled, :v1, :v8 => :v1) do |ecs_select|
6
8
 
7
- subject(:plugin) { LogStash::Filters::CSV.new(config) }
8
- let(:config) { Hash.new }
9
+ subject(:plugin) { LogStash::Filters::CSV.new(config) }
10
+ let(:config) { Hash.new }
9
11
 
10
- let(:doc) { "" }
11
- let(:event) { LogStash::Event.new("message" => doc) }
12
+ let(:doc) { "" }
13
+ let(:event) { LogStash::Event.new("message" => doc) }
12
14
 
13
- describe "registration" do
15
+ describe "registration" do
14
16
 
15
- context "when using invalid data types" do
16
- let(:config) do
17
- { "convert" => { "custom1" => "integer", "custom3" => "wrong_type" },
18
- "columns" => ["custom1", "custom2", "custom3"] }
19
- end
20
-
21
- it "should register" do
22
- input = LogStash::Plugin.lookup("filter", "csv").new(config)
23
- expect {input.register}.to raise_error(LogStash::ConfigurationError)
24
- end
25
- end
26
- end
27
-
28
- describe "receive" do
29
-
30
- before(:each) do
31
- plugin.register
32
- end
33
-
34
- describe "all defaults" do
35
-
36
- let(:config) { Hash.new }
37
-
38
- let(:doc) { "big,bird,sesame street" }
39
-
40
- it "extract all the values" do
41
- plugin.filter(event)
42
- expect(event.get("column1")).to eq("big")
43
- expect(event.get("column2")).to eq("bird")
44
- expect(event.get("column3")).to eq("sesame street")
45
- end
46
-
47
- it "should not mutate the source field" do
48
- plugin.filter(event)
49
- expect(event.get("message")).to be_kind_of(String)
50
- end
51
- end
52
-
53
- describe "custom separator" do
54
- let(:doc) { "big,bird;sesame street" }
17
+ context "when using invalid data types" do
18
+ let(:config) do
19
+ { "convert" => { "custom1" => "integer", "custom3" => "wrong_type" },
20
+ "columns" => ["custom1", "custom2", "custom3"] }
21
+ end
55
22
 
56
- let(:config) do
57
- { "separator" => ";" }
58
- end
59
- it "extract all the values" do
60
- plugin.filter(event)
61
- expect(event.get("column1")).to eq("big,bird")
62
- expect(event.get("column2")).to eq("sesame street")
23
+ it "should register" do
24
+ input = LogStash::Plugin.lookup("filter", "csv").new(config)
25
+ expect {input.register}.to raise_error(LogStash::ConfigurationError)
26
+ end
63
27
  end
64
28
  end
65
29
 
66
- describe "quote char" do
67
- let(:doc) { "big,bird,'sesame street'" }
30
+ describe "receive" do
68
31
 
69
- let(:config) do
70
- { "quote_char" => "'"}
32
+ before(:each) do
33
+ allow_any_instance_of(described_class).to receive(:ecs_compatibility).and_return(ecs_compatibility)
34
+ plugin.register
71
35
  end
72
36
 
73
- it "extract all the values" do
74
- plugin.filter(event)
75
- expect(event.get("column1")).to eq("big")
76
- expect(event.get("column2")).to eq("bird")
77
- expect(event.get("column3")).to eq("sesame street")
78
- end
37
+ describe "all defaults" do
79
38
 
80
- context "using the default one" do
81
- let(:doc) { 'big,bird,"sesame, street"' }
82
39
  let(:config) { Hash.new }
83
40
 
41
+ let(:doc) { "big,bird,sesame street" }
42
+
84
43
  it "extract all the values" do
85
44
  plugin.filter(event)
86
45
  expect(event.get("column1")).to eq("big")
87
46
  expect(event.get("column2")).to eq("bird")
88
- expect(event.get("column3")).to eq("sesame, street")
47
+ expect(event.get("column3")).to eq("sesame street")
48
+ end
49
+
50
+ it "should not mutate the source field" do
51
+ plugin.filter(event)
52
+ expect(event.get("message")).to be_kind_of(String)
89
53
  end
90
54
  end
91
55
 
92
- context "using a null" do
93
- let(:doc) { 'big,bird,"sesame" street' }
56
+ describe "empty message" do
57
+ let(:doc) { "" }
58
+
94
59
  let(:config) do
95
- { "quote_char" => "\x00" }
60
+ { "skip_empty_rows" => true }
96
61
  end
97
62
 
98
- it "extract all the values" do
63
+ it "skips empty rows" do
99
64
  plugin.filter(event)
100
- expect(event.get("column1")).to eq("big")
101
- expect(event.get("column2")).to eq("bird")
102
- expect(event.get("column3")).to eq('"sesame" street')
65
+ expect(event.get("tags")).to include("_csvskippedemptyfield")
66
+ expect(event).not_to be_cancelled
103
67
  end
104
68
  end
105
-
106
- context "using a null as read from config" do
107
- let(:doc) { 'big,bird,"sesame" street' }
69
+
70
+ describe "custom separator" do
71
+ let(:doc) { "big,bird;sesame street" }
72
+
108
73
  let(:config) do
109
- { "quote_char" => "\\x00" }
74
+ { "separator" => ";" }
110
75
  end
111
-
112
76
  it "extract all the values" do
113
77
  plugin.filter(event)
114
- expect(event.get("column1")).to eq("big")
115
- expect(event.get("column2")).to eq("bird")
116
- expect(event.get("column3")).to eq('"sesame" street')
78
+ expect(event.get("column1")).to eq("big,bird")
79
+ expect(event.get("column2")).to eq("sesame street")
117
80
  end
118
81
  end
119
- end
120
-
121
- describe "given column names" do
122
- let(:doc) { "big,bird,sesame street" }
123
- let(:config) do
124
- { "columns" => ["first", "last", "address" ] }
125
- end
126
82
 
127
- it "extract all the values" do
128
- plugin.filter(event)
129
- expect(event.get("first")).to eq("big")
130
- expect(event.get("last")).to eq("bird")
131
- expect(event.get("address")).to eq("sesame street")
132
- end
83
+ describe "quote char" do
84
+ let(:doc) { "big,bird,'sesame street'" }
133
85
 
134
- context "parse csv without autogeneration of names" do
135
-
136
- let(:doc) { "val1,val2,val3" }
137
86
  let(:config) do
138
- { "autogenerate_column_names" => false,
139
- "columns" => ["custom1", "custom2"] }
87
+ { "quote_char" => "'"}
140
88
  end
141
89
 
142
90
  it "extract all the values" do
143
91
  plugin.filter(event)
144
- expect(event.get("custom1")).to eq("val1")
145
- expect(event.get("custom2")).to eq("val2")
146
- expect(event.get("column3")).to be_falsey
92
+ expect(event.get("column1")).to eq("big")
93
+ expect(event.get("column2")).to eq("bird")
94
+ expect(event.get("column3")).to eq("sesame street")
147
95
  end
148
- end
149
96
 
150
- context "parse csv skipping empty columns" do
97
+ context "using the default one" do
98
+ let(:doc) { 'big,bird,"sesame, street"' }
99
+ let(:config) { Hash.new }
151
100
 
152
- let(:doc) { "val1,,val3" }
101
+ it "extract all the values" do
102
+ plugin.filter(event)
103
+ expect(event.get("column1")).to eq("big")
104
+ expect(event.get("column2")).to eq("bird")
105
+ expect(event.get("column3")).to eq("sesame, street")
106
+ end
107
+ end
153
108
 
154
- let(:config) do
155
- { "skip_empty_columns" => true,
156
- "source" => "datafield",
157
- "columns" => ["custom1", "custom2", "custom3"] }
109
+ context "using a null" do
110
+ let(:doc) { 'big,bird,"sesame" street' }
111
+ let(:config) do
112
+ { "quote_char" => "\x00" }
113
+ end
114
+
115
+ it "extract all the values" do
116
+ plugin.filter(event)
117
+ expect(event.get("column1")).to eq("big")
118
+ expect(event.get("column2")).to eq("bird")
119
+ expect(event.get("column3")).to eq('"sesame" street')
120
+ end
158
121
  end
159
122
 
160
- let(:event) { LogStash::Event.new("datafield" => doc) }
123
+ context "using a null as read from config" do
124
+ let(:doc) { 'big,bird,"sesame" street' }
125
+ let(:config) do
126
+ { "quote_char" => "\\x00" }
127
+ end
161
128
 
162
- it "extract all the values" do
163
- plugin.filter(event)
164
- expect(event.get("custom1")).to eq("val1")
165
- expect(event.get("custom2")).to be_falsey
166
- expect(event.get("custom3")).to eq("val3")
129
+ it "extract all the values" do
130
+ plugin.filter(event)
131
+ expect(event.get("column1")).to eq("big")
132
+ expect(event.get("column2")).to eq("bird")
133
+ expect(event.get("column3")).to eq('"sesame" street')
134
+ end
167
135
  end
168
136
  end
169
137
 
170
- context "parse csv with more data than defined" do
171
- let(:doc) { "val1,val2,val3" }
138
+ describe "given column names" do
139
+ let(:doc) { "big,bird,sesame street" }
172
140
  let(:config) do
173
- { "columns" => ["custom1", "custom2"] }
141
+ { "columns" => ["first", "last", "address" ] }
174
142
  end
175
143
 
176
144
  it "extract all the values" do
177
145
  plugin.filter(event)
178
- expect(event.get("custom1")).to eq("val1")
179
- expect(event.get("custom2")).to eq("val2")
180
- expect(event.get("column3")).to eq("val3")
146
+ expect(event.get("first")).to eq("big")
147
+ expect(event.get("last")).to eq("bird")
148
+ expect(event.get("address")).to eq("sesame street")
181
149
  end
182
- end
183
150
 
184
- context "parse csv from a given source" do
185
- let(:doc) { "val1,val2,val3" }
186
- let(:config) do
187
- { "source" => "datafield",
188
- "columns" => ["custom1", "custom2", "custom3"] }
151
+ context "parse csv without autogeneration of names" do
152
+
153
+ let(:doc) { "val1,val2,val3" }
154
+ let(:config) do
155
+ { "autogenerate_column_names" => false,
156
+ "columns" => ["custom1", "custom2"] }
157
+ end
158
+
159
+ it "extract all the values" do
160
+ plugin.filter(event)
161
+ expect(event.get("custom1")).to eq("val1")
162
+ expect(event.get("custom2")).to eq("val2")
163
+ expect(event.get("column3")).to be_falsey
164
+ end
189
165
  end
190
- let(:event) { LogStash::Event.new("datafield" => doc) }
191
166
 
192
- it "extract all the values" do
193
- plugin.filter(event)
194
- expect(event.get("custom1")).to eq("val1")
195
- expect(event.get("custom2")).to eq("val2")
196
- expect(event.get("custom3")).to eq("val3")
167
+
168
+ context "parse csv and skip the header" do
169
+
170
+ let(:doc) { "first_column,second_column,third_column" }
171
+ let(:config) do
172
+ { "skip_header" => true,
173
+ "columns" => ["first_column", "second_column", "third_column"] }
174
+ end
175
+
176
+ it "expects the event to be cancelled" do
177
+ plugin.filter(event)
178
+ expect(event).to be_cancelled
179
+ end
197
180
  end
198
- end
199
181
 
200
- context "that use [@metadata]" do
201
- let(:metadata_field) { "[@metadata][one]" }
202
- let(:config) do
203
- {
204
- "columns" => [ metadata_field, "foo" ]
205
- }
182
+ context "parse csv skipping empty columns" do
183
+
184
+ let(:doc) { "val1,,val3" }
185
+
186
+ let(:config) do
187
+ { "skip_empty_columns" => true,
188
+ "source" => "datafield",
189
+ "columns" => ["custom1", "custom2", "custom3"] }
190
+ end
191
+
192
+ let(:event) { LogStash::Event.new("datafield" => doc) }
193
+
194
+ it "extract all the values" do
195
+ plugin.filter(event)
196
+ expect(event.get("custom1")).to eq("val1")
197
+ expect(event.get("custom2")).to be_falsey
198
+ expect(event.get("custom3")).to eq("val3")
199
+ end
206
200
  end
207
201
 
208
- let(:event) { LogStash::Event.new("message" => "hello,world") }
202
+ context "parse csv with more data than defined" do
203
+ let(:doc) { "val1,val2,val3" }
204
+ let(:config) do
205
+ { "columns" => ["custom1", "custom2"] }
206
+ end
209
207
 
210
- before do
211
- plugin.filter(event)
208
+ it "extract all the values" do
209
+ plugin.filter(event)
210
+ expect(event.get("custom1")).to eq("val1")
211
+ expect(event.get("custom2")).to eq("val2")
212
+ expect(event.get("column3")).to eq("val3")
213
+ end
212
214
  end
213
215
 
214
- it "should work correctly" do
215
- expect(event.get(metadata_field)).to eq("hello")
216
+ context "parse csv from a given source" do
217
+ let(:doc) { "val1,val2,val3" }
218
+ let(:config) do
219
+ { "source" => "datafield",
220
+ "columns" => ["custom1", "custom2", "custom3"] }
221
+ end
222
+ let(:event) { LogStash::Event.new("datafield" => doc) }
223
+
224
+ it "extract all the values" do
225
+ plugin.filter(event)
226
+ expect(event.get("custom1")).to eq("val1")
227
+ expect(event.get("custom2")).to eq("val2")
228
+ expect(event.get("custom3")).to eq("val3")
229
+ end
216
230
  end
217
- end
218
- end
219
231
 
220
- describe "givin target" do
221
- let(:config) do
222
- { "target" => "data" }
223
- end
224
- let(:doc) { "big,bird,sesame street" }
225
- let(:event) { LogStash::Event.new("message" => doc) }
226
-
227
- it "extract all the values" do
228
- plugin.filter(event)
229
- expect(event.get("data")["column1"]).to eq("big")
230
- expect(event.get("data")["column2"]).to eq("bird")
231
- expect(event.get("data")["column3"]).to eq("sesame street")
232
+ context "that use [@metadata]" do
233
+ let(:metadata_field) { "[@metadata][one]" }
234
+ let(:config) do
235
+ {
236
+ "columns" => [ metadata_field, "foo" ]
237
+ }
238
+ end
239
+
240
+ let(:event) { LogStash::Event.new("message" => "hello,world") }
241
+
242
+ before do
243
+ plugin.filter(event)
244
+ end
245
+
246
+ it "should work correctly" do
247
+ expect(event.get(metadata_field)).to eq("hello")
248
+ end
249
+ end
232
250
  end
233
251
 
234
- context "when having also source" do
252
+ describe "givin target" do
235
253
  let(:config) do
236
- { "source" => "datain",
237
- "target" => "data" }
254
+ { "target" => "data" }
238
255
  end
239
- let(:event) { LogStash::Event.new("datain" => doc) }
240
256
  let(:doc) { "big,bird,sesame street" }
257
+ let(:event) { LogStash::Event.new("message" => doc) }
241
258
 
242
259
  it "extract all the values" do
243
260
  plugin.filter(event)
@@ -245,135 +262,152 @@ describe LogStash::Filters::CSV do
245
262
  expect(event.get("data")["column2"]).to eq("bird")
246
263
  expect(event.get("data")["column3"]).to eq("sesame street")
247
264
  end
248
- end
249
-
250
- context "which uses [nested][fieldref] syntax" do
251
- let(:target) { "[foo][bar]" }
252
- let(:config) do
253
- {
254
- "target" => target
255
- }
256
- end
257
265
 
258
- let(:event) { LogStash::Event.new("message" => "hello,world") }
259
-
260
- before do
261
- plugin.filter(event)
262
- end
263
-
264
- it "should set fields correctly in the target" do
265
- expect(event.get("#{target}[column1]")).to eq("hello")
266
- expect(event.get("#{target}[column2]")).to eq("world")
266
+ context "when having also source" do
267
+ let(:config) do
268
+ { "source" => "datain",
269
+ "target" => "data" }
270
+ end
271
+ let(:event) { LogStash::Event.new("datain" => doc) }
272
+ let(:doc) { "big,bird,sesame street" }
273
+
274
+ it "extract all the values" do
275
+ plugin.filter(event)
276
+ expect(event.get("data")["column1"]).to eq("big")
277
+ expect(event.get("data")["column2"]).to eq("bird")
278
+ expect(event.get("data")["column3"]).to eq("sesame street")
279
+ end
267
280
  end
268
281
 
269
- context "with nested fieldrefs as columns" do
282
+ context "which uses [nested][fieldref] syntax" do
283
+ let(:target) { "[foo][bar]" }
270
284
  let(:config) do
271
285
  {
272
- "target" => target,
273
- "columns" => [ "[test][one]", "[test][two]" ]
286
+ "target" => target
274
287
  }
275
288
  end
276
289
 
290
+ let(:event) { LogStash::Event.new("message" => "hello,world") }
291
+
292
+ before do
293
+ plugin.filter(event)
294
+ end
295
+
277
296
  it "should set fields correctly in the target" do
278
- expect(event.get("#{target}[test][one]")).to eq("hello")
279
- expect(event.get("#{target}[test][two]")).to eq("world")
297
+ expect(event.get("#{target}[column1]")).to eq("hello")
298
+ expect(event.get("#{target}[column2]")).to eq("world")
299
+ end
300
+
301
+ context "with nested fieldrefs as columns" do
302
+ let(:config) do
303
+ {
304
+ "target" => target,
305
+ "columns" => [ "[test][one]", "[test][two]" ]
306
+ }
307
+ end
308
+
309
+ it "should set fields correctly in the target" do
310
+ expect(event.get("#{target}[test][one]")).to eq("hello")
311
+ expect(event.get("#{target}[test][two]")).to eq("world")
312
+ end
280
313
  end
281
- end
282
314
 
315
+ end
283
316
  end
284
- end
285
317
 
286
- describe "using field convertion" do
318
+ describe "using field convertion" do
287
319
 
288
- let(:config) do
289
- {
320
+ let(:config) do
321
+ {
290
322
  "convert" => {
291
- "column1" => "integer",
292
- "column3" => "boolean",
293
- "column4" => "float",
294
- "column5" => "date",
295
- "column6" => "date_time",
296
- "column7" => "date",
297
- "column8" => "date_time",
323
+ "column1" => "integer",
324
+ "column3" => "boolean",
325
+ "column4" => "float",
326
+ "column5" => "date",
327
+ "column6" => "date_time",
328
+ "column7" => "date",
329
+ "column8" => "date_time",
298
330
  }
299
- }
300
- end
301
- # 2017-06-01,2001-02-03T04:05:06+07:00
302
- let(:doc) { "1234,bird,false,3.14159265359,2017-06-01,2001-02-03 04:05:06,invalid_date,invalid_date_time" }
303
- let(:event) { LogStash::Event.new("message" => doc) }
331
+ }
332
+ end
333
+ # 2017-06-01,2001-02-03T04:05:06+07:00
334
+ let(:doc) { "1234,bird,false,3.14159265359,2017-06-01,2001-02-03 04:05:06,invalid_date,invalid_date_time" }
335
+ let(:event) { LogStash::Event.new("message" => doc) }
304
336
 
305
- it "converts to integer" do
306
- plugin.filter(event)
307
- expect(event.get("column1")).to eq(1234)
308
- end
337
+ it "converts to integer" do
338
+ plugin.filter(event)
339
+ expect(event.get("column1")).to eq(1234)
340
+ end
309
341
 
310
- it "does not convert without converter" do
311
- plugin.filter(event)
312
- expect(event.get("column2")).to eq("bird")
313
- end
342
+ it "does not convert without converter" do
343
+ plugin.filter(event)
344
+ expect(event.get("column2")).to eq("bird")
345
+ end
314
346
 
315
- it "converts to boolean" do
316
- plugin.filter(event)
317
- expect(event.get("column3")).to eq(false)
318
- end
347
+ it "converts to boolean" do
348
+ plugin.filter(event)
349
+ expect(event.get("column3")).to eq(false)
350
+ end
319
351
 
320
- it "converts to float" do
321
- plugin.filter(event)
322
- expect(event.get("column4")).to eq(3.14159265359)
323
- end
352
+ it "converts to float" do
353
+ plugin.filter(event)
354
+ expect(event.get("column4")).to eq(3.14159265359)
355
+ end
324
356
 
325
- it "converts to date" do
326
- plugin.filter(event)
327
- expect(event.get("column5")).to be_a(LogStash::Timestamp)
328
- expect(event.get("column5").to_s).to eq(LogStash::Timestamp.new(Date.parse("2017-06-01").to_time).to_s)
329
- end
357
+ it "converts to date" do
358
+ plugin.filter(event)
359
+ expect(event.get("column5")).to be_a(LogStash::Timestamp)
360
+ expect(event.get("column5").to_s).to eq(LogStash::Timestamp.new(Date.parse("2017-06-01").to_time).to_s)
361
+ end
330
362
 
331
- it "converts to date_time" do
332
- plugin.filter(event)
333
- expect(event.get("column6")).to be_a(LogStash::Timestamp)
334
- expect(event.get("column6").to_s).to eq(LogStash::Timestamp.new(DateTime.parse("2001-02-03 04:05:06").to_time).to_s)
335
- end
363
+ it "converts to date_time" do
364
+ plugin.filter(event)
365
+ expect(event.get("column6")).to be_a(LogStash::Timestamp)
366
+ expect(event.get("column6").to_s).to eq(LogStash::Timestamp.new(DateTime.parse("2001-02-03 04:05:06").to_time).to_s)
367
+ end
336
368
 
337
- it "tries to converts to date but return original" do
338
- plugin.filter(event)
339
- expect(event.get("column7")).to eq("invalid_date")
340
- end
369
+ it "tries to converts to date but return original" do
370
+ plugin.filter(event)
371
+ expect(event.get("column7")).to eq("invalid_date")
372
+ end
341
373
 
342
- it "tries to converts to date_time but return original" do
343
- plugin.filter(event)
344
- expect(event.get("column8")).to eq("invalid_date_time")
345
- end
374
+ it "tries to converts to date_time but return original" do
375
+ plugin.filter(event)
376
+ expect(event.get("column8")).to eq("invalid_date_time")
377
+ end
346
378
 
347
- context "when using column names" do
379
+ context "when using column names" do
348
380
 
381
+ let(:config) do
382
+ { "convert" => { "custom1" => "integer", "custom3" => "boolean" },
383
+ "columns" => ["custom1", "custom2", "custom3"] }
384
+ end
385
+
386
+ it "get converted values to the expected type" do
387
+ plugin.filter(event)
388
+ expect(event.get("custom1")).to eq(1234)
389
+ expect(event.get("custom2")).to eq("bird")
390
+ expect(event.get("custom3")).to eq(false)
391
+ end
392
+ end
393
+ end
394
+
395
+ describe "given autodetect option" do
396
+ let(:header) { LogStash::Event.new("message" => "first,last,address") }
397
+ let(:doc) { "big,bird,sesame street" }
349
398
  let(:config) do
350
- { "convert" => { "custom1" => "integer", "custom3" => "boolean" },
351
- "columns" => ["custom1", "custom2", "custom3"] }
399
+ { "autodetect_column_names" => true }
352
400
  end
353
401
 
354
- it "get converted values to the expected type" do
402
+ it "extract all the values with the autodetected header" do
403
+ plugin.filter(header)
355
404
  plugin.filter(event)
356
- expect(event.get("custom1")).to eq(1234)
357
- expect(event.get("custom2")).to eq("bird")
358
- expect(event.get("custom3")).to eq(false)
405
+ expect(event.get("first")).to eq("big")
406
+ expect(event.get("last")).to eq("bird")
407
+ expect(event.get("address")).to eq("sesame street")
359
408
  end
360
409
  end
361
410
  end
362
411
 
363
- describe "given autodetect option" do
364
- let(:header) { LogStash::Event.new("message" => "first,last,address") }
365
- let(:doc) { "big,bird,sesame street" }
366
- let(:config) do
367
- { "autodetect_column_names" => true }
368
- end
369
-
370
- it "extract all the values with the autodetected header" do
371
- plugin.filter(header)
372
- plugin.filter(event)
373
- expect(event.get("first")).to eq("big")
374
- expect(event.get("last")).to eq("bird")
375
- expect(event.get("address")).to eq("sesame street")
376
- end
377
- end
378
412
  end
379
413
  end
metadata CHANGED
@@ -1,14 +1,14 @@
1
1
  --- !ruby/object:Gem::Specification
2
2
  name: logstash-filter-csv
3
3
  version: !ruby/object:Gem::Version
4
- version: 3.0.6
4
+ version: 3.1.1
5
5
  platform: ruby
6
6
  authors:
7
7
  - Elastic
8
8
  autorequire:
9
9
  bindir: bin
10
10
  cert_chain: []
11
- date: 2017-11-03 00:00:00.000000000 Z
11
+ date: 2021-06-08 00:00:00.000000000 Z
12
12
  dependencies:
13
13
  - !ruby/object:Gem::Dependency
14
14
  requirement: !ruby/object:Gem::Requirement
@@ -30,6 +30,34 @@ dependencies:
30
30
  - - "<="
31
31
  - !ruby/object:Gem::Version
32
32
  version: '2.99'
33
+ - !ruby/object:Gem::Dependency
34
+ requirement: !ruby/object:Gem::Requirement
35
+ requirements:
36
+ - - "~>"
37
+ - !ruby/object:Gem::Version
38
+ version: '1.3'
39
+ name: logstash-mixin-ecs_compatibility_support
40
+ prerelease: false
41
+ type: :runtime
42
+ version_requirements: !ruby/object:Gem::Requirement
43
+ requirements:
44
+ - - "~>"
45
+ - !ruby/object:Gem::Version
46
+ version: '1.3'
47
+ - !ruby/object:Gem::Dependency
48
+ requirement: !ruby/object:Gem::Requirement
49
+ requirements:
50
+ - - "~>"
51
+ - !ruby/object:Gem::Version
52
+ version: '1.0'
53
+ name: logstash-mixin-validator_support
54
+ prerelease: false
55
+ type: :runtime
56
+ version_requirements: !ruby/object:Gem::Requirement
57
+ requirements:
58
+ - - "~>"
59
+ - !ruby/object:Gem::Version
60
+ version: '1.0'
33
61
  - !ruby/object:Gem::Dependency
34
62
  requirement: !ruby/object:Gem::Requirement
35
63
  requirements:
@@ -44,7 +72,9 @@ dependencies:
44
72
  - - ">="
45
73
  - !ruby/object:Gem::Version
46
74
  version: '0'
47
- description: This gem is a Logstash plugin required to be installed on top of the Logstash core pipeline using $LS_HOME/bin/logstash-plugin install gemname. This gem is not a stand-alone program
75
+ description: This gem is a Logstash plugin required to be installed on top of the
76
+ Logstash core pipeline using $LS_HOME/bin/logstash-plugin install gemname. This
77
+ gem is not a stand-alone program
48
78
  email: info@elastic.co
49
79
  executables: []
50
80
  extensions: []
@@ -56,6 +86,7 @@ files:
56
86
  - LICENSE
57
87
  - NOTICE.TXT
58
88
  - README.md
89
+ - VERSION
59
90
  - docs/index.asciidoc
60
91
  - lib/logstash/filters/csv.rb
61
92
  - logstash-filter-csv.gemspec
@@ -82,9 +113,9 @@ required_rubygems_version: !ruby/object:Gem::Requirement
82
113
  version: '0'
83
114
  requirements: []
84
115
  rubyforge_project:
85
- rubygems_version: 2.4.8
116
+ rubygems_version: 2.6.13
86
117
  signing_key:
87
118
  specification_version: 4
88
- summary: The CSV filter takes an event field containing CSV data, parses it, and stores it as individual fields (can optionally specify the names).
119
+ summary: Parses comma-separated value data into individual fields
89
120
  test_files:
90
121
  - spec/filters/csv_spec.rb