logstash-filter-csv 3.0.5 → 3.1.0

Sign up to get free protection for your applications and to get access to all the features.
checksums.yaml CHANGED
@@ -1,7 +1,7 @@
1
1
  ---
2
- SHA1:
3
- metadata.gz: 55248ee94900a2352a2900c98c891c5de8ebfe3b
4
- data.tar.gz: fe7817db9a950532cf8b32650413d6b8c74cbb71
2
+ SHA256:
3
+ metadata.gz: 61380dfaee5f7c768ba33dbeb372644941053ccf3694c441b5f6bc5b7758af46
4
+ data.tar.gz: c9a4bfd2b2a2c0f4d4ccac90ba8d01722a5ed7833294e8cb749f1b4c2685d2ba
5
5
  SHA512:
6
- metadata.gz: 48756013179fb793a380dd17f4b078c4acb9332ecb2da455279214b8653bdfd27b01cb251252d6b12b062bc726426e214ae3bc0a18bac1a7bd5fd5879212893f
7
- data.tar.gz: c588100c860435e658c98ff3be641a3794374593558cf8715096687524e9ef9712416f1de32ce0444f75420dea288d3fee094401c53944b3098e103ddabd4a36
6
+ metadata.gz: 472e1b1028701925b45aaa1990349cd6f277326e1151ba8eaacc7d48125d67528f8873c08bab7698237cdb1e35eaf39506de51623758327f91a29af36f4059f1
7
+ data.tar.gz: 0cc77c798c6fa3319f63aa4c9d56314686fa5bab7f6cbb0d6f221532e2007aacc1545235bfbe37fb3822125b62201bd78ef7fd3e6d48732e18f7d1990674db7b
data/CHANGELOG.md CHANGED
@@ -1,3 +1,25 @@
1
+ ## 3.1.0
2
+ - Add ECS support [#85](https://github.com/logstash-plugins/logstash-filter-csv/pull/85)
3
+
4
+ ## 3.0.11
5
+ - [DOC] Fixed formatting to improve readability [#84](https://github.com/logstash-plugins/logstash-filter-csv/pull/84)
6
+
7
+ ## 3.0.10
8
+ - [DOC] Fix asciidoc formatting for example [#73](https://github.com/logstash-plugins/logstash-filter-csv/pull/73)
9
+
10
+ ## 3.0.9
11
+ - [DOC] Document that the `autodetect_column_names` and `skip_header` options work only when the number of Logstash
12
+ pipeline workers is set to `1`.
13
+
14
+ ## 3.0.8
15
+ - feature: Added support for tagging empty rows which users can reference to conditionally drop events
16
+
17
+ ## 3.0.7
18
+ - Update gemspec summary
19
+
20
+ ## 3.0.6
21
+ - Fix a bug where `[nested][field]` references were incorrectly used. (#24, #52)
22
+
1
23
  ## 3.0.5
2
24
  - Fix some documentation issues
3
25
 
data/CONTRIBUTORS CHANGED
@@ -12,6 +12,7 @@ Contributors:
12
12
  * Pier-Hugues Pellerin (ph)
13
13
  * Richard Pijnenburg (electrical)
14
14
  * Suyog Rao (suyograo)
15
+ * Abdul Haseeb Hussain (AbdulHaseebHussain)
15
16
 
16
17
  Note: If you've sent us patches, bug reports, or otherwise contributed to
17
18
  Logstash, and you aren't on the list above and want to be, please let us know
data/LICENSE CHANGED
@@ -1,13 +1,202 @@
1
- Copyright (c) 2012–2016 Elasticsearch <http://www.elastic.co>
2
1
 
3
- Licensed under the Apache License, Version 2.0 (the "License");
4
- you may not use this file except in compliance with the License.
5
- You may obtain a copy of the License at
2
+ Apache License
3
+ Version 2.0, January 2004
4
+ http://www.apache.org/licenses/
6
5
 
7
- http://www.apache.org/licenses/LICENSE-2.0
6
+ TERMS AND CONDITIONS FOR USE, REPRODUCTION, AND DISTRIBUTION
8
7
 
9
- Unless required by applicable law or agreed to in writing, software
10
- distributed under the License is distributed on an "AS IS" BASIS,
11
- WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
12
- See the License for the specific language governing permissions and
13
- limitations under the License.
8
+ 1. Definitions.
9
+
10
+ "License" shall mean the terms and conditions for use, reproduction,
11
+ and distribution as defined by Sections 1 through 9 of this document.
12
+
13
+ "Licensor" shall mean the copyright owner or entity authorized by
14
+ the copyright owner that is granting the License.
15
+
16
+ "Legal Entity" shall mean the union of the acting entity and all
17
+ other entities that control, are controlled by, or are under common
18
+ control with that entity. For the purposes of this definition,
19
+ "control" means (i) the power, direct or indirect, to cause the
20
+ direction or management of such entity, whether by contract or
21
+ otherwise, or (ii) ownership of fifty percent (50%) or more of the
22
+ outstanding shares, or (iii) beneficial ownership of such entity.
23
+
24
+ "You" (or "Your") shall mean an individual or Legal Entity
25
+ exercising permissions granted by this License.
26
+
27
+ "Source" form shall mean the preferred form for making modifications,
28
+ including but not limited to software source code, documentation
29
+ source, and configuration files.
30
+
31
+ "Object" form shall mean any form resulting from mechanical
32
+ transformation or translation of a Source form, including but
33
+ not limited to compiled object code, generated documentation,
34
+ and conversions to other media types.
35
+
36
+ "Work" shall mean the work of authorship, whether in Source or
37
+ Object form, made available under the License, as indicated by a
38
+ copyright notice that is included in or attached to the work
39
+ (an example is provided in the Appendix below).
40
+
41
+ "Derivative Works" shall mean any work, whether in Source or Object
42
+ form, that is based on (or derived from) the Work and for which the
43
+ editorial revisions, annotations, elaborations, or other modifications
44
+ represent, as a whole, an original work of authorship. For the purposes
45
+ of this License, Derivative Works shall not include works that remain
46
+ separable from, or merely link (or bind by name) to the interfaces of,
47
+ the Work and Derivative Works thereof.
48
+
49
+ "Contribution" shall mean any work of authorship, including
50
+ the original version of the Work and any modifications or additions
51
+ to that Work or Derivative Works thereof, that is intentionally
52
+ submitted to Licensor for inclusion in the Work by the copyright owner
53
+ or by an individual or Legal Entity authorized to submit on behalf of
54
+ the copyright owner. For the purposes of this definition, "submitted"
55
+ means any form of electronic, verbal, or written communication sent
56
+ to the Licensor or its representatives, including but not limited to
57
+ communication on electronic mailing lists, source code control systems,
58
+ and issue tracking systems that are managed by, or on behalf of, the
59
+ Licensor for the purpose of discussing and improving the Work, but
60
+ excluding communication that is conspicuously marked or otherwise
61
+ designated in writing by the copyright owner as "Not a Contribution."
62
+
63
+ "Contributor" shall mean Licensor and any individual or Legal Entity
64
+ on behalf of whom a Contribution has been received by Licensor and
65
+ subsequently incorporated within the Work.
66
+
67
+ 2. Grant of Copyright License. Subject to the terms and conditions of
68
+ this License, each Contributor hereby grants to You a perpetual,
69
+ worldwide, non-exclusive, no-charge, royalty-free, irrevocable
70
+ copyright license to reproduce, prepare Derivative Works of,
71
+ publicly display, publicly perform, sublicense, and distribute the
72
+ Work and such Derivative Works in Source or Object form.
73
+
74
+ 3. Grant of Patent License. Subject to the terms and conditions of
75
+ this License, each Contributor hereby grants to You a perpetual,
76
+ worldwide, non-exclusive, no-charge, royalty-free, irrevocable
77
+ (except as stated in this section) patent license to make, have made,
78
+ use, offer to sell, sell, import, and otherwise transfer the Work,
79
+ where such license applies only to those patent claims licensable
80
+ by such Contributor that are necessarily infringed by their
81
+ Contribution(s) alone or by combination of their Contribution(s)
82
+ with the Work to which such Contribution(s) was submitted. If You
83
+ institute patent litigation against any entity (including a
84
+ cross-claim or counterclaim in a lawsuit) alleging that the Work
85
+ or a Contribution incorporated within the Work constitutes direct
86
+ or contributory patent infringement, then any patent licenses
87
+ granted to You under this License for that Work shall terminate
88
+ as of the date such litigation is filed.
89
+
90
+ 4. Redistribution. You may reproduce and distribute copies of the
91
+ Work or Derivative Works thereof in any medium, with or without
92
+ modifications, and in Source or Object form, provided that You
93
+ meet the following conditions:
94
+
95
+ (a) You must give any other recipients of the Work or
96
+ Derivative Works a copy of this License; and
97
+
98
+ (b) You must cause any modified files to carry prominent notices
99
+ stating that You changed the files; and
100
+
101
+ (c) You must retain, in the Source form of any Derivative Works
102
+ that You distribute, all copyright, patent, trademark, and
103
+ attribution notices from the Source form of the Work,
104
+ excluding those notices that do not pertain to any part of
105
+ the Derivative Works; and
106
+
107
+ (d) If the Work includes a "NOTICE" text file as part of its
108
+ distribution, then any Derivative Works that You distribute must
109
+ include a readable copy of the attribution notices contained
110
+ within such NOTICE file, excluding those notices that do not
111
+ pertain to any part of the Derivative Works, in at least one
112
+ of the following places: within a NOTICE text file distributed
113
+ as part of the Derivative Works; within the Source form or
114
+ documentation, if provided along with the Derivative Works; or,
115
+ within a display generated by the Derivative Works, if and
116
+ wherever such third-party notices normally appear. The contents
117
+ of the NOTICE file are for informational purposes only and
118
+ do not modify the License. You may add Your own attribution
119
+ notices within Derivative Works that You distribute, alongside
120
+ or as an addendum to the NOTICE text from the Work, provided
121
+ that such additional attribution notices cannot be construed
122
+ as modifying the License.
123
+
124
+ You may add Your own copyright statement to Your modifications and
125
+ may provide additional or different license terms and conditions
126
+ for use, reproduction, or distribution of Your modifications, or
127
+ for any such Derivative Works as a whole, provided Your use,
128
+ reproduction, and distribution of the Work otherwise complies with
129
+ the conditions stated in this License.
130
+
131
+ 5. Submission of Contributions. Unless You explicitly state otherwise,
132
+ any Contribution intentionally submitted for inclusion in the Work
133
+ by You to the Licensor shall be under the terms and conditions of
134
+ this License, without any additional terms or conditions.
135
+ Notwithstanding the above, nothing herein shall supersede or modify
136
+ the terms of any separate license agreement you may have executed
137
+ with Licensor regarding such Contributions.
138
+
139
+ 6. Trademarks. This License does not grant permission to use the trade
140
+ names, trademarks, service marks, or product names of the Licensor,
141
+ except as required for reasonable and customary use in describing the
142
+ origin of the Work and reproducing the content of the NOTICE file.
143
+
144
+ 7. Disclaimer of Warranty. Unless required by applicable law or
145
+ agreed to in writing, Licensor provides the Work (and each
146
+ Contributor provides its Contributions) on an "AS IS" BASIS,
147
+ WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or
148
+ implied, including, without limitation, any warranties or conditions
149
+ of TITLE, NON-INFRINGEMENT, MERCHANTABILITY, or FITNESS FOR A
150
+ PARTICULAR PURPOSE. You are solely responsible for determining the
151
+ appropriateness of using or redistributing the Work and assume any
152
+ risks associated with Your exercise of permissions under this License.
153
+
154
+ 8. Limitation of Liability. In no event and under no legal theory,
155
+ whether in tort (including negligence), contract, or otherwise,
156
+ unless required by applicable law (such as deliberate and grossly
157
+ negligent acts) or agreed to in writing, shall any Contributor be
158
+ liable to You for damages, including any direct, indirect, special,
159
+ incidental, or consequential damages of any character arising as a
160
+ result of this License or out of the use or inability to use the
161
+ Work (including but not limited to damages for loss of goodwill,
162
+ work stoppage, computer failure or malfunction, or any and all
163
+ other commercial damages or losses), even if such Contributor
164
+ has been advised of the possibility of such damages.
165
+
166
+ 9. Accepting Warranty or Additional Liability. While redistributing
167
+ the Work or Derivative Works thereof, You may choose to offer,
168
+ and charge a fee for, acceptance of support, warranty, indemnity,
169
+ or other liability obligations and/or rights consistent with this
170
+ License. However, in accepting such obligations, You may act only
171
+ on Your own behalf and on Your sole responsibility, not on behalf
172
+ of any other Contributor, and only if You agree to indemnify,
173
+ defend, and hold each Contributor harmless for any liability
174
+ incurred by, or claims asserted against, such Contributor by reason
175
+ of your accepting any such warranty or additional liability.
176
+
177
+ END OF TERMS AND CONDITIONS
178
+
179
+ APPENDIX: How to apply the Apache License to your work.
180
+
181
+ To apply the Apache License to your work, attach the following
182
+ boilerplate notice, with the fields enclosed by brackets "[]"
183
+ replaced with your own identifying information. (Don't include
184
+ the brackets!) The text should be enclosed in the appropriate
185
+ comment syntax for the file format. We also recommend that a
186
+ file or class name and description of purpose be included on the
187
+ same "printed page" as the copyright notice for easier
188
+ identification within third-party archives.
189
+
190
+ Copyright 2020 Elastic and contributors
191
+
192
+ Licensed under the Apache License, Version 2.0 (the "License");
193
+ you may not use this file except in compliance with the License.
194
+ You may obtain a copy of the License at
195
+
196
+ http://www.apache.org/licenses/LICENSE-2.0
197
+
198
+ Unless required by applicable law or agreed to in writing, software
199
+ distributed under the License is distributed on an "AS IS" BASIS,
200
+ WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
201
+ See the License for the specific language governing permissions and
202
+ limitations under the License.
data/README.md CHANGED
@@ -1,6 +1,6 @@
1
1
  # Logstash Plugin
2
2
 
3
- [![Travis Build Status](https://travis-ci.org/logstash-plugins/logstash-filter-csv.svg)](https://travis-ci.org/logstash-plugins/logstash-filter-csv)
3
+ [![Travis Build Status](https://travis-ci.com/logstash-plugins/logstash-filter-csv.svg)](https://travis-ci.com/logstash-plugins/logstash-filter-csv)
4
4
 
5
5
  This is a plugin for [Logstash](https://github.com/elastic/logstash).
6
6
 
data/VERSION ADDED
@@ -0,0 +1 @@
1
+ 3.1.0
data/docs/index.asciidoc CHANGED
@@ -21,8 +21,14 @@ include::{include_path}/plugin_header.asciidoc[]
21
21
  ==== Description
22
22
 
23
23
  The CSV filter takes an event field containing CSV data, parses it,
24
- and stores it as individual fields (can optionally specify the names).
25
- This filter can also parse data with any separator, not just commas.
24
+ and stores it as individual fields with optionally-specified field names.
25
+ This filter can parse data with any separator, not just commas.
26
+
27
+ [id="plugins-{type}s-{plugin}-ecs_metadata"]
28
+ ==== Event Metadata and the Elastic Common Schema (ECS)
29
+ The plugin behaves the same regardless of ECS compatibility, except giving a warning when ECS is enabled and `target` isn't set.
30
+
31
+ TIP: Set the `target` option to avoid potential schema conflicts.
26
32
 
27
33
  [id="plugins-{type}s-{plugin}-options"]
28
34
  ==== Csv Filter Configuration Options
@@ -36,9 +42,12 @@ This plugin supports the following configuration options plus the <<plugins-{typ
36
42
  | <<plugins-{type}s-{plugin}-autogenerate_column_names>> |<<boolean,boolean>>|No
37
43
  | <<plugins-{type}s-{plugin}-columns>> |<<array,array>>|No
38
44
  | <<plugins-{type}s-{plugin}-convert>> |<<hash,hash>>|No
45
+ | <<plugins-{type}s-{plugin}-ecs_compatibility>> | <<string,string>>|No
39
46
  | <<plugins-{type}s-{plugin}-quote_char>> |<<string,string>>|No
40
47
  | <<plugins-{type}s-{plugin}-separator>> |<<string,string>>|No
41
48
  | <<plugins-{type}s-{plugin}-skip_empty_columns>> |<<boolean,boolean>>|No
49
+ | <<plugins-{type}s-{plugin}-skip_empty_rows>> |<<boolean,boolean>>|No
50
+ | <<plugins-{type}s-{plugin}-skip_header>> |<<boolean,boolean>>|No
42
51
  | <<plugins-{type}s-{plugin}-source>> |<<string,string>>|No
43
52
  | <<plugins-{type}s-{plugin}-target>> |<<string,string>>|No
44
53
  |=======================================================================
@@ -57,6 +66,8 @@ filter plugins.
57
66
  Define whether column names should be auto-detected from the header column or not.
58
67
  Defaults to false.
59
68
 
69
+ Logstash pipeline workers must be set to `1` for this option to work.
70
+
60
71
  [id="plugins-{type}s-{plugin}-autogenerate_column_names"]
61
72
  ===== `autogenerate_column_names`
62
73
 
@@ -88,7 +99,8 @@ in the data than specified in this column list, extra columns will be auto-numbe
88
99
  Define a set of datatype conversions to be applied to columns.
89
100
  Possible conversions are integer, float, date, date_time, boolean
90
101
 
91
- # Example:
102
+ Example:
103
+
92
104
  [source,ruby]
93
105
  filter {
94
106
  csv {
@@ -99,6 +111,18 @@ Possible conversions are integer, float, date, date_time, boolean
99
111
  }
100
112
  }
101
113
 
114
+ [id="plugins-{type}s-{plugin}-ecs_compatibility"]
115
+ ===== `ecs_compatibility`
116
+
117
+ * Value type is <<string,string>>
118
+ * Supported values are:
119
+ ** `disabled`: does not use ECS-compatible field names
120
+ ** `v1`: uses the value in `target` as field name
121
+
122
+ Controls this plugin's compatibility with the
123
+ {ecs-ref}[Elastic Common Schema (ECS)].
124
+ See <<plugins-{type}s-{plugin}-ecs_metadata>> for detailed information.
125
+
102
126
  [id="plugins-{type}s-{plugin}-quote_char"]
103
127
  ===== `quote_char`
104
128
 
@@ -129,6 +153,33 @@ Optional.
129
153
  Define whether empty columns should be skipped.
130
154
  Defaults to false. If set to true, columns containing no value will not get set.
131
155
 
156
+ [id="plugins-{type}s-{plugin}-skip_empty_rows"]
157
+ ===== `skip_empty_rows`
158
+
159
+ * Value type is <<boolean,boolean>>
160
+ * Default value is `false`
161
+
162
+ Define whether empty rows could potentially be skipped.
163
+ Defaults to false. If set to true, rows containing no value will be tagged with "_csvskippedemptyfield".
164
+ This tag can referenced by users if they wish to cancel events using an 'if' conditional statement.
165
+
166
+ [id="plugins-{type}s-{plugin}-skip_header"]
167
+ ===== `skip_header`
168
+
169
+ * Value type is <<boolean,boolean>>
170
+ * Default value is `false`
171
+
172
+ Define whether the header should be skipped.
173
+ Defaults to false. If set to true, the header will be skipped.
174
+ Assumes that header is not repeated within further rows as such rows will also be skipped.
175
+ If `skip_header` is set without `autodetect_column_names` being set then columns should be set which
176
+ will result in the skipping of any row that exactly matches the specified column values.
177
+ If `skip_header` and `autodetect_column_names` are specified then columns should not be specified, in this case
178
+ `autodetect_column_names` will fill the columns setting in the background, from the first event seen, and any
179
+ subsequent values that match what was autodetected will be skipped.
180
+
181
+ Logstash pipeline workers must be set to `1` for this option to work.
182
+
132
183
  [id="plugins-{type}s-{plugin}-source"]
133
184
  ===== `source`
134
185
 
@@ -148,6 +199,5 @@ Define target field for placing the data.
148
199
  Defaults to writing to the root of the event.
149
200
 
150
201
 
151
-
152
202
  [id="plugins-{type}s-{plugin}-common-options"]
153
- include::{include_path}/{type}.asciidoc[]
203
+ include::{include_path}/{type}.asciidoc[]
@@ -1,6 +1,7 @@
1
1
  # encoding: utf-8
2
2
  require "logstash/filters/base"
3
3
  require "logstash/namespace"
4
+ require "logstash/plugin_mixins/ecs_compatibility_support"
4
5
 
5
6
  require "csv"
6
7
 
@@ -8,6 +9,8 @@ require "csv"
8
9
  # and stores it as individual fields (can optionally specify the names).
9
10
  # This filter can also parse data with any separator, not just commas.
10
11
  class LogStash::Filters::CSV < LogStash::Filters::Base
12
+ include LogStash::PluginMixins::ECSCompatibilitySupport(:disabled, :v1, :v8 => :v1)
13
+
11
14
  config_name "csv"
12
15
 
13
16
  # The CSV data in the value of the `source` field will be expanded into a
@@ -41,10 +44,19 @@ class LogStash::Filters::CSV < LogStash::Filters::Base
41
44
  # Defaults to true. If set to false, columns not having a header specified will not be parsed.
42
45
  config :autogenerate_column_names, :validate => :boolean, :default => true
43
46
 
47
+ # Define whether the header should be skipped or not
48
+ # Defaults to false, If set to true, the header is dropped
49
+ config :skip_header, :validate => :boolean, :default => false
50
+
44
51
  # Define whether empty columns should be skipped.
45
52
  # Defaults to false. If set to true, columns containing no value will not get set.
46
53
  config :skip_empty_columns, :validate => :boolean, :default => false
47
54
 
55
+ # Define whether empty rows could potentially be skipped.
56
+ # Defaults to false. If set to true, rows containing no value will be tagged with _csvskippedemptyfield.
57
+ # This tag can referenced by users if they wish to cancel events using an 'if' conditional statement.
58
+ config :skip_empty_rows, :validate => :boolean, :default => false
59
+
48
60
  # Define a set of datatype conversions to be applied to columns.
49
61
  # Possible conversions are integer, float, date, date_time, boolean
50
62
  #
@@ -93,6 +105,15 @@ class LogStash::Filters::CSV < LogStash::Filters::Base
93
105
  CONVERTERS.default = lambda {|v| v}
94
106
  CONVERTERS.freeze
95
107
 
108
+ def initialize(params)
109
+ super
110
+ if ecs_compatibility != :disabled && @target.nil?
111
+ logger.info('ECS compatibility is enabled but no ``target`` option was specified, it is recommended'\
112
+ ' to set the option to avoid potential schema conflicts (if your data is ECS compliant or'\
113
+ ' non-conflicting feel free to ignore this message)')
114
+ end
115
+ end
116
+
96
117
  def register
97
118
  # validate conversion types to be the valid ones.
98
119
  bad_types = @convert.values.select do |type|
@@ -105,7 +126,7 @@ class LogStash::Filters::CSV < LogStash::Filters::Base
105
126
  @convert_symbols = @convert.inject({}){|result, (k, v)| result[k] = v.to_sym; result}
106
127
 
107
128
  # make sure @target is in the format [field name] if defined, i.e. surrounded by brakets
108
- @target = "[#{@target}]" if @target && @target !~ /^\[[^\[\]]+\]$/
129
+ @target = "[#{@target}]" if @target && !@target.start_with?("[")
109
130
 
110
131
  # if the zero byte character is entered in the config, set the value
111
132
  if (@quote_char == "\\x00")
@@ -120,7 +141,8 @@ class LogStash::Filters::CSV < LogStash::Filters::Base
120
141
 
121
142
  if (source = event.get(@source))
122
143
  begin
123
- values = CSV.parse_line(source, :col_sep => @separator, :quote_char => @quote_char)
144
+
145
+ values = CSV.parse_line(source, :col_sep => @separator, :quote_char => @quote_char)
124
146
 
125
147
  if (@autodetect_column_names && @columns.empty?)
126
148
  @columns = values
@@ -128,6 +150,17 @@ class LogStash::Filters::CSV < LogStash::Filters::Base
128
150
  return
129
151
  end
130
152
 
153
+ if (@skip_header && (!@columns.empty?) && (@columns == values))
154
+ event.cancel
155
+ return
156
+ end
157
+
158
+ if(@skip_empty_rows && values.nil?)
159
+ # applies tag to empty rows, users can cancel event referencing this tag in an 'if' conditional statement
160
+ event.tag("_csvskippedemptyfield")
161
+ return
162
+ end
163
+
131
164
  values.each_index do |i|
132
165
  unless (@skip_empty_columns && (values[i].nil? || values[i].empty?))
133
166
  unless ignore_field?(i)
@@ -151,10 +184,14 @@ class LogStash::Filters::CSV < LogStash::Filters::Base
151
184
  private
152
185
 
153
186
  # construct the correct Event field reference for given field_name, taking into account @target
154
- # @param field_name [String] the bare field name without brakets
187
+ # @param field_name [String] the field name.
155
188
  # @return [String] fully qualified Event field reference also taking into account @target prefix
156
189
  def field_ref(field_name)
157
- "#{@target}[#{field_name}]"
190
+ if field_name.start_with?("[")
191
+ "#{@target}#{field_name}"
192
+ else
193
+ "#{@target}[#{field_name}]"
194
+ end
158
195
  end
159
196
 
160
197
  def ignore_field?(index)
@@ -1,9 +1,11 @@
1
+ CSV_VERSION = File.read(File.expand_path(File.join(File.dirname(__FILE__), "VERSION"))).strip unless defined?(CSV_VERSION)
2
+
1
3
  Gem::Specification.new do |s|
2
4
 
3
5
  s.name = 'logstash-filter-csv'
4
- s.version = '3.0.5'
6
+ s.version = CSV_VERSION
5
7
  s.licenses = ['Apache License (2.0)']
6
- s.summary = "The CSV filter takes an event field containing CSV data, parses it, and stores it as individual fields (can optionally specify the names)."
8
+ s.summary = "Parses comma-separated value data into individual fields"
7
9
  s.description = "This gem is a Logstash plugin required to be installed on top of the Logstash core pipeline using $LS_HOME/bin/logstash-plugin install gemname. This gem is not a stand-alone program"
8
10
  s.authors = ["Elastic"]
9
11
  s.email = 'info@elastic.co'
@@ -21,7 +23,7 @@ Gem::Specification.new do |s|
21
23
 
22
24
  # Gem dependencies
23
25
  s.add_runtime_dependency "logstash-core-plugin-api", ">= 1.60", "<= 2.99"
24
-
25
26
  s.add_development_dependency 'logstash-devutils'
27
+ s.add_runtime_dependency 'logstash-mixin-ecs_compatibility_support', '~>1.2'
26
28
  end
27
29
 
@@ -1,224 +1,260 @@
1
1
  # encoding: utf-8
2
2
  require "logstash/devutils/rspec/spec_helper"
3
3
  require "logstash/filters/csv"
4
+ require 'logstash/plugin_mixins/ecs_compatibility_support/spec_helper'
4
5
 
5
- describe LogStash::Filters::CSV do
6
+ describe LogStash::Filters::CSV, :ecs_compatibility_support, :aggregate_failures do
7
+ ecs_compatibility_matrix(:disabled, :v1, :v8 => :v1) do |ecs_select|
6
8
 
7
- subject(:plugin) { LogStash::Filters::CSV.new(config) }
8
- let(:config) { Hash.new }
9
+ subject(:plugin) { LogStash::Filters::CSV.new(config) }
10
+ let(:config) { Hash.new }
9
11
 
10
- let(:doc) { "" }
11
- let(:event) { LogStash::Event.new("message" => doc) }
12
+ let(:doc) { "" }
13
+ let(:event) { LogStash::Event.new("message" => doc) }
12
14
 
13
- describe "registration" do
15
+ describe "registration" do
14
16
 
15
- context "when using invalid data types" do
16
- let(:config) do
17
- { "convert" => { "custom1" => "integer", "custom3" => "wrong_type" },
18
- "columns" => ["custom1", "custom2", "custom3"] }
19
- end
17
+ context "when using invalid data types" do
18
+ let(:config) do
19
+ { "convert" => { "custom1" => "integer", "custom3" => "wrong_type" },
20
+ "columns" => ["custom1", "custom2", "custom3"] }
21
+ end
20
22
 
21
- it "should register" do
22
- input = LogStash::Plugin.lookup("filter", "csv").new(config)
23
- expect {input.register}.to raise_error(LogStash::ConfigurationError)
23
+ it "should register" do
24
+ input = LogStash::Plugin.lookup("filter", "csv").new(config)
25
+ expect {input.register}.to raise_error(LogStash::ConfigurationError)
26
+ end
24
27
  end
25
28
  end
26
- end
27
-
28
- describe "receive" do
29
-
30
- before(:each) do
31
- plugin.register
32
- end
33
-
34
- describe "all defaults" do
35
29
 
36
- let(:config) { Hash.new }
30
+ describe "receive" do
37
31
 
38
- let(:doc) { "big,bird,sesame street" }
39
-
40
- it "extract all the values" do
41
- plugin.filter(event)
42
- expect(event.get("column1")).to eq("big")
43
- expect(event.get("column2")).to eq("bird")
44
- expect(event.get("column3")).to eq("sesame street")
32
+ before(:each) do
33
+ allow_any_instance_of(described_class).to receive(:ecs_compatibility).and_return(ecs_compatibility)
34
+ plugin.register
45
35
  end
46
36
 
47
- it "should not mutate the source field" do
48
- plugin.filter(event)
49
- expect(event.get("message")).to be_kind_of(String)
50
- end
51
- end
37
+ describe "all defaults" do
52
38
 
53
- describe "custom separator" do
54
- let(:doc) { "big,bird;sesame street" }
39
+ let(:config) { Hash.new }
55
40
 
56
- let(:config) do
57
- { "separator" => ";" }
58
- end
59
- it "extract all the values" do
60
- plugin.filter(event)
61
- expect(event.get("column1")).to eq("big,bird")
62
- expect(event.get("column2")).to eq("sesame street")
63
- end
64
- end
41
+ let(:doc) { "big,bird,sesame street" }
65
42
 
66
- describe "quote char" do
67
- let(:doc) { "big,bird,'sesame street'" }
43
+ it "extract all the values" do
44
+ plugin.filter(event)
45
+ expect(event.get("column1")).to eq("big")
46
+ expect(event.get("column2")).to eq("bird")
47
+ expect(event.get("column3")).to eq("sesame street")
48
+ end
68
49
 
69
- let(:config) do
70
- { "quote_char" => "'"}
50
+ it "should not mutate the source field" do
51
+ plugin.filter(event)
52
+ expect(event.get("message")).to be_kind_of(String)
53
+ end
71
54
  end
72
55
 
73
- it "extract all the values" do
74
- plugin.filter(event)
75
- expect(event.get("column1")).to eq("big")
76
- expect(event.get("column2")).to eq("bird")
77
- expect(event.get("column3")).to eq("sesame street")
78
- end
56
+ describe "empty message" do
57
+ let(:doc) { "" }
79
58
 
80
- context "using the default one" do
81
- let(:doc) { 'big,bird,"sesame, street"' }
82
- let(:config) { Hash.new }
59
+ let(:config) do
60
+ { "skip_empty_rows" => true }
61
+ end
83
62
 
84
- it "extract all the values" do
63
+ it "skips empty rows" do
85
64
  plugin.filter(event)
86
- expect(event.get("column1")).to eq("big")
87
- expect(event.get("column2")).to eq("bird")
88
- expect(event.get("column3")).to eq("sesame, street")
65
+ expect(event.get("tags")).to include("_csvskippedemptyfield")
66
+ expect(event).not_to be_cancelled
89
67
  end
90
68
  end
91
69
 
92
- context "using a null" do
93
- let(:doc) { 'big,bird,"sesame" street' }
70
+ describe "custom separator" do
71
+ let(:doc) { "big,bird;sesame street" }
72
+
94
73
  let(:config) do
95
- { "quote_char" => "\x00" }
74
+ { "separator" => ";" }
96
75
  end
97
-
98
76
  it "extract all the values" do
99
77
  plugin.filter(event)
100
- expect(event.get("column1")).to eq("big")
101
- expect(event.get("column2")).to eq("bird")
102
- expect(event.get("column3")).to eq('"sesame" street')
78
+ expect(event.get("column1")).to eq("big,bird")
79
+ expect(event.get("column2")).to eq("sesame street")
103
80
  end
104
81
  end
105
-
106
- context "using a null as read from config" do
107
- let(:doc) { 'big,bird,"sesame" street' }
82
+
83
+ describe "quote char" do
84
+ let(:doc) { "big,bird,'sesame street'" }
85
+
108
86
  let(:config) do
109
- { "quote_char" => "\\x00" }
87
+ { "quote_char" => "'"}
110
88
  end
111
89
 
112
90
  it "extract all the values" do
113
91
  plugin.filter(event)
114
92
  expect(event.get("column1")).to eq("big")
115
93
  expect(event.get("column2")).to eq("bird")
116
- expect(event.get("column3")).to eq('"sesame" street')
94
+ expect(event.get("column3")).to eq("sesame street")
117
95
  end
118
- end
119
- end
120
96
 
121
- describe "given column names" do
122
- let(:doc) { "big,bird,sesame street" }
123
- let(:config) do
124
- { "columns" => ["first", "last", "address" ] }
125
- end
97
+ context "using the default one" do
98
+ let(:doc) { 'big,bird,"sesame, street"' }
99
+ let(:config) { Hash.new }
126
100
 
127
- it "extract all the values" do
128
- plugin.filter(event)
129
- expect(event.get("first")).to eq("big")
130
- expect(event.get("last")).to eq("bird")
131
- expect(event.get("address")).to eq("sesame street")
132
- end
101
+ it "extract all the values" do
102
+ plugin.filter(event)
103
+ expect(event.get("column1")).to eq("big")
104
+ expect(event.get("column2")).to eq("bird")
105
+ expect(event.get("column3")).to eq("sesame, street")
106
+ end
107
+ end
108
+
109
+ context "using a null" do
110
+ let(:doc) { 'big,bird,"sesame" street' }
111
+ let(:config) do
112
+ { "quote_char" => "\x00" }
113
+ end
114
+
115
+ it "extract all the values" do
116
+ plugin.filter(event)
117
+ expect(event.get("column1")).to eq("big")
118
+ expect(event.get("column2")).to eq("bird")
119
+ expect(event.get("column3")).to eq('"sesame" street')
120
+ end
121
+ end
133
122
 
134
- context "parse csv without autogeneration of names" do
123
+ context "using a null as read from config" do
124
+ let(:doc) { 'big,bird,"sesame" street' }
125
+ let(:config) do
126
+ { "quote_char" => "\\x00" }
127
+ end
128
+
129
+ it "extract all the values" do
130
+ plugin.filter(event)
131
+ expect(event.get("column1")).to eq("big")
132
+ expect(event.get("column2")).to eq("bird")
133
+ expect(event.get("column3")).to eq('"sesame" street')
134
+ end
135
+ end
136
+ end
135
137
 
136
- let(:doc) { "val1,val2,val3" }
138
+ describe "given column names" do
139
+ let(:doc) { "big,bird,sesame street" }
137
140
  let(:config) do
138
- { "autogenerate_column_names" => false,
139
- "columns" => ["custom1", "custom2"] }
141
+ { "columns" => ["first", "last", "address" ] }
140
142
  end
141
143
 
142
144
  it "extract all the values" do
143
145
  plugin.filter(event)
144
- expect(event.get("custom1")).to eq("val1")
145
- expect(event.get("custom2")).to eq("val2")
146
- expect(event.get("column3")).to be_falsey
146
+ expect(event.get("first")).to eq("big")
147
+ expect(event.get("last")).to eq("bird")
148
+ expect(event.get("address")).to eq("sesame street")
147
149
  end
148
- end
149
150
 
150
- context "parse csv skipping empty columns" do
151
+ context "parse csv without autogeneration of names" do
151
152
 
152
- let(:doc) { "val1,,val3" }
153
+ let(:doc) { "val1,val2,val3" }
154
+ let(:config) do
155
+ { "autogenerate_column_names" => false,
156
+ "columns" => ["custom1", "custom2"] }
157
+ end
153
158
 
154
- let(:config) do
155
- { "skip_empty_columns" => true,
156
- "source" => "datafield",
157
- "columns" => ["custom1", "custom2", "custom3"] }
159
+ it "extract all the values" do
160
+ plugin.filter(event)
161
+ expect(event.get("custom1")).to eq("val1")
162
+ expect(event.get("custom2")).to eq("val2")
163
+ expect(event.get("column3")).to be_falsey
164
+ end
158
165
  end
159
166
 
160
- let(:event) { LogStash::Event.new("datafield" => doc) }
161
167
 
162
- it "extract all the values" do
163
- plugin.filter(event)
164
- expect(event.get("custom1")).to eq("val1")
165
- expect(event.get("custom2")).to be_falsey
166
- expect(event.get("custom3")).to eq("val3")
167
- end
168
- end
168
+ context "parse csv and skip the header" do
169
169
 
170
- context "parse csv with more data than defined" do
171
- let(:doc) { "val1,val2,val3" }
172
- let(:config) do
173
- { "columns" => ["custom1", "custom2"] }
170
+ let(:doc) { "first_column,second_column,third_column" }
171
+ let(:config) do
172
+ { "skip_header" => true,
173
+ "columns" => ["first_column", "second_column", "third_column"] }
174
+ end
175
+
176
+ it "expects the event to be cancelled" do
177
+ plugin.filter(event)
178
+ expect(event).to be_cancelled
179
+ end
174
180
  end
175
181
 
176
- it "extract all the values" do
177
- plugin.filter(event)
178
- expect(event.get("custom1")).to eq("val1")
179
- expect(event.get("custom2")).to eq("val2")
180
- expect(event.get("column3")).to eq("val3")
182
+ context "parse csv skipping empty columns" do
183
+
184
+ let(:doc) { "val1,,val3" }
185
+
186
+ let(:config) do
187
+ { "skip_empty_columns" => true,
188
+ "source" => "datafield",
189
+ "columns" => ["custom1", "custom2", "custom3"] }
190
+ end
191
+
192
+ let(:event) { LogStash::Event.new("datafield" => doc) }
193
+
194
+ it "extract all the values" do
195
+ plugin.filter(event)
196
+ expect(event.get("custom1")).to eq("val1")
197
+ expect(event.get("custom2")).to be_falsey
198
+ expect(event.get("custom3")).to eq("val3")
199
+ end
181
200
  end
182
- end
183
201
 
184
- context "parse csv from a given source" do
185
- let(:doc) { "val1,val2,val3" }
186
- let(:config) do
187
- { "source" => "datafield",
188
- "columns" => ["custom1", "custom2", "custom3"] }
202
+ context "parse csv with more data than defined" do
203
+ let(:doc) { "val1,val2,val3" }
204
+ let(:config) do
205
+ { "columns" => ["custom1", "custom2"] }
206
+ end
207
+
208
+ it "extract all the values" do
209
+ plugin.filter(event)
210
+ expect(event.get("custom1")).to eq("val1")
211
+ expect(event.get("custom2")).to eq("val2")
212
+ expect(event.get("column3")).to eq("val3")
213
+ end
189
214
  end
190
- let(:event) { LogStash::Event.new("datafield" => doc) }
191
215
 
192
- it "extract all the values" do
193
- plugin.filter(event)
194
- expect(event.get("custom1")).to eq("val1")
195
- expect(event.get("custom2")).to eq("val2")
196
- expect(event.get("custom3")).to eq("val3")
216
+ context "parse csv from a given source" do
217
+ let(:doc) { "val1,val2,val3" }
218
+ let(:config) do
219
+ { "source" => "datafield",
220
+ "columns" => ["custom1", "custom2", "custom3"] }
221
+ end
222
+ let(:event) { LogStash::Event.new("datafield" => doc) }
223
+
224
+ it "extract all the values" do
225
+ plugin.filter(event)
226
+ expect(event.get("custom1")).to eq("val1")
227
+ expect(event.get("custom2")).to eq("val2")
228
+ expect(event.get("custom3")).to eq("val3")
229
+ end
197
230
  end
198
- end
199
- end
200
231
 
201
- describe "givin target" do
202
- let(:config) do
203
- { "target" => "data" }
204
- end
205
- let(:doc) { "big,bird,sesame street" }
206
- let(:event) { LogStash::Event.new("message" => doc) }
207
-
208
- it "extract all the values" do
209
- plugin.filter(event)
210
- expect(event.get("data")["column1"]).to eq("big")
211
- expect(event.get("data")["column2"]).to eq("bird")
212
- expect(event.get("data")["column3"]).to eq("sesame street")
232
+ context "that use [@metadata]" do
233
+ let(:metadata_field) { "[@metadata][one]" }
234
+ let(:config) do
235
+ {
236
+ "columns" => [ metadata_field, "foo" ]
237
+ }
238
+ end
239
+
240
+ let(:event) { LogStash::Event.new("message" => "hello,world") }
241
+
242
+ before do
243
+ plugin.filter(event)
244
+ end
245
+
246
+ it "should work correctly" do
247
+ expect(event.get(metadata_field)).to eq("hello")
248
+ end
249
+ end
213
250
  end
214
251
 
215
- context "when having also source" do
252
+ describe "givin target" do
216
253
  let(:config) do
217
- { "source" => "datain",
218
- "target" => "data" }
254
+ { "target" => "data" }
219
255
  end
220
- let(:event) { LogStash::Event.new("datain" => doc) }
221
256
  let(:doc) { "big,bird,sesame street" }
257
+ let(:event) { LogStash::Event.new("message" => doc) }
222
258
 
223
259
  it "extract all the values" do
224
260
  plugin.filter(event)
@@ -226,100 +262,152 @@ describe LogStash::Filters::CSV do
226
262
  expect(event.get("data")["column2"]).to eq("bird")
227
263
  expect(event.get("data")["column3"]).to eq("sesame street")
228
264
  end
229
- end
230
- end
231
265
 
232
- describe "using field convertion" do
266
+ context "when having also source" do
267
+ let(:config) do
268
+ { "source" => "datain",
269
+ "target" => "data" }
270
+ end
271
+ let(:event) { LogStash::Event.new("datain" => doc) }
272
+ let(:doc) { "big,bird,sesame street" }
273
+
274
+ it "extract all the values" do
275
+ plugin.filter(event)
276
+ expect(event.get("data")["column1"]).to eq("big")
277
+ expect(event.get("data")["column2"]).to eq("bird")
278
+ expect(event.get("data")["column3"]).to eq("sesame street")
279
+ end
280
+ end
233
281
 
234
- let(:config) do
235
- {
236
- "convert" => {
237
- "column1" => "integer",
238
- "column3" => "boolean",
239
- "column4" => "float",
240
- "column5" => "date",
241
- "column6" => "date_time",
242
- "column7" => "date",
243
- "column8" => "date_time",
282
+ context "which uses [nested][fieldref] syntax" do
283
+ let(:target) { "[foo][bar]" }
284
+ let(:config) do
285
+ {
286
+ "target" => target
244
287
  }
245
- }
246
- end
247
- # 2017-06-01,2001-02-03T04:05:06+07:00
248
- let(:doc) { "1234,bird,false,3.14159265359,2017-06-01,2001-02-03 04:05:06,invalid_date,invalid_date_time" }
249
- let(:event) { LogStash::Event.new("message" => doc) }
288
+ end
250
289
 
251
- it "converts to integer" do
252
- plugin.filter(event)
253
- expect(event.get("column1")).to eq(1234)
254
- end
290
+ let(:event) { LogStash::Event.new("message" => "hello,world") }
255
291
 
256
- it "does not convert without converter" do
257
- plugin.filter(event)
258
- expect(event.get("column2")).to eq("bird")
259
- end
292
+ before do
293
+ plugin.filter(event)
294
+ end
260
295
 
261
- it "converts to boolean" do
262
- plugin.filter(event)
263
- expect(event.get("column3")).to eq(false)
264
- end
296
+ it "should set fields correctly in the target" do
297
+ expect(event.get("#{target}[column1]")).to eq("hello")
298
+ expect(event.get("#{target}[column2]")).to eq("world")
299
+ end
265
300
 
266
- it "converts to float" do
267
- plugin.filter(event)
268
- expect(event.get("column4")).to eq(3.14159265359)
269
- end
301
+ context "with nested fieldrefs as columns" do
302
+ let(:config) do
303
+ {
304
+ "target" => target,
305
+ "columns" => [ "[test][one]", "[test][two]" ]
306
+ }
307
+ end
270
308
 
271
- it "converts to date" do
272
- plugin.filter(event)
273
- expect(event.get("column5")).to be_a(LogStash::Timestamp)
274
- expect(event.get("column5").to_s).to eq(LogStash::Timestamp.new(Date.parse("2017-06-01").to_time).to_s)
275
- end
309
+ it "should set fields correctly in the target" do
310
+ expect(event.get("#{target}[test][one]")).to eq("hello")
311
+ expect(event.get("#{target}[test][two]")).to eq("world")
312
+ end
313
+ end
276
314
 
277
- it "converts to date_time" do
278
- plugin.filter(event)
279
- expect(event.get("column6")).to be_a(LogStash::Timestamp)
280
- expect(event.get("column6").to_s).to eq(LogStash::Timestamp.new(DateTime.parse("2001-02-03 04:05:06").to_time).to_s)
315
+ end
281
316
  end
282
317
 
283
- it "tries to converts to date but return original" do
284
- plugin.filter(event)
285
- expect(event.get("column7")).to eq("invalid_date")
286
- end
318
+ describe "using field convertion" do
287
319
 
288
- it "tries to converts to date_time but return original" do
289
- plugin.filter(event)
290
- expect(event.get("column8")).to eq("invalid_date_time")
291
- end
320
+ let(:config) do
321
+ {
322
+ "convert" => {
323
+ "column1" => "integer",
324
+ "column3" => "boolean",
325
+ "column4" => "float",
326
+ "column5" => "date",
327
+ "column6" => "date_time",
328
+ "column7" => "date",
329
+ "column8" => "date_time",
330
+ }
331
+ }
332
+ end
333
+ # 2017-06-01,2001-02-03T04:05:06+07:00
334
+ let(:doc) { "1234,bird,false,3.14159265359,2017-06-01,2001-02-03 04:05:06,invalid_date,invalid_date_time" }
335
+ let(:event) { LogStash::Event.new("message" => doc) }
292
336
 
293
- context "when using column names" do
337
+ it "converts to integer" do
338
+ plugin.filter(event)
339
+ expect(event.get("column1")).to eq(1234)
340
+ end
294
341
 
295
- let(:config) do
296
- { "convert" => { "custom1" => "integer", "custom3" => "boolean" },
297
- "columns" => ["custom1", "custom2", "custom3"] }
342
+ it "does not convert without converter" do
343
+ plugin.filter(event)
344
+ expect(event.get("column2")).to eq("bird")
298
345
  end
299
346
 
300
- it "get converted values to the expected type" do
347
+ it "converts to boolean" do
301
348
  plugin.filter(event)
302
- expect(event.get("custom1")).to eq(1234)
303
- expect(event.get("custom2")).to eq("bird")
304
- expect(event.get("custom3")).to eq(false)
349
+ expect(event.get("column3")).to eq(false)
350
+ end
351
+
352
+ it "converts to float" do
353
+ plugin.filter(event)
354
+ expect(event.get("column4")).to eq(3.14159265359)
355
+ end
356
+
357
+ it "converts to date" do
358
+ plugin.filter(event)
359
+ expect(event.get("column5")).to be_a(LogStash::Timestamp)
360
+ expect(event.get("column5").to_s).to eq(LogStash::Timestamp.new(Date.parse("2017-06-01").to_time).to_s)
361
+ end
362
+
363
+ it "converts to date_time" do
364
+ plugin.filter(event)
365
+ expect(event.get("column6")).to be_a(LogStash::Timestamp)
366
+ expect(event.get("column6").to_s).to eq(LogStash::Timestamp.new(DateTime.parse("2001-02-03 04:05:06").to_time).to_s)
305
367
  end
306
- end
307
- end
308
368
 
309
- describe "given autodetect option" do
310
- let(:header) { LogStash::Event.new("message" => "first,last,address") }
311
- let(:doc) { "big,bird,sesame street" }
312
- let(:config) do
313
- { "autodetect_column_names" => true }
369
+ it "tries to converts to date but return original" do
370
+ plugin.filter(event)
371
+ expect(event.get("column7")).to eq("invalid_date")
372
+ end
373
+
374
+ it "tries to converts to date_time but return original" do
375
+ plugin.filter(event)
376
+ expect(event.get("column8")).to eq("invalid_date_time")
377
+ end
378
+
379
+ context "when using column names" do
380
+
381
+ let(:config) do
382
+ { "convert" => { "custom1" => "integer", "custom3" => "boolean" },
383
+ "columns" => ["custom1", "custom2", "custom3"] }
384
+ end
385
+
386
+ it "get converted values to the expected type" do
387
+ plugin.filter(event)
388
+ expect(event.get("custom1")).to eq(1234)
389
+ expect(event.get("custom2")).to eq("bird")
390
+ expect(event.get("custom3")).to eq(false)
391
+ end
392
+ end
314
393
  end
315
394
 
316
- it "extract all the values with the autodetected header" do
317
- plugin.filter(header)
318
- plugin.filter(event)
319
- expect(event.get("first")).to eq("big")
320
- expect(event.get("last")).to eq("bird")
321
- expect(event.get("address")).to eq("sesame street")
395
+ describe "given autodetect option" do
396
+ let(:header) { LogStash::Event.new("message" => "first,last,address") }
397
+ let(:doc) { "big,bird,sesame street" }
398
+ let(:config) do
399
+ { "autodetect_column_names" => true }
400
+ end
401
+
402
+ it "extract all the values with the autodetected header" do
403
+ plugin.filter(header)
404
+ plugin.filter(event)
405
+ expect(event.get("first")).to eq("big")
406
+ expect(event.get("last")).to eq("bird")
407
+ expect(event.get("address")).to eq("sesame street")
408
+ end
322
409
  end
323
410
  end
411
+
324
412
  end
325
413
  end
metadata CHANGED
@@ -1,14 +1,14 @@
1
1
  --- !ruby/object:Gem::Specification
2
2
  name: logstash-filter-csv
3
3
  version: !ruby/object:Gem::Version
4
- version: 3.0.5
4
+ version: 3.1.0
5
5
  platform: ruby
6
6
  authors:
7
7
  - Elastic
8
8
  autorequire:
9
9
  bindir: bin
10
10
  cert_chain: []
11
- date: 2017-08-15 00:00:00.000000000 Z
11
+ date: 2021-06-03 00:00:00.000000000 Z
12
12
  dependencies:
13
13
  - !ruby/object:Gem::Dependency
14
14
  requirement: !ruby/object:Gem::Requirement
@@ -44,7 +44,23 @@ dependencies:
44
44
  - - ">="
45
45
  - !ruby/object:Gem::Version
46
46
  version: '0'
47
- description: This gem is a Logstash plugin required to be installed on top of the Logstash core pipeline using $LS_HOME/bin/logstash-plugin install gemname. This gem is not a stand-alone program
47
+ - !ruby/object:Gem::Dependency
48
+ requirement: !ruby/object:Gem::Requirement
49
+ requirements:
50
+ - - "~>"
51
+ - !ruby/object:Gem::Version
52
+ version: '1.2'
53
+ name: logstash-mixin-ecs_compatibility_support
54
+ prerelease: false
55
+ type: :runtime
56
+ version_requirements: !ruby/object:Gem::Requirement
57
+ requirements:
58
+ - - "~>"
59
+ - !ruby/object:Gem::Version
60
+ version: '1.2'
61
+ description: This gem is a Logstash plugin required to be installed on top of the
62
+ Logstash core pipeline using $LS_HOME/bin/logstash-plugin install gemname. This
63
+ gem is not a stand-alone program
48
64
  email: info@elastic.co
49
65
  executables: []
50
66
  extensions: []
@@ -56,6 +72,7 @@ files:
56
72
  - LICENSE
57
73
  - NOTICE.TXT
58
74
  - README.md
75
+ - VERSION
59
76
  - docs/index.asciidoc
60
77
  - lib/logstash/filters/csv.rb
61
78
  - logstash-filter-csv.gemspec
@@ -82,9 +99,9 @@ required_rubygems_version: !ruby/object:Gem::Requirement
82
99
  version: '0'
83
100
  requirements: []
84
101
  rubyforge_project:
85
- rubygems_version: 2.4.8
102
+ rubygems_version: 2.6.13
86
103
  signing_key:
87
104
  specification_version: 4
88
- summary: The CSV filter takes an event field containing CSV data, parses it, and stores it as individual fields (can optionally specify the names).
105
+ summary: Parses comma-separated value data into individual fields
89
106
  test_files:
90
107
  - spec/filters/csv_spec.rb