log2json-loggers 0.1.9 → 0.1.10

Sign up to get free protection for your applications and to get access to all the features.
Files changed (2) hide show
  1. data/lib/log2json/railslogger.rb +12 -1
  2. metadata +60 -109
@@ -15,6 +15,16 @@ require 'logger'
15
15
 
16
16
  module Log2Json
17
17
 
18
+ LEVELS = {
19
+ :debug => Logger::DEBUG,
20
+ :info => Logger::INFO,
21
+ :warn => Logger::WARN,
22
+ :error => Logger::ERROR,
23
+ :fatal => Logger::FATAL,
24
+ :unknown => 5
25
+ }
26
+ LEVELS.default = Logger::INFO
27
+
18
28
  def self.log_formatter
19
29
  proc do |severity, datetime, progname, msg|
20
30
  "#{datetime.strftime('%Y-%m-%dT%H:%M:%S%z')}: [#{severity}] #{$$} #{msg.gsub(/\n/, '#012')}\n"
@@ -37,6 +47,7 @@ module Log2Json
37
47
  config.active_record.colorize_logging = false
38
48
  end
39
49
  logger = ::Logger.new(path)
50
+ logger.level = LEVELS[config.log_level]
40
51
  logger.formatter = ::Log2Json::log_formatter
41
52
  if defined?(ActiveSupport::TaggedLogging)
42
53
  ActiveSupport::TaggedLogging.new(logger)
@@ -49,8 +60,8 @@ module Log2Json
49
60
  #
50
61
  def self.create_custom_unicorn_logger(config)
51
62
  logger = ::Logger.new(config.set[:stderr_path])
63
+ logger.level = Logger::INFO
52
64
  logger.formatter = ::Log2Json::log_formatter
53
- logger
54
65
  end
55
66
 
56
67
 
metadata CHANGED
@@ -1,134 +1,85 @@
1
- --- !ruby/object:Gem::Specification
1
+ --- !ruby/object:Gem::Specification
2
2
  name: log2json-loggers
3
- version: !ruby/object:Gem::Version
4
- hash: 9
3
+ version: !ruby/object:Gem::Version
4
+ version: 0.1.10
5
5
  prerelease:
6
- segments:
7
- - 0
8
- - 1
9
- - 9
10
- version: 0.1.9
11
6
  platform: ruby
12
- authors:
7
+ authors:
13
8
  - Jack Kuan
14
9
  autorequire:
15
10
  bindir: bin
16
11
  cert_chain: []
17
-
18
- date: 2013-10-04 00:00:00 Z
12
+ date: 2013-10-16 00:00:00.000000000 Z
19
13
  dependencies: []
20
-
21
- description: |+
22
- Log2json lets you read, filter and send logs as JSON objects via Unix pipes.
23
- It is inspired by Logstash, and is meant to be compatible with it at the JSON
24
- event/record level so that it can easily work with Kibana.
25
-
26
- Reading logs is done via a shell script(eg, `tail`) running in its own process.
27
- You then configure(see the `syslog2json` or the `nginxlog2json` script for
28
- examples) and run your filters in Ruby using the `Log2Json` module and its
29
- contained helper classes.
30
-
31
- `Log2Json` reads from stdin the logs(one log record per line), parses the log
32
- lines into JSON records, and then serializes and writes the records to stdout,
33
- which then can be piped to another process for processing or sending it to
34
- somewhere else.
35
-
36
- Currently, Log2json ships with a `tail-log` script that can be run as the input
37
- process. It's the same as using the Linux `tail` utility with the `-v -F`
38
- options except that it also tracks the positions(as the numbers of lines read
39
- from the beginning of the files) in a few files in the file system so that if the
40
- input process is interrupted, it can continue reading from where it left off
41
- next time if the files had been followed. This feature is similar to the sincedb
42
- feature in Logstash's file input.
43
-
44
- Note: If you don't need the tracking feature(ie, you are fine with always
45
- tailling from the end of file with `-v -F -n0`), then you can just use the `tail`
46
- utility that comes with your Linux distribution.(Or more specifically, the
47
- `tail` from the GNU coreutils). Other versions of the `tail` utility may also
48
- work, but are not tested. The input protocol expected by Log2json is very
49
- simple and documented in the source code.
50
-
51
- ** The `tail-log` script uses a patched version of `tail` from the GNU coreutils
52
- package. A binary of the `tail` utility compiled for Ubuntu 12.04 LTS is
53
- included with the Log2json gem. If the binary doesn't work for your
54
- distribution, then you'll need to get GNU coreutils-8.13, apply the patch(it
55
- can be found in the src/ directory of the installed gem), and then replace
56
- the bin/tail binary in the directory of the installed gem with your version
57
- of the binary. **
58
-
59
- P.S. If you know of a way to configure and compile ONLY the tail program in
60
- coreutils, please let me know! The reason I'm not building tail post gem
61
- installation is that it takes too long to configure && make because that
62
- actually builds every utilties in coreutils.
63
-
64
-
65
- For shipping logs to Redis, there's the `lines2redis` script that can be used as
66
- the output process in the pipe. For shipping logs from Redis to ElasticSearch,
67
- Log2json provides a `redis2es` script.
68
-
69
- Finally here's an example of Log2json in action:
70
-
71
- From a client machine:
72
-
73
- tail-log /var/log/{sys,mail}log /var/log/{kern,auth}.log | syslog2json |
74
- queue=jsonlogs \
75
- flush_size=20 \
76
- flush_interval=30 \
77
- lines2redis host.to.redis.server 6379 0 # use redis DB 0
78
-
79
-
80
- On the Redis server:
81
-
82
- redis_queue=jsonlogs redis2es host.to.es.server
83
-
84
-
85
-
86
- Resources that help writing log2json filters:
87
- - look at log2json.rb source and example filters
88
- - http://grokdebug.herokuapp.com/
89
- - http://www.ruby-doc.org/stdlib-1.9.3/libdoc/date/rdoc/DateTime.html#method-i-strftime
90
-
14
+ description: ! "Log2json lets you read, filter and send logs as JSON objects via Unix
15
+ pipes.\nIt is inspired by Logstash, and is meant to be compatible with it at the
16
+ JSON\nevent/record level so that it can easily work with Kibana. \n\nReading logs
17
+ is done via a shell script(eg, `tail`) running in its own process.\nYou then configure(see
18
+ the `syslog2json` or the `nginxlog2json` script for\nexamples) and run your filters
19
+ in Ruby using the `Log2Json` module and its\ncontained helper classes.\n\n`Log2Json`
20
+ reads from stdin the logs(one log record per line), parses the log\nlines into JSON
21
+ records, and then serializes and writes the records to stdout,\nwhich then can be
22
+ piped to another process for processing or sending it to\nsomewhere else.\n\nCurrently,
23
+ Log2json ships with a `tail-log` script that can be run as the input\nprocess. It's
24
+ the same as using the Linux `tail` utility with the `-v -F`\noptions except that
25
+ it also tracks the positions(as the numbers of lines read\nfrom the beginning of
26
+ the files) in a few files in the file system so that if the\ninput process is interrupted,
27
+ it can continue reading from where it left off\nnext time if the files had been
28
+ followed. This feature is similar to the sincedb\nfeature in Logstash's file input.\n\nNote:
29
+ If you don't need the tracking feature(ie, you are fine with always\ntailling from
30
+ the end of file with `-v -F -n0`), then you can just use the `tail`\nutility that
31
+ comes with your Linux distribution.(Or more specifically, the\n`tail` from the GNU
32
+ coreutils). Other versions of the `tail` utility may also\nwork, but are not tested.
33
+ The input protocol expected by Log2json is very\nsimple and documented in the source
34
+ code.\n\n** The `tail-log` script uses a patched version of `tail` from the GNU
35
+ coreutils\npackage. A binary of the `tail` utility compiled for Ubuntu 12.04 LTS
36
+ is\nincluded with the Log2json gem. If the binary doesn't work for your\ndistribution,
37
+ then you'll need to get GNU coreutils-8.13, apply the patch(it\ncan be found in
38
+ the src/ directory of the installed gem), and then replace\nthe bin/tail binary
39
+ in the directory of the installed gem with your version\nof the binary. ** \n\nP.S.
40
+ If you know of a way to configure and compile ONLY the tail program in\ncoreutils,
41
+ please let me know! The reason I'm not building tail post gem\ninstallation is that
42
+ it takes too long to configure && make because that\nactually builds every utilties
43
+ in coreutils.\n\n\nFor shipping logs to Redis, there's the `lines2redis` script
44
+ that can be used as\nthe output process in the pipe. For shipping logs from Redis
45
+ to ElasticSearch,\nLog2json provides a `redis2es` script.\n\nFinally here's an example
46
+ of Log2json in action:\n\nFrom a client machine:\n\n tail-log /var/log/{sys,mail}log
47
+ /var/log/{kern,auth}.log | syslog2json |\n queue=jsonlogs \\\n flush_size=20
48
+ \\\n flush_interval=30 \\\n lines2redis host.to.redis.server 6379
49
+ 0 # use redis DB 0\n\n\nOn the Redis server:\n\n redis_queue=jsonlogs redis2es
50
+ host.to.es.server\n\n\nResources that help writing log2json filters:\n\n - look
51
+ at log2json.rb source and example filters\n - http://grokdebug.herokuapp.com/\n
52
+ \ - http://www.ruby-doc.org/stdlib-1.9.3/libdoc/date/rdoc/DateTime.html#method-i-strftime\n\n"
91
53
  email: jack.kuan@thescore.com
92
54
  executables: []
93
-
94
55
  extensions: []
95
-
96
56
  extra_rdoc_files: []
97
-
98
- files:
57
+ files:
99
58
  - lib/log2json/railslogger.rb
100
59
  homepage:
101
60
  licenses: []
102
-
103
61
  post_install_message:
104
62
  rdoc_options: []
105
-
106
- require_paths:
63
+ require_paths:
107
64
  - lib
108
- required_ruby_version: !ruby/object:Gem::Requirement
65
+ required_ruby_version: !ruby/object:Gem::Requirement
109
66
  none: false
110
- requirements:
111
- - - ">="
112
- - !ruby/object:Gem::Version
113
- hash: 3
114
- segments:
115
- - 0
116
- version: "0"
117
- required_rubygems_version: !ruby/object:Gem::Requirement
67
+ requirements:
68
+ - - ! '>='
69
+ - !ruby/object:Gem::Version
70
+ version: '0'
71
+ required_rubygems_version: !ruby/object:Gem::Requirement
118
72
  none: false
119
- requirements:
120
- - - ">="
121
- - !ruby/object:Gem::Version
122
- hash: 3
123
- segments:
124
- - 0
125
- version: "0"
73
+ requirements:
74
+ - - ! '>='
75
+ - !ruby/object:Gem::Version
76
+ version: '0'
126
77
  requirements: []
127
-
128
78
  rubyforge_project:
129
- rubygems_version: 1.8.15
79
+ rubygems_version: 1.8.23
130
80
  signing_key:
131
81
  specification_version: 3
132
- summary: Custom loggers for Rails and Unicorn that use log2json's single-line log format.
82
+ summary: Custom loggers for Rails and Unicorn that use log2json's single-line log
83
+ format.
133
84
  test_files: []
134
-
85
+ has_rdoc: