profanalyzer 1.0.0 → 1.2.0

Sign up to get free protection for your applications and to get access to all the features.
@@ -1,3 +1,8 @@
1
+ === 1.2.0 / 2011-08-11
2
+
3
+ * Added #flagged_words to find the discovered offending words. (Thanks nilmethod/Chris Hobbs!)
4
+ * You can now instantiate Profanalyzer to have multiple analyzers with potentially different settings.
5
+
1
6
  === 0.2.1 / 2009-03-25
2
7
 
3
8
  * Fixed some wordlist errors.
@@ -1,6 +1,6 @@
1
1
  History.txt
2
2
  Manifest.txt
3
- README.markdown
3
+ README.md
4
4
  Rakefile
5
5
  config/list.yml
6
6
  lib/profanalyzer.rb
@@ -8,7 +8,7 @@ Profanalyzer has one purpose: analyze a block of text for profanity. It is able
8
8
 
9
9
  What sets it slightly apart from other filters is that it classifies each blocked word as "profane", "racist", or "sexual" - although right now, each word is considered "profane". It also rates each word on a scale from 0-5, which is based on my subjective opinion, as well as whether the word is commonly used in non-profane situations, such as "ass" in "assess".
10
10
 
11
- The Profanalyzer will default to a tolerance of of 2, which will kick back the arguably non-profane words. It will also test against all words, including racist or sexual words.
11
+ The Profanalyzer will default to a tolerance of of 4, which will kick back the arguably non-profane words. It will also test against all words, including racist or sexual words.
12
12
 
13
13
  Lastly, it allows for custom substitutions! For example, the filter at the website http://www.fark.com/ turns the word "fuck" into "fark", and "shit" into "shiat". You can specify these if you want.
14
14
 
@@ -45,6 +45,19 @@ Or do specific checking:
45
45
  Profanalyzer.profane? "mick" #==> false
46
46
  Profanalyzer.profane? "vagina" #==> true
47
47
 
48
+ You can obtain a list of the words which fell afoul of profanity checking:
49
+
50
+ Profanalyzer.flagged_words("shit damn foo") #==> ["shit", "damn"]
51
+ Profanalyzer.flagged_words("profanalyzer is rad!") #==> []
52
+
53
+ # With custom settings
54
+ Profanalyzer.check_all = false
55
+ Profanalyzer.check_racist = false
56
+ Profanalyzer.flagged_words("you're a mick") #==> []
57
+
58
+ # You can pass options to the method itself:
59
+ Profanalyzer.flagged_words("you're a mick", :racist => false) #==> []
60
+
48
61
  Lastly, you can add custom substitutions:
49
62
 
50
63
  Profanalyzer.substitute("shit","shiat")
@@ -53,6 +66,19 @@ Lastly, you can add custom substitutions:
53
66
  Profanalyzer.substitute(:fuck => :fark)
54
67
  Profanalyzer.filter("fuck") #==> "fark"
55
68
 
69
+ ## Non-Global-State use
70
+
71
+ If you want to not just use global state everywhere, perhaps because you
72
+ need different profanity settings in different contexts, simply create an
73
+ instance of the Profanalyzer class, and use the same methods you were
74
+ using before on the instance:
75
+
76
+ analyzer = Profanalyzer.new
77
+ analyzer.tolerance = 5
78
+ analyzer.profane? 'hooker' #==> false
79
+ analyzer.filter 'fuck' #==> '#!$%'
80
+
81
+ Changing this instance's settings won't affect any other analyzers.
56
82
 
57
83
  ## Requirements
58
84
 
@@ -62,6 +88,8 @@ hoe - a gem for building gems, which I used for profanalyzer.
62
88
 
63
89
  * Michael Edgar <adgar@carboni.ca>
64
90
  * Thomas Hanley <tjhanley.com@gmail.com>
91
+ * Peter Vandenberk <pvandenberk@mac.com>
92
+ * Christopher M. Hobbs <chris@altbit.org> (nilmethod)
65
93
 
66
94
  ## Installation
67
95
 
data/Rakefile CHANGED
@@ -4,27 +4,10 @@ require 'rubygems'
4
4
  require 'hoe'
5
5
  require './lib/profanalyzer.rb'
6
6
 
7
- Hoe.new('profanalyzer', Profanalyzer::VERSION) do |p|
8
- p.rubyforge_name = 'profanalyzer' # if different than lowercase project name
9
- p.developer('Michael J. Edgar', 'edgar@triqweb.com')
7
+ Hoe.spec('profanalyzer') do |p|
8
+ p.developer('Michael J. Edgar', 'adgar@carboni.ca')
10
9
  p.remote_rdoc_dir = ''
11
10
  p.summary = "Analyzes a block of text for profanity. It is able to filter profane words as well."
12
- desc 'Post your blog announcement to blogger.'
13
- task :post_blogger do
14
- require 'blogger'
15
- p.with_config do |config, path|
16
- break unless config['blogs']
17
- subject, title, body, urls = p.announcement
18
-
19
- config['blogs'].each do |site|
20
- next unless site['url'] =~ /www\.blogger\.com/
21
- acc = Blogger::Account.new(site['user'],site['password'])
22
- post = Blogger::Post.new(:title => title, :content => body, :categories => p.blog_categories, :formatter => :rdiscount)
23
- acc.post(site['blog_id'], post)
24
-
25
- end
26
- end
27
- end
28
11
  desc 'Pushes rdocs to carbonica'
29
12
  task :carbonica => :redocs do
30
13
  sh "scp -r doc/ adgar@carboni.ca@carboni.ca:/var/www/html/projects/#{p.name}/"
@@ -98,34 +98,61 @@ require 'yaml'
98
98
  # SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE.
99
99
  class Profanalyzer
100
100
 
101
- VERSION = "1.0.0"
101
+ VERSION = "1.2.0"
102
102
 
103
- @@full_list = YAML::load_file(File.dirname(__FILE__)+"/../config/list.yml")
104
- @@racist_list = @@full_list.select {|w| w[:racist]}
105
- @@sexual_list = @@full_list.select {|w| w[:sexual]}
103
+ DEFAULT_TOLERANCE = 4
106
104
 
107
- @@settings = {:racism => :forbidden, :sexual => :forbidden, :profane => :forbidden, :tolerance => 4, :custom_subs => {}}
105
+ FULL = YAML::load_file(File.dirname(__FILE__)+"/../config/list.yml")
106
+ RACIST = FULL.select {|w| w[:racist]}
107
+ SEXUAL = FULL.select {|w| w[:sexual]}
108
108
 
109
- def self.forbidden_words_from_settings # :nodoc:
109
+ DEFAULT_SETTINGS = {:racism => :forbidden,
110
+ :sexual => :forbidden,
111
+ :profane => :forbidden,
112
+ :tolerance => DEFAULT_TOLERANCE,
113
+ :custom_subs => {}}
114
+
115
+ def self.singleton_class
116
+ class << self; self; end
117
+ end unless defined?(singleton_class)
118
+
119
+ def self.forward_to_default(*methods)
120
+ methods.each do |method|
121
+ singleton_class.class_eval do
122
+ define_method method do |*args|
123
+ DEFAULT_INSTANCE.send(method, *args)
124
+ end
125
+ end
126
+ end
127
+ end
128
+
129
+ def initialize(settings=DEFAULT_SETTINGS)
130
+ @settings = DEFAULT_SETTINGS
131
+ end
132
+
133
+ DEFAULT_INSTANCE = new
134
+
135
+ def forbidden_words_from_settings # :nodoc:
110
136
  banned_words = []
111
137
 
112
- @@full_list.each do |word|
113
- banned_words << word[:word] if @@settings[:tolerance] <= word[:badness]
114
- end if @@settings[:profane] == :forbidden
138
+ FULL.each do |word|
139
+ banned_words << word[:word] if @settings[:tolerance] <= word[:badness]
140
+ end if @settings[:profane] == :forbidden
115
141
 
116
- return banned_words if @@settings[:profane] == :forbidden #save some processing
142
+ return banned_words if @settings[:profane] == :forbidden #save some processing
117
143
 
118
- @@racist_list.each do |word|
119
- banned_words << word[:word] if @@settings[:tolerance] <= word[:badness]
120
- end if @@settings[:racism] == :forbidden
144
+ RACIST.each do |word|
145
+ banned_words << word[:word] if @settings[:tolerance] <= word[:badness]
146
+ end if @settings[:racism] == :forbidden
121
147
 
122
- @@sexual_list.each do |word|
123
- banned_words << word[:word] if @@settings[:tolerance] <= word[:badness]
124
- end if @@settings[:sexual] == :forbidden
148
+ SEXUAL.each do |word|
149
+ banned_words << word[:word] if @settings[:tolerance] <= word[:badness]
150
+ end if @settings[:sexual] == :forbidden
125
151
  banned_words
126
152
  end
153
+ forward_to_default :forbidden_words_from_settings
127
154
 
128
- def self.update_settings_from_hash(hash)
155
+ def update_settings_from_hash(hash)
129
156
  self.tolerance = hash[:tolerance] if hash.has_key? :tolerance
130
157
  self.check_racist = hash[:racist] if hash.has_key? :racist
131
158
  self.check_sexual = hash[:sexual] if hash.has_key? :sexual
@@ -137,6 +164,7 @@ class Profanalyzer
137
164
  self.check_all = true
138
165
  end
139
166
  end
167
+ forward_to_default :update_settings_from_hash
140
168
 
141
169
  # Decides whether the given string is profane, given Profanalyzer's current
142
170
  # settings. Examples:
@@ -157,23 +185,65 @@ class Profanalyzer
157
185
  # [:+racist+] Set to +true+ or +false+ to specify racial slur checking
158
186
  # [:+tolerance+] Sets the tolerance. 0-5.
159
187
  #
160
- def self.profane?(*args)
188
+ def profane?(*args)
161
189
  str = args[0]
162
190
  if (args.size > 1 && args[1].is_a?(Hash))
163
- oldsettings = @@settings
164
- self.update_settings_from_hash args[1]
191
+ oldsettings = @settings
192
+ update_settings_from_hash args[1]
165
193
  end
166
- banned_words = self.forbidden_words_from_settings
194
+ banned_words = forbidden_words_from_settings
167
195
  banned_words.each do |word|
168
196
  if str =~ /\b#{word}\b/i
169
- @@settings = oldsettings if oldsettings
197
+ @settings = oldsettings if oldsettings
170
198
  return true
171
199
  end
172
200
  end
173
- @@settings = oldsettings if oldsettings
201
+ @settings = oldsettings if oldsettings
174
202
  false
175
203
  end
176
-
204
+ forward_to_default :profane?
205
+
206
+ # Returns an array of words that match the currently set rules against the
207
+ # provided string. The array will be empty if no words are matched.
208
+ #
209
+ # Example:
210
+ # Profanalyzer.flagged_words("shit damn foo") #==> ["shit", "damn"]
211
+ # Profanalyzer.flagged_words("profanalyzer is rad!") #==> []
212
+ #
213
+ ## With custom settings
214
+ # Profanalyzer.check_all = false
215
+ # Profanalyzer.check_racist = false
216
+ # Profanalyzer.flagged_words("you're a mick") #==> []
217
+ #
218
+ # You can pass options to the method itself:
219
+ # Profanalyzer.flagged_words("you're a mick", :racist => false) #==> []
220
+ #
221
+ # Available options:
222
+ #
223
+ # [:+all+] Set to +true+ or +false+ to specify checking all words in the blacklist
224
+ # [:+sexual+] Set to +true+ or +false+ to specify sexual checking
225
+ # [:+racist+] Set to +true+ or +false+ to specify racial slur checking
226
+ # [:+tolerance+] Sets the tolerance. 0-5.
227
+ def flagged_words(*args)
228
+ flagged_words = []
229
+ str = args[0]
230
+
231
+ if (args.size > 1 && args[1].is_a?(Hash))
232
+ oldsettings = @settings
233
+ update_settings_from_hash args[1]
234
+ end
235
+
236
+ banned_words = forbidden_words_from_settings
237
+ banned_words.each do |word|
238
+ if str =~ /\b#{word}\b/i
239
+ flagged_words << word
240
+ end
241
+ end
242
+ @settings = oldsettings if oldsettings
243
+ return flagged_words
244
+ end
245
+ forward_to_default :flagged_words
246
+
177
247
  # Filters the provided string using the currently set rules, with #!@$%-like
178
248
  # characters substituted in.
179
249
  #
@@ -195,16 +265,16 @@ class Profanalyzer
195
265
  # [:+racist+] Set to +true+ or +false+ to specify racial slur checking
196
266
  # [:+tolerance+] Sets the tolerance. 0-5.
197
267
  #
198
- def self.filter(*args)
268
+ def filter(*args)
199
269
  str = args[0]
200
270
  if (args.size > 1 && args[1].is_a?(Hash))
201
- oldsettings = @@settings
271
+ oldsettings = @settings
202
272
  self.update_settings_from_hash args[1]
203
273
  end
204
274
 
205
275
  retstr = str
206
276
 
207
- @@settings[:custom_subs].each do |k,v|
277
+ @settings[:custom_subs].each do |k,v|
208
278
  retstr.gsub!(/\b#{k.to_s}\b/i,v.to_s)
209
279
  end
210
280
 
@@ -213,20 +283,21 @@ class Profanalyzer
213
283
  retstr.gsub!(/\b#{word}\b/i,
214
284
  "#!$%@&!$%@%@&!$#!$%@&!$%@%@&!#!$%@&!$%@%@&!"[0..(word.length-1)])
215
285
  end
216
- @@settings = oldsettings if oldsettings
286
+ @settings = oldsettings if oldsettings
217
287
  retstr
218
288
  end
219
-
220
- def self.strip(*args)
289
+ forward_to_default :filter
290
+
291
+ def strip(*args)
221
292
  str = args[0]
222
293
  if (args.size > 1 && args[1].is_a?(Hash))
223
- oldsettings = @@settings
294
+ oldsettings = @settings
224
295
  self.update_settings_from_hash args[1]
225
296
  end
226
297
 
227
298
  retstr = str
228
299
 
229
- @@settings[:custom_subs].each do |k,v|
300
+ @settings[:custom_subs].each do |k,v|
230
301
  retstr.gsub!(/\b#{k.to_s}\b/i,v.to_s)
231
302
  end
232
303
 
@@ -234,49 +305,59 @@ class Profanalyzer
234
305
  banned_words.each do |word|
235
306
  retstr.gsub!(/\b#{word}\b/i,"")
236
307
  end
237
- @@settings = oldsettings if oldsettings
308
+ @settings = oldsettings if oldsettings
238
309
  retstr
239
310
  end
311
+ forward_to_default :strip
240
312
 
241
313
  # Sets Profanalyzer's tolerance. Value should be an integer such that
242
314
  # 0 <= T <= 5.
243
- def self.tolerance=(new_tol)
244
- @@settings[:tolerance] = new_tol
315
+ def tolerance=(new_tol)
316
+ @settings[:tolerance] = new_tol
317
+ end
318
+
319
+ # Returns Profanalyzer's tolerance. Value will be an integer
320
+ # 0 <= T <= 5.
321
+ def tolerance
322
+ @settings[:tolerance]
245
323
  end
246
324
 
247
325
  # Sets Profanalyzer to scan (or not scan) for racist words, based on
248
326
  # the set tolerance.
249
327
  # This is set to +true+ by default.
250
- def self.check_racist=(check)
251
- @@settings[:racism] = (check) ? :forbidden : :ignore
328
+ def check_racist=(check)
329
+ @settings[:racism] = (check) ? :forbidden : :ignore
252
330
  end
253
331
 
254
332
  # Sets Profanalyzer to scan (or not scan) for sexual words, based on the set tolerance.
255
333
  # This is set to +true+ by default.
256
- def self.check_sexual=(check)
257
- @@settings[:sexual] = (check) ? :forbidden : :ignore
334
+ def check_sexual=(check)
335
+ @settings[:sexual] = (check) ? :forbidden : :ignore
258
336
  end
259
337
 
260
338
  # Sets Profanalyzer to scan (or not scan) for all profane words, based on the set tolerance.
261
339
  # This is set to +true+ by default.
262
- def self.check_all=(check)
263
- @@settings[:profane] = (check) ? :forbidden : :ignore
340
+ def check_all=(check)
341
+ @settings[:profane] = (check) ? :forbidden : :ignore
264
342
  end
265
343
 
266
344
  # Sets the list of substitutions to the hash passed in. Substitutions are
267
345
  # performed such that +Profanalyzer.filter(key) = value+.
268
- def self.subtitutions=(hash)
269
- @@settings[:custom_subs] = hash
346
+ def substitutions=(hash)
347
+ @settings[:custom_subs] = hash
270
348
  end
271
349
 
272
350
  # Sets a custom substitution for the filter.
273
351
  # Can be passed as +substitute("foo","bar")+ or +"foo" => "bar"+
274
- def self.substitute(*args)
352
+ def substitute(*args)
275
353
  case args[0]
276
354
  when String
277
- @@settings[:custom_subs].merge!(args[0] => args[1])
355
+ @settings[:custom_subs].merge!(args[0] => args[1])
278
356
  when Hash
279
- @@settings[:custom_subs].merge!(args[0])
357
+ @settings[:custom_subs].merge!(args[0])
280
358
  end
281
359
  end
282
- end
360
+
361
+ forward_to_default :tolerance, :tolerance=, :check_racist=, :check_sexual=, :check_all=
362
+ forward_to_default :substitutions, :substitute
363
+ end
@@ -3,10 +3,22 @@ require "profanalyzer"
3
3
 
4
4
  class TestProfanalyzer < Test::Unit::TestCase
5
5
 
6
+ def test_default_tolerance
7
+ assert_equal Profanalyzer::DEFAULT_TOLERANCE, Profanalyzer.tolerance
8
+ end
9
+
10
+ def test_profanalyzer_tolerance
11
+ 0.upto(5) do |tolerance|
12
+ Profanalyzer.tolerance = tolerance # setter
13
+ assert_equal tolerance, Profanalyzer.tolerance # getter
14
+ end
15
+ end
16
+
6
17
  def test_single_word
7
18
  Profanalyzer.tolerance = 0
8
19
  Profanalyzer.check_all = true
9
20
  assert_equal(true, Profanalyzer.profane?("asshole"))
21
+ assert_equal(["asshole"], Profanalyzer.flagged_words("asshole"))
10
22
  end
11
23
 
12
24
  def test_single_racist_word
@@ -16,6 +28,8 @@ class TestProfanalyzer < Test::Unit::TestCase
16
28
  Profanalyzer.check_racist = true
17
29
  assert_equal(true, Profanalyzer.profane?("spic"))
18
30
  assert_equal(false, Profanalyzer.profane?("pussy"))
31
+ assert_equal(["spic"], Profanalyzer.flagged_words("spic"))
32
+ assert_equal([], Profanalyzer.flagged_words("pussy"))
19
33
  end
20
34
 
21
35
  def test_single_sexual_word
@@ -25,6 +39,8 @@ class TestProfanalyzer < Test::Unit::TestCase
25
39
  Profanalyzer.check_sexual = true
26
40
  assert_equal(true, Profanalyzer.profane?("vagina"))
27
41
  assert_equal(false, Profanalyzer.profane?("nigger"))
42
+ assert_equal(["vagina"], Profanalyzer.flagged_words("vagina"))
43
+ assert_equal([], Profanalyzer.flagged_words("nigger"))
28
44
  end
29
45
 
30
46
  def test_tolerance
@@ -32,6 +48,8 @@ class TestProfanalyzer < Test::Unit::TestCase
32
48
  Profanalyzer.check_all = true
33
49
  assert_equal(false, Profanalyzer.profane?("asskisser")) # badness = 3
34
50
  assert_equal(true, Profanalyzer.profane?("fuck")) # badness = 5
51
+ assert_equal([], Profanalyzer.flagged_words("asskisser")) # badness = 3
52
+ assert_equal(["fuck"], Profanalyzer.flagged_words("fuck")) # badness = 5
35
53
  end
36
54
 
37
55
  def test_sexual_tolerance
@@ -41,6 +59,8 @@ class TestProfanalyzer < Test::Unit::TestCase
41
59
  Profanalyzer.check_sexual = true
42
60
  assert_equal(false, Profanalyzer.profane?("vagina")) # badness = 3
43
61
  assert_equal(true, Profanalyzer.profane?("cunt")) # badness = 5
62
+ assert_equal([], Profanalyzer.flagged_words("vagina")) # badness = 3
63
+ assert_equal(["cunt"], Profanalyzer.flagged_words("cunt")) # badness = 5
44
64
  end
45
65
 
46
66
  def test_racist_tolerance
@@ -50,6 +70,8 @@ class TestProfanalyzer < Test::Unit::TestCase
50
70
  Profanalyzer.check_racist = true
51
71
  assert_equal(false, Profanalyzer.profane?("mick")) # badness = 3
52
72
  assert_equal(true, Profanalyzer.profane?("nigger")) # badness = 5
73
+ assert_equal([], Profanalyzer.flagged_words("mick")) # badness = 3
74
+ assert_equal(["nigger"], Profanalyzer.flagged_words("nigger")) # badness = 5
53
75
  end
54
76
 
55
77
  def test_filter
@@ -80,7 +102,7 @@ class TestProfanalyzer < Test::Unit::TestCase
80
102
  assert_equal(filtered_string, Profanalyzer.filter(original_string))
81
103
  end
82
104
 
83
- def test_subtitutions
105
+ def test_substitutions
84
106
  Profanalyzer.substitute("shit","shiat")
85
107
  assert_equal("shiat", Profanalyzer.filter("shit"))
86
108
 
@@ -91,4 +113,10 @@ class TestProfanalyzer < Test::Unit::TestCase
91
113
  assert_equal("fark", Profanalyzer.filter("fuck"))
92
114
  end
93
115
 
116
+ def test_multiple_matches_in_flagged_words
117
+ Profanalyzer.tolerance = 0
118
+ Profanalyzer.check_all = true
119
+ assert_equal(["shit", "mick", "cocksucking"], Profanalyzer.flagged_words("You're a cocksucking piece of shit, you mick."))
120
+ end
121
+
94
122
  end
@@ -5,31 +5,42 @@ class TestProfanalyzer < Test::Unit::TestCase
5
5
 
6
6
  def test_single_word_advanced
7
7
  assert_equal(true, Profanalyzer.profane?("asshole", :tolerance => 0, :all => true))
8
+ assert_equal(["asshole"], Profanalyzer.flagged_words("asshole", :tolerance => 0, :all => true))
8
9
  end
9
10
 
10
11
  def test_single_racist_word_advanced
11
12
  assert_equal(true, Profanalyzer.profane?("spic", :tolerance => 0, :racist => true, :sexual => false))
12
13
  assert_equal(false, Profanalyzer.profane?("pussy", :tolerance => 0, :racist => true, :sexual => false))
14
+ assert_equal(["spic"], Profanalyzer.flagged_words("spic", :tolerance => 0, :racist => true, :sexual => false))
15
+ assert_equal([], Profanalyzer.flagged_words("pussy", :tolerance => 0, :racist => true, :sexual => false))
13
16
  end
14
17
 
15
18
  def test_single_sexual_word_advanced
16
19
  assert_equal(true, Profanalyzer.profane?("vagina", :tolerance => 0, :racist => false, :sexual => true))
17
20
  assert_equal(false, Profanalyzer.profane?("nigger", :tolerance => 0, :racist => false, :sexual => true))
21
+ assert_equal(["vagina"], Profanalyzer.flagged_words("vagina", :tolerance => 0, :racist => false, :sexual => true))
22
+ assert_equal([], Profanalyzer.flagged_words("nigger", :tolerance => 0, :racist => false, :sexual => true))
18
23
  end
19
24
 
20
25
  def test_tolerance_advanced
21
26
  assert_equal(false, Profanalyzer.profane?("asskisser", :tolerance => 4, :all => true)) # badness = 3
22
27
  assert_equal(true, Profanalyzer.profane?("fuck", :tolerance => 4, :all => true)) # badness = 5
28
+ assert_equal([], Profanalyzer.flagged_words("asskisser", :tolerance => 4, :all => true)) # badness = 3
29
+ assert_equal(["fuck"], Profanalyzer.flagged_words("fuck", :tolerance => 4, :all => true)) # badness = 5
23
30
  end
24
31
 
25
32
  def test_sexual_tolerance_advanced
26
33
  assert_equal(false, Profanalyzer.profane?("vagina", :tolerance => 4, :racist => false, :sexual => true)) # badness = 3
27
34
  assert_equal(true, Profanalyzer.profane?("cunt", :tolerance => 4, :racist => false, :sexual => true)) # badness = 5
35
+ assert_equal([], Profanalyzer.flagged_words("vagina", :tolerance => 4, :racist => false, :sexual => true)) # badness = 3
36
+ assert_equal(["cunt"], Profanalyzer.flagged_words("cunt", :tolerance => 4, :racist => false, :sexual => true)) # badness = 5
28
37
  end
29
38
 
30
39
  def test_racist_tolerance_advanced
31
40
  assert_equal(false, Profanalyzer.profane?("mick", :tolerance => 4, :racist => true, :sexual => false)) # badness = 3
32
41
  assert_equal(true, Profanalyzer.profane?("nigger", :tolerance => 4, :racist => true, :sexual => false)) # badness = 5
42
+ assert_equal([], Profanalyzer.flagged_words("mick", :tolerance => 4, :racist => true, :sexual => false)) # badness = 3
43
+ assert_equal(["nigger"], Profanalyzer.flagged_words("nigger", :tolerance => 4, :racist => true, :sexual => false)) # badness = 5
33
44
  end
34
45
 
35
46
  def test_filter_advanced
metadata CHANGED
@@ -4,9 +4,9 @@ version: !ruby/object:Gem::Version
4
4
  prerelease: false
5
5
  segments:
6
6
  - 1
7
+ - 2
7
8
  - 0
8
- - 0
9
- version: 1.0.0
9
+ version: 1.2.0
10
10
  platform: ruby
11
11
  authors:
12
12
  - Michael J. Edgar
@@ -14,7 +14,7 @@ autorequire:
14
14
  bindir: bin
15
15
  cert_chain: []
16
16
 
17
- date: 2010-04-06 00:00:00 -04:00
17
+ date: 2011-08-11 00:00:00 -04:00
18
18
  default_executable:
19
19
  dependencies:
20
20
  - !ruby/object:Gem::Dependency
@@ -27,8 +27,8 @@ dependencies:
27
27
  segments:
28
28
  - 2
29
29
  - 0
30
- - 2
31
- version: 2.0.2
30
+ - 4
31
+ version: 2.0.4
32
32
  type: :development
33
33
  version_requirements: *id001
34
34
  - !ruby/object:Gem::Dependency
@@ -40,9 +40,9 @@ dependencies:
40
40
  - !ruby/object:Gem::Version
41
41
  segments:
42
42
  - 0
43
- - 2
43
+ - 6
44
44
  - 1
45
- version: 0.2.1
45
+ version: 0.6.1
46
46
  type: :development
47
47
  version_requirements: *id002
48
48
  - !ruby/object:Gem::Dependency
@@ -61,7 +61,7 @@ dependencies:
61
61
  version_requirements: *id003
62
62
  description: ""
63
63
  email:
64
- - edgar@triqweb.com
64
+ - adgar@carboni.ca
65
65
  executables: []
66
66
 
67
67
  extensions: []
@@ -72,7 +72,7 @@ extra_rdoc_files:
72
72
  files:
73
73
  - History.txt
74
74
  - Manifest.txt
75
- - README.markdown
75
+ - README.md
76
76
  - Rakefile
77
77
  - config/list.yml
78
78
  - lib/profanalyzer.rb