rwikibot 1.0.10 → 2.0.0

Sign up to get free protection for your applications and to get access to all the features.
data/CHANGELOG CHANGED
@@ -1,5 +1,33 @@
1
1
  RWikiBot ChangeLog
2
2
 
3
+ 2.0 ** BREAKING RELEASE** - this will break pretty all 1.X bots.
4
+ - It's smart enough now to handle query-continues! As such, there's a new dependency - deep_merge. Get it - it's in gem.
5
+ - Made a Page class. now, instead of bot.delete_page(title...), you do page.delete
6
+ - Got rid of the stupid Result class. That was a terrible decision.
7
+ - edit_page is now save. Part of the Page/class concept
8
+ - Changed user-agent to bot/RWikiBot/2.0
9
+ - page_exists has been replaced with the page attribute "missing" - this aligns with MW's behavior
10
+ - Removed redirects since it doesn't matter anymore now that the API is solidified
11
+ - Removed image_embdedded_in since it's defunct
12
+ - Removed revisions due to limited functionality
13
+ - Removed pageid_to_title because the API is smart enough to not require conversion
14
+ - Made the utilities all private because utilities don't need to be public
15
+ - Shortened up the code considerably by retuning make_request when I can
16
+ - Moved make_request and it's new friend raw_call to utilities because they are.
17
+ - Removed normalize since I can get the "missing" attribute when creating a page, and that's good 'nuff
18
+
19
+ 1.2
20
+ - Added edit functionality to complete the API
21
+ - Played catchup with a whole lot of methods
22
+
23
+ 1.1
24
+ - Switched from YAML to XML for a handful of reasons, most of which being poorly formatted YAML errors in API
25
+ - Disabled cookies from login since they're terrible and hard to maintain and bad long term practice and the API supports it (you're welcome)
26
+ - Striked the "name" field and "prefix" field from initialize. Those were antiquated anyway.
27
+ - Reordered the methods to align with documentation
28
+ - Changed User-Agent to bot-RWikiBot/1.1
29
+ - Refined error handling (sorta)
30
+
3
31
  1.0.10
4
32
  - Changed a get back to post since the bug wasn't with me.
5
33
 
data/README CHANGED
@@ -1,3 +1,17 @@
1
- This is RWikiBot 1.0.
1
+ This is RWikiBot 2.0
2
2
 
3
- This is a test README file.
3
+ For the most up-to-date information on RWikiBot, how to use it, and why it's amazing, please visit:
4
+ http://www.rwikibot.net
5
+
6
+ New in 2.0 (highlights)
7
+ - It's smart enough now to handle query-continues! As such, there's a new dependency - deep_merge. Get it - it's in gem.
8
+ - Made a Page class. now, instead of bot.delete_page(title...), you do page.delete
9
+ - Got rid of the stupid Result class. That was a terrible decision.
10
+ - edit_page is now save. Part of the Page/class concept
11
+ - Changed user-agent to bot/RWikiBot/2.0
12
+
13
+ For more, see the the CHANGELOG.
14
+
15
+ Author:: Eddie Roger (mailto:eddieroger@nospam@gmail.com) http://www.eddieroger.com
16
+ Copyright:: Copyright (c) 2008 Eddie Roger
17
+ License:: GNU/GPL 2.0
data/lib/errors.rb ADDED
@@ -0,0 +1,14 @@
1
+ module RWBErrors
2
+ class LoginError < StandardError
3
+ end
4
+
5
+ class RWikiBotError < StandardError
6
+ end
7
+
8
+ class VersionTooLowError < StandardError
9
+ end
10
+
11
+ class NotLoggedInError < StandardError
12
+ end
13
+
14
+ end
data/lib/pages.rb ADDED
@@ -0,0 +1,206 @@
1
+ ## This class defines a MediaWiki page. Think of it like this: the bot handles site related stuff, but to take action on a page, you need a page object.
2
+ require 'rwikibot'
3
+ require 'errors'
4
+
5
+ module Pages
6
+ include RWBErrors
7
+
8
+ class Page
9
+ attr_reader :title, :namespace, :new, :length, :counter, :lastrevid
10
+
11
+ # Creates a new Page object.
12
+ def initialize(bot, title='')
13
+ @bot = bot
14
+
15
+ # Get page attributes
16
+ info = info(title)
17
+
18
+ @title = info['title']
19
+ @namespace = info['ns']
20
+ @new = info.has_key?('new')
21
+ @length = info['length']
22
+ @counter = info ['counter']
23
+ @lastrevid = info['lastrevid']
24
+ @missing = info.has_key?('missing')
25
+
26
+ end
27
+
28
+ # This will get only the content of the article. It is a modification of revisions to specifically pull the content. I thought it would be useful.
29
+ def content(options = nil)
30
+
31
+ post_me = {'prop' => 'revisions', 'titles' => @title, 'rvprop' => 'content'}
32
+
33
+ # Handle any additional options
34
+ if options != nil
35
+ options.each_pair do |key, value|
36
+ post_me[key] = value
37
+ end
38
+ end
39
+
40
+ # Make the request. Becuase we care.
41
+ revisions_result = @bot.make_request('query', post_me )
42
+ return revisions_result.fetch('pages').fetch('page').fetch('revisions').fetch('rev')
43
+ end
44
+
45
+ # If you have to ask what this method does, don't use it. Seriously, use with caution - this method does not have a confirmation step, and deleted (while restorable) are immediate.
46
+ #
47
+ def delete(reason="Deleted by RWikiBot")
48
+ raise RWBErrors::VersionTooLowError unless @bot.meets_version_requirement(1,12)
49
+ raise RWBErrors::NotLoggedInError unless @bot.logged_in?
50
+
51
+ post_me = {
52
+ 'title' => @title ,
53
+ 'token' => get_token('delete') ,
54
+ 'reason' => reason
55
+ }
56
+
57
+ return delete_result = @bot.make_request('delete',post_me)
58
+ end
59
+
60
+ # This method fetches any article that links to the article given in 'title'. Returned in alphabetical order.
61
+ def backlinks (titles, options = nil)
62
+ raise VersionTooLowError unless meets_version_requirement(1,9)
63
+
64
+ # This will get all pages. Limits vary based on user rights of the Bot. Set to bot.
65
+ post_me = {'list' => 'backlinks', 'titles' => "#{title}" }
66
+
67
+
68
+ if options != nil
69
+ options.each_pair do |key, value|
70
+ post_me[key] = value
71
+ end
72
+ end
73
+
74
+ #make the request
75
+ backlinks_result = make_request('query', post_me)
76
+
77
+ if backlinks_result.success?
78
+ return backlinks_result.get_result.fetch('backlinks')
79
+ else
80
+ return backlinks_result.get_message
81
+ end
82
+ end
83
+
84
+ # This method pulls any page that includes the template requested. Please note - the template must be the full name, like "Template:Disputed" or "Template:Awesome".
85
+ def embedded_in (options = nil)
86
+ raise VersionTooLowError unless @bot.meets_version_requirement(1,9)
87
+
88
+ # This will get all pages. Limits vary based on user rights of the Bot. Set to bot.
89
+ post_me = {'list' => 'embeddedin', 'eititle' => @title }
90
+
91
+ if options != nil
92
+ options.each_pair do |key, value|
93
+ post_me[key] = value
94
+ end
95
+ end
96
+
97
+ #make the request
98
+ embeddedin_result = @bot.make_request('query', post_me)
99
+ return embeddedin_result.fetch('embeddedin').fetch('ei')
100
+ end
101
+
102
+ # I decided to split this up since I wanted to normalize the bot framework as much as possible, or in other words, make it as easy to use as possible. I think the sacrifice of more methods is worth having more English looking code. Its the Ruby way.
103
+ # Info will return information about the page, from namespace to normalized title, last touched, etc.
104
+ def info (titles)
105
+ # Basic quqery info
106
+ post_me = {"prop" => "info", 'titles' => titles}
107
+
108
+ # Make the request
109
+ info_result = @bot.make_request('query', post_me)
110
+
111
+ return info_result.fetch('pages').fetch('page')
112
+ end
113
+
114
+ # This method will let you move a page from one name to another. A move token is required for this to work. Keep that in mind. (get_token much?)
115
+ def move(to, reason, movetalk = true, noredirect = false)
116
+ raise RWBErrors::VersionTooLowError unless @bot.meets_version_requirement(1,12)
117
+ raise RWBErrors::NotLoggedInError unless @bot.logged_in?
118
+
119
+ post_me = {
120
+ 'from' => @title ,
121
+ 'to' => "#{to}" ,
122
+ 'token' => get_token('move') ,
123
+ 'reason' => "#{reason}" ,
124
+ }
125
+
126
+ # These ifs are necessary because they should only be part of post_me if the passed vars are true (which they are by default)
127
+ if movetalk
128
+ post_me['movetalk'] = ''
129
+ end
130
+ if noredirect
131
+ post_me['noredirect'] = ''
132
+ end
133
+
134
+ return @bot.make_request('move', post_me)
135
+ end
136
+
137
+ # Rollback does what it says - rolls back an article one version in the wiki. This is a function that requires not only a token, but a previous user.
138
+ def rollback (summary="", markbot=true)
139
+
140
+ temp_token = get_token("rollback") # special for rollback. Stupid rollback.
141
+ post_me = {
142
+ 'title' => @title,
143
+ 'token' => temp_token['token'],
144
+ 'user' => temp_token['user'],
145
+ 'summary' => summary
146
+ }
147
+
148
+ if markbot
149
+ post_me['markbot'] = ''
150
+ end
151
+
152
+ return @bot.make_request('rollback', post_me)
153
+ end
154
+
155
+ # This method is used to edit pages. Not much more to say about it. Be sure you're logged in and got a token (get_token). Options is an array (or hash) of extra values allowed by the API.
156
+ def save(content, summary = nil, options = nil)
157
+
158
+ post_me = {
159
+ 'text' => "#{content}" ,
160
+ 'token' => get_token("edit") ,
161
+ 'title' => @title ,
162
+ # 'lgtoken' => @config['lgtoken'] ,
163
+ 'summary' => "#{summary}" ,
164
+ 'edittime' => Time.now.strftime("%Y%m%d%H%M%S") ,
165
+ # 'userid' => @config.fetch('lguserid') ,
166
+ }
167
+
168
+ if options.nil? == FALSE
169
+ options.each do |key, value|
170
+ post_me[key] = value
171
+ end
172
+ end
173
+ return @bot.make_request('edit', post_me).fetch('result')
174
+ end
175
+
176
+ private
177
+ # This method should universally return tokens, just give title and type. You will receive a token string (suitable for use in other methods), so plan accordingly.
178
+ # Use an edit token for both editing and creating articles (edit_article, create_article). For rollback, more than just a token is required. So, for token=rollback, you get a hash of token|user. Just the way it goes.
179
+ def get_token(intoken)
180
+ if intoken.downcase == 'rollback'
181
+ #specific to rollback
182
+ post_me = {
183
+ 'prop' => 'revisions' ,
184
+ 'rvtoken' => intoken ,
185
+ 'titles' => @title
186
+ }
187
+ else
188
+ post_me = {
189
+ 'prop' => 'info',
190
+ 'intoken' => intoken,
191
+ 'titles' => @title
192
+ }
193
+ end
194
+ raw_token = @bot.make_request('query', post_me)
195
+
196
+ if intoken.downcase == 'rollback'
197
+ # Damn this decision to make rollback special!. Wasn't mine, I just have to live by it.
198
+ token2 = raw_token.fetch('pages').fetch('page').fetch('revisions').fetch('rev')
199
+ return {'token' => token2.fetch('rollbacktoken') , 'user' => token2.fetch('user')}
200
+ else
201
+ return raw_token.fetch('pages').fetch('page').fetch("#{intoken}token")
202
+ end
203
+ end
204
+ end #class
205
+
206
+ end #module
data/lib/rwikibot.rb CHANGED
@@ -1,674 +1,195 @@
1
- # RWikiBot 0.1
2
- #
3
- # This is a framework upon which to create MediaWiki Bots. It provides a set of methods to acccess MediaWiki's API and return information in
1
+ # This is a framework upon which to create MediaWiki Bots. It provides a set of methods to acccess MediaWiki's API and return information in
4
2
  # various forms, depending on the type of information returned. By abstracting these methods into a Bot object, cleaner script code can be
5
- # written later. Furthermore, it facilitates the updating of the API without breaking old bots. Last, but not least, its good to abstract.
6
- #
7
- # Configuration options are kept in config.yaml and processed on bot creation. I have worked really hard to make sure this file doesn't need editing,
8
- # so please don't make changes here, and email me if you find something that I missed.
9
- #
10
- # Author:: Edwin Sidney Roger (mailto:eddieroger@gmail.com)
11
- # Copyright:: Copyright (c) 2007 Edwin Sidney Roger
3
+ # written later. Furthermore, it facilitates the updating of the API without breaking old bots. Last, but not least, its good to abstract. #
4
+ #
5
+ # Author:: Eddie Roger (mailto:eddieroger@gmail.com)
6
+ # Copyright:: Copyright (c) 2008 Eddie Roger
12
7
  # License:: GNU/GPL 2.0
8
+
9
+ # ruby requires
13
10
  require 'net/http'
14
11
  require 'uri'
15
- require 'yaml'
16
12
  require 'cgi'
17
- require 'logger'
18
13
 
14
+ # my requires
15
+ require 'errors'
16
+ require 'utilities'
17
+ require 'pages'
18
+
19
+ # gem requires
20
+ require 'xmlsimple'
21
+ require 'deep_merge' # New in 2.0!
19
22
 
20
23
  #This is the main bot object. The goal is to represent every API method in some form here, and then write seperate, cleaner scripts in individual bot files utilizing this framework. Basically, this is an include at best.
21
24
  class RWikiBot
22
25
 
26
+ include RWBErrors
27
+ include RWBUtilities
28
+ include Pages
23
29
 
24
- attr_accessor :http, :config, :botname
25
-
26
- # New bots hope for three attributes, but require none. The first is simply the name of the bot for logging purposes. The second is the debug level constant, and third is the logfile.
27
- #
28
- # Example: bot = RWikiBot.new("My Neat Bot", Logger::DEBUG, "./rwikibot.log")
29
- def initialize ( name = "TestBot", username = 'RWikiBot', password = 'rwikibot', domain = 'en', api_path = 'http://localhost:8888/wiki/api.php', wikicookieprefix = 'wikidb_wiki')
30
+ attr_reader :config
30
31
 
31
- @botname = name
32
+ def initialize ( username = 'rwikibot', password = '', api_path = 'http://www.rwikibot.net/wiki/api.php', domain = '', login=false)
32
33
  @config = Hash.new
33
-
34
- #We log. Log4r in the house.
35
- ##@wikibotlogger = Logger.new('rwikibot.log')
36
-
37
- #INFO level is nice. I like it as production default because I like log files. Feel free to change. It will change once config loaded
38
- ##@wikibotlogger.level = Logger::DEBUG
39
- ###@wikibotlogger.info "New RWikiBot created. My name is #{@botname}"
40
-
34
+
41
35
  # This had to come back since I was having config loading issues when being called from MediaWiki
42
36
  @config['username'] = username
43
37
  @config['password'] = password
44
- @config['domain'] = domain
45
38
  @config['api_path'] = api_path
46
- @config['wikicookieprefix'] = wikicookieprefix
47
-
39
+ @config['domain'] = domain
40
+ @config['cookies'] = ""
48
41
  @config['logged_in'] = FALSE
49
42
  @config['uri'] = URI.parse(@config.fetch('api_path'))
50
-
51
- ##change - make an HTTP object here for use later, now that we have config
52
- @http = Net::HTTP.new(@config.fetch('uri').host, @config.fetch('uri').port)
53
- ###@wikibotlogger.debug("INIT - Created HTTP object. Result is: #{@http}}")
54
43
 
55
- @config['cookie'] = nil
44
+ # This has to be last methinks
45
+ @config['api_version'] = version.to_f
46
+
47
+ if login
48
+ login
49
+ end
50
+
56
51
  end
57
52
 
58
- # Login
59
- #
60
- # This is the method that will allow the bot to log in to the wiki. Its not always necessary, but bots need to log in to save changes or retrieve watchlists.
61
- #
62
- # No variables are accepted and the username/password are stored in config.yaml.
53
+ # This is the method that will allow the bot to log in to the wiki. Its not always necessary, but bots need to log in to save changes or retrieve watchlists.
63
54
  def login
64
-
65
- ###@wikibotlogger.debug("LOGIN - Preparing login information...")
55
+ raise VersionTooLowError unless meets_version_requirement(0,0)
56
+
66
57
  post_me = {'lgname'=>@config.fetch('username'),'lgpassword'=>@config.fetch('password')}
67
58
  if @config.has_key?('domain') && (@config.fetch('domain') != nil)
68
59
  post_me['lgdomain'] = @config.fetch('domain')
69
60
  end
70
-
71
- ###@wikibotlogger.debug("LOGIN - Asking make_request to perform login...")
61
+
62
+ #Calling make_request to actually log in
72
63
  login_result = make_request('login', post_me)
73
- ##@wikibotlogger.debug("LOGIN - We should have a result of type login now.")
74
-
75
- ##@wikibotlogger.debug("LOGIN - login_result received. Result is: #{login_result.fetch('result')}")
76
-
64
+
77
65
  # Now we need to changed some @config stuff, specifically that we're logged in and the variables of that
78
66
  # This will also change the make_request, but I'll comment there
79
- if login_result.fetch('result') == "Success"
67
+ if login_result['result'] == "Success"
80
68
  # All lg variables are directly from API and stored in config that way
81
- ##@wikibotlogger.info("LOGIN - Login successful. Wiki user is: #{login_result.fetch('lgusername')}")
82
- @config['logged_in'] = TRUE
83
- @config['lgusername'] = login_result.fetch('lgusername')
84
- @config['lguserid'] = login_result.fetch('lguserid')
85
- @config['lgtoken'] = login_result.fetch('lgtoken')
86
- ##@wikibotlogger.debug("LOGIN - MediwWiki API variables stored in @config")
87
- return TRUE
88
- else
89
- ##@wikibotlogger.error("LOGIN - Login Error. Wiki API said: #{login_result.fetch('result')}")
90
- return FALSE
91
- end
92
-
93
- end
94
-
95
- # Watchlist
96
- #
97
- # This method will get the watchlist for the bot's MediaWiki username. This is really onlu useful if you want the bot to watch a specific list of pages, and would require the bot maintainer to login to the wiki as the bot to set the watchlist.
98
- #
99
- # INPUT:: Options is a hash of API allowed fields that will be passed.
100
- #
101
- # OUTPUT:: Returns an array of hashes.
102
- def watchlist (options=nil)
103
- # Get the bot's watchlist
104
- ##@wikibotlogger.debug("WATCHLIST - Preparing request information...")
105
- post_me = {'list'=>'watchlist'}
106
-
107
- if options != nil
108
- options.each do |key, value|
109
- post_me[key] = value
110
- end
111
- end
112
-
113
- # Make the request
114
- ##@wikibotlogger.debug "WATCHLIST - Asking make_request to get watchlist..."
115
- watchlist_result = make_request('query', post_me)
116
- ##@wikibotlogger.debug "WATCHLIST - We should have a result of type query now."
117
-
118
- #Process into a Hash for return
119
- ##@wikibotlogger.debug "WATCHLIST - Processing result..."
120
- return watchlist_result.fetch('watchlist')
121
-
122
- end
123
-
124
- # Query
125
- #
126
- # This method will return Wiki-wide recent changes, almost as if looking at the Special page Recent Changes. But, in this format, a bot can handle it. Also we're using the API. And bots can't read.
127
- #
128
- # INPUT:: A hash of API-allowed keys and values. Default is same as API default.
129
- # PARAMETERS:: letype (flt), lefrom (paging timestamp), leto (flt), ledirection (dflt=older), leuser (flt), letitle (flt), lelimit (dflt=10, max=500/5000)
130
- # OUTPUT:: An array of hashes.
131
- def recent_changes (options=nil)
132
-
133
- # This will allow any given bot to get recent changes. Then act on it. But that's another method
134
- # TODO - Persistent timestamp storage
135
-
136
- ##@wikibotlogger.debug("RECENT CHANGES - Preparing request information...")
137
- post_me = {"list" => "recentchanges", 'rclimit' => '5000'}
138
- if options != nil
139
- options.each do |key, value|
140
- post_me[key] = value
141
- end
142
- end
143
-
144
- # Make the request
145
- ##@wikibotlogger.debug "RECENT CHANGES - Asking make_request to get recentchanges..."
146
- recentchanges_result = make_request('query' , post_me)
147
- ##@wikibotlogger.debug "RECENT CHANGES - We should have a result of type query now."
148
-
149
- # Figure out what to do now. Process, I'd think
150
- ##@wikibotlogger.debug "RECENT CHANGES - Processing result..."
151
-
152
- return recentchanges_result.fetch('recentchanges')
153
-
154
- end
155
-
156
- # List
157
- #
158
- # This will reutrn a list of the most recent log events. Useful for bots who want to validate log events, or even just a notify bot that checks for events and sends them off.
159
- #
160
- # INPUT:: A hash of API-allowed keys and values. Default is same as API default.
161
- #
162
- # OUTPUT:: An array of hashes containing log events.
163
- def log_events (options = nil)
164
-
165
- ##@wikibotlogger.debug "LOG EVENTS - Preparing request information..."
166
-
167
- # Make the request
168
- post_me = {"list" => "logevents"}
169
-
170
- if options != nil
171
- ##@wikibotlogger.debug("LOG EVENTS - Additional options added by requestor. Adding to post_me...")
172
- options.each_pair do |key, value|
173
- post_me[key] = value
174
- ##@wikibotlogger.debug "LOG EVENTS - Added #{post_me[key]}"
175
- end
176
- ##@wikibotlogger.debug("LOG EVENTS - No more additional options. Moving on...")
177
- end
178
-
179
- #Make the request!
180
- ##@wikibotlogger.debug "LOG EVENTS = Asking make_request to get logevents"
181
- logevents_result = make_request('query', post_me)
182
- ##@wikibotlogger.debug "LOG EVENTS - We should have a result of type logevents now."
183
-
184
- # Process results
185
- ##@wikibotlogger.debug "LOG EVENTS - Processing result..."
186
-
187
-
188
- return logevents_result.fetch('logevents')
69
+ @config['logged_in'] = TRUE
70
+ @config['lgusername'] = login_result.fetch('lgusername')
71
+ @config['lguserid'] = login_result.fetch('lguserid')
72
+ @config['lgtoken'] = login_result.fetch('lgtoken')
73
+ @config['_session'] = login_result.fetch('sessionid')
74
+ @config['cookieprefix'] = login_result.fetch('cookieprefix')
75
+ # puts "You are now logged in as: #{@config['lgusername']}"
76
+ return true
77
+ else
78
+ # puts "Error logging in. Error was: "
79
+ raise LoginError, "#{login_result['result']}: #{login_result['details']}"
189
80
 
81
+ end #if
190
82
  end
191
-
192
- # Query
193
- #
194
- # This will return any redirects from an article title so that you know where it ends. Useful to check for redirects, but mostly here for completeness of the framework.
195
- #
196
- # INPUT:: A string of pipe-delimited titles ('Apple|Baseball|Car port'), and an optional hash of API acceptable values.
197
- # OUTPUT:: An array of redirects.
198
- def redirects (title, options = nil)
199
-
200
- # Prepare the request
201
- ##@wikibotlogger.debug "REDIRECTS - Preparing request information..."
202
- post_me = {'titles' => title, 'redirects'=>'', 'prop' => 'info'}
203
-
204
- if options != nil
205
- ##@wikibotlogger.debug("REDIRECTS - Additional options added by requestor. Adding to post_me...")
206
- options.each_pair do |key, value|
207
- post_me[key] = value
208
- ##@wikibotlogger.debug "REDIRECTS - Added #{post_me[key]}"
209
- end
210
- ##@wikibotlogger.debug("REDIRECTS - No more additional options. Moving on...")
211
- end
212
-
213
- #Make the request
214
- ##@wikibotlogger.debug "REDIRECTS - Asking make_request find redirects..."
215
- redirects_result = make_request('query', post_me)
216
- ##@wikibotlogger.debug "REDIRECTS - We should have a result now..."
217
-
218
- ##@wikibotlogger.debug "REDIRECTS - Processing result..."
219
-
220
-
221
- return redirects_result.fetch('pages')
222
-
223
- end
224
-
225
-
226
- # Query
227
- #
228
- # This is a lot like REDIRECTS method, except its just a true/false to validate whether or not an article is a redirect. We could write the logic into the final bot app, but we're awesome and we include a quicky method.
229
- #
230
- # INPUT:: Title (please, just one!)
231
- # OUTPUT:: True/False
232
- def redirect? (title)
233
-
234
- # Prepare the request
235
- ##@wikibotlogger.debug "REDIRECT? - Preparing request information..."
236
- post_me = {'titles' => title, 'redirects'=>'', 'prop' => 'info'}
237
-
238
-
239
- #Make the request
240
- ##@wikibotlogger.debug "REDIRECT? - Asking make_request find redirects..."
241
- redirects_result = make_request('query', post_me)
242
- ##@wikibotlogger.debug "REDIRECT? - We should have a result now..."
243
-
244
- ##@wikibotlogger.debug "REDIRECT? - Processing result..."
245
-
246
-
247
- return redirects_result.has_key?('redirects')
248
-
249
- end
250
-
251
-
252
- # Query
253
- #
254
- # This little ditty returns a normalized version of the title passed to it. It is super useful because it will normalize an otherise poorly entered title, but most importantly it will let us know if an article exists or not by if it is able to normalize.
255
- #
256
- # INPUT:: Titles, either singular or pipe-delimited.
257
- # OUTPUT:: An array of normalized hashes.
258
- def normalize (title)
259
-
260
- # Prepare the request
261
- ##@wikibotlogger.debug "NORMALIZE - Preparing request information..."
262
- post_me = {'titles' => title}
263
-
264
- #Make the request
265
- ##@wikibotlogger.debug "NORMALIZE - Asking make_request to normalize titles..."
266
- normalized_result = make_request('query', post_me)
267
- ##@wikibotlogger.debug "NORMALIZE - We should have a result now..."
268
-
269
- ##@wikibotlogger.debug "NORMALIZE - Processing result..."
270
-
271
-
272
- return normalized_result.fetch('pages')
273
-
83
+
84
+ # Use Page to create a new page object that you can then manipulate. You could create a page on it's own, but if you do, be _sure_ to pass your bot along with the title, otherwise you won't get access to the super-fun make_request object that is pretty much required.
85
+ def Page(title='')
86
+ return Page.new(self, title)
274
87
  end
275
88
 
276
- # List
277
- #
278
89
  # This will return a list of all pages in a given namespace. It returns a list of pages in with the normalized title and page ID, suitable for usage elsewhere. Accepts all parameters from the API in Hash form.
279
- # Default is namespace => 0, which is just plain pages. Nothing 'special'.
280
- # Also note that if the username the Bot uses is not of type Bot in the Wiki, you will be limited to 50 articles. Also log in, or you get an error.
281
- #
282
- # INPUT:: A hash of API-allowed keys and values. Default is same as API default.
283
- # PARAMETERS:: apfrom (paging), apnamespace (dflt=0), apredirect (flt), aplimit (dflt=10, max=500/5000)
284
- # OUTPUT:: An array of hashes with information about the pages.
90
+ # Default is namespace => 0, which is just plain pages. Nothing 'special'.
285
91
  def all_pages (options = nil)
286
-
92
+ raise VersionTooLowError unless meets_version_requirement(1,9)
287
93
  # This will get all pages. Limits vary based on user rights of the Bot. Set to bot.
288
94
  ##@wikibotlogger.debug "ALL PAGES - Preparing request information..."
289
95
  post_me = {'list' => 'allpages', 'apnamespace' => '0', 'aplimit' => '5000'}
290
-
291
-
96
+
97
+
292
98
  if options != nil
293
- ##@wikibotlogger.debug("ALL PAGES - Additional options added by requestor. Adding to post_me...")
294
99
  options.each_pair do |key, value|
295
100
  post_me[key] = value
296
- ##@wikibotlogger.debug "ALL PAGES - Added #{post_me[key]}"
297
101
  end
298
- ##@wikibotlogger.debug("ALL PAGES - No more additional options. Moving on...")
299
102
  end
300
-
103
+
301
104
  #make the request
302
- ##@wikibotlogger.debug "ALL PAGES - Asking make_request to get all pages..."
303
105
  allpages_result = make_request('query', post_me)
304
- ##@wikibotlogger.debug "ALL PAGES - We should have a result now..."
305
-
306
- return allpages_result.fetch('allpages')
307
-
106
+ allpages_result.fetch('allpages')['p']
308
107
  end
309
108
 
310
- # List
311
- #
312
- # This method fetches any article that links to the article given in 'title'. Returned in alphabetical order.
313
- #
314
- # INPUT:: A normalized article title or titles (pipe delimited), and a hash of API-allowed keys and values. Default is same as API default.
315
- # PARAMETERS:: blfrom (paging), blnamespace (flt), blredirect (flt), bllimit (dflt=10, max=500/5000)
316
- # OUTPUT:: An array of hashes with backlinked articles.
317
- def backlinks (titles, options = nil)
318
-
319
- # This will get all pages. Limits vary based on user rights of the Bot. Set to bot.
320
- ##@wikibotlogger.debug "BACKLINKS - Preparing request information..."
321
- post_me = {'list' => 'backlinks', 'titles' => "#{title}" }
322
-
323
-
109
+ # This method will get the watchlist for the bot's MediaWiki username. This is really onlu useful if you want the bot to watch a specific list of pages, and would require the bot maintainer to login to the wiki as the bot to set the watchlist.
110
+ def watchlist (options=nil)
111
+ raise VersionTooLowError unless meets_version_requirement(1,10)
112
+ raise NotLoggedInError unless logged_in?
113
+
114
+ # Get the bot's watchlist
115
+ post_me = {'list'=>'watchlist'}
116
+
324
117
  if options != nil
325
- ##@wikibotlogger.debug("BACKLINKS - Additional options added by requestor. Adding to post_me...")
326
- options.each_pair do |key, value|
118
+ options.each do |key, value|
327
119
  post_me[key] = value
328
- ##@wikibotlogger.debug "BACKLINKS - Added #{post_me[key]}"
329
120
  end
330
- ##@wikibotlogger.debug("BACKLINKS - No more additional options. Moving on...")
331
121
  end
332
-
333
- #make the request
334
- ##@wikibotlogger.debug "BACKLINKS - Asking make_request to get backlinks..."
335
- backlinks_result = make_request('query', post_me)
336
- ##@wikibotlogger.debug "BACKLINKS - We should have a result now..."
337
- return backlinks_result.fetch('backlinks')
338
-
122
+
123
+ # Make the request
124
+ return make_request('query', post_me).fetch('watchlist').fetch('item')
339
125
  end
340
-
341
- # List
342
- #
343
- # This method pulls any page that includes the template requested. Please note - the template must be the full name, like "Template:Disputed" or "Template:Awesome". Just one, please.
344
- #
345
- # INPUT:: A normalized template title, and a hash of API-allowed keys and values. Default is same as API default.
346
- # PARAMETERS:: eifrom (paging), einamespace (flt), eiredirect (flt), eilimit (dflt=10, max=500/5000)
347
- # OUTPUT:: An array of hashes with articles using said template.
348
- def embedded_in (title, options = nil)
349
-
350
- # This will get all pages. Limits vary based on user rights of the Bot. Set to bot.
351
- ##@wikibotlogger.debug "EMBEDDED IN - Preparing request information..."
352
- post_me = {'list' => 'embeddedin', 'titles' => "#{title}" }
353
-
354
-
126
+
127
+ # This method will return Wiki-wide recent changes, almost as if looking at the Special page Recent Changes. But, in this format, a bot can handle it. Also we're using the API. And bots can't read.
128
+ def recent_changes (options=nil)
129
+ raise VersionTooLowError unless meets_version_requirement(1,10)
130
+
131
+ post_me = {"list" => "recentchanges", 'rclimit' => '5000'}
355
132
  if options != nil
356
- ##@wikibotlogger.debug("EMBEDDED IN - Additional options added by requestor. Adding to post_me...")
357
- options.each_pair do |key, value|
133
+ options.each do |key, value|
358
134
  post_me[key] = value
359
- ##@wikibotlogger.debug "EMBEDDED IN - Added #{post_me[key]}"
360
135
  end
361
- ##@wikibotlogger.debug("EMBEDDED IN - No more additional options. Moving on...")
362
136
  end
363
-
364
- #make the request
365
- ##@wikibotlogger.debug "EMBEDDED IN - Asking make_request to get backlinks..."
366
- embeddedin_result = make_request('query', post_me)
367
- ##@wikibotlogger.debug "EMBEDDED IN - We should have a result now..."
368
- return embeddedin_result.fetch('embeddedin')
369
-
370
- end
371
-
372
- # List
373
- #
374
- # A whole lot like EMBEDDED_IN, except this bad boy has the job of handling Image: instead of Template:. I guess bots may want to watch images. Its really for completeness. But, people do do things with pictures. Maybe it handles Media: as well, but no promisesses.
375
- #
376
- # INPUT:: A normalized image title, and a hash of API-allowed keys and values. Default is same as API default.
377
- # PARAMETERS:: iefrom (paging), ienamespace (flt), ielimit (dflt=10, max=500/5000)
378
- # OUTPUT:: An array of hashes with images page links
379
- # def image_embedded_in (title, options = nil) :nodoc:
380
- #
381
- # # This will get all pages. Limits vary based on user rights of the Bot. Set to bot.
382
- # ##@wikibotlogger.debug "IMAGE EMBEDDED IN - Preparing request information..."
383
- # post_me = {'list' => 'embeddedin', 'titles' => "#{title}" }
384
- #
385
- #
386
- # if options != nil
387
- # ##@wikibotlogger.debug("IMAGE EMBEDDED IN - Additional options added by requestor. Adding to post_me...")
388
- # options.each_pair do |key, value|
389
- # post_me[key] = value
390
- # ##@wikibotlogger.debug "IMAGE EMBEDDED IN - Added #{post_me[key]}"
391
- # end
392
- # ##@wikibotlogger.debug("IMAGE EMBEDDED IN - No more additional options. Moving on...")
393
- # end
394
- #
395
- # #make the request
396
- # ##@wikibotlogger.debug "IMAGE EMBEDDED IN - Asking make_request to get backlinks..."
397
- # imageembeddedin_result = make_request('query', post_me)
398
- # ##@wikibotlogger.debug "IMAGE EMBEDDED IN - We should have a result now..."
399
- # return imageembeddedin_result.fetch('embeddedin')
400
- #
401
- # end
402
-
403
- # Prop = Info
404
- #
405
- # I decided to split this up since I wanted to normalize the bot framework as much as possible, or in other words, make it as easy to use as possible. I think the sacrifice of more methods is worth having more English looking code. Its the Ruby way.
406
- # Info will return information about the page, from namespace to normalized title, last touched, etc.
407
- #
408
- # INPUT:: This method only takes titles, but will accept a pipe-delimited string. Ex: "Apple|Baseball|Horse|Main Page"
409
- #
410
- # OUTPUT:: An array of hashes.
411
- def info (titles)
412
-
413
- # Basic quqery info
414
- ##@wikibotlogger.debug "INFO - Preparing the query..."
415
- post_me = {"prop" => "info", 'titles' => titles}
416
-
137
+
417
138
  # Make the request
418
- ##@wikibotlogger.debug "INFO - Asking make_request to get info"
419
- info_result = make_request('query', post_me)
420
- ##@wikibotlogger.debug "INFO - We should have a result set now..."
421
-
422
- # Result processing
423
- ##@wikibotlogger.debug "INFO - Preparing results..."
424
-
425
- return info_result.fetch('pages')
426
-
139
+ return make_request('query' , post_me).fetch('recentchanges').fetch('rc')
427
140
  end
428
-
429
- # Prop - Revisions
430
- #
431
- # This is the main way of accessing content and page specific information from the wiki. It has multiple uses as described in the API, Its also considerably more complex than the other methods. Enjoy it.
432
- # A final note - I'd really be familiar with this method in the API since I've spent a lot of time trying to figure it out myself.
433
- #
434
- # Please be sure to add the RVPROP key at least, otherwise you'll just get the basic information of revid, oldid and pageid. Boring.
435
- #
436
- # INPUT:: A string of article titles (pipe-delimited), and a hash of API-allowed keys and values. Default is same as API default.
437
- #
438
- # OUTPUT:: An array of hashes.
439
- def revisions(titles, options = nil)
440
-
441
- # Prepare the request! Notify the logger!
442
- ##@wikibotlogger.debug "REVISIONS - Preparing the requeset..."
443
- post_me = {'prop' => 'revisions', 'titles' => titles}
444
-
445
- # Handle any additional options
141
+
142
+ # This will reutrn a list of the most recent log events. Useful for bots who want to validate log events, or even just a notify bot that checks for events and sends them off.
143
+ def log_events (options = nil)
144
+ raise VersionTooLowError unless meets_version_requirement(1,11)
145
+ post_me = {"list" => "logevents"}
146
+
446
147
  if options != nil
447
- ##@wikibotlogger.debug("REVISIONS - Additional options added by requestor. Adding to post_me...")
448
148
  options.each_pair do |key, value|
449
149
  post_me[key] = value
450
- ##@wikibotlogger.debug "REVISIONS - Added #{post_me[key]}"
451
150
  end
452
- ##@wikibotlogger.debug("REVISIONS - No more additional options. Moving on...")
453
151
  end
454
-
455
- # Make the request. Becuase we care.
456
- ##@wikibotlogger.debug "REVISIONS - Asking make_request to get revision for articles(s) #{titles}"
457
- revisions_result = make_request('query', post_me )
458
- ##@wikibotlogger.debug "REVISIONS - We should have a result now..."
459
-
460
- #Process the results
461
- ##@wikibotlogger.debug "REVISIONS - Preparing results..."
462
-
463
- return revisions_result.fetch('pages')
464
-
152
+
153
+ #Make the request!
154
+ return make_request('query', post_me).fetch('logevents').fetch('item')
465
155
  end
466
-
467
- # Meta
468
- #
469
- # This is the only meta method. It will return site information. I chose not to allow it to specify, and it will only return all known properties.
470
- # api.php?action=query&meta=siteinfo&siprop=general|namespaces
471
- #
472
- # INPUT:: siprop is either 'general' or 'namespaces'.
473
- #
474
- # OUTPUT:: A hash of values about site information.
156
+
157
+ # This is the only meta method. It will return site information. I chose not to allow it to specify, and it will only return all known properties.
475
158
  def site_info (siprop = 'general')
476
-
477
- ##@wikibotlogger.debug "SITE INFO - Preparing request information..."
478
-
159
+ raise VersionTooLowError unless meets_version_requirement(1,9)
479
160
  # Make the request
480
161
  post_me = {"meta" => "siteinfo" , "siprop" => siprop}
481
-
482
-
162
+
483
163
  #Make the request!
484
- ##@wikibotlogger.debug "SITE INFO - Asking make_request to get site info"
485
164
  siteinfo_result = make_request('query', post_me)
486
- ##@wikibotlogger.debug "SITE INFO - We should have a result of type site info now."
487
-
488
- # Process results
489
- ##@wikibotlogger.debug "SITE INFO - Processing result..."
490
-
165
+
491
166
  if siprop == 'general'
492
167
  return siteinfo_result.fetch('general')
493
168
  else
494
- return siteinfo_result.fetch('namespaces')
495
- end
496
-
497
- end
498
-
499
- # The point of this method is to iterate through an array of hashes, which most of the other methods return, and remove multiple instances of the same wiki page. We're more than often only concerned with the most recent revision, so we'll delete old ones.
500
- #
501
- # Hashes don't respond to the the Array.uniq method. So this is the same-ish
502
- # INPUT:: An array of hashes.
503
- # OUTPUT:: An array of hashes that are unique.
504
- def make_unique(array)
505
-
506
- test_array = array
507
- count = 0
508
-
509
- # First, let's make one big loop to go through each item in the array.
510
- array.reverse.each do |current_item|
511
-
512
- # Now, let's loop double time.
513
- test_array.each do |test_item|
514
-
515
- # Some comparisons...
516
- if (current_item.fetch('title') == test_item.fetch('title') && current_item.fetch('revid') > test_item.fetch('revid') )
517
-
518
- # At this point, current is the same article as test, and current is newer. Delete test
519
- array.delete(test_item)
520
- count += 1
521
-
522
- end
523
- end
169
+ return siteinfo_result.fetch('namespaces').fetch('ns')
524
170
  end
525
-
526
- puts "Deleted #{count} items."
527
-
528
- return array
529
- end
530
-
531
- # This is a little something I cooked up because it seems like a totally logical thing for bots to want to do. Basically, you feed it a page title - any you want (that's the point) - and it returns TRUE or FALSE if the page exists inside the wiki. Technically, it pulls an attribute "missing", and in its presense, reports TRUE since the page is fake.
532
- # That's something a bot would want to do, right?
533
- #
534
- # INPUT:: A title. Just one!
535
- #
536
- # OUTPUT:: TRUE/FALSE, depending on which is correct
537
- def page_exists? (title)
538
-
539
- # Prepare the request
540
- ##@wikibotlogger.debug "PAGE EXISTS? - Preparing request information..."
541
- post_me = {'titles' => title}
542
-
543
- #Make the request
544
- ##@wikibotlogger.debug "PAGE EXISTS? - Asking make_request to verify page existence..."
545
- page_exists_result = make_request('query', post_me)
546
- ##@wikibotlogger.debug "PAGE EXISTS? - We should have a result now..."
547
-
548
- ##@wikibotlogger.debug "PAGE EXISTS? - Processing result..."
549
-
550
-
551
- if page_exists_result.fetch('pages')[0].has_key?('missing')
552
- ##@wikibotlogger.debug "PAGE EXISTS? - The page #{title} does NOT exist. Sorry."
553
- return false
554
- else
555
- ##@wikibotlogger.debug "PAGE EXISTS? - The page #{title} DOES exist. You lucky, lucky bot."
556
- return true
557
- end
558
-
559
- end
560
-
561
- # This method will return the version of the MediaWiki server. This is done by parsing the version number from the generator attribute of the the site_info method. Useful? Yes - maybe yout bot is only compatible with MediaWiki 1.9.0 depending on what methods you use. I like it, anwyay.
562
- #
563
- # INPUT:: None
564
- #
565
- # OUTPUT:: Version number
566
- def version
567
- # Almost TOO simple...
568
- return site_info.fetch('generator').split(' ')[1]
569
171
  end
570
-
571
- # This method turns a pageid into a title. Why? Because I've written the rest of the methods title-centric, and I want to keep it that way. But, sometiumes you get a list of ids and not titles, and we have to do something about that.
572
- #
573
- # INPUT:: PageID - just one!
574
- # OUTPUT:: A title in string form.
575
- def pageid_to_title(id)
576
-
577
- # Prepare the request! Notify the logger!
578
- ##@wikibotlogger.debug "PAGEID TO TITLE - Preparing the requeset..."
579
- post_me = {'prop' => 'info', 'pageids' => id}
580
-
581
- # Make the request. Becuase we care.
582
- ##@wikibotlogger.debug "PAGEID TO TITLE - Asking make_request to get revision for pageid #{id}"
583
- id_result = make_request('query', post_me )
584
- ##@wikibotlogger.debug "PAGEID TO TITLE - We should have a result now..."
585
-
586
- #Process the results
587
- ##@wikibotlogger.debug "PAGEID TO TITLE - Preparing results..."
588
-
589
- return id_result.fetch('pages')[0].fetch('title')
590
172
 
173
+ # Get information about the current user
174
+ def user_info (uiprop = nil)
175
+ raise VersionTooLowError unless meets_version_requirement(1,11)
176
+ # Make the request
177
+ post_me = {"meta" => "userinfo" }
178
+ post_me['uiprop'] = uiprop unless uiprop.nil?
179
+
180
+ return make_request('query',post_me).fetch('userinfo')
591
181
  end
592
-
593
-
594
- private
595
-
596
- # Make Request is a method that actually handles making the request to the API. Since the API is somewhat standardized, this method is able to accept the action and a hash of variables, and it handles all the fun things MediaWiki likes to be weird over, like cookies and limits and actions. Its very solid, but I didn't want it public because it also does some post processing, and that's not very OO.
597
- def make_request (action, post_this)
598
-
599
- # Now, we do error handling. We're big like that.
600
- begin
601
-
602
- #Housekeeping. We need to add format and action to the request hash
603
- post_this['format'] = 'yaml'
604
- post_this['action'] = action
605
- if @config.fetch('logged_in')
606
- post_this['lgusername'] = @config.fetch('lgusername')
607
- post_this['lgtoken'] = @config.fetch('lgtoken')
608
- post_this['lguserid'] = @config.fetch('lguserid')
609
- end
610
-
611
- #change - preparing a POST string instead of hash.
612
- post_string = ''
613
- post_this.each_pair do |key, value|
614
- post_string << "#{key}=#{value}&"
615
- end
616
- ##@wikibotlogger.info("MAKE REQUEST - Post String is: #{post_string}")
617
- ##@wikibotlogger.debug "MAKE REQUEST - Sending request to: #{@config.fetch('uri')}"
618
-
619
- #Send the actual request
620
- ##@wikibotlogger.debug "MAKE REQUEST - Sending request..."
621
- #'User-agent' => 'RWikiBot/0.1'
622
- # This is working agian. There was a bug in Ruby 1.8.4
623
- resp = @http.post( @config.fetch('uri').path , post_string , {'User-agent'=>'RWikiBot/1.0', 'Cookie' => bake(@config.fetch('cookie')) } ) #
624
- #resp = @http.get( "/wiki/api.php?#{post_string}" , {'User-agent'=>'RWikiBot/1.0', 'Cookie' => bake(@config.fetch('cookie')) } ) #
625
- ##@wikibotlogger.debug "MAKE REQUEST - Response: "
626
- ##@wikibotlogger.debug resp.body
627
- ##@wikibotlogger.debug "MAKE REQUEST - End Response "
628
- result = YAML.load(resp.body)
629
-
630
- #Process response
631
- ##@wikibotlogger.debug "MAKE REQUEST - Response received. Processing..."
632
- return_result = result.fetch(action)
633
-
634
- # A small check to make sure we don't need to save cookie data. Because usually, we don't
635
- if @config.fetch('logged_in') == FALSE
636
- @config['cookie'] = resp.header['set-cookie'] if @config.fetch('cookie').nil?
637
- ##@wikibotlogger.debug "MAKE REQUEST - Received login cookie. Cookie is: #{resp.header['set-cookie']}"
638
- end
639
-
640
- #Return the response
641
- ##@wikibotlogger.debug "MAKE REQUEST - Response cleaned. Returning result."
642
- return return_result
643
-
644
- # The short form is that the API's error message causes an ArgumentError, so we gotta handle that. I notified MediaWiki
645
- rescue ArgumentError
646
- ##@wikibotlogger.error "MAKE REQUEST - The request has failed."
647
- ##@wikibotlogger.error "MAKE REQUEST - Error is: #{$!}"
648
-
649
- raise ArgumentError, "Problem processing returned YAML result"
650
- end
651
- end
652
-
653
- # For some odd reason, MediaWiki sends back three cookies the first time you establis a session. For some even more odd, Net::HTTP doesn't treat the set-cookie headers individually the way that normal borowsers do. So, I have this processing method to handle cookie logic - when to send, when to set, etc. And it bakes cookies. Get it? Bake.
654
- def bake(cookie)
655
-
656
- # Make some dough. Let CGI parse set-cookie, since there's multiple set-cookies and this is hard
657
- dough = CGI::Cookie::parse(cookie)
658
-
659
- # Take the dough and get my three elements out
660
-
661
- if @config.fetch('logged_in') == TRUE
662
- ##@wikibotlogger.debug "BAKE - Baking a cookie... Smells like wiki."
663
- # return "#{dough.fetch(@config.fetch('wikicookieprefix')+'UserID')}; #{dough.fetch(@config.fetch('wikicookieprefix')+'UserName')}; #{dough.fetch(@config.fetch('wikicookieprefix')+'Token')}"
664
- return "#{dough.fetch('wikidb_UserID')}; #{dough.fetch('wikidb_UserName')}; #{dough.fetch('wikidb_Token')}"
665
-
666
- else
667
- ##@wikibotlogger.debug "BAKE - No cookies to bake this time."
668
- return ""
182
+
183
+ end
184
+
185
+ # I'm never happy with good enough, and when it comes to my hashes, I like to see the members of it. So I changed the hash to_s. Overriding method makes me happy.
186
+ class Hash
187
+ def to_s
188
+ out = "{"
189
+ self.each do |key, value|
190
+ out += "#{key} => #{value},"
669
191
  end
670
-
671
- # No shit, this took me almost a week to figure it out
192
+ out = out.chop
193
+ out += "}"
672
194
  end
673
-
674
- end
195
+ end
data/lib/utilities.rb ADDED
@@ -0,0 +1,127 @@
1
+ ## These methods aren't part of the API, so I'm moving them out.
2
+ ## But they're still useful. Very.
3
+
4
+ module RWBUtilities
5
+
6
+ private
7
+
8
+ # is_redirect?
9
+ #
10
+ # Tests to see if a given page title is redirected to another page. Very Ruby.
11
+ def is_redirect? (title)
12
+
13
+ post_me = {'titles' => title, 'redirects'=>'', 'prop' => 'info'}
14
+
15
+ result = make_request('query', post_me)
16
+
17
+ if (result['result'] == "Success") && (result.has_key?("redirects"))
18
+ return true
19
+ else
20
+ return false
21
+ end
22
+ end
23
+
24
+ # logged_in?
25
+ #
26
+ # A quick (and public) method of checking whether or not we're logged in, since I don't want @config exposed
27
+ #
28
+ # INPUT:: None
29
+ # OUTPUT:: boolean
30
+ def logged_in?
31
+ return @config['logged_in']
32
+ end
33
+
34
+ # The point of this method is to iterate through an array of hashes, which most of the other methods return, and remove multiple instances of the same wiki page. We're more than often only concerned with the most recent revision, so we'll delete old ones.
35
+ #
36
+ # Hashes don't respond to the the Array.uniq method. So this is the same-ish
37
+ def make_unique(array)
38
+
39
+ test_array = array
40
+ count = 0
41
+
42
+ # First, let's make one big loop to go through each item in the array.
43
+ array.reverse.each do |current_item|
44
+
45
+ # Now, let's loop double time.
46
+ test_array.each do |test_item|
47
+
48
+ # Some comparisons...
49
+ if (current_item.fetch('title') == test_item.fetch('title') && current_item.fetch('revid') > test_item.fetch('revid') )
50
+
51
+ # At this point, current is the same article as test, and current is newer. Delete test
52
+ array.delete(test_item)
53
+ count += 1
54
+
55
+ end
56
+ end
57
+ end
58
+
59
+ return array
60
+ end
61
+
62
+ # This method will return the version of the MediaWiki server. This is done by parsing the version number from the generator attribute of the the site_info method. Useful? Yes - maybe yout bot is only compatible with MediaWiki 1.9.0 depending on what methods you use. I like it, anwyay.
63
+ def version
64
+ # Almost TOO simple...
65
+ return site_info.fetch('generator').split(' ')[1]
66
+ end
67
+
68
+ # Make Request is a method that actually handles making the request to the API. Since the API is somewhat standardized, this method is able to accept the action and a hash of variables, and it handles all the fun things MediaWiki likes to be weird over, like cookies and limits and actions. Its very solid, but I didn't want it public because it also does some post processing, and that's not very OO.
69
+ def make_request (action, post_this)
70
+
71
+ #Housekeeping. We need to add format and action to the request hash
72
+ post_this['format'] = 'xml'
73
+ post_this['action'] = action
74
+
75
+ if (@config['logged_in'])
76
+ cookies = "#{@config['cookieprefix']}UserName=#{@config['lgusername']}; #{@config['cookieprefix']}UserID=#{@config['lguserid']}; #{@config['cookieprefix']}Token=#{@config['lgtoken']}; #{@config['cookieprefix']}_session=#{@config['_session']}"
77
+ else
78
+ cookies = ""
79
+ end
80
+
81
+ headers = {
82
+ 'User-agent'=>'bot-RWikiBot/2.0-rc1',
83
+ 'Cookie' => cookies
84
+ }
85
+
86
+ r = Hash.new
87
+ until post_this.nil?
88
+ return_result, post_this = raw_call(headers, post_this)
89
+ r.deep_merge(return_result.fetch(action))
90
+ end
91
+
92
+ return r
93
+ end #make_request
94
+
95
+ # Raw Call handles actually, physically talking to the wiki. It is broken out to handle query-continues where applicable. So, all the methods call make_request, and it calls raw_call until raw_call returns a nil post_this.
96
+ def raw_call(headers, post_this)
97
+ request = Net::HTTP::Post.new(@config.fetch('uri').path, headers)
98
+ request.set_form_data(post_this)
99
+ response = Net::HTTP.new(@config.fetch('uri').host, @config.fetch('uri').port).start {
100
+ |http| http.request(request)
101
+ }
102
+
103
+ # Extra cookie handling. Because editing will be based on session IDs and it generates a new one each time until you start responding. I doubt this will change.
104
+ if (response.header['set-cookie'] != nil)
105
+ @config['_session'] = response.header['set-cookie'].split("=")[1]
106
+ end
107
+
108
+ return_result = XmlSimple.xml_in(response.body, { 'ForceArray' => false })
109
+ # puts "==>>Result is: #{return_result}"
110
+
111
+ if return_result.has_key?('error')
112
+ raise RWikiBotError, "#{return_result.fetch('error').fetch('code').capitalize}: #{return_result.fetch('error').fetch('info')}"
113
+ end
114
+
115
+ if return_result.has_key?('query-continue')
116
+ return_result.fetch('query-continue').each do |key, value|
117
+ return_result.fetch('query-continue').fetch(key).each do |x,y|
118
+ post_this[x] = y
119
+ end
120
+ end
121
+ else
122
+ post_this = nil
123
+ end
124
+
125
+ return return_result, post_this
126
+ end #raw_call
127
+ end
metadata CHANGED
@@ -1,52 +1,59 @@
1
1
  --- !ruby/object:Gem::Specification
2
- rubygems_version: 0.9.2
3
- specification_version: 1
4
2
  name: rwikibot
5
3
  version: !ruby/object:Gem::Version
6
- version: 1.0.10
7
- date: 2007-05-29 00:00:00 -05:00
8
- summary: A library for creating MediaWiki bots.
9
- require_paths:
10
- - lib
11
- email: eddieroger @nospam@ gmail.com
12
- homepage:
13
- rubyforge_project:
14
- description:
15
- autorequire: rwikibot
16
- default_executable:
17
- bindir: bin
18
- has_rdoc: "true"
19
- required_ruby_version: !ruby/object:Gem::Version::Requirement
20
- requirements:
21
- - - ">"
22
- - !ruby/object:Gem::Version
23
- version: 0.0.0
24
- version:
4
+ version: 2.0.0
25
5
  platform: ruby
26
- signing_key:
27
- cert_chain:
28
- post_install_message:
29
6
  authors:
30
7
  - Eddie Roger
31
- files:
32
- - lib/rwikibot.rb
33
- - test/rwikibot.log
34
- - sample-config.yaml
35
- - README
36
- - CHANGELOG
37
- test_files: []
8
+ autorequire:
9
+ bindir: bin
10
+ cert_chain: []
38
11
 
39
- rdoc_options:
40
- - --inline-source
41
- extra_rdoc_files:
42
- - README
43
- - CHANGELOG
44
- - sample-config.yaml
12
+ date: 2008-12-29 00:00:00 -06:00
13
+ default_executable:
14
+ dependencies: []
15
+
16
+ description:
17
+ email: eddieroger @nospam@ gmail.com
45
18
  executables: []
46
19
 
47
20
  extensions: []
48
21
 
22
+ extra_rdoc_files:
23
+ - README
24
+ - CHANGELOG
25
+ files:
26
+ - lib/errors.rb
27
+ - lib/pages.rb
28
+ - lib/rwikibot.rb
29
+ - lib/utilities.rb
30
+ - README
31
+ - CHANGELOG
32
+ has_rdoc: "true"
33
+ homepage: http://www.rwikibot.net
34
+ post_install_message:
35
+ rdoc_options:
36
+ - --inline-source --force-update
37
+ require_paths:
38
+ - lib
39
+ required_ruby_version: !ruby/object:Gem::Requirement
40
+ requirements:
41
+ - - ">="
42
+ - !ruby/object:Gem::Version
43
+ version: "0"
44
+ version:
45
+ required_rubygems_version: !ruby/object:Gem::Requirement
46
+ requirements:
47
+ - - ">="
48
+ - !ruby/object:Gem::Version
49
+ version: "0"
50
+ version:
49
51
  requirements: []
50
52
 
51
- dependencies: []
53
+ rubyforge_project: RWikiBot
54
+ rubygems_version: 1.3.1
55
+ signing_key:
56
+ specification_version: 2
57
+ summary: A library for creating MediaWiki bots.
58
+ test_files: []
52
59
 
data/sample-config.yaml DELETED
@@ -1,6 +0,0 @@
1
- default:
2
- username: "RWikiBot"
3
- password: "rwikibot"
4
- domain: 'en'
5
- api_path: "http://localhost:8888/wiki/api.php"
6
- wikicookieprefix: 'wikidb_wiki_' #this is the MediaWiki default
data/test/rwikibot.log DELETED
@@ -1,3 +0,0 @@
1
- # Logfile created on Wed Mar 07 18:06:37 -0600 2007 by logger.rb/1.5.2.9
2
- I, [2007-03-07T18:06:37.367701 #19288] INFO -- : New RWikiBot created. My name is Unnamed Bot
3
- D, [2007-03-07T18:06:37.367942 #19288] DEBUG -- : Loading configuration...