protk 1.2.0 → 1.2.1

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
data/README.md CHANGED
@@ -14,70 +14,103 @@ Protk is a wrapper for various proteomics tools. It aims to present a consistent
14
14
 
15
15
  Protk depends on ruby 1.9. The recommended way to install ruby and manage ruby gems is with rvm. Install rvm using this command.
16
16
 
17
- curl -L https://get.rvm.io | bash -s stable
17
+ ```sh
18
+ curl -L https://get.rvm.io | bash -s stable
19
+ ```
18
20
 
19
21
  Next install ruby and protk's dependencies
20
22
 
21
23
  On OSX
22
24
 
23
- rvm install 1.9.3 --with-gcc=clang
24
- rvm use 1.9.3
25
- gem install protk
26
- protk_setup.rb package_manager
27
- protk_setup.rb system_packages
28
- protk_setup.rb all
29
-
25
+ ```sh
26
+ rvm install 1.9.3 --with-gcc=clang
27
+ rvm use 1.9.3
28
+ gem install protk
29
+ protk_setup.rb package_manager
30
+ protk_setup.rb system_packages
31
+ protk_setup.rb all
32
+ ```
30
33
  On Linux
31
-
32
- rvm install 1.9.3
33
- rvm use 1.9.3
34
- gem install protk
35
- sudo protk_setup.rb system_packages
36
- protk_setup all
37
34
 
35
+ ```sh
36
+ rvm install 1.9.3
37
+ rvm use 1.9.3
38
+ gem install protk
39
+ sudo ~/.rvm/bin/rvm 1.9.3 do protk_setup.rb system_packages
40
+ protk_setup all
41
+ ```
42
+
43
+ Instead off using protk_setup.rb all it might be preferable to only install some of the protk tool dependencies. 'all' is just an alias for the following full target list, any of which can be omitted with the consequence that tools depending on that component will not be available.
44
+
45
+ ```sh
46
+ protk_setup.rb tpp omssa blast msgfplus pwiz openms galaxyenv
47
+ ```
38
48
 
39
49
  ## Sequence databases
40
50
 
41
51
  After running the setup.sh script you should run manage_db.rb to install specific sequence databases for use by the search engines. Protk comes with several predefined database configurations. For example, to install a database consisting of human entries from Swissprot plus known contaminants use the following commands;
42
52
 
43
- manage_db.rb add crap
44
- manage_db.rb add sphuman
53
+ ```sh
54
+ manage_db.rb add --predefined crap
55
+ manage_db.rb add --predefined sphuman
56
+ manage_db.rb update crap
57
+ manage_db.rb update sphuman
58
+ ```
59
+
60
+ You should now be able to run database searches, specifying this database by using the -d sphuman flag. Every month or so swissprot will release a new database version. You can keep your database up to date using the manage_db.rb update command. This will update the database only if any of its source files (or ftp release notes) have changed. The manage_db.rb tool also allows completely custom databases to be configured. Setup requires adding quite a few command-line options but once setup, databases can easily be updated without further config. The example below shows the commandline arguments required to manually configure the sphuman database.
45
61
 
46
- You should now be able to run database searches, specifying this database by using the -d sphuman flag. Every month or so swissprot will release a new database version. You can keep your database up to date using;
62
+ ```sh
63
+ manage_db.rb add --ftp-source 'ftp://ftp.uniprot.org/pub/databases/uniprot/current_release/knowledgebase/complete/uniprot_sprot.fasta.gz ftp://ftp.uniprot.org/pub/databases/uniprot/current_release/knowledgebase/complete/reldate.txt' --include-filters '/OS=Homo\ssapiens/' --id-regex 'sp\|.*\|(.*?)\s' --add-decoys --make-blast-index --archive-old sphuman
64
+ ```
47
65
 
48
- manage_db.rb update sphuman
66
+ ## Annotation databases
49
67
 
50
- This will update the database only if any of its source files (or ftp release notes) have changed. The manage_db.rb tool also allows completely custom databases to be configured. Setup requires adding quite a few command-line options but once setup databases can easily be updated without further config. The example below shows the commandline arguments required to manually configure the sphuman database.
68
+ The manage_db.rb script also deals with annotation databases. Actually at this stage the only useful annotation database it has support for is the Swissprot Uniprot database which contains detailed information on protein entries. The following commands download and index this database.
51
69
 
52
- manage_db.rb add --ftp-source 'ftp://ftp.uniprot.org/pub/databases/uniprot/current_release/knowledgebase/complete/uniprot_sprot.fasta.gz ftp://ftp.uniprot.org/pub/databases/uniprot/current_release/knowledgebase/complete/reldate.txt' --include-filters '/OS=Homo\ssapiens/' --id-regex 'sp\|.*\|(.*?)\s' --add-decoys --make-blast-index --archive-old sphuman
70
+ ```sh
71
+ manage_db.rb add --predefined swissprot_annotation
72
+ manage_db.rb update swissprot_annotation
73
+ ```
53
74
 
75
+ Once this step is complete you should be able to use annotation tools such as the uniprot_annotation.rb tool
54
76
 
55
77
  ## Galaxy integration
56
78
 
57
79
  Although all the protk tools can be run directly from the command-line a nicer way to run them (and visualise outputs) is to use the galaxy web application.
58
80
 
81
+ The preferred method for use of protk with galaxy is to install via the [galaxy toolshed](http://toolshed.g2.bx.psu.edu). You can find protk tools under the Proteomics heading. You will still need to install and setup galaxy itself (step 1 below) and you should familiarise yourself with the admin functions of galaxy including tool installation. Lots of instructions on this are available on the [toolshed wiki](http://wiki.galaxyproject.org/Tool%20Shed). One final note is that although protk based tools are configured to automatically install their dependencies via the toolshed you should take careful note of the system packages that must be installed before proceeding. In addition it is also a good idea to install low-level tools (eg the galaxy_protk repository) before installing dependent tools (eg TPP Prophets).
82
+
59
83
  1. Check out and install the latest stable galaxy [see the official galaxy wiki for more detailed setup instructions](http://wiki.g2.bx.psu.edu/Admin/Get%20Galaxy,"galaxy wiki")
60
84
 
61
- hg clone https://bitbucket.org/galaxy/galaxy-dist
62
- cd galaxy-dist
63
- sh run.sh
85
+ ```sh
86
+ hg clone https://bitbucket.org/galaxy/galaxy-dist
87
+ cd galaxy-dist
88
+ sh run.sh
89
+ ```
90
+
91
+
92
+ 2. Make the protk tools available to galaxy. (Legacy. This step is not needed when installing via the toolshed)
64
93
 
65
- 2. Make the protk tools available to galaxy.
66
94
  - Create a directory for galaxy tool dependencies. It's best if this directory is outside the galaxy-dist directory. I usually create a directory called `tool_depends` alongside `galaxy-dist`.
67
95
  - Open the file `universe_wsgi.ini` in the `galaxy-dist` directory and set the configuration option `tool_dependency_dir` to point to the directory you just created
68
96
  - Create a protkgem directory inside `<tool_dependency_dir>`.
69
97
 
70
- cd <tool_dependency_dir>
71
- mkdir protkgem
72
- cd protkgem
73
- mkdir rvm193
74
- ln -s rvm193 default
75
- cd default
76
- ln -s ~/.protk/galaxy/env.sh env.sh
77
-
78
- 3. Install any of the Proteomics tools that depend on protk from the galaxy toolshed
79
-
80
- 4. After installing the protk wrapper tools from the toolshed it will be necessary to tell those tools about databases you have installed. Use the manage_db.rb tool to do this.
81
-
98
+ ```sh
99
+ cd <tool_dependency_dir>
100
+ mkdir protkgem
101
+ cd protkgem
102
+ mkdir rvm193
103
+ ln -s rvm193 default
104
+ cd default
105
+ ln -s ~/.protk/galaxy/env.sh env.sh
106
+ ```
107
+
108
+ 3. After installing the protk wrapper tools from the toolshed it will be necessary to tell those tools about databases you have installed. Use the manage_db.rb tool to do this. In particular the manage_db.rb tool has a -G option to automatically tell galaxy about the location of its databases. To use this though you will need to tell protk about the location of your galaxy install. To do this
109
+ - Create a file named `config.yml` inside your .protk directory
110
+ - Add the line `galaxy_root: /home/galaxy/galaxy-dist` to config.yml substituting the actual path to the root directory of your galaxy installation
111
+
112
+ ```sh
113
+ echo 'galaxy_root: /home/galaxy/galaxy-dist' > ~/.protk/config.yml
114
+ ```
82
115
 
83
116
 
@@ -82,12 +82,22 @@ protxml_parser=XML::Parser.file(tool.protxml)
82
82
  protxml_doc=protxml_parser.parse
83
83
  proteins = protxml_doc.find('.//protxml:protein','protxml:http://regis-web.systemsbiology.net/protXML')
84
84
 
85
- p "Indexing sixframe translations"
86
- db_filename = Pathname.new(tool.sixframe).realpath.to_s
87
85
 
88
- if tool.skip_fasta_indexing
86
+ db_filename = nil
87
+ case
88
+ when Pathname.new(tool.sixframe).exist? # It's an explicitly named db
89
+ db_filename = Pathname.new(tool.sixframe).realpath.to_s
90
+ else
91
+ db_filename=Constants.new.current_database_for_name(tool.sixframe)
92
+ end
93
+
94
+ db_indexfilename = "#{db_filename}.pin"
95
+
96
+ if File.exist?(db_indexfilename)
97
+ p "Using existing indexed translations"
89
98
  orf_lookup = FastaDB.new(db_filename)
90
99
  else
100
+ p "Indexing sixframe translations"
91
101
  orf_lookup = FastaDB.create(db_filename,db_filename,'prot')
92
102
  end
93
103
 
@@ -114,6 +124,7 @@ for prot in proteins
114
124
  p "Looking up #{protein_name}"
115
125
  orf = orf_lookup.get_by_id protein_name
116
126
  if ( orf == nil)
127
+ p "Failed lookup for #{protein_name}"
117
128
  raise KeyError
118
129
  end
119
130
 
@@ -121,11 +132,12 @@ for prot in proteins
121
132
  position = orf.identifiers.description.split('|').collect { |pos| pos.to_i }
122
133
 
123
134
  if ( position.length != 2 )
135
+ p "Badly formatted entry #{orf}"
124
136
  raise EncodingError
125
137
  end
126
138
  orf_name = orf.entry_id.scan(/lcl\|(.*)/)[0][0]
127
139
  frame=orf_name.scan(/frame_(\d)/)[0][0]
128
- scaffold_name = orf_name.scan(/(scaffold_\d+)/)[0][0]
140
+ scaffold_name = orf_name.scan(/(scaffold_?\d+)_/)[0][0]
129
141
 
130
142
  strand = (frame.to_i > 3) ? '-' : '+'
131
143
  # strand = +1
@@ -177,7 +189,6 @@ for prot in proteins
177
189
 
178
190
  rescue KeyError,EncodingError
179
191
  skipped+=0
180
- p "Lookup failed for #{protein_name}"
181
192
  end
182
193
 
183
194
  # p orf_name
@@ -13,6 +13,130 @@ require 'protk/command_runner'
13
13
  require 'protk/search_tool'
14
14
  require 'rest_client'
15
15
 
16
+ def login(mascot_cgi,username,password)
17
+
18
+ authdict={}
19
+ authdict[:username]=username
20
+ authdict[:password]=password
21
+ authdict[:action]="login"
22
+ authdict[:savecookie]="1"
23
+
24
+ p "Logging in to #{mascot_cgi}/login.pl"
25
+ p authdict
26
+ response = RestClient.post "#{mascot_cgi}/login.pl", authdict
27
+
28
+ cookie = response.cookies
29
+ cookie
30
+ end
31
+
32
+ def export_results(mascot_cgi,session_cookie,results_path,format)
33
+ export_dict={}
34
+ export_dict[:generate_file]=1
35
+ export_dict[:pep_query]=1
36
+ export_dict[:file]=results_path #"../data/20130423/F208623.dat"
37
+ export_dict[:export_format]=format
38
+ export_dict[:protein_master]=1
39
+ export_dict[:prot_hit_num]=1
40
+ export_dict[:show_unassigned]=1
41
+ export_dict[:query_title]=1
42
+ export_dict[:pep_expect]=1
43
+ export_dict[:pep_rank]=1
44
+ export_dict[:search_master]=1
45
+ export_dict[:pep_var_mod]=1
46
+ export_dict[:pep_isbold]=1
47
+ export_dict[:report]=0
48
+ export_dict[:show_queries]=1
49
+ export_dict[:pep_exp_mz]=1
50
+ export_dict[:pep_exp_z]=1
51
+ export_dict[:query_master]=0
52
+ export_dict[:pep_scan_title]=1
53
+ export_dict[:query_qualifiers]=1
54
+ export_dict[:_showsubsets]=1
55
+ export_dict[:_sigthreshold]=0.99
56
+ export_dict[:pep_isunique]=1
57
+ export_dict[:show_header]=1
58
+ export_dict[:pep_ident]=1
59
+ export_dict[:query_peaks]=1
60
+ export_dict[:pep_seq]=1
61
+ export_dict[:query_raw]=1
62
+ export_dict[:pep_score]=1
63
+ export_dict[:show_same_sets]=1
64
+ export_dict[:do_export]=1
65
+ export_dict[:peptide_master]=1
66
+ export_dict[:prot_score]=1
67
+ export_dict[:prot_acc]=1
68
+ export_dict[:show_params]=1
69
+ export_dict[:pep_homol]=1
70
+ export_dict[:show_mods]=1
71
+
72
+ # RestClient.add_before_execution_proc do |req, params|
73
+ # require 'debugger'; debugger
74
+ # p req
75
+ # p params
76
+ # end
77
+
78
+
79
+ export_url="#{mascot_cgi}/export_dat_2.pl"
80
+
81
+ begin
82
+ RestClient.post(export_url , export_dict , {:cookies=>session_cookie}){ |response, request, result, &block|
83
+ # require 'debugger'; debugger
84
+ if ( response.code==303)
85
+ sleep(5)
86
+ end
87
+ response.return!(request, result, &block)
88
+ }
89
+ # response = RestClient.post export_url , export_dict , {:cookies=>session_cookie}
90
+ rescue
91
+ p "Ignoring exception"
92
+ # require 'debugger'; debugger
93
+ end
94
+
95
+ begin
96
+ p response.to_s[0,1000]
97
+ rescue
98
+ end
99
+
100
+ fout = File.new("results.xml", "w+")
101
+
102
+ fout.write response
103
+ end
104
+
105
+ def search_params_dictionary(search_tool,input_file)
106
+ var_mods = search_tool.var_mods.split(",").collect { |mod| mod.lstrip.rstrip }.reject {|e| e.empty? }.join(",")
107
+ fix_mods = search_tool.fix_mods.split(",").collect { |mod| mod.lstrip.rstrip }.reject { |e| e.empty? }.join(",")
108
+
109
+ # None is given by a an empty galaxy multi-select list and we need to turn it into an empty string
110
+ #
111
+ var_mods="" if var_mods=="None"
112
+ fix_mods="" if fix_mods=="None"
113
+
114
+ postdict={}
115
+ postdict[:SEARCH]="MIS"
116
+ postdict[:CHARGE]=search_tool.allowed_charges
117
+ postdict[:CLE]=search_tool.enzyme
118
+ postdict[:PFA]=search_tool.missed_cleavages
119
+ postdict[:COM]="Protk"
120
+ postdict[:DB]=search_tool.database
121
+ postdict[:INSTRUMENT]=search_tool.instrument
122
+ postdict[:IT_MODS]=var_mods
123
+ postdict[:ITOL]=search_tool.fragment_tol
124
+ postdict[:ITOLU]=search_tool.fragment_tolu
125
+ postdict[:MASS]=search_tool.precursor_search_type
126
+ postdict[:MODS]=fix_mods
127
+ postdict[:REPORT]="AUTO"
128
+ postdict[:TAXONOMY]="All entries"
129
+ postdict[:TOL]=search_tool.precursor_tol
130
+ postdict[:TOLU]=search_tool.precursor_tolu
131
+ postdict[:USEREMAIL]=search_tool.email
132
+ postdict[:USERNAME]=search_tool.username
133
+ postdict[:FILE]=File.new(input_file)
134
+ postdict[:FORMVER]='1.01'
135
+ postdict[:INTERMEDIATE]=''
136
+
137
+ postdict
138
+ end
139
+
16
140
  # Environment with global constants
17
141
  #
18
142
  genv=Constants.new
@@ -20,138 +144,81 @@ genv=Constants.new
20
144
  # Setup specific command-line options for this tool. Other options are inherited from SearchTool
21
145
  #
22
146
  search_tool=SearchTool.new({:msms_search=>true,:background=>false,:database=>true,:explicit_output=>true,:over_write=>true,:msms_search_detailed_options=>true})
147
+
23
148
  search_tool.jobid_prefix="o"
24
149
 
25
150
  search_tool.option_parser.banner = "Run a Mascot msms search on a set of mgf input files.\n\nUsage: mascot_search.rb [options] file1.mgf file2.mgf ..."
26
151
  search_tool.options.output_suffix="_mascot"
27
152
 
28
- search_tool.options.mascot_server="#{genv.default_mascot_server}/mascot/cgi/"
29
- #search_tool.option_parser.on( '-P', '--http-proxy url', 'The url to a proxy server' ) do |url|
30
- # search_tool.options.mascot_server=url
31
- #end
32
-
33
- #search_tool.options.http_proxy="http://squid.latrobe.edu.au:8080"
34
- #search_tool.option_parser.on( '-P', '--http-proxy url', 'The url to a proxy server' ) do |url|
35
- # search_tool.options.http_proxy=url
36
- #end
153
+ search_tool.options.mascot_server="#{genv.default_mascot_server}/mascot/cgi"
37
154
 
38
- search_tool.option_parser.parse!
155
+ search_tool.options.httpproxy=""
156
+ search_tool.option_parser.on( '--proxy url', 'The url to a proxy server' ) do |urll|
157
+ search_tool.options.httpproxy=urll
158
+ end
39
159
 
160
+ search_tool.options.mascot_password=""
161
+ search_tool.option_parser.on( '--password psswd', 'Password to use when Mascot security is enabled' ) do |psswd|
162
+ search_tool.options.mascot_password=psswd
163
+ end
40
164
 
165
+ search_tool.options.use_security=FALSE
166
+ search_tool.option_parser.on( '--use-security', 'When Mascot security is enabled this is required' ) do
167
+ search_tool.options.use_security=TRUE
168
+ end
41
169
 
170
+ search_tool.option_parser.parse!
42
171
 
43
172
 
44
- # Set search engine specific parameters on the SearchTool object
45
- #
46
173
  fragment_tol = search_tool.fragment_tol
47
174
  precursor_tol = search_tool.precursor_tol
48
175
 
49
176
 
50
177
 
51
- mascot_cgi=search_tool.mascot_server
178
+ mascot_cgi=search_tool.mascot_server.chomp('/')
52
179
 
53
- unless ( mascot_cgi =~ /^http:\/\//)
180
+ unless ( mascot_cgi =~ /^http[s]?:\/\//)
54
181
  mascot_cgi = "http://#{mascot_cgi}"
55
182
  end
56
183
 
57
- #
58
- RestClient.proxy=search_tool.http_proxy
59
-
60
- genv.log("Var mods #{search_tool.var_mods} and fixed #{search_tool.fix_mods}",:info)
61
-
62
- var_mods = search_tool.var_mods.split(",").collect { |mod| mod.lstrip.rstrip }.reject {|e| e.empty? }.join(",")
63
- fix_mods = search_tool.fix_mods.split(",").collect { |mod| mod.lstrip.rstrip }.reject { |e| e.empty? }.join(",")
64
-
65
- # None is given by a an empty galaxy multi-select list and we need to turn it into an empty string
66
- #
67
- var_mods="" if var_mods=="None"
68
- fix_mods="" if fix_mods=="None"
69
-
70
- postdict={}
71
-
72
- # CHARGE
73
- #
74
- postdict[:CHARGE]=search_tool.allowed_charges
75
-
76
- # CLE
77
- #
78
- postdict[:CLE]=search_tool.enzyme
79
-
80
- # PFA
81
- #
82
- postdict[:PFA]=search_tool.missed_cleavages
83
-
84
- # COM (Search title)
85
- #
86
- postdict[:COM]="Protk"
87
-
88
- # DB (Database)
89
- #
90
- postdict[:DB]=search_tool.database
184
+ mascot_xcgi = "#{mascot_cgi.chomp('cgi')}x-cgi"
91
185
 
92
- # INSTRUMENT
93
186
  #
94
- postdict[:INSTRUMENT]=search_tool.instrument
187
+ RestClient.proxy=search_tool.httpproxy
95
188
 
96
- # IT_MODS (Variable Modifications)
97
- #
98
- postdict[:IT_MODS]=var_mods
99
-
100
- # ITOL (Fragment ion tolerance)
101
- #
102
- postdict[:ITOL]=search_tool.fragment_tol
103
-
104
- # ITOLU (Fragment ion tolerance units)
105
- #
106
- postdict[:ITOLU]=search_tool.fragment_tolu
107
-
108
- # MASS (Monoisotopic and Average)
109
- #
110
- postdict[:MASS]=search_tool.precursor_search_type
189
+ genv.log("Var mods #{search_tool.var_mods} and fixed #{search_tool.fix_mods}",:info)
111
190
 
112
- # MODS (Fixed modifications)
113
- #
114
- postdict[:MODS]=fix_mods
115
191
 
116
- # REPORT (What to include in the search report. For command-line searches this is pretty much irrelevant because we retrieve the entire results file anyway)
117
- #
118
- postdict[:REPORT]="AUTO"
119
192
 
120
- # TAXONOMY (Blank because we don't allow taxonomy)
121
- #
122
- postdict[:TAXONOMY]="All entries"
193
+ cookie=""
194
+ openurlcookie=""
123
195
 
124
- # TOL (Precursor ion tolerance (Unit dependent))
125
- #
126
- postdict[:TOL]=search_tool.precursor_tol
196
+ if ( search_tool.use_security)
197
+ # Login
198
+ #
199
+ genv.log("Logging in",:info)
127
200
 
128
- # TOLU (Tolerance Units)
129
- #
130
- postdict[:TOLU]=search_tool.precursor_tolu
201
+ # authdict={}
202
+ # authdict[:username]=search_tool.username
203
+ # authdict[:password]=search_tool.mascot_password
204
+ # authdict[:action]="login"
205
+ # authdict[:savecookie]="1"
131
206
 
132
- # Email
133
- #
134
- postdict[:USEREMAIL]=search_tool.email
207
+ # response = RestClient.post "#{mascot_cgi}/login.pl", authdict
135
208
 
136
- # Username
137
- #
138
- postdict[:USERNAME]=search_tool.username
209
+ cookie = login(mascot_cgi,search_tool.username,search_tool.mascot_password)
139
210
 
211
+ #cookie = response.cookies
212
+ openurlcookie = "MASCOT_SESSION=#{cookie['MASCOT_SESSION']}; MASCOT_USERID=#{cookie['MASCOT_USERID']}; MASCOT_USERNAME=#{cookie['MASCOT_USERNAME']}"
213
+ end
140
214
 
141
- # FILE
142
- #
143
- postdict[:FILE]=File.new(ARGV[0])
215
+ postdict = search_params_dictionary search_tool, ARGV[0]
144
216
 
145
- postdict[:FORMVER]='1.01'
146
- postdict[:INTERMEDIATE]=''
147
217
 
148
218
  genv.log("Sending #{postdict}",:info)
149
219
 
150
- postdict.each do |kv|
151
- p "#{kv}|\n"
152
- end
153
220
 
154
- search_response=RestClient.post "#{mascot_cgi}/nph-mascot.exe?1", postdict
221
+ search_response=RestClient.post "#{mascot_cgi}/nph-mascot.exe?1", postdict, {:cookies=>cookie}
155
222
 
156
223
  genv.log("Mascot search response was #{search_response}",:info)
157
224
 
@@ -168,8 +235,14 @@ else
168
235
  results_date=results[1]
169
236
  results_file=results[2]
170
237
 
238
+ # results=/master_results_?2?\.pl\?file=(\.*\/data\/.*\/.+\.dat)/.match(search_response)
239
+ # results_file = results[1]
240
+ # export_results mascot_cgi,cookie,results_file,"XML"
241
+
171
242
 
172
- get_url= "#{mascot_cgi}/../x-cgi/ms-status.exe?Autorefresh=false&Show=RESULTFILE&DateDir=#{results_date}&ResJob=#{results_file}"
243
+ get_url= "#{mascot_xcgi}/ms-status.exe?Autorefresh=false&Show=RESULTFILE&DateDir=#{results_date}&ResJob=#{results_file}"
244
+
245
+ genv.log("Getting results file at #{get_url}",:info)
173
246
 
174
247
  if ( search_tool.explicit_output!=nil)
175
248
  output_path=search_tool.explicit_output
@@ -181,7 +254,11 @@ else
181
254
  #
182
255
  require 'open-uri'
183
256
  open("#{output_path}", 'wb') do |file|
184
- file << open("#{get_url}").read
257
+ file << open("#{get_url}","Cookie"=>openurlcookie).read
185
258
  end
259
+
260
+
261
+ #open("F1.dat", 'wb') do |file| file << open("#{get_url}","Cookie" => cookie).read end
262
+
186
263
  end
187
264