taverna-t2flow 0.1.1

Sign up to get free protection for your applications and to get access to all the features.
data/ChangeLog.rdoc ADDED
@@ -0,0 +1,8 @@
1
+ = Version 0.1.1
2
+ Released:: Wednesday, September 16, 2009
3
+
4
+ == Added Functionality
5
+ - Retrieval of Taverna local workers.
6
+
7
+ === New instance method in T2Flow::Model
8
+ local_workers
data/LICENCE ADDED
@@ -0,0 +1,165 @@
1
+ GNU LESSER GENERAL PUBLIC LICENSE
2
+ Version 3, 29 June 2007
3
+
4
+ Copyright (C) 2007 Free Software Foundation, Inc. <http://fsf.org/>
5
+ Everyone is permitted to copy and distribute verbatim copies
6
+ of this license document, but changing it is not allowed.
7
+
8
+
9
+ This version of the GNU Lesser General Public License incorporates
10
+ the terms and conditions of version 3 of the GNU General Public
11
+ License, supplemented by the additional permissions listed below.
12
+
13
+ 0. Additional Definitions.
14
+
15
+ As used herein, "this License" refers to version 3 of the GNU Lesser
16
+ General Public License, and the "GNU GPL" refers to version 3 of the GNU
17
+ General Public License.
18
+
19
+ "The Library" refers to a covered work governed by this License,
20
+ other than an Application or a Combined Work as defined below.
21
+
22
+ An "Application" is any work that makes use of an interface provided
23
+ by the Library, but which is not otherwise based on the Library.
24
+ Defining a subclass of a class defined by the Library is deemed a mode
25
+ of using an interface provided by the Library.
26
+
27
+ A "Combined Work" is a work produced by combining or linking an
28
+ Application with the Library. The particular version of the Library
29
+ with which the Combined Work was made is also called the "Linked
30
+ Version".
31
+
32
+ The "Minimal Corresponding Source" for a Combined Work means the
33
+ Corresponding Source for the Combined Work, excluding any source code
34
+ for portions of the Combined Work that, considered in isolation, are
35
+ based on the Application, and not on the Linked Version.
36
+
37
+ The "Corresponding Application Code" for a Combined Work means the
38
+ object code and/or source code for the Application, including any data
39
+ and utility programs needed for reproducing the Combined Work from the
40
+ Application, but excluding the System Libraries of the Combined Work.
41
+
42
+ 1. Exception to Section 3 of the GNU GPL.
43
+
44
+ You may convey a covered work under sections 3 and 4 of this License
45
+ without being bound by section 3 of the GNU GPL.
46
+
47
+ 2. Conveying Modified Versions.
48
+
49
+ If you modify a copy of the Library, and, in your modifications, a
50
+ facility refers to a function or data to be supplied by an Application
51
+ that uses the facility (other than as an argument passed when the
52
+ facility is invoked), then you may convey a copy of the modified
53
+ version:
54
+
55
+ a) under this License, provided that you make a good faith effort to
56
+ ensure that, in the event an Application does not supply the
57
+ function or data, the facility still operates, and performs
58
+ whatever part of its purpose remains meaningful, or
59
+
60
+ b) under the GNU GPL, with none of the additional permissions of
61
+ this License applicable to that copy.
62
+
63
+ 3. Object Code Incorporating Material from Library Header Files.
64
+
65
+ The object code form of an Application may incorporate material from
66
+ a header file that is part of the Library. You may convey such object
67
+ code under terms of your choice, provided that, if the incorporated
68
+ material is not limited to numerical parameters, data structure
69
+ layouts and accessors, or small macros, inline functions and templates
70
+ (ten or fewer lines in length), you do both of the following:
71
+
72
+ a) Give prominent notice with each copy of the object code that the
73
+ Library is used in it and that the Library and its use are
74
+ covered by this License.
75
+
76
+ b) Accompany the object code with a copy of the GNU GPL and this license
77
+ document.
78
+
79
+ 4. Combined Works.
80
+
81
+ You may convey a Combined Work under terms of your choice that,
82
+ taken together, effectively do not restrict modification of the
83
+ portions of the Library contained in the Combined Work and reverse
84
+ engineering for debugging such modifications, if you also do each of
85
+ the following:
86
+
87
+ a) Give prominent notice with each copy of the Combined Work that
88
+ the Library is used in it and that the Library and its use are
89
+ covered by this License.
90
+
91
+ b) Accompany the Combined Work with a copy of the GNU GPL and this license
92
+ document.
93
+
94
+ c) For a Combined Work that displays copyright notices during
95
+ execution, include the copyright notice for the Library among
96
+ these notices, as well as a reference directing the user to the
97
+ copies of the GNU GPL and this license document.
98
+
99
+ d) Do one of the following:
100
+
101
+ 0) Convey the Minimal Corresponding Source under the terms of this
102
+ License, and the Corresponding Application Code in a form
103
+ suitable for, and under terms that permit, the user to
104
+ recombine or relink the Application with a modified version of
105
+ the Linked Version to produce a modified Combined Work, in the
106
+ manner specified by section 6 of the GNU GPL for conveying
107
+ Corresponding Source.
108
+
109
+ 1) Use a suitable shared library mechanism for linking with the
110
+ Library. A suitable mechanism is one that (a) uses at run time
111
+ a copy of the Library already present on the user's computer
112
+ system, and (b) will operate properly with a modified version
113
+ of the Library that is interface-compatible with the Linked
114
+ Version.
115
+
116
+ e) Provide Installation Information, but only if you would otherwise
117
+ be required to provide such information under section 6 of the
118
+ GNU GPL, and only to the extent that such information is
119
+ necessary to install and execute a modified version of the
120
+ Combined Work produced by recombining or relinking the
121
+ Application with a modified version of the Linked Version. (If
122
+ you use option 4d0, the Installation Information must accompany
123
+ the Minimal Corresponding Source and Corresponding Application
124
+ Code. If you use option 4d1, you must provide the Installation
125
+ Information in the manner specified by section 6 of the GNU GPL
126
+ for conveying Corresponding Source.)
127
+
128
+ 5. Combined Libraries.
129
+
130
+ You may place library facilities that are a work based on the
131
+ Library side by side in a single library together with other library
132
+ facilities that are not Applications and are not covered by this
133
+ License, and convey such a combined library under terms of your
134
+ choice, if you do both of the following:
135
+
136
+ a) Accompany the combined library with a copy of the same work based
137
+ on the Library, uncombined with any other library facilities,
138
+ conveyed under the terms of this License.
139
+
140
+ b) Give prominent notice with the combined library that part of it
141
+ is a work based on the Library, and explaining where to find the
142
+ accompanying uncombined form of the same work.
143
+
144
+ 6. Revised Versions of the GNU Lesser General Public License.
145
+
146
+ The Free Software Foundation may publish revised and/or new versions
147
+ of the GNU Lesser General Public License from time to time. Such new
148
+ versions will be similar in spirit to the present version, but may
149
+ differ in detail to address new problems or concerns.
150
+
151
+ Each version is given a distinguishing version number. If the
152
+ Library as you received it specifies that a certain numbered version
153
+ of the GNU Lesser General Public License "or any later version"
154
+ applies to it, you have the option of following the terms and
155
+ conditions either of that published version or of any later version
156
+ published by the Free Software Foundation. If the Library as you
157
+ received it does not specify a version number of the GNU Lesser
158
+ General Public License, you may choose any version of the GNU Lesser
159
+ General Public License ever published by the Free Software Foundation.
160
+
161
+ If the Library as you received it specifies that a proxy can decide
162
+ whether future versions of the GNU Lesser General Public License shall
163
+ apply, that proxy's public statement of acceptance of any version is
164
+ permanent authorization for you to choose that version for the
165
+ Library.
data/README.rdoc ADDED
@@ -0,0 +1,80 @@
1
+ = Taverna[http://taverna.sourceforge.net] 2 Interaction Gem
2
+
3
+ Authors:: Emmanuel Tagarira, David Withers
4
+ Version:: 0.1.1
5
+ Contact:: mailto:mannie@mygrid.org.uk
6
+ URL:: http://taverna.sourceforge.net/
7
+ Licence:: LGPL 3 (See LICENCE or http://www.gnu.org/licenses/lgpl.html)
8
+ Copyright:: (c) 2008-2009 University of Manchester, UK
9
+
10
+
11
+
12
+ == Synopsis
13
+
14
+ This is a Ruby library to support the interaction with Taverna[http://taverna.sourceforge.net] version 2 workflows (T2Flows). This relies on the functionality provided by the libxml library. To be able to utilise this gem to its full potential, you will need to have the following installed on your system.
15
+ - GraphViz[http://www.graphviz.org/Download.php] (a graph visualization package)
16
+ - Rdoc gem
17
+ - Darkfish-Rdoc gem
18
+
19
+
20
+
21
+ == The T2Flow Model Implementation
22
+
23
+ Much like the Taverna[http://taverna.sourceforge.net] 1 Scufl model, T2Flows contain Processors, Sinks, and Sources. T2Flows however, encapsulate these elements within Dataflow objects. A Dataflow objects is the container for all the different elements present within the Taverna[http://taverna.sourceforge.net] 2 workflows, hence a single T2Flow may have one or more dataflows. WorkflowDescriptions and Links, from the original Taverna[http://taverna.sourceforge.net] 1 workflows (Scufl), have been renamed in T2Flows to DataflowAnnotations and Datalinks respectively. DataflowAnnotations, however, allow for multiple titles, authors, and descriptions, as opposed to the single value attributes held for Scufls.
24
+
25
+
26
+
27
+ == Installation
28
+ To install the gems required by the Taverna 2 gem library, type into your command prompt:
29
+ [sudo] gem install libxml-ruby
30
+ [sudo] gem install rdoc
31
+ [sudo] gem install darkfish-rdoc
32
+
33
+ When you have the required gems on your system, you may install this gem using the following commands:
34
+ gem sources -a http://gems.github.com
35
+ [sudo] gem install mannie-taverna-t2flow
36
+
37
+
38
+
39
+ == Usage
40
+
41
+ To be able to generate at least a T2Flow model using the gem, you need to include in your ruby code the following lines:
42
+ require "t2flow/model.rb"
43
+ require "t2flow/parser.rb"
44
+
45
+ To generate the model you can then use the gem as follows:
46
+ foo = File.new("path/to/workflow/file", "r")
47
+ bar = T2Flow::Parser.new.parse(foo)
48
+
49
+ Alternatively:
50
+ foo = File.new("path/to/workflow/file", "r").read
51
+ bar = T2Flow::Parser.new.parse(foo)
52
+
53
+ You will then be able to use your T2Flow model to retrieve information about the workflow by invoking the different methods and attributes.
54
+ bar.INVOKED
55
+ ... where INVOKED is the method or attribute required.
56
+
57
+ You can also interact with remote workflows.
58
+ require "open-uri"
59
+
60
+ foo = Uri.parse("xxxx://uri_to_workflow").read
61
+ bar = T2Flow::Parser.new.parse(foo)
62
+
63
+ To be enable you to draw images of the T2Flow, you need to include:
64
+ require "t2flow/dot.rb"
65
+
66
+ To be able to use any functionality included in "t2flow/dot.rb", you need to have GraphViz[http://www.graphviz.org/Download.php] installed on your system. Once this package has been installed, you may use the gem to draw an image showing the structure of the T2Flow as follows.
67
+ out_file = File.new("path/to/file/you/want/the/dot/script/to/be/written", "w+")
68
+ T2Flow::Dot.new.write_dot(out_file, bar)
69
+ `dot -Tpng -o"path/to/the/output/image" #{out_file.path}`
70
+ The last line draws a PNG image using +out_file+. To learn more about dot, try typing into your command prompt:
71
+ % man dot
72
+ or
73
+ % dot -h
74
+
75
+
76
+
77
+ == References
78
+
79
+ [1] http://taverna.sourceforge.net
80
+ [2] http://www.graphviz.org
data/lib/t2flow/dot.rb ADDED
@@ -0,0 +1,250 @@
1
+ module T2Flow
2
+
3
+ # This class enables you to write the script will will be used by dot
4
+ # (which is part of GraphViz[http://www.graphviz.org/Download.php])
5
+ # to generate the image showing the structure of a given model.
6
+ # To get started quickly, you could try:
7
+ # out_file = File.new("path/to/file/you/want/the/dot/script/to/be/written", "w+")
8
+ # workflow = File.new("path/to/workflow/file", "r").read
9
+ # model = T2Flow::Parser.new.parse(workflow)
10
+ # T2Flow::Dot.new.write_dot(out_file, model)
11
+ # `dot -Tpng -o"path/to/the/output/image" #{out_file.path}`
12
+ class Dot
13
+
14
+ @@processor_colours = {
15
+ 'apiconsumer' => 'palegreen',
16
+ 'beanshell' => 'burlywood2',
17
+ 'biomart' => 'lightcyan2',
18
+ 'local' => 'mediumorchid2',
19
+ 'biomobywsdl' => 'darkgoldenrod1',
20
+ 'biomobyobject' => 'gold',
21
+ 'biomobyparser' => 'white',
22
+ 'inferno' => 'violetred1',
23
+ 'notification' => 'mediumorchid2',
24
+ 'rdfgenerator' => 'purple',
25
+ 'rserv' => 'lightgoldenrodyellow',
26
+ 'seqhound' => '#836fff',
27
+ 'soaplabwsdl' => 'lightgoldenrodyellow',
28
+ 'stringconstant' => 'lightsteelblue',
29
+ 'talisman' => 'plum2',
30
+ 'bsf' => 'burlywood2',
31
+ 'abstractprocessor' => 'lightgoldenrodyellow',
32
+ 'rshell' => 'lightgoldenrodyellow',
33
+ 'arbitrarywsdl' => 'darkolivegreen3',
34
+ 'workflow' => 'crimson'}
35
+
36
+ @@fill_colours = %w{white aliceblue antiquewhite beige}
37
+
38
+ @@ranksep = '0.22'
39
+ @@nodesep = '0.05'
40
+
41
+ # Creates a new dot object for interaction.
42
+ def initialize
43
+ # @port_style IS CURRENTLY UNUSED. IGNORE!!!
44
+ @port_style = 'none' # 'all', 'bound' or 'none'
45
+ end
46
+
47
+ # Writes to the given stream (File, StringIO, etc) the script to generate
48
+ # the image showing the internals of the given workflow model.
49
+ # === Usage
50
+ # stream = File.new("path/to/file/you/want/the/dot/script/to/be/written", "w+")
51
+ # workflow = .......
52
+ # model = T2Flow::Parser.new.parse(workflow)
53
+ # T2Flow::Dot.new.write_dot(stream, model)
54
+ def write_dot(stream, model)
55
+ @t2flow_model = model
56
+ stream.puts 'digraph t2flow_graph {'
57
+ stream.puts ' graph ['
58
+ stream.puts ' style=""'
59
+ stream.puts ' labeljust="left"'
60
+ stream.puts ' clusterrank="local"'
61
+ stream.puts " ranksep=\"#@@ranksep\""
62
+ stream.puts " nodesep=\"#@@nodesep\""
63
+ stream.puts ' ]'
64
+ stream.puts
65
+ stream.puts ' node ['
66
+ stream.puts ' fontname="Helvetica",'
67
+ stream.puts ' fontsize="10",'
68
+ stream.puts ' fontcolor="black", '
69
+ stream.puts ' shape="box",'
70
+ stream.puts ' height="0",'
71
+ stream.puts ' width="0",'
72
+ stream.puts ' color="black",'
73
+ stream.puts ' fillcolor="lightgoldenrodyellow",'
74
+ stream.puts ' style="filled"'
75
+ stream.puts ' ];'
76
+ stream.puts
77
+ stream.puts ' edge ['
78
+ stream.puts ' fontname="Helvetica",'
79
+ stream.puts ' fontsize="8",'
80
+ stream.puts ' fontcolor="black",'
81
+ stream.puts ' color="black"'
82
+ stream.puts ' ];'
83
+ write_dataflow(stream, model.main)
84
+ stream.puts '}'
85
+
86
+ stream.flush
87
+ end
88
+
89
+ def write_dataflow(stream, dataflow, prefix="", name="", depth=0) # :nodoc:
90
+ if name != ""
91
+ stream.puts "subgraph cluster_#{prefix}#{name} {"
92
+ stream.puts " label=\"#{name}\""
93
+ stream.puts ' fontname="Helvetica"'
94
+ stream.puts ' fontsize="10"'
95
+ stream.puts ' fontcolor="black"'
96
+ stream.puts ' clusterrank="local"'
97
+ stream.puts " fillcolor=\"#{@@fill_colours[depth % @@fill_colours.length]}\""
98
+ stream.puts ' style="filled"'
99
+ end
100
+ dataflow.processors.each {|processor| write_processor(stream, processor, prefix, depth)}
101
+ write_source_cluster(stream, dataflow.sources, prefix)
102
+ write_sink_cluster(stream, dataflow.sinks, prefix)
103
+ dataflow.datalinks.each {|link| write_link(stream, link, dataflow, prefix)}
104
+ dataflow.coordinations.each {|coordination| write_coordination(stream, coordination, dataflow, prefix)}
105
+ if name != ""
106
+ stream.puts '}'
107
+ end
108
+ end
109
+
110
+ def write_processor(stream, processor, prefix, depth) # :nodoc:
111
+ # nested workflows
112
+ if "#{processor.type}" == "workflow"
113
+ dataflow = @t2flow_model.dataflow(processor.dataflow_id)
114
+ write_dataflow(stream, dataflow, prefix + dataflow.annotations.name, dataflow.annotations.name, depth.next)
115
+ else
116
+ stream.puts " \"#{prefix}#{processor.name}\" ["
117
+ stream.puts " fillcolor=\"#{get_colour processor.type}\","
118
+ stream.puts ' shape="box",'
119
+ stream.puts ' style="filled",'
120
+ stream.puts ' height="0",'
121
+ stream.puts ' width="0",'
122
+ stream.puts " label=\"#{processor.name}\""
123
+ stream.puts ' ];'
124
+ end
125
+ end
126
+
127
+ def write_source_cluster(stream, sources, prefix) # :nodoc:
128
+ if sources.length > 0
129
+ stream.puts " subgraph cluster_#{prefix}sources {"
130
+ stream.puts ' style="dotted"'
131
+ stream.puts ' label="Workflow Inputs"'
132
+ stream.puts ' fontname="Helvetica"'
133
+ stream.puts ' fontsize="10"'
134
+ stream.puts ' fontcolor="black"'
135
+ stream.puts ' rank="same"'
136
+ stream.puts " \"#{prefix}WORKFLOWINTERNALSOURCECONTROL\" ["
137
+ stream.puts ' shape="triangle",'
138
+ stream.puts ' width="0.2",'
139
+ stream.puts ' height="0.2",'
140
+ stream.puts ' fillcolor="brown1"'
141
+ stream.puts ' label=""'
142
+ stream.puts ' ]'
143
+ sources.each {|source| write_source(stream, source, prefix)}
144
+ stream.puts ' }'
145
+ end
146
+ end
147
+
148
+ def write_source(stream, source, prefix) # :nodoc:
149
+ stream.puts " \"#{prefix}WORKFLOWINTERNALSOURCE_#{source.name}\" ["
150
+ stream.puts ' shape="box",'
151
+ stream.puts " label=\"#{source.name}\""
152
+ stream.puts ' width="0",'
153
+ stream.puts ' height="0",'
154
+ stream.puts ' fillcolor="skyblue"'
155
+ stream.puts ' ]'
156
+ end
157
+
158
+ def write_sink_cluster(stream, sinks, prefix) # :nodoc:
159
+ if sinks.length > 0
160
+ stream.puts " subgraph cluster_#{prefix}sinks {"
161
+ stream.puts ' style="dotted"'
162
+ stream.puts ' label="Workflow Outputs"'
163
+ stream.puts ' fontname="Helvetica"'
164
+ stream.puts ' fontsize="10"'
165
+ stream.puts ' fontcolor="black"'
166
+ stream.puts ' rank="same"'
167
+ stream.puts " \"#{prefix}WORKFLOWINTERNALSINKCONTROL\" ["
168
+ stream.puts ' shape="invtriangle",'
169
+ stream.puts ' width="0.2",'
170
+ stream.puts ' height="0.2",'
171
+ stream.puts ' fillcolor="chartreuse3"'
172
+ stream.puts ' label=""'
173
+ stream.puts ' ]'
174
+ sinks.each {|sink| write_sink(stream, sink, prefix)}
175
+ stream.puts ' }'
176
+ end
177
+ end
178
+
179
+ def write_sink(stream, sink, prefix) # :nodoc:
180
+ stream.puts " \"#{prefix}WORKFLOWINTERNALSINK_#{sink.name}\" ["
181
+ stream.puts ' shape="box",'
182
+ stream.puts " label=\"#{sink.name}\""
183
+ stream.puts ' width="0",'
184
+ stream.puts ' height="0",'
185
+ stream.puts ' fillcolor="lightsteelblue2"'
186
+ stream.puts ' ]'
187
+ end
188
+
189
+ def write_link(stream, link, dataflow, prefix) # :nodoc:
190
+ if dataflow.sources.select{|s| s.name == link.source} != []
191
+ stream.write " \"#{prefix}WORKFLOWINTERNALSOURCE_#{link.source}\""
192
+ else
193
+ processor = dataflow.processors.select{|p| p.name == link.source.split(':')[0]}[0]
194
+ if "#{processor.type}" == "workflow"
195
+ df = @t2flow_model.dataflow(processor.dataflow_id)
196
+ stream.write " \"#{prefix}#{df.annotations.name}WORKFLOWINTERNALSINK_#{link.source.split(':')[1]}\""
197
+ else
198
+ stream.write " \"#{prefix}#{processor.name}\""
199
+ end
200
+ end
201
+ stream.write '->'
202
+ if dataflow.sinks.select{|s| s.name == link.sink} != []
203
+ stream.write "\"#{prefix}WORKFLOWINTERNALSINK_#{link.sink}\""
204
+ else
205
+ processor = dataflow.processors.select{|p| p.name == link.sink.split(':')[0]}[0]
206
+ if "#{processor.type}" == "workflow"
207
+ df = @t2flow_model.dataflow(processor.dataflow_id)
208
+ stream.write "\"#{prefix}#{df.annotations.name}WORKFLOWINTERNALSOURCE_#{link.sink.split(':')[1]}\""
209
+ else
210
+ stream.write "\"#{prefix}#{processor.name}\""
211
+ end
212
+ end
213
+ stream.puts ' ['
214
+ stream.puts ' ];'
215
+ end
216
+
217
+ def write_coordination(stream, coordination, dataflow, prefix) # :nodoc:
218
+ stream.write " \"#{prefix}#{coordination.control}"
219
+ processor = dataflow.processors.select{|p| p.name == coordination.control}[0]
220
+
221
+ stream.write 'WORKFLOWINTERNALSINKCONTROL' if "#{processor.type}" == "workflow"
222
+ stream.write '"->"'
223
+ stream.write "#{prefix}#{coordination.target}"
224
+ processor = dataflow.processors.select{|p| p.name == coordination.target}[0]
225
+ stream.write 'WORKFLOWINTERNALSOURCECONTROL' if "#{processor.type}" == "workflow"
226
+ stream.write "\""
227
+ stream.puts ' ['
228
+ stream.puts ' color="gray",'
229
+ stream.puts ' arrowhead="odot",'
230
+ stream.puts ' arrowtail="none"'
231
+ stream.puts ' ];'
232
+ end
233
+
234
+ def get_colour(processor_name) # :nodoc:
235
+ colour = @@processor_colours[processor_name]
236
+ if colour
237
+ colour
238
+ else
239
+ 'white'
240
+ end
241
+ end
242
+
243
+ # Returns true if the given name is a processor; false otherwise
244
+ def Dot.is_processor?(processor_name)
245
+ true if @@processor_colours[processor_name]
246
+ end
247
+
248
+ end
249
+
250
+ end
@@ -0,0 +1,330 @@
1
+ # This is the module containing the T2Flow model implementation i.e. the model structure/definition and all its internals.
2
+
3
+ module T2Flow # :nodoc:
4
+
5
+ # The model for a given Taverna 2 workflow.
6
+ class Model
7
+ # The list of all the dataflows that make up the workflow.
8
+ attr_accessor :dataflows
9
+
10
+ # The list of any dependencies that have been found inside the workflow.
11
+ attr_accessor :dependencies
12
+
13
+ # Creates an empty model for a Taverna 2 workflow.
14
+ def initialize
15
+ @dataflows = []
16
+ end
17
+
18
+ # Retrieve the top level dataflow ie the MAIN (containing) dataflow
19
+ def main
20
+ @dataflows[0]
21
+ end
22
+
23
+ # Retrieve the dataflow with the given ID
24
+ def dataflow(df_id)
25
+ df = @dataflows.select { |x| x.dataflow_id == df_id }
26
+ return df[0]
27
+ end
28
+
29
+ # Retrieve ALL the processors containing beanshells within the workflow.
30
+ def beanshells
31
+ self.all_processors.select { |x| x.type == "beanshell" }
32
+ end
33
+
34
+ # Retrieve ALL processors of that are webservices WITHIN the model.
35
+ def web_services
36
+ self.all_processors.select { |x| x.type =~ /wsdl|soaplab|biomoby/i }
37
+ end
38
+
39
+ # Retrieve ALL local workers WITHIN the workflow
40
+ def local_workers
41
+ self.all_processors.select { |x| x.type =~ /local/i }
42
+ end
43
+
44
+ # Retrieve the datalinks from the top level of a nested workflow.
45
+ # If the workflow is not nested, retrieve all datalinks.
46
+ def datalinks
47
+ self.main.datalinks
48
+ end
49
+
50
+ # Retrieve ALL the datalinks within a nested workflow
51
+ def all_datalinks
52
+ links = []
53
+ @dataflows.each { |dataflow| links << dataflow.datalinks }
54
+ return links.flatten
55
+ end
56
+
57
+ # Retrieve the annotations specific to the workflow. This does not return
58
+ # any annotations from workflows encapsulated within the main workflow.
59
+ def annotations
60
+ self.main.annotations
61
+ end
62
+
63
+ # Retrieve processors from the top level of a nested workflow.
64
+ # If the workflow is not nested, retrieve all processors.
65
+ def processors
66
+ self.main.processors
67
+ end
68
+
69
+ # Retrieve ALL the processors found in a nested workflow
70
+ def all_processors
71
+ procs =[]
72
+ @dataflows.each { |dataflow| procs << dataflow.processors }
73
+ return procs.flatten
74
+ end
75
+
76
+ # Retrieve the sources(inputs) to the workflow
77
+ def sources
78
+ self.main.sources
79
+ end
80
+
81
+ # Retrieve ALL the sources(inputs) within the workflow
82
+ def all_sources
83
+ sources =[]
84
+ @dataflows.each { |dataflow| sources << dataflow.sources }
85
+ return sources.flatten
86
+ end
87
+
88
+ # Retrieve the sinks(outputs) to the workflow
89
+ def sinks
90
+ self.main.sinks
91
+ end
92
+
93
+ # Retrieve ALL the sinks(outputs) within the workflow
94
+ def all_sinks
95
+ sinks =[]
96
+ @dataflows.each { |dataflow| sinks << dataflow.sinks }
97
+ return sinks.flatten
98
+ end
99
+
100
+ # Retrieve the unique dataflow ID for the top level dataflow.
101
+ def model_id
102
+ self.main.dataflow_id
103
+ end
104
+
105
+ # For the given dataflow, return the beanshells and/or services which
106
+ # have direct links to or from the given processor.
107
+ # If no dataflow is specified, the top-level dataflow is used.
108
+ # This does a recursive search in nested workflows.
109
+ # == Usage
110
+ # my_processor = model.processor[0]
111
+ # linked_processors = model.get_processors_linked_to(my_processor)
112
+ # processors_feeding_into_my_processor = linked_processors.sources
113
+ # processors_feeding_from_my_processor = linked_processors.sinks
114
+ def get_processor_links(processor)
115
+ return nil unless processor
116
+ proc_links = ProcessorLinks.new
117
+
118
+ # SOURCES
119
+ sources = self.all_datalinks.select { |x| x.sink =~ /#{processor.name}:.+/ }
120
+ proc_links.sources = []
121
+
122
+ # SINKS
123
+ sinks = self.all_datalinks.select { |x| x.source =~ /#{processor.name}:.+/ }
124
+ proc_links.sinks = []
125
+ temp_sinks = []
126
+ sinks.each { |x| temp_sinks << x.sink }
127
+
128
+ # Match links by port into format
129
+ # my_port:name_of_link_im_linked_to:its_port
130
+ sources.each do |connection|
131
+ link = connection.sink
132
+ connected_proc_name = link.split(":")[0]
133
+ my_connection_port = link.split(":")[1]
134
+
135
+ if my_connection_port
136
+ source = my_connection_port << ":" << connection.source
137
+ proc_links.sources << source if source.split(":").size == 3
138
+ end
139
+ end
140
+
141
+ sinks.each do |connection|
142
+ link = connection.source
143
+ connected_proc_name = link.split(":")[0]
144
+ my_connection_port = link.split(":")[1]
145
+
146
+ if my_connection_port
147
+ sink = my_connection_port << ":" << connection.sink
148
+ proc_links.sinks << sink if sink.split(":").size == 3
149
+ end
150
+ end
151
+
152
+ return proc_links
153
+ end
154
+ end
155
+
156
+
157
+
158
+ # The entities within the Taverna 2 mdoel which contains the different
159
+ # elements of the workflows; processors, sinks, sources, etc...
160
+ class Dataflow
161
+ # This returns a DataflowAnnotation object.
162
+ attr_accessor :annotations
163
+
164
+ # Retrieve the list of processors specific to the dataflow.
165
+ attr_accessor :processors
166
+
167
+ # Retrieve the list of datalinks specific to the dataflow.
168
+ attr_accessor :datalinks
169
+
170
+ # Retrieve the list of sources specific to the dataflow.
171
+ attr_accessor :sources
172
+
173
+ # Retrieve the list of sinks specific to the dataflow.
174
+ attr_accessor :sinks
175
+
176
+ # Retrieve the list of coordinations specific to the dataflow.
177
+ attr_accessor :coordinations
178
+
179
+ # The unique identifier of the dataflow.
180
+ attr_accessor :dataflow_id
181
+
182
+ # Creates a new Dataflow object.
183
+ def initialize
184
+ @annotations = DataflowAnnotation.new
185
+ @processors = []
186
+ @datalinks = []
187
+ @sources = []
188
+ @sinks = []
189
+ @coordinations = []
190
+ end
191
+
192
+ # Retrieve beanshell processors specific to this dataflow.
193
+ def beanshells
194
+ @processors.select { |x| x.type == "beanshell" }
195
+ end
196
+ end
197
+
198
+
199
+
200
+ # This is the (shim) object within the workflow. This can be a beanshell,
201
+ # a webservice, a workflow, etc...
202
+ class Processor
203
+ # A string containing name of the processor.
204
+ attr_accessor :name
205
+
206
+ # A string containing the description of the processor if available.
207
+ # Returns nil otherwise.
208
+ attr_accessor :description
209
+
210
+ # A string for the type of processor, e.g. beanshell, workflow, webservice, etc...
211
+ attr_accessor :type
212
+
213
+ # For processors that have type "dataflow", this is the the reference
214
+ # to the dataflow. For all other processor types, this is nil.
215
+ attr_accessor :dataflow_id
216
+
217
+ # This only has a value in beanshell processors. This is the actual script
218
+ # embedded with the processor which does all the "work"
219
+ attr_accessor :script
220
+
221
+ # This is a list of inputs that the processor can take in.
222
+ attr_accessor :inputs
223
+
224
+ # This is a list of outputs that the processor can produce.
225
+ attr_accessor :outputs
226
+
227
+ # For processors of type "arbitrarywsdl", this is the URI to the location
228
+ # of the wsdl file.
229
+ attr_accessor :wsdl
230
+
231
+ # For processors of type "arbitrarywsdl", this is the operation invoked.
232
+ attr_accessor :wsdl_operation
233
+
234
+ # For soaplab and biomoby services, this is the endpoint URI.
235
+ attr_accessor :endpoint
236
+
237
+ # Authority name for the biomoby service.
238
+ attr_accessor :biomoby_authority_name
239
+
240
+ # Service name for the biomoby service. This is not necessarily the same
241
+ # as the processors name.
242
+ attr_accessor :biomoby_service_name
243
+
244
+ # Category for the biomoby service.
245
+ attr_accessor :biomoby_category
246
+ end
247
+
248
+
249
+ # This object is returned after invoking model.get_processor_links(processor)
250
+ # . The object contains two lists of processors. Each element consists of:
251
+ # the input or output port the processor uses as a link, the name of the
252
+ # processor being linked, and the port of the processor used for the linking,
253
+ # all seperated by a colon (:) i.e.
254
+ # my_port:name_of_processor:processor_port
255
+ class ProcessorLinks
256
+ # The processors whose output is fed as input into the processor used in
257
+ # model.get_processors_linked_to(processor).
258
+ attr_accessor :sources
259
+
260
+ # A list of processors that are fed the output from the processor (used in
261
+ # model.get_processors_linked_to(processor) ) as input.
262
+ attr_accessor :sinks
263
+ end
264
+
265
+
266
+
267
+ # This is the annotation object specific to the dataflow it belongs to.
268
+ # A DataflowAnnotation contains metadata about a given dataflow element.
269
+ class DataflowAnnotation
270
+ # The name used of the dataflow
271
+ attr_accessor :name
272
+
273
+ # A list of titles that have been assigned to the dataflow.
274
+ attr_accessor :titles
275
+
276
+ # A list ot descriptive strings about the dataflow.
277
+ attr_accessor :descriptions
278
+
279
+ # A list of authors of the dataflow
280
+ attr_accessor :authors
281
+ end
282
+
283
+
284
+
285
+ # This represents a connection between any of the following pair of entities:
286
+ # {processor -> processor}, {workflow -> workflow}, {workflow -> processor},
287
+ # and {processor -> workflow}.
288
+ class Datalink
289
+ # The name of the source (the starting point of the connection).
290
+ attr_accessor :source
291
+
292
+ # The name of the sink (the endpoint of the connection).
293
+ attr_accessor :sink
294
+ end
295
+
296
+
297
+
298
+ # This is a representation of the 'Run after...' function in Taverna
299
+ # where the selected processor or workflow is set to run after another.
300
+ class Coordination
301
+ # The name of the processor/workflow which is to run first.
302
+ attr_accessor :control
303
+
304
+ # The name of the processor/workflow which is to run after the control.
305
+ attr_accessor :target
306
+ end
307
+
308
+
309
+
310
+ # This is the start node of a Datalink. Each source has a name and a port
311
+ # which is seperated by a colon; ":".
312
+ # This is represented as "source of a processor:port_name".
313
+ # A string that does not contain a colon can often be returned, signifiying
314
+ # a workflow source as opposed to that of a processor.
315
+ class Source
316
+ attr_accessor :name, :descriptions, :example_values
317
+ end
318
+
319
+
320
+
321
+ # This is the start node of a Datalink. Each sink has a name and a port
322
+ # which is seperated by a colon; ":".
323
+ # This is represented as "sink of a processor:port_name".
324
+ # A string that does not contain a colon can often be returned, signifiying
325
+ # a workflow sink as opposed to that of a processor.
326
+ class Sink
327
+ attr_accessor :name, :descriptions, :example_values
328
+ end
329
+
330
+ end
@@ -0,0 +1,252 @@
1
+ require "libxml"
2
+
3
+ module T2Flow
4
+
5
+ class Parser
6
+ # Returns the model for the given t2flow_file.
7
+ # The method accepts objects of classes File, StringIO and String only.
8
+ # ===Usage
9
+ # foo = ... # stuff to initialize foo here
10
+ # bar = T2Flow::Parser.new.parse(foo)
11
+ def parse(t2flow)
12
+ case t2flow.class.to_s
13
+ when /^string$/i
14
+ document = LibXML::XML::Parser.string(t2flow).parse
15
+ when /^stringio|file$/i
16
+ t2flow.rewind
17
+ document = LibXML::XML::Parser.string(t2flow.read).parse
18
+ else
19
+ raise "Error parsing file."
20
+ end
21
+
22
+ root = document.root
23
+ raise "Doesn't appear to be a workflow!" if root.name != "workflow"
24
+ version = root["version"]
25
+
26
+ create_model(root, version)
27
+ end
28
+
29
+ def create_model(element, version) # :nodoc:
30
+ model = Model.new
31
+
32
+ local_depends = element.find("//localDependencies")
33
+ if local_depends
34
+ local_depends.each do |dependency|
35
+ dependency.each do |dep|
36
+ model.dependencies = [] if model.dependencies.nil?
37
+ model.dependencies << dep.content unless dep.content =~ /^\s*$/
38
+ end
39
+ end
40
+ model.dependencies.uniq! if model.dependencies
41
+ end
42
+
43
+ element.each do |dataflow|
44
+ dataflow_obj = Dataflow.new
45
+ dataflow_obj.dataflow_id = dataflow["id"]
46
+
47
+ dataflow.each do |elt|
48
+ case elt.name
49
+ when "name"
50
+ dataflow_obj.annotations.name = elt.content
51
+ when "inputPorts"
52
+ elt.each { |port| add_source(dataflow_obj, port) }
53
+ when "outputPorts"
54
+ elt.each { |port| add_sink(dataflow_obj, port) }
55
+ when "processors"
56
+ elt.each { |proc| add_processor(dataflow_obj, proc) }
57
+ when "datalinks"
58
+ elt.each { |link| add_link(dataflow_obj, link) }
59
+ when "conditions"
60
+ elt.each { |coord| add_coordination(dataflow_obj, coord) }
61
+ when "annotations"
62
+ elt.each { |ann| add_annotation(dataflow_obj, ann) }
63
+ end # case elt.name
64
+ end # dataflow.each
65
+
66
+ model.dataflows << dataflow_obj
67
+ end # element.each
68
+
69
+ temp = model.processors.select { |x| x.type == "workflow" }
70
+ temp.each do |proc|
71
+ df = model.dataflow(proc.dataflow_id)
72
+ df.annotations.name = proc.name
73
+ end
74
+
75
+ return model
76
+ end
77
+
78
+ def add_source(dataflow, port) # :nodoc:
79
+ source = Source.new
80
+
81
+ port.each do |elt|
82
+ case elt.name
83
+ when "name": source.name = elt.content
84
+ when "annotations"
85
+ elt.each do |ann|
86
+ node = LibXML::XML::Parser.string("#{ann}").parse
87
+ content_node = node.find_first("//annotationBean")
88
+ content = content_node.child.next.content
89
+
90
+ case content_node["class"]
91
+ when /freetextdescription/i
92
+ source.descriptions = [] unless source.descriptions
93
+ source.descriptions << content
94
+ when /examplevalue/i
95
+ source.example_values = [] unless source.example_values
96
+ source.example_values << content
97
+ end # case
98
+ end # elt.each
99
+ end # case
100
+ end # port.each
101
+
102
+ dataflow.sources << source
103
+ end
104
+
105
+ def add_sink(dataflow, port) # :nodoc:
106
+ sink = Sink.new
107
+
108
+ port.each do |elt|
109
+ case elt.name
110
+ when "name": sink.name = elt.content
111
+ when "annotations"
112
+ elt.each do |ann|
113
+ node = LibXML::XML::Parser.string("#{ann}").parse
114
+ content_node = node.find_first("//annotationBean")
115
+ content = content_node.child.next.content
116
+
117
+ case content_node["class"]
118
+ when /freetextdescription/i
119
+ sink.descriptions = [] unless sink.descriptions
120
+ sink.descriptions << content
121
+ when /examplevalue/i
122
+ sink.example_values = [] unless sink.example_values
123
+ sink.example_values << content
124
+ end # case
125
+ end # elt.each
126
+ end # case
127
+ end # port.each
128
+
129
+ dataflow.sinks << sink
130
+ end
131
+
132
+ def add_processor(dataflow, element) # :nodoc:
133
+ processor = Processor.new
134
+
135
+ temp_inputs = []
136
+ temp_outputs = []
137
+
138
+ element.each do |elt|
139
+ case elt.name
140
+ when "name"
141
+ processor.name = elt.content
142
+ when /inputports/i # ports from services
143
+ elt.each { |port|
144
+ port.each { |x| temp_inputs << x.content if x.name=="name" }
145
+ }
146
+ when /outputports/i # ports from services
147
+ elt.each { |port|
148
+ port.each { |x| temp_outputs << x.content if x.name=="name" }
149
+ }
150
+ when "activities" # a processor can only have one kind of activity
151
+ activity = elt.child
152
+ activity.each do |node|
153
+ if node.name == "configBean"
154
+ activity_node = node.child
155
+
156
+ if node["encoding"] == "dataflow"
157
+ processor.dataflow_id = activity_node["ref"]
158
+ processor.type = "workflow"
159
+ else
160
+ processor.type = (activity_node.name =~ /martquery/i ?
161
+ "biomart" : activity_node.name.split(".")[-2])
162
+
163
+ activity_node.each do |value_node|
164
+ case value_node.name
165
+ when "wsdl"
166
+ processor.wsdl = value_node.content
167
+ when "operation"
168
+ processor.wsdl_operation = value_node.content
169
+ when /endpoint/i
170
+ processor.endpoint = value_node.content
171
+ when /servicename/i
172
+ processor.biomoby_service_name = value_node.content
173
+ when /authorityname/i
174
+ processor.biomoby_authority_name = value_node.content
175
+ when "category"
176
+ processor.biomoby_category = value_node.content
177
+ when "script"
178
+ processor.script = value_node.content
179
+ when "inputs" # ALL ports present in beanshell
180
+ value_node.each { |input|
181
+ input.each { |x|
182
+ processor.inputs = [] if processor.inputs.nil?
183
+ processor.inputs << x.content if x.name == "name"
184
+ }
185
+ }
186
+ when "outputs" # ALL ports present in beanshell
187
+ value_node.each { |output|
188
+ output.each { |x|
189
+ processor.outputs = [] if processor.outputs.nil?
190
+ processor.outputs << x.content if x.name == "name"
191
+ }
192
+ }
193
+ end # case value_node.name
194
+ end # activity_node.each
195
+ end # if else node["encoding"] == "dataflow"
196
+ end # if node.name == "configBean"
197
+ end # activity.each
198
+ end # case elt.name
199
+ end # element.each
200
+
201
+ processor.inputs = temp_inputs if processor.inputs.nil? && !temp_inputs.empty?
202
+ processor.outputs = temp_outputs if processor.outputs.nil? && !temp_outputs.empty?
203
+ dataflow.processors << processor
204
+ end
205
+
206
+ def add_link(dataflow, link) # :nodoc:
207
+ datalink = Datalink.new
208
+
209
+ link.each do |sink_source|
210
+ case sink_source.name
211
+ when "sink"
212
+ datalink.sink = sink_source.first.content
213
+ datalink.sink += ":" + sink_source.last.content if sink_source["type"] == "processor"
214
+ when "source"
215
+ datalink.source = sink_source.first.content
216
+ datalink.source += ":" + sink_source.last.content if sink_source["type"] == "processor"
217
+ end
218
+ end
219
+
220
+ dataflow.datalinks << datalink
221
+ end
222
+
223
+ def add_coordination(dataflow, condition) # :nodoc:
224
+ coordination = Coordination.new
225
+
226
+ coordination.control = condition["control"]
227
+ coordination.target = condition["target"]
228
+
229
+ dataflow.coordinations << coordination
230
+ end
231
+
232
+ def add_annotation(dataflow, annotation) # :nodoc:
233
+ node = LibXML::XML::Parser.string("#{annotation}").parse
234
+ content_node = node.find_first("//annotationBean")
235
+ content = content_node.child.next.content
236
+
237
+ case content_node["class"]
238
+ when /freetextdescription/i
239
+ dataflow.annotations.descriptions = [] unless dataflow.annotations.descriptions
240
+ dataflow.annotations.descriptions << content
241
+ when /descriptivetitle/i
242
+ dataflow.annotations.titles = [] unless dataflow.annotations.titles
243
+ dataflow.annotations.titles << content
244
+ when /author/i
245
+ dataflow.annotations.authors = [] unless dataflow.annotations.authors
246
+ dataflow.annotations.authors << content
247
+ end # case
248
+ end
249
+
250
+ end
251
+
252
+ end
metadata ADDED
@@ -0,0 +1,126 @@
1
+ --- !ruby/object:Gem::Specification
2
+ name: taverna-t2flow
3
+ version: !ruby/object:Gem::Version
4
+ hash: 25
5
+ prerelease: false
6
+ segments:
7
+ - 0
8
+ - 1
9
+ - 1
10
+ version: 0.1.1
11
+ platform: ruby
12
+ authors:
13
+ - Emmanuel Tagarira
14
+ - David Withers
15
+ autorequire: t2flow
16
+ bindir: bin
17
+ cert_chain:
18
+ date: 2009-09-16 00:00:00 +01:00
19
+ default_executable:
20
+ dependencies:
21
+ - !ruby/object:Gem::Dependency
22
+ name: libxml-ruby
23
+ prerelease: false
24
+ requirement: &id001 !ruby/object:Gem::Requirement
25
+ none: false
26
+ requirements:
27
+ - - ">="
28
+ - !ruby/object:Gem::Version
29
+ hash: 21
30
+ segments:
31
+ - 1
32
+ - 1
33
+ - 3
34
+ version: 1.1.3
35
+ type: :runtime
36
+ version_requirements: *id001
37
+ - !ruby/object:Gem::Dependency
38
+ name: rdoc
39
+ prerelease: false
40
+ requirement: &id002 !ruby/object:Gem::Requirement
41
+ none: false
42
+ requirements:
43
+ - - ">="
44
+ - !ruby/object:Gem::Version
45
+ hash: 25
46
+ segments:
47
+ - 2
48
+ - 4
49
+ - 3
50
+ version: 2.4.3
51
+ type: :runtime
52
+ version_requirements: *id002
53
+ - !ruby/object:Gem::Dependency
54
+ name: darkfish-rdoc
55
+ prerelease: false
56
+ requirement: &id003 !ruby/object:Gem::Requirement
57
+ none: false
58
+ requirements:
59
+ - - ">="
60
+ - !ruby/object:Gem::Version
61
+ hash: 25
62
+ segments:
63
+ - 1
64
+ - 1
65
+ - 5
66
+ version: 1.1.5
67
+ type: :runtime
68
+ version_requirements: *id003
69
+ description: This a gem developed by myGrid for the purpose of interacting with Taverna 2 workflows. An example use would be the image genaration for the model representing Taverna 2 workflows as used in myExperiment.
70
+ email: mannie@mygrid.org.uk
71
+ executables: []
72
+
73
+ extensions: []
74
+
75
+ extra_rdoc_files:
76
+ - README.rdoc
77
+ - LICENCE
78
+ - ChangeLog.rdoc
79
+ files:
80
+ - lib/t2flow/dot.rb
81
+ - lib/t2flow/model.rb
82
+ - lib/t2flow/parser.rb
83
+ - README.rdoc
84
+ - LICENCE
85
+ - ChangeLog.rdoc
86
+ has_rdoc: true
87
+ homepage: http://www.mygrid.org.uk/
88
+ licenses: []
89
+
90
+ post_install_message:
91
+ rdoc_options:
92
+ - -N
93
+ - --tab-width=2
94
+ - --main=README.rdoc
95
+ - --exclude='t2flow.gemspec|test'
96
+ require_paths:
97
+ - lib
98
+ required_ruby_version: !ruby/object:Gem::Requirement
99
+ none: false
100
+ requirements:
101
+ - - ">="
102
+ - !ruby/object:Gem::Version
103
+ hash: 21
104
+ segments:
105
+ - 1
106
+ - 0
107
+ - 1
108
+ version: 1.0.1
109
+ required_rubygems_version: !ruby/object:Gem::Requirement
110
+ none: false
111
+ requirements:
112
+ - - ">="
113
+ - !ruby/object:Gem::Version
114
+ hash: 3
115
+ segments:
116
+ - 0
117
+ version: "0"
118
+ requirements: []
119
+
120
+ rubyforge_project:
121
+ rubygems_version: 1.3.7
122
+ signing_key:
123
+ specification_version: 1
124
+ summary: Support for interacting with the Taverna 2 workflow system (T2Flow).
125
+ test_files: []
126
+