cerebrum 0.1.1

Sign up to get free protection for your applications and to get access to all the features.
@@ -0,0 +1,7 @@
1
+ ---
2
+ SHA1:
3
+ metadata.gz: 0f30759c9f08ac49ff7c52a6a0fd97dcf288d255
4
+ data.tar.gz: 94a095b0fe5853e8b7e5b8e6f88c4cfe8ea1fd7a
5
+ SHA512:
6
+ metadata.gz: 7c6cc26527c029730a51e2c1f5a36eed725d1b4e3c473732b67bc02673da9b28557adeb9a88c6ea2b1916f432a5d673ffc17cd52d8236decba912bf54c2b85a7
7
+ data.tar.gz: 6b755cd92c8fbb84c69dde3b4751247c670278272112128ca6412c798b15a57ad26770367afed48315b5c7dce905ef1d6e0a4949e165927985f8890fd79782d8
@@ -0,0 +1,9 @@
1
+ /.bundle/
2
+ /.yardoc
3
+ /Gemfile.lock
4
+ /_yardoc/
5
+ /coverage/
6
+ /doc/
7
+ /pkg/
8
+ /spec/reports/
9
+ /tmp/
@@ -0,0 +1,5 @@
1
+ language: ruby
2
+ rvm:
3
+ - 2.1.8
4
+ before_install: gem install bundler -v 1.11.2
5
+ script: bundle exec rake
data/Gemfile ADDED
@@ -0,0 +1,7 @@
1
+ source 'https://rubygems.org'
2
+
3
+ # Specify your gem's dependencies in cerebrum.gemspec
4
+ gemspec
5
+ gem 'pry-nav', group: [:development, :test]
6
+ gem 'bundler', group: [:development, :test]
7
+ gem 'minitest', group: [:development, :test]
@@ -0,0 +1,21 @@
1
+ The MIT License (MIT)
2
+
3
+ Copyright (c) 2016 Irfan Sharif
4
+
5
+ Permission is hereby granted, free of charge, to any person obtaining a copy
6
+ of this software and associated documentation files (the "Software"), to deal
7
+ in the Software without restriction, including without limitation the rights
8
+ to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
9
+ copies of the Software, and to permit persons to whom the Software is
10
+ furnished to do so, subject to the following conditions:
11
+
12
+ The above copyright notice and this permission notice shall be included in
13
+ all copies or substantial portions of the Software.
14
+
15
+ THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
16
+ IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
17
+ FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
18
+ AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
19
+ LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
20
+ OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN
21
+ THE SOFTWARE.
@@ -0,0 +1,136 @@
1
+ # GEM: cerebrum ![](https://travis-ci.org/irfansharif/cerebrum.svg?branch=master)
2
+
3
+ `cerebrum` is an implementation of
4
+ [ANNs](https://en.wikipedia.org/wiki/Artificial_neural_networ://en.wikipedia.org/wiki/Artificial_neural_network)
5
+ in Ruby. There's no reason to train a neural network in Ruby, I'm using it to
6
+ experiment and play around with the bare fundamentals of ANNs, original idea
7
+ for this project [here](https://github.com/harthur/brain) is currently
8
+ unmaintained. Extensions on top of that are personal experimentation.
9
+
10
+ ## Installation
11
+
12
+ Add this line to your application's Gemfile:
13
+
14
+ ```ruby
15
+ gem 'cerebrum'
16
+ ```
17
+
18
+ And then execute:
19
+
20
+ $ bundle
21
+
22
+ Or install it yourself as:
23
+
24
+ $ gem install cerebrum
25
+
26
+ ## Usage
27
+
28
+ ```ruby
29
+ require 'cerebrum'
30
+
31
+ network = Cerebrum.new
32
+
33
+ network.train([
34
+ {input: [0, 0], output: [0]},
35
+ {input: [0, 1], output: [1]},
36
+ {input: [1, 0], output: [1]},
37
+ {input: [1, 1], output: [0]}
38
+ ])
39
+
40
+ result = network.run([1, 0])
41
+ # => [0.9333206724219677]
42
+
43
+ ```
44
+
45
+ ### Training
46
+
47
+ Use `Cerebrum#train` to train the network with an array of training data.
48
+
49
+ #### Data format
50
+
51
+ Each training pattern should have an `input:` and an `output:`, both of which
52
+ can either be an array of numbers from `0` to `1` or a hash of numbers from `0`
53
+ to `1`. An example of the latter is demonstrated below:
54
+
55
+ ```ruby
56
+ network = Cerebrum.new
57
+
58
+ network.train([
59
+ {input: { r: 0.03, g: 0.7, b: 0.5 }, output: { black: 1 }},
60
+ {input: { r: 0.16, g: 0.09, b: 0.2 }, output: { white: 1 }},
61
+ {input: { r: 0.5, g: 0.5, b: 1.0 }, output: { white: 1 }}
62
+ ]);
63
+
64
+ result = network.run({ r: 1, g: 0.4, b: 0 })
65
+ # => { :black=>0.011967728530458011, :white=>0.9871010273923573 }
66
+ ```
67
+
68
+ #### Cerebrum Options
69
+
70
+ `Cerebrum#new` takes a hash of options that would set defaults if not specified in the `Cerebrum#train` procedure call:
71
+
72
+ ```ruby
73
+ network = Cerebrum.new({
74
+ learning_rate: 0.3,
75
+ momentum: 0.1,
76
+ binary_thresh: 0.5,
77
+ hidden_layers: [3, 4]
78
+ })
79
+ ```
80
+
81
+ #### Training Options
82
+
83
+ `Cerebrum#train` optionally takes in a configuration hash as the second argument:
84
+
85
+ ```ruby
86
+ network.train(data, {
87
+ error_threshold: 0.005,
88
+ iterations: 20000,
89
+ log: true,
90
+ log_period: 100,
91
+ learning_rate: 0.3
92
+ })
93
+ ```
94
+
95
+ The network will train until the training error has gone below the threshold or
96
+ the max number of iterations has been reached, whichever comes first.
97
+
98
+ By default training won't let you know how its doing until the end, but set `log`
99
+ to `true` to get periodic updates on the current training error of the network.
100
+ The training error should decrease every time. The updates will be printed to
101
+ console. If you set `log` to a function, this function will be called with the
102
+ updates instead of printing to the console.
103
+
104
+ The `learning_rate` is a parameter that influences how quickly the network
105
+ trains, a number from `0` to `1`. If the learning rate is close to `0` it will
106
+ take longer to train. If the learning rate is closer to `1` it will train faster
107
+ but it's in danger of training to a local minimum and performing badly on new
108
+ data.
109
+
110
+ #### Output
111
+
112
+ The output of `Cerebrum#train` is a hash of information about how the training went:
113
+
114
+ ```ruby
115
+ network.train(data, options)
116
+ # => { error: 0.005324233132423, iterations: 9001 }
117
+ ```
118
+ ## Development
119
+
120
+ After checking out the repo, run `bin/setup` to install dependencies. Then, run
121
+ `rake test` to run the tests. You can also run `bin/console` for an interactive
122
+ prompt that will allow you to experiment. To install this gem onto your local
123
+ machine, run `bundle exec rake install`. To release a new version, update the
124
+ version number in `version.rb`, and then run `bundle exec rake release`, which
125
+ will create a git tag for the version, push git commits and tags, and push the
126
+ `.gem` file to [rubygems.org](https://rubygems.org).
127
+
128
+ ## Contributing
129
+
130
+ Bug reports and pull requests are welcome on GitHub at [irfansharif](https://github.com/irfansharif/cerebrum).
131
+
132
+
133
+ ## License
134
+
135
+ The gem is available as open source under the terms of the [MIT License](http://opensource.org/licenses/MIT).
136
+
@@ -0,0 +1,10 @@
1
+ require "bundler/gem_tasks"
2
+ require "rake/testtask"
3
+
4
+ Rake::TestTask.new(:test) do |t|
5
+ t.libs << "test"
6
+ t.libs << "lib"
7
+ t.test_files = FileList['test/**/*_test.rb']
8
+ end
9
+
10
+ task default: :test
@@ -0,0 +1,11 @@
1
+ #!/usr/bin/env ruby
2
+
3
+ require "bundler/setup"
4
+ require "cerebrum"
5
+
6
+ # You can add fixtures and/or initialization code here to make experimenting
7
+ # with your gem easier. You can also use a different console, if you like.
8
+
9
+ # (If you use this, don't forget to add pry to your Gemfile!)
10
+ require "pry"
11
+ Pry.start
@@ -0,0 +1,8 @@
1
+ #!/usr/bin/env bash
2
+ set -euo pipefail
3
+ IFS=$'\n\t'
4
+ set -vx
5
+
6
+ bundle install
7
+
8
+ # Do any other automated setup that you need to do here
@@ -0,0 +1,32 @@
1
+ # coding: utf-8
2
+ lib = File.expand_path('../lib', __FILE__)
3
+ $LOAD_PATH.unshift(lib) unless $LOAD_PATH.include?(lib)
4
+ require 'cerebrum/version'
5
+
6
+ Gem::Specification.new do |spec|
7
+ spec.name = "cerebrum"
8
+ spec.version = Cerebrum::VERSION
9
+ spec.authors = ["Irfan Sharif", "Arham Ahmed"]
10
+ spec.email = ["irfanmahmoudsharif@gmail.com", "mohammad.a.ahmed@uwaterloo.ca"]
11
+
12
+ spec.summary = %q{Artificial Neural Networks in Ruby}
13
+ spec.homepage = "https://github.com/irfansharif/cerebrum"
14
+ spec.license = "MIT"
15
+
16
+ # Prevent pushing this gem to RubyGems.org by setting 'allowed_push_host', or
17
+ # delete this section to allow pushing this gem to any host.
18
+ if spec.respond_to?(:metadata)
19
+ spec.metadata['allowed_push_host'] = "https://rubygems.org/"
20
+ else
21
+ raise "RubyGems 2.0 or newer is required to protect against public gem pushes."
22
+ end
23
+
24
+ spec.files = `git ls-files -z`.split("\x0").reject { |f| f.match(%r{^(test|spec|features)/}) }
25
+ spec.bindir = "exe"
26
+ spec.executables = spec.files.grep(%r{^exe/}) { |f| File.basename(f) }
27
+ spec.require_paths = ["lib"]
28
+
29
+ spec.add_development_dependency "bundler", "~> 1.11"
30
+ spec.add_development_dependency "rake", "~> 10.0"
31
+ spec.add_development_dependency "minitest", "~> 5.0"
32
+ end
@@ -0,0 +1,2 @@
1
+ require "cerebrum/version"
2
+ require "cerebrum/cerebrum"
@@ -0,0 +1,166 @@
1
+ require_relative "data_scrubber"
2
+ require_relative "cerebrum_helper"
3
+
4
+ class Cerebrum
5
+ include CerebrumHelper
6
+ include DataScrubber
7
+
8
+ attr_accessor :learning_rate, :momentum, :binary_thresh, :hidden_layers,
9
+ :input_lookup_table, :output_lookup_table
10
+
11
+ def initialize(learning_rate: 0.3, momentum: 0.1, binary_thresh: 0.5, hidden_layers: nil)
12
+ @learning_rate = learning_rate
13
+ @momentum = momentum
14
+ @binary_thresh = binary_thresh
15
+ @hidden_layers = hidden_layers
16
+ end
17
+
18
+ def train_pattern(input, target, learning_rate)
19
+ learning_rate = learning_rate || @learning_rate
20
+
21
+ run_input(input)
22
+ calculate_deltas(target)
23
+ adjust_weights(learning_rate)
24
+ mean_squared_error(@errors[@layers])
25
+ end
26
+
27
+ def train(training_set, options = Hash.new)
28
+ @input_lookup_table ||= get_input_lookup_table(training_set)
29
+ @output_lookup_table ||= get_output_lookup_table(training_set)
30
+ training_set = scrub_dataset(training_set)
31
+
32
+ iterations = options[:iterations] || 20000
33
+ error_threshold = options[:error_threshold] || 0.005
34
+ log = options[:log] || false
35
+ log_period = options[:log_period] || 10
36
+ learning_rate = options[:learning_rate] || 0.3
37
+ error = Float::INFINITY
38
+ current_iteration = 0
39
+
40
+ input_size = training_set[0][:input].length
41
+ output_size = training_set[0][:output].length
42
+
43
+ hidden_layer_sizes = [ [3, (input_size/2).floor].max ] unless @hidden_layers
44
+ layer_sizes = [input_size, hidden_layer_sizes, output_size].flatten
45
+ construct_network(layer_sizes)
46
+
47
+ iterations.times do |i|
48
+ current_iteration = i
49
+ training_set_errors = training_set.map { |ex| train_pattern(ex[:input], ex[:output], learning_rate) }
50
+ error = training_set_errors.inject(:+) / training_set.length
51
+ puts "(#{i}) training error: #{error}" if (log && (i % log_period) == 0)
52
+
53
+ break if error < error_threshold
54
+ end
55
+
56
+ { error: error, iterations: current_iteration }
57
+ end
58
+
59
+ def run(input)
60
+ input = to_vector_given_features(input, @input_lookup_table) if @input_lookup_table
61
+ output = run_input(input)
62
+ @output_lookup_table ? to_features_given_vector(output, @output_lookup_table) : output
63
+ end
64
+
65
+ private
66
+
67
+ def construct_network(layer_sizes)
68
+ @layer_sizes = layer_sizes
69
+ @layers = layer_sizes.length - 1 # Excluding output layer
70
+
71
+ @biases, @weights, @outputs = [], [], []
72
+ @deltas, @changes, @errors = [], [], []
73
+
74
+ (@layers + 1).times do |layer| # Including output layer
75
+ layer_size = @layer_sizes[layer]
76
+ @deltas[layer] = zeros(layer_size)
77
+ @errors[layer] = zeros(layer_size)
78
+ @outputs[layer] = zeros(layer_size)
79
+
80
+ next if layer == 0
81
+
82
+ @biases[layer] = randos(layer_size)
83
+ @weights[layer] = Array.new(layer_size)
84
+ @changes[layer] = Array.new(layer_size)
85
+ previous_layer_size = @layer_sizes[layer - 1]
86
+
87
+ layer_size.times do |node|
88
+ @weights[layer][node] = randos(previous_layer_size)
89
+ @changes[layer][node] = zeros(previous_layer_size)
90
+ end
91
+ end
92
+ end
93
+
94
+ def mean_squared_error(errors)
95
+ sum_of_squares = errors.map{ |error| error ** 2 }.reduce(:+)
96
+ Float(sum_of_squares) / errors.length
97
+ end
98
+
99
+ def run_input(input)
100
+ @outputs[0] = input
101
+
102
+ (@layers + 1).times do |layer| # Include output layer
103
+ next if layer == 0
104
+
105
+ layer_size = @layer_sizes[layer]
106
+ previous_layer_size = @layer_sizes[layer - 1]
107
+
108
+ layer_size.times do |node|
109
+ weights = @weights[layer][node]
110
+ sum = @biases[layer][node]
111
+ previous_layer_size.times do |prev_node|
112
+ sum += @outputs[layer - 1][prev_node] * weights[prev_node]
113
+ end
114
+ @outputs[layer][node] = activation_function(sum)
115
+ end
116
+ end
117
+
118
+ @outputs.last
119
+ end
120
+
121
+ def calculate_deltas(target)
122
+ @layers.downto(0) do |layer|
123
+ layer_size = @layer_sizes[layer]
124
+
125
+ layer_size.times do |node|
126
+ output = @outputs[layer][node]
127
+ error = 0
128
+
129
+ if layer == @layers # Output layer
130
+ error = target[node] - output
131
+ else # Hidden layer
132
+ deltas = @deltas[layer + 1]
133
+ deltas.each_with_index do |delta, next_node|
134
+ error += delta * @weights[layer + 1][next_node][node]
135
+ end
136
+ end
137
+ @errors[layer][node] = error
138
+ @deltas[layer][node] = error * output * (1 - output)
139
+ end
140
+ end
141
+ end
142
+
143
+ def adjust_weights(rate)
144
+ 1.upto(@layers) do |layer|
145
+ prev_layer_output = @outputs[layer - 1]
146
+ layer_size = @layer_sizes[layer]
147
+
148
+ layer_size.times do |node|
149
+ delta = @deltas[layer][node]
150
+ prev_layer_output.length.times do |prev_node|
151
+ change = @changes[layer][node][prev_node]
152
+ change = rate * delta * prev_layer_output[prev_node] + (@momentum * change)
153
+
154
+ @changes[layer][node][prev_node] = change
155
+ @weights[layer][node][prev_node] += change
156
+ end
157
+
158
+ @biases[layer][node] += rate * delta
159
+ end
160
+ end
161
+ end
162
+
163
+ def activation_function(sum)
164
+ 1 / (1 + Math.exp( -sum ))
165
+ end
166
+ end
@@ -0,0 +1,37 @@
1
+ module CerebrumHelper
2
+ private
3
+
4
+ def zeros(size)
5
+ Array.new(size, 0)
6
+ end
7
+
8
+ def randos(size)
9
+ Array.new(size) { rand }
10
+ end
11
+
12
+ # [{a: 1}, {b: 6, c: 7}] -> {a: 0, b: 1, c: 2}
13
+ def features_to_vector_index_lookup_table(features)
14
+ flattened_feature_keys = features.inject(:merge)
15
+ reindex_hash_values(flattened_feature_keys)
16
+ end
17
+
18
+ # changes hash {a: 6, b: 7} to {a: 0, b: 1}
19
+ def reindex_hash_values(hash)
20
+ hash.each_with_index{ |pair, index| hash[pair[0]] = index }
21
+ end
22
+
23
+ # formats {a: 0, b: 1}, {a: 6} to [6, 0]
24
+ def to_vector_given_features(features, lookup_table)
25
+ lookup_table.map { |k,v| features[k] || 0 }
26
+ end
27
+
28
+ # {a: 0, b: 1}, [6, 7] to {a: 6, b: 7}
29
+ def to_features_given_vector(vector, lookup_table)
30
+ lookup_table.keys.zip(vector).to_h
31
+ end
32
+
33
+ # [5, 3] to {5: 0, 3: 1}
34
+ def lookup_table_from_array(arr)
35
+ Hash[arr.each_with_index.map { |val, i| [val, i] }]
36
+ end
37
+ end
@@ -0,0 +1,33 @@
1
+ module DataScrubber
2
+ private
3
+
4
+ def scrub_dataset(dataset)
5
+ dataset = scrub_input(dataset) unless dataset[0][:input].is_a? Array
6
+ dataset = scrub_output(dataset) unless dataset[0][:output].is_a? Array
7
+ dataset
8
+ end
9
+
10
+ def get_input_lookup_table(dataset)
11
+ input_features = dataset.map { |ex| ex[:input] }
12
+ (input_features.first.is_a? Array) ? nil : features_to_vector_index_lookup_table(input_features)
13
+ end
14
+
15
+ def get_output_lookup_table(dataset)
16
+ output_features = dataset.map { |ex| ex[:output] }
17
+ (output_features.first.is_a? Array) ? nil : features_to_vector_index_lookup_table(output_features)
18
+ end
19
+
20
+ def scrub_input(dataset)
21
+ input_lookup_table = get_input_lookup_table(dataset)
22
+ dataset.each do |ex|
23
+ ex[:input] = to_vector_given_features(ex[:input], input_lookup_table)
24
+ end
25
+ end
26
+
27
+ def scrub_output(dataset)
28
+ output_lookup_table = get_output_lookup_table(dataset)
29
+ dataset.each do |ex|
30
+ ex[:output] = to_vector_given_features(ex[:output], output_lookup_table)
31
+ end
32
+ end
33
+ end
@@ -0,0 +1,3 @@
1
+ class Cerebrum
2
+ VERSION = "0.1.1"
3
+ end
metadata ADDED
@@ -0,0 +1,103 @@
1
+ --- !ruby/object:Gem::Specification
2
+ name: cerebrum
3
+ version: !ruby/object:Gem::Version
4
+ version: 0.1.1
5
+ platform: ruby
6
+ authors:
7
+ - Irfan Sharif
8
+ - Arham Ahmed
9
+ autorequire:
10
+ bindir: exe
11
+ cert_chain: []
12
+ date: 2016-04-21 00:00:00.000000000 Z
13
+ dependencies:
14
+ - !ruby/object:Gem::Dependency
15
+ name: bundler
16
+ requirement: !ruby/object:Gem::Requirement
17
+ requirements:
18
+ - - "~>"
19
+ - !ruby/object:Gem::Version
20
+ version: '1.11'
21
+ type: :development
22
+ prerelease: false
23
+ version_requirements: !ruby/object:Gem::Requirement
24
+ requirements:
25
+ - - "~>"
26
+ - !ruby/object:Gem::Version
27
+ version: '1.11'
28
+ - !ruby/object:Gem::Dependency
29
+ name: rake
30
+ requirement: !ruby/object:Gem::Requirement
31
+ requirements:
32
+ - - "~>"
33
+ - !ruby/object:Gem::Version
34
+ version: '10.0'
35
+ type: :development
36
+ prerelease: false
37
+ version_requirements: !ruby/object:Gem::Requirement
38
+ requirements:
39
+ - - "~>"
40
+ - !ruby/object:Gem::Version
41
+ version: '10.0'
42
+ - !ruby/object:Gem::Dependency
43
+ name: minitest
44
+ requirement: !ruby/object:Gem::Requirement
45
+ requirements:
46
+ - - "~>"
47
+ - !ruby/object:Gem::Version
48
+ version: '5.0'
49
+ type: :development
50
+ prerelease: false
51
+ version_requirements: !ruby/object:Gem::Requirement
52
+ requirements:
53
+ - - "~>"
54
+ - !ruby/object:Gem::Version
55
+ version: '5.0'
56
+ description:
57
+ email:
58
+ - irfanmahmoudsharif@gmail.com
59
+ - mohammad.a.ahmed@uwaterloo.ca
60
+ executables: []
61
+ extensions: []
62
+ extra_rdoc_files: []
63
+ files:
64
+ - ".gitignore"
65
+ - ".travis.yml"
66
+ - Gemfile
67
+ - LICENSE.txt
68
+ - README.md
69
+ - Rakefile
70
+ - bin/console
71
+ - bin/setup
72
+ - cerebrum.gemspec
73
+ - lib/cerebrum.rb
74
+ - lib/cerebrum/cerebrum.rb
75
+ - lib/cerebrum/cerebrum_helper.rb
76
+ - lib/cerebrum/data_scrubber.rb
77
+ - lib/cerebrum/version.rb
78
+ homepage: https://github.com/irfansharif/cerebrum
79
+ licenses:
80
+ - MIT
81
+ metadata:
82
+ allowed_push_host: https://rubygems.org/
83
+ post_install_message:
84
+ rdoc_options: []
85
+ require_paths:
86
+ - lib
87
+ required_ruby_version: !ruby/object:Gem::Requirement
88
+ requirements:
89
+ - - ">="
90
+ - !ruby/object:Gem::Version
91
+ version: '0'
92
+ required_rubygems_version: !ruby/object:Gem::Requirement
93
+ requirements:
94
+ - - ">="
95
+ - !ruby/object:Gem::Version
96
+ version: '0'
97
+ requirements: []
98
+ rubyforge_project:
99
+ rubygems_version: 2.6.3
100
+ signing_key:
101
+ specification_version: 4
102
+ summary: Artificial Neural Networks in Ruby
103
+ test_files: []