neural-monkey 0.2.0__tar.gz

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
@@ -0,0 +1,152 @@
1
+ Metadata-Version: 2.4
2
+ Name: neural-monkey
3
+ Version: 0.2.0
4
+ Summary: A neural network library for beginners.
5
+ Home-page: https://github.com/19919rohit/Neural-Monkey
6
+ Author: Neunix Studios
7
+ Author-email: neunixstudios@gmail.com
8
+ Classifier: Programming Language :: Python :: 3
9
+ Classifier: License :: OSI Approved :: MIT License
10
+ Classifier: Operating System :: OS Independent
11
+ Requires-Python: >=3.7
12
+ Description-Content-Type: text/markdown
13
+
14
+ # Monkey – Simple Neural Networks for Beginners
15
+
16
+ Monkey is a lightweight Python library for building and training simple neural networks.
17
+ It is designed for beginners to learn concepts like layers, activations, forward propagation, backpropagation, and simple attention mechanisms in a fun and interactive way.
18
+
19
+ ---
20
+
21
+ ## Features
22
+
23
+ - Create fully connected neural networks (Dense layers)
24
+ - Choose activation functions: **ReLU**, **Sigmoid**, **Tanh**
25
+ - Train networks using gradient descent with adjustable learning rate and epochs
26
+ - Make predictions on new inputs
27
+ - Optional lightweight **AttentionBlock** for sequence inputs
28
+ - Beginner-friendly API with minimal setup
29
+ - Fully Python-based, no external dependencies
30
+
31
+ ---
32
+
33
+ ## Installation
34
+
35
+ ```bash
36
+ pip install monkey
37
+ ```
38
+
39
+ ---
40
+
41
+ ## Quick Start Examples
42
+
43
+ ### 1. Predict the sum of two numbers
44
+
45
+ ```python
46
+ from monkey.nn import NeuralNet
47
+
48
+ # Training data
49
+ x_train = [[2, 8], [9, 3], [7, 4], [1, 1]]
50
+ y_train = [[sum(pair)] for pair in x_train]
51
+
52
+ # Create network
53
+ nn = NeuralNet(input_size=2)
54
+ nn.add_layer(neurons=5, activation='relu')
55
+ nn.add_layer(neurons=1, activation='relu', layer='output')
56
+
57
+ # Train the network
58
+ nn.train(x_train, y_train, epochs=500, lr=0.1)
59
+
60
+ # Predict
61
+ print("Prediction for [3,5]:", nn.predict([3,5])[0])
62
+ ```
63
+
64
+ ### 2. Predict using Sigmoid activation
65
+
66
+ ```python
67
+ nn = NeuralNet(input_size=2)
68
+ nn.add_layer(neurons=4, activation='sigmoid')
69
+ nn.add_layer(neurons=1, activation='sigmoid', layer='output')
70
+
71
+ nn.train(x_train, y_train, epochs=1000, lr=0.05)
72
+ print("Prediction for [2,2]:", nn.predict([2,2])[0])
73
+ ```
74
+
75
+ ### 3. Using AttentionBlock for sequences
76
+
77
+ ```python
78
+ from monkey.attention import AttentionBlock
79
+
80
+ seq_input = [[0.8, 0.2, 0.1], [0.5, 0.1, 0.3], [0.2, 0.7, 0.6]]
81
+ attn = AttentionBlock(input_size=3, output_size=3)
82
+ seq_output = attn.forward(seq_input)
83
+
84
+ print("Attention output:", seq_output)
85
+ ```
86
+
87
+ ---
88
+
89
+ ## API Reference
90
+
91
+ | Class / Function | Description |
92
+ |-----------------------|-----------------------------------------------------------------------------|
93
+ | `NeuralNet` | Core class for creating and training fully connected networks |
94
+ | `NeuralNet.add_layer` | Add a layer to the network; choose neurons, activation, layer type |
95
+ | `NeuralNet.predict` | Make predictions for a given input |
96
+ | `NeuralNet.train` | Train the network on input/output data |
97
+ | `AttentionBlock` | Simple attention mechanism for sequences |
98
+ | `relu(x)` | Rectified Linear Unit activation function |
99
+ | `sigmoid(x)` | Sigmoid activation function |
100
+ | `tanh(x)` | Tanh activation function |
101
+
102
+ **Layer Parameters in `add_layer`:**
103
+
104
+ - `neurons`: Number of neurons in the layer
105
+ - `activation`: One of `'relu'`, `'sigmoid'`, `'tanh'`
106
+ - `layer`: `'hidden'` (default) or `'output'`
107
+ - `input_size`: Optional, only needed for first layer
108
+
109
+ ---
110
+
111
+ ## Testing Monkey
112
+
113
+ You can run the provided test script:
114
+
115
+ ```bash
116
+ python tests/test.py
117
+ ```
118
+
119
+ Sample test scenarios include:
120
+
121
+ - Predicting sums of two numbers
122
+ - Predicting multiplication or XOR patterns
123
+ - Testing different activation functions
124
+ - Testing AttentionBlock on sequence data
125
+
126
+ ---
127
+
128
+ ## Learning Tips for Beginners
129
+
130
+ - Start with a single hidden layer and few neurons
131
+ - Use small datasets (like sum of two numbers) for testing
132
+ - Adjust `learning_rate` and `epochs` to see effect on convergence
133
+ - Observe how activation functions change the output and training speed
134
+ - Experiment with the AttentionBlock for sequence-based learning
135
+
136
+ ---
137
+
138
+ ## Contributing
139
+
140
+ Monkey is beginner-friendly and open to contributions!
141
+
142
+ - Experiment with new activation functions
143
+ - Add utilities or visualization helpers
144
+ - Improve training efficiency
145
+
146
+ Github Repo available at : https://github.com/19919rohit/Neural-Monkey
147
+
148
+ ---
149
+
150
+ ## License
151
+
152
+ MIT License
@@ -0,0 +1,139 @@
1
+ # Monkey – Simple Neural Networks for Beginners
2
+
3
+ Monkey is a lightweight Python library for building and training simple neural networks.
4
+ It is designed for beginners to learn concepts like layers, activations, forward propagation, backpropagation, and simple attention mechanisms in a fun and interactive way.
5
+
6
+ ---
7
+
8
+ ## Features
9
+
10
+ - Create fully connected neural networks (Dense layers)
11
+ - Choose activation functions: **ReLU**, **Sigmoid**, **Tanh**
12
+ - Train networks using gradient descent with adjustable learning rate and epochs
13
+ - Make predictions on new inputs
14
+ - Optional lightweight **AttentionBlock** for sequence inputs
15
+ - Beginner-friendly API with minimal setup
16
+ - Fully Python-based, no external dependencies
17
+
18
+ ---
19
+
20
+ ## Installation
21
+
22
+ ```bash
23
+ pip install monkey
24
+ ```
25
+
26
+ ---
27
+
28
+ ## Quick Start Examples
29
+
30
+ ### 1. Predict the sum of two numbers
31
+
32
+ ```python
33
+ from monkey.nn import NeuralNet
34
+
35
+ # Training data
36
+ x_train = [[2, 8], [9, 3], [7, 4], [1, 1]]
37
+ y_train = [[sum(pair)] for pair in x_train]
38
+
39
+ # Create network
40
+ nn = NeuralNet(input_size=2)
41
+ nn.add_layer(neurons=5, activation='relu')
42
+ nn.add_layer(neurons=1, activation='relu', layer='output')
43
+
44
+ # Train the network
45
+ nn.train(x_train, y_train, epochs=500, lr=0.1)
46
+
47
+ # Predict
48
+ print("Prediction for [3,5]:", nn.predict([3,5])[0])
49
+ ```
50
+
51
+ ### 2. Predict using Sigmoid activation
52
+
53
+ ```python
54
+ nn = NeuralNet(input_size=2)
55
+ nn.add_layer(neurons=4, activation='sigmoid')
56
+ nn.add_layer(neurons=1, activation='sigmoid', layer='output')
57
+
58
+ nn.train(x_train, y_train, epochs=1000, lr=0.05)
59
+ print("Prediction for [2,2]:", nn.predict([2,2])[0])
60
+ ```
61
+
62
+ ### 3. Using AttentionBlock for sequences
63
+
64
+ ```python
65
+ from monkey.attention import AttentionBlock
66
+
67
+ seq_input = [[0.8, 0.2, 0.1], [0.5, 0.1, 0.3], [0.2, 0.7, 0.6]]
68
+ attn = AttentionBlock(input_size=3, output_size=3)
69
+ seq_output = attn.forward(seq_input)
70
+
71
+ print("Attention output:", seq_output)
72
+ ```
73
+
74
+ ---
75
+
76
+ ## API Reference
77
+
78
+ | Class / Function | Description |
79
+ |-----------------------|-----------------------------------------------------------------------------|
80
+ | `NeuralNet` | Core class for creating and training fully connected networks |
81
+ | `NeuralNet.add_layer` | Add a layer to the network; choose neurons, activation, layer type |
82
+ | `NeuralNet.predict` | Make predictions for a given input |
83
+ | `NeuralNet.train` | Train the network on input/output data |
84
+ | `AttentionBlock` | Simple attention mechanism for sequences |
85
+ | `relu(x)` | Rectified Linear Unit activation function |
86
+ | `sigmoid(x)` | Sigmoid activation function |
87
+ | `tanh(x)` | Tanh activation function |
88
+
89
+ **Layer Parameters in `add_layer`:**
90
+
91
+ - `neurons`: Number of neurons in the layer
92
+ - `activation`: One of `'relu'`, `'sigmoid'`, `'tanh'`
93
+ - `layer`: `'hidden'` (default) or `'output'`
94
+ - `input_size`: Optional, only needed for first layer
95
+
96
+ ---
97
+
98
+ ## Testing Monkey
99
+
100
+ You can run the provided test script:
101
+
102
+ ```bash
103
+ python tests/test.py
104
+ ```
105
+
106
+ Sample test scenarios include:
107
+
108
+ - Predicting sums of two numbers
109
+ - Predicting multiplication or XOR patterns
110
+ - Testing different activation functions
111
+ - Testing AttentionBlock on sequence data
112
+
113
+ ---
114
+
115
+ ## Learning Tips for Beginners
116
+
117
+ - Start with a single hidden layer and few neurons
118
+ - Use small datasets (like sum of two numbers) for testing
119
+ - Adjust `learning_rate` and `epochs` to see effect on convergence
120
+ - Observe how activation functions change the output and training speed
121
+ - Experiment with the AttentionBlock for sequence-based learning
122
+
123
+ ---
124
+
125
+ ## Contributing
126
+
127
+ Monkey is beginner-friendly and open to contributions!
128
+
129
+ - Experiment with new activation functions
130
+ - Add utilities or visualization helpers
131
+ - Improve training efficiency
132
+
133
+ Github Repo available at : https://github.com/19919rohit/Neural-Monkey
134
+
135
+ ---
136
+
137
+ ## License
138
+
139
+ MIT License
@@ -0,0 +1,49 @@
1
+ # monkey/__init__.py
2
+
3
+ """
4
+ Monkey Library
5
+ --------------
6
+ A simple neural network library with dense layers, activations, and attention blocks.
7
+
8
+ Author: NEUNIX STUDIOS
9
+ """
10
+
11
+ # Core Neural Network
12
+ from .nn import NeuralNet, Dense
13
+
14
+ # Activations
15
+ from .activations import (
16
+ activation_map,
17
+ relu,
18
+ relu_derivative,
19
+ sigmoid,
20
+ sigmoid_derivative,
21
+ tanh,
22
+ tanh_derivative,
23
+ linear,
24
+ linear_derivative
25
+ )
26
+
27
+ # Attention
28
+ from .attention import AttentionBlock
29
+
30
+ # Public API
31
+ __all__ = [
32
+ # Neural network components
33
+ "NeuralNet",
34
+ "Dense",
35
+
36
+ # Activations
37
+ "activation_map",
38
+ "relu",
39
+ "relu_derivative",
40
+ "sigmoid",
41
+ "sigmoid_derivative",
42
+ "tanh",
43
+ "tanh_derivative",
44
+ "linear",
45
+ "linear_derivative",
46
+
47
+ # Attention
48
+ "AttentionBlock"
49
+ ]
@@ -0,0 +1,38 @@
1
+ import math
2
+
3
+ # Sigmoid activation and derivative
4
+ def sigmoid(x):
5
+ return 1 / (1 + math.exp(-x))
6
+
7
+ def sigmoid_derivative(x):
8
+ s = sigmoid(x)
9
+ return s * (1 - s)
10
+
11
+ # ReLU activation and derivative
12
+ def relu(x):
13
+ return max(0, x)
14
+
15
+ def relu_derivative(x):
16
+ return 1 if x > 0 else 0
17
+
18
+ # Tanh activation and derivative
19
+ def tanh(x):
20
+ return math.tanh(x)
21
+
22
+ def tanh_derivative(x):
23
+ return 1 - math.tanh(x)**2
24
+
25
+ # Linear activation and derivative
26
+ def linear(x):
27
+ return x
28
+
29
+ def linear_derivative(x):
30
+ return 1
31
+
32
+ # Mapping of activation names to their functions and derivatives
33
+ activation_map = {
34
+ 'relu': (relu, relu_derivative),
35
+ 'sigmoid': (sigmoid, sigmoid_derivative),
36
+ 'tanh': (tanh, tanh_derivative),
37
+ 'linear': (linear, linear_derivative)
38
+ }
@@ -0,0 +1,43 @@
1
+ import random
2
+
3
+ class AttentionBlock:
4
+ """
5
+ Simplified attention block for sequence inputs.
6
+ - Transforms a sequence of vectors using query, key, value matrices
7
+ - Produces an output sequence where each vector attends to the others
8
+ """
9
+
10
+ class Layer:
11
+ def __init__(self, input_size, output_size):
12
+ self.weights = [[random.uniform(-0.5, 0.5) for _ in range(output_size)] for _ in range(input_size)]
13
+ self.biases = [0] * output_size
14
+
15
+ def forward(self, x):
16
+ return [sum(x[i] * self.weights[i][j] for i in range(len(x))) + self.biases[j]
17
+ for j in range(len(self.weights[0]))]
18
+
19
+ def __init__(self, input_size, output_size):
20
+ self.Wq = self.Layer(input_size, output_size)
21
+ self.Wk = self.Layer(input_size, output_size)
22
+ self.Wv = self.Layer(input_size, output_size)
23
+
24
+ def forward(self, X):
25
+ Q = [self.Wq.forward(x) for x in X]
26
+ K = [self.Wk.forward(x) for x in X]
27
+ V = [self.Wv.forward(x) for x in X]
28
+
29
+ # Compute attention scores
30
+ attn_scores = [[sum(qi * kj for qi, kj in zip(q, k)) for k in K] for q in Q]
31
+
32
+ # Normalize scores to get attention weights
33
+ attn_weights = [[s / sum(row) if sum(row) != 0 else 1 / len(row) for s in row] for row in attn_scores]
34
+
35
+ # Compute weighted output
36
+ output = []
37
+ for w_row in attn_weights:
38
+ out_vec = [0] * len(V[0])
39
+ for i, w in enumerate(w_row):
40
+ for j, v_val in enumerate(V[i]):
41
+ out_vec[j] += w * v_val
42
+ output.append(out_vec)
43
+ return output
@@ -0,0 +1,120 @@
1
+ import random
2
+ from .activations import activation_map
3
+
4
+ # --- Dense layer ---
5
+ class Dense:
6
+ def __init__(self, input_size, output_size, activation, activation_deriv):
7
+ # Initialize weights randomly and biases to zero
8
+ self.weights = [[random.uniform(-0.5, 0.5) for _ in range(output_size)] for _ in range(input_size)]
9
+ self.biases = [0] * output_size
10
+ self.activation = activation
11
+ self.activation_deriv = activation_deriv
12
+ self.last_input = None
13
+ self.last_output = None
14
+ self.is_output = False
15
+
16
+ def forward(self, x):
17
+ self.last_input = x
18
+ # Linear combination + bias
19
+ z = [sum(x[i] * self.weights[i][j] for i in range(len(x))) + self.biases[j]
20
+ for j in range(len(self.weights[0]))]
21
+ # Activation
22
+ self.last_output = [self.activation(v) for v in z]
23
+ return self.last_output
24
+
25
+ def backward(self, grad_output, lr):
26
+ grad_input = [0] * len(self.last_input)
27
+ for i in range(len(self.last_input)):
28
+ for j in range(len(self.weights[0])):
29
+ grad = grad_output[j] * self.activation_deriv(self.last_output[j])
30
+ grad_input[i] += grad * self.weights[i][j]
31
+ # Update weights and biases
32
+ self.weights[i][j] -= lr * grad * self.last_input[i]
33
+ self.biases[j] -= lr * grad
34
+ return grad_input
35
+
36
+
37
+ # --- Neural Network ---
38
+ class NeuralNet:
39
+ def __init__(self, input_size=None, lr=0.1):
40
+ self.layers = []
41
+ self.input_size = input_size
42
+ self.learning_rate = lr
43
+
44
+ def add_layer(self, neurons, activation='relu', layer='hidden', input_size=None):
45
+ """
46
+ Add a layer to the network.
47
+ - neurons: number of neurons in the layer
48
+ - activation: activation function name
49
+ - layer: 'hidden' or 'output'
50
+ """
51
+ # Determine input size
52
+ if not self.layers:
53
+ input_size = input_size or self.input_size
54
+ if input_size is None:
55
+ raise ValueError("Input size must be specified")
56
+ else:
57
+ input_size = len(self.layers[-1].biases)
58
+
59
+ # Fetch activation functions
60
+ act, act_deriv = activation_map.get(activation, (lambda x: x, lambda x: 1))
61
+
62
+ # Create layer
63
+ new_layer = Dense(input_size, neurons, act, act_deriv)
64
+
65
+ # Handle output layer replacement
66
+ if layer.lower() == 'output':
67
+ if self.layers and getattr(self.layers[-1], 'is_output', False):
68
+ self.layers[-1].is_output = False
69
+ new_layer.is_output = True
70
+ else:
71
+ new_layer.is_output = False
72
+
73
+ self.layers.append(new_layer)
74
+
75
+ def predict(self, x):
76
+ for layer in self.layers:
77
+ x = layer.forward(x)
78
+ return x
79
+
80
+ def train(self, x_train, y_train=None, epochs=1000, shuffle=True, verbose=100, lr=None, next_step=False):
81
+ """
82
+ Flexible training:
83
+ - Supports supervised learning if y_train is provided.
84
+ - Supports self-supervised / reconstruction if y_train is None.
85
+ - Supports next-step prediction for sequences if next_step=True.
86
+ """
87
+ learning_rate = lr if lr is not None else self.learning_rate
88
+
89
+ for epoch in range(epochs):
90
+ total_loss = 0
91
+ data = x_train
92
+ targets = y_train
93
+
94
+ # For sequence next-step learning
95
+ if next_step:
96
+ data = [x_train[i:i+2] for i in range(len(x_train)-2)]
97
+ targets = [x_train[i+2] for i in range(len(x_train)-2)]
98
+
99
+ # If no targets, use self-supervised (auto-reconstruction)
100
+ combined = list(zip(data, targets)) if targets is not None else [(x, x) for x in data]
101
+ if shuffle:
102
+ random.shuffle(combined)
103
+
104
+ for x, y in combined:
105
+ # Ensure x and y are lists
106
+ x_list = x if isinstance(x, list) else [x]
107
+ y_list = y if isinstance(y, list) else [y]
108
+
109
+ output = self.predict(x_list)
110
+
111
+ # Compute gradient
112
+ grad = [2*(o-t) for o, t in zip(output, y_list)]
113
+ total_loss += sum((o-t)**2 for o, t in zip(output, y_list))
114
+
115
+ # Backpropagate through layers
116
+ for layer in reversed(self.layers):
117
+ grad = layer.backward(grad, learning_rate)
118
+
119
+ if verbose and epoch % verbose == 0:
120
+ print(f"Epoch {epoch}, Loss: {total_loss:.6f}")
@@ -0,0 +1,152 @@
1
+ Metadata-Version: 2.4
2
+ Name: neural-monkey
3
+ Version: 0.2.0
4
+ Summary: A neural network library for beginners.
5
+ Home-page: https://github.com/19919rohit/Neural-Monkey
6
+ Author: Neunix Studios
7
+ Author-email: neunixstudios@gmail.com
8
+ Classifier: Programming Language :: Python :: 3
9
+ Classifier: License :: OSI Approved :: MIT License
10
+ Classifier: Operating System :: OS Independent
11
+ Requires-Python: >=3.7
12
+ Description-Content-Type: text/markdown
13
+
14
+ # Monkey – Simple Neural Networks for Beginners
15
+
16
+ Monkey is a lightweight Python library for building and training simple neural networks.
17
+ It is designed for beginners to learn concepts like layers, activations, forward propagation, backpropagation, and simple attention mechanisms in a fun and interactive way.
18
+
19
+ ---
20
+
21
+ ## Features
22
+
23
+ - Create fully connected neural networks (Dense layers)
24
+ - Choose activation functions: **ReLU**, **Sigmoid**, **Tanh**
25
+ - Train networks using gradient descent with adjustable learning rate and epochs
26
+ - Make predictions on new inputs
27
+ - Optional lightweight **AttentionBlock** for sequence inputs
28
+ - Beginner-friendly API with minimal setup
29
+ - Fully Python-based, no external dependencies
30
+
31
+ ---
32
+
33
+ ## Installation
34
+
35
+ ```bash
36
+ pip install monkey
37
+ ```
38
+
39
+ ---
40
+
41
+ ## Quick Start Examples
42
+
43
+ ### 1. Predict the sum of two numbers
44
+
45
+ ```python
46
+ from monkey.nn import NeuralNet
47
+
48
+ # Training data
49
+ x_train = [[2, 8], [9, 3], [7, 4], [1, 1]]
50
+ y_train = [[sum(pair)] for pair in x_train]
51
+
52
+ # Create network
53
+ nn = NeuralNet(input_size=2)
54
+ nn.add_layer(neurons=5, activation='relu')
55
+ nn.add_layer(neurons=1, activation='relu', layer='output')
56
+
57
+ # Train the network
58
+ nn.train(x_train, y_train, epochs=500, lr=0.1)
59
+
60
+ # Predict
61
+ print("Prediction for [3,5]:", nn.predict([3,5])[0])
62
+ ```
63
+
64
+ ### 2. Predict using Sigmoid activation
65
+
66
+ ```python
67
+ nn = NeuralNet(input_size=2)
68
+ nn.add_layer(neurons=4, activation='sigmoid')
69
+ nn.add_layer(neurons=1, activation='sigmoid', layer='output')
70
+
71
+ nn.train(x_train, y_train, epochs=1000, lr=0.05)
72
+ print("Prediction for [2,2]:", nn.predict([2,2])[0])
73
+ ```
74
+
75
+ ### 3. Using AttentionBlock for sequences
76
+
77
+ ```python
78
+ from monkey.attention import AttentionBlock
79
+
80
+ seq_input = [[0.8, 0.2, 0.1], [0.5, 0.1, 0.3], [0.2, 0.7, 0.6]]
81
+ attn = AttentionBlock(input_size=3, output_size=3)
82
+ seq_output = attn.forward(seq_input)
83
+
84
+ print("Attention output:", seq_output)
85
+ ```
86
+
87
+ ---
88
+
89
+ ## API Reference
90
+
91
+ | Class / Function | Description |
92
+ |-----------------------|-----------------------------------------------------------------------------|
93
+ | `NeuralNet` | Core class for creating and training fully connected networks |
94
+ | `NeuralNet.add_layer` | Add a layer to the network; choose neurons, activation, layer type |
95
+ | `NeuralNet.predict` | Make predictions for a given input |
96
+ | `NeuralNet.train` | Train the network on input/output data |
97
+ | `AttentionBlock` | Simple attention mechanism for sequences |
98
+ | `relu(x)` | Rectified Linear Unit activation function |
99
+ | `sigmoid(x)` | Sigmoid activation function |
100
+ | `tanh(x)` | Tanh activation function |
101
+
102
+ **Layer Parameters in `add_layer`:**
103
+
104
+ - `neurons`: Number of neurons in the layer
105
+ - `activation`: One of `'relu'`, `'sigmoid'`, `'tanh'`
106
+ - `layer`: `'hidden'` (default) or `'output'`
107
+ - `input_size`: Optional, only needed for first layer
108
+
109
+ ---
110
+
111
+ ## Testing Monkey
112
+
113
+ You can run the provided test script:
114
+
115
+ ```bash
116
+ python tests/test.py
117
+ ```
118
+
119
+ Sample test scenarios include:
120
+
121
+ - Predicting sums of two numbers
122
+ - Predicting multiplication or XOR patterns
123
+ - Testing different activation functions
124
+ - Testing AttentionBlock on sequence data
125
+
126
+ ---
127
+
128
+ ## Learning Tips for Beginners
129
+
130
+ - Start with a single hidden layer and few neurons
131
+ - Use small datasets (like sum of two numbers) for testing
132
+ - Adjust `learning_rate` and `epochs` to see effect on convergence
133
+ - Observe how activation functions change the output and training speed
134
+ - Experiment with the AttentionBlock for sequence-based learning
135
+
136
+ ---
137
+
138
+ ## Contributing
139
+
140
+ Monkey is beginner-friendly and open to contributions!
141
+
142
+ - Experiment with new activation functions
143
+ - Add utilities or visualization helpers
144
+ - Improve training efficiency
145
+
146
+ Github Repo available at : https://github.com/19919rohit/Neural-Monkey
147
+
148
+ ---
149
+
150
+ ## License
151
+
152
+ MIT License
@@ -0,0 +1,12 @@
1
+ README.md
2
+ pyproject.toml
3
+ setup.cfg
4
+ monkey/__init__.py
5
+ monkey/activations.py
6
+ monkey/attention.py
7
+ monkey/nn.py
8
+ neural_monkey.egg-info/PKG-INFO
9
+ neural_monkey.egg-info/SOURCES.txt
10
+ neural_monkey.egg-info/dependency_links.txt
11
+ neural_monkey.egg-info/top_level.txt
12
+ tests/test.py
@@ -0,0 +1,3 @@
1
+ [build-system]
2
+ requires = ["setuptools>=42", "wheel"]
3
+ build-backend = "setuptools.build_meta"
@@ -0,0 +1,24 @@
1
+ [metadata]
2
+ name = neural-monkey
3
+ version = 0.2.0
4
+ author = Neunix Studios
5
+ author_email = neunixstudios@gmail.com
6
+ description = A neural network library for beginners.
7
+ long_description = file: README.md
8
+ long_description_content_type = text/markdown
9
+ url = https://github.com/19919rohit/Neural-Monkey
10
+ classifiers =
11
+ Programming Language :: Python :: 3
12
+ License :: OSI Approved :: MIT License
13
+ Operating System :: OS Independent
14
+
15
+ [options]
16
+ packages = find:
17
+ python_requires = >=3.7
18
+
19
+ [options.packages.find]
20
+
21
+ [egg_info]
22
+ tag_build =
23
+ tag_date = 0
24
+
@@ -0,0 +1,22 @@
1
+ from monkey.nn import NeuralNet
2
+
3
+ # Training data: simple sums
4
+ x_train = [[0, 0], [0, 1], [1, 0], [1, 1], [2, 3], [3, 4], [5, 2], [6, 1]]
5
+ y_train = [[sum(pair)] for pair in x_train]
6
+
7
+ # Create the neural network
8
+ nn = NeuralNet(input_size=2)
9
+
10
+ # Add one hidden layer with 5 neurons, ReLU activation
11
+ nn.add_layer(neurons=5, activation='relu')
12
+
13
+ # Output layer with 1 neuron (the sum), ReLU works fine here
14
+ nn.add_layer(neurons=1, activation='relu', layer='output')
15
+
16
+ # Train the network
17
+ nn.train(x_train, y_train, epochs=1000, lr=0.01)
18
+
19
+ # Test predictions
20
+ test_cases = [[1, 2], [3, 5], [4, 4], [6, 7]]
21
+ for case in test_cases:
22
+ print(f"Prediction for {case}: {nn.predict(case)[0]}")