![]() |
Why does this simple neural network not learn? - Printable Version +- Python Forum (https://python-forum.io) +-- Forum: Python Coding (https://python-forum.io/forum-7.html) +--- Forum: Data Science (https://python-forum.io/forum-44.html) +--- Thread: Why does this simple neural network not learn? (/thread-20798.html) |
Why does this simple neural network not learn? - PythonIsGreat - Aug-30-2019 Hey everyone, This small perceptron is training with the delta rule and does not really make a great progress when it comes to learning. The model should solve the simple OR-Problem. def train(rounds): weight_one = 0.5 weight_two = 0.8 for i in range(rounds): input_one = r.randint(0,1) input_two = r.randint(0,1) if input_one == 0 and input_two == 0: wish_output = 0 else: wish_output = 1 learning_rate = 0.1 output = input_one * weight_one + input_two * weight_two print(i,' X1: ', input_one, ' X2: ', input_two, ' output = ', output) error = wish_output - output delta_weight_one = learning_rate * error * input_one delta_weight_two = learning_rate * error * input_two weight_one = weight_one + delta_weight_one weight_two = weight_two + delta_weight_two train(99999999)I hope someone is finding my mistake. RE: Why does this simple neural network not learn? - ThomasL - Aug-30-2019 You forgot the bias weight. import random LR = 0.1 OR_GATE = {(0,0): 0, (0,1): 1, (1,0): 1, (1,1): 1} def train(rounds): b = random.random() w1 = random.random() w2 = random.random() for i in range(rounds): input_one = random.randint(0,1) input_two = random.randint(0,1) wish_output = OR_GATE[(input_one, input_two)] output = b + input_one * w1 + input_two * w2 error = wish_output - output delta_b = LR * error delta_w1 = LR * error * input_one delta_w2 = LR * error * input_two b = b + delta_b w1 = w1 + delta_w1 w2 = w2 + delta_w2 for x1, x2 in [(0,0), (0,1), (1,0), (1,1)]: print(x1, x2, b + x1*w1 + x2*w2) train(100000)The output looks good, you need to define a threshold e.g. 0.5 If output is below that means output is 0, if above then output is 1. That´s it.
|