Python Forum

Full Version: program error
You're currently viewing a stripped down version of our content. View the full version with proper formatting.
Dear all, I tried to run a program that I found on the internet but it was not possible because it presented an error. Here you can see the program lines:
import numpy as np
def sigmoid(x):
    # Our activation function: f(x) = 1 / (1 + e^(-x))
    return 1 / (1 + np.exp(-x))
class Neuron:
    def __init__(self, weights, bias):
        self.weights = weights
        self.bias = bias
def feedforward(self, inputs):
     # Weight inputs, add bias, then use the activation function
     total = np.dot(self.weights, inputs) + self.bias
     return sigmoid(total)
weights = np.array([0, 1]) # w1 = 0, w2 = 1
bias = 4                   # b = 4
n = Neuron(weights, bias)
x = np.array([2, 3])       # x1 = 2, x2 = 3
print(n.feedforward(x))    # 0.9990889488055994
---------------------------------------------------------------------------
Error:
AttributeError Traceback (most recent call last) <ipython-input-6-50a424dd081a> in <module> 1 x = np.array([2, 3]) # x1 = 2, x2 = 3 ----> 2 print(n.feedforward(x)) # 0.9990889488055994 AttributeError: 'Neuron' object has no attribute 'feedforward'
Can anyone help me find the program problem?
your indentation is off. you want lines 9-12 to be indented one more level so that they are part of the class
Many thanks for your help. Sorry to tags error, I'm starting today and I hope to improve in the next publications.
In relation the mentioned problem, I adjusted the indentation on lines 9-12 but the error mensagem still appear.
I don't have a lot of knowledge in Python programming, so maybe that's my biggest problem.
Could you support me?
Again, could you help me?
no tested, but
import numpy as np
def sigmoid(x):
    # Our activation function: f(x) = 1 / (1 + e^(-x))
    return 1 / (1 + np.exp(-x))
class Neuron:
    def __init__(self, weights, bias):
        self.weights = weights
        self.bias = bias
    def feedforward(self, inputs):
        # Weight inputs, add bias, then use the activation function
        total = np.dot(self.weights, inputs) + self.bias
        return sigmoid(total)
weights = np.array([0, 1]) # w1 = 0, w2 = 1
bias = 4                   # b = 4
n = Neuron(weights, bias)
x = np.array([2, 3])       # x1 = 2, x2 = 3
print(n.feedforward(x))    # 0.9990889488055994
if you get error - always post the full traceback and the code that produce it
It worked.
Many thanks.