Skip to content Skip to sidebar Skip to footer

Neural Network Sigmoid Function

I'm trying to make a neural network and I have a couple of questions: My sigmoid function is like some s = 1/(1+(2.7183**(-self.values))) if s > self.weight: self.value

Solution 1:

You are mashing together several different NN concepts.

The logistic function (which is the generalized form of the sigmoid) already serves as a threshold. Specifically, it is a differentiable threshold which is essential for the backpropagation learning algorithm. So you don't need that piecewise threshold function (if statement).

The weights are analogues for synaptic strength and are applied during summation (or feedforward propagation). So each connection between a pair of nodes has a weight that is multiplied by the sending node's activation level (the output of the threshold function).

Finally, even with these changes, a fully-connected neural network with all positive weights will probably still produce all 1's for the output. You can either include negative weights corresponding to inhibitory nodes, or reduce connectivity significantly (e.g. with a 0.1 probability that a node in layer n connects to a node in layer n+1).

Post a Comment for "Neural Network Sigmoid Function"