A single neuron
Every neural network is built from this one operation: multiply each input by a weight, sum them, add a bias, then apply an activation function.
2.0
×
0.5
-1.0
×
0.8
3.0
×
-0.3
Weighted sum
—
+
Bias
0.1
→
—
=
Output
—
Formula:
This is the fundamental building block. A neural network is just many of these connected together.
output = activation(x₁·w₁ + x₂·w₂ + x₃·w₃ + bias)This is the fundamental building block. A neural network is just many of these connected together.
Loading...
Loading...
Loading...
Think Deeper
Try this:
Set all weights to 0. What does the neuron output for any input? Why is this a problem?
With all weights at 0, the dot product is always 0, so the output is just bias + activation(0). The neuron learns nothing from the input. This is called the dead neuron problem — why weight initialisation matters.
Cybersecurity tie-in: Each input could be a log feature: bytes_per_second, port_risk_score,
packet_rate. The weights learn how much each feature matters for detecting attacks.
A high weight on bytes_per_second means the neuron pays attention to transfer speed.