Step 3: Build a Network

Layers, widths, and parameter counts

1 ExplorePlay below
2 ReadUnderstand
3 BuildHands-on lab
4 CompareSolution
💡 ReflectThink deeper

Build a network

Stack neurons into layers. Each layer's output feeds into the next. The total parameter count (weights + biases) determines model capacity.

32
16
0 (off)
Input
10
Layer 1
32
Layer 2
16
Output
1

Parameter count examples

NameArchitectureParameters
Tiny 10 → 16 → 1 193
Small 10 → 32 → 16 → 1 897
Medium 10 → 64 → 32 → 1 2,817
Large 10 → 128 → 64 → 32 → 1 11,777
Huge 10 → 256 → 128 → 64 → 1 44,033
Loading...
Loading...
Loading...

Think Deeper

A network with 10 inputs and layers [256, 128, 64, 1] has 44,000+ parameters but only 2,000 training samples. What happens?

More parameters than samples = overfitting guaranteed. The network memorises training data instead of learning patterns. Rule of thumb: keep parameters well below 10x your training samples.
Cybersecurity tie-in: Security datasets are often small (hundreds of labelled attacks, not millions). A network with 50,000 parameters trained on 2,000 samples will memorise the training data. Keep architectures small for tabular security data.

Loading...