Step 5: The Training Loop

Epochs, batches, and loss curves

1 ExplorePlay below
2 ReadUnderstand
3 BuildHands-on lab
4 CompareSolution
💡 ReflectThink deeper

The training loop

Training repeats: forward pass → compute loss → backward pass (adjust weights) → repeat. Each full pass through all data is one epoch.

Epoch 1
Loss
Train Acc
Val Acc
Loading...
Loading...
Loading...

Think Deeper

Watch the loss curve. Does it ever go UP during training? Why?

Yes — individual batches can increase loss (noise from random batch selection). But the trend should be downward. If the overall trend goes up, the learning rate is too high.
Cybersecurity tie-in: Each epoch is one pass through all your training logs. With 100,000 firewall entries and batch_size=32, each epoch makes ~3,125 weight updates. More epochs = more chances to learn, but also more risk of memorising noise.

Loading...