Topic 2025 2024 Practice Count Priority
Bias-Variance / Design Choices Q2 (3m) Q2 (6m) Q2+Q3 (11m) 4 MUST
CNN Calculations Q6 (4m) Q6 (4m) Q7 (5m) 3 MUST
Transformer / Attention Q5 (4m) Q5 (4m) Q6 (4m) 3 MUST
Data Preprocessing Q1 (2m) Q1 (4m) Q1 (5m) 3 MUST
Learning Rate / Optimizers Q4 (4m) Q4 (4m) — 2 HIGH
Confusion Matrix Metrics — Q3 (4m) Q4 (3m) 2 HIGH
Activation Functions Q3 (3m) — — 1 MED
RNN vs Transformer — Q5 (4m) — 1 MED
DNN Training Challenges — Q7 (4m) — 1 MED
Batch Normalisation — — Q5 (5m) 1 MED
Priority Rule Your Action
MUST Every exam, >= 3 appearances Master completely. Can explain on whiteboard from memory.
HIGH 2 out of 3 exams Understand well. Can calculate and explain.
MED 1 out of 3 exams Know key points. Can write 3-4 sentences if asked.
Diagnose overfitting vs underfitting from numbers/curves
For each fix: say YES/NO + link to the specific diagnosis
Never confuse: regularisation fights overfitting, NOT underfitting
Two formulas: conv output + pool output
Practice multi-layer pipeline calculations
Know valid vs same padding
Masked attention = prevent seeing future tokens
Multi-head attention = multiple perspectives simultaneously
ViT: patches → embeddings → [CLS] token → classification
Which imputation for which data type
When to remove attribute vs impute
Read pipeline → infer data characteristics
Learning Rate — curve shapes, momentum, LR schedules
Confusion Matrix — calculate accuracy/precision/recall, spot class imbalance traps
Bias-Variance/DC ████████████████████ 20 marks
CNN █████████████ 13 marks
Transformer ████████████ 12 marks
Data Preprocessing ███████████ 11 marks
Learning Rate ████████ 8 marks
Eval Metrics ███████ 7 marks
Batch Norm █████ 5 marks
DNN Training ████ 4 marks
RNN ████ 4 marks
Activation Func ███ 3 marks