Seed Programming

Industrial Artificial Intelligence

Fee: 12,000 /- PKR (per-month)

Registration Fee: 1,000 /- PKR
(non-refundable)

In Collaboration with Information Technology University (ITU)

Who can join this course?

If you're at the beginning of a BS degree in Computer Science, Software Engineering, Data Science, Machine Learning, or Artificial Intelligence at any university or affiliated college in Lahore or its vicinity, this is an excellent opportunity to build a strong foundation for your journey in this field.

About Course

Applied AI Engineering: From Probability to Generative Models & LLM Systems is a 6-month intensive, hands-on program designed to take you from core mathematical and algorithmic foundations to deploying state-of-the-art AI systems, including Generative Models, Transformers, and Large Language Models (LLMs).
Engage in mastering the essentials of linear algebra, calculus, probability, optimization, and neural networks, and then progress to building generative models (VAE, Diffusion), training transformers, and fine-tuning LLMs. Practice mathematical derivations, implement models in NumPy and PyTorch, benchmark performance, and deploy fully functional AI applications. By the end, you’ll be ready to build, optimize, and deploy AI systems, and work as an ML / Generative AI Engineer in real-world environments.

Course Outline

Month 1: Linear Algebra + Calculus for ML (Compressed & Focused)
Week 1: Vectors, Geometry, Matrix Operations

Math: Vector spaces, norms, dot product, matrix multiplication, linear transformations

NumPy Lab: Matrix multiplication from scratch, KNN classifier, 2D visualization

Torch Lab: GPU vs CPU tensors, batch operations & broadcasting

Mini Project: Implement KNN fully in NumPy

Week 2: Eigenvalues, SVD, PCA

Math: Eigen decomposition, SVD, PCA derivation, low-rank approximation

NumPy Lab: PCA manually, image compression

Torch Lab: PCA on high-dimensional data, compare reconstruction errors

Week 3: Multivariable Calculus & Gradients

Math: Partial derivatives, gradients, Jacobian, chain rule, backprop idea

NumPy Lab: Compute gradients numerically, gradient descent, 3D loss visualization

Torch Lab: Autograd exploration, manual vs autograd comparison

Week 4: Optimization Theory

Math: Convexity, SGD, Momentum, Adam, convergence

NumPy Lab: Implement optimizers, compare convergence rates

Torch Lab: Custom optimizer, learning rate scheduling

Month 2: Probability & Density Estimation
Week 5: Probability Foundations

Random variables, PMF vs PDF, expectation, variance, covariance, Law of Large Numbers

NumPy Lab: Monte Carlo simulations

Torch Lab: torch.distributions exploration

Week 6: Gaussian & Multivariate Distributions

Multivariate Gaussian, covariance matrix geometry, likelihood function

NumPy Lab: PDF implementation, contour plots, log-likelihood

Torch Lab: Fit Gaussian via gradient descent

Week 7: Maximum Likelihood Estimation

MLE derivation, linear & logistic regression

NumPy Lab: Closed-form linear regression, logistic regression from scratch

Torch Lab: Build training loop, loss functions, checkpointing

Week 8: Latent Variables & EM

Mixture models, EM algorithm, GMM as density estimator

NumPy Lab: EM implementation, clustering visualization

Torch Lab: Log-sum-exp trick, stable implementation

Month 3: Neural Networks & Generalization
Week 9: Neural Networks from Scratch

Backpropagation, activation functions, loss functions

NumPy & Torch Labs

Week 10: Regularization & Generalization

Bias-variance, L1/L2, dropout, batch norm

Labs: Overfitting experiments, regularization comparison

Week 11: CNNs

Manual convolutions, Torch CNN for CIFAR/MNIST, train & evaluate

Week 12: Density Estimation with Neural Networks

Autoregressive models, Normalizing flows, Energy-based models

Month 4: Generative AI
Week 13: Variational Autoencoders

ELBO derivation, KL divergence, reparameterization trick

NumPy & Torch Labs: Visualize latent space, full VAE implementation

Week 14: Diffusion Models

Forward & reverse process, noise prediction objective

Torch Lab: Train simple diffusion, visualize denoising

Week 15: Conditional Generation

Conditional VAE/diffusion, classifier-free guidance

Week 16: Evaluation of Generative Models

FID score, likelihood vs perceptual quality, mode collapse

Month 5: Transformers & Deep Learning Systems
Week 17: Attention Mechanism

Scaled dot-product attention, multi-head attention

Week 18: Transformer Architecture

Encoder, decoder, positional encoding, mini transformer lab

Week 19: Training at Scale

Mixed precision, gradient clipping, checkpointing, experiment tracking

Week 20: Deployment

TorchScript, ONNX, FastAPI, Docker; deploy transformer as REST API

Month 6: Large Language Models (LLM Engineering)
Week 21: LLM Internals

Causal LM, tokenization, scaling laws

Lab: Train tiny GPT from scratch

Week 22: Fine-Tuning LLMs

LoRA, QLoRA, instruction tuning, task-specific chatbot

Week 23: RAG Systems

Embeddings, vector DBs, hybrid search, FAISS lab

Week 24: LLM Deployment & Optimization

Quantization, vLLM, API deployment, monitoring lab

Recorded Lectures

With lifetime access to our lecture content,
you can revisit and refresh your concepts at your convenience.