Seed Programming

Mathematics For Computer Science

Fee: 12,000 /- PKR (per-month)

Registration Fee: 1,000 /- PKR
(non-refundable)

In Collaboration with Information Technology University (ITU)

Who can join this course?

If you're at the beginning of a BS degree in Computer Science, Software Engineering, Data Science, Machine Learning, or Artificial Intelligence at any university or affiliated college in Lahore or its vicinity, this is an excellent opportunity to build a strong foundation for your journey in this field.

About Course

Mathematical Foundations for AI and Algorithmic Thinking is a comprehensive course designed to build the core mathematical intuition required for Artificial Intelligence, Machine Learning, and advanced algorithms.
Engage in mastering the core pillars of AI: Discrete Mathematics, Calculus, Probability, and Linear Algebra transformed from abstract formulas into practical tools for building intelligent technology.Explore the ideas behind intelligent systems through hands-on mathematical modeling connecting theory to neural networks, optimization, probabilistic learning, and high-dimensional data. This journey is designed to turn mathematical principles into clarity, confidence, and creative problem-solving, whether your path leads to Machine Learning, Data Science, competitive programming, or AI research.

Course Outline

Conceptual Foundations of Mathematics for AI

Month 1

Module 1: Discrete Mathematics for AI
Lecture 1: Logic, Proofs & Sets

Propositions & truth tables (AND, OR, XOR, …), logical implication & equivalence, quantifiers (∀, ∃), direct proof vs contradiction, arithmetic & geometric series proofs, prime numbers, AI applications (rule-based systems, dataset splits, feature sets)

Lecture 2: Mathematical Induction & Time Complexity

Mathematical induction proofs, recursion, Big O / Theta notation, set theory basics (closure, transitivity, symmetry)

Lecture 3: Counting & Combinatorics

Counting principles, permutations & combinations, binomial theorem, inclusion-exclusion principle, AI applications (model capacity, token sequences, parameter search space)

Lecture 4: Graph Theory Foundations

Graphs, nodes, edges, directed vs undirected, adjacency matrices, paths & connectivity (DFS/BFS), DAGs & topological sorting, AI applications (neural networks, transformers attention graph, knowledge graphs)

Module 2: Probability & Statistics for AI
Lecture 1: Probability Fundamentals

Basic probability concepts, conditional probability, Bayes’ theorem, random variables, distributions (uniform, normal, binomial), AI applications (classification probabilities, feature likelihoods)

Lecture 2: Descriptive Statistics & Data Summarization

Mean, median, mode, variance, standard deviation, skewness & kurtosis, data visualization, AI applications (dataset exploration, outlier detection, preprocessing)

Lecture 3: Inferential Statistics & Hypothesis Testing

Confidence intervals, t-tests, chi-square tests, p-values, correlation & regression, AI applications (model evaluation, A/B testing, significance of features)

Lecture 4: Multivariate Probability & Distributions

Joint, marginal, and conditional distributions, independence, covariance & correlation matrices, AI applications (Bayesian networks, multivariate Gaussian, feature interactions)

Month 2

Module 3: Linear Algebra for AI
Lecture 1: Vectors & Matrices

Vector operations, dot/cross product, matrix operations, identity & diagonal matrices, AI applications (data representation, embeddings, transformations)

Lecture 2: Matrix Decomposition & Inverse

Determinants, rank, inverse matrices, LU decomposition, eigenvalues & eigenvectors, AI applications (PCA, dimensionality reduction, covariance analysis)

Lecture 3: Vector Spaces & Transformations

Linear independence, basis, orthogonality, linear transformations, AI applications (feature space transformations, embeddings, attention mechanisms)

Lecture 4: Applications in AI

Matrix calculus, gradients, singular value decomposition (SVD), AI applications (neural networks, recommendation systems, latent factor models)

Module 4: Algorithms & Data Structures for AI
Lecture 1: Algorithm Basics & Complexity

Algorithm design principles, complexity analysis (Big O, Omega, Theta), recursion, AI applications (efficient search, optimization, model training)

Lecture 2: Arrays, Linked Lists & Stacks/Queues

Data structures fundamentals, dynamic arrays, linked lists, stacks & queues, AI applications (data buffering, sequential processing, memory-efficient storage)

Lecture 3: Trees, Graphs & Search Algorithms

Binary trees, BSTs, DAGs, DFS/BFS, shortest path algorithms, AI applications (knowledge graphs, decision trees, search problems)

Lecture 4: Advanced Topics & Optimization

Greedy algorithms, divide & conquer, dynamic programming, priority queues & heaps, AI applications (pathfinding, resource allocation, optimization problems)

In-Depth Exploration of Mathematics for AI

Month 3

Discrete Mathematics for AI
Lecture 1: Foundations of Logic, Proofs & Sets

Core Concepts: Propositions, truth tables (AND, OR, XOR, NOT), logical implication, equivalence, quantifiers (∀, ∃), proofs, √N divisor logic.

Intuition: Building the grammar of mathematics to define truth clearly.

AI/CS Link: Logical rules power rule-based systems and model assumptions.

Lecture 2: Mathematical Induction & Recurrence Relations

Core Concepts: Induction, recursion, arithmetic & geometric series.

Intuition: Domino-effect reasoning for infinite processes.

AI/CS Link: Used in proving correctness of recursive algorithms.

Lecture 3: Growth Rates & Computational Complexity

Core Concepts: Big O, Theta, Omega, complexity classes.

Intuition: How algorithms scale with input size.

AI/CS Link: Determines efficiency of training and inference.

Lecture 4: Advanced Set Theory & Relations

Core Concepts: Set operations, Cartesian products, relations properties.

Intuition: Structuring and organizing data relationships.

AI/CS Link: Used in dataset structuring and feature grouping.

Lecture 5: Counting, Combinatorics & Binomials

Core Concepts: Permutations, combinations, binomial theorem, inclusion-exclusion.

Intuition: Counting possible configurations in systems.

AI/CS Link: Defines model capacity and search space.

Lecture 6 — Graph Theory Foundations

Core Concepts: Graphs, nodes, edges, adjacency, connectivity.

Intuition: Modeling relationships as networks.

AI/CS Link: Used in neural networks and knowledge graphs.

Lecture 7: Trees, DAGs & Topological Sorting

Core Concepts: Trees, DAGs, DFS, BFS, topological sorting.

Intuition: Hierarchical and ordered data flow.

AI/CS Link: Backpropagation and transformer architectures.

Lecture 8: Formal Languages & Automata

Core Concepts: FSM, regular expressions, formal systems.

Intuition: Rules that define valid data sequences.

AI/CS Link: Tokenization and NLP pipelines.

Month 4

Calculus for AI
Lecture 1: Limits, Continuity & Derivatives

Core Concepts: Limits (0/0, ∞/∞), continuity, basic derivatives (xⁿ, sin, cos, eˣ, ln x).

Intuition: Measuring change at an infinitesimal level.

AI/CS Link: Foundation for gradients in optimization.

Lecture 2: Differentiation Rules & Chain Rule

Core Concepts: Sum, product, quotient, chain rule, L’Hôpital’s rule.

Intuition: Handling complex nested functions.

AI/CS Link: Core mechanism behind backpropagation.

Lecture 3: Optimization & Critical Points

Core Concepts: Maxima, minima, inflection points, second derivative test.

Intuition: Finding peaks and valleys of functions.

AI/CS Link: Training models by minimizing loss functions.

Lecture 4: Taylor Series & Linearization

Core Concepts: Taylor expansion, linear approximation.

Intuition: Approximating complex functions locally.

AI/CS Link: Understanding local behavior of loss surfaces.

Lecture 5: Integration & The Fundamental Theorem of Calculus

Core Concepts: Riemann sums, Fundamental Theorem of Calculus.

Intuition: Accumulating values over intervals.

AI/CS Link: Used in expectations and probabilistic models.

Lecture 6: Multivariable Calculus: Partial Derivatives & Gradients

Core Concepts: Partial derivatives, gradient vector.

Intuition: Change across multiple dimensions.

AI/CS Link: Updating all parameters in ML models.

Lecture 7: Jacobian & Vector Functions

Core Concepts: Jacobian matrix, multivariable mappings.

Intuition: Derivatives for multi-input/output systems.

AI/CS Link: Gradient flow in deep networks.

Lecture 8: Convexity & High-Dimensional Optimization

Core Concepts: Convex functions, Hessian matrix.

Intuition: Bowl-shaped functions guarantee optimal solutions.

AI/CS Link: Stable training in ML algorithms.

Month 5

Probability Foundations for AI
Lecture 1: Discrete Probability & Kolmogorov Axioms

Core Concepts: Probability axioms, expectation, variance.

Intuition: Quantifying uncertainty.

AI/CS Link: Base of all probabilistic models.

Lecture 2: Conditional Probability & Bayes

Core Concepts: Conditional probability, Bayes’ theorem.

Intuition: Updating beliefs with new data.

AI/CS Link: Bayesian inference and prediction.

Lecture 3: Random Variables: Expectation & Variance

Core Concepts: PMF, PDF, expectation, variance.

Intuition: Representing uncertainty numerically.

AI/CS Link: Loss calculations and distributions.

Lecture 4: Continuous Distributions & The Gaussian

Core Concepts: Gaussian, uniform, exponential distributions.

Intuition: Modeling real-world randomness.

AI/CS Link: Used in generative models.

Lecture 5: Covariance & High-Dimensional Geometry

Core Concepts: Covariance, correlation matrices.

Intuition: Measuring relationships between variables.

AI/CS Link: Feature relationships and embeddings.

Lecture 6: Limit Theorems & Sampling

Core Concepts: LLN, CLT, sampling.

Intuition: Why averages stabilize.

AI/CS Link: Stochastic gradient descent.

Lecture 7: Likelihood & Parameter Estimation

Core Concepts: Likelihood, MLE.

Intuition: Finding best model parameters.

AI/CS Link: Basis of training loss functions.

Lecture 8: Information Theory: Entropy & Divergence

Core Concepts: Entropy, cross-entropy, KL divergence.

Intuition: Measuring information and difference.

AI/CS Link: Core loss functions in ML.

Month 6

Linear Algebra for AI
Lecture 1: Vectors, Norms & Dot Products

Core Concepts: Vectors, norms, dot products.

Intuition: Representing data in space.

AI/CS Link: Embeddings and similarity.

Lecture 2: Matrices & Linear Transformations

Core Concepts: Matrix multiplication, transformations.

Intuition: Transforming data across spaces.

AI/CS Link: Neural network layers.

Lecture 3: Linear Systems

Core Concepts: Solving Ax = b, Gaussian elimination.

Intuition: Finding intersections of equations.

AI/CS Link: Regression models.

Lecture 4: Subspaces, Orthogonality & Projections

Core Concepts: Basis, orthogonality, projections.

Intuition: Reducing dimensions.

AI/CS Link: Dimensionality reduction.

Lecture 5: Eigenvalues & Eigenvectors

Core Concepts: Eigen decomposition.

Intuition: Natural directions of transformation.

AI/CS Link: Stability and clustering.

Lecture 6: Singular Value Decomposition(SVD)

Core Concepts: Singular Value Decomposition.

Intuition: Matrix factorization.

AI/CS Link: Compression and LoRA.

Lecture 7: Principal Component Analysis(PCA)

Core Concepts: Principal Component Analysis.

Intuition: Capturing maximum variance.

AI/CS Link: Feature reduction.

Lecture 8: Tensors & Multilinear Algebra

Core Concepts: Tensors, multidimensional arrays.

Intuition: Extending vectors and matrices.

AI/CS Link: Deep learning frameworks.

Recorded Lectures

With lifetime access to our lecture content,
you can revisit and refresh your concepts at your convenience.