Machine Learning & AI
Master ML algorithms, neural networks, computer vision, and NLP.
- From Python Data Stack to Deep Learning
- Interactive AI visualizations
- Learn Pandas, Scikit-Learn & Neural Networks
Start your 7-day free trial
Get full access to all learning paths across the platform.
All Lessons
Linear Regression: Finding the Line
Understand how linear regression finds the best-fit line through data points.
Python Data Stack: Pandas & NumPy
Master the essential tools for data manipulation: DataFrames and arrays.
Classification: Predicting Categories
Learn how to predict categories (like churn vs. retention) using Logistic Regression with Scikit-Learn.
Evaluating Models: Train & Test Splits
Learn how to evaluate your models accurately and prevent overfitting by splitting your data.
Generative AI Foundations
Understand how Large Language Models (LLMs) process text as tokens and learn to estimate inference costs.
Support Vector Machines (SVM)
Learn how SVMs find the optimal mathematical boundary between different classes of data.
Model Tuning & Cross-Validation
Take your models to the next level by systematically finding the best hyperparameters.
The Transformer Architecture
Explore the groundbreaking architecture that powers ChatGPT, Claude, and modern AI.
Advanced Prompt Engineering
Master techniques like Few-Shot prompting and Chain of Thought to get better results from LLMs.
Retrieval-Augmented Generation (RAG)
Learn how to give AI models access to custom documents and real-time knowledge.
Data Preprocessing & Cleaning
Learn to handle missing data, drop duplicates, and prepare clean datasets.
Exploratory Data Analysis (EDA)
Group data, understand distributions, and uncover hidden insights.
Feature Engineering
Create new features and encode text so ML models can understand them.
Decision Trees & Ensembles
Learn how algorithms can make decisions through a series of yes/no questions, and how Random Forests combine them.
Unsupervised Learning (Clustering)
Group similar data points together without knowing the answers beforehand using K-Means.
Probability & Statistics
Master the mathematical language of uncertainty that powers all machine learning models.
Math for ML: Vectors & Matrices
Explore the core linear algebra operations that make neural networks and embeddings possible.
PCA & Dimensionality Reduction
Learn how to compress hundreds of features into their most important components.
Naive Bayes & NLP Basics
Use probability to classify text and build a classic spam filter.
Reinforcement Learning (RL)
Teach an AI to play games by maximizing rewards using Q-Learning.
AI Agents & Tool Use
How modern LLMs function autonomously using external tools and reasoning loops.
Neural Networks 101: The Perceptron
Understand the biological inspiration behind AI: the artificial neuron.
Forward Propagation & Deep Networks
Stack neurons into layers to create Deep Neural Networks capable of complex logic.
Loss Functions & Evaluation
Learn how networks measure how 'wrong' their predictions are.
Gradient Descent & Backpropagation
Understand the mathematical engine that actually allows Neural Networks to learn.
Optimizers: Beyond Vanilla Descent
Why nobody uses standard Gradient Descent, and how advanced Optimizers speed up training.
Training Loops in PyTorch
Write the standard 5-step PyTorch training loop used by researchers worldwide.
Regularization, Dropout & BatchNorm
Prevent networks from memorizing the data using Regularization layers.
Convolutional Neural Networks (CNNs)
How AI processes visual data using Convolutions and Pooling.
Recurrent Networks: RNNs & LSTMs
Processing sequential data like heartbeat signals, stock prices, and text.
Autoencoders & Latent Spaces
Compressing reality into vectors to build the foundation of Generative AI.
Gradient Boosting Machines
Learn how gradient boosting builds trees sequentially, each correcting errors of the previous one.
XGBoost in Practice
Master XGBoost: regularization, feature importance, handling missing values, and hyperparameter tuning.
LightGBM & Fast Training
Train models faster with LightGBM's histogram-based splits, leaf-wise growth, and categorical support.
CatBoost & Categorical Features
Use CatBoost for datasets with many categorical features without manual encoding.
Ensemble Stacking Techniques
Combine multiple models into a meta-learner using stacking to boost predictive performance.
Model Blending Strategies
Blend predictions from diverse models using weighted averaging and cross-validated blending strategies.
DBSCAN Clustering
Discover clusters of arbitrary shape with DBSCAN: density-based grouping without specifying K.
Gaussian Mixture Models
Model data as mixtures of Gaussian distributions for soft clustering and density estimation.
Hierarchical Clustering
Build hierarchical cluster trees (dendrograms) with agglomerative and divisive methods.
Evaluating Cluster Quality
Diagnose underfitting and overfitting by evaluating cluster models internally using Silhouette Scores.
t-SNE for Visualization
Visualize high-dimensional data in 2D/3D with t-SNE while preserving local neighborhood structure.
UMAP Dimensionality Reduction
Use UMAP for fast, scalable dimensionality reduction that preserves both local and global structure.
Anomaly Detection Methods
Detect outliers and anomalies with Isolation Forest, Local Outlier Factor, and statistical methods.
Autoencoders for Unsupervised Learning
Learn how autoencoders compress and reconstruct data for feature learning and anomaly detection.
Text Preprocessing for NLP
Clean and prepare text data: tokenization, stopword removal, stemming, lemmatization, and normalization.
Bag of Words & TF-IDF
Represent text as numerical vectors using bag-of-words, n-grams, and TF-IDF weighting.
Word Embeddings (Word2Vec)
Understand word embeddings: how Word2Vec, GloVe, and FastText capture semantic meaning in vectors.
Sequence Models for NLP
Apply recurrent models (RNNs, LSTMs, GRUs) to text tasks like translation and summarization.
Named Entity Recognition
Extract named entities (people, places, organizations) from text using sequence labeling models.
Sentiment Analysis
Build sentiment classifiers that determine whether text expresses positive, negative, or neutral opinions.
Text Classification Pipelines
Create end-to-end text classification pipelines: preprocessing, vectorization, training, and evaluation.
Image Preprocessing Techniques
Prepare images for model input: resizing, normalization, color space conversion, and batch loading.
Feature Extraction from Images
Extract visual features from images using traditional methods (HOG, SIFT) and CNN feature maps.
Object Detection Fundamentals
Detect and localize objects in images with anchor boxes, YOLO, and two-stage detector architectures.
Image Segmentation
Segment images at the pixel level with semantic, instance, and panoptic segmentation approaches.
Data Augmentation for Vision
Expand training datasets with augmentation: flipping, rotation, cropping, color jitter, and mixup.
Transfer Learning for Vision
Leverage pre-trained models (ResNet, EfficientNet) and fine-tune them for your specific vision task.
Face Recognition Systems
Build face recognition systems: face detection, alignment, embedding extraction, and identity matching.
Variational Autoencoders (VAEs)
Generate new data with Variational Autoencoders: latent space sampling, the ELBO loss, and interpolation.
GANs Introduction
Understand Generative Adversarial Networks: the generator-discriminator game and training dynamics.
Conditional GANs
Control GAN outputs with conditional generation: class-conditional, text-to-image, and style transfer.
Diffusion Models
Learn diffusion models: the forward noising process, reverse denoising, and modern architectures.
Text-to-Image Generation
Generate images from text prompts with models like Stable Diffusion: architecture, guidance, and fine-tuning.
Fine-Tuning Large Language Models
Fine-tune large language models on custom data: LoRA, QLoRA, instruction tuning, and dataset preparation.
RLHF & AI Alignment
Align AI models with human preferences using RLHF: reward modeling, PPO training, and evaluation.
Time Series Decomposition
Decompose time series into trend, seasonal, and residual components for better understanding and forecasting.
ARIMA Forecasting
Forecast stationary time series with ARIMA: differencing, autocorrelation, and parameter selection.
Prophet for Time Series
Use Facebook Prophet for automatic seasonality detection, holiday effects, and changepoint handling.
LSTMs for Forecasting
Apply LSTMs to time series forecasting: sequence windowing, multi-step predictions, and feature engineering.
Anomaly Detection in Time Series
Detect anomalies in time series data using statistical tests, sliding windows, and deep learning methods.
Multivariate Time Series
Model multiple correlated time series simultaneously with VAR, multivariate LSTM, and attention mechanisms.
Forecasting Pipelines
Build production forecasting pipelines: data ingestion, model training, prediction serving, and monitoring.
Experiment Tracking (MLflow)
Track ML experiments with MLflow: parameters, metrics, artifacts, and experiment comparison dashboards.
Model Versioning & Registry
Version models and manage the model registry for staging, production, and rollback across environments.
Feature Stores
Centralize feature computation and serving with feature stores for consistent training and inference.
Model Serving & APIs
Deploy models as REST APIs: model serialization, containerization, batching, and latency optimization.
A/B Testing for ML Models
Run A/B tests on ML models to measure real-world impact and make data-driven deployment decisions.
Monitoring Model Drift
Monitor models in production for data drift, concept drift, and performance degradation over time.
ML Pipeline Orchestration
Orchestrate ML pipelines with Airflow, Kubeflow, or Vertex AI for reproducible, automated workflows.
Graph Neural Networks
Apply neural networks to graph-structured data: node classification, link prediction, and graph generation.
Federated Learning
Train models across decentralized data sources without sharing raw data using federated learning.
Reinforcement Learning Deep Dive
Deep dive into RL: Q-learning, policy gradients, actor-critic methods, and environment design.
Multi-Agent Systems
Build systems where multiple AI agents collaborate or compete to solve complex tasks together.
Neural Architecture Search
Automatically discover optimal neural network architectures with NAS, DARTS, and efficiency-aware search.
Self-Supervised Learning
Learn representations from unlabeled data with self-supervised methods: contrastive learning and masking.
Few-Shot & Zero-Shot Learning
Generalize to new tasks with minimal examples using few-shot and zero-shot learning techniques.
Multimodal AI Models
Build models that process and combine multiple data types: text, images, audio, and video together.
AI Safety & Alignment
Understand AI safety challenges: alignment, interpretability, robustness, and value specification.
Efficient Inference Techniques
Speed up model inference with quantization, distillation, caching, and hardware-specific optimizations.
Model Compression & Pruning
Reduce model size with pruning, weight sharing, low-rank factorization, and knowledge distillation.
Edge AI Deployment
Deploy ML models to edge devices: mobile, IoT, and embedded systems with TensorFlow Lite and ONNX.
Responsible AI Practices
Build fair, transparent, and accountable AI systems with bias detection, explainability, and governance.