Programming assignments and lecture notes from the Deep Learning Specialization taught by Andrew Ng and offered by deeplearning.ai on Coursera.
This repository contains my work on the assignments. The codebase, lecture notes, and citations are from the Deep Learning Specialization on Coursera, unless otherwise noted.
Don't miss Tess Fernandez's vivid notes on this course.
The first course in the Deep Learning Specialization focuses on the foundational concepts of neural networks and deep learning.
Learn about the key technology trends driving the rise of deep learning; build, train, and apply fully connected deep neural networks; implement efficient (vectorized) neural networks; identify key parameters in a neural network’s architecture; and apply deep learning to applications.
- Week 2 - A1: Logistic Regression with a Neural Network mindset
- Week 3 - A1: Planar data classification with one hidden layer
- Week 4 - A1: Building your Deep Neural Network: Step by Step
- Week 4 - A2: Deep Neural Network for Image Classification: Application
The second course in the Deep Learning Specialization focuses on opening the black box of deep learning to understand the processes that drive performance and systematically generate good results.
Learn best practices for training and developing test sets and analyzing bias/variance for building deep learning applications; be able to use standard neural network techniques such as initialization, L2 and dropout regularization, hyperparameter tuning, batch normalization, and gradient checking; implement and apply various optimization algorithms, such as mini-batch gradient descent, Momentum, RMSprop, and Adam, and check for their convergence; and implement a neural network in TensorFlow.
- Week 1 - A1: Initialization
- Week 1 - A2: Regularization
- Week 1 - A3: Gradient Checking
- Week 2 - A1: Optimization Methods
- Week 3 - A1: TensorFlow Tutorial
The third course in the Deep Learning Specialization focuses on learning how to build a successful machine learning project and practicing decision making as a machine learning project leader.
Learn how to diagnose errors in a machine learning system; prioritize strategies to reduce errors; understand complex ML settings such as mismatched training/test sets and comparing to and/or outperforming human performance; and apply end-to-end learning, transfer learning, and multi-task learning.
This is also a standalone course for learners who have basic knowledge of machine learning. This course draws on Andrew Ng's experience building and shipping many deep learning products. If you aspire to become a technical leader who can set the direction for an AI team, this course provides the "industry experience" that you might otherwise only get after years of ML work experience.
There are no programming assignments for this course.
- Week 1: Machine Learning Strategies 1
- Week 2: Machine Learning Strategies 2
The fourth course in the Deep Learning Specialization focuses on understanding how computer vision has evolved and becoming familiar with its exciting applications, such as autonomous driving, face recognition, reading radiology images, and more.
Learn how to build a convolutional neural network, including recent variations such as residual networks; apply convolutional networks to visual detection and recognition tasks; and use neural style transfer to generate art and apply these algorithms to a variety of image, video, and other 2D or 3D data.
- Week 1 - A1: Convolutional Model: step by step
- Week 1 - A2: Convolutional Neural Networks: Application
- Week 2 - A1: Residual Networks
- Week 2 - A2: Transfer Learning with MobileNetV2
- Week 3 - A1: Object detection with YOLO
- Week 3 - A2: Image Segmentation with U-Net
- Week 4 - A1: Face Recognition
- Week 4 - A2: Deep Learning & Art: Neural Style Transfer
The fifth course in the Deep Learning Specialization focuses on sequence models and their applications in speech recognition, music synthesis, chatbots, machine translation, natural language processing (NLP), and more.
Learn how to build and train Recurrent Neural Networks (RNNs) and commonly used variants such as GRUs and LSTMs; apply RNNs to character-level language modeling; gain experience with natural language processing and word embeddings; and use HuggingFace tokenizers and transformer models to solve various NLP tasks such as NER and question answering.
- Week 1 - A1: Building a Recurrent Neural Network - Step by Step
- Week 1 - A2: Character-level language model
- Week 1 - A3: Jazz improvisation with LSTM
- Week 2 - A1: Word Vector Representation and Debiasing
- Week 2 - A2: Emojify!
- Week 3 - A1: Neural Machine Translation with Attention
- Week 3 - A2: Trigger Word Detection
- Week 4 - A1: Transformer Network
- Week 4 - U1: Transformer Network Preprocessing
- Week 4 - U2: Transformer Network Application: Named-Entity Recognition
- Week 4 - U3: Transformer Network Application: Question Answering