Summary for Machine Learning Hosted by Andrew

Last updated on:2 years ago

Time flies, I spent almost a month finishing Andrews’ courses, Machine Learning. When he said, “And I wanted to say: Thank you very much for having been a student in this class.“ I feel a little sad, and I want to say, “ Thanks for being my teacher, Andrew.“ After that, I plan to commit to Deep Learning which is also taught by Andrew.

Introduction to ML strategy

Ideas

  • Collect more data
  • Collect more diverse training set
  • Train algorithm longer with gradient descent
  • Try Adam instead of gradient descent
  • Try bigger network
  • Try smaller network
  • Try dropout
  • Add l2 regularization
  • Network architecture (eg. activation function, hidden units)

Our main topics

  • Supervised learning: linear regression, logistic regression, neural networks, SVMs
  • Unsupervised learning: K-means, PCA, anomaly detection
  • Special applications/special topics: Recommender systems, large scale machine learning
  • Advice on building a machine learning: bias/ variance, regularization, deciding what to work on next: evaluation of learning algorithms, learning curves, error analysis, ceiling analysis.

Program Assignments

anthonyweidai/machine-learning-ex-cousera-andrew

Xiaohu’s Blogs

Summary for Machine Learning Hosted by Andrew

The framework of machine learning

Machine learning mathematics

Backpropagation of Machine Learning

The model of artificial neuron and its general principles - Class review

How to solve the the problem of overfitting - Class review

Machine learning questions

How dose regularization work in ML and DL? - Class review

How to debug your machine learning system

What is support vector machine/SVM

Unsupervised learning and clustering algorithms

Dimensionality reduction for input data

Anomaly Detection - Class Review

Recommender Systems - Class Review

Large Scale Machine Learning - Class Review

Photo Optical Recognition/OCR - Class Review

Reference

[1] Andrew NG, Machine learning

[2] Deeplearning.ai, Improving Deep Neural Networks: Hyperparameter Tuning, Regularization and Optimization