This course was taught by Dr. Thomas Miller, who is the faculty director of the Data Science program (formerly known as the Predictive Analytics program – I am going to post an article discussing the program name change from the Master of Science in Predictive Analytics (MSPA) to the Master of Science in Data Science (MSDS)). Overall, this was an excellent review of machine learning, and is a required core course for all students in the program. It is most definitely a foundational course for any student of data science in today’s world. It is also a foundational course for the Artificial Intelligence and Deep Learning specialization, which is currently being developed (more on this in a subsequent post as well). The course covers the following topics:
- Supervised, Unsupervised, and Semi-supervised learning
- Regression versus Classification
- Decision Trees and Random Forests
- Dimensionality Reduction techniques
- Clustering Techniques
- Feature Engineering
- Artificial Neural Networks
- Deep Neural Networks
- Convolutional Neural Networks (CNN)
- Recurrent Neural Networks (RNN)
This course uses Python and the Python Libraries Scikit-Learn and TensorFlow. In addition to using Jupyter Notebooks to run my code, I also learned how to run TensorFlow from the Command Line, which is a faster way of running neural networks through a large number of epochs. The course is currently offered in R as well, but they will be discontinuing the R course, and only offering the Python/TensorFlow course starting in the fall semester. Dr. Miller commented that they will be using Python much more extensively going forward, especially in the AI/Deep Learning specialization courses. R apparently will still be offered in the Analytics/Modeling courses – 410 (Regression Analysis) and 411 (Generalized Linear Models). I did learn to use Python/Scikit-Learn/TensorFlow at an intermediate level, and feel like I have a great foundation to build upon, in terms of programming.
There is required reading every week, mainly from the two required textbooks, although there are a few articles to read as well. There were a total of 5 sync sessions which reviewed various topics. I wish the sync sessions had been a little more robust, and covered the current assignments and the coding required to complete the assignments. I found this very helpful in previous courses. There were weekly discussion board assignments, which covered basic concepts, and turned out to be very informative, especially since a lot of the topics covered on the final exam were covered in these discussions. There are weekly assignments which must be completed, in which you either develop the code yourself, or use a skeletal code base provided and build upon it. These ranged from very easy to very difficult, especially as you moved into the artificial neural networks. There was a non-proctored final exam and a proctored final exam.
Géron, A. 2017. Hands-On Machine Learning with Scikit-Learn & TensorFlow: Concepts, Tools, and Techniques to Build Intelligent Systems. Sebastopol, Calif.: O’Reilly. [ISBN-13 978-1-491-96229-9] Source code available at https://github.com/ageron/handson-ml This was the primary textbook for most of the course. It is an excellent text with lots of great coding examples.
Müller, A. C. and Guido, S. 2017. Introduction to Machine Learning with Python: A Guide for Data Scientists. Sebastopol, Calif.: O’Reilly. [ISBN-13: 978-1449369415] Code examples at https://github.com/amueller/introduction_to_ml_with_python
Izenman, A. J. 2008. Modern Multivariate Statistical Techniques: Regression, Classification, and Manifold Learning. New York: Springer. [ISBN-13: 978-0-387-78188-4] This was used very little.
Learning Outcomes (from syllabus):
Learning Outcomes Practical Machine Learning is a survey course with a long list of learning outcomes:
- Explain the learning algorithm trade-offs, balancing performance within training data and robustness on unobserved test data.
- Distinguish between supervised and unsupervised learning methods.
- Distinguish between regression and classification problems
- Explain bootstrap and cross-validation procedures
- Explore and visualize data and perform basic statistical analysis
- List alternative methods for evaluating classifiers.
- List alternative methods for evaluating regression
- Demonstrate the application of traditional statistical methods for classification and regression
- Demonstrate the application of trees and random forests for classification and regression
- Demonstrate principal components for dimension reduction.
- Demonstrate principal components regression
- Describe hierarchical and non-hierarchical clustering techniques
- Describe how semi-supervised learning may be utilized in addressing classification and regression problems
- Explain how measurement and feature engineering are relevant to modeling
- Describe how artificial neural networks are constructed from logical connections of artificial neurons and activation functions
- Demonstrate the use of artificial neural networks (including deep neural networks) in classification and regression
- Describe how convolutional neural networks are constructed
- Describe how recurrent neural networks are constructed
- Distinguish between autoencoders and other forms of unsupervised learning
- Describe applications of autoencoders
- Explain how the results of machine learning can be useful to business managers
- Transform data and research results into actionable insights
Here are the weekly learning titles and assignments:
Week 1. Introduction to Machine Learning
- Assignment 1. Exploring and Visualizing Data
Week 2. Supervised Learning for Classification
- Assignment 2. Evaluating Classification Models
Week 3. Supervised Learning for Regression
- Assignment 3. Evaluating Regression Models
Week 4. Trees and Random Forests
- Assignment 4. Random Forests
Week 5. Unsupervised Learning
- Assignment 5. Principal Components Analysis
Week 6. Neural Networks
- Assignment 6. Neural Networks
Week 7. Deep Learning for Computer Vision
- Assignment 7. Deep Learning
Week 8. Deep Learning for Natural Language Procession
- Assignment 8 Natural Language Processing
Week 9. Neural Networks Autoencoders
There were 2 final examinations, one being non-proctored and the other proctored. The non-proctored exam was open book, and tested your ability to look at data and the various analytical techniques, and interpret the results of the analyses. The proctored final exam was closed book and covered general concepts.
This was a great overview of some of the more important topics in machine learning. I was able to get a good theoretical background in these topics, and learned the coding necessary to perform these. This is a great foundation upon which to add more advanced and in-depth use of these techniques. This course really challenged me to rethink what analytical techniques I should be learning and applying in the future, to the point that I am going to change my specialization to Artificial Intelligence and Deep Learning.