This course is a down-to-earth, shy but confident take on machine learning techniques that you can put to work today. The course is down-to-earth : it makes everything as simple as possible - but not simpler. The course is shy but confident : It is authoritative, drawn from decades of practical experience -but shies away from needlessly complicating stuff. You can put ML to work today : If Machine Learning is a car, this car will have you driving today. It won't tell you what the carburetor is. The course is very visual : most of the techniques are explained with the help of animations to help you understand better. This course is practical as well : There are hundreds of lines of source code with comments that can be used directly to implement natural language processing and machine learning for text summarization, text classification in Python. The course is also quirky. The examples are irreverent. Lots of little touches: repetition, zooming out so we remember the big picture, active learning with plenty of quizzes. There’s also a peppy soundtrack, and art - all shown by studies to improve cognition and recall.
Yep! Analytics professionals, modelers, big data professionals who haven't had exposure to machine learning. Yep! Engineers who want to understand or learn machine learning and apply it to problems they are solving. Yep! Product managers who want to have intelligent conversations with data scientists and engineers about machine learning. Yep! Tech executives and investors who are interested in big data, machine learning or natural language processing. ep! MBA graduates or business professionals who are looking to move to a heavily quantitative role.
Identify situations that call for the use of Machine Learning. Understand which type of Machine learning problem you are solving and choose the appropriate solution. Use Machine Learning and Natural Language processing to solve problems like text classification, text summarization in Python.
No prerequisites, knowledge of some undergraduate level mathematics would help but is not mandatory. Working knowledge of Python would be helpful if you want to run the source code that is provided.
Lesson 1
Machine Learning: Why should you jump on the bandwagon?
Lesson 2
Plunging In - Machine Learning Approaches to Spam Detection
Lesson 3
Spam Detection with Machine Learning Continued
Lesson 4
Get the Lay of the Land : Types of Machine Learning Problems
Lesson 5
Naive Bayes Classifier - Random Variables
Lesson 6
Bayes Theorem
Lesson 7
Naive Bayes Classifier
Lesson 8
Naive Bayes Classifier : An example Preview
Lesson 9
K-Nearest Neighbors
Lesson 10
K-Nearest Neighbors : A few wrinkles
Lesson 11
Support Vector Machines Introduced
Lesson 12
Support Vector Machines : Maximum Margin Hyperplane and Kernel Trick
Lesson 13
Clustering as a form of Unsupervised learning - Clustering introduction
Lesson 14
Clustering : K-Means and DBSCAN
Lesson 15
Association Detection - Association rules learning
Lesson 16
Dimensionality Reduction
Lesson 17
Principal Component Analysis
Lesson 18
Artificial Neural Networks:Perceptrons Introduced
Lesson 19
Regression as a form of supervised learning - Regression Introduced : Linear and Logistic Regression
Lesson 20
Bias Variance Trade-off
Lesson 21
Natural Language Processing and Python - Natural Language Processing with NLTK
Lesson 22
Natural Language Processing with NLTK - See it in action
Lesson 23
Web Scraping with BeautifulSoup
Lesson 24
A Serious NLP Application : Text Auto Summarization using Python
Lesson 25
Python Drill : Autosummarize News Articles I
Lesson 26
Python Drill : Autosummarize News Articles II
Lesson 27
Python Drill : Autosummarize News Articles III
Lesson 28
Put it to work : News Article Classification using K-Nearest Neighbors
Lesson 29
Put it to work : News Article Classification using Naive Bayes Classifier
Lesson 30
Python Drill : Scraping News Websites
Lesson 31
Document Distance using TF-IDF
Lesson 32
Put it to work : News Article Clustering with K-Means and TF-IDF
Lesson 33
Sentimental analysis - A Sneak Peek at what's coming up
Lesson 34
Sentiment Analysis - What's all the fuss about?
Lesson 35
ML Solutions for Sentiment Analysis - the devil is in the details
Lesson 36
Sentiment Lexicons ( with an introduction to WordNet and SentiWordNet)
Lesson 37
Regular Expressions
Lesson 38
Regular Expressions in Python
Lesson 39
Put it to work : Twitter Sentiment Analysis
Lesson 40
Twitter Sentiment Analysis - Work the API
Lesson 41
Twitter Sentiment Analysis - Regular Expressions for Preprocessing
Lesson 42
Twitter Sentiment Analysis - Naive Bayes, SVM and Sentiwordnet
Lesson 43
Decision Tree - Planting the seed - What are Decision Trees?
Lesson 44
Growing the Tree - Decision Tree Learning
Lesson 45
Branching out - Information Gain
Lesson 46
Decision Tree Algorithms
Lesson 47
Titanic : Decision Trees predict Survival (Kaggle) - I
Lesson 48
Titanic : Decision Trees predict Survival (Kaggle) - II
Lesson 49
Titanic : Decision Trees predict Survival (Kaggle) - III
Lesson 50
A Few Useful Things to Know About Overfitting - Overfitting - the bane of Machine Learning
Lesson 51
Overfitting Continued
Lesson 52
Cross Validation
Lesson 53
Simplicity is a virtue - Regularization
Lesson 54
The Wisdom of Crowds - Ensemble Learning
Lesson 55
Ensemble Learning continued - Bagging, Boosting and Stacking
Lesson 56
Random Forest - Random Forests - Much more than trees
Lesson 57
Back on the Titanic - Cross Validation and Random Forests
Lesson 58
Recommendation System - What do Amazon and Netflix have in common?
Lesson 59
Recommendation Engines - A look inside
Lesson 60
What are you made of? - Content-Based Filtering
Lesson 61
With a little help from friends - Collaborative Filtering
Lesson 62
A Neighbourhood Model for Collaborative Filtering
Lesson 63
Top Picks for You! - Recommendations with Neighbourhood Models
Lesson 64
Discover the Underlying Truth - Latent Factor Collaborative Filtering
Lesson 65
Latent Factor Collaborative Filtering contd.
Lesson 66
Gray Sheep and Shillings - Challenges with Collaborative Filtering
Lesson 67
The Apriori Algorithm for Association Rules
Lesson 68
Recommendation Systems in Python - Back to Basics : Numpy in Python
Lesson 69
Back to Basics : Numpy and Scipy in Python
Lesson 70
Movielens and Pandas
Lesson 71
Code Along - What's my favorite movie? - Data Analysis with Pandas
Lesson 72
Code Along - Movie Recommendation with Nearest Neighbour CF
Lesson 73
Code Along - Top Movie Picks (Nearest Neighbour CF)
Lesson 74
Code Along - Movie Recommendations with Matrix Factorization
Lesson 75
Code Along - Association Rules with the Apriori Algorithm
Lesson 76
A Taste of Deep Learning and Computer Vision - Computer Vision - An Introduction
Lesson 77
Perceptron Revisited
Lesson 78
Deep Learning Networks Introduced
Lesson 79
Code Along - Handwritten Digit Recognition -I
Lesson 80
Code Along - Handwritten Digit Recognition - II
Lesson 81
Code Along - Handwritten Digit Recognition - III
Loony Corn
Loonycorn is us, Janani Ravi, Vitthal Srinivasan, Swetha Kolalapudi and Navdeep Singh. Between the four of us, we have studied at Stanford, IIM Ahmedabad, the IITs and have spent years (decades, actually) working in tech, in the Bay Area, New York, Singapore and Bangalore. Janani: 7 years at Google (New York, Singapore); Studied at Stanford; also worked at Flipkart and Microsoft. Vitthal: Also Google (Singapore) and studied at Stanford; Flipkart, Credit Suisse and INSEAD too. Swetha: Early Flipkart employee, IIM Ahmedabad and IIT Madras alum. Navdeep: longtime Flipkart employee too, and IIT Guwahati alum. We hope you will try our offerings, and think you'll like them.