Review of Python Courses (Part 30)
Posted by Mark on February 26, 2021 at 07:32 | Last modified: February 17, 2021 13:32In Part 29, I summarized my Datacamp courses 86-88. Today I will continue with the next three.
As a reminder, I introduced you to my recent work learning Python here.
My course #89 was a case study in Python machine learning. This course covers:
- Introducing the challenge
- Exploring the data
- Looking at the datatypes (converting dtype for all dataframe categories)
- How do we measure success?
- It’s time to build a model (from sklearn.linear_model import LogisticRegression)
- Making predictions (from sklearn.multiclass import OneVsRestClassifier)
- A very brief introduction to NLP
- Representing text numerically (from sklearn.feature_extraction.text import CountVectorizer)
- Pipelines, feature and text preprocessing (from sklearn.pipeline import Pipeline, FeatureUnion)
- Text features and feature unions (from sklearn.preprocessing import FunctionTransformer, Imputer)
- Choosing a classification model (from sklearn.ensemble import RandomForestClassifier)
- Learning from the expert: processing
- Learning from the expert: a stats trick (from sklearn.preprocessing import PolynomialFeatures)
- Learning from the expert: the winning model (from sklearn.feature_extraction.text import HashingVectorizer)
- Next steps and the social impact of your work
>
My course #90 was Ensemble Methods in Python. Topics covered in this course include:
- Introduction to ensemble methods (from sklearn.ensemble import MetaEstimator)
- Voting (from sklearn.ensemble import VotingClassifier, VotingRegresssor)
- Averaging
- The strength of “weak” models
- Bootstrap aggregating
- Bagging classifier: nuts and bolts
- Bagging parameters: tips and tricks
- The effectiveness of gradual learning
- Adaptive boosting: award winning model (from sklearn.ensemble import AdaBoostClassifier, AdaBoostRegressor)
- Gradient boosting (from sklearn.ensemble import GradientBoostingClassifier, GradientBoostingRegressor)
- Gradient boosting flavors (import xgboost as xgb; import lightgbm as lgb; import catboost as cb)
- The intuition behind stacking
- Build your first stacked ensemble
- Let’s mlxtend it (from mlxtend.classifier import StackingClassifier; from mlxtend.regressor import StackingRegressor)!
>
My course #91 was Data Analysis in Spreadsheets. This course covers:
- First function: ROUND
- Function composition: SQRT
- Functions and ranges: MIN, MAX
- Selecting ranges: SUM, AVERAGE, MEDIAN
- Multiple arguments: RANK
- String manipulation: LEFT, RIGHT
- String information: LEN, SEARCH
- Combining strings: CONCATENATE
- Date functions: WEEKDAY
- Comparing dates
- Combining functions
- Flow control: IF
- Nested logical functions: IF
>
I will review more courses next time.
Categories: Python | Comments (0) | Permalink