Let us help you find the training program you are looking for.

If you can't find what you are looking for, contact us, we'll help you find it. We have over 800 training programs to choose from.

banner-img

Course Skill Level:

Foundational to Intermediate

Course Duration:

3 day/s

  • Course Delivery Format:

    Live, instructor-led.

  • Course Category:

    AI / Machine Learning

  • Course Code:

    ENLEPYL21E09

Who should attend & recommended skills:

Those with Python, Linux, & linear algebra skills who wish to create ensemble models using Python

Who should attend & recommended skills

  • This course is geared for those who want to combine popular machine learning techniques to create ensemble models using Python.
  • Skill-level: Foundation-level Ensemble Learning with Python skills for Intermediate skilled team members. This is not a basic class.
  • Python: Basic to Intermediate (1-5 years’ experience)
  • Linear Algebra: Basic (1-2 years’ experience)
  • Linux: Basic (1-2 years’ experience), including familiarity with command-line options such as ls, cd, cp, and su

About this course

Ensembling is a technique of combining two or more similar or dissimilar machine learning algorithms to create a model that delivers superior predictive power. This course will demonstrate how you can use a variety of weak algorithms to make a strong predictive model. With its hands-on approach, you’ll not only get up to speed with the basic theory but also the application of different ensemble learning techniques. Using examples and real-world datasets, you’ll be able to produce better machine learning models to solve supervised learning problems such as classification and regression. In addition to this, you’ll go on to leverage ensemble learning techniques such as clustering to produce unsupervised machine learning models. As you progress, the lessons will cover different machine learning algorithms that are widely used in the practical world to make predictions and classifications. You’ll even get to grips with the use of Python libraries such as scikit-learn and Keras for implementing different ensemble models. By the end of this course, you will be well-versed in ensemble learning, and have the skills you need to understand which ensemble method is required for which problem, and successfully implement them in real-world scenarios.

Skills acquired & topics covered

  • Working in a hands-on learning environment, led by our Python expert instructor, students will learn about and explore:
  • Implementing ensemble models using algorithms such as random forests and AdaBoost
  • Applying boosting, bagging, and stacking ensemble methods to improve the prediction accuracy of your model
  • Exploring real-world data sets and practical examples coded in scikit-learn and Keras
  • Implementing ensemble methods to generate models with high accuracy
  • Overcoming challenges such as bias and variance
  • Exploring machine learning algorithms to evaluate model performance
  • Understanding how to construct, evaluate, and apply ensemble models
  • Analyzing tweets in real time using Twitter’s streaming API
  • Using Keras to build an ensemble of neural networks for the MovieLens dataset

Course breakdown / modules

  • Technical requirements
  • Learning from data
  • Supervised and unsupervised learning
  • Performance measures
  • Machine learning algorithms

  • Technical requirements
  • Bias, variance, and the trade-off
  • Ensemble learning
  • Difficulties in ensemble learning

  • Technical requirements
  • Hard and soft voting
  • Python implementation
  • Using scikit-learn

  • Technical requirements
  • Meta-learning
  • Deciding on an ensemble composition
  • Python implementation

  • Technical requirements
  • Bootstrapping
  • Bagging
  • Python implementation
  • Using scikit-learn

  • Technical requirements
  • AdaBoost
  • Gradient boosting
  • Using scikit-learn
  • XGBoost

  • Technical requirements
  • Understanding random forest trees
  • Creating forests
  • Using scikit-learn

  • Technical requirements
  • Consensus clustering
  • Using Open Ensembles

  • Technical requirements
  • Getting familiar with the dataset
  • Exploratory analysis
  • Voting
  • Stacking
  • Bagging
  • Boosting
  • Using random forests
  • Comparative analysis of ensembles

  • Technical requirements
  • Time series data
  • Voting
  • Stacking
  • Bagging
  • Boosting
  • Random forests

  • Technical requirements
  • Sentiment analysis tools
  • Getting Twitter data
  • Creating a model
  • Classifying tweets in real time

  • Technical requirements
  • Demystifying recommendation systems
  • Neural recommendation systems
  • Using Keras for movie recommendations

  • Technical requirements
  • Understanding the World Happiness Report
  • Creating the ensemble
  • Gaining insights