Wiki
FlickrLeetcode
  • 💞Artificial Intelligence
    • ⚙️Midjourney vs Stable Diffusion
    • ⚙️Creative QR Codes with ControlNet
      • ⚙️How to generate a QR Code
      • ⚙️Collect prompt
      • ⚙️clip skip
      • ⚙️AUTOMATIC1111
      • ⚙️Edge detection and human pose detection
    • ⚙️Stable Diffusion
    • ⚙️What is 'token' and 'turbo'?
    • ⚙️Today's learning--LangChain
      • ⚙️Prompt
    • ⚙️LLM Parameters Demystified
    • ⚙️What is Cohere playground?
    • ⚙️DALL-E
    • ⚙️How to use AI to learn something that you don't know?
    • ⚙️Diffusers
    • Boosting Algorithms in machine learning, part 1:AdaBoost
  • 💞GitHub
    • ✅How to add a issue of the code with GitHub?
    • ✅How to edit code?
    • ✅How to use GitHub Desktop
    • ✅How to review in GutHub?
  • 💞Lastest concepts for me
    • 🪅Pandas DataFrame
    • 🪅Choosing between loc and iloc
  • 💞Need to remember
    • 🔢An article for leetcode
    • 🉑two types of group work
    • 🍒What is hashtag#?
    • 🐝Week6_APD
    • 🦋API
    • 🎼BFF
  • 💞Python
    • 🐍argument & parameter
    • 🐍"{:.2f}"
    • 🐍Timeit in Python
    • 🐍Today's learning--Pylint
    • 🐍Split and Strip in Python
    • 🐍Getter and Setter in Python
    • 🐍"import json" in Python
    • 🐍Open CSV file in Python
    • 🐍print(f"An error occurred: {e}")
  • Page
  • 🪅command-line
  • 💞DataVisualization
    • 🪅How to choose plot type
  • 💞DataCleaning
    • 🪅Some basic code of data_cleaning
  • 💞Java
    • 🍡difference use between ArrayList and HashMap in Java
    • 🍡ArrayList & LinkedList
    • 🍡assertFalse(App.checkInputNumber(1,0))
      • 🍡HashSet
    • 🍡iterator
    • 🍡Java concept of assignment 1
    • 🍡Week6_Java
    • 🍡serializable
  • 💞Mark something that easily to forget
    • 🙉Mark something
    • 🙉How to edit cover picture from "Flickr" using "URL" in GitBook?
  • 💞VS Code
    • ✖️Install a project in VS Code
    • ✖️What should do after editing the code of one branch?
    • ✖️How to create a branch in VS code?
    • ✖️How to debug?
Powered by GitBook
On this page
  • What is an Ensemble?
  • Understand why there is a need for boosting algorithms
  • Introduction to AdaBoost algorithms
  • Implement the AdaBoost algorithm for binary clasification in Python and look at the individual weaklearner as well as the final predictions with some interesting viualizations.

Was this helpful?

Edit on GitHub
  1. Artificial Intelligence

Boosting Algorithms in machine learning, part 1:AdaBoost

In machine learning, boosting is a kind of emsemble learning method that combines several weak learners into a single strong learner. The idea is to train the weak learners sequentially, each trying to do its best not to repeat the mistakes its predecessor made, eventually building a strong ensemble.

In this article, we will learn one of the popular boosting techniques known as AdaBoost and show how elegantly it allows weak learner to pass on their mistakes to the next weak learner to improve the quality of predictions eventually.

In the opening sentence, I mentioned that boosting is a kind of ensemble learning that combines several weak learners into a single strong learner. Before digging in, we must be familiar with these terminologies first.

What is an Ensemble?

An Ensemble is a technique in mahine where multple models are combined to improve the over performance and generalization of a system. The idea behind ensembling is to leverage the strengths of diferent models and compensate for their individual weaknesses. By combining diverse models, the ensemble can often achieve better results than any of its individual components.

  1. Bagging(Bootstrap Aggregating): In bagging, multipleinstances of the same learning algorithm are trained on different subsets of the training data. Each model is trained independently, and their predictions are combined through averaging or voting to make the final prediction. Random Forest is a popular example of a bagging ensemble, where decision trees are trained on different subsets of the data and their predictions are averaged.

  2. Boosting: boosting focuses on sequentially training models, with each subsequent model giving more attention to the instances that the previous models found difficult. The predictions of each model are combined in a weighted manner, giving more weight to the models tha perform better. Examples of boosting algorithms include AdaBoost, Gradient Boosting Machines(GBM), and XGBoost.

Understand why there is a need for boosting algorithms

Introduction to AdaBoost algorithms

Implement the AdaBoost algorithm for binary clasification in Python and look at the individual weaklearner as well as the final predictions with some interesting viualizations.

PreviousDiffusersNextHow to add a issue of the code with GitHub?

Last updated 1 year ago

Was this helpful?

💞